@matrix it looks worse than Doom 2016. Bethesda should just give up at this point.

@newt It's not reasonable to compare a linear static game and a open world game with a day and night cycle

@matrix @newt just pre-bake 4-16 lighting setups for the day and night cycle and blend between them, it's not rocket science and perfectly doable. starfield looks like something that should run on gtx 700 series cards at 60fps, so there should be headroom for lighting
@matrix @newt besides, didn't cyberbug 9077 have realtime GI day/night cycle light? it should definitely be possible in current year.
Follow

@ic3l9 @matrix @newt I thought that cp2077 was heavy on your system until I saw starfield. It looks far worse and runs far worse. Starfield doesn't even have DLSS to save it either, it has shitty FSR2 and that's it.

Something has gotta be spaghettied in their graphics rendering. I watched a video showing that NVIDIA cards tank when running on Ultra settings relative to AMD cards. Maybe a driver thing, but that's not something you really see ever.

@beardalaxy @ic3l9 @newt Nvidia is supposedly downsizing their gaming division so it's possible it's partially Nvidia's fault
youtube.com/watch?v=IPSB_BKd9D

@matrix @ic3l9 @newt dude, there's not even drivers for it yet on intel cards :meru_death:

@beardalaxy @matrix @newt when did it become normal that video card vendors need to add specific support for every single game
@newt @beardalaxy @matrix that was not for Every. Single. Popular. Game.

and instead of "it's slow because nobody at activision/bethesda/ubisoft/namco/whatever can code" it's always turned into "it's slow because the (amd|nvidia) drivers are bad >:)" by retarded shills with no personality beyond their favorite brands and instead of saying "damn not only am i wasting time, i'm also wasting loads of money" gamers keep "upgrading" their video cards and preordering shitty broken games that need day one patches and workarounds in drivers to be playable at all. and instead of any form of consumer info you have shit like LTT who give no fucks or GN who mean well but simply do not understand how much the software side is ruining amateur computing.
@ic3l9 @beardalaxy @matrix ok ok, I hear you. But what if..

what if..

What if both "it's slow because nobody at activision/bethesda/ubisoft/namco/whatever can code" AND "it's slow because the (amd|nvidia) drivers are bad >:)" are true?
@newt @beardalaxy @matrix i would find that more believable if actually well-written software would not perform so close to the theoretical maximum. and the card vendors usually employ real software developers for a long time, while game studios hire the bottom of the barrel and fire them once the bare minimum is completed. and if the driver devs are so bad, how come they can still pull the cart out of the mud for the AAA studios?
@ic3l9 @beardalaxy @matrix somehow i believe that engine devs are treated a lot different tham your typical gamedev. Consider that UE can render rather complex almost photorealistic scenes on a consumer albeit top of the line hardware in real time at 60 fps or more.
@newt @beardalaxy @matrix yeah but those engines are maintained by people with actual skills that stay with the company for much longer than it takes to make one game. and in a world where you'll have to clean up behind gamedevs anyways you might as well contribute to game engines, and perhaps add support for some proprietary stuff while you're at it? and somehow even with the most advanced game engines some troglodyte will come along and make his game run at powerpoint framerates by using it wrong.
@ic3l9 @beardalaxy @matrix sure.

Then again, I can totally imagine a driver eating shit at some very specific workload. I don't know how good or bad windows drivers are, but graphics drivers for L'Eunuchs were famously shit for the longest time.
@newt @beardalaxy @matrix i guess it depends on what the manufacturer considers necessary to get people to buy the cards? for the longest time hardly anyone used video cards on linux for anything, only with CUDA and OpenCL it became a thing for scientific computing and machine learning. i remember cuda being very unstable under windows as well in maybe 2015 or so, every other cycles render used to crash, and when i first switched to amd/opencl i couldn't even get it to render without weird artifacts. since then demand for support has skyrocketed so they fixed their shit, i don't think i have had even one fatal crash with cycles on HIP. basically, gpu compute has moved from being experimental tech demo published for prestige to being an actual feature that sells cards.
Sign in to participate in the conversation
Game Liberty Mastodon

Mainly gaming/nerd instance for people who value free speech. Everyone is welcome.