@LukeAlmighty
>why have good 3D models and textures, when you can use thousands of watts of electricity to brute force a bad AI model to make your game look decent instead
Can we seriously fuck Nvidia into oblivion already?! I'm so done with this bullshit. Can we stop pretending like part of the problem for why games are so horrendously optimized now isn't Nvidia's DLSS and RTX tech?
@VD15 @LukeAlmighty
Of course they don't strictly need it to make games worse, but it allows them to continuously be lazier and lazier in their work, while pretending they're the bomb. Same with DLSS.
@alyx
Funny thing is, it might actually be a good tech, if they used a human.
But they used a 3D model with all of the problems, just to be projected on literally almost the same 3D model.
So, it's pointless. You will get no aditional data, except for some slightly nicer fake polygons.
@LukeAlmighty Good tech and cool tech demo, but it's simply being used in the wrong place.
Just like when they tried to take DLSS tech and apply it to make regular video playback look sharper. It's a cool idea in principle, but it's ridiculously energy wasteful.
I remember when LTT tried it out, and it maxed the GPU usage. It's simply unlikely that just streaming video content at a higher bitrate could use more energy than playing it back with this DLSS hack. So it's a cool tech demo, but you simply wouldn't want this used on mass. The tech could be used for something like restoring old footage, but it shouldn't be used for internet video playback (like they advertised it for) when it would be more energy efficient to just encode content at higher bitrate and/or resolution.
The same with this thing. It is good tech, and I'm sure it could be useful somewhere. Maybe even in the development process of 3D face models and textures, but not directly in the consumer hands.
@alyx @LukeAlmighty you're delusional if you think graphics programmers need RTX to make games run like shit