https://www.youtube.com/watch?v=KnozAHKTz9o
HAAAAHAHAHAHAHAH
HAHAAAHAHAAHAHAHAA
@LukeAlmighty
>why have good 3D models and textures, when you can use thousands of watts of electricity to brute force a bad AI model to make your game look decent instead
Can we seriously fuck Nvidia into oblivion already?! I'm so done with this bullshit. Can we stop pretending like part of the problem for why games are so horrendously optimized now isn't Nvidia's DLSS and RTX tech?
@LukeAlmighty Good tech and cool tech demo, but it's simply being used in the wrong place.
Just like when they tried to take DLSS tech and apply it to make regular video playback look sharper. It's a cool idea in principle, but it's ridiculously energy wasteful.
I remember when LTT tried it out, and it maxed the GPU usage. It's simply unlikely that just streaming video content at a higher bitrate could use more energy than playing it back with this DLSS hack. So it's a cool tech demo, but you simply wouldn't want this used on mass. The tech could be used for something like restoring old footage, but it shouldn't be used for internet video playback (like they advertised it for) when it would be more energy efficient to just encode content at higher bitrate and/or resolution.
The same with this thing. It is good tech, and I'm sure it could be useful somewhere. Maybe even in the development process of 3D face models and textures, but not directly in the consumer hands.