This talk about anti-aliasing in games made me think... remember when people actually had the rigs to do super-sampling AA? Now the resolutions have increased so much, we can't even play at native anymore if you have a 4K screen. Let alone use the holly grandfather of AA. We actually regressed into doing the opposite of SSAA. We're undersampling during rendering, and upscaling. I find that absolutely insane.
@LukeAlmighty
Yeah, but my pseudo-OCD doesn't allow my computers to do guess work. Nvidia's DLSS 3 alone is enough to make me scream inside.
I've learned to be more accepting of DLSS 2, but only as a more complex and fancy upscaler, and nothing more. 4K DLSS is not 4K and it never will be.
The same goes for AMD's FSR, though I prefer it over DLSS, simply on account that it doesn't try to shove AI generated "magic" (or at least that's my understanding of reason behind the difference in quality).
P.S. Attached is basically my reaction to anything AI generated that people try to push as "the real thing".
@LukeAlmighty
>dithering shadows
Oh god... I see that sometimes and wonder if it's my drivers or the game at fault.
@LukeAlmighty
The whole point of computers in the first place IS precision. If you throw that out, and start guesstimating what is presented on screen for a game, where will that lead us to? How long till we make computers be lazy enough to estimate important things like the interactions in an atom smasher? There's no telling on what unforeseen consequences this could lead us to. And funny enough, to my understanding, there is a problem with some modern computers actually NOT being precise enough for the advanced science math they have to do, and the rounding errors potentially leading to false theories or solutions.
@alyx nd funny enough, to my understanding, there is a problem with some modern computers actually NOT being precise enough
what???
@LukeAlmighty
From what I understood of it, for some very advanced math, the current standards for floating point arithmetic doesn't actually provide enough precision. Basically they need to switch to even more bits to represent numbers, because what they're working on is that sensitive.
@alyx
Oh, ok.
In school, one of the first homeworks we did was to make our own number representation, that went around this problem. (I failed that school for a reason :D)
@LukeAlmighty
Not sure you can make a standard that truly solves the problem. You can keep throwing bits to be able to store larger and larger (or smaller and smaller) numbers, with more and more precision, but ultimately numbers are infinite in size and precision, while your bits aren't.
@LaoBan @LukeAlmighty
I think you're talking about a completely different type of approximations.
Sure, games have approximated things like lighting and shadows from the very beginning. But at least the approximations were precise, in that 2 different computers, running at the same resolution, with the same in game parameters, would output the exact same image.
Now with the likes of DLSS, even the exact same computer, running the exact same in game scenario multiple times, with the same parameters, at the exact same resolution, will NOT output the exact same image.
>but that doesn't mean it's used in more serious applications
Not yet. And I'm honestly not completely sure even that is true.
@alyx
Dithering shadows....
I agree with you completelly on bases of that shit alone. DLSS 3 is not even needed to ruin my day.