This talk about anti-aliasing in games made me think... remember when people actually had the rigs to do super-sampling AA? Now the resolutions have increased so much, we can't even play at native anymore if you have a 4K screen. Let alone use the holly grandfather of AA. We actually regressed into doing the opposite of SSAA. We're undersampling during rendering, and upscaling. I find that absolutely insane.
@alyx
Well, I see it this way.
It takes human brain 0.1s to make an estimate of most things. But to calculate it precisely takes pen, paper, new skill and a ton of time.
I always used to wonder, why do computers work differently. But they no longer do. Because we have finally thought it to make a guess. And that allows it to work faster.
@LukeAlmighty
The whole point of computers in the first place IS precision. If you throw that out, and start guesstimating what is presented on screen for a game, where will that lead us to? How long till we make computers be lazy enough to estimate important things like the interactions in an atom smasher? There's no telling on what unforeseen consequences this could lead us to. And funny enough, to my understanding, there is a problem with some modern computers actually NOT being precise enough for the advanced science math they have to do, and the rounding errors potentially leading to false theories or solutions.
@LukeAlmighty
From what I understood of it, for some very advanced math, the current standards for floating point arithmetic doesn't actually provide enough precision. Basically they need to switch to even more bits to represent numbers, because what they're working on is that sensitive.
@alyx
Oh, ok.
In school, one of the first homeworks we did was to make our own number representation, that went around this problem. (I failed that school for a reason :D)
@LukeAlmighty
Not sure you can make a standard that truly solves the problem. You can keep throwing bits to be able to store larger and larger (or smaller and smaller) numbers, with more and more precision, but ultimately numbers are infinite in size and precision, while your bits aren't.
@LaoBan @LukeAlmighty
I think you're talking about a completely different type of approximations.
Sure, games have approximated things like lighting and shadows from the very beginning. But at least the approximations were precise, in that 2 different computers, running at the same resolution, with the same in game parameters, would output the exact same image.
Now with the likes of DLSS, even the exact same computer, running the exact same in game scenario multiple times, with the same parameters, at the exact same resolution, will NOT output the exact same image.
>but that doesn't mean it's used in more serious applications
Not yet. And I'm honestly not completely sure even that is true.
@alyx nd funny enough, to my understanding, there is a problem with some modern computers actually NOT being precise enough
what???