Follow

This talk about anti-aliasing in games made me think... remember when people actually had the rigs to do super-sampling AA? Now the resolutions have increased so much, we can't even play at native anymore if you have a 4K screen. Let alone use the holly grandfather of AA. We actually regressed into doing the opposite of SSAA. We're undersampling during rendering, and upscaling. I find that absolutely insane.

· · Web · 3 · 2 · 5

@alyx
Well, I see it this way.
It takes human brain 0.1s to make an estimate of most things. But to calculate it precisely takes pen, paper, new skill and a ton of time.

I always used to wonder, why do computers work differently. But they no longer do. Because we have finally thought it to make a guess. And that allows it to work faster.

@LukeAlmighty
Yeah, but my pseudo-OCD doesn't allow my computers to do guess work. Nvidia's DLSS 3 alone is enough to make me scream inside.

I've learned to be more accepting of DLSS 2, but only as a more complex and fancy upscaler, and nothing more. 4K DLSS is not 4K and it never will be.
The same goes for AMD's FSR, though I prefer it over DLSS, simply on account that it doesn't try to shove AI generated "magic" (or at least that's my understanding of reason behind the difference in quality).

P.S. Attached is basically my reaction to anything AI generated that people try to push as "the real thing".

@alyx
Dithering shadows....
I agree with you completelly on bases of that shit alone. DLSS 3 is not even needed to ruin my day.

@LukeAlmighty
>dithering shadows
Oh god... I see that sometimes and wonder if it's my drivers or the game at fault.

@LukeAlmighty
The whole point of computers in the first place IS precision. If you throw that out, and start guesstimating what is presented on screen for a game, where will that lead us to? How long till we make computers be lazy enough to estimate important things like the interactions in an atom smasher? There's no telling on what unforeseen consequences this could lead us to. And funny enough, to my understanding, there is a problem with some modern computers actually NOT being precise enough for the advanced science math they have to do, and the rounding errors potentially leading to false theories or solutions.

@alyx nd funny enough, to my understanding, there is a problem with some modern computers actually NOT being precise enough

what???

@LukeAlmighty
From what I understood of it, for some very advanced math, the current standards for floating point arithmetic doesn't actually provide enough precision. Basically they need to switch to even more bits to represent numbers, because what they're working on is that sensitive.

@alyx
Oh, ok.
In school, one of the first homeworks we did was to make our own number representation, that went around this problem. (I failed that school for a reason :D)

@LukeAlmighty
Not sure you can make a standard that truly solves the problem. You can keep throwing bits to be able to store larger and larger (or smaller and smaller) numbers, with more and more precision, but ultimately numbers are infinite in size and precision, while your bits aren't.

You're overreacting, approximations and "best guesses" have been in videogames for over 20 years at this point, but that doesn't mean it's used in more serious applications

@LaoBan @LukeAlmighty
I think you're talking about a completely different type of approximations.

Sure, games have approximated things like lighting and shadows from the very beginning. But at least the approximations were precise, in that 2 different computers, running at the same resolution, with the same in game parameters, would output the exact same image.

Now with the likes of DLSS, even the exact same computer, running the exact same in game scenario multiple times, with the same parameters, at the exact same resolution, will NOT output the exact same image.

>but that doesn't mean it's used in more serious applications
Not yet. And I'm honestly not completely sure even that is true.

It seems i misunderstood your argument. My bad

@alyx the graphic quality has increased a lot so the resolution suffers as a result. It's pretty telling that in most games, low and ultra have very little difference between them. It used to be wildly different.

@beardalaxy
I can't remember who covered it, but ultra is usually placebo. When it comes to textures for instance, there's usually no size (as in resolution) difference between textures at high or ultra, the only difference is that the ultra ones are uncompressed, but give no discernible difference (just an increase in vram usage).

Sign in to participate in the conversation
Game Liberty Mastodon

Mainly gaming/nerd instance for people who value free speech. Everyone is welcome.