cross-posted from: https://lemmy.world/post/11840660
TAA is a crucial tool for developers - but is the impact to image quality too great?
For good or bad, temporal anti-aliasing - or TAA - has become a defining element of image quality in today’s games, but is it a blessing, a curse, or both? Whichever way you slice it, it’s here to stay, so what is it, why do so many games use it and what’s with all the blur? At one point, TAA did not exist at all, so what methods of anti-aliasing were used and why aren’t they used any more?
TAA has become so common because its’s “free”. Temporal data is required by DLSS and FSR, so if you are implementing those technologies you already have the necessary data to implement TAA, making it a no brainier to include.
Antialiasing is a byproduct of moving away from CRT display technology. The natural image softening in CRT tech is not replicated in LCD and LED displays.
TAA is one of the better options, but at the end of the day it will be difficult to create a true AA solution that doesnt have artifacts, without utilizing supersampling.
We used AA on our CRTs back in the day. Of course we were all running like 1024x768 as the resolution so it was a lot more needed. The higher your resolution the less you need it.
Yes, thats true. AA was helpful at certain resolutions that were what I call “medium resolutions”, the range between 480 and 768 pixels. But CRTs still had a softer image simply as a byproduct of the way the technology worked, and worked better at lower resolutions like 240p (AFAIK, any signal less than 480 vertical pixel resolution was automatically progressive scan). This was abused and exploited by game developers of the time, famously utilizing dithering for transparency effects for platforms that didn’t fully support it such as the SEGA Saturn (it only supported transparent 2D sprites, but not textured polygons like the PSX did). The softer image led to the dithered effects smoothing out, giving the appearance of a bigger available color palette and special effects. Flickering sprites every other field was also a common technique due to CRTs high image persistence. This is why games like Streets of Rage look awful on modern displays, but display correctly on CRTs.
But regardless, AA will probably be phased out eventually, its just a tool to mitigate growing pains of new display technology.
DLAA comes to mind
Interesting take. Do you think that natural image softening would come back in newer technologies?
I’m not that guy, but I don’t think so. The trend will likely be that we get to the point where we render and display in such a high resolution that you can’t even see pixels anymore. We’re getting there already with smaller 4k displays where turning on AA doesn’t have an appreciable difference in 4k native rendering.
At one point
I was there, 3000 years ago, when first of the costumer AA almost usable on my Voodoo 1.
The first things I always turn off are motion blur, anti-aliasing and ray tracing.
Motion blur just makes it look like you’re drunk, anti-aliasing makes everything look like it’s smeared with vaseline and ray tracing tanks your FPS for not much added quality.
Try playing Forza without AA. Ray Tracing tanks your performance, but it gives great visual Enhancements, once you experience it, there’s no going back.
I don’t really play racing games or Forza so maybe it’s unique to Forza or racing in general but every RPG, action, adventure, strategy, survival, shooter and sim game I have played looks worse with AA and ray tracing is not worth cutting your FPS in half for.
You must not notice aliasing and shimmering then? Most find it very distracting to see everything flickering and shimmering and stair step with the slightest motion.
And ray tracing really depends on the game, implementation, and hardware. Ray traced global illumination alone fixes the classic video game look that stems from rasterized lighting errors (light leaking, default ambient light, etc). It is the future for high quality games even not photo-realistic ones. Its expense is offset by both reconstruction and improved hardware. You wont be able to avoid it forever even if you want to.
It has gotten much better in the last 7 years. I will say that I usually test 1.5× or 2× my resolution if possible, which can to be less taxing depending on the engine, as I’m always trying to eek out a little extra on my 970.
2x on a 970? I struggled with my 970 at 1440p low-medium settings until i got the 3080. Often had to put scaling to 1080p. And that was on “last gen” titles, cant imagine still trying to limp that thing along nowadays, despite as much as i loved it.
Same. I also disable stuff like filmgrain and lens-flares, whenever possible.
I always have film grain enabled. It provides some half decent dithering that helps mask color banding, especially noticeable on my low end monitor.
I don’t think I could stomach a game without AA. It’s on par with playing a game with an unstable 30fps frame rate, it’s just nauseating.
Motion blur just makes it look like you’re drunk
Someone hasn’t tried motion blur since 2004 GTA
It’s like you’ve used each thing once in some specific game where it was badly implemented and decided that’s how it looks in all games.
There is no objective “it looks like this”, every game does things slightly or very differently. I’m certain you are unusually blind to detail, have serious vision problems, or you’re just very good at convincing yourself of your own bad ideas.
All temporal effects kinda blow.
TAA just makes beautiful graphics look crappy and blurry.
rather not have AA at all instead