A lot of newer game engines tend to render a lot of effects at resolutions considerably lower than the full render resolution of the image. This does save on performance, but it does mean you are basically forced to use TAA or an upscaler like DLSS or FSR to act as a denoiser. If the game does allow you to turn TAA off without using an upscaler, you get a very grainy looking image as lots of elements are not rendered at full resolution as there isn’t a blur filter over the image to smooth it out.
Even with TAA or an upscaler on, things can look grainy if the internal render resolution isn’t high enough. If you run at 4K, the grainy effect tends to be a lot less prominent, but that is very demanding on the GPU, so only those with pretty high end graphics cards realistically have the option to do that.
SpeedRun355
Idk my man
TalentlessSavant87
That thing was around forever. I remember in 2009, I bought new pc, 9600gt, which was pretty good for most of the games. I installed one of flat out games (can’t remember which one came out around that time) and the shadows looked so weird I thought my card was defective. Crysis, too, fallout 3 etc. Important thing is it was not dying hardware,l and I just accepted it and got used to it.
Memetron69000
because it’s a great boon to performance
transparent shaders multiply re-rendering pixels for every overlapping transparent poly which can quite quickly ruin performance even on good hardware
Latter_Ad2247
Fake frames
A_Person77778
Low-res effects. A more extreme example is Red Dead Redemption 2 and its ambient occlusion when on medium (console settings); not even XeSS knows what to do with it, and even on high the dithering can still be visible
Sinister_Mr_19
Two things are causing what you’re seeing. It is either AI upscaling (DLSS/FSR), and/or it is effects and shadows being rendered at a low resolution to save on performance. You’d need to run your game on very high/ultra settings with no DLSS/FSR to get a full clean image.
LukeLC
Modern games are made for TAA. It’s expensive to alpha blend a ton of tiny details even on modern GPUs. Using a jitter pattern + TAA is cheap by comparison and resolves nearly the same at resolutions like 4K.
You still need actual good quality TAA, of course. DLSS, XeSS, and FSR4 are the way to go.
jim_forest
because they just rely on TAA to make it presentable
Reptalex
Thats aliasing. Anti aliasing counters that
millenia3d
dithered transparency is much better than translucency both in terms of performance and what you can do with it – fully translucent shader models tend to have lots of restrictions versus standard opaque/masked shaders aside from performance also
ofc it has its own limitations and issues but generally if you can use masked dithered transparency it’s better to go with that than a translucent shader
MiraiKishi
Temporal Anti Aliasing. Ugh, never liked it.
anthonycarbine
r/fucktaa and Threat Interactive on YouTube would be great subs for you. Every other “bad” game you mentioned is unreal engine 5 which is infamous for its blurry taa implementation in its core rendering pipeline.
unknownsqwe
I play on TV with HDR disabled at 144hz and the best I can see is DLSS in Quality and sharpness at 100%.
UnlimitedAidan
Yeah I see this prominently in Darktide (without ai upscaling tech or AA). The hair looks like a grainy mop and object outlines are horridly aliased.
Ai upscaling even on highest quality results in terrible artifacts for every game I’ve played. The inaccuracy is just not acceptable.
TAA blurs things to oblivion. Objects in the distance become too obfuscated to recognize as priority targets. This is especially terrible for FPS games.
djDouggpound
What a voice lol the way they say disgusting is disgusting
quadsimodo
A+ for OP’s thorough example to their question
IDK_IV_1
Because it needs better batteries.
PenCool479
I’ll swear by 4k with in-game fxaa or reshade fxaa for life. Very few other antialiasing solutions are as clear.
Thundergod250
I guess I didn’t take this problem as seriously as it should because I don’t know what the video is saying until they zoomed it on 1:30 lmao.
fnv_fan
Welcome to modern gaming baby
Aromatic-Coconut-122
Wait. Why are we complaining about using AA and TAA?
Anti-aliasinging is there because you can’t turn on only a fraction of a pixel, I don’t care if you’re playing at 640×480 or 8k. The graphic can be round AF but the games engine has to keep the aspect ratio of smooth surfaces, straight edges or curved objects.
Now take that and force it on a display that has round or square pixels; there’s going to be missed pixels on any display for anything that isn’t either perfect horizontal or vertal line.
So a 45° line across any monitor or TV is going to look stair-stepped without anti-aliasing. However, gaming monitors and gaming capable TV that have good scaling hardware can provide various AA levels to any image preventing aliased/jagged edges. However, you also have to take the game into consideration. COD is a horrible example, yet a perfect one at the same time.
COD wants the engine to pump out as many rendered pixels as possible, raw, on a base level. This lets the developer set low minimum specs to say, “hey it runs. No one said it’d be perfect” but crank up to the best AA and the game looks great, bus distance come the screen affects this.
Meanwhile on the opposite end, I play everything either on my super Ultrawide at 5K x 1440 (32:9) or at 4k (16:9) with an RTX 4090 and while 4K is more demanding, I’m getting more frames rendered than I can actually recognize visually. On the 5Kx1440, it’s even beyond that.
AA is too generic. Pretty much all games have basic anti-aliasing.
SSAA is the most hardware intensive, but give the best image quality. MSAA came out and instead of spacial sampling(anti-aliasing the entire frame, every time), it uses multiple samples from what was already drawn and renders the samples much more quickly because the majority of the image has been rendered, and the GPU renders only new data, Nvidia cards benifit from MFAA, which combines multiple frame samples with temporal anti-aliasing, which pulls image data from the GPUs buffer. Then there’s TAA, which is pure buffer based temporal anti-aliasing, TAAU, is the same, but using upscaling.
There are so many more AA options, but the purpose is all the same: smooth jagged edges produced by the fact the displays pixels aren’t capable of being only partially used. Anti-aliasing compensates for this by rendering additional information to utilize neighboring pixels to create a blending effect.
No AA is always going to result in jagged edges, though less prominatly at higher resolutions. Higher resolutions, NOT larger screens.
Just because you’re on a larger screen, there’s still a specific number of pixels that makes up the resolution. You don’t magically get more pixels just because the screen is larger, in fact, the larger the screen, the more you’ll notice jagged edges, even if you use the highest anti-aliasing method.
If I’m playing on a 65 inch OLED, I don’t notice jagged edges because I’m sitting 10 to 15 feet away. You’d need to be 15 to 20 feet away for a 90 to 100 inch screen to not notice the jagged edges, even with full SSAA.
Euthanasiia
Omg it sounds like you have so much mucas just stuck.
I wish I had 20/20 vision so I could know what yall are talking about. All this shit looks the same to me.
DjXer007_
Please name all the games
SubpabUwU
[ Removed by Reddit ]
[deleted]
[deleted]
Fun_Possible7533
One of the worse cases of this tragedy is Stalker 2. I use Reshade to subtly blur out the grainy textures.
iamonewiththeforce
I love this post. I had the same question. I noticed it on Stray and I was really puzzled. Would make a great video topic for LTT.
impala_knight
True it use to be the best practice to turn off AA if you play on 4k because pixel density is so big you won’t notice the pixel staircase. But nowadays game made with low console specs in mind and don’t bother to change it when port to PC
29 Comments
A lot of newer game engines tend to render a lot of effects at resolutions considerably lower than the full render resolution of the image. This does save on performance, but it does mean you are basically forced to use TAA or an upscaler like DLSS or FSR to act as a denoiser. If the game does allow you to turn TAA off without using an upscaler, you get a very grainy looking image as lots of elements are not rendered at full resolution as there isn’t a blur filter over the image to smooth it out.
Even with TAA or an upscaler on, things can look grainy if the internal render resolution isn’t high enough. If you run at 4K, the grainy effect tends to be a lot less prominent, but that is very demanding on the GPU, so only those with pretty high end graphics cards realistically have the option to do that.
Idk my man
That thing was around forever. I remember in 2009, I bought new pc, 9600gt, which was pretty good for most of the games. I installed one of flat out games (can’t remember which one came out around that time) and the shadows looked so weird I thought my card was defective. Crysis, too, fallout 3 etc. Important thing is it was not dying hardware,l and I just accepted it and got used to it.
because it’s a great boon to performance
transparent shaders multiply re-rendering pixels for every overlapping transparent poly which can quite quickly ruin performance even on good hardware
Fake frames
Low-res effects. A more extreme example is Red Dead Redemption 2 and its ambient occlusion when on medium (console settings); not even XeSS knows what to do with it, and even on high the dithering can still be visible
Two things are causing what you’re seeing. It is either AI upscaling (DLSS/FSR), and/or it is effects and shadows being rendered at a low resolution to save on performance. You’d need to run your game on very high/ultra settings with no DLSS/FSR to get a full clean image.
Modern games are made for TAA. It’s expensive to alpha blend a ton of tiny details even on modern GPUs. Using a jitter pattern + TAA is cheap by comparison and resolves nearly the same at resolutions like 4K.
You still need actual good quality TAA, of course. DLSS, XeSS, and FSR4 are the way to go.
because they just rely on TAA to make it presentable
Thats aliasing. Anti aliasing counters that
dithered transparency is much better than translucency both in terms of performance and what you can do with it – fully translucent shader models tend to have lots of restrictions versus standard opaque/masked shaders aside from performance also
ofc it has its own limitations and issues but generally if you can use masked dithered transparency it’s better to go with that than a translucent shader
Temporal Anti Aliasing. Ugh, never liked it.
r/fucktaa and Threat Interactive on YouTube would be great subs for you. Every other “bad” game you mentioned is unreal engine 5 which is infamous for its blurry taa implementation in its core rendering pipeline.
I play on TV with HDR disabled at 144hz and the best I can see is DLSS in Quality and sharpness at 100%.
Yeah I see this prominently in Darktide (without ai upscaling tech or AA). The hair looks like a grainy mop and object outlines are horridly aliased.
Ai upscaling even on highest quality results in terrible artifacts for every game I’ve played. The inaccuracy is just not acceptable.
TAA blurs things to oblivion. Objects in the distance become too obfuscated to recognize as priority targets. This is especially terrible for FPS games.
What a voice lol the way they say disgusting is disgusting
A+ for OP’s thorough example to their question
Because it needs better batteries.
I’ll swear by 4k with in-game fxaa or reshade fxaa for life. Very few other antialiasing solutions are as clear.
I guess I didn’t take this problem as seriously as it should because I don’t know what the video is saying until they zoomed it on 1:30 lmao.
Welcome to modern gaming baby
Wait. Why are we complaining about using AA and TAA?
Anti-aliasinging is there because you can’t turn on only a fraction of a pixel, I don’t care if you’re playing at 640×480 or 8k. The graphic can be round AF but the games engine has to keep the aspect ratio of smooth surfaces, straight edges or curved objects.
Now take that and force it on a display that has round or square pixels; there’s going to be missed pixels on any display for anything that isn’t either perfect horizontal or vertal line.
So a 45° line across any monitor or TV is going to look stair-stepped without anti-aliasing. However, gaming monitors and gaming capable TV that have good scaling hardware can provide various AA levels to any image preventing aliased/jagged edges. However, you also have to take the game into consideration. COD is a horrible example, yet a perfect one at the same time.
COD wants the engine to pump out as many rendered pixels as possible, raw, on a base level. This lets the developer set low minimum specs to say, “hey it runs. No one said it’d be perfect” but crank up to the best AA and the game looks great, bus distance come the screen affects this.
Meanwhile on the opposite end, I play everything either on my super Ultrawide at 5K x 1440 (32:9) or at 4k (16:9) with an RTX 4090 and while 4K is more demanding, I’m getting more frames rendered than I can actually recognize visually. On the 5Kx1440, it’s even beyond that.
AA is too generic. Pretty much all games have basic anti-aliasing.
SSAA is the most hardware intensive, but give the best image quality. MSAA came out and instead of spacial sampling(anti-aliasing the entire frame, every time), it uses multiple samples from what was already drawn and renders the samples much more quickly because the majority of the image has been rendered, and the GPU renders only new data, Nvidia cards benifit from MFAA, which combines multiple frame samples with temporal anti-aliasing, which pulls image data from the GPUs buffer. Then there’s TAA, which is pure buffer based temporal anti-aliasing, TAAU, is the same, but using upscaling.
There are so many more AA options, but the purpose is all the same: smooth jagged edges produced by the fact the displays pixels aren’t capable of being only partially used. Anti-aliasing compensates for this by rendering additional information to utilize neighboring pixels to create a blending effect.
No AA is always going to result in jagged edges, though less prominatly at higher resolutions. Higher resolutions, NOT larger screens.
Just because you’re on a larger screen, there’s still a specific number of pixels that makes up the resolution. You don’t magically get more pixels just because the screen is larger, in fact, the larger the screen, the more you’ll notice jagged edges, even if you use the highest anti-aliasing method.
If I’m playing on a 65 inch OLED, I don’t notice jagged edges because I’m sitting 10 to 15 feet away. You’d need to be 15 to 20 feet away for a 90 to 100 inch screen to not notice the jagged edges, even with full SSAA.
Omg it sounds like you have so much mucas just stuck.
I wish I had 20/20 vision so I could know what yall are talking about. All this shit looks the same to me.
Please name all the games
[ Removed by Reddit ]
[deleted]
One of the worse cases of this tragedy is Stalker 2. I use Reshade to subtly blur out the grainy textures.
I love this post. I had the same question. I noticed it on Stray and I was really puzzled. Would make a great video topic for LTT.
True it use to be the best practice to turn off AA if you play on 4k because pixel density is so big you won’t notice the pixel staircase. But nowadays game made with low console specs in mind and don’t bother to change it when port to PC