Anti-Aliasing Analysis, Part 1: Settings And Surprises

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

JPForums

Distinguished
Oct 9, 2007
104
0
18,680
With that it mind, what does 2X sampling multiply the axes by?

If I recall correctly, 2X is an MSAA mode and not available to SSAA. It just means their are two two samples averaged in the pixel rather the 4 shown in the example. Unfortunately, with only two samples, the blends effectiveness is extremely dependent on the angle of the object sampled. Hence, 4X MSAA shows quite an improvement over 2X.

Nvidia article is wrong, or at the very least semantically sloppy.

4X samples generally means doubling of resolution for both axes. 4*(x*y)==(2*x)*(2*y).

Yes, 4X SSAA doubles the resolution of each axis. This results in a quadrupling of pixel count (Hence the 4X). In cameras, moving from 2 Megapixel to 8 Megapixel is considered a 4 fold increase in resolution. Likewise, since resolution on monitors is also measured in two dimensions, a doubling of each axis is a quadrupling of resolution. It is technically the semantics of this article that are sloppy. Not that it makes much of a difference as Don spells out what is meant.
 

cleeve

Illustrious


Did you select 'Advanced View' from the preferences button on the right? ;)
 

AnUnusedUsername

Distinguished
Sep 14, 2010
235
0
18,710
It's an interesting article, but I think it's really not relevant for most users. From a gamer's perspective, there seems to be hardly any reason to ever want to use AA. Either you can already run the game at your monitor's highest resolution, which generally has a high enough DPI that individual pixels aren't noticeable, so AA will eat performance at no benefit, or you can't, and AA isn't an option as your card can't handle the game as is.

In the past, when people were running ~86PPI monitors (1280x1024@19'', or similar), AA was relevant, as resolution compared to screen size was low enough that seeing pixels was a problem, and there was no way to fix it but go to AA. Plus, resolutions were lower, and it was thus easier to run games well at native. Nowadays, ~100PPI is common (1920x1080@21.5'', 2560x1600@30, 1920x1200@24) and even more than that is possible, such as ~130PPI from 1920x1080@17''(laptop screens). All of these are more than high enough that you can effectively "resolution out" of aliased images, making AA just a resource-waster, at best.

To some like myself, the sharpness of the image is the most important aspect, so using something that both eats performance alive and reduces sharpness is lunacy. Sure, thats a matter of opinion, but it's more or less a given that blurring an image (by trying to display more detail than possible) to reduce jagged edges is going to produce a blurrier image. Some people swear by AA, but I just don't see the point.
 

cleeve

Illustrious


It's subjective I'll admit, but I completely disagree with you. I find it incredibly easy to recognize aliasing artifacts even with high DPI monitors at the native res. And I think you will find a lot of other folks notice it, too.

Even on my 2560x1600 monitor, I find AA makes a colossal difference in many games at the native resolution--and if the hardware can handle it with smooth frame rates there's no reason whatsoever not to enjoy the benefit of this feature.

As far as blurring, i think you have a misunderstanding of the term and are using it improperly here. For me, the visual quality benefit is obvious, and I see no 'blurriness' using MSAA -- only higher quality edges without aliasing problems. Anti-aliasing an objects edge doesn't 'blur' it anymore than displaying a high-resolution photograph on a monitor is blurred--it's actually averaging down more information to show a better approximation of the image within the limits of the resolution. Downsampling in this fashion isn't really 'blurring' it, a term that suggests you're losing visual fidelity. You're actually gaining fidelity with AA.

Not all eyes are created equal. You may lack the acuity to notice the aliased effect on a high quality monitor, but don't begrudge those to whom it is an obvious detractor. :)
 

cburke82

Distinguished
Feb 22, 2011
1,126
0
19,310

lol if you knew how manny times I have been asked that you would find it funny as well :). Anyhow I have selected reg and advanced , re-installed 10-20 times lol read the help section called amd XFX and still no gaming tab :( at this point its almost funny but still when I think about it it pisses me off. all XFX would tell me was "you dont really need that tab anyway"...ok great lol.... AMD was only slightly more helpfull by running down a list of c++ and open gl/net frame work files I should have. When I ran down that list and found all the required files all AMD could say was re-install windows...ok great but pre built systems dont come with install disks :( (im using the copy of windows that came on a hdd from my other pre-built system im no longer using. I called microsoft and activated it its working fine other than this) I guess I just might not know enough about how OS's work but I dont see how a program like CCC should need a fresh install of WIN 7 to work correctly when I have the latest GPU drivers and all the other required files/drivers. Seems like if CCC is a program I should be able do do a fresh install with the new drivers and just have it work. Feel like im going crazy lol :fou:
 

cburke82

Distinguished
Feb 22, 2011
1,126
0
19,310

I would like to know what setup you use where you cant see the jagged edges lol? My screen is max 1080p and thats what I game at, if I turn off AA I get jagged edges all over the place and it looks really bad. With AA on about 4x like the article states they are all but gone and the game looks great.
 

Fokissed

Distinguished
Feb 10, 2010
392
0
18,810
[citation][nom]Cleeve[/nom]1.5 times?[/citation]
Close, ~1.41421(trig...). And 2x SSAA would cause terrible quality, downsampling from such a resolution would require blending the upsampled pixels across multiple downsampled pixels.
 

haze4peace

Distinguished
Mar 3, 2009
119
0
18,680
I agree with AnUnusedUsername. I prefer to play without AA on. Sure I can see the jaggies if i am really looking for them, but they seem to disappear as I play. Plus it makes the image blurrier. What really bugs me is when I see tearing, so I try to always have Vsync enabled.
To each their own I guess.
 

cleeve

Illustrious


AA does not make the image 'blurrier'. Multisampling doesn't even touch 99% of the image, it only smooths out object edges that cover a pixel by better approximating the edge with increased fidelity.

As you say, to each their own. But 'blurrier' isn't an appropriate term to use.
 

army_ant7

Distinguished
May 31, 2009
629
0
18,980
@Cleeve

Thanks for the answers. About that 1.5 multiplier, I'm guessing that Fokissed's answer of the sq.rt. of 2 is a more accurate answer. But now it's kind of confusing because you end up with a fraction of a pixel of you do get the sq.rt. of 2 or even 8 then multiplying it by one axes of the frames resolution. So this just makes me think that maybe it's not exactly the sq.rt. of 2 or 8 but a more "stable" number like 1.5, or 2.5 or 3, respectively, which won't result in a fraction of a pixel if multiplied by one of the axes. Hahaha!
 

mac_angel

Distinguished
Mar 12, 2008
661
136
19,160
great article. I'm confused on something and maybe I'm just a dumbass and missed it, but if I turn on 8x MSAA in a game and I have a GTX570, is it using 8xQ, or 4xMSaa + 4x CSAA? You said 8xQ is better than 4x+4x. I'm just wondering what I'm actually getting and if I should be tweaking my driver settings.
 

Assmar

Distinguished
Sep 14, 2009
250
1
18,790
[citation][nom]warhammerkid[/nom]I used Nvidia Inspector on Mass Effect 1 and Mass Effect 2 to play the game with 2x2 SSAA (2x horizontal and 2x vertical), and it definitely works if you set it to force the game to use those settings. Forcing MSAA in those games didn't seem to work though, or if it did the effect wasn't noticeable enough to be worth it, not surprising considering some of the issues Unreal Engine games have with anti-aliasing. I didn't bother trying this with Batman because the built-in AA for the game looked fine.[/citation]
I see, I'm running an ATI card so I use forced settings in CCC for those games.
 

AnUnusedUsername

Distinguished
Sep 14, 2010
235
0
18,710
It's not so much that AA blurs the image, I agree that wasn't the best word to use. But it does "dull" the image. The best way to look at it is by looking at something like a 1-pixel wide diagonal line. With no AA, its going to be a line of solid black pixels. With AA, it's going to be a wider collection of semi-black pixels. You may get a straighter line, but it isnt as "crisp". To get a basic idea of what I'm trying to say, look at the top of the fence in the very first comparison image in the article. Without AA, it's a brighter color, and with it, its duller. The fence itself in that picture is also a good example. it's a much darker black without AA than it is with it. Again, to each his own. I don't like that effect, and as such don't like AA. Some people do, and I'm not trying to dissuade them.
 

tmax

Distinguished
Aug 24, 2007
107
0
18,710
This article made me rethink my graphic card settings. I did my own unofficial testing at home. Ever game I played looked and preformed better with the "Use Application Settings" enabled instead of the settings testing and recommended in this article. Did you find the same thing? I run 2 5870s on crossfire mode on the latest drivers. Is setting AA at higher qualities in the game better than using the video card settings? My unofficial testing seems to suggest that. Will that be covered in part two?
 

Rescator

Distinguished
Apr 20, 2008
22
0
18,510
The screens with best Pixel density for desktops are like mentioned by someone her previously 21.5 inch 1920x1080.
The pixel density is getting rather nice there even if sitting close.
And if your machine can handle it then 2x or 4x AA at that resolution on a screen that size makes any jaggies almost vanish.

So by conclusion a 21.5 inch screen at 3840x2160 would only need 2x AA.
And a 21.5" inch screen at 7680x4320 would probably not need any AA at all.

Obviously a 21.5 inch at 1920x1080 is 102.46 PPI (Pixel Pitch 0.248 mm)
and 21.5 inch at 3840x2160 is 204.92 PPI (Pixel Pitch 0.124 mm)
and 21.5 inch at 7680x4320 is 409.84 DPI (Pixel Pitch 0.062 mm)

A 21.5 inch at 7680x4320 (~400 PPI) is probably a bit overkill,
as 300 DPI is around where we no longer can distinguish "points".

A 21.5 inch at 5760x3240 is 307.38 PPI (Pixel Pitch 0.083 mm).

With DualPort resolutions like this should not be an issue as the cable/interface has the bandwidth to support that.

To check your current DPI setting, or calculate I have a nice DPI calculator here http://www.emsai.net/projects/widescreen/dpicalc/
If your system is set correctly it will automatically show you the current DPI/PPI setting in your system.

If you want the best image possible, the size of the screen does not matter, the pixel density however is the key here. (and using the native resolution of the screen as well)
 

cleeve

Illustrious


The aliased line might be 'crisper', but only because you're seeing crisp stair-step artifacts as opposed to the actual straight line that the PC is doing it's best to approximate.

To me, the anti-aliased approximation of a straight line is far more appealing than stair-stepping artifacts resulting from an imperfect digital representation of a line.

But like you said, to each his own. :)
 

MagicPants

Distinguished
Jun 16, 2006
1,315
0
19,660
I wish someone would do 3x subpixel antialiasing. You know the same thing they do with clear type. Split each pixel into its individual R,G,and B elements. I suppose the best would be 5X, one for each of the three subpixels and one for top and bottom.
 

husker

Distinguished
Oct 2, 2009
1,251
243
19,670
[citation][nom]AnUnusedUsername[/nom]To some like myself, the sharpness of the image is the most important aspect, so using something that both eats performance alive and reduces sharpness is lunacy.[/citation]
What you say seems to make sense but then how do you account for the fact that a low resolution (640x480) analog TV show still looks more realistic to our eyes than high resolution (1900x1200) video game? Clearly the video game image is sharper and less blurry. Color is the only advantage the TV has. Color and blending of color. Even black and white TV programs look more realistic to our eyes because the shades of gray are so perfectly blended and not overly "sharp".
 
Status
Not open for further replies.