Anti-Aliasing Analysis, Part 2: Performance

Status
Not open for further replies.

compton

Distinguished
Aug 30, 2010
197
0
18,680
This series is one of the best. The first article was most illuminating, and the second keeps it coming. Before the first article I was clueless to nVidia's AA nomenclature. Now it makes much more sense, and I applaud nVidia for not making the situation worse (though nVidia and AMD need nomenclature help in other areas still).

I'm not a huge gamer and the games I do play mostly run awesome with my 2500K + GTX460. I decided that if it's going to be a while before the next generation of GPUs drop, I'd get another 460. So that's what I did, should be here in a few days. I was worried that even at 1920x1200 I'd have problems with AA and the lack of VRAM, but it's good to see that two 460s work pretty admirably.

As an aside, I'm totally on an efficiency kick, and I don't relish the thought of needing two cards to get decent performance, but the GTX 460 is one of the most efficient cards around well over a year after it's release.
 

Zeh

Distinguished
Dec 7, 2010
169
0
18,690
What happened to Morphological AA? When the 6000 series was released, Morph AA showed an impressively low demand on hardware - about 2 or 3 fps lost -, and now it's cutting frame rates in half?

Seriously, what is it?
 
G

Guest

Guest
This article is going to have me diving into my settings tonight, I've basically set my aged 5770 to run as poorly as possible given I game at 1920x1200 :/ Learn something new every day ;)
 

ojas

Distinguished
Feb 25, 2011
2,924
0
20,810
[citation][nom]Zeh[/nom]What happened to Morphological AA? When the 6000 series was released, Morph AA showed an impressively low demand on hardware - about 2 or 3 fps lost -, and now it's cutting frame rates in half?[/citation]

Was thinking the same thing....part 1 and part 2 are contradicting each other hear...if i'm remembering part 1 correctly...

btw there's a typo at the start of page 2,
This is because the GT 420 is not DirectX 11-capable
 

cleeve

Illustrious
[citation][nom]Zeh[/nom]What happened to Morphological AA? When the 6000 series was released, Morph AA showed an impressively low demand on hardware - about 2 or 3 fps lost -, and now it's cutting frame rates in half?Seriously, what is it?[/citation]

On release we tested StarCraft II because that was a game that choked with MSAA on Radeons. It turns out, that game is severely CPU limited, so it wasn't the best test subject for Morphological AA
 
I don't like these animated gifs for comparing anti-aliasing modes, because 1. gifs are limited to 256 colors, 2. moving around in a game will affect how noticeable the differences in quality between different anti-aliasing modes are. (so will the physical size of the pixels, but that would probably be impractical to represent when viewed on other monitors). Would it be possible to get some animations that show antialiasing modes side-by-side (or half and half) while moving around in some of these games, instead of just fixed-position images that cycle between anti-aliasing modes?
 

cleeve

Illustrious
[citation][nom]MauveCloud[/nom]I don't like these animated gifs for comparing anti-aliasing modes, because 1. gifs are limited to 256 colors, 2. moving around in a game will affect how noticeable the differences in quality between different anti-aliasing modes are. [/citation]

As for #2, there's no worries as the Half Life 2 engine in Lost Coast that we used for the majority of comparison shots doesn't move the camera during idle times. We used a save game and reloaded the scene at exactly the same position, so its not an issue here.

As for your first concern, I was worried about that, too, at first. But I carefully scrutinize the uncompressed TIFF files before exporting them to GIF and in these cases there's no practical difference, it does an excellent job of demonstrating the result with different AA modes.
 
Very interesting article! Although I'm a tad confused by your nomeclature of the Radeon AA settings. There's MSAA, AMSAA, SSAA, and within those you can choose box, narrow tent, wide tent, and edge detect types (edge being the only one AFAIK to increase demand), and then on top of that you can enable Morphological. So, I'm not sure what "EQ" means as it is not at all a term used by Radeon (or at least CCC).

Also, as the first poster said, why is morphological so demanding all of a sudden? When I first tried using it, I barely saw an impact on performance and in a couple games it made everything look blurry. I just tried enabling it in Skyrim (a game that really needs better AA) and my performance plummeted - which these results confirm. What changed?
 

cleeve

Illustrious
[citation][nom]wolfram23[/nom] So, I'm not sure what "EQ" means as it is not at all a term used by Radeon (or at least CCC).[/citation]

As it says in the article, EQAA is Radeon HD 6900-series exclusive. You probably don't have a 6900 card.

[citation][nom]wolfram23[/nom]Also, as the first poster said, why is morphological so demanding all of a sudden? [/citation]

The answer is 5 posts above this comment. :) Depends on the game, you may have been using a CPU-bottlenecked title.
 
G

Guest

Guest
I wish for a DX10 card, they would have thrown in a card a lot of us probably have/had, like a GTX260.
 

mt2e

Distinguished
Feb 15, 2011
85
0
18,630
I must say the animated GIFs are pro status....I wanna see a giant GIF with all modes compared....like a gradual no AA to MAX aa...I know there are tons of modes but that is just the "jist" of the idea.
 

yyk71200

Distinguished
Mar 10, 2010
877
0
19,160
Also, if you download NVidia inspector, you can use Sparse Grid SS Transparency AA (doesn't work in all games). Looks better than regular Transparency SS AA but even more demanding. It is not available in the normal control panel.
 
[citation][nom]Cleeve[/nom]As for #2, there's no worries as the Half Life 2 engine in Lost Coast that we used for the majority of comparison shots doesn't move the camera during idle times. We used a save game and reloaded the scene at exactly the same position, so its not an issue here.[/citation]

You misunderstand my concern here. I'm not talking about whether the screenshots are in the same position. I'm talking about users actually moving around in the game (it's not common when actually playing any of these games to stay in one place for long, right?) and whether the qualitative differences between antialiasing modes are still as noticeable while moving.
 

cleeve

Illustrious
[citation][nom]MauveCloud[/nom] whether the qualitative differences between antialiasing modes are still as noticeable while moving.[/citation]

Ah. Well for that you should try it out. in my experience the stills represent what happens in-game quite well, except when it comes to post processing effects like Morphological and FXAA. Those tend to crawl because the filer re-assesses each frame independently, it's not based on the same geometric data like MSAA.
 

cleeve

Illustrious
[citation][nom]DXRick[/nom]Interesting, but the 450px × 448px images are too small for me to see the differences well.[/citation]

It's 1 to 1, so if you can't see a difference there, you won't see it in game.
 


Thanks for replying!

You're right I'm not on a 6900 card so I guess that takes care of that :)

The (recent) game I just tried to use morphological on was Skyrim. I've read that it's CPU intensive and yet my i5 750 at 4ghz is barely being utilized, like at best 50%. People have said it is a CPU intensive game, yet plenty of games use much more of my CPU like BF3 and Crysis 2, among others. At the same time, I've heard that it is really only using 2 threads, in which case I suppose it would make sense that I see 50% usage on a quad core. I don't know if that is true or not.
 

cleeve

Illustrious
[citation][nom]wolfram23[/nom]I've read that it's CPU intensive and yet my i5 750 at 4ghz is barely being utilized, like at best 50%. People have said it is a CPU intensive game[/citation]

Well, Skyrim might be better described as CPU-*dependent*, not intensive. It doesn't appear to use more than 2 threads, at least it didn't in our Skyrim performance analysis. Having said that, frame rate was very dependent on the CPU at the Ultra setting, but this dependance was really minimized if you use High or lower settings.

It's the Ultra setting that kills Skyrim and really shifts the bottleneck to the CPU.
 
Status
Not open for further replies.