Anti-aliasing is not removing some jaggies

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.

radeonninjaxt

Distinguished
Apr 23, 2006
167
0
18,680
Misrach I am willing to bet that the issue we are facing is just due to immature drivers for our X1900XT's, give it a couple more months and we should have better driver support for our 3 month old cards. 8) I can also imagine that the performance will also get much better with optimized drivers, I am just totally amazed at the fact that I can run Oblivion on all high settings at 1280X1024 and 4Xaa with HDR and average 45+fps outside. :)
 

Misrach

Distinguished
Mar 25, 2006
61
0
18,630
Thanks for the low-down on AA in GRAW. I thought that I was going crazy again. It's a good-looking game, but without AA there's jaggies all over the place. It almost looks like it's running in 768, not 1200 (I'm playing it in 1200x1920 but I may drop down to 1050x1680 if the framerate begins to slide).

Actually, I picked up GRAW in the hopes that the PhysX card would be worth buying. From what I read on AnandTech, right now it's not worth the dough -- maybe later, but not now.

Yes, I do find it interesting how great Oblivion looks with HDR and 4xAA. Just goes to show you how far a little development can take a game. I run it at 1200x1920, though sometimes I'll toss back to 1050x1680. Granted, I finished the main quest, so I haven't played it much in the last 2-3 weeks, but it's always nice to look at -- especially the Imperial City. At 1200+, my framerates dip outside to 25-45 fps, but I can live with that.

While I'm not partial to either ATi or nVidia, Oblivion alone has convinced me that (in my case) I made the right choice in my X1900XT. Let's hope that GRAW can follow suit with the next ATi driver update. And let's also hope that we don't have to keep waiting for patches or updates to play the new titles with HDR and AA.
 

TabrisDarkPeace

Distinguished
Jan 11, 2006
1,378
0
19,280
I've sent in a request to ATi. If I hear back, I'll let you know. But I'm glad someone else has had this issue -- I thought I was just being too picky (which might still be the case...).

To me, it's almost as if the textures can't get small enough and tend to interpolate with nearby textures of similar size. If you look closely, it's not really a "jaggy" but 2 different textures that are vying for the same spot. The image flucutates between the two but does not always maintain a solid line.

OK, That is a Z depth precisions issue, and I got it on all my ATI cards.
When you get closer to the object it stops right ?
As W Buffers are no longer used, this occurs, learn to deal with it.

Read up on how Hyper Z I, II, III, III+, etc work, and you'll get a better understanding of why it happens, but you won't be able to fix it.
 

michaelahess

Distinguished
Jan 30, 2006
1,711
0
19,780
He's just listing them wrong, a 24" widescreen lcd has a native res of 1920x1200, or maybe he's got it rotated to the vertical 8O

I run Oblivion at 1600x1200 on my 22" NEC CRT maxed out on an 1800xt 512mb and average around 40fps outside. No jaggies at all, just the striped shadows :D
 

Flakes

Distinguished
Dec 30, 2005
1,868
0
19,790
ok i gotta point out that the image jaggies are comming from grates and things and theres one technology none of you guys have mentioned, and yes ATI doesnt have but Nvidia does.

Transparency Antialiasing - should be set to Supersampling.

this helps smooth out textures like the ones you guys have mentioned, when you switched to ATI you lost this cause ATI doesnt support it.
 

TabrisDarkPeace

Distinguished
Jan 11, 2006
1,378
0
19,280
Errrrr, ATI have supported that for ages, since the X800 series dude.

They just call it High Quality Anti-Aliasing, and CCC won't show it unless it detects a X1800 or X1900 series card. (But other tools have been able to enable it on X800 cards for.... years).
 

Flakes

Distinguished
Dec 30, 2005
1,868
0
19,790
Errrrr, ATI have supported that for ages, since the X800 series dude.

They just call it High Quality Anti-Aliasing, and CCC won't show it unless it detects a X1800 or X1900 series card. (But other tools have been able to enable it on X800 cards for.... years).

lol never mind then, i thought all the review sites had said at the time of realease of the 7800 that ati didnt have it. and it was a plus for nvidia.
 

Misrach

Distinguished
Mar 25, 2006
61
0
18,630
OK, That is a Z depth precisions issue, and I got it on all my ATI cards.
When you get closer to the object it stops right ?
As W Buffers are no longer used, this occurs, learn to deal with it.

Read up on how Hyper Z I, II, III, III+, etc work, and you'll get a better understanding of why it happens, but you won't be able to fix it.

Thanks for your explanation (insert big sigh of relief here). While I would have loved a solution, I can live with the effect. And yes, when I move closer to an object, the jaggies/braids stop; you've described the problem to a tee.

This would also explain why I was able to see the same effect on my Moblie Radeon GPU.

i know you've probably already said this but what exactly are your settings in CCC. have you tried using temporal AA or adaptive AA. have you also tried forcing everthing through the CCC. the only game i have ever looked for the difference's in is HL2. AF and AA made alot of difference. i only played a minute or 2 of F.E.A.R as i hated the ladder climbing but i'm pretty sure it looked o.k to me. i might be wrong there though.

I've tried switching every setting in CCC -- even the Avivo ones that cover deinterlacing, just to make sure. Nothing works, but at least now I know that my card is not defective. I really didn't want to try to RMA the card.

I've found that Doom 3 is actually worse than F.E.A.R. (I can't speak for GRAW right now, since the game doesn't even support AA for some silly reason). Early on in the game, when the Mars station is still intact, you can see a lot of very sharp, shiny metal edges. Those will often turn jagged at medium to far distances.

I can see the effect even in HL2, but it's much more subdued. The worst was in the beginning, on the subway: the handrails "shimmered" as the camera moved, something I did not see on my 7800GTX. Aside from that, I could barely tell the difference between the cards -- excecpt that the X1900XT ran the game at faster framerates, of course.

And I have to apologize about switching my rez numbers. I'm just used to the old printers' convention of height x width. I can read the numbers as width x height and still switch them around in my head (without getting confused). But just in case that wasn't what radeonninjaxt was asking about, my monitor is 16:10 widescreeen, so instead of 1200x... er, 1600x1200, I run at 1920x1200.
 

havenplaygame

Prominent
Oct 25, 2017
1
0
510
Bit of a stale thread, but a very clear description which shows up on Google. It's also cropping up again recently, because it will always rear it's ugly head when fine polygonal detail outstrips monitor resolutions. So, for anyone discovering this, what you need to solve this is downsampling. The best lead I can give you on this topic is this: https://www.geforce.com/whats-new/articles/dynamic-super-resolution-instantly-improves-your-games-with-4k-quality-graphics

You might also be able to get results by 'cheating' with FSSGSSAA. See this guide: http://www.overclock.net/t/1250100/nvidia-sparse-grid-supersampling