OK, That is a Z depth precisions issue, and I got it on all my ATI cards.
When you get closer to the object it stops right ?
As W Buffers are no longer used, this occurs, learn to deal with it.
Read up on how Hyper Z I, II, III, III+, etc work, and you'll get a better understanding of why it happens, but you won't be able to fix it.
Thanks for your explanation (insert big sigh of relief here). While I would have loved a solution, I can live with the effect. And yes, when I move closer to an object, the jaggies/braids stop; you've described the problem to a tee.
This would also explain why I was able to see the same effect on my Moblie Radeon GPU.
i know you've probably already said this but what exactly are your settings in CCC. have you tried using temporal AA or adaptive AA. have you also tried forcing everthing through the CCC. the only game i have ever looked for the difference's in is HL2. AF and AA made alot of difference. i only played a minute or 2 of F.E.A.R as i hated the ladder climbing but i'm pretty sure it looked o.k to me. i might be wrong there though.
I've tried switching every setting in CCC -- even the Avivo ones that cover deinterlacing, just to make sure. Nothing works, but at least now I know that my card is not defective. I really didn't want to try to RMA the card.
I've found that Doom 3 is actually worse than F.E.A.R. (I can't speak for GRAW right now, since the game doesn't even support AA for some silly reason). Early on in the game, when the Mars station is still intact, you can see a lot of very sharp, shiny metal edges. Those will often turn jagged at medium to far distances.
I can see the effect even in HL2, but it's much more subdued. The worst was in the beginning, on the subway: the handrails "shimmered" as the camera moved, something I did not see on my 7800GTX. Aside from that, I could barely tell the difference between the cards -- excecpt that the X1900XT ran the game at faster framerates, of course.
And I have to apologize about switching my rez numbers. I'm just used to the old printers' convention of height x width. I can read the numbers as width x height and still switch them around in my head (without getting confused). But just in case that wasn't what radeonninjaxt was asking about, my monitor is 16:10 widescreeen, so instead of 1200x... er, 1600x1200, I run at 1920x1200.