DirectX 10 Shootout: Nvidia vs. ATI

Page 4 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.


I suspect that you're only understanding a part of my intended statement. Yes, I realize that all the DX10 cards are new designs, as compared to the former DX9 cards. But both the Nvidia 8800 series and the ATI 2900 series had their designs started before Vista actually was finalised and released. Both have had their individual problems, possibly because the released version of Vista was different then when the cards were in developement. Some of it is surely software/drivers. But I think that the hardware will have to undergo more design changes before it will work well with Vista. Whether it is faster memory, faster GPUs, a combination of the two or something else, I don't know. But I think its fairly obvious that the designs which are presently being used are not fully up to the task of Vista. I'm not an engineer, so I can't tell what or where the problem lies, I can only tell that a problem exists.
 
theres no bias on this article, freakin conspiracy thugs, for those complaining about the 700$ card dont buy it, its not meant for you anyway..it doesnt matter on how many ati cards are on the table, the score of the hd2900 is relative to how the other 2900s will perform so whats the mumbo jumbo all about? the article is all about the DX10 winner dont post crap regarding dx9 cards, theyre irrelevant. clearly nvidia is ahead of the game right now

ATi shouldve challenged the ULTRA, since a king of the hill scenario would benefit us all ( more price depreciation for the non enthusiasts, more updates on the architecture)
 

That was pretty much what i was thinking but then if we are allowing for the 320 and the 640 GTS to be different cards then ATI currantly have allowing for the Gddr4 cards being different 13 products available.
Mactronix
 
Ok I am not saying the results are totally wrong here. However I have to stand up for my current card. I am using a MSI HD 2600xt Diamond with 256meg DDR4 ram overclocked. I am playing Bioshock at 1280x1024 x4aa high settings, game play is smooth, and does not crash. I also played Lost Planet and had an average of 23-25fps some scenes where as high aw 36fps on 1280x1024, 4xaa, mostly high settings, I think shadows was the only one set to low. I bought this card to upgrade from my 7600gs so I could play World in Conflict in DX10 during its beta, statistics wise it would have been an inproovement.

I am in no way saying Nvidia isn't the better, however I just wanted to make the statement. Of course I decided to go with Windows XP from here on because Vista can bite my balls.
 


Agreed, and I'm still waiting for somebody to figure out a way to hack it into XP and XP64. But at the moment, I doubt that's going to happen. M$ needed a reason to get people to give up XP, and DX10 is it. Without DX10 and its binding to Vista, how many people would actually buy Vista? There's no reason that I know of, except for DX10.
 
I agree with the nVidia favortism in general on this review. I would've liked to see this:

8800gtx Stock
8800gtx OC'd
2900xt Stock
2900xt OC'd
8800gts 640mb Stock
8800gts 640mb OC'd
2900 pro 1gig Stock
2900 pro 1gig OC'd to XT speeds (hopefully)
8800gts 320mb Stock
8800gts 320mb OC'd
2900 pro 512mb Stock
2900 pro 512mb OC'd

This would've shown 6 nVidia's and 6 ATI's. Then the review would've been fair for both sides and one wouldn't have to read between the lines and make exceptions. I seriously don't like weak and incomplete reviews. I understand that most sites use the cards that are already donated, but at least try to represent most of the normal possibilities. Don't just put up 20 cards for 1 team and 3 for the other, that doesn't play fair in my game. I don't care which side is favored, because either way the consumer can't make an informed decision about the product. I'm not an expert by any means, but I'm not stupid either. If it smells like a rat and looks like a rat, than I can deduce what it is.
 
I've been reading on another forum who shall remain nameless (atomicpc) (oops) that the 2900XT performs almost as fast as the 8800GTX as a single card, and much faster with 2 in crossfire than 2 x 8800GTX's in SLI. I don't see where the benchmarks confirm either way on futuremark since there's no way of knowing which cards are modded and which aren't. IF the radeons are indeed faster in crossfire, it would be worth knowing.

Can anyone confirm which is faster (crossfire with 2900XT or SLI with 8800GTX)?
 
There are several things to keep in mind here.

1) ATI never pitted their card against the 8800GTX, so anyone expecting it to be beat by ATI's offering were sure to get disappointed. That said, let's focus on the 8800GTS when talking about ATI cards. Obviously, overclocked cards are not in line with ATI's statement either.

2) At the highest resolution in several of the games, the 2900XT actually beat the standard 8800GTS offerings in Bioshock and Company of heroes. It fell between them in Lost Planet. The only major loss, and it was severe, was World in Conflict. I suspect the World in Conflict score will come closer to parity with driver updates or patches.

3) The 2900XTs with GDDR4 were left out of the comparison. Given that ATI's architecture takes great pains to make sure that internal memory bandwidth doesn't slow the card down, wouldn't it make sense to get as much external memory bandwidth as possible?

ATI's statement about their card not being pushed wasn't just fluff. (At least as far as the 2900XT is concerned) It doesn't start out with ultra high frame rates, but the card seems to scale to resolution far better than the 8800GTS in most of the games tested. Given more memory bandwidth, I wonder if lower resolution frame rates would go up.

That said, this reminds me of the P4 vs Athlon64 comparison. The P4 could only compete with copious amounts of bandwidth where the Athlon64 could still do well with much less bandwidth. For that reason the Athlon64 was the superior architecture.

For similar reason, until copious amounts of bandwidth are standard on ATI cards, nVidia's architecture will be the better architecture. Even if ATI gets the bandwidth, and thus the better architecture status, if they can't outperform nVidia's cards it won't mean much. Performance and price get the sales.

Someone above said that AMD was the worst thing that ever happend to ATI. However, This design was at least 5 years in the making, well before AMD came into the picture. Product delays may have been a result of the merger, but, AMD or not, This product would not have performed better.

I'm interested to see what ATI offers in the way of a product line update. Minor fixes in this case may be enough to make their cards more desireable.
 
Sailer,
I suspect that the simplest solution is the brute force approach - die shrink, more processing elements stuffed onto the chip, clocked faster. I really don't see either nVidia or ATI going to a brand new architecture this quickly
 
You could well be right about that. Starting a new architecture takes a lot of time and money. ATI doesn't have that much money, and Nvidia has no need to change unless they are forced to by ATI. But the brute force will cost us in hotter running cards that use ever more power. There's just no easy answer. If there was, someone would have done it.
 
Did anyone see the link that Extrapolation290 posted a few messages ago? It appears that the ATI Catalyst 10 drivers dramatically improve the performance of the 2900 series cards. If the ATI HD2900 cards are getting up to 30% improvement nearly across the board with this new driver release they have got to be matching or beating the 8800GTX in some bench marks.

I just downloaded the drivers today. Have not had a chance to play yet.

====================================================


Call of Juarez DirectX 10 Crossfire performance improves up to 42% and single card performance improves up to 34% on all ATI Radeon™ HD 2000 series products.

Company of Heroes DirectX 10 Crossfire performance improves up to 80% on all ATI Radeon HD 2000 series products and single card performance improves as much as 31% on ATI Radeon HD 2900 and ATI Radeon HD 2600 products

Enemy Territory: single card performance improves as much as 23% on both ATI Radeon HD 2000 and ATI Radeon 1000 series products

FEAR Crossfire performance improves as much as 16% on the ATI Radeon X1950 XTX, ATI Radeon™ X1650 XT, ATI Radeon HD 2400 series and ATI Radeon X1300/X1550 series

Supreme Commander Crossfire performance improves up to 30% on all ATI Radeon HD 2000 and ATI Radeon 1000 series products. The ATI Radeon X1650 and ATI Radeon X1300/X1550 series products see even greater improvements of 82% or more.
 
Well, the "educated" of us can fill in the blanks regarding other versions of the cards used, and know enough to read other reviews...just a bit of a shame that this article is pretty incomplete, and doesn't offer an accurate conclusion. Some may actually even belive the results...

Unless I missed something in my late night weakened state, WIC is clearly broken with the current ATI driver and we should have seen an alternate chart or two showing performance that didn't include these results...or at least a recognition of that. If ATI fixes the issue, which I expect they will, I see the 2900XT beating the 8800GTS overall, especially if you take out the virtually pointless 1024x768 resolution.

I call ATI the champ for that position, and I think the Pro will do well too. Shame the author was...ummm...blind to it.

Still, it's amazing to see how the GTX/Ultra kick everything in site. Seriously monster performance compared to the rest of the lot.

Antix
 
Can't speak for anyone else, but I have seen that link. The recorded improvements on the second page with CoH and Call of Juarez were between 1 to 2.5 fps, for an average of about 13% with AA turned off to a high of 19% with AA turned on. While it is a boost, it is not the 34-42% which was promised. It helps, and would have affected the test results, but not to a great extent.
 
I still want to know if they patched WIC before doing the benchies. As for the cat 7.10s, why did they improve x1950xt performance but not x1950 pro performance? I am sick of hardware manufacturers not keeping proper support for hardware that is a year or so old.
 
I don't get what all the fuss is about. Had it not be for all the BS from ATI Fanboy/Propaganda machine before the release far fewer of you would be so surprised.

What is a little surprising is M$ is in bed with ATI and the card still has issues with DX10. Just like with HD DVD they prop them up yet doesn't seem to working for them either case.
 
Late last year (or early this year - can't find the link) around the time that the G80 was being introduced, one of the nVidia senior executives when asked about die sizes, replied that nVidia is prepared to use as much silicon as necessary for performance.

And with the highly parallelized architecture of the current GPU's, it's a relatively simple matter of adding more stream processors and a wider data path. And even with a die shrink, power consumption and heat go up.

How many of us would have considered buying a dual slot video card two years ago? Not me.
 


We dont know if OC'ed ATI were donated or not, but THG could very easily OC any ATI card they have, since to compare the highest clocked and even OCed Nvidia to lower clocked 2900XT and no OC at all is wierd and suggest bias.

Also where is 2900 Pro? 2900 XT GDDR4? Cant THG get them if they would want to?

I do have one of the tested cards - MSI 8800 GTS 320MB OC, and this test only proves again its a sweet card, still I do plan to upgrade to 2nd refresh DX10 card for acceptable performance at 1900x1200. Not sure if it will be nVidia or ATI, R670 looks nice but lets live and see which card will offer the best price/performance ratio.
 


MS may well be in bed with ATI but didn't they (MS)do Nvidea a great favour and kick ATI where it hurts when they dropped the need for virtualization in DX10 because Nvidea were having trouble with it
Mactronix
 


That's right, M$ did that. Instead of writing out a set of standards and sticking to it, M$ changed their standards after Nvidia cried like a baby that the standards were impossible. Funny thing though, ATI managed to meet those standards. At the same time, M$ may have changed standards in the past in ATI's favor.

The one thing that I do wish M$ would change is the DRM. But M$ has a history of not listening to the desires of its customers, which is why I'm setting up a computer to use Linux. Its only an experiment at the moment, but I'm going to be exploring life in a non-M$ world.
 
no matter if they were oced nvidia or nvidia supported games or even both combined together with a bit of luck nvidia should not have outperformaned ATI by that much.
EDIT: Oh snap! I think I got the 100th post