DirectX 10 Shootout: Nvidia vs. ATI

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
I can't believe how much ATI gets slaughtered here...I don't think using the 800-825MHz cards would have made enough of a difference over the 743MHz one...helped yes, but still well below the ultra.
That's really embarrassing IMO for DAAMIT. There's no way I will buy a 2900 now!

I DO wonder what the hell they (DAAMIT) are doing. If AMD tanks and takes ATI with it, what a joke it will be. AMD was the worst thing to happen to ATI. If the RV670/680 fares no better, I will be buying NVIDIA for sure...until I can afford a crossfire setup for my new ASUS X38.
 
You can't find any new games that aren't part of Nvidia's "TWIMTBP" program. Also, it doesn't guarantee the game will run better on Nvidia hardware. Look at Oblivion; it came out as a "TWIMTBP" title and ran considerably better on ATI hardware.
 
True, a 825 mhz 2900 may not have helped that much, but it would have been interesting to see how a 825 mhz card with 1 gig of DDR4 memory would have faired. Its also not just the mhz alone, but the DDR4 as opposed to the DDR3. But it should have helped, and maybe the ATI cards wouldn't have been quite as far behind. But we don't know. Having good information makes for good choices. Having incomplete information makes for guesses. I'd rather have good information than make guesses with my money.
 
Atm DX10 is more of a super high quality level then what games are based on, you arent going to run every dx10 game in future on high are you, dont run every dx9 game on high.

Just wait until some proper dx10 games come out.
 
I was wondering why they included so many oc'd nvida cards.
If you look at the graphs as a whole,you'll see all this green and all the red on the bottom. :ouch:
ATI whouldn't have looked so bad if they go rid of the extra nvidias
 


I agree with you. In principal it is always better to have a more complete representation of both players. I am actually more concerned with the low clocked card they used... most other benches I have seen do not show a big diff (if any at all) between the 1 gigger and the vanilla, but that is an 800 vs 825 mhz card... only a 25 mhz diff... NOT a 75mhz+ diff. And are there NO oc'd ati cards on the market? (that is a legit question, I thought there would be a few to be able to grab at least one...)

Other sites show similar issues though with other ati cards... the AA in WiC tanks it, but there is a patch for the game coming out (may already be out) that is supposed to fix that, so it may not be an ati driver thing but a game thing. (other games don't tank the card w/ AA like that one does) If I remember correctly Nv has been pushing their non-dx10 method of (old-school) AA and most twimtbp games optimize for it where ati is "proper" dx10 implementation... but that could be bogus memory on my part. Proper or not, the fact that a game patch is "fixing" it points at them not having it implemented correctly for the hardware.

Also, if you look at res above 1280 (and who runs under that after buying an 8800 or 2900?) you see many matches having the 2900 take the win over the 8800. Why do we even still have a 1024 res on these tests? Is there anyone still gaming at that size?! (unless you have 320 or less memory and you HAVE to) IMO that just removes frame buffer size as the large factor it is with AA and high resolutions and "equalizes" the smaller vram cards with their larger cousins. If you remove the 1024 from all calculations I would wager that the numbers would balance the 8800 640 and the 2900 much closer... maybe... probably... meh, I dont have time to check that, maybe I will tonight.

regardless, was a good article to combine with info from other sites.
 

I hope your're right.
 
This article also proves that gaming in Vista sucks arse right now and isn't worth a squirt of urine over XP. Give it about 2 years for Microsoft to fix their mistakes, for hardware (mainly GPUs) to catch up to the bloated OS and for mainstream DX10 games to be available.

Then "all of the sudden" gaming in Vista will be the best way to go.
 
I notice a lot of people shoot the HD2900Pro down very fast and its strange why. Its the same core as a HD2900XT only underclocked which means you can OC it to the XT speeds and beyond very easily.

I have the 1GB version of the HD2900Pro and its GPU is 601MHz memory 925MHz and it cost me $319.99. A 1GB HD2900XT would cost $499.99 at some places so its almost a $200 dollar price difference. All you have to do is OC it to the XT version speeds and I am sure you would see a huge difference.

I did a performance test using Lost Planets demo and it got me roughly 44 on the first map and 48 on the second at stock speeds. When I played the demo it was very smooth and it never slowed down.

I think we need to get some of the HD2900Pro cards in the labs and test them at stock and OC'ed to see what they can offer especially since they are cheap compared to a lot. Plus I dont see any reason to spend more than $350 on a video card let alone $700.
 

Ahhh... you were expecting to have your cake AND eat it too? Surely you know better than that by now! Just think about Windows... every time a new version comes out, Microsoft tells us how much faster it is... and in reality, that's NEVER true. I'm sure I could dig up an article/press release about how much faster Vista is than XP... but what's the point?
 
The reason they have more Nvidia cards then ATI cards is most likely because the companies making the Nvidia cards have sent them more cards then the ATI companies.
Most review sites can't afford to buy a lot of video cards (and most of what they review) and are dependant on companies donating the cards. Considering that for the most part the Nvidia cards have been around a lot longer they (THG) has had a lot longer to accumulate more and different cards.


As for showing benchmarks in 1024, yes people still run games at that resolution. Probably a fair number of people. 1280 would also be very common, you don't get most users above that most of the time. Sure people that are spending the money for the 8800 Ultra or big SLI configurations, but that is a very small minority of users. Considering that the 8800s don't even fall into the mid-range price range and mid-range basically defines what most people are using, the use of lower resolutions is highly relivent. Also when you consider that they are looking at cards like the 8600/2600s and lower then 1024 is very relivent.
 



Expecting, no not for a second... Surprised no one seems to be calling Microshaft on it, uhm yeah.
 
The ATI HD series is a joke, it reminds me alot of the old Geforce FX series. I hope that the next ATI batch of GPUs kick a bit more ass.
 
I honestly think there is some serious nvidia bias to this article. It's common knowledge that statistics can be manipulated to argue any point I think that is the case here. They did not use ATI's top card they used a lower end version of it. They compared 6 different versions of Nvidias 8800 and 1 of the 2900. I was unaware that the games were optimised for nvidia but even before that I could tell by the writing that this article was going to lean in favor of nvidia. Basically the first page said "haha we told you so and here's the proof". Further in the article the op jumps to the defense of nvidia blaming the game mechanics relying heavily on memory (this seemed like a very accurate analysis) but still sounded bias.

I'm a proud owner of an 8800 GTS 320 MB version and it pleases me to see my money was well spent however I often depend on Toms Hardware to make my comparisons as I could careless about ati vs. nvidia or amd vs. intel (I want the most bang for my buck). And with articles like this I lose faith in achieving a fair comparison from this site.

 


😉
 

I thought I picked up that to, otherwise I would have thought (proove me wrong here, by all means) they would have a 2900 pro in it to compare to the 320
 


Memory Bug? I know there is some kind of 'bug' in some games like QW where the vid card refuses to use virtual memory, which is why framerates drop more than they should at 16 AA
 
AMDZone also had trouble getting new ATI cards as well so there's been this "unwillingness" to donate cards for review. Hopefully the HD2950 will go in a better direction than the current 2x00 series.
 
I would also agree that this article comes across bias. Overclocked Nvidia cards and hand picked Nvidia hardware favoured games. My view is wait for other DX10 games to appear until writing ATI's offering off.
 
When developers started creating software for DX10, they used the only available DX10 compliant hardware available, namely Nvidia hardware.

It makes sense that the first DX10 software available would run better on Nvidia hardware since that's what the software was being developed on.

Am I out of whack here?
 
From the article leadin before the test:

"ATI simply needed more time - after all, Nvidia had six months to tweak its graphics drivers. Given enough time, ATI's drivers would be bound to improve, giving the Radeon 2900 XT the much-anticipated performance boost."

That sounds like a ready made excuse for ATI's lack of a better showing. nVidia has had six months or so to tweak drivers. And considering the memory bug in the 640 MB 8800GTS mentioned above, they are not done yet. But remember, the 2900 was supposed to ship concurrently with 8800's last winter. So ATI has had 6 months to tweak the hardware.
 
I understand the reasoning for the nvidia based games, not much you could do differently there. As previously stated nvidia was first with the technology so games were made by their standards to start.

With all that aside this article would still come off bias to me. Just by how it was written and the choices of cards to compare. Just a week or two ago an article was made saying that a 2900 pro would give you more bang for your buck then an 8800 gts. At that point I wanted to see THG benchmarks to prove it to see whether or not my 8800 was a worse investment. And this article doesn't even compare the two. It just doesn't make sense to me that they would compare 6 high end cards to 1 ATI. They put in over clocked nvidia cards and fail to front even the stock high end version of the ATI card.
 
It seems to me that the reasons why the software is Nvidia tweaked is because there are really not a whole lot of ATI-tweaked games coming out right now. Maybe someone else can name a few mainstream titles if I'm wrong (wouldn't be the first time, nor the last). Considering that ATI took an extra 6 months to bring out the 2900, why would any software developers looking to get bleeding-edge performance write code for out-dated hardware?
Oh, and about the OC'd hardware- It's a good and relevant point to show that Tom's is comparing apples to oranges there. HOWEVER, if you ignore the OC'd versions, the price/framerate comparison is still valid.
How can anyone argue with the fact that the 2XXX series still can't handle any sort of AA/AF without dropping 50% on their framerates? SAD!
I'm still holding out hope that ATI still has some driver tweaks left up their sleeves for DX10 that will help improve performance. Nvidia needs some real competition on the top-end to help drag those ridiculously high prices down. Hell, I got my 8800gts640 for less than it sells today back in January for Godssake!!!
 
"ATI simply needed more time - after all, Nvidia had six months to tweak its graphics drivers. Given enough time, ATI's drivers would be bound to improve, giving the Radeon 2900 XT the much-anticipated performance boost."

They had 6 months to tweak the hardware then now about 5 months to tweak the drivers. If this test isn't rigged then ATI better release a BEAST on their new releases or else ATI goes down the NVIDIA starts killing consumers with price raises