ATI 3870 Crysis Benchmarks

Page 3 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.

systemlord

Distinguished
Jun 13, 2006
2,737
0
20,780
I don't get it, today I loaded Riva Tuner to use the Video memory usage which should show how much Vram is being used up. The highest Vram usage was 308MB @ 1280x1024 native with all settings to high except for the shader to medium, I find this hard to believe. Does someone have a reason why this is?
 

amnotanoobie

Distinguished
Aug 27, 2006
1,493
0
19,360




I find it hard to believe that any part of this site is true. For a site that has only two articles, one about the 3870 and the other about the 8800gt, I find the information (if that's what you'd call it) to be total FUD. Also even after trying the site out on the Internet Archive Wayback Machine (http://web.archive.org/web/*/http://iax-tech.com/) I think the site was only made yesterday.

This would be like the time of the the site with the premature R600 benchies, this site: http://level505.com/2006/12/30/the-full-ati-r600-test/1/ (which is now dead), but thanks to the Internet Wayback Machine: http://web.archive.org/web/*/http://level505.com/2006/12/30/the-full-ati-r600-test/1/ (The site went up at January 3, 2007 and the first article of the site was made on December 30, 2006!). It showed that the R600 above the 8800GTX, when in fact during the launch, it was nowhere near even a 8800GTS at that time. And up to now it has little to no hope of beating a 8800GTX.


P.S. After looking at the site's Contact Us. Section, this is what you'd find:

------------------------------------------------------------------------------
IAX TECH is locating at California, Unite States.

For any question about our website or advertising
Please email us at iaxtech@iax-tech.com

Thank you
IAX TECH INC




COPYRIGHT ©2007 IAX TECH INC ALL RIGHT REVERED
------------------------------------------------------------------------------

For a supposedly US based site, it could be noted that they couldn't even spell the United States right.
 

spoonboy

Distinguished
Oct 31, 2007
1,053
0
19,280
Yeah the english is soo poor that it must be in asia, where the factories churning out products are. Im pretty sure theres lots of intrigue and backhanders in the asian business world, more than say in the west, soo Im pretty sure its at least possible that far eastern websites get hold of products before the US & Europe. The website doesnt make any wild performance claims, and it all seems very plausible what theyre saying, soo I'll take it all with a smaller pinch of salt than usual.

"all rights revered" lol :)
 

Jakc

Distinguished
Apr 16, 2007
208
0
18,680



Well it's at least impressive they managed to get all their right revered.
 

thuan

Distinguished
Sep 6, 2005
166
0
18,690
The internal Direct3d version of Catalyst 7.10 is 7.14.10.0532 so saying 7.14 is not wrong in this context but hard for other ppl to understand.
 

tennen

Distinguished
May 25, 2003
102
0
18,680

 

tennen

Distinguished
May 25, 2003
102
0
18,680



10 fps is an update in picture 10 times / sec = 0,1 sec... right..
This is about your reaction time and I bet you will say that is a slide show.

30 fps = 0,033 sec. You will need about two picture updates to react on a change and I bet if you move around with constant 30 fps you will sort of feel it is dragging behind. Sence of delay in respond..

60 fps= 0,0166 sec. At this framrate games also feel smoth to me. I can clearly sence a difference between 30 fps and 60 fps when moving and looking around in a game, considering responsiveness..

So NO. high fps is not just for having some in spare for the intesive fire fights. It is also for the feeling even you not really can't tell straight out by just looking at it.

Please enter a TV-shop and compare a 50hz TV side by side with a 100hz TV. I'm sure will sence the flickering from the 50hz Tv. And if you can notice that difference why wouldnt you notice 30 --> 60 fps?

Try it!
 

bildo123

Distinguished
Feb 6, 2007
1,599
0
19,810


I thought the slideshow point was below 24FPS? But here's another example that could be used...compare a 85Hz monitor with a 130Hz monitor...not much of a difference there.
 

warezme

Distinguished
Dec 18, 2006
2,450
56
19,890


You must own an old Nvidia card than because image quality and color richness comparison differences between ATI and Nvidia have been a thing of the past since the 8000 series?? Why ATI folks always have to dredge up that old argument all the time? :pfff:
 

While no other reviews I have noticed a few posts here stating choppy performance on crysis using Nvidia. 1280X1024 does come under the higher resolutions performance hit when you notice that how much graphics is packed in the 1280X1024. Does not the GTX suffer greatly at 1280X1024 at high settings on crysis as much as in other games at say 1900X1200? Its about the same issue. I have played crysis on my 8800GTS and I do appreciate how much difference it makes that the 8800GT can win FPS at those settings but is unplayable at those same settings. The 3870 being playable here is a major win for AMD/ATI and really makes it a performance threat to the GTX as it doesn't suffer extremely low FPS at higher settings.
http://www.tomshardware.com/forum/246247-28-choppy-crysis-demo This is atleast a GT. :D

Its neither a benchmark blip or major issue with the hardware side of the 8800GT. Its a driver issue where the 8800GT due to memory bandwidth needs some changes unlike the GTX.
 

Paladyn

Distinguished
Nov 14, 2007
3
0
18,510
Did I miss something here? In the Catalyst Control Center the video bus was verified as being PCIe v2.0, HOWEVER to the right it showed that the card was only using x8 lanes instead of x16? And testing was done with a P35 chipset board instead of something that supports this card's new features?

I see a few things here:
(1) Four x8 video cards running on one system board
(2) A pair of dual-GPU x16 cards on one system board
(3) Losing benchmarks to nVidia

Somebody please correct me if I am wrong!
Somebody please benchmark this card with an X38-chipset board!
Am I the only one that saw the connection between the 38xx-series video matching the x38-series northbridge's new features?

IMO, this is Intel teaming up with their rivals AMD (who own ATI) to stick it to nVidia's market share. So who's got the popcorn?

EDIT: nevermind. Looks like someone already tested the 3870 on a X38 platform, and the 3870 still looses quite badly to the 8800GT.

http://www.legionhardware.com/document.php?id=703&p=0
 

spoonboy

Distinguished
Oct 31, 2007
1,053
0
19,280


I think its a question of taste, or overly refined tastes. even without fraps or something else to show fps you can tell when a game dips under 25fps cos you get that slow motion everything is swimming in treacle feeling. The difference between say 26 and 22 fps is very pronounced in a fast moving game. Above that point though I think its fairly hard to tell a difference. The difference - minimum frame rates aside - in an fps or online fps, between say 30fps and 60 fps isnt enough to give you an edge, and wont mean it'll be any easier to get points/kills. Its smoother, but not not to the untrained eye/gamer.
 

No, the second digit is the version number launched.

The first number is now the year number.

Which is 200 7 . 11th driver release.

So You can have an x.14 version in a year it just means that a month or two have to have multiple releases in it.

The last series to launch a number higher than 12 was the Catalyst 5.13 driver.

AMD/ATi currently is sticking to a monthly launch cycle, but if that were to change for whatever reason then you'd get different numbers.

However IMO that would be unwise since people have come to accept the idea that you mention above, so why mess with it and have people complain about 'confusion'?

 

Well it doesn't necessarily conradict itself. Even you point out about the minimum FPS argument.

Look at some of the HD series performance and you see almost a tunnel of performance in some games with no great highs, but also no great lows, while the cards it's being compared to has a higher average framerate and much higher max fps, but there is also a deeper trough of min fps. This is a similar situation when some people talk about SLi/Xfire and talk about little benifit, yet you look at an actual hystogram or per second benchie of the two and suddenly there's a distinct difference that doesn't show up much in the avg fps test.

A perfect example of this was one of those early Crysis benchies with the HD3870 vs GF8800GT;

http://iax-tech.com/video/3870/38704.htm

The average is higher, but look at the difference in the minimum fps, even on the retest on the next page it's 4X higher.

Now that still doesn't show if it's a very brief blip or a deep trough, but it starts to show how something can have lower FPS and yet still have smoother framerates. Consistent 50 fps is still smoother while on average being lower than than 20 fps - 150 fps - 30fps - 90fps - 15 fps - 120fps - etc.

And this has been seen in some situation with the HD2900 & GF8800 series, but it's usually the rarity versus the Ultra/GTX, but not as rare versus the GTS.


 

dtq

Distinguished
Dec 21, 2006
515
0
18,990


Grape that site has only 4 articles covering just one topic the 8800gt vs the 3800, I wouldnt be too confident in taking any results from it seriously...

Looking at benchmark threads elsewhere I cant see any duplicating those sort of min frame troughs, I personally am thinking that review is somewhat unreliable. Whilst yes you can have peaks and troughs leading to a higher overall average FPS I dont believe that the 8800GT actually has the issues that article is trying to make out, as the results arent being duplicated elsewhere. Especially from a site with so little history...
 


You're missing the point and focusing on a review you seem to take issue with which is really irrelevant other than it was handy for a recent illustration. It's what the results show that illustrates that you can have higher fps and still have less fluidity due to drop in the min fps.
But it's not exclusive to that, you could have one card that has a 0.5 second drop to 5 fps, but it's not noticeable if the rest of the time it's playable, whereas the other card has a longer drop into the unplaybale segment.

Look at this image from purely a 'smoothness' aspect;
1194994614E3oshBmZcO_5_5.gif


The Red line arguably has a much lower average and even max fps, but the yellow line has much more servere drops which would greatly affect the 'smoothness' of gameplay.

That's just to illustrate the idea of how the two aren't necessarily related, not commenting on whether either of the new card is or isn't smoother just that it would be possible to have lower max and average fps, but still maintain smoother gameplay. That Histogram should make it pretty clear what I was talking about.



 

pogsnet

Distinguished
Aug 15, 2007
417
0
18,780
ATI 3870 has better picture quality than 8800GT in Crysis
QUOTED "Specially the light refraction from water, HD3870 is very smooth but 8800GT is not. It is around frame 500 to 1000."

Take a look:
http://iax-tech.com/video/3870/38704.htm

Reading further
nVidia replied to the quality issue...with few updates and file renaming (i dont see any necessity to that) then the quality now equates ATI3870.
"I don't know what nVidia do to 169.04 but we can see the MAX FPS drops from 30 to 20FPS and the max frame is same as ATi 3870."
Then they both run on the same FPS. So surely 8800GT is more likely equal to 3870 on same picture quality. Price then is the issue.

Whats the use of having few FPS advantage but you dont have the best picture quality available.
I rather go to cheaper which runs most games flawlessly plus best rendering quality, much wiser to do that.
 

miribus

Distinguished
Nov 28, 2006
246
0
18,680
I look at this a completely different way.
While no doubt the 8800gt benches better, I, with an older system and a PSU that is just on the edge for the x1950xtx, can get a 3870 or even a very strong 3850 with superior performance and not need to upgrade the power supply.
That saves me like $200. Also, since the ATI's scale very nicely with older processors I can be sure of less of a bottleneck.

I like Nvidia, I have no real preference, but at this point AMD/ATI fits better.

And as far as poor ATI driver support, what is this? 2003? The driver support hasn't been a real issue to me since the "X" series at least.

And congrats to me for apparently being a poster since 1970!(?)
 

tamalero

Distinguished
Oct 25, 2006
1,132
138
19,470
ok, we've seen enoght benchmarks, now where the hell do we buy them?
newegg is out of stock, and the rest of the shops are selling at insanely price ( tigerdirect over 270 usd plus shipping , zipzoomfly over 280 plus shipping )
 

TRENDING THREADS