4870x2 HardOCP Preview - taking the fluff out of reviewing t -.- t

Page 4 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.


I don't say if HardOCP is any good, but they say directly in their page that those very low points are save game spots... So that is "explanation" for those numbers. Othervice it's a matter of taste. Their reviews seems to be more like "matter of personal taste" than pure statisticks. (I would say that HardOCP is like English hifi-magazine where they test audio equipment knowing what they are listening, and other sites like Finnish Hifi-magazines where all test are normally done "blindly" without seeing the actual device... Quite often you can see that one magazice says that cable XXX sounds better than cable YYY by listening them, when other magazines measures the guality of cables with some electronical instrument and say that there is not any difference between these two caples... Both type magazines have their reader audience.)


I personally like more "cold" numbers, but their articles are quite often entertaining reading, at least they are very different than another review sites...
 
The problem with them using the 'save points' as an excuse is, why does one solution take a bigger hit than another (often in Crysis it was ATi, but in AoC it was nV), and isn't that relevant if it affects performance differently?

They mentioned HDD thrashing in the AOC review, but if one solution is more efficient at texture or memory management and does it better so as to not need to swap out with the HDD as much, how is that different than one solution being more or less CPU dependant, or being better at HDR or AA or anything else?
If the frames do drop and it's not an aritifice of fraps that shows 2 fps but the screen still refreshes at 30 fps, then it needs to be included. There's mention of excuses, but hard drive thrashing as an excuse as if the framerate drop doesn't matter is ignorant, as if saying the X48 vs P45 issue above isn't relevant to the testing of different solutions on different platforms.

I definitely find them to be the nice nuts or sprinkles ontop of other more substantial reviews as a different view of the hardware, but I do have issues with their testing methods since it truly offers far less information than others. It may be a few tidbits others aren't providing, but it does remain to me just a few tidbits.
 
lol hannibal... your response made me laugh... tons and tons of errors... I love caples (cables) and magasices... (magazine)

sorry if i'm being stupid for pointing that out but I find it funny
 



Minimum frame rates are save points - and it specifically states that in the testing methodoloy.

The histogram is there for you to read, which you obviously seem quite capable of; so if you only take the "reported values" and nothing from the histogram then it means nothing, doesn't it?

Thats like trusting THG when listing average fps and not even giving a min or max or a histogram at all.
 
btw, the 17 fps minimum is directly above your second green circle in your picture that you drew all over. I can also see the 10 min fps, so obviously they made a mistake - which you picked up on. But at the same time, in the same picture later on in the bench it shows both cards at 8 fps. Go figure, huh?

But here is the difference - they actually gave you the information to determine that they made a mistake - they didn't just give you a figures with no histogram and declared that you base gameplay at max settings on that.

Draw your own conclusions based on the histogram - don't take the "results" at face value and don't ever use one review from one site as the only thing to base a purchase on. Any purchase i've made i've typically based it on information provided by Anandtech and Hardocp as they give me (between them) the total information that I want.
 


Actually look again at that one it's not above 15 fps it's below 15 fps so that's 13 or 14 fps. At best the 17 fps would be at the very end and even then it looks more like 19 fps than 17 which would be in the bottom half of the 20-15 range.

I can also see the 10 min fps, so obviously they made a mistake - which you picked up on. But at the same time, in the same picture later on in the bench it shows both cards at 8 fps. Go figure, huh?

Yeah go figure that in the only concrete thing we have there's issues. That's my point.

But here is the difference - they actually gave you the information to determine that they made a mistake - they didn't just give you a figures with no histogram and declared that you base gameplay at max settings on that.

Once again, that thing you think is a benificial difference is nice to see, but that's the part that is like the other tests (it is a histogram showing the test, not a personal preference) it's quantitative / empirical data, their decision as to what to tweak is qualitative /subjective data. If they get the numbers wrong that are there for all to see, how can you trust their opinions?

Anywhooo, I'm not trying to convert you, you asked for the biased statement, I said it's not the statment it's issues like this. There are other issues, including Kyle's comments on the R600 long before it's release. Anywhoo, like I said, you can question my or SS's motives, but sofar you seem to simply be ignoring that there's an issue by pretending I should be ok with them providing me the evidence to refute their mistakes?
 



Good job getting the 3800X2 to 2.8ghz. I got my old one to boot at 2.7ghz but never kept it there. highest is 2.5ghz 24/7
 
I think The Great Grape Ape and Ovaltine Please need to get together and mix up a big glass of the chocolate delicacy and wait for the final review. 😀
 


You would have to ask them more about there testing method and histogram output in order to determine where the issue lies. I've done gameplay benchmarks myself (including yesterday using an Ultra High Quality config) where my fraps min/max/avg reported a 7 fps minimum - yet I wasn't able to detect it in my gameplay at all. I'd chalk that up to a strange anomoly in the software itself - I would expect that there is a reasonable explanation for consistant misplacement of minimums, if they are indeed as consistant as you claim.

In reality though, I could go and pick apart any review done by any site and find flaws in it - but does that make me somehow a superior reviewer or hardware enthusiast or software output analyzer than them?

I picked apart that techreport article quite well, and in my opinion the method is flawed and it will remain flawed in my eyes - I don't approve of benchmarking high end gpus on low end cpus. Just like you don't approve of figure 1 claiming a certain min fps and figure 2 claiming another min fps. Both situations likely have explanations for the discrepancies/problems.

1) maybe techreport picks a 3.0 ghz cpu as they feel that statistically there are more users at the 3.0 ghz benchmark - I still think its a flawed perception, but its their choice to do it that way; maybe thats just me picking apart their article like you are picking apart Hardocp's particular article.

2) as you showed in your screenshot - both histograms showed either card hitting a very low minimum towards the end of the benchmark; there could be a reason for this however - such as an in-game anomoly or a collection of in-game anomolies such as scripted events commencing (assuming they benchmark on assault harbour, I imagine the Cruiser bombing could briefly emit a very low fps point even if it were just for a half second - which appears to be reflected in the histogram, well - maybe.)

Or there could be a minor flaw in the fraps software itself outputting an occasional incorrect figure or a figure which is not detectable easily by eye at least. I'd really say you'd have to ask them why they would discard that number along with the save point minimums; as it might not be a mistake and it may be intentionally discarded for a reason. I will agree to this much, if the number is being discarded then they should say specifically why and what criteria they use to decide this.

However, the histogram is still provided - so obviously you can draw your own conclusions by reading the output data, which is a feature I like. If hardocp used apples to apples only with avg fps only then you would never even be able to tell whether they were BS-ing, making mistakes, or otherwise; right?
 
bottleneck = **********

it is a bad word people.

It's not a bad word, it's only bad if not defined relative to the situation. My CPU isn't powerful enough to keep up with the 3870x2 at the resolution I'm stuck at until I get a new monitor after summer vacation.

If I had a Wolfdale overclocked to 3.4 gigahertz, then the CPU would be able to keep up with the card at 1280 x 1024. I knew I was buying a card too powerful for my system at the resolution I was playing at, but didn't mind it at all; because I planned on upgrading (though it turned out to be not as soon as originally planned).

People come on and ask why they don't get the frames per second in game X that a particular review gets. Well, most reviews are done with very high end CPU's and not everyone has one. In some games, at some resolutions, having a mainstream CPU with a high end card doesn't matter. In other games it does.

As many ads say in the fine print "results may vary".



I'd approve of benchmarking high end GPU's on mainstream CPU's and on high end CPU's. I'd approve of also benchmarking mainstream GPU's on both low end CPU's and mainstream CPU's.

Granted, few will put a 9600gt or 3850 in a system with an overclocked Penryn, but many will put a 4850 or a 9800gtx in a system with a stock C2D based Pentium or an Athlon X2 3800+. Sometimes, it's because they plan on upgrades later, as I do.

Other times, it's not understanding that a high end card won't give spectacular results akin to the online reviews unless the same category of high end CPU is mated to the high end GPU.

What I'd love to see are interactive charts where you choose a CPU and a GPU, then a resolution, then AA and AF. After choices made, the chart provides minimum, maximum and average framerates for a set of games and artificial benchmarks. It might be too much work for every site, but you'd think that Tom's or Anandtech could manage to keep it updated.
 
lol hannibal... your response made me laugh... tons and tons of errors... I love caples (cables) and magasices... (magazine)

sorry if i'm being stupid for pointing that out but I find it funny

Well, you are like my old english teacher ;-) From light years away from now...

I don't mind too much. When I have time I reread everything I write two to three times, and then the text is readable (mostly so...)
When I am writing my comments at 1 or 2 a clock in the morning, there are much more of those... "What the heck this man is saying..." things you point out. Well, I have to keep on practising...

Sorry for all the trouble...
and thanks for all the fish!

 


http://enthusiast.hardocp.com/article.html?art=MTUzMSwxMCwsaGVudGh1c2lhc3Q=








http://enthusiast.hardocp.com/article.html?art=MTUzMSw4LCxoZW50aHVzaWFzdA==

AMD's 4870x2 kicking ass in 2007 games


So in 3 out of 4 demonstrations, HardOCP showed AMD's card being a killer performer - and specifically in 1 out of 2 benchmarks they showed the 4870x2 being clearly better than the Nvidia solution, and in Crysis the performance was very comparable. Its already been shown with the AMD 3870x2 that it takes some time for drivers to come through for this game, so today's fps on an AMD card is hardly the decider.

TGGA: I really don't think its biased based on how much positive light they give the AMD solution, just because the Nvidia product wins one test which is Nvidia driver biased to begin with - doesn't mean that HardOCP is instigating an NV bias in the article themselves. They even state in conclusions of both the 4870 review and the 4870x2 preview that both cards have plenty of room to improve with software.

Anyways, i've had enough of the argument. This thread was posted to give people an idea of what kind of eye candy they can expect from a card like this because it offers lots of high quality AA screens; and based on the information that Hardocp provided, i'd say the 4870x2 really delivers in the majority of games and it'd be the first card i'd recommend to someone interested in buying a 500$ gpu this August.
 


Well it would be hard for them to come out negative with all the positives out there, it would make them stick out like a sore thumb.
I'm not sure if it's intential bias, or simply cheerleading the latest fan fouvrites (the king is dead... all hail the king) and there are still lotsa rabid nV fans in [H]'s forums, but it's a definite pattern, as mentioned before it's far from the first time these anomalies go in one direction, and the public statements of the sites owner and active participant also are far from objective at times. My issue is less with that pattern than the lack of transparency for the best playable setting due to the number of errors and inconsistencies. What would make me feel better is show what they think is the best playable for both (or more) contenders involved with their hystograms, and then show the other card played at those settings with it's hystogram. This should make it pretty clear how each is better suited, but it still leaves you with a rather limited set of settings and doesn't provide you with resources outside of that narrow window where someone without an ultra- high res panel, or little appreciation for 8XAA would play.

Anywhoo, to me it's still a nice addendum to other reviews, but usually something I read after all the reviews I value more for their depth and detail.

 
I think it is a shame to not see performance for both the GTX280 SLI/Tr-SLI and the 4870x2/dual 4870x2 in 1920 and 2560 resolutions at Very High all settings DX10. "Playable" comparisons are good, but it's also nice to see the actual performance when the cards are being pushed to the extreme to see if they buckle or end up holding their own.

Quite frankly the performance of the dual 4870x2 in "Crossfire X" is a major disappointment. Crysis is still the benchmarking game, and I know myself and many others with 8 series SLI setups already run everything else at max settings. I expected much better scaling in 2560, but I'm disappointed. The GTX280 in Tri-SLI performed very nicely in crysis 2560x1600 very high all dx10, and as the prices come down unless ATI has another trick up their sleeve, that seems to be the way to go for people with high res monitors.
 

Did you read the review? They stated that ATI said that the Crysis scaling was not performing the way that they would like. This means that it will improve, perhaps drastically, with the drivers available when the card is released.