4870x2 HardOCP Preview - taking the fluff out of reviewing t -.- t

Page 3 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.

Actually, they flattened the pair of GTX280's in everything but Crysis, which they said they were still working on scaling for (driver updates should cause significant improvement).
 


oh there we go again.. driver, driver and again drvier.
Been hearing this kind of excuse since the release of 38xx
 
Did you even read the review?

I wouldn't worry too much about this early performance behavior in Crysis however with CrossFireX, AMD has told us that they are not yet seeing the performance scaling they'd like to in Crysis with CrossFireX

Direct from the review.

Oh, and on the next page (in Age of Conan):
With GeForce GTX 280 SLI we were able to play at true 8X MSAA at 2560x1600 with the highest in-game settings, very impressive indeed. However, the ATI Radeon HD 4870 X2 CrossFireX platform surpassed that by allowing us to enable Adaptive AA plus 8X AA to reduce aliasing on all the foliage in the game as well! Not only that, but as you can see, the performance is still even higher than GTX 280 SLI at these higher settings. Just simply outstanding!

So I think I'm fairly safe in saying that the 4870's flattened the GTX280's in almost everything but Crysis, and that Crysis should be improved by drivers.
 
I'm sick of all this Crysis ****. Seriously, the game's a bad example of a benchmark IMO. Sure it lays the smackdown on any card that runs it, but that's because of unoptimized coding on the part of Crytek and there's probably some NV bias in that game.

It's quite obvious that with just about every other game that's been developed by people who know what they're doing that video cards don't have to be cooled with liquid nitrogen, OC'd to over 1 GHz core and have 5000 shader cores to play at 2560 x 1600 w/ full settings and AA/AF.
 
the problem i have with H is that i have seen too often the biased conclusions they come to and their "most playable" settings they use.

complete rubbish and even with their apples to apples(what else would you do) i do not trust them not to show their usual nvidia bias.

i have even seen some reviews where their own data has contradicted their conclusions. they really struggle to praise ATI and to so is done grudgingly and only when clear cut but even the slightest thing is cause for praise for nvidia.

you cpu remarks are useless in the real world and i have a funny feeling you use the word "bottleneck".


Find me a biased statement in 4800 series reviews and previews and you can go ahead and call me a monkey's uncle. I don't see them struggling to praise the 4800 series of gpus.

So if you feel that everything they do is flawed and contradictory feel free to call out specific examples right here - as they have an entire archive of articles for you to hand-pick choose from if you wish.

The cpu remarks aren't useless in the real world, because in the real world where many people own 19-24" displays; if you have a slow cpu there is absolutely no point in buying any top end graphics array as it will be held back by the cpu at anything less than 28-30" resolution - except in Crysis, maybe.

If you have nothing to contribute then why do you even post? So that you can express your biased opinions about the legitimacy of another review site? My opinions are based on objective reasoning and testing, your opinions are based on "well I don't like it cause they said this about AMD this one time and I hate hardocp because I hate them!"
 



Well you can certainly stress a gpu playing Call of Duty 4 or really any directx9 title on a 1920*1200 under 16x Quality AA + Transparency super sampling (Nv) or in the case of AMD 24x CSAA with ADAA edge detect

You don't have to play Crysis in order to stress gpus - largely the point that I continue to make is that if you aren't enabling high quality AA and texture settings in your drivers then you aren't taking advantage of high end hardware like the 4870x2. You don't have to play Crysis in order to appreciate your hardware and crank up the details, as the whole point of buying a high end gpu is to increase IQ settings. IMHO if you can play games at minimum 30 fps with every conceivable detail maxed out in-game and in drivers, then you're taking advantage of high end gpus.
 

I'd say that it is impossible to say whether Crysis is badly optimized or not. Does it run slower than basically anything else on the market? Yes. However, it also has more detail and looks better than almost anything else on the market. Because of this, it is impossible to compare. It also runs quite well on distinctly mediocre hardware if you turn the settings way down.

Now, if you want a badly optimized game IMO, look at Bioshock. It runs slower than Half-Life 2 Ep.2, Gears (PC version), and UT3 on my system, but it looks at best equal to those other games, if not worse.
 
I'm almost sure that Crytek even admitted that Crysis wasn't very optimized. The next iteration, Crysis Warhead, is supposed to run on the same engine, but with optimizations such that you can have all the delicious eye candy of the original without the need for top-tier hardware. That's what I've heard and read anyway.

Also, don't knock Bioshock. I liked it, though it did get ridiculously easy at the end. Also, I run an Athlon X2 3800+ at 2.8 GHz, 2 gigs of DDR2 800, GF 7600GT and I was able to run the game on XP with a mix of low and medium settings at 1280 x 1024, comfortably. Also from benchmarks I've seen, Bioshock seems to run fairly well on modern hardware, even with settings cranked up. Doesn't look too bad either.
 


Does it need to be bias or can it be incompentance that makes you question their ability to pick "best playable" and causes trust issues with their abilities/methods?

I can provide you a few of those.

I like that they offer the Hystogram (one of the few sites that do) however if I were to be given only one review to get information from, I would never pick an [H] review as my sole source of info.

I'm optimistic that they'll improve their methods, but what was once valuable for its novelty needs to mature more like other review sites and provide more information than just one arbitrarily picked settting.
 
i know one thing though, they use those damn playable settings test which are useless as they tell us nothing. i do not want to know what he reckons to be playable i want to just know how they compare.

...

you can go as far back as the 8800's and 1900's and you will see the same thing, not testing cards on the same levels and using dodgy level settings for different cards, if everything scaled in a linear fashion that would be great but graphics cards do not always do so and you cannot use shoddy testing like he does.

Agreed.

The reviewers bias for higher AA or higher resolution, or higher/lower shader/features is definitely an issue, just ask any Oblivion player about [H]'s habit of turning down or off grass when testing. That favoured one architecture in particular and also is how no one would play. Almost anyone playing oblivion would change the grass size before turning it that far down or off, it totally ruins the game having it gone or popping up 10 ft in front of you.

Also someone with a 1680x1050 22" monitor might play very differently than someone with a 1920x1200 24" LCD, so showing me only one of the two settings or else neither (going to 2560x1600 only) isn't very insightful, especially since they scale very differently like you mentioned.

I would prefer to see hystograms of 'prefered' settings rather than all the analysis which is usually mediocre at best ("more AA is better than less AA", well thanks Einstein).

Anywhoo, the thing that bugs me the most is even when confronted with their own blatant errors (not just errors in judgement) they try to act as if it's inconsequential, when really it just compounds the issue of 'blind faith' in their methods.

Now it seems they are more interested in making some kind of difference in settings, because to have them both the same playable would mean they just have one min/avg/max benchmark. So their focus seems almost to create a difference even if there may be none. That may not be the case, but it is the impression one gets when you see very slight differences in settings, and it's far from apparent that the other card couldn't also run at those settings since it ran the slightly different setting much better (much high min FPS).

I don't think there's just one way to do this, but I think considering the tools at their disposal [H] definnitely could've improved on what seemed like a good start 3+ years ago, and now begins to look a little limited IMO. Still a source of info, but far less weighty than before to me.
 

Bioshock was quite an enjoyable game, don't get me wrong. It just doesn't run as fast, at least on my system, as several other games that are comparable in the graphics department.

Note: I'm running it on a Geforce Go 7950GTX, T7600 C2D @ 2.83, and it runs smoothly on high DX9 @1680x1050. I can't quite run it on my monitor's native 1920x1200 smoothly though, while I can run HL2 Ep2, Gears, and UT3.
 



How is it arbitrary? They change the settings until the card has a minimum fps of 25. There isn't anything arbitrary about that, it makes perfect sense. I'm not going to turn up details to a point that the card gives me 15 average fps and actually play like that, am I?

If you are pointing fingers at me and claiming that I use Hardocp as my only resource when i'm looking into new technology then you need to give your damn head a shake - but the fact still stands, benchmarking on 3.0ghz cpus is idiotic because it does nothing but show dual gpu setups in a bad light and makes them appear to scale less than they actually do when they are implimented in a balanced system with good cpu power behind them.

I don't even know what you're trying to get at here, but you missed the entire point of my argument to begin with.



You're reading into it too much, there is nothing conspirital about it. If you want nothing but same settings benchmarks then there are plenty of sites that offer that.



Nobody cares what you use for a display or what cpu you have, because if I wanted to make a biased gpu review i'd run every test on a pentium 4 with 1 gig of ram and scream NO! BAD GPU! NO! BAD CROSSFIRE! NO! BAD SLI! Just like you seem to think is sensible.

If you don't like the reviewing style fine, but as far as I can tell you're just trolling my thread and throwing your bloated bias around to try to drag everything off-topic instead of actually contributing to the discussion. You are not going to make me agree that 3.0ghz cpus are suitable for doing comparitive benchmarks on high end gpus, so drag your baseless opinions out of the thread, thank you kindly.
 
you know, if i can be arsed i will go look for those articles. i know one thing though, they use those damn playable settings test which are useless as they tell us nothing. i do not want to know what he reckons to be playable i want to just know how they compare.

http://www.hardocp.com/article.html?art=MTUyNCw2LCxoZW50aHVzaWFzdA==

why no apples to apples tests between the GT260 and the 4850? why not let the reader decide for themselves?


GTX260: 11 min, 36 max, 25.2 avg

4850: 10 min, 31 max, 19.6 avg


feel free to remove your tinfoil hat.
 



I don't exactly understand what you're getting at, but when I started the thread I posted links to a website that shows the 4870x2 kicking the living crap out of the gtx280 - and then some people came in with the express interest of trying to give me a headache at their lack of sense by trying to say that the site that showed the AMD card walloping the Nvidia card for cheaper was somehow Nvidia biased.

So what does this have to do with fanboyism?
 
To be honest, I am a bit surprised - especially since techreport already reviewed it and came to contrary results. Go figure, huh? I wonder who is right/wrong.

ed; oh, you know though - techreport's review only covered x48 vs p45, I guess I mislead myself a bit.

I wonder what was actually holding back the x38 in CF performance - seems a little peculiar that it evidently had nothing to do with x16 lanes.

I am humbled. 😛
 


It's arbitrary because who decides what's most important to a game higher resolution, higher AA (2560x1600 no AA or 1920x1200 4X QAA ?), who decides that shader quality or features are less/more important than resolution/AA. It's arbitrary as to what is chosen. I could pick 10 settings for many games that would yield a minimum of 25 fps. And even that measure they don't adhere to, so it's arbitrary. It relies solely on the choices of one person, and it's a limited number of benchmarks compared to other reviews, which in and of itself isn't bad but the exclusion of all others to that end is ignorant, and that's the position they recently took attacking other sites, including Anand whom I have no particular affinity for but their methods are no less/more valid.

If you are pointing fingers at me and claiming that I use Hardocp as my only resource when i'm looking into new technology then you need to give your damn head a shake

You need to give your own a shake and re-read what I said. I kept the sentence singular about MY likes and dislikes about it, your internalizing that is your issue bud! [:thegreatgrapeape:5]
My point was the [H] review offers very little information and limits your ability to get a feel for a piece of hardware with disproportionate small slices of performance. It's nice as a companion to other reviews, but if I had only one to chose from , then I would prefer reviews like Computerbase, TechReport, Xbit, Digit-Life, THG, and yes even Anand on occasion, because if I were limited to the one review they do a good job of running extensive tests to expose limits of the hardware and compare and contrast them. Some are better than others, but rarely do I get a good feel for the hardware with an [H] review. The best thing they have is the Hystogram which tells me more than the 'highest playable settings'.

I don't even know what you're trying to get at here, but you missed the entire point of my argument to begin with.

No, but then again I wasn't talking about that, I was specifically replying to your statements about [H], not about CPUs, so I have a feeling that is why you don't know what I'm getting at and you miss the point more than me.

You're reading into it too much, there is nothing conspirital about it. If you want nothing but same settings benchmarks then there are plenty of sites that offer that.

I'm not reading too much into, but you're definitely discounting too much if you see no flaw in their current implementation. As for others sites using the same settings, yes there are many that do that, and there were others that used best playable for a while, however they moved out of that once people realized that different cards react differently to different stresses (do the GF8/9 and HD2/3K react the same to AA and Resolution?) so you need to show those differences. Not everyone plays at the same setting, nor do they all have access to the same resolutions, nor do they value settings similarly. So the utlity of testing a card at one resolution and setting makes little sense.

If you don't like the reviewing style fine, but as far as I can tell you're just trolling my thread and throwing your bloated bias around...

Like you aren't?

Look I'm pretty open with my criticisms to Kyle and in the [H] forums when I see something. The last time I saw something questionable was in the Age of Conan review;
http://www.hardforum.com/showpost.php?p=1032596440&postcount=85

You tell me whether or not this issue supports what StrangerStranger and I are saying or if [H]'s methods aren't without fault.

Hey man if you can't get an Apples to Apples right (which so many other sites seem able to as you point out) then how can I trust them with something as subjective as a 'max payable setting' test? And this is far from the first time, and the errors seem always to appear in one direction. :heink:
Regardless of what you think my motives are there or in this thread, as if anyone who disagrees is a troll, the data speaks for itself IMO. [:wr2:1]
 
Now just a quick look at the VERY FIRST test of the HD4870 X2 review we encounter a familiar issue. Is this sufficient to call you a monkey's uncle? [:mousemonkey:1]

firstherrorqd1.jpg


It's anomalies like this that are commonplace and make me question what was happening during those 2/3 fps moments, and the bunch of sub 10 fps moments for both cards? And is the reviewer missing this when testing or is it just a 'feeling' of better performance, etc?
I want to ask them, like Axel in BH Cop 2, "Are you driving with your eyes open? Or are you, like, using The Force?



Edited to crop image, forgot about the gutter greated by paint when adding info quickly
 
To me, its all about trust. I see mistakes in most reviews anyways. BUT, I have the ability and opportunity to run it myself on other sites review methods, and other sites use the same methods which I have access to, and can compare. Does running at 1 resolution help me if its not my resolution? AA more important? AF? Total speed, damn the eye candy (this isnt me, I love my eye candy) for a FPS game? More games? Theres just too much trust involved, without me being able to actually reproduce their efforts, its either useless, or I have to believe them. OTP, I too saw those things in their reviews, all the things Apes brought up, again. Weve all seen it. Not that its bad, it is interesting, it is different, but to me, not the best, nor most concise. And sometimes they just miss the point. Ive read their forums after a review, and seen the responses, and they are defended to the inth degree. Just like the Toms way it was meant to be played thread, where a few people were claiming Toms as being biased towards nVidia, I defended them, but then I also asked why there wasnt a follow up 4870 review, which we did get. THATS apples to apples, making sure its done right, no excuses, and open to a bit of opinion. Thats just not what Ive seen at [H], in their reviews, nor in their forums
 


Well, more cores will be used later, but just not now. CPU bottlenecks disappear (or lessen) at higher resolution and with filters on. I'm bottlenecked with a 3870x2 at 1280 x 1024 with my CPU but won't be at 1920 come September. If I had a 4870x2, I might be bottlenecked at 1920, even with the Phenom 8750 overclocked to 2.8 (which I'm considering). It's all relative. Plus, bottlenecks should vary by game.


 
OK, those cpus being used in muli gpu card configurations will incur slowdowns at regular cpu clocks in alot of games, using the new gpu cards heheh so therefore, you need to oc your cpu to get maximum, and sometimes playable fps, depending on game and resolution. There
 



thankfully ovaltineplease i listened to ur advice about the x38/x48 and completely ignored it so it all worked out in the end


anywho, that's me all-over some one tells me to do something i do the opposite
 

TRENDING THREADS