Best Of The Best: High-End Graphics Card Roundup

Page 4 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

IzzyCraft

Distinguished
Nov 20, 2008
1,438
0
19,290
Lol god dam fanboys left and right. It's a review benchmark as long as they left all the info on how they ran the game and what's the set up i see nothing wrong with results.

Ya'll bitch about dx10.1 when you know yourself you wont use anything past dx9 to save fps etc. You complain about hand pick results well guess what unless you do benchmarks at every resolution and at every setting possible you're going to look a bit bais. I can call xbitlabs or anandtech ATI bias well mostly anandtech just about all their articles are about ATI and when they do a review of cards usually the pictures are of only ATI cards.

We already know some games work better on nvidia and some work better on ATI probably due to the drivers woop de freaking do you should only care if you play those games.

So Shut up and be thankful for the results, as long as they left you what were their settings on what machine then you shouldn't say jack about the results as long as they were not lies.
 

spearhead

Distinguished
Apr 22, 2008
120
0
18,680
i think half of these benchmarks were crap. you either used uncompatible hardware or crap drivers because the radeons are quite on pair with nvidias offering 4890 for example is as fast as gtx275 with a maximum diffrence margin of 5-10% in certain tests, i have seen quite alot of benchmarks out there were the 4890 beaten or was on pair with the GTX275. here you have a few benchmarks that make you beleve a radeon 4870x2 would not even be able to get you 30+ fps where a gtx280 would get 120fps that is total bullshit!
 

sebastian869

Distinguished
Jan 5, 2009
213
0
18,680
I cant seem to find this "MSI" water cooling kit they claim then used on the 295. Come someone please lemme know where to look or what else could replace it (being small)?

thanks,
Sebastian
 

jontseng

Distinguished
May 27, 2009
5
0
18,510
This is just bad analysis, plain and simple. It does no service to the author, to TG and to the readers. Come on folks, this is elementary logic 101:

1) EITHER there is a problem with the Last Remnant test, OR we must accept the GTX295 is actually 636% faster than the 4870x2 (303 aggregate frames versus 41), and that the 4850 is 17% faster than the 4870x2 (48 frames versus 41).

2) In spite of this rather obvious issue the author simply included the test and made no comment about it. The appropriate action is clearly to either exclude the test, or to include it with an explanation of the discrepancy. The author did neither.

The error was compounded because he used the flawed data to draw (similarly flawed) conclusions. For example he claims that the EVGA295 is "29.8% faster" than the 4870x2 based on the aggregate framerate data. In total the EVGA295 polls 620 more frames than the x2, but 262 of the difference (or 42% of the outperformance!!) is attributable to this single flawed test!

3) I have to conclude that EITHER he did not notice the anomalous result (in which case he is incompetant) OR he did notice it and ignored it (in which case he is being misleading).

Which is it to be?
 

jontseng

Distinguished
May 27, 2009
5
0
18,510
On a secondary note the documentation of the "Prozent normalisiert" (which should, in theory, help iron out the discrepancies outlined previously) is completely inadequete. This would be an issue under any circumstances, but in a situation where the author is demonstrable EITHER incompetant OR misleading it is doubly serious.

There is a vague description that "These adjustments, or weights, create a different sum that has been mathematically adjusted to represent a fair basis for comparison." with no explanation of the methodology or of the weights involved. This makes it impossible to assess whether the results are meaningful or not.

I would suggest they are not. An obvious anomaly which jumps out is that the normalised results show the GTX275 faster than the GTX280. How can this be given the 280 is the same GPU (give or take a 55nm shrink) except with MORE memory, MORE pixel shaders and a WIDER memory bandwidth. How can this possibly rate as a SLOWER GPU on the normalised tests?
 
G

Guest

Guest
When something goes wrong in one or more measurments (@ metrology laboratories) we just usually skip results which are definitely wrong. Well, I think it's not the best test I've ever seen, in fact, it's one of the worst. We have a AA problem with DX10? Skip that. Let's see what about Dx9... AMD cards are extremely slow in a game or two? No problem here.
 

sonofhendrix

Distinguished
Jul 16, 2008
46
0
18,530
Is it just me or has the pc graphic card market gone stale. I brought my ATI 4870 a long time ago at a cheap price, and i would have expected it to be well superceeded by now.
Is it this great recession thats cuasing the chip makers to simply suspend development of new GPUs because of costs.......
Im not complaing as long as the new games are well optimized to take advantage of the current gen GPUS.
 

Netherscourge

Distinguished
May 26, 2009
390
0
18,780
Why would they sell a video card with a stock cooler and then everyone suggests that we're in trouble if we don't completely replace the cooler with our own, more expensive, water cooler?

Why even bother selling video cards with air-coolers anymore? Why don't the manufacturers make water-coolers the stock cooler in the first place if the stock air-cooler is ultimately going to make the card fail?

Makes no sense to me.
 
G

Guest

Guest
Why aren't there any 2560x1600 resolutions? I agree with 'drealar', 1900x1200 should be the lowest resolution tested. If I'm buying a big card, I'm pairing it with a big 30" monitor (resolution). I have actually been holding off purchasing a video card because I am not content with the FPS on the current offerings.

Will nVidia's next GPU finally be worth getting with a 30"!!??

In the future, please target the test with the target market (ie; get the right resolutions for what the potential owner would have). ASSumption on my part.
 

steiner666

Distinguished
Jul 30, 2008
369
0
18,780
why not have Crysis/Warhead in the benchmark games? Isn't it still the most GPU-intensive game? I know that i've personally been holding off on getting a new GPU until a decently priced one can play it maxed out on 1080p res with a consistent 30+ fps... so this article is now help to me
 

tristanx

Distinguished
May 29, 2009
15
3
18,515
This review is a bit broken because of Last Remnant. This game is a pain to ATI cards because ATI cards will have a huge performance hit when texture - shadow is at maximum. It would be better if the texture - shadow is set to the lowest for all cards.
 

magicandy

Distinguished
Jun 8, 2008
295
0
18,780
The article[citation][nom]Netherscourge[/nom]Why would they sell a video card with a stock cooler and then everyone suggests that we're in trouble if we don't completely replace the cooler with our own, more expensive, water cooler?Why even bother selling video cards with air-coolers anymore? Why don't the manufacturers make water-coolers the stock cooler in the first place if the stock air-cooler is ultimately going to make the card fail?Makes no sense to me.[/citation]

Because if your card fails it means you have to buy a new one.
 

sirrell

Distinguished
May 3, 2008
95
0
18,630
Loved this article!
Especially that you used a few retail cards we all have seen or heard of.
Thanks to this review I have made my decision to get a Zotac GTX285 AMP'D eventually in three of them in SLI on a 790i mobo
:D
 

sirrell

Distinguished
May 3, 2008
95
0
18,630
[citation][nom]Netherscourge[/nom]Why would they sell a video card with a stock cooler and then everyone suggests that we're in trouble if we don't completely replace the cooler with our own, more expensive, water cooler?Why even bother selling video cards with air-coolers anymore? Why don't the manufacturers make water-coolers the stock cooler in the first place if the stock air-cooler is ultimately going to make the card fail?Makes no sense to me.[/citation]

They Don't suggest modifying the cards, the water cooled EVGA one is stock copper pipes and stock water hookup
Retail ones are the branded ones at the shops
Note they aren't all water cooled either
They Suggest buying an expensive brand like EVGA if you want results!

Besides some people are still wary of mixing water and electricity so fans are for them.
;)
 
G

Guest

Guest
been wondering, are those power consumptions for just the GPU or the entire system used all together?
 

asuran83

Distinguished
Jun 10, 2009
1
0
18,510
Anyone have an idea on the release date for the EVGA Hydrocopper?

I herd a rumor that nvidia will not allow an OC 295 out, until they release an updated version of the card.
 
G

Guest

Guest
Saw this on the last page of this article...
(using it as a standard air-cooled GeForce GTX 295 card is simply asking for trouble)

I have had a bfg gtx 295 for 5 or 6 months now (air cooled)with the i7, no problems here.
Not sure what that statement is about
 

bikeracer4487

Distinguished
Jan 20, 2009
332
0
18,810
[citation][nom]billy1ear[/nom]Saw this on the last page of this article...(using it as a standard air-cooled GeForce GTX 295 card is simply asking for trouble)I have had a bfg gtx 295 for 5 or 6 months now (air cooled)with the i7, no problems here.Not sure what that statement is about[/citation]

I think the reviewer was saying the using the WATERCOOLED version of the GTX 295 as aircooled would be asking for trouble as it would quickly overheat without the water to cool it down. I don't think he was saying the using any normal 295 would be asking for trouble.
 
Status
Not open for further replies.