I am sad about HD 5870

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.


im just curious will one gtx 360 fit in cm 690?

cm 690 i think its biggest mid tower case i hope so it will fit, i dont want full tower


Well, until we have the final size on the card we really won't know. Are you referring to the 360 or the 380 (GTX). The 380 might be huge, but again, pure speculation. Hopefully as the cards take shape and proliferate we'll get more concrete numbers.


Jan 2, 2008
Just out of curiosity... how did you get 3, troll? I haven't seen a place with more than a 2 per person limit. To be honest I'm not sure I believe you. Also, if you have $1140 to spend then why the HELL didn't you wait for the 5870 X2 to come out and get 2 of those for less, you would undoubtedly get better performance.


May 5, 2009
I dont know what that chart is showing but it contradicts ATI's powe claims and THG's own quotes from the same article. I'm talking about active and idle consumption according to AMD's website and the info I got from Tom's article on the 5870. I quoted Tom's above, did you not read the quote?

Lets do a simple math lesson here kiddy.

1 5870 uses 188w when active. So theres no way 2 can use 561w.

1 5870 idles at 27w. So theres no way 2 can idle at 166w. Starting to get it now?
This chart shows the power consumption of the entire system but I guess is so difficult for you to understand.

i7-975 OC 4GHz/x58/6GB ram / 2 HD5870’s consume 561W load
i7-975 OC 4GHz/x58/6GB ram / GTX295 consume 482W load

561W > 482W

get it now?

If you steel believe that the 2 5870’s consume less power that the GTX295, then you are living in your own little world…

Not to mention having 3 5870's would be totally moronic because there is no cpu out there even at 4.5ghz that could even keep up with them. So if you did get mommy to buy you 3 5870's then your a total friggin idiot to begin with and you wasted your money (oops mommies money)
I have 30’ (2560x1600) and i7-975 OC 5GHz (Custom W/C) as you can see the i7-975 can perfectly keep up with 3 5870’s…


I'm not sure I like this new trend that reviewers use furmark or occt to create ridiculously high power consumption values for the graphics cards, you won't be seeing anything close to those numbers when gaming...

and I think this is right time for a little funny flash
Successful troll is successful
Since that toms article clearly shows facts on the graph and then says completely opposite in the text.


May 22, 2009
Sorry, i just have to jump in with regards to the power consumption - it's pretty straightforward, and successful Troll is more correct.

first, Tom's quote:

Incidentally, that’s 71W more than a Radeon HD 4870 X2 and 79W less than a GeForce GTX 295.
But look at the chart (power consumption at the socket - how could it NOT be the load of the entire computer? Also, if everyone measures the 5870 at 180-ish, then why does this chart have it at 354?)

The article text clearly is a typo.

The numbers (I can't copy paste the chart here):

idle = 141
load =354

2x 5870:
idle = 166
load = 561

idle = 195
load = 482

So the difference between 1 and 2 hd5870's is, surprise surprise, 25W idle and 207W load. (curiously right within the boundaries of the oft-spoken 27W and 188W card usage)

If the 295 uses 128W more than a single 5870, how could the second card possibly use merely ~60W more at load to stay 70W below the 295?

However, 2x 5870 do use less power at idle, namely 29W.

These charts aren't that complicated....


Sep 26, 2008

How does this matter anyway? I mean two GTX 285's use more power than two HD 5870's, while two HD 5870's use more power than one GTX 295 they also bring a lot more performance to the table than one GTX 295 and it's very likely that a HD 5870x2 will use much less power than two HD 5870's (like the GTX 295 uses much less power than two GTX 285's), probably even less than a GTX 295.

- ATI gives you more bang per watt

- ATI gives you more more bang per $

- ATI gives you more bang in general: four GTX 285's or two GTX 295's would loose against four HD 5870's or two HD 5870x2's, one HD 5870 wins against one GTX 285



Sep 2, 2008
You did miss one major thing, though. If those measurements are taken at the wall, the second card isn't using 207 watts, at least not in the way we traditionally state power consumption. You forgot the efficiency of the power supply, which is probably ~80% That means ~20% of that 207w is actually being used by the PSU, meaning the second card is consuming closer to 165w.

But yes, the article text is obviously a typo because the difference in power usage between the GTX 295 and HD 4870X2 was NEVER 150 watts. And those #'s are definitely for the full system.


Mar 21, 2006

With the chip size probably not possible, even if they do ati will cut prices on the 5870.



I disagree since all benches are basically comparing old Nvdia GPU"s to the new 5XXX series, IMO it is only fair to compare both next-gen brands. Just cause Nvidia is taking longer then usual to launce does not mean that you cannot compare both brands (next-gen). AMD could have done the same exact thing but instead they decided to launch early so in all the real comparison will be done sometime next year....

The reason many peeps here on toms are whining about the comparison issue is due to the fact that they underestimated what Nvidia will soon release... Many thought that Nvidia was going to flop and that is not the case......

BTW, Nvidia has been testing thier next-gen GPU's for quite some time, meaning the tech/arch was already developed...They are just having a couple of issues that need to be fixed before launch, so basically you cannot say that comparing both brands will give a disadvantage to AMD.


Sep 2, 2008
We can only compare with what's out. If Nvidia takes 6 months to bring anything out, there could very well be a refresh from ATI out; I mean, we have the same amount of evidence for both things.

The fact that Nvidia will probably bring out their next gen in a few months means nothing for people who are building now, who are upgrading now, and who are buying a card now. With no good guess of when Nvidia could have cards out, and no benchmarks at all, how can you really tell someone to wait, especially when it looks like it'll be after Christmas before Nvidia really has anything?