Radeon R9 295X2 8 GB Review: Project Hydra Gets Liquid Cooling

Page 3 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

That's mostly happening in the US. No such problems in Europe.
 
I guess I'm the only one thinking the price of graphics cards has become just outrageous over the years. I still remember when a top of the line cards was less than 300 bucks. now I can get a used car or a new computer for less than a high end graphics card and I'm no just talking about this one.
 
Glad to see AMD nail this one. Performance - check! Noise - check! Heat - check! Appearance - oh hell yeah double check! And price? Yeah, they got that covered too.
 

So far as I know, no NV card has a 512-bit BUS. You'd need to run a 6GB 780, a 3GB 780, a 4GB R9 290, and a 2GB R9 290 against each other. Half of those cards don't exist. It would be a very interesting experiment though.
 


Back then there wasn't a high demand for graphics processing power. The graphic processing demands of video games has shot up in recent years. Additionally, screen resolution options are completely off the charts compared to what they were even 10 years ago. I remember 10 years ago when everyone still had a CRT monitor and 1280x1024 was considered a high end resolution. Mainstream resolutions back then were more along the lines of 1024x768. Now mainstream is 1920x1080 and high end is 3840x2160. So back in the day you added a 66% increase in pixel quantity going from "mainstream" to "high end" resolutions. Currently the increase from "mainstream" to QHD "high end" resolutions is a 300% increase in pixel quantity. Therefore there is a much larger disparity in graphics card prices between "main stream" and "high end" graphics cord power and prices.
 


And I remember when 4MB of RAM was like $250 and that was more than a weeks wages at minimum wage, so what about it?
 
The big Tom's Hardware thing on every game benchmark graphic just seems ridiculous and affects readability badly.

Maybe think of a more subtle watermark?
 
The 790 Likely isn't coming. Which saddens me. Kepler is on the way out and Maxwell on the way in. But about a year or so from now I'll (possibly) be drooling over the 890.
 

It's going to be a while before we see high-end Maxwell GPUs, because Nvidia are waiting for TSMC to get their 20nm production sorted.
 


Yeah, it's a shame about that. I was thinking I'd go to Nvidia when Maxwell came out, but I want to see the performance numbers vs the AMD 300 series first. I just liked the idea of something like an 880 using less power than my 270X. But now I have some time to think I guess.

Also, if they did release a 790, I would think that it's a move to keep putting out cards and bridge the gap in time between Kepler and Maxwell.
 
Can someone please explain how they came up with 500 watts? I understand it's twice that of the 290x, but with only 2 x 8-pins and the pci-e it's only 275 watts. Is this an "in general" spec for non-reference skews later on??
 
I may just switch to AMD for this especially if the 790, if it gets released, probably has 3GB per GPU as I last heard. I only have room left for the one radiator for GPUs so this looks good for me this time around.
 
"That also means most maximum power consumption calculations based on an assumed 75 W reading from the motherboard are wrong." That statement makes me squirm. :)

A bit controversial... But proof was provided.
 
Performance = Impressive
Frame Pacing = Non-issue
Cooling = Nice
Power Draw = Interesting

Overall = Winner!

Someone buying this GPU probably isn't worried about buying other quality parts (case, PSU, high end CPU), and power consumption probably isn't a concern either.

I am curious though, whats the temp of the air being exhausted? Is there room to OC? And lastly, is there a performance benefit from adding a fan for a push/pull exhaust?

Great review Tom's! Can't wait to see how nVidia answers this, price cuts or maybe change the specs of their upcoming dual solution?
 
3840x2160 vs 2560x1440:

4K gaming gets about HALF the frame rate vs 1440p, and most importantly drops below 60FPS in many titles.

However, I seriously doubt anybody could even tell the visual difference. Going from 1080p to 1440p mainly produces a slightly sharper HUD/text but everything else is basically exactly the same.

The ONLY two advantages of 4K versus 1440p that I can even think of are:
a) superior anti-aliasing, and
b) better textures (in future games or some current mods)

BOTH of these issues however will be solved for 1440p gaming by:
a) newer anti-aliasing methods, and
b) Tessellation (zooming in rebuilds higher textures automatically)

So in general, there's very little point to 4K gaming at least until prices are not much more than an equivalent 1440p monitor. Gaming at 40FPS at 4K when you could be doing 80FPS at 1440p is just stupid.
 

As the test shows, you can exceed the rated wattage of the 8-pin connectors. Apparently safe to do with a quality power supply. And power consumption is measured to around 450W, so a 500W TDP isn't unreasonable.
 
You get the 500 watts with maximized power target, as i wrote.
The peaks can excite the 530 watts barrier but this is not a real-world scenario. :)
 
Status
Not open for further replies.