PowerColor Devil R9 390X Review

Status
Not open for further replies.
Hmm.. So a pre almost max OCed card with watercooling.. Only around 6 months late (or more if you count the 290X 8GB as basically the same as a 390X). At this point if you have a decent card wait for 16nm..
 
An impressive result by AMD and PowerColor! I'm looking forward to future (more modern) released by these companies hoping to bring more competition into the once stagnant GPU realm!

I'm aware this is a review of the Devil R9, yet I'm curious why the GTX 980 was mentioned in the noise graph, but omitted in the temperature graph, I get the feeling its because it will show the card was throttling based on thermals, helping describe its performance in the earlier tests, this is strictly speculation on my behalf however, and highly bias as I currently own a 980.
 
Does the pump constantly run? I wish there was a hybrid liquid/air cooler that ran the fan only when idle, then turned on the water pump for more intensive tasks. I don't like the noise of water pumps when the rest of my system is idle
 
Does the pump constantly run? I wish there was a hybrid liquid/air cooler that ran the fan only when idle, then turned on the water pump for more intensive tasks. I don't like the noise of water pumps when the rest of my system is idle

The pump has to run, even at low RPMs, otherwise the card would overheat. The waterblock itself is generally not enough to dissipate heat. The waterblock simply transfers the heat to the water and the radiator does almost all of the heat dissipation. If the pump is off there is no water flow through the radiator, no water flow will mean heat from the waterblock is not dissipated, causing the water in the waterblock to heat up and the GPU to overheat.

 
What's the point testing on the windows 8.1? I mean, there was enough time to upgrade to windows 10 already... It was shown several times that the new W10 often provide measurable performance advantage.
 
Why test on 15.7? Seriously, that's like 6 drivers old. AMD stated they will release WHQL just ocasionaly, with more Beta throughout the year. You're doing them a diservice benching only "official" drivers.

Nvidia's latest ... dunno, 12 drivers in the last 3 months were all official and most of them broke games or destroyed performance in a lot of other games.
 
Why test on 15.7? Seriously, that's like 6 drivers old. AMD stated they will release WHQL just ocasionaly, with more Beta throughout the year. You're doing them a diservice benching only "official" drivers.

Nvidia's latest ... dunno, 12 drivers in the last 3 months were all official and most of them broke games or destroyed performance in a lot of other games.

At the time this review was written it was not that old. As mentioned in the article, we first got this card over the summer. The test were done a couple months ago now and at the time they were done with the driver that Power Color suggested after having problems with the first sample.

An impressive result by AMD and PowerColor! I'm looking forward to future (more modern) released by these companies hoping to bring more competition into the once stagnant GPU realm!

I'm aware this is a review of the Devil R9, yet I'm curious why the GTX 980 was mentioned in the noise graph, but omitted in the temperature graph, I get the feeling its because it will show the card was throttling based on thermals, helping describe its performance in the earlier tests, this is strictly speculation on my behalf however, and highly bias as I currently own a 980.

The temperature of the 980 was omitted because the ambiant temperature of the room was 3 degrees cooler when that card was tested, which affected the results. I didn't have the GTX 980 in the lab to redo the tests with the new sample. I had the card when the defective 390x arrived for the roundup, but when the replacement came back it was loaned to another lab at the time.
Rather than delay the review even longer, I opted to omit the 980 from the test.

It had nothing to do with hiding any kind of throttling result. If that were found we wouldn't slip it under the rug.

What's the point testing on the windows 8.1? I mean, there was enough time to upgrade to windows 10 already... It was shown several times that the new W10 often provide measurable performance advantage.

We have not made the switch to Windows 10 on any of our test benches yet. I don't make the call about when that happens and I don't know the reasons behind the delay.
 
Well then, sir @kcarbotte, I can't wait until you guys get to review some AMD GPUs on Windows 10 with the new Crimson drivers and some Skylake i7s thrown into the mix !
 


You and me both!
I have a feeling that the Crimson drivers have better gains in Win10 than the do in older OS's.
 
An impressive result by AMD and PowerColor! I'm looking forward to future (more modern) released by these companies hoping to bring more competition into the once stagnant GPU realm!

I'm aware this is a review of the Devil R9, yet I'm curious why the GTX 980 was mentioned in the noise graph, but omitted in the temperature graph, I get the feeling its because it will show the card was throttling based on thermals, helping describe its performance in the earlier tests, this is strictly speculation on my behalf however, and highly bias as I currently own a 980.
*Face palm*
How is this impressive? The reviewer could not recommend it and said there are better options available LOL
 
UNTIL SOME KIND OF DRASTIC CHANGE COMES ABOUT.......
WHEN THE PRICE OF HIGHEST DEF MONITORS COME DOWN TO REALITY......
I'm happier than a pig in poop because the price per performance gains are MINISCULE......MY XFX R290 / LG 27 inch 1920 X 1080 work beautifully, thank you....VSR makes my display explode with reality ! ! !
p.s. T.V. 's are cheap....When a monitor is classified as a PC device it's price soars.
Are we really that stupid ? A HIGH DEF, 4K TV is 100's of dollars cheaper than a " computer monitor " That pertains to ALL definitions and widths.....
 


Are you joking? Neweggs cheapest 4K monitor is $300 and their cheapest 4K TV is $250. Furthermore, many of the 4K monitors are only around $350 whereas the next cheapest 4K TVs are over $400.

Also, labeling a TV as a computer monitor does not make it the same as a computer monitor. Monitors are generally made to have lower response time and there are other differences like that.
 


Because, it directly competes with Nvidia's higher-end products, which is nice to see once more. Competition promotes advancement which is good for everyone.

Without competition, advancement slows to a crawl (see Intel).
 


Not to mention most 4k TV's lack DisplayPort or HDMI 2.0 inputs which enable refresh rates above 30Hz at 4k resolutions. No one wants to use a 30Hz 4k TV as a monitor.

 


 
All I can say, is that I'm running two of these in cross fire and couldn't be more happier. It plays every game I throw at it at with Ultra settings on my free sync 2k screen. I'm NOT overclocking them because to me, the extra few frames per second won't be noticeable when I'm already at 100+.
 
Status
Not open for further replies.