Radeon R9 295X2 8 GB Review: Project Hydra Gets Liquid Cooling

Page 4 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

Reaver192

Distinguished
Oct 2, 2011
61
0
18,630
What is it about amd cards that allow them to perform so well at rediculous resolutions? Is it the pixel fill rate or the bit rate for the memory? Am I missing something?
 

Ninjawithagun

Distinguished
Aug 28, 2007
747
16
19,165
These tests are invalid IMHO as they only provide half the story. Why did Tom's Hardware decide to not use the 6GB Titan? That is the ONLY reason why the Titan SLI configuration scored lowerd @ 4K resolutions. Everyone who knows anything about gaming at extremely high resolutions knows that a much higher frame buffer is required to render at higher frame rates. 3GB of DDR5 is not enough to fluidly game at 4K resolutions. I'll wait for Tom's to redo this article properly using the Titan 6GB card and the new Titan Z dual-GPU card once it becomes available. This article only shows what we already knew - nuff said!
 

ekagori

Honorable
Feb 9, 2013
407
2
10,960
@ninjawithagun If you look at the game charts provided, you will notice they did use SLI Titans in the comparison and they still scored below the 295X2. Even their 6GB of vram did not help the Titans.
 

ALL Titans have 6 GB. There is no 3 GB Titan. The Titans used in this review have 6 GB of VRAM, so your criticism is ridiculous. The Titans in SLI have 50% MORE memory per GPU than the R9 295X2, or the R9 290 Crossfire.
 

Ninjawithagun

Distinguished
Aug 28, 2007
747
16
19,165
@ Sakkura

I did clarify properly in my post. The reason for my opinion was the fact that Tom's used old 1st Gen Titans and not the new Black Titans that have a completely unlocked GK110 GPU with 2880 shader units as well as higher GPU and memory clock frequencies. Thus, my point is still valid :p
 

Isaiah4110

Distinguished
Jan 12, 2012
603
0
19,010




Your first response makes absolutely no mention of either the Titan Black Edition or an "old/original vs. new" version of the Titan. You only complained about the falsely alleged "lack on memory" on the Titans they used in the benchmarks.

That said, an unlocked multiplier will not change any of these test results in any way. You cannot compare a manually overclocked model from one manufacturer to the stock rates from another and consider it a fair benchmark. Comparing stock settings from the Titan Black Edition to the stock settings from the "old model" of the Titan is not going to produce a large disparity in performance. If TH does a review of the overclock-ability of the 295X and its performance when overclocked then yes, it would make a bigger difference using the Titan Black vs. the regular Titan, but not running benchmarks at stock settings.
 

Lessthannil

Honorable
Oct 14, 2013
468
0
10,860


I don't see how the 384 bit bus on the 780 (Ti) would hold it back in 4K. Despite the 290(x)'s 512 bit bus, the NVIDIA cards still have more memory bandwidth.



Yeah, its just you. Its only more expensive for people who feel they need the latest and greatest video card. Otherwise, graphical power is becoming cheaper. A $150 graphics card (750 Ti or R7 265, take your pick) which are now "lower-mid end" now matches and/or beats the HD 7850 which started retailing at $250. The HD 7970's price has essentially been halved from its starting price of $600 (I think? Could of been $550 too).

__

Good card, but there's one problem. The heat from the VRMs and the power cables is worrisome. From Techpowerup's thermal imaging, the VRMs can get as hot as 105c. They are rated withstand temperatures up to 120c max, but running them at that temperature or close to it probably isn't good for longevity. The 8 pin power connectors are also running hotter than the actual GPUs, too.

See for yourself. https://www.youtube.com/watch?v=gVDnQomkkaI#t=385
 

anthony8989

Distinguished
Wow the PCB is so densely packed - it looks awesome. I'm surprized to see the 295x2 beat out GTX 780 Ti's in SLI. The bar has been set pretty high for nVidia.

Also, thank you Tom's, for adding Arma 3 to the benchmark suite.
 

When was that? In 2006, the Geforce 8800 GTX launched with a $600 price tag, and each dollar was worth more back then. The top single-GPU cards today, the R9 290X and GTX 780 Ti, can be had for $570 and $650 respectively. No real change, if anything the prices are a little lower due to inflation.
 


its the size of the memory bus. Nvidia has had tremendous issues scaling up the size of their memory bus... and even when they do they don't seem to get any real improvement in performance from the scale up in size.

AMD cards just are able to transmit a lot more data to and from the ram at any one time. That said all that memory bus size means almost nothing at lower resolutions, and at lower resolutions it's memory SPEED that matters. This is why the 780ti which has a faster memory speed, but smaller bus, generally out performs the 290/290x at lower resolutions.

it was actually this technical issue that sorta predicted this behavior before we even saw any benches on the new 290/290x.. it was widely predicted that in 4k these hawaii cards would out perform the nvidia 780/titan due to the memory design, yet in 1440p and lower the 780/titan would out perform the new hawaii cards. Which is exactly the case.
 

That's not correct. AMD and Nvidia are simply using different strategies when it comes to memory bandwidth.

The GTX 780 Ti has a memory bandwidth of 336 GB/s. The R9 290X "only" has a memory bandwidth of 320 GB/s. It's just that the GTX 780 Ti gets there with a 384-bit bus and extremely high memory clocks (7000 MT/s), while the R9 290X gets there with a 512-bit bus and moderate memory clocks (5000 MT/s). What approach you use doesn't really matter for performance, only the memory bandwidth you end up with.

The problem the GTX 780 Ti has at high resolutions is simply that it only has 3GB memory, while the R9 290X has 4GB.
 


my understanding is a little different... i mean i know bandwidth is bus*speed, and generally bandwidth is the important number... however from what i've seen and heard bus size becomes a bottleneck in and of itself when dealing with huge blocks of memory at a time. meanwhile when bus size isn't a bottleneck on it's own, memory speed actually has an impact on performance as well.

My understanding might be wrong... but from what i've seen you can see any gpu in the past with a huge memory bus tends to do better at high resolutions how high depends on the size of the bus in question) then a similar card with a small bus... meanwhile, cards with a higher speed tend to do better at lower resolutions then similar cards with lower memory speeds.
 

jlwtech

Honorable
Mar 8, 2012
58
0
10,630
I doubt I could make use of all that power, even with my new 1440p monitor.
However, I still find myself thinking "I gotta get one. It's soo awesome."

It's been a long time since any AMD product has done that to me....
Keep up the good work.



 

jlwtech

Honorable
Mar 8, 2012
58
0
10,630


Then why did you say it?
Are you trying illicit some comments from the "green tem"?
 

rdc85

Honorable
Damn nice card...

Now i wonder if the other (AMD partners) will provide their own custom cooler for this..

edit:


Thats the wonder of WC, the liquid can transfer heat much quickly and evenly.. :D
 

ExpLicitSainT

Reputable
Apr 7, 2014
9
0
4,510
It would be nice if it supported 4k @ 60hz over HDMI to connect to my Samsung 55" F9000 4K tv. I don't mind shelling money out for a beautiful TV, but it's a lot harder to pull that kind of money out on a smaller monitor only meant for gaming. AMD, please add HDMI 2.0 support to this card!! PLEASE!
 


Why?
I could get about the same performance with two good GTX780's (about same as Titan). This card can pull ahead slightly at 4K but then it's usually under 60FPS so why would you build a $3000 computer then run at below 60FPS?

Crossfire still is inferior to SLI overall so that's another problem.

It's an interesting card though and if stayed at $1500 it might sell well, but if I had $1500 I'd get a GTX780Ti and the upcoming G-Sync monitor (27", 2560x1440, 120Hz) from Asus.

I'd far rather game at 1080p or 1440p and be really smooth on G-Sync then put my money into trying to get 4K resolutions but not be as smooth.
 
Jeez ... its a monster. I couldn't get any 290X cards here so settled on two 280X's.

The biggest issue with these high end cards is availability ... both AMD and Nvidia let us down in this respect.

The shops make us pay such a high premium to get one that it is never RRP.
 
Status
Not open for further replies.