Radeon R9 295X2 8 GB Review: Project Hydra Gets Liquid Cooling

Page 5 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

Ahmadjon

Distinguished


No :D They would sell their current GPU's and will get one or two R9-295X2 for mining :)
 

anthony8989

Honorable
Feb 2, 2013
652
0
11,160
57
It's funny you say that , tons of people have been flooding craigslist and sites like it trying to offload dozens of high end Radeon cards. At least in my region any way.
 

sayantan

Distinguished
Dec 9, 2009
692
0
19,060
25
Please post a quad cfx review of this outstanding card specifically to see how far AMD's technology has progressed on quad cfx front so that we could know it is a viable option in future.
 

Sakkura

Illustrious

Displayport can carry audio just like HDMI, though obviously you'll either need displayport at the other end as well, or need to use an adapter.
 

TeamBLU 4K

Honorable
Feb 6, 2014
12
0
10,520
1
Another question... what would be the PCIe 3.0 requirements for putting two of these 295x2 cards in CrossfireX? Would it require dual x16 by default, or would x16/x8 be sufficient? Is there a way to calculate such requirements?
 
Another question... what would be the PCIe 3.0 requirements for putting two of these 295x2 cards in CrossfireX? Would it require dual x16 by default, or would x16/x8 be sufficient? Is there a way to calculate such requirements?
x16/x16 PCIe Gen2 or x8/x8 PCIe Gen3 should work just dandy.

TechPowerUp has some great benchies -- even going as far to say PCIe Gen2 x8/x8 does a bang-up job in multi-card.

edit: I fergit ...

Going above 1080p tends to really even things out ... assuming, of course, that you do not plan to run 2x295s at 1680x1050 :D

 

RedJaron

Splendid
Moderator

Does this change with AMD's new bridgeless XFire system? How much traffic now goes across the PCIe bus instead of the dedicated bridge? Probably not much, but is it enough to finally show some lag at PCIe 2.0 x8?
 
That could well be a possibility but I've got no clue. It's easy to think there will be a point where high-rez textures, multiple GPUs, drivers, frame-rendering and bandwidth all collide in a negative fashion but new tech rides to the rescue.

PCIe Gen3 is approaching 1GB/s per lane with reduced overhead. That's a bunch. DDR4 could be the next solution with dedicated address space for each discreet card/GPU.

 

Grayson Grouge

Reputable
Apr 15, 2014
4
0
4,510
0
I've never been a fanboy of either company but have always hated AMD's dual solutions. This however, is a damn good job. I don't have the money for it but seriously a damn good job.
 

cub_fanatic

Honorable
Nov 21, 2012
1,005
1
11,960
213
Apparently they were waiting for the whole litecoin freakshow to die down after the first ASICs were released. Now, you can get a 280x for the MSRP and in some cases even less with discounts, rebates, etc. Had ASICs not been released yet and the 295x2 still came out, they would probably be selling for more than $2500. Ultra high end gamers can thank the makers of the "Gridseed" for the availability of these 295x2s - not just the "low" prices. I, on the other hand, will use my 7950 Boost until I can't even play whatever the equivalent of Minecraft in low detail is in the not too distant future.
 
Status
Not open for further replies.

ASK THE COMMUNITY

TRENDING THREADS