Partner Cards: One Radeon R9 290 And Three 290Xes, Reviewed

Page 3 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

npyrhone

Distinguished
Mar 12, 2012
29
5
18,535
It seems that you've got DVI-I and DVI-D mixed up, for example with HIS R9 290X. This has happened earlier, too.So here's how it goes. The connector in in the picture is DVI-D. It is easy to identify because the right end of the connector has only a vertical aperture, but no holes around it. It looks like a minus sign. DVI-D does not enable analog connection. DVI-I allows for analog connection and its right end looks looks not like "-" but like ":-:".
 

CaptainTom

Honorable
May 3, 2012
1,563
0
11,960
Ok so I just want to point out that anyone complaining about the 290X's current price of $700 needs to recognize that it trades blows with the $700 780 Ti while having 1GB of extra VRAM. They both use about the same amount of power, and they both operate at about 80c. Crypto miners have jacked up the price, and it SHOULD be lower; but don't think the proce will drop any quicker than Nvidia drops theirs. The fact is the 290X should cost close to the same as the 780 Ti, and the 290 should cost MORE than the 780...
 

ZEPd3Z

Distinguished
Feb 7, 2008
106
0
18,690
Why not use the Kraken X60? most test with the Kraken G10 I come across with are using the X40s'. GPU's (usually AMD's) tend to be hotter than CPU's (usually Intel's) wouldn't the performance difference between the X60 and X40 matter more in a GPU?
 

jin_mtvt

Honorable
Dec 25, 2012
26
0
10,530
Ok let's set a few facts straight please. I paid 550$CAD here in Canada ( ~ 520$USD ) for a Sapphire 290 tri-X more than a month ago. Please stop using NEWEGG overdriven BS upped prices as current price. We ALL know THEY are price gouging the cards since they got out. How do we even know this is not a setup from NVidia? ( teasing NV fanboys that are reading comments on an R9 review :p ) Then, lets looks at prices elsewhere. from Overclockers.co.uk ..the 290 tri-x is @ £284.99 ex VAT( don't include VAT on prices..i'm not including my country 15% taxe @ 550$ ..if i order from computeruniverse.de/overclockers.co.uk they do not charge me the VAT either ) So this gets us back down to ~ 480$USD @ 1.66 current exchange rate.Computeruniverse.de lists it at € 331,09 which is around 450$USD. So NOW tell me how you can use stupid 600-700$ prices from Master GOUGERERS Newegg to list on your reviews? Because i order from computeruniverse.de and they send the card at this price + ~35$ of express shipping??? far from 600-700$ i say.Then back onto the temperature issue.What the heck do we CARE ABOUT HOW HOT A CERTAIN PART GETS. Do you lick your card while playing a game ?? I'll say it for the second and last time, higher temperature = higher temp DELTA = more efficient to remove HEAT . Then, the only important number to know for a consumer point of view is the POWER.300W is 300W ...it doesn't change a thing if the GPU gets up to 80C or 70C, until the part is designed for that. As far as i can tell, we were told like 10 times by AMD that their design point is 95C target. Ok they screwed the reference cooler bad, but with the partners boards..do we really need to be talking about execessive temps? Tell me it consumes excessive power ( which it does not...very similar to high end NV cards while in use ) but stop spreading BS about the high temps which is not revelent other than for cooler comparisons here with the partners boards.
 

jin_mtvt

Honorable
Dec 25, 2012
26
0
10,530
My bad computeruniverse does not have stock of the sapphire. but they do have the XFX in stock @ € 341,18 .Not much of a difference.
 

jk47_99

Distinguished
Jul 24, 2012
207
5
18,765
As bad and loud as the stock cooler is, it isn't that difficult to setup a custom fan profile to ramp up the fan at bit sooner and actually keep the core clock at 1000 MHz. It just means that it occasionally dips over 55% but temps can stay below 90c for long gaming sessions. I can only imagine the hours they must have spent in the lab listening to the subtile differences in the noise the cooler makes before deciding that 47% and 55% were the cut-off points.
 

tulx

Distinguished
May 17, 2009
220
0
18,690
Did not expect the HIS IceQ card to come out on top in a closed environment. Will have to buy that, unless MSI Lightning comes out soon and tops them all.
 

Cichas

Reputable
Feb 27, 2014
4
0
4,510
Anybody with GV-R929XOC-4GD? I bought it yesterday and I am disappointed by fans. Even if you will try to spin them by finger (doesn't matter which direction) - two of them will make "crackling" noise, they rotating will go down quickly and in the end the rotation stops in "steps" - not continuously. Looks like bad bearings. Anybody else able to confirm the same (terrible quality fans used) or I had just bad luck? I will expect such behaviour after two years of usage - not on totally new card.

PS: I was also disappointed by behaviour during playback of HW accelerated FullHD H264 video (MPC-HC, DVI output). GPU core will increase speed to ~450Mhz, RAM will go to max, card temperature will go over 50C (to 60C) so auto fan (default setting) will increase speed from idle ~39% to 43% which is quite louder...

PS2: don't try to run OCCT on this card. VRM temperature (~110C) will make you cry. :(
 

xerxces

Distinguished
Dec 28, 2010
328
0
18,810
AMD has to do more to win me back. Up until about a six months ago, I only used AMD cards. I always had trouble with crossfire and with performance overall, even with only one card turned on. Then, I got lucky and won $500 on a scratch off ticket and decided to try Nvidia and I got a 760. Best decision I have could have made. Nvidia seems to handle AA much better and the games that do have PhysX there is a noticeable difference.
 

g00ey

Distinguished
Aug 15, 2009
470
0
18,790
yes, bitcoin miners made these cards prices skyrocket. its really made these cards non viable upgrade for some compared to the competitions cards. which is a pain since these were the best cards at launch. Either bitcoin prices crash or nvidia comes up with a competitor to GCN...
... or AMD decides to scale up production to meet the higher demand. I can't really see why some people see this "demand due to bitcoin mining" as a bad thing. If AMD can produce a card that caters for both gamers and bitcoin miners they will have more revenue and more money to develop even cooler things. Also remember, higher volumes implies increased economies of scale which means that with higher volumes they can get more cost efficient and thus afford to offer really good and well performing products at more reasonable prices. I really think AMD should step up to meet this demand because this won't last forever and other competitors may step in soon...
 

anthony8989

Distinguished


Bitcoin mining on GPU's is over . Why in the world would somebody buy an R9 290x to mine Bitcoins at 0.99 GH/S with a cost of $550 +++. You can get an ASIC like the Monarch that mines 600GH/S. You'd need 600 x R9 290X's to reach that yield. And that's not factoring in consumption. That one ASIC uses 350 watts - I wonder how many watts 600 x 290x's use full-speed mining.

If the argument to that is "You can't game on a mining ASIC" - Yeah, well, you can't game on an R9 290x while it's mining either. Miners didn't inflate the 290/x's price. AMD introduced it at an artificially low price to trump nVidia, then increased it to maintain profits. Why would a company offer $700+ performance for $550 when nobody else does? To be your buddy?
 

rdc85

Honorable
Maybe he means litecoin not bitcoin....

AFAIK it's the litecoin inflation that cause a short supply at first...
sadly some uninformed people also buy the card for bitcoin mining making it worse
(even some newspaper in my country print "bitcoin" and not "litecoin" in the article, spreading the mistake more)
 
Status
Not open for further replies.