AMD Introduces Radeon R9 Fury Series Graphics Cards With Fiji GPUs

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
And to the guy complaining about dvi... I suggest you upgrade your monitor.

1440p 120hz monitors such as the QNIX EVO II require dual-link DVI and imho, don't need an upgrade.
[\quote]

The man was talking about DVI-I not dual link, when talking to another fellow with VGA and DVI-I inputs in his CRT monitors.
 
400 A? You mean 40A?

Probably not. Cards actually run at a little over 1V, and use VRM's to step down the voltage from your 12 watt power supply. In a card like the R9 290X (like I have) under load, it will have about 270A feeding the GPU (but the VRM's will get super hot, and those will die under heat just like the GPU itself). They're probably claiming that overclockers will be able to push the stock card to close to 500W overall power draw, but I don't imagine those VRM's can do that for very long without bursting into flames unless properly cooled.
 
http://iyd.kr/746
Synthetic Benchmarks are out for the new cards.
As expected Fury X outperforms the TItan-X and for the same price as the 980ti.
The Fury for $550 smashes the 980 but I guess Nvidia saw this coming and lowered their price to $500.
The Nano is going to be in between the 390x and the Fury so better performance than the 980 and I'm guessing it's going to be $450 since the 390x is going to be around $350 and puts it in between the Fury and 390x as does its performance.
Not too bad from AMD
 


Yes and no. It is GCN 1.2, so basically think of it as an enhanced Hawaii XT.



Pricing has already been announced, the Fury XT is $650, Fury is $550 and the 390X is $429.

I also would take those numbers with a grain of salt until official benchmarks start to hit rather than random ones from an unknown website.
 


It's hard to tell in a lot of ways, If AMD was smart they will be outperforming the 980Ti and especially the 980. Their TDP is way higher than the 980, so all that extra power better buy a performance boost. I will be severely disappointed if R9 Fury doesn't outperform the ~$500 980 by quite a bit. But if at $550 they are at the level of the $650 980Ti, that could be interesting - even though I personally would almost certainly never buy a GPU over 200 Watts. I hope I was mistaken by comparing the TDP of the Fury to the 980.

I'm not actually anticipating much if any performance difference between Fury and Fury X aside from overclocking, as the table comparing the 300 series on the AMD website jumps from R9 Fury to R9 Fury X. They mention in the pictured slide that Fury is Dual "Fiji"... and based on the core count I suspect that what they essentially have done, is found a way to cram 2 R9 280X's onto a single die with the unified memory
.... which means performance will most likely below the R9 295X2. If that (or something similar) is the case, performance is going to be more driver dependent than normal, and it will probably underperform unless it is using DX12/Mantle. On DX11, the 4GB HBM may be split into the 2GBx2 format you expect from a Crossfire setup.

Also, there is a user guide for R9 Fury on AMD's site that describes a second card for Crossfire and how to install, but no mention of 3 or 4 cards. I don't know if this is typical or not, but it is my understanding Crossfire should support up to 4 GPUs. Maybe the limitation is that these are dual GPU cards like the 295X2.

I hope I'm wrong on that and Windows, at a Minimum, recognizes a R9 Fury as a single GPU. I thought that the Dual GPU slide was a reference to their Quantum thing, but that doesn't make sense right under the "World's fastest graphics card" tagline.
 
Once again, it all depends on the driver support. I've been using Radeons since the 5970 I bought years ago, and since then, AMD has depreciated the CAP updates for Crossfire and I'm stuck waiting weeks for AAA title crossfire support.

Heck, to this day, Crossfire doesnt work very well on Witcher 3 or GTA 5, even with the latest 15.5 betas. The former is a TWIMTBP title, but the latter isnt, and even includes AMD technology in it (AMD CHS).
 
''Because what you are trying to do is not done by the majority of people. The majority of people will upgrade to a newer monitor and utilize newer interfaces.
''

what makes you think I don't have new monitors ?? like I said you buy a card that only does 1/2 the job and then you got to spend extra for it to work ??

way not a card that does it all right out of its box with out spending extra ?? amd got you short changed you spend to support them they don't support you

like I said a fool and there money is soon parted i'll buy the cards that will do it all not and not spend un necessary money on

hard to beleave you would spend 400 bucks on a card that don't work on your monitors regardless what kind and then have to go spend another 100 -300 bucks on a monitor that the card is only able to work with ???

man I wish I had your money just to waste on stupidly

I guess you would buy a car with out wheels and tires for the same money as one that has ?? what a joke

but I understand you cant figure that one out
 


I have been using ATI mainly since the 9700Pro, used to swap depending on their performance, but I can say that driver support has dropped a bit in the past few years.

I get why they did some things, such as no more monthly updates, but they need to work harder on releasing new game drivers with major releases.
 


How about we watch the personal insults? I didn't call you stupid or a fool but rather pointed out that the majority of people have monitors that support DVI-D so the market trend is where things go. If you also noticed they are dropping DVI for the soon to be highly adopted DisplayPort. Why? Because DP can handle all current resolutions and as well 4K and higher with newer versions.

I don't need DVI-I nor do I need VGA so why should I worry about something I do not need? Obviously you don't need S-Video which was dropped years ago and DVI/VGA support will probably be dropped in the next 5 years.

As I said, if it works for you that is great but the majority do not need DVI-I or VGA just like the majority does not need more than 4GB of VRAM since the majority are still at 1080p or lower.
 
well, i waited for 'hbm' but the price is way out of my price range. i don't want the non-hbm 300 series either. they're not enough to justify upgrading from a 5850 hd. oh, well. it's back to waiting another 6 months to see if more 'hbm' based cards come out at a decent price. 😛
 


I doubt we will see anymore HBM parts until their next GPU on 14nm and HBM2 which should become more cost effective.

But I can tell you this, a R9 380 will blow you HD 5850 away in most every game, especially at 1080p.
 
''Because what you are trying to do is not done by the majority of people. The majority of people will upgrade to a newer monitor and utilize newer interfaces.
''

what makes you think I don't have new monitors ?? like I said you buy a card that only does 1/2 the job and then you got to spend extra for it to work ??

way not a card that does it all right out of its box with out spending extra ?? amd got you short changed you spend to support them they don't support you

like I said a fool and there money is soon parted i'll buy the cards that will do it all not and not spend un necessary money on

hard to beleave you would spend 400 bucks on a card that don't work on your monitors regardless what kind and then have to go spend another 100 -300 bucks on a monitor that the card is only able to work with ???

man I wish I had your money just to waste on stupidly

I guess you would buy a car with out wheels and tires for the same money as one that has ?? what a joke

but I understand you cant figure that one out


How about we watch the personal insults? I didn't call you stupid or a fool but rather pointed out that the majority of people have monitors that support DVI-D so the market trend is where things go. If you also noticed they are dropping DVI for the soon to be highly adopted DisplayPort. Why? Because DP can handle all current resolutions and as well 4K and higher with newer versions.

I don't need DVI-I nor do I need VGA so why should I worry about something I do not need? Obviously you don't need S-Video which was dropped years ago and DVI/VGA support will probably be dropped in the next 5 years.

As I said, if it works for you that is great but the majority do not need DVI-I or VGA just like the majority does not need more than 4GB of VRAM since the majority are still at 1080p or lower.


ok sorry if you got offended - just to point out amd cant get ahead and another reason what there items don't sell as well how much does my minority cost them that NVidia will get ??

that's the point --- I do prefer amd but once again a non full hardware supporting limited use card
 
AMD has just hit a home run! With cards of such high grade performance at such low prices, it looks like they just gained the upper hand on Nvidia (I hope I'm not being too hasty in writing this. Benchmarks I guess will tell.)

Don't be so sure, we don't know the impact of this new memory on current games, theory is always awesome, i am an ATI fan but let's wait and see some benchmarks

 

No, when you rebrand the majority of your lineup for two consecutive generations with GPU's of varying feature level support and efficiencies, the "joke" still stands.
 
I like that nano and its size .. beats he heck out of them 13'' monsters they were putting out - I just don't like the no dvi port .. at least NVidia cares enough to keep dvi-I no there cards and understands theres still folks that may require that

like the cards over the 280x only came with dvi-d so they were a no sale as well

thanks NVidia

If you still need dvi then that's on you. Time for a monitor upgrade. Monitors have been coming with hdmi for about 1000 year now. You might wanna look into it. There's also this new fangled thing called display port. I would say look it up, but you might think you died and somehow wound up in the future.
 
If we are actually talking 4096 GCN cores with 1.5X the Performance per watt of Hawaii XT's 2816 GCN cores we are roughly talking about something with the power of a 295 X2 in a single GPU card that draws 275W. If this is all true we are seeing something that has never been seen before and that is a 100% performance increase from the previous generation. Ill wait to see benches before I get too hasty here but if this all ends up to be true Nvidia better have something SERIOUSLY innovative in the pipeline. If their next gen is just a tick off maxwell then they are in big trouble because fiji seems to be a tock, and a big tock at that.
 
Too bad It won't come out until fall. I don't want to wait that long.

My next, and current, build is going to be Skylake, so I can wait. It's also in a Corsair 250D, so that enforces a mITX MB. If it can drive my dual 1440p monitors at, the very least, very good framerates on the few games I play, I'll be hooked.

(My enforced delay lets me look at all the reviews before hand anyway.)
 
Status
Not open for further replies.