AMD Retires Legacy GPUs, GCN Only Going Forward

Page 4 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.


You said Nvidia still posts game-specific drivers for the 400 and 500 series. That's misleading because they're the same drivers as used by the newer cards. You may know better, but your post certainly didn't imply that.

AMD can't afford to care about whether it excuses it or not.

The VLIW4 APUs were being designed at the same time (even partly earlier), not after GCN. No VLIW4 GPUs were designed after GCN came around. AMD couldn't have developed the APUs with a GCN GPU until after GCN was finalized because you need the architecture ready before you can actually make something with it.

No, GDDR5 would not cause the Fury X to use over 300W. GDDR5 memory would not consume over 80W more than HBM. You are completely ignoring the huge changes made with Powertune (which were further added on with the Nano) that were responsible for most of the power efficiency improvement. Yes, HBM helped considerably, but it certainly isn't saving almost 100W.
 

Creme

Reputable
Aug 4, 2014
360
0
4,860


It's not 80w, but from what I've read it's around 50w savings. Powertune did help, but it would help around 10-15% at most if the 300 series is anything to go by. It's a junction of GCN 1.2, Powertune and HBM, with the latter helping considerably lower the TDP.

Game specific drivers are for 400-900 series cards, I never said they were "specific" for Fermi, I meant "specific" for the game itself.

VLIW4 was around for a year before GCN came out, and was around until 2013 along with VLIW5 even. If they chose to continue using that architecture, they should have chosen to support it for the people who bought "late" too. It's a 2013 APU that has no support anymore, and that's not good for building consumer trust.
 


Even if it's 50W saved, Fury X wouldn't get within 10% of 300W, let alone go over it as you previously claimed.

Powertune did a whole lot more than a mere 1-15%. Nano is huge proof of that. Binning and the slightly lower frequency don't come close to accounting for that power efficiency difference and HBM has nothing to do with the differences between Fury X/Fury and Nano. The advancements in Powertune made in Nano are much more similar to what Nvidia did with their power targets, especially in Maxwell's implementation.

More than half of the latest drivers from Nvidia broke as much as they fixed, if not more. Having more updates when they're so defective is not a good thing.

AMD didn't have a choice in whether or not they continued VLIW at the time. Again, I'm not saying this excuses anything, but AMD didn't make a simple decision to screw people over, they went with the only options available.

Furthermore, having no support for a product is nowhere near as relevant as you seem to think it is in this field. The vast majority of gamers, let alone non-gamers, do not change their graphics drivers even every year, let alone every few months. Even if AMD did keep updating drivers on products that are already working, very few people would bother updating to the new drivers. That is a very inefficient use of AMD's scarce resources and has little impact on consumer trust for the majority of AMD's customers.
 

Creme

Reputable
Aug 4, 2014
360
0
4,860


I believe Powertune is the same as the Fury X, which is improved from the 200 series. The thing is, after a certain clockspeed and hardware combination, the GCN architecture (or at least the 28nm version) becomes increasingly inefficient, as was shown when some site lowered the power limit of the Fury X and obtained Nano power consumption level for a small performance hit. Downclocking just 100-200MHz and agressively adhering to a TDP limit did wonders for the power consumption.

The R9 290 could have been a 200w card if they clocked it at 850MHz, but it would have not competed with the 970 as much, so they upped the clock as much as they could and brought the voltage, heat and power requirements that come with it.

Yeah, the driver thing can be a placebo effect in most cases, or even a disaster as some people are still using Catalyst for one reason or another. The game optimizations is not the important part, what's important is being there with a driver fix if a Windows Update or some other software screws up the display of a VLIW card, hence why I believe updates should have continued, even if they were "when needed" or "quarterly" or even "annual".
 
Status
Not open for further replies.