blazorthon
Glorious
Creme :
My mention of DX was to show how Nvidia seperated their drivers between DX10 and 11 cards. I know that 400 and 500 series use the same driver as the latest ones. VLIW5 and GCN being different doesn't excuse AMD from dropping support, they chose to keep using that architecture after the 6000 series even after GCN was out, so they should have owned up and supported the 2013 APUs longer too.
HBM did a lot for the Fury and Fury X. With GDDR5 they would have been over 300 watts easily. It's still a respectable amount of power that it saved them.
You said Nvidia still posts game-specific drivers for the 400 and 500 series. That's misleading because they're the same drivers as used by the newer cards. You may know better, but your post certainly didn't imply that.
AMD can't afford to care about whether it excuses it or not.
The VLIW4 APUs were being designed at the same time (even partly earlier), not after GCN. No VLIW4 GPUs were designed after GCN came around. AMD couldn't have developed the APUs with a GCN GPU until after GCN was finalized because you need the architecture ready before you can actually make something with it.
No, GDDR5 would not cause the Fury X to use over 300W. GDDR5 memory would not consume over 80W more than HBM. You are completely ignoring the huge changes made with Powertune (which were further added on with the Nano) that were responsible for most of the power efficiency improvement. Yes, HBM helped considerably, but it certainly isn't saving almost 100W.