AMD Radeon R9 300 Series MegaThread: FAQ and Resources

Page 44 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.


I don't know if your trolling or just want to be rich to be able to do max distance. Serisouly. I am prettu sure people would NOT put max distance at 4k... You evened asnwered your comment. Itneeds 16, oh but at max distance. well yea. But at ultra, max distance isn't included. For a good reason...
 


i like games to look real. Draw distance is important in multiplayer
 
So 'Polaris' is the new 14nm ff architecture with 2.5 times performance per watt than fiji. Source twitter and wccftech
http://wccftech.com/amd-polaris-2-5-times/
 
So now it's polaris? Greenland, Polaris, the name changes all the time. If that 2.5X statement is true, that'll be really good for AMD, and I really hope AMD can bring something to the table that competes with Nvidia head-on (which they always do).
 


AMD have got pretty adept at scratching out the old name and writing a new one so why does this name change surprise you?
 
Since the artic islands were ?internal? code names maybe rtg got to change the name/marketing to their own now they're separate, either way it seems Polaris is the name we're given, 2016 is looking like a golden year for gpus, cant wait!
 


I didn't even follow computer hardware "in the past" when they did.
 


What? Have you never heard of the HD7xxx series, R7 2xx series and the R9 3xx series?
 
Yes, I know of them, but at the time I was, oh what was I doing with my life, well, not following them. Or course I know the 2xx and 3xx (HD7xxx don't know too much about, and don't feel like spending the time) but TBH pay no heed to their name. I know they're GCN. So any name change I do not know about.
 


Southern Islands was also the internal code name for the uArch but we always got names based on islands in the area, i.e. Tahiti was the top end for Southern Islands and Hawaii was the top end for Volcanic Islands. There is no Polaris island which makes it hard to think they would name it Arctic Islands internally then use some other random name.

Of course since Radeon is now its own group and GCN is no longer associated with top end hardware maybe they are changing the name of the main uArch from GCN to Polaris but the rest is the same with the specific uArch being Arctic Islands and Greenland being the top end GPU.

Still we need to be careful of that 2.5x. There have been claims like that before and in the end we didn't see that massive leap in performance. It might be 2.5x faster in raw performance (TFLOPS) but in the real world TFLOPS doesn't mean a damn thing.
 


Apparently there is :- http://www.anandtech.com/show/9387/amd-radeon-300-series/2

Tweaking a few bits does not change the uArch and as such the cards are all the same, they just called them by a different name.
 
See, too many islands for me to keep track of 😛

They said 2.5X performance per watt, so let's say we have a ratio of performance/watt of the 390. Let x = performance. So the ratio is x/300W per watt. Now 2.5 performance per watt so 2.5x/300W, so if we just infer that power usage is the same that is x/120W. Now if we realistically assume that power usage is 2/3 of the 300 series, then 1.66x/200W, so a improvement. If power draw is lower than 200W (which it very well probably will be), say 150W for the 490, then 1.25x/150W. So 25% improvement?

I totally think I did that wrong, but I'll leave it up there because 3 minutes of my time went into it.

@monkey: I said any name change I did not know about, when I started getting into hardware the 2xx series was out. I'm not talking about tweaks or anything. I don't think you see what I'm saying...
 


AMD supported VLIW to the end of DX11. Given the last desktop VLIW card was released in 2010 (HD 6970) it's hardly surprising they stopped supporting it by 2015. Note that all the HD5000 parts were also supported to this date. The HD4000 support ended around 2012 and they were DX10.

If they follow the same pattern, expect GCN 1.0 cards to maintain getting support in driver updates for the duration of DX12, which doesn't look set to be superseded any time soon.

*edit: To clarfy by 'end of DX11' I mean the introduction of the next major revision of Direct X that will replace it. I know DX11 games are still being made, just as DX9 games were still released after DX11 came out.
 
right now AMD no longer releasing top to bottom gpu every year like they did in the past. so driver support won't be dropped as fast as they did before. but when they drop support for 4k series back then many of the cards especially the high end and CF config is still very capable in performance despite they are limited up to DX10 only. CF user get hit worse. even if they can apply CF profiles themselves but for problem that specifically need driver update to solve they cannot do anything about it since AMD no longer release updated drivers for those series.
 
@renz496, I had a 4000 series card, which a friend is now running. Despite lack of driver updates it still works for most games (performance allowing). Also whilst people are quick to criticize amd on the 4000 series, you need to remember NV dropped support for the gtx 200 range (same gen) around the same time. Neither company can be expected to support cards forever. 5 to 6 years strikes me as pretty reasonable.
 


nope. AMD drop the support for 4k series in 2012 while nvidia drop support for 200 series and below in 2014. so no nvidia did not drop support for 200 series around the same time as AMD. and right now 200 series down to 8k series still got official drivers for windows 10 while AMD 4k series don't even have official drivers for win 8.1
 


Ah you are correct- thought they dropped support for those a long time ago. Still I think that's more props to NV than AMD doing anything bad. Also for the record, the HD4670 is running fine on Windows 10 (Windows identified and installed the driver, I'm guessing the last official driver from AMD?).

I mean my friend is using it mainly to play older games of GOG and minecraft (but then again it *is* a HD4670). It does work though.
 
As i said since amd no longer release new card (not rebrand) every year they won't be as fast to drop support.

Those 4k series right work with MS driver just fine right now but imagine if you were buying top of the line card. What do you think if AMD dropping support for 7970/280X right now? That's how it is for HD4890. Not really complaining just that AMD should think a bit before really dropping support for their card. There are few people actually jumping sides because of this thing. And i see once people getting into a fight with amd forum admin (amd official forum) because AMD did not officially support 4k series in win 8.1 despite having legacy support in win 8.
 
Keep in mind the difference of unsupported and will not work through this conversation...

It's not "if your card is now unsupported you have to upgrade or you cant play new games"

It's more "If your card is now unsupported, we aren't going to included that architecture in our driver optimizations"

Stuff should still work in most cases.
 
The issue is more about how long the support is. True they will work most of the time. But there bound to be game that need both patch and driver updates to work properly. So you need to buy new card just you can play one game properly? And in my early post i did mention CF. Without driver updates such setup have it worse because multi gpu tend to have problem that does not exist in single gpu setup.

Btw this is none issue now because AMD no longer releasing new gpu top to bottom every year. No longer they need to keep optimizing different architecture every year like they did before.
 


Well you say this, however next year *is* (pretty much has) to be a top to bottom line up given the manufacturing process change.
 
Judging GPU archs is like judging Ford for making Mustangs with V8s since the 60s saying they're lazy. If a GPU can keep going strong with a few tweaks gen over gen, then so be it. AMD using GCN for some time now has been the result of stupid (in my own views, of course) decisions in regards on how they had the Company organized, using the GPU department as a "glue" to all other techs they were producing and GCN being the jack of all trades behind APUs, GPUs and other stuff.

Now, nVidia has been using the same idea on how to arrange the SMXes since Fermi with a few tweaks here and there, but keeping the same arrangement on the GPU arch. Or at least, that is my impression. One can judge on what constitutes a "big" change on his/her own terms. In any case, my point I think still stands: GCN is just there because there is nothing at the moment that would make sense to replace it with. AMD is now in a good position to tweak it for GPUs and GPUs alone.

Cheers!
 


But rumour has that GCN is going to be replaced, it's just taken them this long to come up with something.