Discussion: Polaris, AMD's 4th Gen GCN Architecture

Page 43 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
http://hexus.net/tech/news/graphics/94240-amd-radeon-software-1671-fixes-pci-e-power-draw-issues/

"AMD has emailed HEXUS to say that it has a solution to concerns over the Radeon RX 480 drawing excessive current from the PCIe bus. In a nutshell, it will be providing a new driver, AMD Radeon Software 16.7.1, which will address power distribution, within the next 48 hours. The driver will provide additional benefits to end users including an option to reduce total power consumption, and performance improvements of up to 3 per cent in several popular games titles."

AMD Statement: -

"We promised an update today (July 5, 2016) following concerns around the Radeon™ RX 480 drawing excess current from the PCIe bus. Although we are confident that the levels of reported power draws by the Radeon RX 480 do not pose a risk of damage to motherboards or other PC components based on expected usage, we are serious about addressing this topic and allaying outstanding concerns. Towards that end, we assembled a worldwide team this past weekend to investigate and develop a driver update to improve the power draw. We’re pleased to report that this driver—Radeon Software 16.7.1—is now undergoing final testing and will be released to the public in the next 48 hours.

In this driver we’ve implemented a change to address power distribution on the Radeon RX 480 – this change will lower current drawn from the PCIe bus.

Separately, we’ve also included an option to reduce total power with minimal performance impact. Users will find this as the “compatibility” UI toggle in the Global Settings menu of Radeon Settings. This toggle is “off” by default.

Finally, we’ve implemented a collection of performance improvements for the Polaris architecture that yield performance uplifts in popular game titles of up to 3%1. These optimizations are designed to improve the performance of the Radeon RX 480, and should substantially offset the performance impact for users who choose to activate the “compatibility” toggle."

-------------------

Based on data running ’Total War: Warhammer’, ultra settings, 1080p resolution.
Radeon Software 16.6.2 74.2FPS vs Radeon Software 16.7.1 78.3FPS;
Metro Last Light, very high settings, 1080p resolution, 80.9FPS vs 82.7 FPS.
Witcher 3, Ultra settings, 1440p, 31.5FPS vs 32.5,
Far Cry 4, ultra settings, 1440p, 54.65FPS vs 56.38FPS,
3DMark11 Extreme, 22.8 vs 23.7
System config: Core i7-5960X, 16GB DDR4-2666MHz, Gigabyte X99-UD4, Windows 10 64-bit.
Performance figures are not average, may vary from run-to-run.

Also: http://www.tomshardware.co.uk/amd-rx-480-power-driver,news-53401.html
 


definitely yes. having more never hurts. but this kind of move are not good for AMD actually. this way people have no reason to buy the 8GB model. what are AMD thinking when they decided to disable 4GB out of 8GB? unlike unlocking shaders this is a cost that they can save. and they doing this kind of stuff when they already in the red in each quarter.
 
Well, AMD has done it from time to time. I bet there have been products we (public) just haven't noticed you could convert.

The HD 69xx series had something similar, the Phenom II X2s and X3s and i bet older designs (like Athlons and Durons) might have had something similar.

I don't think they (AMD) "lose money" on it, since they already sold the chips priced to their liking. OEMs might be taking the hit more than AMD here. I would imagine they get kinda pissed of with this, but it's not like AMD wants this to be found out and I bet OEMs *know* about it.

And yeah, nVidia and Intel also do it. Intel is a bit more of an ass about it though, from what I remember.

Cheers!
 

Rogue Leader

It's a trap!
Moderator


Regarding that, the X2 and X3 chips were lower binned X4's that the cores didn't pass spec. Now some of them the cores were complete junk, but others they were just slightly off. Thats why you hear of some folks successfully unlocking and it being prettymuch fine, and other times it just doesn't work at all.

But I agree with you, the price difference to build a 4gb vs an 8gb card is likely a couple dollars to them tops. They aren't losing anything in fact they are gaining customers by enticing some in at the $200 price point.

Heck I bought mine because it was $200, if it was not that cheap I would have held on to my Crossfired 280s and waited for the rest of my system build to happen.
 


That was true with the early Phenom II parts (for example I had a 'hecka' Phenom II X3 710, which could unlock but was unstable). Later version were somewhat more reliable- the Phenom II X4 960t (the only 'x4' with turbo mode) was based on the Thuban X6 and they almost always unlock to hex core's without issue (I wonder if there was pressure from OEM's for AMD to make a quad core with turbo?). I built about 4 machines based on those for friends and all 4 unlocked successfully without issue. That processor was such a great deal imo, and Phenom II X6 is still a decent processor to this day.
 

the one taking the hit probably was their board partner. though honestly i don't know. because we are talking about reference design here. also i said in my earlier post that this is not quite the same as unlocking the extra shaders of unlocking more cpu cores. because it was already in the design. you can't "cut" the bad cluster and making the chip smaller or something like that. so they just depend on silicon lottery if the extra shaders can be unlocked and working properly. but in this VRAM they have the options to not using the chip on the card. personally i think it might not be much of a problem for AMD even if they have shortage for the 4GB model.
 
I am still running a Phenom II 550 unlocked to 4 cores and built 4 Athlon x3 machines that 3 unlocked the 4th core 1 also L3 cache. Before I did any of this I did alot of online research it it seemed at the time (very few people had done many) 90%+ of dual cores unlocked to quad and only 50% of 3 cores. My guess is AMD got very few bad cores at the time so duals where almost always just locked due to demand while around half 3 cores did have bad cores
 


So you are saying it is all just a coincidence?

It is still higher than spec power draw and AMD could have avoided all of this by just adding a 8pin connection to it instead so the PCIe port would draw less power.
 


It will depend on how AMD negotiates with the Partners, but I am very sure they only create a "prototype" and send the full spec to them. They then negotiate the GPU bulk price and call it a day. The final cards, I'd say are on the shoulders of Partners and not AMD. Everything "cost" before the final reference design, I would imagine falls in AMD's hands.

And all 4GB cards are out of stock in every store near me. Only 8GB ref. ones are available. Darn it xD



More than coincidence, it's not that out of scope the "blame it on the weakest link" could come into play here. As in: "they reported the card *might* cause issues and my MoBo died when I swapped my video card!" (**1). But it's never 100% of the cases reported. I am not even sure on how to assign a % of chance here, but at least it does sound plausible for it to happen.

**1 -> When you swap your video card, and we know this chance is greater than zero, a lot of people will make dumb mistakes that can actually cause harm to their MoBos. Even when not swapping and just building from scratch; our forum is abundant with them, so there's that.

Cheers!

EDIT: Added Quote... Why am I missing your posts, Jimmy? XD
 

Rogue Leader

It's a trap!
Moderator


I would say its highly likely the first run of "4gb" cards was very small due to this memory issue, but they figured take the hit on a few cards, get them out there to meet the price point and have it out in public. Then you have the person in the store. They can't get the 4 (out of stock) but at that point they are like eh whats 30 more bucks.

Makes sense why my Microcenter only had 2 vs 30+ 8gb cards.

Most likely when 4gb cards come back into stock in places they will all be true 4gb cards.
 

Math Geek

Titan
Ambassador
here's some more fun with aftermarket cooling for the 480 http://wccftech.com/amd-rx-480-overclocked-air-water/

seems with solid air and water cooling, the reference pcb of the 480 is a VERY robust platform. on water it clocked high enough to beat "R9 390X, 980 & Nano but is also effectively encroaching on R9 Fury territory."

get's me a bit more excited for the custom cards to start showing up in the next couple weeks.
 
So the Asus Strix might have a 6-pin and an 8-pin connector? This is more like it. That's 300W and plenty of potential overclocking headroom.

All the reference cards have been a bit of a yawn for me.

Now the serious business begins with the 'no compromise' custom designs. Can't wait!
 
Lots of salt for WTFBBQTech.

I really don't think most Partners will go over 1.2Ghz and 1.3Ghz turbo speed for their non-reference. Maybe some halo products will push them to 1.3/1.4Ghz, but not the norm. I will be happy with a cool running silent card. AMD and, most probably Sapphire, can have my money then. That Asus strix looks nice as well, but the bling bling... Ugh.

Cheers!
 


Why are you refusing to read my posts? I specifically said that the issue with the card was exposing an issue with the motherboard.
 


The Strix RGB is logical. Everything is RGB. Good thing is you can normally turn it off as well.
 


A motherboard issue? No it's an issue with the card. You would expect that any consumer-based product you purchase to stick into a motherboard would comply with the specs and standards.
 


I am not. I just don't think it is the motherboard that is the issue I think it is the GPUs power draw. If this happened with nVidia I would be on the same page it is just odd that people who ran older, higher power draw cards did not have this issue because those GPUs did not pull as much power from the PCIe slot.

It is too much of a coincidence.
 

Math Geek

Titan
Ambassador
the strix is always a good card. but i love the understated classy look of the sapphire card. reminds me of the evga acx cooler for nvidia cards.

of course if there is a lot of oc headroom, we'll see moderate oc's for regular cards and the major all out oc's for the top end cards. that's a given really. but the fact that the reference pcb can handle the high oc leaves a lot of hope that adding an extra power connector and better cooling, this card could be a really great card. we saw references to "beast mode" cards early on. this is probably the superclocked high end cards but if priced right could be a solid buy. we won't know until they are out how they will be priced/perform but around $300 sounds about right considering the 8gb reference is $240.
 


The thing I like about the Strix is that the DirectCU III. Sure it isn't as simplistic but it is a good cooling design.
 

Math Geek

Titan
Ambassador
the more i look at it, the more i like the odd palit gamerock card. that grey/blue/white color is so out there that i almost like it. i can picture it with the right component mix being a part of a nice looking system if done right.

but the strix has always been a solid go to card with great cooling and acoustics. may not be the highest clocked card out there but i can always say it is a great card whether i have read the review or not :)
 

Rogue Leader

It's a trap!
Moderator


We have seen it, it even has replaceable fans. Pic was included in instruction booklet of the reference cards

Custom-XFX-RX-480-1-635x381.jpg
 


That depends. On launch the Strix 980Ti was the highest clocked 980Ti.
 
Status
Not open for further replies.