Discussion: Polaris, AMD's 4th Gen GCN Architecture

Page 38 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.


I'm sorry but that's not correct. During their reddit AMA, AMD said they sent reviewers an 8GB card with a special BIOS that allowed them to disable 4GB so they could benchmark it as if it were a 4GB model. Not the other way around as some think. The 4GB models being sold only have 4GB of physical memory.

https://www.reddit.com/r/pcmasterrace/comments/4qfy9d/i_work_at_amd_the_time_has_come_to_ama_about/d4smvmo
 


Ah, that makes a lot more sense
 


lol i was thinking if they intend to sell 4GB card why disable 4GB out of 8GB? this is not like unlocking extra shaders. that extra VRAM on board still going to cost money. in any case it is not a logical thing to do for retail unit.
 


On another site however...

Another oddity – and one that may make enthusiast a bit more cheery – is that AMD only built a single reference design card for this release to cover both the 4GB and 8GB varieties. That means that the cards that go on sale today listed at 4GB models will actually have 8GB of memory on them! With half of the DRAM partially disabled, it seems likely that someone soon will find a way to share a VBIOS to enable the additional VRAM. In fact, AMD provided me with a 4GB and an 8GB VBIOS for testing purpose. It’s a cost saving measure on AMD’s part – this way they only have to validate and build a single PCB.

http://www.pcper.com/reviews/Graphics-Cards/AMD-Radeon-RX-480-Review-Polaris-Promise
 


it will depends on the memory interface configuration. but nvidia better not pairing 192bit with 4GB/8GB memory config. such config has the chance to ruin the card and make RX480 more interesting choice instead.
 


Pffff my Asus Strix is sexier:

19b.jpg
 


perhaps in that desperate "oh please someone look at me and give me your approval" kind of way. the sapphire is more of a classy "i know i'm the shiznit and don't need to be flashy to prove it" kind of thing. but hey whatever fills the void in your life 😛
 
So it's official and confirmed by nVidia of all people: RX480 is faster than GTX1060. The nVidia slide shows the 1060 15% faster - but that is undoubtedly the best showing for the 1060 from the most nVidia friendly game they could find. I assume in the real world the cards are equal or the RX480 is ahead :)
 


Got a link for that?
 


But pulling a bit too much power through the 6-pin power connector really wouldn't be an issue 99.9% of the time. Drawing too much through the PCIe slot is more sketchy.
 


Can we maybe just wait for benchmarks?
 


That depends on the ratings that the connectors are rated for, it is possible that drawing too much current could cause melting. More importantly though, if the card is being marketed as a 150w card then it should be a 150w (or less) card and if it isn't then that is deceitful is it not?
 


Re-read what he posted. He is inferring something from the slide.

Cheers!
 


It seems to me as though he was inferring that only one game was used to get that result, hence why I asked for a link. Now if when the card is released and the results show the 1060 to be faster across the board we will know he was talking out of his arse.
 
Jonny guru members have subjected 6-pin PCIe cables to over 150W of load without a problem. Also, if the GTX 1060 is better than the RX 480, it will be priced higher. That's how it works. Nobody starts price wars, that is stupid stupid stupid for business. Every card is always priced accordingly to its performance.
 


No need to get so defensive on an opinion. You really love nVidia, don't you? 😛

In any case, AMD used Ashes and they were right for that game. I would imagine when nVidia releases more slides we'll know how the cards will stack, since the 480 has been reviewed very thoroughly by now. IMO, there is merit to his theory.

Cheers!
 
Even with the power problem, I don't consider the RX480 a bad product. Once that is solved, it will sell very well I'd say.

On a broader spectrum, yes, AMD has made some very dumb decisions from the external point of view. Still, they have managed to stay afloat and I hope they keep on doing it.

What I am wondering now, is who's going to take the fall for this one. Mr. "buttefingers" Koduri or Mrs. "itrytolaughbutidon'tknowhow" Su.

Cheers!
 
Status
Not open for further replies.