AMD RX 400 series (Polaris) MegaThread! FAQ & Resources

Page 41 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.


well that is just my own speculation anyway. but from the way i see it AMD have much less restriction towards their board partner. take asus RX460 strix for example. RX460 in general was supposed to be sub 75w gpu. but asus RX460 can still easily use up to 90w which basically eliminate the power advantage it supposed to have vs GTX950.

https://www.techpowerup.com/reviews/ASUS/RX_460_STRIX_OC/22.html
 
About the warranty voiding:

I think AMD does not cover the burned/damaged GPU replacement cost when that happens, but the 3rd party (MSI, XFX and others that offer that). Is that correct to assume?

About the disabled bits:

I think Asus and all the other fellas that slapped the 6pin to the RX460 might have been anticipating this one way or another? I mean, it's really curious they are increasing the power requirement a little bit, for an extra 5%-10% and the original promise was a 75W card. I think that theory has some weight based on the fact that Asus does have a history of doing this sort of thing before with all chip providers (P4 era, Intel's 865PE chipset is the best example I can remember). Always finding ways to get more out of the stuff they receive 😛

In any case, if a user is willing to risk a RMA, Warranty process or they just feel like it, it's a good thing to have, I guess. Although I still think giving the full GPU would have been better...

Cheers!

EDIT: Typos.
 
I think AMD does not cover the burned/damaged GPU replacement cost when that happens, but the 3rd party (MSI, XFX and others that offer that). Is that correct to assume?

not sure but in nvidia case if the chip somehow burned while still running inside nvidia spec then nvidia will provide the replacement chip. as far as i know there is no such restriction with AMD. i mean AMD did not really dictate what parameter board partner should follow when they doing factory overclock cards. so i assume even if board partner go crazy with it AMD will still provide the replacement. and looking at how board partner did not go crazy when making nvidia cards prove that this free replacement from gpu maker is really a big deal to them.

I think Asus and all the other fellas that slapped the 6pin to the RX460 might have been anticipating this one way or another? I mean, it's really curious they are increasing the power requirement a little bit, for an extra 5%-10% and the original promise was a 75W card. I think that theory has some weight based on the fact that Asus does have a history of doing this sort of thing before with all chip providers (P4 era, Intel's 865PE chipset is the best example I can remember). Always finding ways to get more out of the stuff they receive 😛

well asus already releasing GTX1050 Ti bar none edition. but they also more expensive. more so than RX470 4GB.
 
http://techreport.com/review/31093/amd-opens-up-machine-learning-with-radeon-instinct

Is that "Vega" sample image true to it's projected size?

There's a lot of speculation one can do from that picture 😛

I posted that in the comments section at TR, since Toms and AT didn't provide that picture or at least I don't remember seeing it.

EDIT: In Tom's article, they have a picture of a server from Micro-something, but that is not as having a label next to the picture, I guess?

Cheers!
 


Overclocking by the user is never covered by any of the manufacturers. Despite them designing and selling cards that can be overclocked, if you blow it up overclocking it the warranty is void. I've dealt with a few manufacturers both

HOWEVER

Depending on the damage, there really is no way for them to know you overclocked it. So keep that in mind. Many things besides overclocking can fry a GPU.
 


We're gonna need a Vega thread at some point.

As for the contents, AMD claiming a certain level of performance is fine, but as always we should wait for reviews.
 
We wont need a Seperate tread if it's a 490...

But it's good to hear that certainty.

Remains to be seen I suppose... I guess we will know more tomorrow.
 
Why an extra Vega thread? Isn't Vega going to be another 400 card?

1080 performance doesn't look good, tbh. Unless the price is alluring.
Also: this was Vulkan. Let's wait for more benchmarks and see if it can hold onto 1080 performance.
Hopefully the big vega will be out soon and we will see some competition finally.
 


Yeah if it's like that and there are more games other than doom benchmarked, then I'm all for it!

I mean, a 480@4K beats a 1060 by nearly 100% in Doom, if I remember correctly.
 
something like that, it was a BIG difference. in my opinion amd is gonna look better and better as dx 12 and vulkan takes center stage. i know nvidia fanboys like to ignore the reality of it but amd has the dx12 thing pretty well figured out. so equal/similar performance between an amd and an nvidia card, the nod goes to amd simply for this fact. if i can get almost everything out of an amd card in dx 11 compared to a similar priced nvidia card but a good bit more in dx12/vulkan. that's a no brainer to me. no amount of loyality and fanboy nonsense can keep me from making that common sense decision.

clearly not the same thinking for others but it makes sense to my wallet 😀
 

Err, don't think so. I remember when Doom was first patched to support Vulkan, the RX 480 was beating the GTX 1060 by a fair bit (don't think it was close to double though). But I'm pretty sure there was another patch later that improved Vulkan support on Pascal, and now the performance is fairly similar.
Here are 1440p benchmarks, and I can't imagine 4k being that much different
http://www.hardwarecanucks.com/forum/hardware-canucks-reviews/73945-gtx-1060-vs-rx-480-updated-review-20.html

If you look at the benchmarks in that article, you'll see that the whole "AMD is (much) better for Vulkan and DX12" isn't as as pronounced as some people think. On the flip side, the RX 480 seems to have closed the gap in DX11 since release, and now trades blows with the GTX 1060. Overall performance between the two seems surprisingly balanced.

Edit: Derp, didn't look at the summaries on the last page of that article. They found the GTX 1060 performed virtually the same as the RX 480 in DX11 (2%/0% better at 1080p/1440p), and a little worse in DX12/Vulkan on average (-6% at both 1080p and 1440p).
 
I can't link you the site, since it was a while ago but I remember that 4k doom on a 1060 was like 25fps and on a 480 40fps or so. 1080p was virtually the same but 4k was that huge leap.

EDIT: it might've been pre-patch for the green team now that I think about it.
 


well this could end up being the "big" vega until AMD can come up Vega 20 which is slated to come out in 2019 using GF 7nm process. AMD introducing there new cards call Radeon instict. MI6 rated at 150w (pointing to polaris 10), MI8 rated at 175w (pointing to Fiji nano) and MI25 which is rated at 300w for new vega chip. being rated at 300w i don't think this is the 'small' chip. or vega simply have one chip only. fully enabled one and another cut down version. if that is the case we just don't know if vega used for the doom demo is fully enable chip or the cut down version.
 


that is for average combined score though. they add more games that favoring AMD cards even in DX11 (like titan fall 2) hence why the average score we see RX480 closing the gap to none for DX11 vs 1060. it doesn't mean now that RX480 is as fast as GTX1060 in every DX11 title. but i see some people throwing this bench around as a proof now RX480 is as fast as 1060 in every DX11 title.
 
I love 4K gaming, press a key and you will react in about 3 seconds time. Super cool gaming experience. If you want 4K gaming, get an xbox 1 and a new LG 4K UHD tv and booooom locked 45FPS at 4K without LAAAAAAG
 
To put the performance of the 'mystery' Vega card into perspective, in Doom (campaign) at 4K Ultra with AA off, I get an average of ~45 fps with an R9 Fury @ 1050MHz. Though that is a maybe conservative estimate. Overclocking it pushes that figure up a bit. We don't know what settings were used in the demo (max ?) but I expect it didn't have optimised drivers either.
 


Hope we have the answer to these questions at today's big AMD promotional meeting (3PM CST):

http://www.amd.com/en-us/innovations/new-horizon
 
Status
Not open for further replies.