Radeon RX 590 Allegedly Up to 9 Percent Faster Than GeForce GTX 1060

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.

Yeah, what's up with this?

I feel like they were onto something, with their Fury Nano. I don't really want water-cooling, but It'd be nice to have a similar option of a smaller, lower-clocked Vega 56 (or maybe Vega 52?).

My Linux PC has a HD 7870, which I'd like to replace with another AMD card. I was waiting for the crypto boom to die down, but now I'll probably wait until the RX 590 launches and maybe Navi.
 

Nvidia is a company that can only be forced to do the right thing by market pressure, alone. That said, I agree that the RTX cards probably have all the hardware needed to support VESA Adaptive Sync.

I wonder if the EU's anti-competition laws could force their hand...
 

This sounds like a refresh.

A rebrand is where they take the exact same GPU and sell it with a different name. A refresh is where they make some real improvement. Recall Intel's Haswell refresh and now their Coffee Lake Refresh. Ryzen 2000 should also be regarded as a refresh.

You could say the RX 500 series was a rebrand, even though they did a respin of the Polaris 10 to create the Polaris 20 used in the RX 570 and RX 580.
 


I get the impression that the margins on the Vega cards are pretty thin. I'm not sure how true that is, as I believe Vega 56 was intentded to be priced at 1070 levels, and Vega 64 at 1080 levels. I've read some people say that AMD was actually losing money on the Vega cards, but I don't think that sounds likely.

Maybe HBM + Vega chip is too pricey to put on a lower end card, or to make a lower-clocked version of it? Not sure.

If it WERE profitable, something that performs well enough for max details at 60fps for 2560x1080 might be nice. Of course, I'm thinking of my own specific want/need at the moment.

But it seems like there is a sort of gap missing between the 1060 6GB/580 8GB and the 1070/Vega 56, performance wise. Like there's room for a product about halfway between them, maybe.
 

Yeah, pricing is probably the issue. I think AMD underestimated global HBM2 demand, which probably blew their pricing model.

The interposer also adds cost, as does the beefy VRM and cooling system needed to drive & dissipate Vega's peak TDP.

Most telling is probably the die size, at 486 mm^2 (12.5 B transistors). Compared with GP102's 471 mm^2 (12 B transistors), and you get the feeling that Vega 64 was originally targeted more into GTX 1080 Ti territory. The GPUs' TFLOPS and memory bandwidth specs also match up quite well.

Anyway, I'm guessing current Vega 56 pricing is probably near its floor. That's probably why we don't get a Nano.
 

Call it what you want, it would make me no different. Based on what you stated in your comment reinforces the validity of my original comment. Have a nice day 😉
 
1) Adapter. 😉

2) If you're serious, yes... the first component to get [strike]thrown out a window[/strike] replaced and gently repurposed is that PSU. Failing that, if your PSU really is actually competent but for some reason lacks the requisite second connector you might have options for an adapter there too.

3) Prices fluctuate but in recent memory I've always seen at least one model on Newegg and one model on Amazon around $400. Sometimes multiple models... I've seen them on sale for LESS than $400 sometimes. That doesn't even count MIR type offers.

Did this start as an OEM box? Even when I work with fairly small gaming towers (even sometimes mITX cubes) I make sure it has room for at least one huge graphics card, should the owner want one or want to upgrade to one later. Sometimes that entails removing a cage that reduces the number of drive bays a tiny bit. ("But I can't cram a dozen 3.5" spinners in my gaming rig inhibiting airflow and adding vibration/noise-" seriously just get a NAS if you need an army of spinners.)

I wanna work with you man but a semi-serious gaming box has some soft requirements that if aren't met, turn into barriers. 😛 With that being said, if you want a decent card for a great price, you can't go wrong with a 580/590. They're affordable, decent, and support all the latest freesync wizardry that display offers. But the Freesync-capable card to cross-shop with a 1070 is a Vega 56, and it will offer a superior higher-FPS experience, better delivering on the whispered promises of that display.

Oh one last thing... chill out on Chill. Unless you're really strapped for cooling, I don't think you need to artificially limit your framerate that much. Set it way higher. Personally not a fan of it, rather just bolster airflow. If I do any tinkering it would be with voltage and power limit.
 

Consumer 7nm cards are gonna be a while. Mid-range models? Maybe even longer. I'd love to be wrong, though. I just can't envision 7nm capacity being bonkers good.
 
RX 590 is obviously, just like the RX 480 and 580, best used with 1080p or 1980x1200.
Comparisons at higher resolutions are therefore fairly irrelevant.

If we realistically assume that the 7nm process will initially produce a relatively high number of flaws per wafer it should be good business to sell chips with (flawed) parts disabled, resulting in mid range GPUs made of (partially disabled) higher end dies.
 

No go - it only has 2x6-pin connectors.


Yep, a Dell XPS 8910. Currently running a Gigabyte Windforce OC R9 285 (190W TDP, used to be in my system). Otherwise, i5, 8GM RAM, single spinning drive. It's just that, if the card portion itself is wider, and the PCIe connectors stick out as far as some of the wide fan shrouds, then it'll be a problem. Take a look, for example, at the various EVGA 10 series SC versions versus the FTW versions. The former will fit, the latter will also fit, but then plugging in the PCIe connectors will prevent putting the side panel back on.

That said, Dell outright states for their 460W PSU versions of the XPS that it will work with up to a 225W video card. I guess technically, that's correct (75W slot, plus 2x 6-pin at 75 each).


Yeah, the 56 and 1070 are on the same tier, but I was only considering the 1070 because of weird pricing, and I wondered if the 580 would fall short.



My son, strangely enough, prefers more details to more frames. Still, if something could manage to stay smooth at high details at 2560x1080 without going to 56/1070 level, I'd go for it. The FreeSync/Chill/LFC is just to keep things smooth during those more demanding moments in any given particular game, when the frame rate dips. Aside from that, we'd likely cap it at 60 or 75Hz with the new card.




On the OTHER other hand... if I do wind up getting a 2080 for my own machine, I'd just move my current card (a 1080) to his machine. But I can only justify my own upgrade if:
1 - I can confirm that nothing funky about Dell's BIOS would prevent it (It's an XPS 8700), AND
2 - A certain "may or may not happen" small windfall comes into my hands.

My system would handle the 2080, because before getting my 1080, I had anticipated getting a 1080Ti, and thus upgraded the power supply in anticipation of that. Wound up with a 1080 non-Ti instead, and, had I known that in advance, I would've stuck with the stock Dell PSU. They're actually reasonably durable - haven't had one fail yet, and my earliest of the same series is an XPS 8300 from 2012 that's still running strong (my son's PC at his mom's house).

Huh, seems like my plan is less definitive than it should be... lol.

I also feel like I've hijacked this thread way too far.
 
I'm not understanding the reference to higher than full HD resolutions here. The GTX 1060 and RX 580 were/are 1080p market oriented GPUs. Neither of them were solid sustained 60FPS/60Hz GPUs targeted at 1440p without game settings quality being turned down, and most certainly not meant for 4K.

But as others have mentioned many times, AMD is really missing a market opportunity not challenging the 2070-2080 segment (as of yet). However, unlike Nvidia, AMD has a lot more challenge on their plate to deal with in capital expenditure with less than Nvidia will ever have in challenges to face. At the forefront, AMD focused a lot of resources on Ryzen development and is also working on the upcoming replacement APU for the PS5 and Xbox whatever (code named Scarlett).

Nvidia really only does two things for the mainstream and prosumer: GPUs and their small market share Android-based Shield multi-tasker that is really more of a niche market product and not a true console/AIO entertainment competitor to the PS and XB like the Wii and Nintendo.
 

What do you mean no go? Number 1 was about side panel clearance. Adapters can deal with the side-panel clearance, as well as adapt from 6 to 8. For example:

https://www.moddiy.com/products/PCIE-90-Degree-Angle-Low-Profile-8%252dPin-6%252dPin-Connector-Extension-Cable.html

Under "Type" select "6-Pin to 8-Pin". That cable deals with both issues. With that being said that PSU is underpowered and I wouldn't consider it a good idea to keep it, regardless of whether he gets the hand-me-down 1080 or something else. If it takes standard ATX PSUs, $50 gets you a halfway decent 550-600W PSU with twin 8-pins. If it was mine I'd probably also mod the chassis for more airflow.
 

We're not trying to pick on you - it's just that AMD has a sort of habit of rebrands and some of us feel it's worthwhile to draw a distinction when one exists.

I understand if it's not important to you, nor am I saying it should be. Probably all of us wished their 2018 GPU introductions consisted of something a bit more substantial and "game-changing".
 

Well, they could take a page from Nvidia and its $3k Titan V.

I think we know that Vega 7 nm uses 4 stacks of HBM2. So, that will limit pricing and availability. Will they have enough defects to be worth creating a new consumer product? I'm doubtful. I see them doing a Vega 64/56 strategy, where the defects merely create a lower tier of cloud GPUs.

BTW, Vega 7 nm dies are going to pack even more transistors than Vega 10, since they include half-rate fp64 and new machine learning instructions.
 


Oh, when you said adapters, I didn't realize you meant the 90 degree angle things.

But - to try to use the two 6-pin PCIe cables (2x75W) to adapt as two 8-pin (2*150W draw)? That would be a disaster. Two 6-pin into a single 8-pin, yeah, but two 6-pin into two 8-pin? Yikes!

Yeah, a PSU upgrade would be the way to go, if I were to go with a heavy draw card. Of course, that then adds to the cost, which also plays a part in my calculation.

The GTX 1080, by the way, draws a little less power than the current R9 285.



All in all, I'd like to see how the new 590 performs, as well as the 1060 GDDR5X. Then I'll figure out whether to go 590, or Vega 56/1070.

Granted, upcoming holiday sales/discounts/pricing will also factor in/
 
Yes, and if you keep that chassis the 8-pin to 8-pin versions might still be needed for clearance.


If the PSU was more capable, it wouldn't be a problem. A competent PSU's 6-pin connectors and wiring are EASILY capable of supplying 75W or more. Now if you're using a cheap OEM PSU with marginal wiring... well, the math gets trickier and it's better to just get a new PSU... see below.

$50-60 for a competent 550-600W PSU with dual 8-pins, and I'd basically just recommend you go ahead and do so regardless. It's cheap insurance all around, especially if this thing is going to be his gaming box for the next year or two.

Maybe, if that 1080 is a reference design. Otherwise... all bets are off.
 
The latest leak/ rumor I've heard was from Gamers Nexus and they were saying an early benchmark appeared that might be it and might put it at the level between a 1070 and 1080. That would be roughly equal to a 1070ti. Like with any of these rumors --> grain of salt until actual real reviews and benchmarks come out. That's not to say that some rumors aren't true, no one believed the rumors and "hype" from AMD about first gen Ryzen and it was true, however we also can't forget about the hype surrounding Bulldozer....

Always best to just wait and see what the actual reviews and benchmarks are, but if it is the rough equal of a 1070ti, even if it consumes more power it will still be a welcome addition if the price is right. We definitely need more competition in higher end GPUs. As it is now Nvidia can charge what they want and they know people will pay it because there really isn't a good choice "B". I've been stuck on my old R9 290 for years now and the sad part is that a RX 580 really isn't a worthwhile upgrade for the cost and the 1070ti and 1080 prices haven't come down very much, was expecting a better price drop with the release of the 2070 and 2080....
 


Yep, Founder's Edition - pretty much the only thing I could find in stock at the time (back in February)
 



That, admittedly, seems extremely dubious - no way this tweak is going to put the 590 over the Vega 56.
 

Wow you are a lot more politically correct than I am. A 12nm Polaris with 36 CUs is stuck down in 1060 territory, period. It's just a little bump. The only way you'd see something new in Vega territory would be if they were ALSO going to shrink the larger Vega... but I feel like it's too late to bother. The first one to get replaced when they finally start churning out 7nm consumer GPUs is going to be Vega 56/64. The market segment that has to hold on a bit longer is the mid-range 36/32 CU Polaris units, so it makes sense to shrink those.