Discussion AMD NAVI RX 5700 XT's picture and SPECS leaked *off topic*

Page 4 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Hello,

You may or may not be aware of this, but Videocardz has just leaked some SPECS and info on the AMD's upcoming NAVI GPU.

AMD Radeon RX 5700 XT features 40 Compute Units (2560 stream processors). The Navi GPU is clocked at 1605 MHz in base mode, 1755 MHz in Game Mode and 1905 MHz in boost mode. Of course, the new addition here is the Game Clock.

With the said boost clock, AMD expects a maximum of 9.75 TFLOPs of single-precision compute from the Radeon RX 5700 XT. The card is also confirmed to feature 8 GB of GDDR6 memory that should run across a 256-bit wide bus interface


The card is confirmed to feature 8GB of GDDR6 memory. The memory bus width, memory clock, pricing, and availability date were not available at the time of writing.

https://videocardz.com/80966/amd-radeon-rx-5700-xt-picture-and-specs-leaked

https://wccftech.com/amd-radeon-rx-5700-xt-7nm-navi-gpu-rdna-specs-leak-8-gb-2560-cores/
 
Last edited:
  • Like
Reactions: MrN1ce9uy

King_V

Illustrious
Ambassador
I just hope AMD has done the proper job of making sure their blower types are not a repeat of the HD6K series and HD7K.

That seems extraordinarily unlikely. I don't think the Vega cards went back to the 6K or 7K series for their blower design, no reason that Navi would. Especially since they had that detailed diagram and there was specific mention of a vapor chamber.
 
  • Like
Reactions: Metal Messiah.
Some update on this issue.

First of all, according to GURU3D, custom AIB NAVI cards would be available in August. Yes, there will be AIB (custom) cards for the new Radeon RX 5700 and 5700 XT, however, these will not be available during launch in a few weeks, but roughly one month later.

So AIB Radeon 5700 and 5700 XT cards is a definite yes, and they will become available roughly one month after the reference launch on the 7th....

https://www.guru3d.com/news-story/a...aib-customized-cards-available-in-august.html

On other news, VIDEOCARDZ posted this article... AMD will not sell its 50th Anniversary Edition of NAVI-based RX 5700 XT graphics card outside China and USA.

During the Next Horizon Gaming product announcement, AMD president and CEO, dr. Lisa Su unveiled a limited 50th-anniversary edition of Radeon RX 5700 XT. A 75-MHz faster variant of the Navi 10-based model will only be available in China and USA, according to the news report from Cowcotland.

No customers in Europe, Oceania or Africa will be able to buy this card directly. This means that RX 5700 XT 50th will follow the path of Radeon VIII 50th SKU, which was also hard to buy in certain regions.

https://videocardz.com/newz/amd-radeon-rx-5700-xt-50th-anniversary-edition-only-for-usa-and-china

AMD implied during an interview that it will be downright impossible to overclock these cards or mess with the power settings to overclock them. IOTW: AMD pushed them to the limits like Vega.

Now lets look at AMD's history:

Fury Release:
  • Release timing against cheaper to make NVIDIA, HBM pricing handicap
  • 4GB memory limitations. Its bandwidth was designed for 4K, but it lacked enough memory FOR 4K.
RX480 Release:
  • Violated 75W PCIe spec which was huge controversy.
  • Was a superior card to 1060 however based on pricing.
Vega Release:
  • Hot, Expensive, Late, Too Slow against 1080ti which blew it out of the water.
Vega 7nm Release:
  • Low margin just to keep AMD in the game
  • Was a decent competitor to 1080ti if set up properly (under volt then crank power limit)

Navi (5700XT)
  • Priced directly up against 2070 (which will be main competitor, and has more features)
  • 2070 will overclock. Will 5700XT? (Unknown, but based on history if it does overclock it will be LIMITED at best)
  • 2070 price is about to drop decimating the value of 5700XT.
  • <20% of market sit in this price bracket with no replacement for Polaris (now entry level)
  • No Variable Rate Shaders
I would say this is about a bone headed decision AMD could have made on pricing. Almost out of the gate they will need to lower margin to compete.

So I would say AMD messed up yet again with marketing and plan choices.

I wonder how many Nano Fury's they sat on? Do you remember no one would touch them? The Fury Nana's were slower, and cost just as much as regular Fury card that wasn't that much bigger.

AMD just keeps misjudging the market. That said, AMD has done wonders (especially with drivers) given the EXTREMELY limited budget. The fact it GPU division has been starved for years with budget. Finding qualified engineers in this field is difficult. Then you have to on-board them and they have to learn your plans and design strategies.

So even IF AMD could turn everything around right now and dump 50% of their total company margin back into the GPU biz, it would still take several years to comeback kneecap NVIDIA. And I think this is why Raja left in frustration. (I'm not a huge fan of Raja. His scope can be tunnel visions like. But he's a smart man engineering wise.)

I think AMD's end game here is to make APU's so powerful that they own the mid range of the market which represents 50% of the margin revenue. You could theoretically put out an APU that is just as fast and powerful as a mid range card (80% of market) in a few years using chiplettes. It would be cheaper than separates and because of that, margins will be greater and you can still under price your competition. Imagine offering 1660ti level performance for $100 less on a single chip solution. Yes I really do believe APU's will get this powerful in a few years if they integrate HBM on the package.

And NVIDIA can't compete here because they don't have a x86 license. Cut off 50% of NVIDIA's revenue, and their GPU biz will deeply suffer unless they offer something incredible AMD can't. However NVIDIA will keep top end lead for years to come. There's just too much a gap there to overcome. AMD is struggling to compete at 7nm to NVIDIA's 14/12 nm nodes.

Don't take this rant as anti-AMD. I rather dislike NVIDIA. But AMD needs a "come to digital jesus" moment with this absurd marketing and R&D budgeting to the PC space. (The semi custom business is awesome. PC space, not so much)

Other predictions:
  • Required horse power for same resolution will continue to increase, however not at the same rate.

  • Navi 20 will have to be on 7nm+ node to keep power at reasonable levels. NVIDIA will blow it away at 7nm,

  • Ray tracing will be off loaded to servers as a service you pay for full scene ray tracing. (Ala XBox PS Stadia) Although simple low level ray tracing elements will be available locally.

  • Gaming resolution will stall with 4K monitors because anything > 4K on a < 32" doesn't make any sense. (2560x1440p is now becoming mainstream)

  • VR will start to gain serious foothold. Foveated rendering will be based on eye tracking. Only objects within 3-7 degree arc will get highest resolution. Rendering resolution outside that will start to fall off quickly dramatically improving performance and increasing overall field of view.
 
Last edited:
That seems extraordinarily unlikely. I don't think the Vega cards went back to the 6K or 7K series for their blower design, no reason that Navi would. Especially since they had that detailed diagram and there was specific mention of a vapor chamber.
True. I actually forgot they had blower designs with Vega and I don't recall reading horrendous things from them.

Cheers!
 

King_V

Illustrious
Ambassador
AMD implied during an interview that it will be downright impossible to overclock these cards or mess with the power settings to overclock them. IOTW: AMD pushed them to the limits like Vega.


RX480 Release:
  • Violated 75W PCIe spec which was huge controversy.
  • Was a superior card to 1060 however based on pricing.
Your post seems like an assuming-the-worst-of-all-worlds thing, somewhat unfairly, but I specifically want to touch on two particular points, quoted above.

1 - Um, the 50th anniversary is slightly overclocked, and the increased TDP suggests that they upped the power settings. I don't know if there's much more room than that or not, but obviously there was an overclock. Keep in mind this is the initial release. Further, the silicon can mature. Remember the 2016 article about undervolting the R9 Fury to improve efficiency?

2 - The 75W PCIe spec was violated, and it did become a controversy, rightly so. Strangely, when Nvidia moved their 1050 non-Ti from 2 to 3GB, and compensated for cutting the memory bandwidth, cache, and ROPs by upping the frequencies, lo and behold, they exceeded the spec, as well. Not by as much as the RX480 initially did. Yet, while it's mentioned in the review, there was remarkable silence for the most part, which seems strange to me, particularly since AMD took steps to fix the issue, whereas, to the best of my knowledge, Nvidia has not done so.
 
Your post seems like an assuming-the-worst-of-all-worlds thing, somewhat unfairly, but I specifically want to touch on two particular points, quoted above.

1 - Um, the 50th anniversary is slightly overclocked, and the increased TDP suggests that they upped the power settings. I don't know if there's much more room than that or not, but obviously there was an overclock. Keep in mind this is the initial release. Further, the silicon can mature. Remember the 2016 article about undervolting the R9 Fury to improve efficiency?

2 - The 75W PCIe spec was violated, and it did become a controversy, rightly so. Strangely, when Nvidia moved their 1050 non-Ti from 2 to 3GB, and compensated for cutting the memory bandwidth, cache, and ROPs by upping the frequencies, lo and behold, they exceeded the spec, as well. Not by as much as the RX480 initially did. Yet, while it's mentioned in the review, there was remarkable silence for the most part, which seems strange to me, particularly since AMD took steps to fix the issue, whereas, to the best of my knowledge, Nvidia has not done so.

I think AMD is in a really tough spot. Don't take me wrong. I want them to succeed. I really do. I love my AMD 7970, ATI 980 pro, and original ATI 8500. And my hatred for NVIDIA's business practices has what has kept me from jumping ship. But it looks like AMD is abandoning delivering value for graphics for mainstream, which is where their strong point was. I think they will see their market share start to plummet if this pricing model continues.

I'll agree with you on point 2. And to be honest the 1070 had similar issues if you overclocked it at all (It was right at the line @ stock config). NVIDIA should have been roasted on both.

That said, I don't think the current pricing models by both NVIDIA and AMD can be sustained. There have been rumblings of disappointing sales of NVIDIA RTX series. I think NVIDIA falsely saw justified price increases because the crypto boom created a sustained high price. Therefore NVIDIA believed they could continue it and the market would bear it because there isn't a choice. That was a mistake for various reasons (Mainly supply and demand basics). And soon the boom will fall. This is especially true as soon and Intel gets in on the act.
 
  • Like
Reactions: King_V

King_V

Illustrious
Ambassador
Fair enough - I do think that if Navi is, at MSRP, equaling or undercutting the equivalent Nvidia card price-wise, while performing equal or better, though, they're in the right spot.

That seems to be the case so far with the 5700XT vs 2070, but maybe not with the 5700 vs 2060. I'm really looking forward to the reviews come July.

I also am hoping that AMD actually has enough margin to cut the prices if need be, and still make a profit. Competition and a battle in the market is needed. It does seem that AMD has the 1920x1080@60 price/performance category nailed down vs Nvidia, though the 5700/5700XT look to be pushing into higher territory a bit, but, obviously, not 2080/2080Ti territory.

On the other hand, while battling for the crown is worth kudos/bragging rights, are they losing a whole lot by not being in the topmost tier? I suspect maybe not, though I don't really know the answer to that.
 
Looking at most of ATI/AMD when they've introduced cards, out of all the big successes, the HD4870 stands out the most to me and it wasn't even a "homerun" from the performance perspective. I'm sure many of you will know why.

What AMD is doing with the Navi siblings is daft and short-sighted IMO. They'll get their hand forced sooner than later, but they lost their big chance at bringing a second HD4870.

Cheers!
 

King_V

Illustrious
Ambassador
Looking at most of ATI/AMD when they've introduced cards, out of all the big successes, the HD4870 stands out the most to me and it wasn't even a "homerun" from the performance perspective. I'm sure many of you will know why.

What AMD is doing with the Navi siblings is daft and short-sighted IMO. They'll get their hand forced sooner than later, but they lost their big chance at bringing a second HD4870.


I don't know why, actually, but at the time, I was sort of out of the PC game in general.

That said, what is it that AMD is doing that is daft, exactly? That they don't have a card that competes with the 2080Ti? Or, that they don't have it on day 1?

I mean, the 1080Ti didn't come out until, what, 10 months or so after the 1080, I think it was. Granted, the 1080 was the top dog (I tend to exclude the crazily priced Titan cards) until the 1080Ti.

2080Ti has Titan level pricing currently, which in and of itself seems pretty insane.
 
I don't know why, actually, but at the time, I was sort of out of the PC game in general.

That said, what is it that AMD is doing that is daft, exactly? That they don't have a card that competes with the 2080Ti? Or, that they don't have it on day 1?

I mean, the 1080Ti didn't come out until, what, 10 months or so after the 1080, I think it was. Granted, the 1080 was the top dog (I tend to exclude the crazily priced Titan cards) until the 1080Ti.

2080Ti has Titan level pricing currently, which in and of itself seems pretty insane.

The problem being AMD released typically 1/2 year to 1 year behind their competitors with no clear advantage.

Now they are releasing with less features and soon to be greater MSRP for similar performance.
 
I think AMD's end game here is to make APU's so powerful that they own the mid range of the market which represents 50% of the margin revenue. You could theoretically put out an APU that is just as fast and powerful as a mid range card (80% of market) in a few years using chiplettes. It would be cheaper than separates and because of that, margins will be greater and you can still under price your competition. Imagine offering 1660ti level performance for $100 less on a single chip solution. Yes I really do believe APU's will get this powerful in a few years if they integrate HBM on the package.

the problem with this is game also becoming more and more demanding. by the time they can make APU with GPU as fast as 1660ti we will consider such performance to be on the very low end side. also nvidia is pushing RTRT real hard. what is the end goal of this RT stuff? that is to replace all the fake effect were are using now with RT demanding effect but in turn it will clean game engine from the bloat we have right now and have less issue going forward. RT effect is very demanding even with specialized RT core doing the job. and this is just one example. over the years game developer adding a lot of stuff making their game look even better but also more demanding to run. this is why integrated graphic can't really catch up. when they got improved performance the performance bar also get raised. in 2010 GTX480 is the fastest GPU in the world. in 2014 GTX750ti have similar performance as GTX480.
 
I think AMD is in a really tough spot. Don't take me wrong. I want them to succeed. I really do. I love my AMD 7970, ATI 980 pro, and original ATI 8500. And my hatred for NVIDIA's business practices has what has kept me from jumping ship. But it looks like AMD is abandoning delivering value for graphics for mainstream, which is where their strong point was. I think they will see their market share start to plummet if this pricing model continues.

but that strong point never really help them as a company. no doubt being cheap will help them to move volume more but to effectively gain market share from nvidia AMD need more than just being value champion. for one they seriously need real marketing effort.

That said, I don't think the current pricing models by both NVIDIA and AMD can be sustained. There have been rumblings of disappointing sales of NVIDIA RTX series. I think NVIDIA falsely saw justified price increases because the crypto boom created a sustained high price. Therefore NVIDIA believed they could continue it and the market would bear it because there isn't a choice. That was a mistake for various reasons (Mainly supply and demand basics). And soon the boom will fall. This is especially true as soon and Intel gets in on the act.

nvidia for their part has been experimenting a lot with pricing. those RTX crazy pricing part of it is to gouge how far they can go with pricing. in AMD case they always want to get better profit for themselves (has been the case since 2012) . when nvidia go crazy with RTX pricing they have been very supportive on that. that's why RX590 MSRP was set at $275. to me it is more like indirect signal from AMD telling nvidia to continue what they have been doing on the high end on the mid range: keep the price/performance on the new generation the same as the previous gen. but yeah RTX sales are not as good so they still hammer AMD RX590 with $280 GTX1660Ti. so navi did not really undercutting nvidia existing product probably another bait from AMD for nvidia not to go too aggressive with pricing.

about intel....it seems some people expect will charge expensive price even if their part is slower simply because they have "intel" branding. but i think intel can be very destructive from the get go. they may not beat AMD and nvidia in performance but they probably can pressure AMD and nvidia in pricing. or they can doing promotion directly to OEM where they can bundle their graphic card for "free" if they develop their machine using intel CPU.
 
Looking at most of ATI/AMD when they've introduced cards, out of all the big successes, the HD4870 stands out the most to me and it wasn't even a "homerun" from the performance perspective. I'm sure many of you will know why.

What AMD is doing with the Navi siblings is daft and short-sighted IMO. They'll get their hand forced sooner than later, but they lost their big chance at bringing a second HD4870.

Cheers!

but i don't think they can take nvidia by surprise this time. also back then they still can compete with nvidia evenly on the high end. and they also need to make sure nvidia can't retaliate back just like how intel have trouble with their Ryzen right now. looking what they have been doing right now it seems they really hope for nvidia not to enter SSJ mode against them.
 
but that strong point never really help them as a company. no doubt being cheap will help them to move volume more but to effectively gain market share from nvidia AMD need more than just being value champion. for one they seriously need real marketing effort.

Theres also no data to support the contrary statement (Contrapostive) But you have to look at the facts and make a few assumptions based on questions

Why would someone choose AMD over NVIDIA?
  • NVIDIA comes out of the gate first
  • NVIDIA is the market leader
Well we can assume some are fan boys, or hate NVIDIA, Then those are are looking for better value.

Can we assume fan boys will buy the more expensive cards just because? If this were the case, Vega 64's and Vega 56's would sell better than Polaris because the price point is similar to the 5700 XT. But we know this is demonstrably not the case based on unit numbers. Therefore, price is a factor.

For years I've been holding off upgrading. And it's becoming a vicious cycle. NVIDIA releases a new card and it's expensive. So I'm like "AMD's next gen is supposed to be fast" Well I wait that additional six months or year and discover it's no faster and the pricing while better isn't hugely so.

Also I'm dealing with a moving target performance wise. That means AMD is offering me a product that's a year behind the curve. So I think, "Well I know in another 6 months to a year NVIDIA will come out with faster cards. Why would I invest in year old tech, when that mainstream performance window was YESTERDAY. Mainstream today and the next few years is 1440p. I'm not paying $450 for that. $450 is NOT mainstream. My 7970 was cheaper than that, and that was top tier. A 5700XT $450 is mainstream middle tier at best.

I just couldn't stomach paying $250 or $300 for an RX580 when I knew NVIDIA was right around the corner, and knowing RX580 is no longer mainstream because 1440p is mainstream. VR is becoming more mainstream. The RX580 and RX570 struggle with VR. I even see 1080p titles struggle with RX580 sometimes. I would argue RX570/RX580 is now entry level only.

However at $170 today that I could handle the RX580 as a stop gap. I'll get a RX580, put it in my primary machine. I'll use it till pricing wars begin. Then I'll likely pick up an RTX2070 at $400->$425 unless the 5700XT is cheaper by $50. Then I'll transfer the RX580 to my secondary system. I'll then sell my 7970 for $50-$75 on fleebay. If no pricing war occurs, then I'll just sit and wait.
 
Last edited:

King_V

Illustrious
Ambassador
For me, it's always boiled down to:
  • I need X performance at Y resolution and Z refresh
  • What cards meet or exceed that performance?
  • Which one of them is the least expensive?
  • I slightly favor AMD due to their driver GUI being modern, and FreeSync being, well Free
    • Though Nvidia supporting FreeSync for 10xx and above negates that, but their driver GUI is awful.
  • Is it not too bad of a power hog? I prefer more efficient, can deal with less efficient but:
    • Not at the expense of too much noise
    • If I need to upgrade my PSU, that factors into the equation

Maybe I'm just crazy, going for "best cost/performance ratio, as long as it meets my performance needs."


That's my general logic, but it got flouted with my GTX 1080 flaunted most of that logic because:
  • I purchased an upgraded PSU in anticipation of a 1080Ti to better work with my 3840x1600 monitor.
    • Or to get a 1070 when/if I could, then resell it or give it to my son (he desperately needed an upgrade, though he actually wasn't complaining himself - but his R7 250E was not cutting it on a 2560x1080 screen) when I got the 1080Ti.
  • Crypto-currency madness meant I got what I could get: in this case, a 1080 FE which managed to stay in stock for 6 minutes rather than 5 for the 1080Ti and 1070. That same madness priced Vega well beyond reach.
In hindsight, I think I shoud've gone with a 3440x1440 monitor rather than 3840x1600... but I chose the latter because I needed the equivalent width of the dual 1920x1080 monitors at work. In hindsight, I guess I could've lived with the width-equivalent of 200 pixels of horizontal loss from each work monitor.


Ok, that exception with the 1080 was that, due to timing, etc., I could've gone with a Vega 64 if one had an amazing cooling solution, but, eh, at the time, the Vegas were so overpriced (not due to AMD, but because of Crypto) that all the rules went out the window.
 
Last edited:
Back when the HD4870 was released, AMD/ATI was competing with the 8800GTX still and subsequent generation right before the GTX280 was released. They did not touch the GTX280 at the top, but their VLIW-derived arch was able to net them a very efficient and small GPU that was ~80% as good as nVidia's best for like... Half the price or something? These two were launched right one after the other, so AMD/ATI had to play a good card and they did back then. After having that lesson in their history, I'm surprised they did not do it again. Polaris was more or less similar in that vein, although we new Vega was coming.

So, that is why they're being daft with this release. It's not a problem with the GPU itself, but the positioning. That's all.

Cheers!
 
Back when the HD4870 was released, AMD/ATI was competing with the 8800GTX still and subsequent generation right before the GTX280 was released. They did not touch the GTX280 at the top, but their VLIW-derived arch was able to net them a very efficient and small GPU that was ~80% as good as nVidia's best for like... Half the price or something? These two were launched right one after the other, so AMD/ATI had to play a good card and they did back then. After having that lesson in their history, I'm surprised they did not do it again. Polaris was more or less similar in that vein, although we new Vega was coming.

So, that is why they're being daft with this release. It's not a problem with the GPU itself, but the positioning. That's all.

Cheers!

personally i think it get's more complicated for AMD when they try to add more compute oriented design with their hardware. just look at Cypress. die size wise it was only slightly bigger than GF104 used in nvidia GTX460 but performance wise Cypress is competing with GF100. during that generation (4k to 6k) AMD can still aggressively play with the pricing. but going forward they start losing that die size advantage.
 
personally i think it get's more complicated for AMD when they try to add more compute oriented design with their hardware. just look at Cypress. die size wise it was only slightly bigger than GF104 used in nvidia GTX460 but performance wise Cypress is competing with GF100. during that generation (4k to 6k) AMD can still aggressively play with the pricing. but going forward they start losing that die size advantage.
I get what you're saying, but then again you have to ask yourself if they need all the computing side of things for mid or low end. Why put Vega cores in an APU, for instance? Professional workloads using an APU? Why not simplify the design to a point where you just do 3D effectively and efficiently? Polaris was a step in the right direction IMO, but somehow they went back to "bigger is better".

I once said they should go back to a VLIW derived uArch like Terascale was, but I know that's not as a good idea as it was before... Then again, I'll be damned if I don't think they shouldn't give it a try.

Cheers!
 

King_V

Illustrious
Ambassador
Back when the HD4870 was released, AMD/ATI was competing with the 8800GTX still and subsequent generation right before the GTX280 was released. They did not touch the GTX280 at the top, but their VLIW-derived arch was able to net them a very efficient and small GPU that was ~80% as good as nVidia's best for like... Half the price or something? These two were launched right one after the other, so AMD/ATI had to play a good card and they did back then. After having that lesson in their history, I'm surprised they did not do it again. Polaris was more or less similar in that vein, although we new Vega was coming.

So, that is why they're being daft with this release. It's not a problem with the GPU itself, but the positioning. That's all.

Cheers!

Yep, sounds about right. Judging by some brief/hasty digging up of older articles, it sounds like the GTX 280 was great performance, but inefficient and crazy expensive.
For once, our assessment of this Radeon HD 4870 will be simple: It’s an excellent high-end graphics card! With the same architecture and most of the strong points of the Radeon HD 4850, it’s in a higher category performance- and price-wise. The bottom line: Though it’s faster by an average of 6% (and in the majority of our tests) than the GeForce GTX 260, it sells for $299 – $150 less than the competing Nvidia card (ie, priced at $449)! Even the top-end card from Nvidia, the GeForce GTX 280 – souped up with more transistors, twice as much memory and higher clock speeds – is not that far ahead. It showed only 13% better performance than the Radeon HD 4870, though it costs twice as much. ($599)
(emphasis mine)

And keep in mind that those are prices from 11 years ago. That $599 is about $713 in today's dollars. (at least according to an online calculator)

Of course, all this is a lot of guesswork and assumption on my part. I think really all they need, is equal or better performance than the equivalent Nvidia card, at a lower price.

While it still seems to me that the 5700XT is better than the 2070 in price/performance (based on the info we have so far, and, yeah, I admit, I'm dismissive of ray-tracing for now), I don't think they had a breakthrough in design like they did back then. Or maybe Nvidia was just assuming ATI had nothing, so priced however the hell they felt like it (ie: our precursor to what they did with the RTX cards?)

The numbers I saw for the non-XT vs the 2060 suggests they're missing the mark, though.
 
According to one TOM's hardware article, some more NAVI variants have been spotted in a Linux driver. The most recent Linux display driver contains multiple lines of code that makes reference to AMD's Navi 10, Navi 12, Navi 14 and Navi 21 GPU variants.

It's unclear at this point where the Navi 12, Navi 14 and Navi 21 will find their places in AMD's graphics cards. However, it's speculated that AMD could use the Navi 21 silicon in the Radeon RX 5800 graphics cards while saving the Navi 12 and Navi 14 dies for the Radeon RX 5600 and RX 5500 lineups, respectively.

https://www.tomshardware.com/news/amd-navi-10-navi-12-navi-14-navi-21,39684.html
 
  • Like
Reactions: david_the_guy

david_the_guy

BANNED
May 11, 2019
77
14
35
According to one TOM's hardware article, some more NAVI variants have been spotted in a Linux driver. The most recent Linux display driver contains multiple lines of code that makes reference to AMD's Navi 10, Navi 12, Navi 14 and Navi 21 GPU variants.

It's unclear at this point where the Navi 12, Navi 14 and Navi 21 will find their places in AMD's graphics cards. However, it's speculated that AMD could use the Navi 21 silicon in the Radeon RX 5800 graphics cards while saving the Navi 12 and Navi 14 dies for the Radeon RX 5600 and RX 5500 lineups, respectively.

https://www.tomshardware.com/news/amd-navi-10-navi-12-navi-14-navi-21,39684.html

When are these going to be released btw ??
 
When are these going to be released btw ??

Not sure about any release date of those GPU variants. BTW, Videocardz has been posting more and more leaks.

To quote their article:

A surprising leak came from Komachi on Twitter. Apparently Sapphire is preparing quite a few Radeon RX 5000 models:
RX 5950XT , * RX 5950 *, * RX 5900XT *, * RX 5900 *, * RX 5850XT *, * RX 5850 *, RX 5800XT , * RX 5800 *, * RX 5750XT *, * RX 5750 *, * RX 5700XT *, * RX 5700 *, * RX 5650XT *, * RX 5650 *, * RX 5600XT *, * RX 5600 *, * RX 5550XT *, * RX 5550 *, * RX 5500XT *, * RX 5500 *, *RX590XT, * RX 590

They suspect that these are just placeholders. Sapphire simply registered all possible trademarks to save time. It seems very unlikely that Radeon RX 5000 series would feature that many SKUs. AMD has not released this many cards in a very long time. However, it is possible that AMD still has something to say in the high-end and enthusiast market segments. There had to be a reason why AMD did not name the Navi10-based models as RX 5900 or RX 5800.

A dual-GPU RX 5950XT anyone?

https://videocardz.com/newz/sapphire-registers-radeon-rx-5950-5900-xt-rx-5850-5800-xt-series-at-eec
 
honestly i don't think AMD really interested in making dual GPU for gamer anymore. hence there is no real successor to R9 295x2. forget about dual GPU AMD slowly killing multi GPU support and instead want game developer to take up the initiatives themselves to support multi GPU using low level API like DX12. something interesting here (more details on the comment section):

View: https://www.youtube.com/watch?v=wz5ZvG1M6tc&t=14s


in summary CF support are worst on Vega GPU than polaris based GPU. there are new games that polaris in CF will work but not for Vega CF according to the guy who did the test. Radeon 7 have it worse. they only support multi GPU through low level API like DX12.