Discussion AMD NAVI RX 5700 XT's picture and SPECS leaked *off topic*

Hello,

You may or may not be aware of this, but Videocardz has just leaked some SPECS and info on the AMD's upcoming NAVI GPU.

AMD Radeon RX 5700 XT features 40 Compute Units (2560 stream processors). The Navi GPU is clocked at 1605 MHz in base mode, 1755 MHz in Game Mode and 1905 MHz in boost mode. Of course, the new addition here is the Game Clock.

With the said boost clock, AMD expects a maximum of 9.75 TFLOPs of single-precision compute from the Radeon RX 5700 XT. The card is also confirmed to feature 8 GB of GDDR6 memory that should run across a 256-bit wide bus interface


The card is confirmed to feature 8GB of GDDR6 memory. The memory bus width, memory clock, pricing, and availability date were not available at the time of writing.

https://videocardz.com/80966/amd-radeon-rx-5700-xt-picture-and-specs-leaked

https://wccftech.com/amd-radeon-rx-5700-xt-7nm-navi-gpu-rdna-specs-leak-8-gb-2560-cores/
 
Last edited:
  • Like
Reactions: MrN1ce9uy
honestly i don't think AMD really interested in making dual GPU for gamer anymore. hence there is no real successor to R9 295x2. forget about dual GPU AMD slowly killing multi GPU support and instead want game developer to take up the initiatives themselves to support multi GPU using low level API like DX12. something interesting here (more details on the comment section):

View: https://www.youtube.com/watch?v=wz5ZvG1M6tc&t=14s


in summary CF support are worst on Vega GPU than polaris based GPU. there are new games that polaris in CF will work but not for Vega CF according to the guy who did the test. Radeon 7 have it worse. they only support multi GPU through low level API like DX12.
 
The "Game Clock" might be the new addition, but the actual higher "Boost Clock" clock speed is what is new, and it looks pretty darn impressive!

I like the XT and Pro throwback to ATI. Very cool.

The pricing does seem a bit high for good taste. AMD is continuing the uptrend in GPU pricing that Nvidia piggy-backed off the crypto-mining boom in 2017-2018. As a free-market advocate, I suppose I can't blame them.

Off-topic: I see AMD is now holding back their Ryzen 3000 CPUs, keeping their 16-core for whatever Intel has to counter. Intel hasn't sufficiently countered, so AMD holds for now.

All these nice new parts and I don't need to upgrade. My system should last me 5+ years. AMD and Intel will find out, while they are sandbagging, people will not need to upgrade nearly as often because they continue holding back the better performing parts making any new upgrade marginal and/or making the new, higher performing parts outrageously expensive.

*Update: If anyone else is like me, once I bought the high-end part for everything, once I reached the pinnacle, PC gaming was no longer as interesting to me. I had more fun using the budget-oriented gaming hardware back when I couldn't afford better parts than I ever have on any more powerful hardware. That was back when I actually played games because I liked gaming; now I just play games to see what the hardware can do, which I stopped doing because life gets in the way. Now I only play League of Legends when I have the time. High-end hardware is overrated.
 
Last edited:
  • Like
Reactions: Metal Messiah.

King_V

Illustrious
Ambassador
The "RDNA" re-brand of GCN feels cheap

This seems like a very unfair description. It's not a rebrand of GCN.

"Although the company says RDNA is all-new, vestiges of Graphics Core Next are clearly identifiable throughout."

There's probably good things that are in GCN that are worth keeping. I'm sure the same can be said of Nvidia's architecture, some old things that were worthwhile became part of the new architecture.

After all:
vestige

noun
1. a mark, trace, or visible evidence of something that is no longer present or in existence: A few columns were the last vestiges of a Greek temple.

2. a surviving evidence or remainder of some condition, practice, etc.: These superstitions are vestiges of an ancient religion.

3. a very slight trace or amount of something: Not a vestige remains of the former elegance of the house.

4. Biology. a degenerate or imperfectly developed organ or structure that has little or no utility,but that in an earlier stage of the individual or in preceding evolutionary forms of the organismperformed a useful function.

5. Archaic. a footprint; track.
 
Fair enough; thanks for pushing back on that topic. I just can't shake the feeling that GCN is still lurking under RDNA, but you can't really re-work something from scratch and have zero remnants of previous implementations or IP. I'll need to read more on RDNA then, as my first (quick) read gave me the impression it was GCN with a few tweaks instead of a proper grounds up re-work.

Cheers!
 
  • Like
Reactions: Metal Messiah.
A month later for the custom cards is kind of lame, but not unexpected,

i think this is normal when GPU maker launch new architecture. part of it was because GPU maker want to keep the details of their new architecture as much as possible from the competitor. and board partner have always the best place for leak to happen. that's why it is very hard to get GPU leak now until the real thing really close to release. a few years ago we can get almost accurate performance leak within two or three months before new GPU release. right now seeing performance leak is almost outright impossible apart from the presentation slide one.
 
According to one TOM's hardware article, some more NAVI variants have been spotted in a Linux driver. The most recent Linux display driver contains multiple lines of code that makes reference to AMD's Navi 10, Navi 12, Navi 14 and Navi 21 GPU variants.

It's unclear at this point where the Navi 12, Navi 14 and Navi 21 will find their places in AMD's graphics cards. However, it's speculated that AMD could use the Navi 21 silicon in the Radeon RX 5800 graphics cards while saving the Navi 12 and Navi 14 dies for the Radeon RX 5600 and RX 5500 lineups, respectively.

https://www.tomshardware.com/news/amd-navi-10-navi-12-navi-14-navi-21,39684.html
 
  • Like
Reactions: david_the_guy
I like the XT and Pro throwback to ATI. Very cool.

Yeah, I totally agree on that. This nomenclature is a welcome change for sure. On other news, it seems AMD will launch Ryzen 9 3950X as world’s first 16-core mainstream processor.

AMD Ryzen 9 3950X would utilize all available cores from both dies. It would preserve the TDP from Ryzen 9 3900X which features 12 cores but at the cost of a lower base clock. This AM4 processor is supposedly a 105W TDP part. A bit high value, if you ask me.....

https://videocardz.com/newz/amd-ryzen-9-3950x-to-become-worlds-first-16-core-gaming-cpu
 
  • Like
Reactions: MrN1ce9uy
By the way, what's the difference between ""Game Clock" and "Boost clock" speed values ?? Isn't BOOST clock the same as gaming Mode, I mean when the GPU is under full load, the card will boost to the desired clock speed value ??
Good question, I'm not sure how that will work. I was referring to the higher clock speed of 1905MHz as the "value". We haven't seen that high clock speed from AMD yet.
 
  • Like
Reactions: Metal Messiah.

lux1109

Reputable
BANNED
Apr 30, 2019
117
25
4,610
What about the performance level ? Any guess ? Will it compete with Nvidia's high-end offerings ? Will this be faster or slower than the current Radeon 7 GPU ??
 
Will this be faster or slower than the current Radeon 7 GPU ??

Slower in my opinion, since AMD is only releasing mainstream NAVI GPUs this year.

According to the roadmap, NAVI 20 is going to land next year, 2020. These might be high-end NAVI GPU offerings, though this is just based on pure speculation for now. Also, NAVI is the last AMD GPU to be based on the GCN architecture though (refined). In 2021 we might see a completely new arch, rumored as ARCTURUS, (most probably on VLIW2, or as AMD calls it SUPER-SIMD). This is where things might change for AMD.
 
  • Like
Reactions: lux1109
UPDATE: !! AMD Radeon RX 5700 XT official gaming benchmarks leaked.

Official figures for 2560×1440 gaming have been leaked. According to the slide, AMD’s new Navi-based Radeon RX 5700 XT graphics card will on average offer better performance than NVIDIA’s GeForce RTX 2070. I'm not impressed though.

The card would be having a blower-type cooler, and would require dual 8-pin power connectors, seriously ?? I think TDP will be close to 300W (2x8pin PCI-E power connectors deliver up to 2x150W) which doesn't really bode well for this 7nm AMD architecture. NVIDIA at 12nm is a lot more power effective. AMD has again increased the voltages and frequencies to the absolute maximum in order to look good.


https://videocardz.com/80993/amd-radeon-rx-5700-xt-official-gaming-benchmarks-leaked

xu3JAZs.jpg
 
It is odd that it requires 2x 8-pin power connectors. I don't think it will require a full 300W though, but it could. Performance looks good, but for the same price as the RTX 2070 and consuming more power, I don't see it as anything spectacular.

I guess the Pricing is going to be a huge deciding factor here, between the RTX 2070, and this new NAVI XT GPU. Though I doubt many gamers are going to opt for a higher TDP card, and that too with a blower-type cooler, just like this XT Model.

If the new upcoming "refresh" RTX 2070 is faster, and more power efficient as well, then AMD would be having a hard time selling this XT NAVI GPU. I guess only time will tell. Third-party benchmarks might provide a more clear picture though.
 
Last edited:
I guess the Pricing is going to be a huge deciding factor here, between the RTX 2070, and this new NAVI XT GPU. Though I doubt many gamers are going to opt for a higher TDP card, and that too with a blower-type cooler, just like this XT Model.

If the new upcoming "refresh" RTX 2070 is faster, and more power efficient as well, then AMD would be having a hard time selling this XT NAVI GPU. I guess only time will tell. Third-party benchmarks might provide a more clear picture though.

If it comes in at more power with no Ray tracing, then the 5700XT will have to be $100 less than the equivalent 2070 pricing to be effective. And I'll wait for the OEM cooler designs. Blowers just blow chunks noise wise. When will they get that?!?

My money is ready. I hate NVIDIA, but if AMD blows this, to NVIDIA I go. I've been waiting too long to upgrade my 7970 for VR.
 

King_V

Illustrious
Ambassador
*Update: If anyone else is like me, once I bought the high-end part for everything, once I reached the pinnacle, PC gaming was no longer as interesting to me. I had more fun using the budget-oriented gaming hardware back when I couldn't afford better parts than I ever have on any more powerful hardware. That was back when I actually played games because I liked gaming; now I just play games to see what the hardware can do, which I stopped doing because life gets in the way. Now I only play League of Legends when I have the time. High-end hardware is overrated.

Decently beefy, but I never bought top-end. I was always a sort of "can I get 80-90% of top end performance for 50% of the price" kind of guy.

I do have a decent system now, but, of course, adulting interferes with my game time even more than it used to. Even with the big monitor (partially for work purposes) and the GTX 1080, I was "stressing" it by mostly playing:
  • Pinball Arcade
  • Heroes of Might and Magic 3


The second system listed in my sig is my most recent system, and was what fascinated me the most - putting together something for light-use/connecting-via-RDP-to-work, that was better than what I already had for that task, consumed less power, and cost less.