Discussion AMD NAVI RX 5700 XT's picture and SPECS leaked *off topic*

Page 3 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Seems like I read somewhere that AMD no longer wants to be the "budget option". Either way, I think they should stop releasing the same performance for [nearly and sometimes] the same price as Nvidia nearly a year later. That's not going to put them on top.

It will soon be 2 years since I bought my Nvidia GPU and AMD has yet to release something that can even match it in performance.
they try to get rid that "budget option" sticker for a very long time now. just look at 7970 initial pricing. many expect the card to replace 5870/6970 price wise at around $380 to $400. for those that even more conservative with their prediction they expect it can go as high as $450. then $550 came. even more expensive than GTX480/GTX580 MSRP ($500). bye bye $400 for flagship price from AMD. 290X also debut at $550. with Fury X they increase their flagship price to $650. if not for nvidia dropping GTX 1080 price by $100 when they launch GTX1080ti AMD probably put Vega 64 at the same price as GTX1080 original MSRP.
 
I would go so far as to say that Ray Tracing is probably as small of a factor as less power is
don't underestimate nvidia marketing prowess when it comes to pushing features. power efficiency for example. back in 2014 nvidia create the market for card like GTX750ti. it was very successful to the point even AMD try to create their own "powerful and yet very power efficient" card called RX460. hence AMD showcasing polaris 11 vs GTX950 at CES2016 and showing system with polaris 11 only drawing half the power compared to the one using GTX950.

How big is the "top end" market for video cards, versus, say, the mainstream/1080p level where AMD handily outdoes Nvidia in price/performance?
having the fastest GPU crown for yourself will help sell your other slower cards. trust me. this kind of marketing never fail. yes if we look it from face value perspective then AMD is dominating the mid range segment with their price/performance metric. but the reality is nvidia still sell boat load more mid range GPU volume wise even if they are more expensive. right now nvidia owns more than 80% market share of discrete GPU.

Until seeing your post, I never heard of Gameworks, so, can't say how much effect that has.
that is just one of nvidia initiatives. but at this point pretty much everyone knows nvidia have really solid devrel that AMD can't match that starts with the "TWIMTBP" program for more than a decade ago. AMD is quite competitive when it comes to triple A games but outside that nvidia still have the edge. take this for example: among those like to play MMO or MMORPG type of games a lot, for years they have this rule that if you simply want to play your game simply buy nvidia GPU. remember majority of gamer did not only play triple A games. they also play other games. more often than not nvidia covers their needs better than AMD.

I really don't think AMD has to outdo the price/performance of Nvidia by an overwhelming amount.
i can agree on this point. in fact after being the value champion for almost a decade vs nvidia they still fail to significantly gain market share for themselves. AMD need something more to fight nvidia. they need real marketing effort and further improve their devrel which is still behind nvidia at this point.
 
i can agree on this point. in fact after being the value champion for almost a decade vs nvidia they still fail to significantly gain market share for themselves. AMD need something more to fight nvidia. they need real marketing effort and further improve their devrel which is still behind nvidia at this point.
I suppose that is a good point. The RX 470/480 and 570/580 were and are a MUCH better value than Nvidia (GTX 1050 Ti & 1060) at that price point; yet we now see those Nvidia's cards being the #1 and #2 cards on Steam by a HUGE margin and AMD being way down below them. I think the crypto-mining boom had some effect on Radeon availability, but people would still rather have an Nvidia GPU for some reason.

Reference
https://store.steampowered.com/hwsurvey/videocard/
 
The RX 470/480 and 570/580 were and are a MUCH better value than Nvidia (GTX 1050 Ti & 1060) at that price point; yet we now see those Nvidia's cards being the #1 and #2 cards on Steam by a HUGE margin
I think this is because Gamers tend to think Nvidia is superior to AMD, and this mindset does have an effect on their purchasing decision. I've noticed this bent of mind.
 
Reactions: MrN1ce9uy

digitalgriffin

Distinguished
Jan 29, 2008
963
203
19,390
22
I suppose that is a good point. The RX 470/480 and 570/580 were and are a MUCH better value than Nvidia (GTX 1050 Ti & 1060) at that price point; yet we now see those Nvidia's cards being the #1 and #2 cards on Steam by a HUGE margin and AMD being way down below them. I think the crypto-mining boom had some effect on Radeon availability, but people would still rather have an Nvidia GPU for some reason.

Reference
https://store.steampowered.com/hwsurvey/videocard/
I can somewhat agree with this. AMD cards did significantly better on popular crypto currencies like Monero, so pricing was absurd and most AMD cards likely went to mining instead of gaming. By the time it was all over most the current gen was already bought. And those whom haven't upgraded were waiting for next gen (like me)

There has been a massive upsurge of RX580/RX570 purchases now they are being liquidated. (And respective increase on steam) But it's too little too late.

And I agree with Metal Messiah. There is some mindshare/mindset to many gamers that NVIDIA is always better. The truth is what card you pick is based on many factors. NVIDIA isn't always the best answer unless you care about ultimate speed and budget is not a factor.
 
I think the crypto-mining boom had some effect on Radeon availability, but people would still rather have an Nvidia GPU for some reason.
By far. AMD cards were practically becoming what I and some others call unobtanium for a while. It was absolutely insane.

So, practically all that was left for gamers were Nvidia cards. Eventually, even those started getting sparse when the cryptominers, now unable themselves to get AMD cards even at the overinflated prices they'd driven them to, started going for Nvidia for cryptomining.

That was all such a colossal mess....
 
Reactions: MrN1ce9uy

david_the_guy

BANNED
May 11, 2019
77
13
35
0
Are you sure about this ? Because I think GCN replaced VLIW with a traditional SIMD vector processor. The complex nature of VLIW made it harder to disassemble and to debug though.

Actually, there was some difference in the architectures though. The VLIW was poor for GPU computing purposes, unlike the non-VLIW SIMD. The principal issue was that VLIW was hard to schedule ahead of time and there’s no dynamic scheduling during execution. VLIW was all about extracting instruction level parallelism (ILP), and the non-VLIW SIMD was about thread level parallelism (TLP).

An Example "code snippet" of the VLIW compiler, as well as GCN. You can see there are some restrictions under VLIW (assuming you can understand the CODE), so that's why AMD dropped VLIW.

IMO, In 2021 we might see a completely new arch, rumored as ARCTURUS, (most probably on VLIW2, or as AMD calls it SUPER-SIMD). This is where things might change for AMD.

Code:
.VLIW
// Registers r0 contains "a", r1 contains "b"
// Value is returned in r2
00   ALU_PUSH_BEFORE
       1  x: PREDGT     ____, R0.x,  R1.x
             UPDATE_EXEC_MASK UPDATE PRED
01 JUMP   ADDR(3)
02 ALU
       2  x: SUB        ____, R0.x,  R1.x
       3  x: MUL_e      R2.x, PV2.x, R0.x
03 ELSE POP_CNT(1) ADDR(5)
04 ALU_POP_AFTER
       4  x: SUB        ____, R1.x,  R0.x
       5  x: MUL_e      R2.x, PV4.x, R1.x
05 POP(1) ADDR(6)

Non-VLIW SIMD
// Registers r0 contains "a", r1 contains "b"
// Value is returned in r2
v_cmp_gt_f32       r0,r1          //a > b, establish VCC
s_mov_b64          s0,exec        //Save current exec mask
s_and_b64          exec,vcc,exec  //Do "if"
s_cbranch_vccz     label0         //Branch if all lanes fail
v_sub_f32          r2,r0,r1       //result = a - b
v_mul_f32          r2,r2,r0       //result=result * a


s_andn2_b64        exec,s0,exec   //Do "else" (s0 & !exec)
s_cbranch_execz    label1         //Branch if all lanes fail
v_sub_f32          r2,r1,r0       //result = b - a
v_mul_f32          r2,r2,r1       //result = result * b

s_mov_b64          exec,s0        //Restore exec mask

Ouch ! That was too technical for me to grasp, but nonetheless informative info. So when is arcturus coming ??
:p
 
I think this is because Gamers tend to think Nvidia is superior to AMD, and this mindset does have an effect on their purchasing decision. I've noticed this bent of mind.
majority of gamer are not interested in the tech details. when they saw nvidia have the fastest GPU like 2080ti they quickly judge as long as it is nvidia then it will be superior to AMD. this is why AMD try their very hard to take the crown for themselves in the past even if they have to use dual GPU to win that crown. but somehow the one that only matter is having that single GPU crown. dual GPU while very fast have it's own issues in which becoming the negative point even if they are the "fastest". nvidia probably starts aware about this from fermi generation. their GTX590 barely keeps up with AMD HD6990 at the time. heck many even questioning the quality build of GTX590 as well when reviewer like TPU blown their GTX590 during testing. and if i remember correctly it happen twice as well! despite that nvidia keep gaining market share. GTX690 is the last time nvidia get serious with dual card.
 
majority of gamer are not interested in the tech details. when they saw nvidia have the fastest GPU like 2080ti they quickly judge as long as it is nvidia then it will be superior to AMD. this is why AMD try their very hard to take the crown for themselves in the past even if they have to use dual GPU to win that crown.
I agree. Most of the games don't even look at the GPU specs before buying. It's that psychological bent of mind some gamers have, that if NVidia's card is faster, then it is obviously going to be superior as well.
 

lux1109

Proper
BANNED
Apr 30, 2019
117
25
110
0
I agree. Most of the games don't even look at the GPU specs before buying. It's that psychological bent of mind some gamers have, that if NVidia's card is faster, then it is obviously going to be superior as well.
All thanks to nvidia though. After all they are GREEN, lol.
 
So when is arcturus coming ??
Not anytime soon. Most probably it will come out in mid 2021, or it can even get delayed as well, if AMD changes it's roadmap. This will be a new GPU arch though. Next year AMD will launch the high-end flagship NAVI GPUs.

So I don't expect ARCTURUS to launch within the same year itself. But I guess we will get to know about this more in the coming months.
 

digitalgriffin

Distinguished
Jan 29, 2008
963
203
19,390
22
I would consider it upper mid range. (5700XT)

Two years ago it would have been high range.

And if you look at that, that's disappointing. Three years ago upper mid range was $250 to $300. (Before crypto took hold.). Now games are more complex and gaming mainstream is moving towards 1440p and mid range really is 5700XT.

So we went from $300 to $450.

Meanwhile the relative price: performance needle hasnt changed much.
 
Last edited:
I would consider it upper mid range. (5700XT)

Two years ago it would have been high range.

And if you look at that, that's disappointing. Three years ago upper mid range was $250 to $300. (Before crypto took hold.). Now games are more complex and gaming mainstream is moving towards 1440p and mid range really is 5700XT.

So we went from $300 to $450.

Meanwhile the relative price:performance needle hasnt changed much.
And then people asks why a lot of us feel disappointed with the announced MSRP... There's a reason AMD hasn't called them mid-range.

Cheers!
 

lux1109

Proper
BANNED
Apr 30, 2019
117
25
110
0
Three years ago upper mid range was $250 to $300. (Before crypto took hold.). Now games are more complex and gaming mainstream is moving towards 1440p and mid range really is 5700XT.

So we went from $300 to $450.
it's even more than that I suppose...Previous mid range price has now become upper mid-range/high, and the prices have increased a lot as well. I think a time will come, when even AMD are going to sell their products within the same price bracket of Nvidia, or even more.

Nvidia won't stop either.
 
The "Game Clock" might be the new addition, but the actual higher "Boost Clock" clock speed is what is new, and it looks pretty darn impressive!
Good question, I'm not sure how that will work. I was referring to the higher clock speed of 1905MHz as the "value". We haven't seen that high clock speed from AMD yet.
Actually, OC3D have just now posted some insight regarding the base, boost and Gaming clock speed meaning/purpose. To quote the article:

AMD's base clock is what could be considered as their minimum GPU clock speeds, what each graphics card will aim for when under extreme "power virus" loads such as FurMark. Navi's clock speeds shouldn't ever get lower than this under load, assuming that your PC is well enough ventilated to prevent thermal buildup and throttling.

Navi's Boost clock is what the Radeon RX 5700 series targets opportunistically. In ideal scenarios, the Radeon RX 5700 will target this clock speed, but it won't be achievable under many workloads. With base and boost, Radeon has provided gamers with what could also be described as maximum and minimum clock speeds, which begs the question; what about an average?

This is where AMD's new "Gaming" clock speeds come into play. AMD has tested 25 games, which are a mix of unknown AAA and esports titles, and has used their performance data to calculate a clock speed which acts as a ballpark figure for what gamers should expect across most gaming titles. AMD has been a little conservative with their Gaming clock to ensure that most games will sit between their Gaming clock and boost clock speeds.

The base clock acts as an effective minimum clock speed for extreme (power virus) loading conditions while the boost acts as a maximum. Gamers can expect the vast majority of titles run somewhere between Navi's gaming and boost clocks, assuming their system is adequately cooled and powered.


https://www.overclock3d.net/news/gpu_displays/what_does_amd_s_radeon_rx_5700_series_base_boost_and_gaming_clocks_mean/1
 
Some update on this issue.

First of all, according to GURU3D, custom AIB NAVI cards would be available in August. Yes, there will be AIB (custom) cards for the new Radeon RX 5700 and 5700 XT, however, these will not be available during launch in a few weeks, but roughly one month later.

So AIB Radeon 5700 and 5700 XT cards is a definite yes, and they will become available roughly one month after the reference launch on the 7th....

https://www.guru3d.com/news-story/amd-radeon-rx-5700-(navi)-aib-customized-cards-available-in-august.html

On other news, VIDEOCARDZ posted this article... AMD will not sell its 50th Anniversary Edition of NAVI-based RX 5700 XT graphics card outside China and USA.

During the Next Horizon Gaming product announcement, AMD president and CEO, dr. Lisa Su unveiled a limited 50th-anniversary edition of Radeon RX 5700 XT. A 75-MHz faster variant of the Navi 10-based model will only be available in China and USA, according to the news report from Cowcotland.

No customers in Europe, Oceania or Africa will be able to buy this card directly. This means that RX 5700 XT 50th will follow the path of Radeon VIII 50th SKU, which was also hard to buy in certain regions.

https://videocardz.com/newz/amd-radeon-rx-5700-xt-50th-anniversary-edition-only-for-usa-and-china
 
A month later for the custom cards is kind of lame, but not unexpected, I guess. And as for the 50th anniversary edition... Meh.

I just hope AMD has done the proper job of making sure their blower types are not a repeat of the HD6K series and HD7K. I also recall the RX-290 was quite hot and loud with the reference blower design cooler.

Cheers!
 
A month later for the custom cards is kind of lame, but not unexpected,
i think this is normal when GPU maker launch new architecture. part of it was because GPU maker want to keep the details of their new architecture as much as possible from the competitor. and board partner have always the best place for leak to happen. that's why it is very hard to get GPU leak now until the real thing really close to release. a few years ago we can get almost accurate performance leak within two or three months before new GPU release. right now seeing performance leak is almost outright impossible apart from the presentation slide one.
 

ASK THE COMMUNITY

TRENDING THREADS