News Did AMD Just Confirm Big Navi Is Coming?

Big Navi would have to be quite a good performer IF any of the rumors of Ampere are true. I wont say they are, rumors are rumors, however Nvidia is pushing a new uArch onto 7nm which, per TSMCs specifications, is 2x the density of their 12nm. Even Samsungs 7nm is probably 2x as dense as well if not more since it should be using EUV.

I really want to see Radeon be the performer it used to be. I still remember my 9700Pro which was a beast at the time.
 
  • Like
Reactions: Rdslw
navi was solid but very power hungry chip.
but AMD did show us ~120% ish gains on APU for mobile, IF they know how to do it (or it was mostly from GPU) then 2020 will really be a gamer year from both R B N teams.
 

paul prochnow

Distinguished
Jun 4, 2015
90
5
18,535
AMD CEO Lisa Su may have confirmed that Big Navi is coming in a question-and-answer session at CES 2020.

Did AMD Just Confirm Big Navi Is Coming? : Read more

If AMD comes up with a performance comparo GPU for partners that selll AIO cooled cards for $400 less than the top RTX2080ti 11G, nVidia will be out of it entirely excepting "N" uberfanbois. I feel at $1,000 range there is a LOT of slack in the pricing, a lot, as AIO bumps pinnacle Nvidia cards from $800 to $1400
 

paul prochnow

Distinguished
Jun 4, 2015
90
5
18,535
If AMD comes up with a performance comparo GPU for partners that selll AIO cooled cards for $400 less than the top RTX2080ti 11G, nVidia will be out of it entirely excepting "N" uberfanbois. I feel at $1,000 range there is a LOT of slack in the pricing, a lot, as AIO bumps pinnacle Nvidia cards from $800 to $1400
 

joeblowsmynose

Distinguished
navi was solid but very power hungry chip.
...

I wouldn't say that unless you are thinking it should be way less for being on 7nm.

5700xt - average game power consumption (per Tom's) is 218W
2070s - average game power consumption (per Tom's) is 212W

5500xt - average game power consumption was ~ 100w (as per Tom's review)
1650S - average game power consumption was ~ 100w (as per Tom's review)

So power draw per performance is comparable to Turing.

TSMCs current 7nm isn't overly great on the power consumption side - 7nm+ will be a fair bit better in that area according to them.
 
  • Like
Reactions: Rdslw and TJ Hooker

TJ Hooker

Titan
Ambassador
5700xt - average game power consumption (per Tom's) is 218W
2070s - average game power consumption (per Tom's) is 212W
[...]
So power draw per performance is comparable to Turing.
Bit of a nitpick, but on average the 5700 XT's performance is a lot closer to the 2070 than the 2070S. The 2070 FE draws ~188W according to Tom's review. So the 5700 XT is still lagging a bit on efficiency.
 
  • Like
Reactions: joeblowsmynose

joeblowsmynose

Distinguished
Bit of a nitpick, but on average the 5700 XT's performance is a lot closer to the 2070 than the 2070S. The 2070 FE draws ~188W according to Tom's review. So the 5700 XT is still lagging a bit on efficiency.


Fair enough, but I've seen reviews have it land anywhere between the two, sometimes right on the heels of 2070S - a bit over in a few titles.

Tweaktown numbers over 5 games, 1080p, puts the 5700xt 5.55% behind 2070s (my math across all the games the tested). https://www.tweaktown.com/reviews/9049/amd-radeon-rx-5700-xt-step-right-direction/index9.html Although I generally don't like to use one review only.

Anyway, "very power hungry" seems to be a common sentiment with Navi - but I think that's just mostly carried over sentiment (rhetoric) from Vega.

The higher the performance is pushed, the less efficient current Navi is. Its certainly not linear. And while I have been pressed to find the source (and can't seem to), I'm pretty sure I heard an AMD guy (maybe in a Full Nerds interview) say that current Navi is a GCN / Navi hybrid - and that their vision for full navi won't come until 2020 sometime. So there may be more efficiencies coming in the architecture down the road.

On the flip side, the 5500xt is, I believe, a better performer than 1650super - so its more efficient at the lower levels. Vega is also surprisingly efficient at lower performance levels - hence why AMD is still using Vega in APUs (albeit, they've kept working on it and made it 59% more efficient at that level - according to the CES presentation)
 
  • Like
Reactions: TJ Hooker

mcgge1360

Reputable
Oct 3, 2017
116
3
4,685
The gpu in Xbox project scarlet is claimed to offer twice the performance of the Xbox one. That's 12 TFLOPS. The RTX 2080ti has 13.45 TFLOPS performance.
View: https://twitter.com/AMD/status/1205331072822718464?s=19
FLOPS doesn't mean performance in gaming. It means performance in compute. The RX 480 has the FLOPS of a GTX 1070 but the gaming performance of a GTX 1060. The xbox one X has similar power to an rx 480 and is AMD based as well. Twice the power would mean it's similar to a Vega 64 or RX 5700, not in a league CLOSE to a 2080ti. More like a 2060 Super.
 

TJ Hooker

Titan
Ambassador
AMD does seem to have significantly improved their gaming performance to compute performance (FLOPS) ratio with Navi (although they're still behind Nvidia in this regard). If we assume that the new Xbox has twice the FLOPS as the One X, it could actually have more than double the gaming performance as the One X due to the improved Navi architecture.

That being said, the exact quote I've found is "we're over 8 times the GPU power of the Xbox One, and two times what an Xbox One X is." So I don't know if we can say with certainty if that means twice the FLOPS or what.

If we say that a 5700 XT is about on par with a 2070 while having ~30% higher FLOPS, if we apply that same ratio to the Xbox Series X FLOPs we end up with something a little better than a 2070 Super. Obviously this is a very crude, back of a napkin sort of estimate.
 
I wouldn't say that unless you are thinking it should be way less for being on 7nm.

5700xt - average game power consumption (per Tom's) is 218W
2070s - average game power consumption (per Tom's) is 212W

5500xt - average game power consumption was ~ 100w (as per Tom's review)
1650S - average game power consumption was ~ 100w (as per Tom's review)

So power draw per performance is comparable to Turing.

TSMCs current 7nm isn't overly great on the power consumption side - 7nm+ will be a fair bit better in that area according to them.
turing was very power hungry crap.
I just remind you
980 was 144W (did ~180W full load)
1080 was 180W (did ~220W full load)
2080 IS 225W (~250W)
for same "tier" performance
not something we should be happy about when whole world burns because we use to much fosil fuels
 

joeblowsmynose

Distinguished
turing was very power hungry crap.
I just remind you
980 was 144W (did ~180W full load)
1080 was 180W (did ~220W full load)
2080 IS 225W (~250W)
for same "tier" performance
not something we should be happy about when whole world burns because we use to much fosil fuels

My country gets electricity from water. :) Maybe ask your country's leader to look for alternatives to burning coal/oil.

----
And while at the time, both a 980 and a 2080 was both "top" tier performance, you certainly wouldn't put them in the same performance category at the same time - there's no comparison.

Yet gamers keep demanding more and more and more .... but its the manufacturers fault if they do what is needed to try to meet those demands?

The very first GPU was probably like 5 watts. The demand for greater performance along with better cooling capabilities has caused this to drive up to the levels you see today.

If you can make a more power efficient GPU design, be sure to let NVidia know ...
 
turing was very power hungry crap.
I just remind you
980 was 144W (did ~180W full load)
1080 was 180W (did ~220W full load)
2080 IS 225W (~250W)
for same "tier" performance
not something we should be happy about when whole world burns because we use to much fosil fuels

The 1080 used 20W more than the 980 but provided roughly 50% on average better performance than the 980. Some cases were more.

https://www.anandtech.com/show/10325/the-nvidia-geforce-gtx-1080-and-1070-founders-edition-review/30

Thats a good trade off personally.

@joeblowsmynose close. The first was the GeForce 256 which was a 20W TDP card.