News AMD Backs Off RDNA 3 Efficiency Comparison to RTX 4090

Oct 2, 2022
14
4
15
Does not matter when the 4080 is competing with the 4090 for the most expensive card of the year.
The choices are go for the RX 7900 which is much cheaper with performance that will likely be similar to the 4080 but have a bit higher power use or throw power consumption out the window and go for a 4090 with a 30-40% performance increase for a similar price as the 4080.

I've been keeping track of ibuypower and cyberpower custom builds and there is only a few hundred dollar difference between a 4080 and a 4090 PC on builds that are all over $3k except for the i7 versions that are just under $3k.
 

Math Geek

Titan
Ambassador
makes sense. if the 7900 will perform similar to the 4080 (blatant guess based on rumor mill) and use similar power, then obviously there is no efficiency gap to brag about.

even the extra power used by the 4090 is still pretty good on a perf/watt scale compared to the 4080/7900 .

i'd stick with the "our card won't melt the connections and possible ruin your psu and $1600-2000 gpu" angle for my press at this point :)
 
Oct 2, 2022
14
4
15
makes sense. if the 7900 will perform similar to the 4080 (blatant guess based on rumor mill) and use similar power, then obviously there is no efficiency gap to brag about.

even the extra power used by the 4090 is still pretty good on a perf/watt scale compared to the 4080/7900 .

i'd stick with the "our card won't melt the connections and possible ruin your psu and $1600-2000 gpu" angle for my press at this point :)
they've pretty much solved the connector thing, only 50 cards out of 124,000 sold had the problem (0.04% of cards) and the 50 cases were all because they didn't have the connector seated in all the way. the fix is pretty simple either they add a better clip or they shorten the sensor pins so the cable won't send power if the connector is not seated all the way, the current sensor pins are too long and giving a false positive on being seated in all the way.
 
  • Like
Reactions: KyaraM and Sluggotg

nimbulan

Distinguished
Apr 12, 2016
36
31
18,560
makes sense. if the 7900 will perform similar to the 4080 (blatant guess based on rumor mill) and use similar power, then obviously there is no efficiency gap to brag about.

even the extra power used by the 4090 is still pretty good on a perf/watt scale compared to the 4080/7900 .

i'd stick with the "our card won't melt the connections and possible ruin your psu and $1600-2000 gpu" angle for my press at this point :)
Yeah based on the few performance numbers AMD's posted for the 7900 XT, compared to independent reviews of the 4080, it puts them basically equal in rasterization performance. The 7900 XTX looks around 15-20% faster than that in rasterization, but the 4080 will still beat it by 30% in raytracing.

My guess is that AMD was expecting nVidia's cards to hit their TGP limit under most workloads, as has been the case in the past, and prepared a slide with comparisons based on that. But with the 40 series frequently using significantly less power than the TGP limit (and to be clear I'm not expecting the same to be true for AMD,) that will likely make nVidia's cards more power efficient this gen.
 
  • Like
Reactions: KyaraM

criticaloftom

Prominent
Jun 14, 2022
26
10
535
All I know as a consumer is that with a 'hyper' expensive purchase either way; I won't be trusting Nvidia.
The press about how many inserts the plug is rated for; and the mere fact that they have any that create a system fire means they have done their dash.
Maybe in a few years consumers will forget, but for now that connector is a dead selling point.
 
  • Like
Reactions: palladin9479

russell_john

Honorable
Mar 25, 2018
114
81
10,660
AMD reportedly hid a performance per watt slide at the very last moment, relating to its RX 7900 XTX/XT coverage on November 15th. AMD's reasoning is unknown, but we suspect its related to Nvidia's RTX 4080, and excellent power efficiency.

AMD Backs Off RDNA 3 Efficiency Comparison to RTX 4090 : Read more
Architecture only gives you small gains in efficiency usually in single digit percentages. In the last 20 year nearly all the efficiency gains have come from using a smaller node size and since the 4000 and 7000 series cards are on essentially the same node they are likely to have very similar performance per watt. The edge will likely go to AMD since they don't have the additional Tensor Cores which add overhead even when they aren't being used. However when you look at performance per watt for Ray Tracing Nvidia is still likely to blow AMD away although they may whittle down the 23% efficiency edge Nvidia had last generation but I suspect not by much
 

russell_john

Honorable
Mar 25, 2018
114
81
10,660
All I know as a consumer is that with a 'hyper' expensive purchase either way; I won't be trusting Nvidia.
The press about how many inserts the plug is rated for; and the mere fact that they have any that create a system fire means they have done their dash.
Maybe in a few years consumers will forget, but for now that connector is a dead selling point.

You better get used to it because that connector is going to be on motherboards eventually (In fact that was what Intel originally designed it for) since it is part of the ATX 3.0 standard. Intel's intention with ATX 3.0 is to get rid of the 24 pin motherboard connector, and the 12V Molex connector and replace it with this single 12V connector and have all the voltage conversion/regulation for the other voltages on the motherboard itself. You'll basically have two of these 16 pin connectors coming from the power supply, one for the motherboard and another for the graphics cards.

What is throwing everyone for a loop is Intel has been using this connector on server motherboards for almost two years without issue but then again those were installed by professionals

BTW you can't start a fire without a flame and the composite plastic material used for those connectors are flameproof .... Melting DOES NOT equal fire
 
Last edited:
  • Like
Reactions: Red_Frog and KyaraM

zecoeco

Prominent
BANNED
Sep 24, 2022
83
113
710
RTX 4090 is untouchable in terms of performance, but AMD will still be ahead in efficiency because it's the balanced approach they've decided to follow.
 

gruffi

Distinguished
Jun 26, 2009
36
24
18,535
AMD already made it clear. There are no comparisons to RTX 4080 because it wasn't available back then and AMD didn't get a sample from Nvidia for comparisons. All AMD could do was some comparisons to RTX 4090. But as the author correctly recognized, Navi 31 is targeted at AD103, not AD102. AD102 is a much larger monolithic design and much more expensive than AD103 or the chiplet based Navi 31 design.
 

Warrior24_7

Distinguished
Nov 21, 2011
30
17
18,535
AMD reportedly hid a performance per watt slide at the very last moment, relating to its RX 7900 XTX/XT coverage on November 15th. AMD's reasoning is unknown, but we suspect its related to Nvidia's RTX 4080, and excellent power efficiency.

AMD Backs Off RDNA 3 Efficiency Comparison to RTX 4090 : Read more
The reasoning is known. They require the same power requirements as the 4090, and are LESS powerful! They want you to be stupid enough to buy the card first, and realize that later!
 

King_V

Illustrious
Ambassador
In fact, according to our tests, the RTX 4080 Founders Edition consumed just 221W at 1080p.

That is a bit misleading, isn't it? After all, according to the very test article that links to:

That's especially true of our 1080p results, where the average power use for the card was just 221W. With a faster CPU like a Core i9-13900K or Ryzen 9 7950X, we'd likely see higher power use from the 4080, though it's likely even an overclocked 13900K wouldn't max out its power limit at 1080p.

Basically, this is saying that the 4080 uses just 221W for gaming when it's forced to sit there waiting for the CPU to provide it with enough data.

Don't get me wrong, given that Nvidia officially rates the card at 320W, the fact that, in the Additional Power, Clock, and Temperature Testing table, in those cases where the RTX 4080 is pushed to its maximum (at 99.0%) independent of resolution, it will use as little as 277.6W (Horizon Zero Dawn, 4K Ultimate), or as much as 308.8 (Bright Memory Infinite, 4K Very High )

side note: kind of interesting that its highest power consumption AND lowest power consumption, when at full utilization, were both at 4K.

Still impressive, don't get me wrong, relative to what they stated as the official TDP. But, unless I'm missing something, to say that 221W usage is "impressive efficiency" for the card isn't really telling us much when we're looking at the CPU being the limiting factor... it's not telling us about performance-per-watt.


Alternately, to compare apples to apples, I'd say that other competing cards, whether from AMD, Intel, or even other 4000-series Nvidia models, would have to have a similar detailed table to truly be able to compare. Or, a "this is the Geomean across our tests at 1080p, 1440p, and 4K" as comparisons.
 
  • Like
Reactions: P1nky

JarredWaltonGPU

Senior GPU Editor
Editor
That is a bit misleading, isn't it? After all, according to the very test article that links to:



Basically, this is saying that the 4080 uses just 221W for gaming when it's forced to sit there waiting for the CPU to provide it with enough data.

Don't get me wrong, given that Nvidia officially rates the card at 320W, the fact that, in the Additional Power, Clock, and Temperature Testing table, in those cases where the RTX 4080 is pushed to its maximum (at 99.0%) independent of resolution, it will use as little as 277.6W (Horizon Zero Dawn, 4K Ultimate), or as much as 308.8 (Bright Memory Infinite, 4K Very High )

side note: kind of interesting that its highest power consumption AND lowest power consumption, when at full utilization, were both at 4K.

Still impressive, don't get me wrong, relative to what they stated as the official TDP. But, unless I'm missing something, to say that 221W usage is "impressive efficiency" for the card isn't really telling us much when we're looking at the CPU being the limiting factor... it's not telling us about performance-per-watt.


Alternately, to compare apples to apples, I'd say that other competing cards, whether from AMD, Intel, or even other 4000-series Nvidia models, would have to have a similar detailed table to truly be able to compare. Or, a "this is the Geomean across our tests at 1080p, 1440p, and 4K" as comparisons.
The point wasn't that the 4090 uses 221W, but that it also doesn't generally use the listed 450W either. If you look a the Gigabyte RTX 4090 review's power page, or the MSI RTX 4090 review's power page, both have full data on power use under most of the tests. (RDR2 doesn't because that game crashes if you try to use FrameView with it.) Here's a better view of the GB results from Excel, which doesn't munge the tables like our CMS:

158

Across all 13 games, it averaged just under 400W at 4K and under 350W at 1440p. Only in DXR testing did it get pretty close to the 450W figure, where it averaged 432W. Except it's a safe bet that the 4090 will be a lot faster than the 7900 XTX in demanding DXR tests, which would give Nvidia the win on performance per watt. If you only look at non-DXR games, at 1440p the 4090 averaged 311W (geometric mean of 303.7W) and at 4K it averaged 376W (geomean of 373.8W). So that's 75W lower than the rated maximum TBP and any assumption AMD might have made about power going into the testing probably proved incorrect once it had actual data.

From my reviews and testing, using the seven non-DXR games and geometric means of average FPS and average power:

RTX 4080 perf-per-watt (PPW) at 1440p: 0.526 FPS/W
RTX 4080 perf-per-watt (PPW) at 4K: 0.287 FPS/W

RTX 4090 perf-per-watt (PPW) at 1440p: 0.494 FPS/W
RTX 4090 perf-per-watt (PPW) at 4K: 0.319 FPS/W

So based on those results, the 4090 was more efficient than the 4080 at 4K ultra settings! Sadly, I don't have full matching data for other GPUs (YET!), but I'll definitely be capturing that on future testing. I'm going to be retesting everything on a new 13900K testbed in the coming weeks/months as well, using FrameView and PCAT v2 to measure real power data, and that should prove quite interesting.
 

JarredWaltonGPU

Senior GPU Editor
Editor
Actually, here's another set of data, using Spider-Man: Miles Morales testing. AMD values are "approximate" because AMD basically lies on its software power reporting. (Looking at real vs. software values, AMD's power use is anywhere from ~18 to ~35 percent higher than the software reports, as it only gives the GPU power use rather than the full board. This is why we need to use a tool like PCAT or Powenetics. But Powenetics is rather useless for capturing this on every game, so I'm hoping to use PCAT soon.)

For these PPW figures, I applied a 20% higher TBP value on AMD. It's going to be slightly high on some cards, and quite a bit low on others, but it's at least a rough starting point. And this is a forum thread precisely because I don't have the hard AMD TBP figures, so don't get bent out of shape. We know the AMD data is inaccurate, take with grains of salt, etc. Anyway, I've sorted by PPW at 1080p for these tables using ultra and ultra with maxed out DXR settings:

Spider-Man: Miles MoralesFPS/WattAVG FPSGPU Clock (MHz)GPU Temp (C)GPU Power (Watts)GPU Utilization
RTX 4080 1080p Very High0.994140.7282042141.669.4
RTX 4090 1080p Very High0.989141277544.1142.661.3
RX 6600 XT 1080p Very High0.78583.42668.551106.297.8
RX 6700 XT 1080p Very High0.70899.42570.969.4140.397.2
RX 6800 XT 1080p Very High0.597115.3243681.8193.197.3
RTX 3060 1080p Very High0.59479.4191262.3133.694.6
RX 6500 XT 1080p Very High0.58231.52711.350.954.198.9
RTX 3070 1080p Very High0.56499.31973.265.4176.192.3
RX 6950 XT 1080p Very High0.531122.92633.862231.696.4
GTX 1650 Super 1080p Very High0.50043.2192059.486.497.1
GTX 1660 Super 1080p Very High0.48749.21925.564.910196.7
RTX 3080 1080p Very High0.478124.8195073.8261.191.8
RTX 3090 1080p Very High0.471132.3196562.8280.790.1
RTX 2060 1080p Very High0.45368.91889.767.5152.195.5

Spider-Man: Miles MoralesFPS/WattAVG FPSGPU Clock (MHz)GPU Temp (C)GPU Power (Watts)GPU Utilization
RTX 4080 DXR 1080p Very High DXR0.64098.3280546.2153.783.3
RTX 4090 DXR 1080p Very High DXR0.57598.6277549171.472.2
RTX 3060 DXR 1080p Very High DXR0.37749.5191261.7131.496.7
RX 6700 XT DXR 1080p Very High DXR0.36757.62569.574.7156.898.7
RTX 3070 DXR 1080p Very High DXR0.35362196566175.496.2
RX 6600 XT DXR 1080p Very High DXR0.32729.32682.448.789.698.9
RX 6800 XT DXR 1080p Very High DXR0.32070.42437.580.1219.797.9
RTX 3080 DXR 1080p Very High DXR0.29379.9195074.8272.495.3
RTX 3090 DXR 1080p Very High DXR0.28784.6195067.6294.994.8
RX 6950 XT DXR 1080p Very High DXR0.28376.92614.866.1272.297.3
RTX 2060 DXR 1080p Very High DXR0.25535.1189066.8137.497.4
RX 6500 XT DXR 1080p Very High DXR0.1214.42714.548.736.399

Some food for thought at least. It's one game, Nvidia's RTX 4090 and 4080 are at the top of both charts. I can also tell you that they're at the top of the 1440p and 4K results as well — by quite a wide margin in some cases. Actually, I'll just post the 4K figures, with the same caveats as above about AMD's power values.

Spider-Man: Miles Morales 4KFPS/WattAVG FPSGPU Clock (MHz)GPU Temp (C)GPU Power (Watts)GPU Utilization
RTX 40900.402119.52760.059.4297.295.0
RTX 40800.399100.92805.054.7253.096.2
RX 6700 XT0.26648.22540.577.0181.598.7
RX 6600 XT0.25727.62657.251.2107.698.9
RX 6800 XT0.24963.12391.877.9253.698.5
RX 6950 XT0.24569.42450.865.0283.098.6
RTX 30700.23148.61917.870.3210.096.6
RTX 30900.23076.91839.871.6334.195.8
RTX 30800.22469.21870.078.7308.496.4
RTX 30600.22136.01886.869.2162.997.3
GTX 1660 Super0.17718.81922.364.9106.398.7
RTX 20600.15923.31879.068.5146.997.9

Spider-Man: Miles Morales 4KFPS/WattAVG FPSGPU Clock (MHz)GPU Temp (C)GPU Power (Watts)GPU Utilization
RTX 4090 DXR0.25072.32760.058.7289.494.9
RTX 4080 DXR0.23856.82790.055.7238.696.8
RTX 3090 DXR0.13042.41870.271.0327.497.3
RTX 3060 DXR0.12618.51896.966.1146.798.3
RX 6800 XT DXR0.12632.12364.978.3254.799.0
RX 6950 XT DXR0.12234.62417.466.8283.498.9
RX 6700 XT DXR0.11619.02540.276.5164.298.8
RTX 3070 DXR0.11520.61962.067.5179.998.3
RTX 3080 DXR0.11232.61915.276.3291.397.4

Based on just these figures, the PPW of the 4090 is 98% higher than the closest AMD card at 4K DXR, and 51% higher than the closest AMD card at 4K without DXR. If you take the RX 6950 XT values and apply a 50% improvement in PPW, the 4090 would still be 9% 'better' at 4K and 37% better at 4K DXR. I strongly suspect when AMD did the actual testing, they discovered exactly this and so dropped the slide and forgot to remove the footnote. We'll find out how it actually fares in real-world testing in December! 🙃
 

Elusive Ruse

Commendable
Nov 17, 2022
375
492
1,220
Although I'm impressed by the power efficiency of the 4000 series, a lot of it is due to the horrible P/W of the 3090 and Ti. I wouldn't put much into gaming benchmarks for the 4090 efficiency though as it's just overkill for gaming purposes.
Kv6tMwJ.jpg
 

LastStanding

Prominent
May 5, 2022
75
27
560
they've pretty much solved the connector thing, only 50 cards out of 124,000 sold had the problem (0.04% of cards) and the 50 cases were all because they didn't have the connector seated in all the way. the fix is pretty simple either they add a better clip or they shorten the sensor pins so the cable won't send power if the connector is not seated all the way, the current sensor pins are too long and giving a false positive on being seated in all the way.
Out that 100k+ cards, how many of those cards are in data centers, at Hollywood studios, in scalpers' hands, sitting on Ebay, etc. so, that subjective opinion is still very unknown and remember, NVIDIA has shown no FACTUAL data yet to conclude its finding AFTER GN's support video but only delivered a well-orchestrated PR stunt.
 

Math Geek

Titan
Ambassador
yah the whole "we investigated ourselves and found we did nothing wrong and we are awesome" PR nonsense is crazy.

but i guess so long as folks continue buying their defective cards, all is good.
 

Elusive Ruse

Commendable
Nov 17, 2022
375
492
1,220
yah the whole "we investigated ourselves and found we did nothing wrong and we are awesome" PR nonsense is crazy.

but i guess so long as folks continue buying their defective cards, all is good.
I wouldn't be surprised if they just relied on GN's investigation without putting any effort themselves.
 

Math Geek

Titan
Ambassador
to me the scary thing is that these connections melted down at much lower power than what we thought at first.

sure 600w is easy to imagine going all meltdown. but at the lower day to day gaming power draw, this is more alarming to me. leaves me wondering if we'll see this with the 4080 cards soon as well.
 
  • Like
Reactions: bit_user