News AMD Ryzen 9 9950X Engineering Sample gets a full suite of Blender benchmarks at various TDPs, showcasing major efficiency improvements

D

Deleted member 2731765

Guest
One good thing worth mentioning about this early engineering Zen 5 sample test is that apart from the stellar efficiency, the CPU TEMP values were also pretty much decent.

Never exceeded 62 C. Here is the breakdown anyway.
  • Ryzen 9 9950X (230W PPT) - 5620 MHz Peak Clock / 62C Temps
  • Ryzen 9 9900X (160W PPT) - 5555 MHz Peak Clock / 58C Temps
  • Ryzen 9 9950X (120W PPT) - 5220 MHz Peak Clock / 55C Temps
  • Ryzen 9 9950X (90W PPT) - 5050 MHz Peak Clock / 49C Temps
  • Ryzen 9 9950X (60W PPT) - 4084 MHz Peak Clock / 41C Temps
 

OLDKnerd

BANNED
Jun 12, 2024
16
10
15
My threadripper idle on windows desktop at 170 - 180 watts. :sneaky:

You might find 1 place in the world that pay a tiny little bit more for 1 KWh than we do here on my little pink cloud with a rainbow below.
 

vinay2070

Distinguished
Nov 27, 2011
281
74
18,870
My threadripper idle on windows desktop at 170 - 180 watts. :sneaky:

You might find 1 place in the world that pay a tiny little bit more for 1 KWh than we do here on my little pink cloud with a rainbow below.
Ya, AMD idling power consumption is pathetic. Thier laptops on battery wont last long compared to Intel or Snapdragon. All the benefits they see in the reviews saying efficiency moster etc are lost at the end of the day with mixed workloads. And no reviewer wants to cover this in depth for some reason.
 

OLDKnerd

BANNED
Jun 12, 2024
16
10
15
Well mine is a GEN 1 threadripper 12 / 24 cores, i think things got a bit better later on.
Contemplating falling back to 8 / 16 cores on next upgrade

Really what i do these days a dual core could handle, but i still want to be ready to game, even if that ship have sailed and for 2 decades now games have gone to poop, well at least the FPS games i care for.
 
  • Like
Reactions: usertests

Simon_78

Distinguished
Aug 24, 2016
10
15
18,515
Ya, AMD idling power consumption is pathetic. Thier laptops on battery wont last long compared to Intel or Snapdragon. All the benefits they see in the reviews saying efficiency moster etc are lost at the end of the day with mixed workloads. And no reviewer wants to cover this in depth for some reason.
While I agree that earlier generations of Zen certainly had an idle power consumption issue and Intel was vastly superior, this is a thing of the past.

OC3D.net total system idle power consumption data:

https://oc3dmedia.s3.eu-west-2.amaz...7950x3d-and-7900x3d-review_64b94dc5aa70f.jpeg

7950X3D = 96W

13900K = 98W

Then there's Guru3D which also still runs total system idle power consumption test for their reviews:

https://www.guru3d.com/data/publish/221/17ed1429c2d65837ceca4fefabe1b99ec5486d/untitled_1.webp

7950X3D = 78W

14900K = 81W

Conclusion: While there was a difference a few years ago in idle total system power draw between AMD and Intel in favor of Intel. Nowadays it's a definitive tie.

Same experience with a 7800X3D, it idles at around 12W on HWinfo64.

It's good to keep oneself updated on such matters, lest one be relegated to unwilling ignorance due to outdated information.

Same thing as some people who think Noctua's coolers are still the best, when coolers 2, 3 and sometimes even 4 times cheaper are just as good or better in noise normalized testing. To labor on outdated information is an issue best avoided.
 

Amdlova

Distinguished
Yo my idle 46w :) 225w max
Amd cpus have low power on cores but intel can make cpu idles about 1 to 2 watts
I use the intel igpu for video wallpaper and bring down the rtx 4060ti to 10w on idle.

On the amd card can't use proper the intel igpu have lots of bug, amd driver crashes and other defects.

46 watts with with four ssds on system, four dimms and a atx motherboard it's a dream.
 

Vanderlindemedia

Commendable
Jul 15, 2022
116
66
1,660
My threadripper idle on windows desktop at 170 - 180 watts. :sneaky:

You might find 1 place in the world that pay a tiny little bit more for 1 KWh than we do here on my little pink cloud with a rainbow below.

I'd replace it with a new generation Ryzen or so. TR's are getting old. And the idle consumption is purely due to the large cache that needs to be powered on constant.

Anyway: numbers look good.
 
Ya, AMD idling power consumption is pathetic. Thier laptops on battery wont last long compared to Intel or Snapdragon. All the benefits they see in the reviews saying efficiency moster etc are lost at the end of the day with mixed workloads. And no reviewer wants to cover this in depth for some reason.
There's a lot reviewers that ignore tons of things. For example, nobody ever benchmarks Intel and AMD chips at the same TDP. You can see a 50% performance increase in the 9950X going from stock 90W to 230W. But I've the last several years, flagship Intel chips regularly draw 250W+ at stock and are compared to AMD flagships drawing 125W.

I'd really prefer to see a 230W and 120W normalized comparison. From what I've heard, Intel builds from OEMs like Alienware throttle in the real world and lose 12% off of benchmark scores.

Intel laptops actually did terribly with batteries from 2016-2020, until they got past that 14nm node. So you're referring to a small window.
 
  • Like
Reactions: bit_user

abufrejoval

Reputable
Jun 19, 2020
480
323
5,060
My threadripper idle on windows desktop at 170 - 180 watts. :sneaky:

You might find 1 place in the world that pay a tiny little bit more for 1 KWh than we do here on my little pink cloud with a rainbow below.
Servers aren't designed to save power on idle, because they aren't supposed to run idle: 100% idle means 100% of original invest and operating cost are financial loss.

Their design is to get the most compute out of the least amount of Wattage near 99% utilization.

And since my home lab is mostly about functional testing, most of the hardware is actually idle and the main reason I've like the Ryzens 16-cores so much is because their desktop design is notebook driven and allows me to run server workloads functionally at nearer notebook heat and noise.

On my earlier Xeons it was actually the RAM that consumed most of the power, 18-22 core CPUs would go down to around 10 Watts on idle, but the RAM is around 50 Watts on idle and 120 Watts when used.

And then the other server components aren't typically designed for minimal noise and heat on idle, but for running near full load for years without fail. The last quad socket Intel Xeons I remember measuring were 400 Watt idle and 800 Watt at max load: that was long before they started putting GPUs into them and might have been more than 100 Watts for the fans alone.
 
  • Like
Reactions: 35below0

abufrejoval

Reputable
Jun 19, 2020
480
323
5,060
Ya, AMD idling power consumption is pathetic. Thier laptops on battery wont last long compared to Intel or Snapdragon.
With the Snapdragon having just entered the scene and the AMDs coming in two weeks, that statement is based on very personal data samples.

I bought two Lenovos within a year, one a G8 i7-8559U and the other a Ryzen 5800U, both with 15-28 Watts of configurable TDP very similar physical sizes and battery capacity.

It was night and day with the Intel toasting my fingers and draining its battery within four hours max while the Ryzen never got hot, lasted much longer, but obviously couldn't pull tons of power out of its eight cores, when they had to share 15 Watts among them.

My much younger Alder Lake laptop easily beats the Ryzen 5800U, but it uses a 45 Watt TDP i7-12700H to get there, which isn't exactly fair.

But of course it can outlast the Ryzen as well, if you basically disable all P-cores and let it only sip droplets from its much larger bettery. If you run it full hilt, there is a good chance it'll drop dead in two hours, although I doubt it can even do 45 Watts on battery.
All the benefits they see in the reviews saying efficiency moster etc are lost at the end of the day with mixed workloads. And no reviewer wants to cover this in depth for some reason.
Because mixed workloads are all different for everyone: all attempts to come up with some standard mixed workload have then immediately been attacked for bias. And then standard apps keep changing all the time, vendors invent things like MMX, VNNI or AI as having intrinsic value etc.

Just try to come up with a good mixed workload definition yourself and you'll see it's anything but trivial to do and automate and compare. If you can prove the opposite it's time to apply for a job with TH or any other publication, because they'd love nothing more than to publish results that satisfy you and millions of others, who aren't very much like you but want the same personal optimum.
 
  • Like
Reactions: bit_user
There's a lot reviewers that ignore tons of things. For example, nobody ever benchmarks Intel and AMD chips at the same TDP. You can see a 50% performance increase in the 9950X going from stock 90W to 230W. But I've the last several years, flagship Intel chips regularly draw 250W+ at stock and are compared to AMD flagships drawing 125W.

I'd really prefer to see a 230W and 120W normalized comparison. From what I've heard, Intel builds from OEMs like Alienware throttle in the real world and lose 12% off of benchmark scores.

Intel laptops actually did terribly with batteries from 2016-2020, until they got past that 14nm node. So you're referring to a small window.
All the reviews and benchmarks are at 230W PPT for the top tier ryzen parts, like here where it draws 235W at stock, they just don't show you the real power draw and only TDP to keep the myth alive.
https://www.techpowerup.com/review/amd-ryzen-9-7950x/24.html
power-multithread.png
 
  • Like
Reactions: dalauder

vinay2070

Distinguished
Nov 27, 2011
281
74
18,870
While I agree that earlier generations of Zen certainly had an idle power consumption issue and Intel was vastly superior, this is a thing of the past.

OC3D.net total system idle power consumption data:

https://oc3dmedia.s3.eu-west-2.amaz...7950x3d-and-7900x3d-review_64b94dc5aa70f.jpeg

7950X3D = 96W

13900K = 98W

Then there's Guru3D which also still runs total system idle power consumption test for their reviews:

https://www.guru3d.com/data/publish/221/17ed1429c2d65837ceca4fefabe1b99ec5486d/untitled_1.webp

7950X3D = 78W

14900K = 81W

Conclusion: While there was a difference a few years ago in idle total system power draw between AMD and Intel in favor of Intel. Nowadays it's a definitive tie.

Same experience with a 7800X3D, it idles at around 12W on HWinfo64.

It's good to keep oneself updated on such matters, lest one be relegated to unwilling ignorance due to outdated information.

Same thing as some people who think Noctua's coolers are still the best, when coolers 2, 3 and sometimes even 4 times cheaper are just as good or better in noise normalized testing. To labor on outdated information is an issue best avoided.
https://forums.tomshardware.com/thr...battle-for-the-high-end.3846021/post-23274482
 

vinay2070

Distinguished
Nov 27, 2011
281
74
18,870
All the reviews and benchmarks are at 230W PPT for the top tier ryzen parts, like here where it draws 235W at stock, they just don't show you the real power draw and only TDP to keep the myth alive.
https://www.techpowerup.com/review/amd-ryzen-9-7950x/24.html
power-multithread.png
Every Tom's Hardware review shows Ryzen WAY below on power consumption at stock. If Ryzen gets 20% improvement using 50% more per, which might be modest based on this article, then the 7000 series beats the 14000 series.
Tom's data: https://www.tomshardware.com/news/i...-intel-core-i9-14900k-i7-14700k-and-i5-14600k

How do I insert images?
https://cdn.mos.cms.futurecdn.net/3tjH9FMPjXZRpQp6ToT346-970-80.png.webp
 
  • Like
Reactions: bit_user
Every Tom's Hardware review shows Ryzen WAY below on power consumption at stock. If Ryzen gets 20% improvement using 50% more per, which might be modest based on this article, then the 7000 series beats the 14000 series.
Tom's data: https://www.tomshardware.com/news/i...-intel-core-i9-14900k-i7-14700k-and-i5-14600k

How do I insert images?
https://cdn.mos.cms.futurecdn.net/3tjH9FMPjXZRpQp6ToT346-970-80.png.webp
Yeah a lot of reviews are trying to convince people that ryzen is using way less power than it does.
You can tell by comparing the results that tom's uses the same(ish) settings as techpowerup.
37900 on tom's vs 37500 on tech, so tom's actually has a slightly higher result.
eLEfSYtAtcrngnyMxN2cL8-1200-80.png.webp

cinebench-multi.png
 

abufrejoval

Reputable
Jun 19, 2020
480
323
5,060
Ryzens with distinct CCDs and IODs won't ever be as efficient as APUs or SoCs: there is a certain overhead to pay for the chip-to-chip communication and essentially those two (or three) dies will each do their own power management.

I'd say my desktop ryzens are very efficient relative to Xeons with similar amounts of computing power, they aren't the most parsimonious idlers overall. I have some Alder Lake i7-12700H based desktops using these H-class mobile chips at 120/90 PL2/PL1 which are more or less the computing power and peak energy consumption equivalentof an 8-core Ryzen 5800X but more power efficient at idle.

And a lot of that is probably the X570 chipset/mainboard that vs. the mobile components mainboard of the Alder Lake. It would be interesting to measure the 5800X vs a 5800G (or rather the exact 65 Watt equivalents) to see big the idle impact is on chiplet vs. monolithic Zen.

But that's neither the 16-cores where Intel had to resort to horrendous Wattage to reach Zen peak performance nor the U-class notebooks, where Intel only gave you two P-cores vs. AMD's 8 (even if they are limited to only 2 GHz all-cores at 15 Watts).

I guess even at the top end Intels could win "the idle wars" against Zen, while loosing big on peak and many partial load. Which of the two will then be more efficient for you obviously depends on how you operate your systems, which is why I believe a simple comparison can't be done.

But to say that generally Zens suck at idle is too broad a statement and I'm eagerly looking forward to Strix Point in two weeks, which aims towards a higher TDP than the Snapdragons, but also delivers quite a bit more punch. Yet they should also be rather good at partial, very low and idle loads if AMD has done their work right, measuring that and proving that will still be a challenge.

And unfortunately Intel will be late to that party, so we'll have to wait until the end of the year to see if they can draw even.
 
Last edited:
  • Like
Reactions: vinay2070
And unfortunately Intel will be late to that party, so we'll have to wait until the end of the year to see if they can draw even.
???
AMD is releasing a new gen in the mid of summer where nobody cares about PCs, nobody is home to care in the first place, and by the time people will be back and eager to upgrade intel will be releasing their new gen, just in time...

But AMD has no choice here, they are forced to release now to have a chance to sell their new gen before they get compared to intel's new gen.

And if you think that they will be more efficient just look at the history and how every gen of ryzen has increased its power draw by quite a lot.
The new gen will have close to zero improvement in overall performance at the same power draw, that's why you see AMD pushing the single core performance that much because in multicore clocks will drop immensely and performance will be barely any better at the same power.
 

Simon_78

Distinguished
Aug 24, 2016
10
15
18,515
Ryzens with distinct CCDs and IODs won't ever be as efficient as APUs or SoCs: there is a certain overhead to pay for the chip-to-chip communication and essentially those two (or three) dies will each do their own power management.

I'd say my desktop ryzens are very efficient relative to Xeons with similar amounts of computing power, they aren't the most parsimonious idlers overall. I have some Alder Lake i7-12700H based desktops using these H-class mobile chips at 120/90 PL2/PL1 which are more or less the computing power and peak energy consumption equivalentof an 8-core Ryzen 5800X but more power efficient at idle.

And a lot of that is probably the X570 chipset/mainboard that vs. the mobile components mainboard of the Alder Lake. It would be interesting to measure the 5800X vs a 5800G (or rather the exact 65 Watt equivalents) to see big the idle impact is on chiplet vs. monolithic Zen.

But that's neither the 16-cores where Intel had to resort to horrendous Wattage to reach Zen peak performance nor the U-class notebooks, where Intel only gave you two P-cores vs. AMD's 8 (even if they are limited to only 2 GHz all-cores at 15 Watts).

I guess even at the top end Intels could win "the idle wars" against Zen, while loosing big on peak and many partial load. Which of the two will then be more efficient for you obviously depends on how you operate your systems, which is why I believe a simple comparison can't be done.

But to say that generally Zens suck at idle is too broad a statement and I'm eagerly looking forward to Strix Point in two weeks, which aims towards a higher TDP than the Snapdragons, but also delivers quite a bit more punch. Yet they should also be rather good at partial, very low and idle loads if AMD has done their work right, measuring that and proving that will still be a challenge.

And unfortunately Intel will be late to that party, so we'll have to wait until the end of the year to see if they can draw even.

AMD is, unequivocally, using much less power for the same work.

You seem to like TPU, so I'll include their data too.

https://tpucdn.com/review/intel-core-i9-14900k/images/efficiency-multithread.png

https://tpucdn.com/review/intel-core-i9-14900k/images/efficiency-gaming.png

https://tpucdn.com/review/intel-core-i9-14900k/images/power-games-compare-vs-7950x3d.png

https://tpucdn.com/review/intel-core-i9-14900k/images/power-games-compare-vs-7950x.png

https://tpucdn.com/review/intel-core-i9-14900k/images/power-applications-compare-vs-7950x3d.png

https://tpucdn.com/review/intel-core-i9-14900k/images/power-applications-compare-vs-7950x.png

Quote from W1zzard ( reviewer at TPU ): Compared to AMD's offerings there's still a night-and-day difference. During rendering, the 14900K consumes 282 W, the 7950X needs 80 W less, while offering virtually the same performance. The 7950X3D even uses only 140 W, which is half of the 14900K, and it's only marginally slower. Similar situation in gaming: 14900K average gaming power of 144 W, 7950X: 89 W, 7950X3D: 56 W, 7800X3D: 49 W (!)—all while offering virtually the same FPS.

I'll also include Gamers Nexus results for good measure.

https://gamersnexus.net/u/styles/la...N Logo) Power Efficiency GamersNexus.png.webp

Quote from Steve: "The 14900K is towards the bottom-end of the chart. It’s more efficient than the 13900K, 14700K, 12900K, and 13700K -- and AMD’s 5800X, barely -- but it’s far below top performers like the 5950X and 7950X. It’s not the least efficient we’ve tested, but Intel is pushing its thermal envelope and power draw higher and higher."

Now if you still want to claim otherwise, it's your prerogative, but it's also deeply flawed.
 
Last edited:
  • Like
Reactions: bit_user

abufrejoval

Reputable
Jun 19, 2020
480
323
5,060
AMD is, unequivocally, using much less power for the same work.

You seem to like TPU, so I'll include their data too.

https://tpucdn.com/review/intel-core-i9-14900k/images/efficiency-multithread.png

https://tpucdn.com/review/intel-core-i9-14900k/images/efficiency-gaming.png

https://tpucdn.com/review/intel-core-i9-14900k/images/power-games-compare-vs-7950x3d.png

https://tpucdn.com/review/intel-core-i9-14900k/images/power-games-compare-vs-7950x.png

https://tpucdn.com/review/intel-core-i9-14900k/images/power-applications-compare-vs-7950x3d.png

https://tpucdn.com/review/intel-core-i9-14900k/images/power-applications-compare-vs-7950x.png

Quote from W1zzard ( reviewer at TPU ): Compared to AMD's offerings there's still a night-and-day difference. During rendering, the 14900K consumes 282 W, the 7950X needs 80 W less, while offering virtually the same performance. The 7950X3D even uses only 140 W, which is half of the 14900K, and it's only marginally slower. Similar situation in gaming: 14900K average gaming power of 144 W, 7950X: 89 W, 7950X3D: 56 W, 7800X3D: 49 W (!)—all while offering virtually the same FPS.

I'll also include Gamers Nexus results for good measure.

https://gamersnexus.net/u/styles/large_responsive_no_watermark_/public/inline-images/GN CPU Benchmark Blender 3.6.4 (GN Logo) Power Efficiency GamersNexus.png.webp

Quote from Steve: "The 14900K is towards the bottom-end of the chart. It’s more efficient than the 13900K, 14700K, 12900K, and 13700K -- and AMD’s 5800X, barely -- but it’s far below top performers like the 5950X and 7950X. It’s not the least efficient we’ve tested, but Intel is pushing its thermal envelope and power draw higher and higher."

Now if you still want to claim otherwise, it's your prerogative, but it's also deeply flawed.
Pretty sure you're responding to the wrong guy here...

I never claimed that Intel was more efficient than AMD at work, especially at peak loads.

I'll grant them, that they can be more efficient at idling, when compared to AMDs with a single CCD and an IOD and an X570 chipset. That was a vain attempt to appease TerryLaze, that went nowhere.

I've noticed when I ran an 5800X3D on an X570 board against an Alder Lake i7-12700H that had been pushed to nearly the same TDPs, that the AL actually seemed to have a leg up, because it matched the 5800X3D on peak synthetic loads and idled at lower power.

But of course the 5800X3D would have trounced the i7-12700H at many games and it wasn't quite fair in terms of PCIe lanes as the i7-12700H is both a mobile and monolithic design and offers quite a lot less I/O than an X570 board.

But on my big workstations I switched from Broadwell Xeons to Ryzens with the 5950X and 7950X3D and I actually have not seen any reason to go back to Intel in EPYC or desktop ranges, while Strix Point will most likely dominate 65-25 Watts, too. I've used 5800U below that, not sure if I'll actually want something new in the 15 Watt range for a while, so Lunar Lake would be lost on me.
 
  • Like
Reactions: eX_Arkangel

TheHerald

Prominent
Feb 15, 2024
614
166
560
AMD is, unequivocally, using much less power for the same work.

You seem to like TPU, so I'll include their data too.

https://tpucdn.com/review/intel-core-i9-14900k/images/efficiency-multithread.png

https://tpucdn.com/review/intel-core-i9-14900k/images/efficiency-gaming.png

https://tpucdn.com/review/intel-core-i9-14900k/images/power-games-compare-vs-7950x3d.png

https://tpucdn.com/review/intel-core-i9-14900k/images/power-games-compare-vs-7950x.png

https://tpucdn.com/review/intel-core-i9-14900k/images/power-applications-compare-vs-7950x3d.png

https://tpucdn.com/review/intel-core-i9-14900k/images/power-applications-compare-vs-7950x.png

Quote from W1zzard ( reviewer at TPU ): Compared to AMD's offerings there's still a night-and-day difference. During rendering, the 14900K consumes 282 W, the 7950X needs 80 W less, while offering virtually the same performance. The 7950X3D even uses only 140 W, which is half of the 14900K, and it's only marginally slower. Similar situation in gaming: 14900K average gaming power of 144 W, 7950X: 89 W, 7950X3D: 56 W, 7800X3D: 49 W (!)—all while offering virtually the same FPS.

I'll also include Gamers Nexus results for good measure.

https://gamersnexus.net/u/styles/large_responsive_no_watermark_/public/inline-images/GN CPU Benchmark Blender 3.6.4 (GN Logo) Power Efficiency GamersNexus.png.webp

Quote from Steve: "The 14900K is towards the bottom-end of the chart. It’s more efficient than the 13900K, 14700K, 12900K, and 13700K -- and AMD’s 5800X, barely -- but it’s far below top performers like the 5950X and 7950X. It’s not the least efficient we’ve tested, but Intel is pushing its thermal envelope and power draw higher and higher."

Now if you still want to claim otherwise, it's your prerogative, but it's also deeply flawed.
The myth lives on, doesn't it?

Restricted to 125w it has similar efficiency to the 7950x 3d and far better than the 7950x. Please, the data is there

efficiency-multithread.png
 

Simon_78

Distinguished
Aug 24, 2016
10
15
18,515
The myth lives on, doesn't it?

Restricted to 125w it has similar efficiency to the 7950x 3d and far better than the 7950x. Please, the data is there

efficiency-multithread.png
It's also much slower in that configuration.

A stock 7950X3D consumes 140W in MT ( source: https://tpucdn.com/review/amd-ryzen-9-7950x3d/images/power-multithread.png ), if both have the same points per watts and the 14900K is at 125W, it means the 7950X3D is faster to finish the task by ~140/125 = 1.12 = 12%.

To be clear at approximately ISO efficiency:

7950X3D @ 140W * 253.3 points = 35462
14900K @ 125W * 250.9 points = 31363

35462 / 31363 = 1.13069 = 13.1% faster for the 7950X3D.

With a bit of clever math ( and "MyCurveFit" Source: https://mycurvefit.com/ - reference of that particular curve: View: https://imgur.com/6HcwGWe
) we can extrapolate the 14900K performance at 140W given the data points of 253W, 200W, 125W, 95W, 65W and 35 in the curve of TPU's results which lands around 31882 points.

So at ISO power:

7950X3D @ 140W = 35462
14900K @ 140W = 31882

11.2% faster for the 7950X3D

With that data we can also extrapolate for fun, the points per watts at 140W for the 14900K to 227.7 giving:

7950X3D = 253.3 Points per Watt
14900K = 227.7 Points per Watt

And at ISO performance ( around 35000 points for both processors, more or less a few tenths of a percent ) we can also arrive at:

7950X3D = 140W
14900K = 253W

80.7% higher power consumption for the 14900K.

In conclusion:

- At ISO efficiency the 7950X3D is faster by ~13%.

- At ISO power the 7950X3D scores ~11% higher.

-At ISO performance the 7950X3D consumes ~44% less power ( or the 14900K consumes 80.7% more, same thing ).


You can dance around reality all you want and try to distort the facts to suit your argument, it doesn't matter.

Whether you dispute those proven undisputable facts that have been confirmed over and over again by the whole professional reviewing community and beyond or not is irrelevant.

BTW, are you related to the guy running Userbenchmark? I'm getting similar vibes here. I just hope this demonstration will get you back on track with reality contrarily to that guy which seems pretty hopeless.

I rest my case, it is ironclad and I'm done here, there's nothing left to say.

Note: Sorry abufrejoval, I misquoted, my bad.
 
I rest my case, it is ironclad and I'm done here, there's nothing left to say.
You made up a bunch of numbers which are not accurate (your picture also uses Blender power consumption not Cinebench though they're close to the same). Your overall point is still mostly accurate (the 7950X3D is better perf/W at multi, but @TheHerald also didn't say otherwise), but basically none of your math is an accurate reflection of reality.

Below are the actual numbers for Cinebench R23 Multi if you don't understand why I said what I did.
cinebench-multi.png