Uh, did you miss the chart that showed the power draw? The "renders per day per watt" literally measure work-per-watt. That's how it wins in efficiency.
Also, the Handbrake and Y-cruncher power draw.. 31 and 33 watts vs Intel's 53 and 54 watts.
That means, in Handbrake, the 4100 uses 58.5% of the power of the 12100, and in Y-cruncher, it uses 61.1% of the power of the 12100.
So, if the 4100 is only 58.5% and 61.1% as fast as the Intel, that means it's break-even for efficiency. Any faster than that, as it turned out to be, and it wins in efficiency. The data is literally right there.
It's not utter nonsense. It's mathematical reality.
Does it really provide around 60% of the rendering capability? If not, it's still bulls; and considering gaming power consumption wasn't even measured, and based on my 12700k, I doubt the 12100 will lose there. Hint, my 12700k is literally playing in that ball park when gaming despite outright dwarfing either of these CPUs in literally everything. And from what I see, the 4100 has around 58% the performance of the 12100 according to the text, which I find somewhat baffling leads to higher power efficiency. It should be a wash, as you stated yourself.
As stated above, neither of those CPUs makes any sense at all for rendering workloads anyways, so it is questionable if that is even a useful metric here. Other reviewers like Igor's also back up Alder Lake's gaming efficiency, so no, it's not just anecdotal. And if you actually look at the graph, you see the 12100 quite further towards the "lower left corner" that, according to the text, is the perfect spot. Maybe that graph should just be left out completely if it is this misleading. It's been for all Intel and AMD CPUs in it thus far. Looking at the highest possible power draw, and the highest only, is misleading in general and disingenuous since it leaves out 90% of all use cases and 90% of all uptime of a CPU. Fun fact, unless you jump into heavy rendering tasks the second a computer is turned on, and turn it off the very moment you are finished, it is highly unlikely that maximum draw means much for you. If you work on your computer for 8h a day, but you render during only 1h out of that time, that means that there are 7h your computer won't draw its maximum possible. Unfortunately, nobody reviewed these chips like that so far so we don't know how much they actually draw. That is honestly a bit annoying to me. I did find a number for the 12100 of 9.5W, but nothing for the 4100.
https://www.hardwareluxx.de/index.p...3-12100f-ist-der-neue-p-l-koenig.html?start=8
Unfortunately in German, but that shouldn't hinder you from looking at the graph. I'm not sure how accurate that is, though. Would be better to have both CPUs in a graph together as well, but oh well.
However, unless the 12100 draws more in idle, which would be an outliers for Alder Lake, just dropping back to idle use would help balance things out here, if not outright negate the higher draw. If in a theoretical workload the 12100 needs 45 minutes and the 4100 needs 60, that higher draw for the 12100 means jack. Again, we are talking about a workload that is neither the norm, nor something people do for the majority of their time. It shouldn't be used to make a definitive judgement, ever. And that applies to every CPU on the market. Gaming consumption needs to be more prominent, it is was most people use their computer for after all.