The problem with AMD CPUs and GPUs are thier higher power consumption at idle and nobody is making enough noise about this. All those benchmarkers show you that AMD is more efficient when it comes to CPU processing but dont tell you on an average day, AMD would have consumed more power due to higher idling watts.
@JarredWaltonGPU Is it possible for you to make an Intel vs AMD CPU comparision with a power meter connected that will show watt hours consumed over a typical day of web browsing, gaming and idling? Lets say 24 hours. 14 hours idling, 4 hours gaming and 6 hours websurfing and youtube playback with nvidia GPUs installed (to take GPU efficiency out of the equation as we know AMD GPUs are worse). Or better use iGPUs for each when not gaming.
My take on this is if your concerned with saving all that extra energy you should not buy such hardware. Tbh I dont really buy it, I dont think the savings make enough of a difference in a world were everything is so expensive and going up and up. If i was going to game for example it would be a console, less electricity cost and heat produced, aswell the gpu prices have destroyed PC gaming for me.
My 7950X with 40GB ram used up-brave browser many tabs, heavy game paused in background, many misc things opened, playing a 4K h.265 60fps video consumes around 38-44watts. Compare that to my thinkpad CPU that is something like 10watts irrc.
Doing some dirty quick math with my usage:
The PC is on everyday and each day is on for 12+hours, sleeps 10+hours, during that 12hours the CPU idles at 25-30watts for 50% of the time-not used, and 35-44watts the other 50% multitasking. If i could somehow get that idle figure of 25-30watts down to 6-10 id save around 2500-3000wh a week. Thats worth as much as running my small AC for 5hrs a week, not much of a huge impact tbh.
Sure its something, but then again, all I have to do is keep more lights off, and sleeping my PC during those breaks would for me undermine the advantage of the fabled 7watt CPU.
Now when on battery power that is when every single watt is vital, which is why if on battery I use a Laptop for my dekstop work and sleep my machine every chance I get.
My 4080 Super gets 25-27fps in Portal RTX at 1600p no DLSS all native loaded 99% drawing 310+ watts. Turn on DLSS and the difference in image quality is quite obvious, DLSS makes it somewhat blurry and the reflections on the dimpled textures look unrealistic, overall it was bad enough i actually preferred the slow frame rate. It makes me think I might aswell go Console gaming. It also used 12GB of vram!!!!!!
Since work and energy are fundamental concepts of the physical world and intimately linked, and since shrinking transistors doesn't give as big a performance gains like it use to, and since games have stagnated so badly- they dont have path tracing or those extreme unreal engine 5 tech demo graphics that make Crysis look like N64, we must accept we will soon be drawing well over a kilowatt just to game, either that or they force that DLSS nonsense on use due to us "damaging the environmental". The PS5 is said to draw only 200watts, sound efficient right? But it also only does like a quarter of the work a high end PC does. Even the 5090 wont stand much chance of doing 4K path tracing with true next gen graphics that you see unreal engine 5 is already capable of.