numerous things
The display you are using matters and its refresh rate. Having a second display hooked up to the gpu even if it turned off can also affect this.
Is freesync on, is the system really idle or are there other processes running.
Any tweaking done on the system? some people turn off hardware acceleration in their browsers etc.
there are so many variables to it.
Yes, there are lots of variables. But interestingly, when using a single monitor, Nvidia's power draw tends to be pretty much the same on multiple different displays. It seems as though AMD hasn't properly tuned the various GPU firmwares / drivers to better optimize for lower idle power. I don't have a ton of different monitors to test with, so I can only report my finding with either four different 4K 60Hz displays, or a 4K 1440Hz displays (which is what I used), or I could even use a 4K 240Hz (with DSC) display. Switching monitors would require rearranging a bunch of other stuff as well, however, as the monitor footprint is larger on the last one.
So to reiterate:
1) The idle value is full idle, while the monitor is on and showing content, but everything is static (no open windows that aren't minimized). Long idle where the display powers off is a different metric and tends to be even lower because the card really doesn't need to do anything. Most GPUs should have long idle power use of <10W, assuming there's no background GPU compute task running, but I haven't tried to test this.
2) G-Sync and FreeSync are both disabled, depending on the GPU, even though the monitor can support VRR. This is done for purposes of consistency. Why force a constant 144Hz refresh rate? Because that's more of the "worst case scenario" and high-end GPUs will often be paired with high-end, high refresh rate monitors. Also, it's one less variable (i.e. VRR on or off introduces variability). Resolution is 3840x2160 4:2:2 YCbCr, though, because the monitor doesn't have DSC support and you can't do 144Hz at 4K with full 4:4:4 color.
3) Test system is Core i9-13900K, the same in both cases. Power use is measured via PCAT v2 while idle, while decoding a 4K30 AV1 video in VLC, and while playing that same video off YouTube. The video, if you want to look, is this one:
View: https://www.youtube.com/watch?v=qZ4n-0162nY
Power data logging starts right as after the black screen disappears and Houston Jones starts talking (~3 seconds into the video) and stops 60 seconds later (where Scott says Girthmaster 5000 or whatever).
AMD's behavior on various GPUs is quite odd in terms of power use, and appears to fluctuate a lot with monitor, refresh rate, and resolution AFAICT. The RX 7600 uses more power while idle than an RX 7900 XT, and the selected refresh rate definitely matters. 4K 60 Hz idle power is ~8W on the 7600, but anything higher (75Hz, 82Hz, 98Hz, 120Hz, 144Hz) jumps up to around 17W. Idle power on the 7900 XT and XTX is generally the same at 4K, regardless of refresh rate (maybe 1~2W lower at 60Hz). The 7800 XT and 7700 XT all seem to have higher power use above 82Hz, and I think the 7900 GRE had lower power at 120Hz and below — so it only "didn't like" 144Hz where power use jumped by maybe 15W.