It's no myth. The RX 7900 XT uses an extra 120W over the RTX 4070. To be fair though, the RTX 4070 isn't in the same tier as the RX 7900 XT as the RX 7900 XT is a whopping 38% faster than the RTX 4070.
The RTX 4070 Ti is the real competitor to the RX 7900 XT and it uses an extra 80W over the RTX 4070. So comparing the RX 7900 XT to a card that is in a lower performance tier isn't exactly a fair comparison because it's 11% faster than the RTX 4070 To, the card that's one tier higher than the RTX 4070.
Having said that, the RX 7900 XT still uses 50W more than the RTX 4070 Ti but that's not a big difference for two cards in the high-end performance tier.
Even with the 120W gaming difference, of you lived in the UK, where electricity is VERY expensive, you're looking at only 4p extra per hour which is about 5¢. In the USA, you'd be looking at a difference in cost of a "whopping" 2¢/hr. Here in Canada, it's half of that at 1¢/hr. Not really enough to be concerned about, eh? It's far worse to have a more power-hungry CPU than GPU because the CPU is ALWAYS trying to run as fast as possible while the GPU is often performing tasks that don't draw a lot of power like anything that isn't high-end gaming, encoding or running benchmarks. Playing a video for instance, well, you can do this with an integrated GPU just fine as this is considered child's play for even the weakest GPUs in existence. Most applications have your GPU almost at idle, but not your CPU.
It's a simple fact that the further up the stack you go, the not juice you're going to pull, like a Mustang with a V6 or V8. In this case, you're comparing an L4 to a V8.
The Navi 31 GPU on which the 7900 XT is based apparently undervolts really well and while you can do the same with the 4070, you'd get more of a drop from the 7900 XT just based on the laws of averages. It still won't be as low as the GeForce card but it won't be as bad as it is now.
I just bought an RX 7900 XTX but I'm on vacation in Montreal so I won't get to tinker with it until Saturday night.