Also depends highly on your psu. An 80+ (white) is only 80% efficiency, so if one card pulls 50w extra from the pc, that's closer to 63w extra just from the gpu, from the wall, which can add up in a hurry for someone like Lafong. If your psu is Platinum rated, thats 95% efficiency, so that 50w extra would only be 55w from the wall, which still adds up, just not nearly as fast.
Total, if your pc is using 250w, that's 313w with a White or 263w with a Platinum, so overall a 50w savings per hour, and with the 50w card, that's 300w = 375w and 300w = 315w, so a 60w savings with only a 58w difference.
That's 1x 60w lightbulb turned off, or @ 10x overhead led lights every hour.
Over the course of a monthly bill, the energy difference adds upto not much more than a hamburger, ± the cheese.
They show both 6700 XT cards drawing an identical 7 watts for idle power consumption when the system is not under load, which is actually slightly lower than a 3060 Ti's 9 watts at idle, while both drew 21 watts during video playback, slightly higher than the 3060 Ti's 17 watts. So typical desktop power draw between all of these cards is likely to be fairly similar.
They did find the 6700 XT cards to draw more power when idling with 2 monitors running at different resolutions though (which can drive up power draw on some cards), at 33 watts versus the 3060 Ti's 17 watts. That could add up if you leave your computer on most of the time with certain multi-monitor configurations, but is probably more of a niche scenario.
And under their tested gaming load (running Cyberpunk 2077 at 1440p Ultra without RT), both the reference 6700 XT and the Nitro+ drew 221 watts, while the 3060 Ti drew 199. That's only a 22 watt difference, or 11% higher power draw, while they showed the 6700 XT to push around 15% higher frame rates than the 3060 Ti under those same conditions. And across their entire test suite of 22 games, they found the 6700XT cards to be almost 9% faster on average compared to a 3060 Ti in games without RT enabled at 1440p, so the higher power draw roughly matches the cards higher performance. However, it should be noted that in most games with RT effects enabled, the 3060 Ti tends to perform notably better, though they didn't do RT power testing.
The only part of their testing where the Nitro+ was shown to draw more power than a reference 6700 XT was in the FurMark stress test, where the Nitro+ went up to 254 watts, while the reference 6700 XT stayed at 217 watts, and the 3060 Ti at 197 watts. So it seems that stress-testing was the only way they got the card to approach its higher power limit. It's possible that some games or other workloads might also manage to draw more power on that card, but judging by the Cyberpunk testing, along with the card's very similar performance to the reference model across their test suite, that's probably not the norm. They did, however, show both 6700 XTs to draw notably more power than the 3060 Ti when enabling 60Hz V-Sync at 1080p in Cyberpunk to cap frame rates at 60fps though, with them drawing 125-130 watts compared to the 3060 Ti at just 73 watts. So, in frame-limited, moderately demanding workloads, the 3060 Ti might potentially draw less power, though it's hard to say from just testing that one game under those conditions.
Overall, I don't think the power draw differences between any of these cards will make too much of a difference to how much they cost in electricity. On the desktop, power draw should in most cases be similarly low between all of them, with an exception possibly being some multimonitor setups, where the 6700 XTs might draw a little more. In games, the power draw of the 6700 XTs may be a little higher, but performance should be a little higher as well, at least outside of raytracing. Power draw in something like Unreal Engine is hard to say, and might depend on how much demand the editor is placing on the hardware.