I have an EVGA FTW3 3080 10gb on the quiet BIOS and as a test did a clean boot of win10 (latest version) with my 55 inch LG CX 4k tv at 120hz, my 27 inch Viewsonic Elite XG270QG at 120 hz, and my 31.5 inch Samsung 4k U32H85x 60hz monitor. I opened HW info and total GPU power for a 30 minute average of afk nothing happening on my computer was 99w average with an 89w minimum and a 109w maximum.
Thanks!
Are you trying to tell me that I am wrong due to the monitoring software, or what?
No, I'm just saying that it
shouldn't take that much power to read framebuffer values from memory and send them out to a display. Even if the display is YUV. So, there's something going on that I definitely don't understand.
For reference, a little raspberry Pi can do this with 2x 4k monitors @ 30 Hz, and its GPU is 28 nm garbage. The whole device shouldn't use even 10 W to do that.
My total power pulled at the wall for my entire PC in this state also says 158 watts. Is that wrong too?
If the GPU figure is correct, then that sounds about right.
I said specifically that the lack of competition was the reason for the industries woe's. I forwent the exact reason that you so aptly made because it would be the 3rd or 4th time I have made the same such post on forums in the past month and can no longer be bothered going into the explanation of why the competition sucks.
You can put it in your profile, for easy access, and then just drop a link to it each time it comes up. Or just bookmark a link to your best post if it and drop that link, instead of repeating yourself.
Anyway, it's your speculation vs. mine. I know a little about graphics APIs and I know a GPU driver developer, so mine is somewhat informed. But, you're entitled to your opinions, even if they're wrong.
; )