mmmmMehhhhh . . . .
Vegas GPU compute on line one . . .
My CAD does OpenGL, too
I didn't say Nvidia was always faster for professional use cases, but the professional market is like 90–95 percent Nvidia from what I understand. AMD is doing better at getting into supercomputers, but its workstation aspirations have been floundering for years. I've personally known lots of engineers over the past 15 years as an example. I have literally never been to a business using professional workstations where they had AMD cards. Never. Dozens of companies with thousands of workstations, and they were all Nvidia shops. I'm sure there are places that do use AMD GPUs, but they're simply far less common in the real world.
You'll note that I point out AMD's gains in SPECviewper 2020 scores in the review. That's because of those new OpenGL drivers. But improving performance and gaining market share don't happen overnight. There's a lot of history and momentum that AMD needs to overcome.
Seeing how much wattage these GPU use in a loop is interesting, but it still tells me nothing regarding real-life cost. Cloud gaming suddenly looks more attractive when I realize I won't need to pay to run a GPU at 300 watt.
The running cost of GPU should now be part of reviews imo.
Considering how much people in Europe, Japan, and South East Asia are now paying for electricity and how much these new GPU consume.
Household appliances with similar power usage, usually have their running cost discussed in reviews.
Household appliances usually post numbers that are "optimized" to make them look better, but they're also far more predictable. "A refrigerator in a 70F house needs to run xxx hours per day to stay cool." Simple. "We assume you'll watch two hours of TV per day" on the other hand may or may not match your actual usage. For a PC, it's going to be all over the map, and I give the power figures precisely so you can decide how much it would actually cost you. I suppose I didn't explicitly list idle power, which you can sort of see on the far left of the power line charts.
Anyway, let's say a "normal" user runs their PC for 10 hours per day. Of that time, seven hours is more or less idle power — surfing the internet, watching YouTube, etc. Only three hours is high impact gaming. The idle power use of the 4080 is around 20W.
That means you get 7 * 20 = 140W, 3 * 300 = 900W. Basically 1kWh per day of power gets used by the RTX 4080. That's $0.09 per day where I live I think, but in CA or NY it might be triple that, and in parts of Europe it would be up to five times that much.
Even so, $0.50 per day is $15 per month or around $180 per year. If you can afford to go out and buy a $1,200 graphics card, you're probably not going to worry much about the $180 a year in added electricity costs. Or you can get a game streaming service like Stadia... oh, wait. Hmmm. Well, GeForce Now is an option, and you'll only need to pay $200 a year in subscription fees for access to an RTX 3080 in the cloud or whatever.
But if you play games ten hours per day, every day of the year? Yeah, electricity costs will be a lot higher. $0.27 per day for me, up to $1.50 per day for some regions of the world. "OMG YOU'RE SPENDING $550 A YEAR ON POWER FOR YOUR GAMING ADDICTION!" And some people go out and buy muscle cards that get 10 miles per gallon. ¯\
(ツ)/¯