Discussion Ampere power consumption improvements

The new 30 series cards are powerhogs, and put out a ton of heat, we all know that.
We also know they have very good performance compared to last gen.

Here's a question that's been on my mind
The 3070 is similar in performance to the 2080 Ti, but is much newer and on a smaller node, so you'd imagine it would take less power, but it actually doesn't. It has very similar power consumption

So did ampere just let Nvidia run more juice through the chips, but with the same performance per watt?

edit: Just checked it and the 3070 actually takes about 50-70 watts less, but that's still not alot.
 
The new 30 series cards are powerhogs, and put out a ton of heat, we all know that.
We also know they have very good performance compared to last gen.

Here's a question that's been on my mind
The 3070 is similar in performance to the 2080 Ti, but is much newer and on a smaller node, so you'd imagine it would take less power, but it actually doesn't. It has very similar power consumption

So did ampere just let Nvidia run more juice through the chips, but with the same performance per watt?

edit: Just checked it and the 3070 actually takes about 50-70 watts less, but that's still not alot.

My theory (and that's all it is really) is that it's a combination of Samsung's 8 nm process and the goals Nvidia set for itself.

Nvidia loves their profit margins, so I'm guessing they settled on the 8 nm process because it was cheap and it would be 'enough' to stay ahead of AMD. I keep hearing that Samsung's 8 nm process is basically a slightly improved version of their older 10 nm process, which doesn't seem like a huge improvement over the 12 nm process Nvidia used for Turing. By the time Nvidia realized that the performance/efficiency gains they were going to get from the node shrink weren't enough, it was too late to change the node. The only real way to increase performance beyond the lackluster improvements they were seeing in internal testing was to crank up the power. They decide to go for it, realizing that most gamers care more about generational performance improvements vs generational efficiency improvements. After all, AMD has been making (and selling) GPUs for years that ran a lot hotter and drew a lot more power than equivalent Nvidia offerings (RX 580/RX 590 comes to mind). Even today, some people still buy the RX 580 because its cheap and works fine.

In theory, this could have all worked out just fine for Nvidia: they'd have both their higher profit margins and a whole lot of people lining up to buy their significantly more powerful GPUs, with the only real downside being the increased electricity all the new cards would draw (which the customers would pay for, not Nvidia, so they'd be fine). Problem is, now we have new AMD GPUs on the way that just might outperform their Nvidia counterparts for a lower price and with lower power consumption. Nvidia didn't see that coming, which is why they're not in the best position right now. If they had known how competitive AMD would be this time around (its been a while after all), they probably wouldn't have even considered using the 8nm Samsung node.
 
edit: Just checked it and the 3070 actually takes about 50-70 watts less, but that's still not alot.

that about right? well take this as an example. 1070 power reduction from 980ti is around 66watt (for average gaming session):
power_average.png


and this is from 28nm to 16nm.

My theory (and that's all it is really) is that it's a combination of Samsung's 8 nm process and the goals Nvidia set for itself.

it was not just about being cheap. nvidia has been scouting samsung as an alternative fab since 2012. and after 8 years they seriously need samsung to have the capability to make their big GPU not just their small GPU. nvidia ultimate goal is to have a fab they can alternate to when one fab fail to do what they want. and because of their ultimate goal is to have both samsung and TSMC capable of making their big GPU TSMC can be nvidia back up plan if AMD end up being more competitive than what they expect. hence we already heard the rumor about nvidia refreshing ampere at TSMC next year.