Question How will Ampere scale on laptops?

kunal_1991

Reputable
Oct 12, 2016
18
0
4,520
2
With RTX 3080 and 3090 having 300-400 watts power draw on 3rd party boards will the performance delta between laptops and desktops be even greater than Turing? Assuming nvidia sticks with the Samsung 8nm node and not 7nm it's hard to imagine massive gains, especially since laptops prevent cramming in anything more than 200 watts. This seems to be the upper limit in alienware/MSI/Clevo because anything higher is very hard to cool down.
Makes sense that performance of the Pascal generation was the closest among laptops and desktops, given that gtx 1080 FE had 180 watts of TDP.
My guess is gains on laptops will be a lot more modest at around 30-40% over Turing, maybe 50-60% better rtx performance with the 2nd gen rt cores but not the 2x claim as seen on desktops.
All this is for the 180-200w variants, which is rare because majority of the laptops opt for max-q where gains might be even lower. Ampere might not get enough breathing room on thin and light laptops to have a sizeable impact over Turing.
Considering Turing still had a manageable TDP but some RTX 2080 max-q laptops were flat out beaten by overclocked desktop RTX 2060s I'm not overly optimistic on the RTX Ampere mobile launch, maybe nvidia should go back to the mobility naming scheme rather than mislead people into paying for desktop performance.
Thoughts?
 

SteveRX4

Notable
BANNED
Sep 29, 2020
1,477
135
890
19
The 3080 is about 25% better than the 2080ti. The 3090 is about 15% better than the 3080.
They are really power hungry. So they are hard to cool. You'd only get them in big expensive laptops.
As you said, a 3080 is about 60% better than a 2080. I'd expect this sort of scaling with mobile versions.
 

ASK THE COMMUNITY

TRENDING THREADS