Question Will the RTX 3000 series require more watts than the 2000 series?

King_V

Distinguished
I think it's hard to say.

Electronics have been getting more efficient.

For example....iff you look at the GTX vs the RTX series....

The GTX 1080 Ti consumes the same amount of power as the RTX 2080 Ti.
Slightly more, no? By a small amount, 260W vs 250W. Then again, same process node, but more performance, so, not entirely surprising, though they seem to have kept the need for more power to a minimum with the added efficiency of their new architecture.

After all, the 1080Ti's performance is about where the 2070 Super is, and the 2070 Super only needs about 215W to get the 1080Ti's performance.
 
Slightly more, no? By a small amount, 260W vs 250W. Then again, same process node, but more performance, so, not entirely surprising, though they seem to have kept the need for more power to a minimum with the added efficiency of their new architecture.

After all, the 1080Ti's performance is about where the 2070 Super is, and the 2070 Super only needs about 215W to get the 1080Ti's performance.
The reference specification on the RTX 2080 Ti is 250 watts.
https://hexus.net/tech/news/graphics/110969-nvidia-geforce-gtx-1070-ti-specs-price-availability-leak/
 

King_V

Distinguished
From your link:
Mind you, the system wattage is measured at the wall socket side and there are other variables like PSU power efficiency. So this is an estimated value, albeit a very good one. Below, a chart of relative power consumption. Again, the Wattage shown is the card with the GPU(s) stressed 100%, showing only the peak GPU power draw, not the power consumption of the entire PC and not the average gaming power consumption.
So, this one generic sample, one methodology, not taking into account different throttling mechanisms across differing architectures, and so forth, is not the be-all and end-all.

Otherwise, why does Nvidia themselves rate the 1080Ti at 250W, and the 2080Ti at 260W? What benefit would Nvidia gain from "pretending" that the newer card draws more power than the old one?


I suspect that more real-world testing would probably give results similar to what is seen here (which is a much more recent test)

https://www.tomshardware.com/features/graphics-card-power-consumption-tested
 

King_V

Distinguished
Oooh, crap . . . forgot about the reference spec being lower than the Founders Edition spec.

Still, even at 250W, it's not less than 250W.

OTOH - if we compare the Founders Edition version of both generations (using the Tom's article I linked), we're looking at about a 20W gap on Metro Exodus, and a 25W gap on FurMark.

I wonder if they're equal when using a board partner's reference, non-OC's board, or if it falls somewhere between the two, given that the real usage is more than the on-paper 10W difference.
 
Oooh, crap . . . forgot about the reference spec being lower than the Founders Edition spec.

Still, even at 250W, it's not less than 250W.

OTOH - if we compare the Founders Edition version of both generations (using the Tom's article I linked), we're looking at about a 20W gap on Metro Exodus, and a 25W gap on FurMark.

I wonder if they're equal when using a board partner's reference, non-OC's board, or if it falls somewhere between the two, given that the real usage is more than the on-paper 10W difference.
10 watts is kind of "in the noise" when we are looking at 250 watts anyway.
....and with the tolerances on electrical components what they are...I wouldn't be surprised to see 10 watt differences within the same batch of cards.
 

ASK THE COMMUNITY

TRENDING THREADS