The scheduled forum maintenance has now been completed. If you spot any issues, please report them here in this thread. Thank you!
Whoa. So is the rear fan hanging off the PCB?
If you want to be a leader, you need to do "Exotic Cooling Solutions" that pushes trends.More info on it here:
https://www.tweaktown.com/news/7308...ler-rumored-to-cost-150-on-its-own/index.html
Leak says it is a shorter PCB with a cutout in it. This won't be the PCB reference design that AIB's get. If true, the founders edition will no longer be the go to version for water cooling. I don't understand why Nvidia would develop such an exotic and reportedly expensive cooling solution for their own branded cards.
I was being a bit sarcastic when I was calling it a decent PSU, I wanted some to roll their eyes.Anything that is Titanium is more than just a decent psu its top of the line.
And seasonic is the best in the game, its why I choose my psu which is built on the same platform as the Prime just with a Corsair label on it.
Meh 320W is insane I thought the power consumption would go down or at least stay the same when moving to a smaller process technology.
... I don't understand why Nvidia would develop such an exotic and reportedly expensive cooling solution for their own branded cards.
We have to keep in mind the greater thermal densities of the die shrinks. Like how Pascal is 16nm and Turing is 12nm... Ampere is 7nm.Those TDPs are pretty high for mainstream gaming cards ... jit would seem to me odd if those were their original TDP targets.
We have to keep in mind the greater thermal densities of the die shrinks. Like how Pascal is 16nm and Turing is 12nm... Ampere is 7nm.
Yes. 320W for the 3070? I'll believe it when Nvidia announces it. 2070 is 175w. I don't believe for a second that Nvidia increased the TDP of the x70 by 145W, or 83% from last generation.You saw the TDPs, right?
So if 2070 has 225 TDP ... and we believe the headline that leaked a while back about Ampere being 50% faster and 50% more efficient, that means the 3xxx series should have maybe 4 times the performance of Turing ... ?😉
Those are defaults - they easily pull more than that.No, the 2070's TDP is 175W. The 2070 Super is 215W. That's how we know this leak is BS.
Yes. 320W for the 3070? I'll believe it when Nvidia announces it. 2070 is 175w. I don't believe for a second that Nvidia increased the TDP of the x70 by 145W, or 83% from last generation.
You really think the 3070 and the 3080Ti, and thus the 3080 as well, are all going to have the same TDP? The leak is complete bull.
That explanation is meaningless unless Nvidia changed how they determine the TDP of a card for the 30xx series from default draw to max draw. Pretty unlikely. The problem seems to be that THG somehow screwed up copying the chart from the original source.Those are defaults - they easily pull more than that.
RTX 2070 has a default of 185w, with a limit of 215w: https://www.techpowerup.com/vgabios/204549/nvidia-rtx2070-8192-180831
RTX 2070 Super, 215w default, 260w max: https://www.techpowerup.com/vgabios/212046/nvidia-rtx2070super-8192-190531
That doesn't account for the fancier aftermarket models, which tend to have higher limits, like this Gigabyte Aorus 2070 Super with 314w limit: https://www.techpowerup.com/vgabios/212861/gigabyte-rtx2070super-8192-190625
My 1080Ti is rated for 250w, but pulls 300w quite often.
Part | PCB | Chip | Model | Extension | Memory | Interface | TBP | Connectors |
SKU10 | PG132 | GA102 | RTX 3090 | (Ti/Super)* | 24 GB GDDR6X (Double-Sided) | 384-bt | 350 W | 3x DP, HDMI NVLink |
SKU20 | PG132 | GA102 | RTX 3080 | (Ti/Super)* | 11 GB GDDR6X* | 352-bit* | 320 W | 3x DP, HDMI |
SKU30 | PG132 | GA102 | RTX 3080 | none | 10 GB GDDR6X | 320-bit | 320 W | 3x DP, HDMI |
Graphics Card | GPU | Memory | Memory Interface | TDP (W) | Outputs |
---|---|---|---|---|---|
GeForce RTX 3090 (Ti / Super) | GA102 | 24GB GDDR6X | 384-bit | 350 | 3 DisplayPort, HDMI, NVLink |
GeForce RTX 3080 (Ti / Super) | GA102 | 11GB GDDR6X | 352-bit | 320 | 3 DisplayPort, HDMI |
GeForce RTX 3070 | GA102 | 10GB GDDR6X | 320-bit | 320 | 3 DisplayPort, HDMI |
If these cards really require a $150 cooler to function properly, what are AIB's going to do? Think about how many different versions of each card AIB's like Asus and Gigabyte produce. They aren't going to develop 5 or 6 different coolers that cost that much to manufacture. Unless the 3080 FE is going sell for $900, you can't put a $150 cooler on it. $150 cooler on a $5000 Quadro, sure makes sense. $150 cooler on a card we hope is in the $700-800 range, makes no sense.You ask why they would need such expensive and exotic cooling (appears authentic), and the listed leaked TDPs directly answer that question. Just saying ...
If these cards really require a $150 cooler to function properly, what are AIB's going to do? Think about how many different versions of each card AIB's like Asus and Gigabyte produce. They aren't going to develop 5 or 6 different coolers that cost that much to manufacture. Unless the 3080 FE is going sell for $900, you can't put a $150 cooler on it. $150 cooler on a $5000 Quadro, sure makes sense. $150 cooler on a card we hope is in the $700-800 range, makes no sense.
Who knows? Could be a typo.That explanation is meaningless unless Nvidia changed how they determine the TDP of a card for the 30xx series from default draw to max draw. Pretty unlikely. The problem seems to be that THG somehow screwed up copying the chart from the original source.
Original source, Igor's Lab, linked from Tom's article:
Part PCB Chip Model Extension Memory Interface TBP Connectors SKU10 PG132 GA102 RTX 3090 (Ti/Super)* 24 GB GDDR6X
(Double-Sided)384-bt 350 W 3x DP, HDMI
NVLinkSKU20 PG132 GA102 RTX 3080 (Ti/Super)* 11 GB GDDR6X* 352-bit* 320 W 3x DP, HDMI SKU30 PG132 GA102 RTX 3080 none 10 GB GDDR6X 320-bit 320 W 3x DP, HDMI
Here's the chart THG posted:
Graphics Card GPU Memory Memory Interface TDP (W) Outputs GeForce RTX 3090 (Ti / Super) GA102 24GB GDDR6X 384-bit 350 3 DisplayPort, HDMI, NVLink GeForce RTX 3080 (Ti / Super) GA102 11GB GDDR6X 352-bit 320 3 DisplayPort, HDMI GeForce RTX 3070 GA102 10GB GDDR6X 320-bit 320 3 DisplayPort, HDMI
Why did THG replace the vanilla 3080 in the original chart with a 3070 in their chart?
Meh 320W is insane I thought the power consumption would go down or at least stay the same when moving to a smaller process technology.