[SOLVED] Can someone explain to me in detail the amount of TDP of a GDDR6 memory chip and Vrm found in an RX6900xt?

Dec 11, 2021
81
2
45
0
I have recently been wanting to watercool my 6900xt with a kraken g12 and would like to know what size heatsinks to get for the components considering that I don't know the TDP of such components that need cooling!
NZXT Kraken G12's manual says that the fan alone provides enaugh cooling!

Naturally I would like to add them just in case!

Buying a proper GPU block is out of the question here!.


Your opinion is greatly appreciated!
 
https://www.micron.com/-/media/client/global/documents/products/technical-note/dram/tned03_gddr6.pdf

This only applies to Micro based chips since it's Micron's marketing, but I doubt memory made by SK Hynix or Samsung (who also make GDDR6 memory) is significantly different. In any case, Micron claims their GDDR6 modules consume 5.5 pJ/bit. Doing some math:

( 5.5 * 10^-12 joules per bit) * (16000 * 10^6 cycles per second) * 32 bits = 2.816 watts

If you're curious about how the units work out: Joules is also watt-seconds, so the formula can be rewritten as (using only the units):

(W * s / b ) * ( 1 / s ) * (b) = (W * s * b ) / (b * s) = W

So round it up to 3W per chip.
 
Reactions: Arbër1041

Eximo

Titan
Ambassador
That is more or less correct. Direct airflow is generally enough for memory chips and VRMs.

As for heatsinks, if you want to stick some on, anything that fits really. It will be an improvement.

As for power, lets see if anyone has posted screenshots.

There are some pre-filled full coverage blocks from the likes of EK and Alphacool.
 
Reactions: Arbër1041
Dec 11, 2021
81
2
45
0
That is more or less correct. Direct airflow is generally enough for memory chips and VRMs.

As for heatsinks, if you want to stick some on, anything that fits really. It will be an improvement.

As for power, lets see if anyone has posted screenshots.

There are some pre-filled full coverage blocks from the likes of EK and Alphacool.
You gave me Confidence thank you very much bro! :)
 
Dec 11, 2021
81
2
45
0
Sadly looks like GPU-Z only grabs the total board power. Doesn't separate the GPU from the board. So the 300W limit appears to be correct, that is where it will max out at stock settings for a stock card.
Yeah though my original question was the TDP of each memory chip not the main unit(gpu)
 

Eximo

Titan
Ambassador
Yes, other cards separate the various power usage from the GPU. Doesn't look like AMD did that on the stock 6900XT. Maybe some of the higher end cards do, but I didn't find any that had a card at load with that section of GPU-Z in a screenshot.
 
Reactions: Arbër1041
https://www.micron.com/-/media/client/global/documents/products/technical-note/dram/tned03_gddr6.pdf

This only applies to Micro based chips since it's Micron's marketing, but I doubt memory made by SK Hynix or Samsung (who also make GDDR6 memory) is significantly different. In any case, Micron claims their GDDR6 modules consume 5.5 pJ/bit. Doing some math:

( 5.5 * 10^-12 joules per bit) * (16000 * 10^6 cycles per second) * 32 bits = 2.816 watts

If you're curious about how the units work out: Joules is also watt-seconds, so the formula can be rewritten as (using only the units):

(W * s / b ) * ( 1 / s ) * (b) = (W * s * b ) / (b * s) = W

So round it up to 3W per chip.
 
Reactions: Arbër1041
Dec 11, 2021
81
2
45
0
https://www.micron.com/-/media/client/global/documents/products/technical-note/dram/tned03_gddr6.pdf

This only applies to Micro based chips since it's Micron's marketing, but I doubt memory made by SK Hynix or Samsung (who also make GDDR6 memory) is significantly different. In any case, Micron claims their GDDR6 modules consume 5.5 pJ/bit. Doing some math:

( 5.5 * 10^-12 joules per bit) * (16000 * 10^6 cycles per second) * 32 bits = 2.816 watts

If you're curious about how the units work out: Joules is also watt-seconds, so the formula can be rewritten as (using only the units):

(W * s / b ) * ( 1 / s ) * (b) = (W * s * b ) / (b * s) = W

So round it up to 3W per chip.
Hats off to you my friend!
 

ASK THE COMMUNITY

TRENDING THREADS