Question Gpu memory temp question 3060ti ftw

Arbiter051

Honorable
Mar 28, 2016
311
3
10,795
1
Hello, hope this isn’t a dumb question.

I have a 3060ti ftw stock no manual oc. My temps during gaming at the highest reach maybe 62c in the games I am playing and I know that is great.

My question however is my gpu does not have sensors for the memory and I’ve read tons of mixed feelings about memory temps for 3060ti and 3070 cards (I know the memory was mostly a problem for 3080 and 3090 models)

My hotspot temp is always 11-12c higher than my core and I just have a few questions.

  1. is the hotspot always supposed to be 11-12c higher?
  2. Is the hotspot the same as memory or is there no way to know?
  3. With summer approaching, I worry if my card was to go to something like 73c and my hotspot hits 85c is that bad?
  4. I have read some places that people report their memory temps 10c higher than their hotspot temp. Is that true/should I be worried?
To add to number 4, this could mean my memory temps are already in the 80s and I wouldn’t want them to be in the 90s or even 100s in the summer.

I don’t mine, only game on my pc. Thank you!
 

Phaaze88

Titan
Ambassador
1)Normal to be 10-20C higher than gpu core.
For example, my 1080Ti is about a 15C gap.

2)No, it's is not.
Hwinfo should be able to read 30 series Vram thermals.

3)No. Worry about:
Gpu core when 83C or higher.
Gpu hot spot when 100C or higher. The temperature range is greater than that on the core.
The recommendation is below 95C on Vram.

4)That's the first time I've heard that one... I'd still say no.
 
is the hotspot always supposed to be 11-12c higher?
Yes, because it's the hottest spot on the GPU. The other temperature sensor is more of an average.

Is the hotspot the same as memory or is there no way to know?
It's only for the GPU itself.

With summer approaching, I worry if my card was to go to something like 73c and my hotspot hits 85c is that bad?
No, GPUs are designed to work up to 95-100C. And even then, it'll start downclocking when it gets close to those numbers.

I have read some places that people report their memory temps 10c higher than their hotspot temp. Is that true/should I be worried?
To add to number 4, this could mean my memory temps are already in the 80s and I wouldn’t want them to be in the 90s or even 100s in the summer.
There's no correlation between hot spot and memory temperatures, other than if hot spot temperatures are up, the GPU must be working on some load so it stands to reason the memory would be used enough for it to go up as well.

Either way, unless you see strange artifacting, there's nothing to worry about with regards to memory health and/or performance.
 

Arbiter051

Honorable
Mar 28, 2016
311
3
10,795
1
Yes, because it's the hottest spot on the GPU. The other temperature sensor is more of an average.


It's only for the GPU itself.


No, GPUs are designed to work up to 95-100C. And even then, it'll start downclocking when it gets close to those numbers.


There's no correlation between hot spot and memory temperatures, other than if hot spot temperatures are up, the GPU must be working on some load so it stands to reason the memory would be used enough for it to go up as well.

Either way, unless you see strange artifacting, there's nothing to worry about with regards to memory health and/or performance.
Ty you both very much for your replies.

So there is no true way for me to figure out the memory temp without a sensor or thermal gun?That’s unfortunate. If true.

when you say artifacting, can you give examples? I ask because sometimes in destiny some textures will go to really low poly boxes and stuff. I never really thought about it because it’s destiny and weird textures, screen tearing etc seemed par for the corse since I started months ago.

I have watched gamers nexus video of them testing artifacting on cards and I remember one of them having space invaders on them along with obvious broken visuals.

Sorry if it is a dumb question to ask. Just want to be sure before I start getting paranoid or anything.

From what I have read, the 3060ti ftw does not have sensors for memory temps. If I am wrong please let me know where I can see them. Hwinfo and Per X does not show them no matter where I look :
 
Last edited:

Eximo

Titan
Ambassador
There were reports from various GA102 cards that had memory temperatures as high as 110C, likely warmer than the hotspot. On the 3090 half the memory was on the back of the card with only a heatspreader.

The 3090Ti corrects this by doubling the memory density and putting it all on the front. And in a lot of cases, swapping out the thermal pads on some cards proved to be the best solution.
 

Arbiter051

Honorable
Mar 28, 2016
311
3
10,795
1
There were reports from various GA102 cards that had memory temperatures as high as 110C, likely warmer than the hotspot. On the 3090 half the memory was on the back of the card with only a heatspreader.

The 3090Ti corrects this by doubling the memory density and putting it all on the front. And in a lot of cases, swapping out the thermal pads on some cards proved to be the best solution.
Yea that is what I remember. Just reading people say that I should undervolt a 3060ti because of hotspot readings etc troubles me because I would rather not do that if I don’t have to

currently all of you nice people are telling me I am perfectly fine
 
Yea that is what I remember. Just reading people say that I should undervolt a 3060ti because of hotspot readings etc troubles me because I would rather not do that if I don’t have to

currently all of you nice people are telling me I am perfectly fine
I'd still recommend doing an undervolt to make the card more efficient.

As an example, my 2070 Super needs like 220W to hit 2050MHz, but with some tweaking I can get it down to using 180W and it tops out at 1930MHz. I notice almost no performance hits. I went a little further and found the lowest voltage at 1800MHz (it's stock boost speed), and it floats around 160W. There are also some games I play where for some reason it can run the video card to its fullest, but there's almost no performance gains.
 

Arbiter051

Honorable
Mar 28, 2016
311
3
10,795
1
I'd still recommend doing an undervolt to make the card more efficient.

As an example, my 2070 Super needs like 220W to hit 2050MHz, but with some tweaking I can get it down to using 180W and it tops out at 1930MHz. I notice almost no performance hits. I went a little further and found the lowest voltage at 1800MHz (it's stock boost speed), and it floats around 160W. There are also some games I play where for some reason it can run the video card to its fullest, but there's almost no performance gains.
I would rather not mess with that stuff cause I’m not really sure what I am doing and rather not risk it.

Is the purpose of under volting it less heat and power usage?
 
I would rather not mess with that stuff cause I’m not really sure what I am doing and rather not risk it.

Is the purpose of under volting it less heat and power usage?
Undervolting reduces the power consumption, which in turn reduces heat generated.

You're not going to damage the hardware by undervolting it and you only set the parameters to the point where there is a software stability issue (driver crashing) but then you just dial it back a bit, re-verify, and if it's stable you're done.
 
Reactions: Phaaze88

Arbiter051

Honorable
Mar 28, 2016
311
3
10,795
1
Undervolting reduces the power consumption, which in turn reduces heat generated.

You're not going to damage the hardware by undervolting it and you only set the parameters to the point where there is a software stability issue (driver crashing) but then you just dial it back a bit, re-verify, and if it's stable you're done.
I wish I had the confidence to do it heh.
 

ASK THE COMMUNITY