[SOLVED] Question about Techpowerup GPU-Z GPU Clock readings

Nemalp031

Reputable
Jan 18, 2016
19
0
4,510
I have the Sapphire Pulse 5700XT. I have attached three screenshots of three different clock settings on MSI Afterburner and the respective GPU - Z readings.
View: https://imgur.com/a/DqzC5Vg


When I reset my settings in MSI it shows 2029 MHz but in GPU-Z it reads everything as default with the Boost clock being 1925 MHz. Why does MSI Afterburner (Radeon's software too for that matter) say 2029?
When I set the clock in MSI Afterburner to 1925 which is supposed to be the default Boost clock of the card, GPU-Z says that the GPU clock is now lower than default, while the Boost clock remains the same. How come?
Another example is OC to 2050 MHz. GPU-Z also says 2050 Boost clock but the GPU Clock is still significantly lower than default. Why is that?

I'm still new to all this so maybe I'm misunderstanding something.

Edit: I think I actually mostly figured it out. Looks like the first box in GPU-Z is the mid point of a linear graph for the clock speeds starting at 800MHz and ending at whatever your default boost clock is. 1815 MHz is the defaullt max gpu clock (no boost) according to the manufacturer. The Mystic Afterburner skin has a clock curve and voltage curve editor (under voltage curve tab). It's a linear graph and the midpoint is at 1414MHz by default. If I OC to 2050, that's +21 from the 2029 so the mid point gets moved up by 11 (it's techically 21/2). 1414+11 = 1425 and that's what the first box in GPU-Z says. If I UC to 1925 that's -104 from the default 2029 so the mid point gets moved down by 52 (104/2). 1414 - 52 = 1362 and again that's what the first box in GPU-Z says. So the only odd thing is that Afterburner and Radeon say my default boost clock is 2029MHz while GPU-Z says it's 1925MHz. 1925MHz is also the default boost clock according to the manufacturer but from what I understand it varies from card to card so maybe mine really does default to 2029. Another interesting thing is that back when I got the card and I was using the older Adrenalin Driver (before Adrenalin 2020) it always defaulted to 1989 and not 2029 so maybe the default setting is a driver+card specific thing idk.
 
Last edited:
Solution
...
I'm still new to all this so maybe I'm misunderstanding something.
...

I don't think you're alone. I gave up trying to figure it out much beyond knowing the terms... base, game and boost clocks for the three key points on the curve. I also found that raising base, and therefore game clock, in Wattman CAN result in higher BOOST clocks..but also much more severe drop-backs as the chip loses thermal headroom. The result can be terrible stuttering even though I'm seeing terrific FPS numbers on the screen. So rather than trying figure out the relationships between the 6 variables (three clocks with voltage at each point), i just leave BASE and GAME alone and adjust BOOST only, which is what most of the AB skins do.

Doing...
...
I'm still new to all this so maybe I'm misunderstanding something.
...

I don't think you're alone. I gave up trying to figure it out much beyond knowing the terms... base, game and boost clocks for the three key points on the curve. I also found that raising base, and therefore game clock, in Wattman CAN result in higher BOOST clocks..but also much more severe drop-backs as the chip loses thermal headroom. The result can be terrible stuttering even though I'm seeing terrific FPS numbers on the screen. So rather than trying figure out the relationships between the 6 variables (three clocks with voltage at each point), i just leave BASE and GAME alone and adjust BOOST only, which is what most of the AB skins do.

Doing that, what I have come to learn is overclocking a 5700(XT) isn't nearly as productive as undervolting while keeping the highest 'boost' clock setting also lower. In benchmarks (TimeSpy mostly) the maximum clock used may also be lower but it's far far more consistent with fewer drops, sometimes staying rock steady. The resulting scores are at worse no different, usually better. I can also feel it with much more stable clocks/FPS being maintained during intense gaming action.

It is a bit of a balancing act, of course, because you can always just drop the boost clock for any given low voltage to keep stable. But once you find the sweet spot it's amazingly good performance and much lower thermal output, so less fan noise to keep temperature in control.
 
Last edited:
Solution

Nemalp031

Reputable
Jan 18, 2016
19
0
4,510
I don't think you're alone. I gave up trying to figure it out much beyond knowing the terms... base, game and boost clocks for the three key points on the curve. I also found that raising base, and therefore game clock, in Wattman CAN result in higher BOOST clocks..but also much more severe drop-backs as the chip loses thermal headroom. The result can be terrible stuttering even though I'm seeing terrific FPS numbers on the screen. So rather than trying figure out the relationships between the 6 variables (three clocks with voltage at each point), i just leave BASE and GAME alone and adjust BOOST only, which is what most of the AB skins do.

Doing that, what I have come to learn is overclocking a 5700(XT) isn't nearly as productive as undervolting while keeping the highest 'boost' clock setting also lower. In benchmarks (TimeSpy mostly) the maximum clock used may also be lower but it's far far more consistent with fewer drops, sometimes staying rock steady. The resulting scores are at worse no different, usually better. I can also feel it with much more stable clocks/FPS being maintained during intense gaming action.

It is a bit of a balancing act, of course, because you can always just drop the boost clock for any given low voltage to keep stable. But once you find the sweet spot it's amazingly good performance and much lower thermal output, so less fan noise to keep temperature in control.
I do have clock stability issues in some games (Mordhau in particular). Just for reference, what settings do you use?

Edit: I should say that in benchmarks the clock speeds are very stable and in Mordhau they are the least stable in the 64 player servers with big maps where my CPU struggles a lot (i7-3770 non k, I'd like to upgrade soon, just don't really have the time to deal with it right now. I have to buy a new mobo and RAM too.) so there's definitely some bottlenecking going on too in my case.
 
I do have clock stability issues in some games (Mordhau in particular). Just for reference, what settings do you use?
Mine's not a good example to follow; I have a 5700 Red Dragon that I burned with a 5700XT Red Dragon BIOS so it's heavily OC'd by that. I'm running with a 'boost clock' of 1905Mhz and GPU voltage setting of 1070mV. It's important to remember they down-binned 5700 GPU's for a reason, so there's not a lot more than just getting to a stock-clocked Red Dragon XT left in it if I want some downvolting potential.

My chip needs more voltage than most any true 5700XT I know of. I've read a lot of people with proper 5700XT's can get 2100Mhz with GPU voltage around 1000mV. My down-binned chip needs almost a full 1200mV at only 2010Mhz and down-clocks a lot; even though it gets decent scores and passes stability test it will stutter pretty frequently playing Ghost Recon. I was trying to turn my 5700 Red Dragon into a 5700XT Red Devil without the massive heatsink and triple fan LOL. It just doesn't work well.
 

Nemalp031

Reputable
Jan 18, 2016
19
0
4,510
Mine's not a good example to follow; I have a 5700 Red Dragon that I burned with a 5700XT Red Dragon BIOS so it's heavily OC'd by that. I'm running with a 'boost clock' of 1905Mhz and GPU voltage setting of 1070mV. It's important to remember they down-binned 5700 GPU's for a reason, so there's not a lot more than just getting to a stock-clocked Red Dragon XT left in it if I want some downvolting potential.

My chip needs more voltage than most any true 5700XT I know of. I've read a lot of people with proper 5700XT's can get 2100Mhz with GPU voltage around 1000mV. My down-binned chip needs almost a full 1200mV at only 2010Mhz and down-clocks a lot; even though it gets decent scores and passes stability test it will stutter pretty frequently playing Ghost Recon. I was trying to turn my 5700 Red Dragon into a 5700XT Red Devil without the massive heatsink and triple fan LOL. It just doesn't work well.
I see. I think the 2100 @ 1000mV is a bit of an exaggeration though. Most posts I've seen people have more like 1120-1140mV for 2050-2100Mhz.
 
I see. I think the 2100 @ 1000mV is a bit of an exaggeration though. Most posts I've seen people have more like 1120-1140mV for 2050-2100Mhz.
It's really hard to tell for sure, but I'm pretty sure they aren't lying, even if not telling the whole story. People remove heatsink/fan to carefully apply fresh paste then modded HS/mounting for full contact on the die, use open-air cases among other things to get high scores for bragging rights. And most importantly, cherry pick units to find the best silicon. Lots of ways to do it, most of them exceptional and impractical for average owners. But a decent measure of ultimate capability I should think.

One thing I'd like to do to my Red Dragon put some thermal pad between the back of the GPU and the aluminum backplate. Right now it's just an insulating air gap, i have to think closing the gap and allowing heat to conduct directly to the plate would improve cooling of the GPU. But taking the plate off means unmounting the fans and heatsink completely. I'm not willing to go that far just yet.
 
Last edited:

Nemalp031

Reputable
Jan 18, 2016
19
0
4,510
It's really hard to tell for sure, but I'm pretty sure they aren't lying, even if not telling the whole story. People remove heatsink/fan to carefully apply fresh paste then modded HS/mounting for full contact on the die, use open-air cases among other things to get high scores for bragging rights. And most importantly, cherry pick units to find the best silicon. Lots of ways to do it, most of them exceptional and impractical for average owners. But a decent measure of ultimate capability I should think.

One thing I'd like to do to my Red Dragon put some thermal pad between the back of the GPU and the aluminum backplate. Right now it's just an insulating air gap, i have to think closing the gap and allowing heat to conduct directly to the plate would improve cooling of the GPU. But taking the plate off means unmounting the fans and heatsink completely. I'm not willing to go that far just yet.
Oh if you're talking about special setups, modding, water cooling then I understand. When you said "a lot of people" I was thinking average users.
 
Oh if you're talking about special setups, modding, water cooling then I understand. When you said "a lot of people" I was thinking average users.
Not all are extreme mods, but yes modding. Mostly it's taking the advice Steve at GN dishes out on how to offset design deficiencies or simply improve your card. You might be surprised how many take their heatsinks off to find the mfr. left the plastic surface protectors on when they installed them...I was following a whole series of posts w/pic on Reddit of people who did and found it. And I mean different mfrs! not all the same!

Serious modders would be into the water cooling mods....or rigging up a huge heatsink salvaged off an older card or a big CPU heatsink. And that doesn't even include the crazy LN2 crowd.
 
  • Like
Reactions: Nemalp031

TRENDING THREADS