[SOLVED] AfterBurner showing wrong clock

Sep 27, 2019
12
0
20
2
Specs:
CPU: i7 9700k (stock clock)
Motherboard: MSI Z390-A PRO
Ram: 2x 8GB Corsair Vengeance 3.6Ghz
SSD/HDD: Samsung 850 EVO (250GB & 1TB)
GPU: RTX 2080 Super by Zotac (latest driver)
PSU: Corsair TX-M 650 Watt
Chassis: Corsair Carbide 275r
OS: Windows 10 home (build 1903)

Afterburner (As well as HWMonitor and Zotac's Firestorm) all measure my memory clock speed to hover in idle around 405MHz, and leap into the thousands, often reaching a max of 7750MHz.

My GPU has a max memory clock of 1938MHz. Its octo-pumped datarate gives it an effective memory clock of 15504MHz. 15504 / 2 gives me exactly the max effective memory clock I described above.

GPU-Z tells me my memory clock accurately - it's 101MHz idle. That timed by 4 gives me the memory clock given by Afterburner.

I don't know if AB gets the memory clock and multiplies it by the datarate, or takes the effective clock and divides it by the data rate, but the former would make everything make sense if AB is calibrated to account for quad-pumped GDDR5.

Is my solution correct? Does anyone know how to get it reporting correctly? Thanks.
 

dotas1

Admirable
Specs:
CPU: i7 9700k (stock clock)
Motherboard: MSI Z390-A PRO
Ram: 2x 8GB Corsair Vengeance 3.6Ghz
SSD/HDD: Samsung 850 EVO (250GB & 1TB)
GPU: RTX 2080 Super by Zotac (latest driver)
PSU: Corsair TX-M 650 Watt
Chassis: Corsair Carbide 275r
OS: Windows 10 home (build 1903)

Afterburner (As well as HWMonitor and Zotac's Firestorm) all measure my memory clock speed to hover in idle around 405MHz, and leap into the thousands, often reaching a max of 7750MHz.

My GPU has a max memory clock of 1938MHz. Its octo-pumped datarate gives it an effective memory clock of 15504MHz. 15504 / 2 gives me exactly the max effective memory clock I described above.

GPU-Z tells me my memory clock accurately - it's 101MHz idle. That timed by 4 gives me the memory clock given by Afterburner.

I don't know if AB gets the memory clock and multiplies it by the datarate, or takes the effective clock and divides it by the data rate, but the former would make everything make sense if AB is calibrated to account for quad-pumped GDDR5.

Is my solution correct? Does anyone know how to get it reporting correctly? Thanks.
Ok so first things first. Memory clock and GPU clock are 2 different things.
Your GPU core clock is 1938MHz that can go higher with GPU Boost from Nvidia.

The rest of your math are correct. Afterburner shows half the clock (Double Data Rate) and GPU-z shows quarter of the clock (don't know why, haven't search more for it).

The clocks you see it's how it's supposed to be. In Idle your GPU runs in low power mode and jumps to higher clocks when needed.

If you wish, I can upload my Afterburner screenshot along with the GPU-z.
 
Reactions: extreme_noob

extreme_noob

Notable
Jul 30, 2018
682
74
1,140
32
My GPU has a max memory clock of 1938MHz. Its octo-pumped datarate gives it an effective memory clock of 15504MHz. 15504 / 2 gives me exactly the max effective memory clock I described above.
Your math is wrong. 15504/2 is 7752.
Edit: I misinterpreted what you said, sorry about that.

As for the actual reason, I don't know, but my 2070 has similar clocks.
 
Sep 27, 2019
12
0
20
2
Your math is wrong. 15504/2 is 7752.
Edit: I misinterpreted what you said, sorry about that.

As for the actual reason, I don't know, but my 2070 has similar clocks.
Do your monitoring utils report clocks which don't make sense, like mine do? Like half of your max effective data rate?
 

dotas1

Admirable
Specs:
CPU: i7 9700k (stock clock)
Motherboard: MSI Z390-A PRO
Ram: 2x 8GB Corsair Vengeance 3.6Ghz
SSD/HDD: Samsung 850 EVO (250GB & 1TB)
GPU: RTX 2080 Super by Zotac (latest driver)
PSU: Corsair TX-M 650 Watt
Chassis: Corsair Carbide 275r
OS: Windows 10 home (build 1903)

Afterburner (As well as HWMonitor and Zotac's Firestorm) all measure my memory clock speed to hover in idle around 405MHz, and leap into the thousands, often reaching a max of 7750MHz.

My GPU has a max memory clock of 1938MHz. Its octo-pumped datarate gives it an effective memory clock of 15504MHz. 15504 / 2 gives me exactly the max effective memory clock I described above.

GPU-Z tells me my memory clock accurately - it's 101MHz idle. That timed by 4 gives me the memory clock given by Afterburner.

I don't know if AB gets the memory clock and multiplies it by the datarate, or takes the effective clock and divides it by the data rate, but the former would make everything make sense if AB is calibrated to account for quad-pumped GDDR5.

Is my solution correct? Does anyone know how to get it reporting correctly? Thanks.
Ok so first things first. Memory clock and GPU clock are 2 different things.
Your GPU core clock is 1938MHz that can go higher with GPU Boost from Nvidia.

The rest of your math are correct. Afterburner shows half the clock (Double Data Rate) and GPU-z shows quarter of the clock (don't know why, haven't search more for it).

The clocks you see it's how it's supposed to be. In Idle your GPU runs in low power mode and jumps to higher clocks when needed.

If you wish, I can upload my Afterburner screenshot along with the GPU-z.
 
Reactions: extreme_noob

ASK THE COMMUNITY

TRENDING THREADS