Specs:
CPU: i7 9700k (stock clock)
Motherboard: MSI Z390-A PRO
Ram: 2x 8GB Corsair Vengeance 3.6Ghz
SSD/HDD: Samsung 850 EVO (250GB & 1TB)
GPU: RTX 2080 Super by Zotac (latest driver)
PSU: Corsair TX-M 650 Watt
Chassis: Corsair Carbide 275r
OS: Windows 10 home (build 1903)
Afterburner (As well as HWMonitor and Zotac's Firestorm) all measure my memory clock speed to hover in idle around 405MHz, and leap into the thousands, often reaching a max of 7750MHz.
My GPU has a max memory clock of 1938MHz. Its octo-pumped datarate gives it an effective memory clock of 15504MHz. 15504 / 2 gives me exactly the max effective memory clock I described above.
GPU-Z tells me my memory clock accurately - it's 101MHz idle. That timed by 4 gives me the memory clock given by Afterburner.
I don't know if AB gets the memory clock and multiplies it by the datarate, or takes the effective clock and divides it by the data rate, but the former would make everything make sense if AB is calibrated to account for quad-pumped GDDR5.
Is my solution correct? Does anyone know how to get it reporting correctly? Thanks.
CPU: i7 9700k (stock clock)
Motherboard: MSI Z390-A PRO
Ram: 2x 8GB Corsair Vengeance 3.6Ghz
SSD/HDD: Samsung 850 EVO (250GB & 1TB)
GPU: RTX 2080 Super by Zotac (latest driver)
PSU: Corsair TX-M 650 Watt
Chassis: Corsair Carbide 275r
OS: Windows 10 home (build 1903)
Afterburner (As well as HWMonitor and Zotac's Firestorm) all measure my memory clock speed to hover in idle around 405MHz, and leap into the thousands, often reaching a max of 7750MHz.
My GPU has a max memory clock of 1938MHz. Its octo-pumped datarate gives it an effective memory clock of 15504MHz. 15504 / 2 gives me exactly the max effective memory clock I described above.
GPU-Z tells me my memory clock accurately - it's 101MHz idle. That timed by 4 gives me the memory clock given by Afterburner.
I don't know if AB gets the memory clock and multiplies it by the datarate, or takes the effective clock and divides it by the data rate, but the former would make everything make sense if AB is calibrated to account for quad-pumped GDDR5.
Is my solution correct? Does anyone know how to get it reporting correctly? Thanks.