Wrong readings GPU-Z or Furmark?

Letten

Honorable
Dec 17, 2014
194
2
10,695
Hello, I had a 750ti for a few years and I overclocked it over time, and it reached its maximum potential,
Clock: 1469MHz
Memory: 3000MHz
Temps below 60 in furmark.

But wait, that's what furmark reads, but GPU-Z reads only 1287MHz in clock and 150 on memory, (I guess it just split,)and 1366MHz in boost.

I don't know what's wrong here, so I tried with gpu shark,

On idle, 1287MHz (GPU-Z's readings)
On full load, 1469MHz (Furmark's)

Sometimes on furmark it jumps to 1502MHz, and on gpu shark stays at 1469

Anything I need to know?
 
Solution
Your GPU have some sort of self boost clock.Nvidia technology.
On idle your frequency should drop more than that.It's normal when u don't use your card it goes to energy saving mode.
Your normal GPU frequency is 1287MHz.When u use the graphic card that clock increase itself usually more than the "boost" clock given by the manufacturer and he garantee that the card will reach this boost clocks. http://www.geforce.com/hardware/technology/gpu-boost/technology
GPU-Z shows your Default GPU clock and your Default Boost clock.It also shows current clocks.When u monitor your best clock GPU must be under load.Every program should show u exact current clocks.Best place to monitor them is the program that u use to overclock them.
Your GPU have some sort of self boost clock.Nvidia technology.
On idle your frequency should drop more than that.It's normal when u don't use your card it goes to energy saving mode.
Your normal GPU frequency is 1287MHz.When u use the graphic card that clock increase itself usually more than the "boost" clock given by the manufacturer and he garantee that the card will reach this boost clocks. http://www.geforce.com/hardware/technology/gpu-boost/technology
GPU-Z shows your Default GPU clock and your Default Boost clock.It also shows current clocks.When u monitor your best clock GPU must be under load.Every program should show u exact current clocks.Best place to monitor them is the program that u use to overclock them.
 
Solution