Hey Guys,
I am currently in the process of overclocking my graphics card (Asus 1GB Geforce 9800 GT)
Stock
GPU Clock: 600MHz
Memory Clock: 900MHz
Shader Clock: 1500MHz = 55 FPS
I am currently using EVGA Precision and OC Scanner
I can get the GPU Clock upto 720MHz without any artifacting in the OC Scanner
using the OC Scanner in fullscreen mode, and at 1440x900 Resolution, during benchmarking i get
720MHz = 61 FPS
but if i reduce the GPU Clock to 660MHz i get
660MHz = 242 FPS
this is where it confuses me, how does this transfer into games?? should i run it at 720MHz or 660MHz?? which is going to be better overall?
full speeds:
GPU Clock: 660MHz
Memory Clock: 1150MHz
Shader Clock: 1674MHz = 242 FPS
GPU Clock: 720MHz
Memory Clock: 1150MHz
Shader Clock: 1782MHz = 61 FPS
please help
thanks
PC Specs:
AMD Phenom II x6 1090T 3.2GHz (not oc'd, i would but im too scared)
4GB DDR2 800 Ram
Asus 1GB Geforce 9800 GT
I am currently in the process of overclocking my graphics card (Asus 1GB Geforce 9800 GT)
Stock
GPU Clock: 600MHz
Memory Clock: 900MHz
Shader Clock: 1500MHz = 55 FPS
I am currently using EVGA Precision and OC Scanner
I can get the GPU Clock upto 720MHz without any artifacting in the OC Scanner
using the OC Scanner in fullscreen mode, and at 1440x900 Resolution, during benchmarking i get
720MHz = 61 FPS
but if i reduce the GPU Clock to 660MHz i get
660MHz = 242 FPS
this is where it confuses me, how does this transfer into games?? should i run it at 720MHz or 660MHz?? which is going to be better overall?
full speeds:
GPU Clock: 660MHz
Memory Clock: 1150MHz
Shader Clock: 1674MHz = 242 FPS
GPU Clock: 720MHz
Memory Clock: 1150MHz
Shader Clock: 1782MHz = 61 FPS
please help
thanks
PC Specs:
AMD Phenom II x6 1090T 3.2GHz (not oc'd, i would but im too scared)
4GB DDR2 800 Ram
Asus 1GB Geforce 9800 GT