Now, I am nearly 100% certain what I am about to say is completely incorrect, but I have no other way to determine it. Can someone please correct my logic here?
The passmark benchmark for each CPU is:
CPU1: 6600k @ 3.5 Ghz benchmark score - 8056
CPU2: 9600k @ 3.6 Ghz benchmark score - 13521
Now, if there is a linear relationship between clock speed and performance....
If my overclock on my 6600k is 4.6 Ghz, then the effective benchmark is 8056 * 4.6/3.5 = 10587
If I expect to overclock my 9600k to be 5 Ghz then the effective benchmark would be 13521 * 5/3.7 = 18272
Based on these numbers, I could estimate a 72.5% performance gain when switching from my 6600k @ 4.6 Ghz to a 9600k @ 5 Ghz in tasks that utilize all the cores.
Edit: Clarity.
Edit2: I went ahead and upgraded my CPU from the 6600k to the 9600k. I am getting about 72.5% more frames in previously CPU bottle-necked games like Destiny. Also, CPU based tasks such as video rendering feel about twice as fast as well, I have no exact data on this.
The passmark benchmark for each CPU is:
CPU1: 6600k @ 3.5 Ghz benchmark score - 8056
CPU2: 9600k @ 3.6 Ghz benchmark score - 13521
Now, if there is a linear relationship between clock speed and performance....
If my overclock on my 6600k is 4.6 Ghz, then the effective benchmark is 8056 * 4.6/3.5 = 10587
If I expect to overclock my 9600k to be 5 Ghz then the effective benchmark would be 13521 * 5/3.7 = 18272
Based on these numbers, I could estimate a 72.5% performance gain when switching from my 6600k @ 4.6 Ghz to a 9600k @ 5 Ghz in tasks that utilize all the cores.
Edit: Clarity.
Edit2: I went ahead and upgraded my CPU from the 6600k to the 9600k. I am getting about 72.5% more frames in previously CPU bottle-necked games like Destiny. Also, CPU based tasks such as video rendering feel about twice as fast as well, I have no exact data on this.