binderasaf1 :
jankerson :
binderasaf1 :
jankerson :
binderasaf1 :
Hello dear superposition people.
There's a mystery i have yet to solve , and I'm super curious as to what's the solution.
Upon browsing through superpositions hof in unigine's website, i stumbled upon this score (attached in the pic).
http://i63.tinypic.com/vpx0xw.jpg
There's a 700ish points difference between us, and 5-6 fps, which is a lot.
Now I'm baffled by this, as both of us have ryzen systems, the only difference being me having a 6 core but running 3.95 ghz (he is running 3.9ghz). The impact on results, regardless, is marginal at best. If anything, my 50mhz advantage should help more than 2 extra cores. We both run dual channel 16 gigs, and my mem is actually clocked higher. Gpu Clockwise, this guy runs his core 50mhz more than me, and 90mhz more on the memory. This difference can never account for such a big difference in score/fps. In the real world its 1 fps at best. I tried all the regular nvidia control panel tweaks, but i can't go past 6346ish....
Am i missing something? What's the secret sauce for this big discrepency in results? I'm pretty sure its not the clocks...
Help anyone ?
There's a mystery i have yet to solve , and I'm super curious as to what's the solution.
Upon browsing through superpositions hof in unigine's website, i stumbled upon this score (attached in the pic).
http://i63.tinypic.com/vpx0xw.jpg
There's a 700ish points difference between us, and 5-6 fps, which is a lot.
Now I'm baffled by this, as both of us have ryzen systems, the only difference being me having a 6 core but running 3.95 ghz (he is running 3.9ghz). The impact on results, regardless, is marginal at best. If anything, my 50mhz advantage should help more than 2 extra cores. We both run dual channel 16 gigs, and my mem is actually clocked higher. Gpu Clockwise, this guy runs his core 50mhz more than me, and 90mhz more on the memory. This difference can never account for such a big difference in score/fps. In the real world its 1 fps at best. I tried all the regular nvidia control panel tweaks, but i can't go past 6346ish....
Am i missing something? What's the secret sauce for this big discrepency in results? I'm pretty sure its not the clocks...
Help anyone ?
Yeah, big difference... He OCed his GTX 1080Ti to the moon, look at the memory clocks for the GPU.... 6300 MHz...
That's the difference.
Nope...mine is clocked at 6210mhz...i actually benched at 6300 as well and it certainely wasnt a 5 fps difference. Not the memory.
SP is all GPU memory bandwidth..
I can promise you, and im saying this from personal experience, the difference between 6210mhz and 6300mhz on the memory is never 5-6 fps. 1fps at most if not 0.5 in sp. Its most certainly not the memory.
What's your core and shader clock?
As the person who moderates the table and the person with the most entries in the table there's a LOT more than just memory speed. The point difference between my 1070 Ti at stock core and +500mhz memory and then 2100MHz core and +500 memory is about 800 points.