[quotemsg=18756571,0,212627][quotemsg=18755023,0,1782036]@rcald2000
I didn't want to reply directly to your question in case Filippo does answer you.
Our CPU's are very similar. Most notable differences are 5820k(15mb L3 cache, 28PCIe lanes) and 4930k(12mb L3cache, 40 PCIe lanes). I know 1080's aren't Titan's but they're not that far off. My CPU has yet to go over 60% with both GPU's holding over 95% usage during 4k(60-80fps ultra, vsync on, AA min or off). It mostly averages 40%. This is with ROTTR, Witcher 3, GTA V, DOOM. The most obscene thing I've seen is how ROTTR loves to eat all the ram it can get, where ever it can get it.
The firestrike physics tests nails it at 100% along with HD video conversions(including software using hardware accelerations).
That being said. If you're really going all in for 2 of these then I would recommend seeing how your 5820 handles it. My personal experience in going from quads to hexes is that the lower hex speeds are more than compensated by the larger L caches and increased PCIe lanes vs a quad trying to hit 5ghz. I'm actually very surprised how many reviewers still obsess with using quads for these kinds of tests. Just look at the monsters in the 3d mark hall of fame and most are using 6,8,10 cores with their Titan SLI's. [/quotemsg]
Increased PCIe lanes make absolutely zero (ok, maybe 1-2% at most) difference. Proof:
http://www.tomshardware.com/reviews/graphics-performance-myths-debunked,3739-3.html
The real difference is the clock speed and, to an extent, the cache size.
The monsters in the OVERALL 3d mark hall of fame do indeed use 6,8,10 CPUs but, if you take out the CPU tests and look specifically at graphics scores the picture changes a lot.
[quotemsg=18755023,0,1782036]@rcald2000
When I put my old 970's from a 2600k(OC'd 4.20Ghz) to this 4930k(at the time 3.8ghz) I saw an average 10-20FPS increase in 4k. I was using the same memory and hard drives at the time.
If people are going to try use a quad for 4k/5k 60+hz or 1440p 200hz Titan SLI or the mythical 1080tiSLI(January 2017?) then a quad will likely need to be at ~5ghz or higher(I'm thinking closer to 6). I'm pretty certain a hex or bigger will sit happy around 4-5ghz for this. Mine's at 4.10ghz because I still use all air cool options and I simply don't need more and its 24/7 stable.
The other thing you may need to do if you take the leap. . .a bigger PSU. 850w can do it but a general recommendation I've heard for optimal PSU efficiency is to double the wattage you need. 2 Titan's and that CPU are going to push real close to 500-600 during a full load. I've watched mine at it averages in the 450-550 as is(but I've also got 2 raids and 3 or 4 other drives and that's including the display as well).
[/quotemsg]
The answer is actually simpler than that, as CPUs and GPUs follow Little's Law and Queuing Networks Theory. If his CPU utilization is at 50% with GPUs as the bottleneck, then the overclock he needs to shift the bottleneck to the CPU is precisely a 100% overclock (50%->100% utilization implies a doubling of the effective frequency, that is, a 100% overclock.) As mentioned in another response, that is simply not achievable under normal circumstances.[/quotemsg]
In my case I'm pretty sure it was the cache sizes that made the most difference but I really did gain 10-20fps in going from my 2600k to 4930k. I also emphasize that the 4930k has yet to go over 60% in gaming with 1080sli while Filippo stated that the 6700k(4.2ghz) pegged 100% leaving the Titan's bottle-necked at 50%.
I do completely agree with you on 144hz though. I honestly can't see much a difference from 110-144. I just mentioned it because of the new craze the display manufacturers are obsessing on and you know no matter how ridiculous there will be people trying to push it.