vmN :
gamerk316 :
jdwii :
8350rocks :
Reynod :
Thanks Palladin.
Guys, those of you who have posted regularly in this thread over the past year or so will have now received a PM from one of our staff called "Tittilating" asking you for some details so we can ship you a Toms Case Badge for your PC.
I went back through this thread and nominated about 10 of you.
Please respond to the link provided and you will get something in the mail in a couple of days.
Please feel free to post a pick of your pooter with the new case badge.
There are only a few of these so treasure them.
Tom.
Guys, those of you who have posted regularly in this thread over the past year or so will have now received a PM from one of our staff called "Tittilating" asking you for some details so we can ship you a Toms Case Badge for your PC.
I went back through this thread and nominated about 10 of you.
Please respond to the link provided and you will get something in the mail in a couple of days.
Please feel free to post a pick of your pooter with the new case badge.
There are only a few of these so treasure them.
Tom.
![Smile :) :)](/data/assets/smilies/smile.gif)
Thanks!
Also, in other news, FX 8350 pushes SLI @ 4K resolution better than 4930K:
http://www.tweaktown.com/tweakipedia/56/amd-fx-8350-powering-gtx-780-sli-vs-gtx-980-sli-at-4k/index.html
Wow i would of never guessed do you have any idea why? Even the PCI-E 3.0 alone on the Intel build should put intel ahead. Maybe its the fast ram i do notice the min frame rates are lower.
Starting to think the Evil within is just a bad port
http://www.dsogaming.com/news/rumor-the-evil-within-pc-may-be-locked-at-30fps/
I've got a theory on that actually. Remember that so far, Intel Quads in newer titles, maxed out, are reaching the upper 70s in core usage in the worst case. As I've noted several times: The highest usage core is the one that will limit you the most. It's possible that at 4k, even though the GPU should be doing vastly more work, the Intel CPUs are getting just enough extra work where the two heavy threads are maxing out one core, creating a CPU bottleneck in the process. The effects are minor due to the GPU bottleneck at that res, but they are there.
Working that theory, Intel should see a gain from DX12/OGLNG/Mantel/Whatever due to reduced CPU loads, as that one heavy core wouldn't be a problem anymore. That would make per-core performance dominant again, which benefits Intel processors.
Again, just a theory. Would need core loading numbers to confirm.
So, the FX singlecore performance forces the game to split up the threads across more cores?
Windows SMT support should also avoid putting 2 heavy threads on the same core.
I'd like to say Windows does this already with the patches installed for 7 and 8,8.1,10 does this already.