Upgrade: 3570K to 6800K?

Jonathanese

Distinguished
Jun 7, 2010
273
0
18,790
After I got my GTX 1070, I started realizing how much of a bottleneck my CPU apparently is. For a card that is supposed to be beating the Titan X, it is running at something like 40% slower.

Now, I'm not sure how much of a bottleneck it actually is, because my GPU still shows high usage. But for instance, Assassin's Creed: Unity runs at about 40FPS maxed at 1440p. My previous 970 was able to lock 60 at 1080p. So I'm thinking maybe it has to do with oudated or not enough PCI-E lanes? Either way, I figure it might be time to start looking at upgrades. (Ugh. I just spent $450 on a graphics card, $500 on a monitor.)

The upgrade path I'm looking at is the i7 6800K. I figure an upgrade from 4 threads to 12 might be at least a little noticeable. But it seems like Intel hasn't done anything in terms of performance improvements. Pair that with 16GB of quad-channel DDR4 3200. An upgrade from 16GB of dual channel DDR3 1600.

I assume the ASUS X99-A is a good motherboard and the Zalman CNPS 9900 MAX is a good cooler. And older cooler, but from the days of more power-hungry CPU's.

So altogether, that would be about $800. So the total spent in 2 months would be $1750. About $500 more than my car is currently worth.

Do you think something like this would be worth it? Or should I stick with my 3570K for another year or so? I mean, that's still less than half I make in a month.

EDIT: Also, My 3570K is running at 4.6GHz. I hear that should be plenty. wait for new drivers?
 
Solution
Honestly, the guy in the video is probably over-exaggerating a bit. But as you might know, GPU's are getting ridiculously fast at the moment, so the old 2500k to 4690k might produce minimal bottlenecks if not overclocked pretty well, someone correct me if I'm wrong. And most games only utilize 4 threads, and a small bunch maybe more. But the newest 6600k/6700k will be GUARANTEED to NOT bottleneck your 1070/1080, if overclocked to around 4.5 which is very normal. I'm talking about literally any game here. So if you want to be absolutely certain you are not limited by your cpu in any way, settle for a skylake or preferably kaby lake for those improvements on the amount of instructions per clock, which will grant you better fps in cpu...
Hey,
1) The real-world performance advantage for most games would be MINIMAL upgrading from your current CPU to an i7 Skylake CPU.

2) It's not a PCIe issue.

3) Skylake has roughly 15% more processing per physical core (ignoring hyperthreading which isn't a benefit most of the time)

4) *Your "proof" that your current CPU is bottlenecking things isn't very scientific. You need to look at CPU benchmarks comparing the same settings.

Summary:
IMO upgrading would give very poor value.

In fact, there are much better ways to spend the money such as a GSYNC monitor if you don't have one of those, though if you have a nice, non-GSYNC monitor I'd wait for prices to drop (about $750USD for a 27", 2560x1440 IPS)
 

lrrelevant

Commendable
Jun 22, 2016
210
0
1,760
If you look solely for gaming performance, go for the 6700k instead of the 6800k. Personally I'd suggest that you wait for the kaby Lake like He suggested. It'll save you quite a chunk of cash aswell. So according to me, stick with your 3570k untill the release of the i7 7700k(?) which is most likely going on sale in a couple of months if you want to remove your gaming bottleneck completely.
 
http://www.anandtech.com/show/9483/intel-skylake-review-6700k-6600k-ddr4-ddr3-ipc-6th-generation/16

Some are more demanding but observe how little benefit many games will get. Fallout 4 is one of the worst-case scenarios, though they may have slightly improved since THIS benchmark-> http://www.gamersnexus.net/game-bench/2182-fallout-4-cpu-benchmark-huge-performance-difference

From the 1440p benchmarks you can see maybe a 15% average FPS increase if you upgraded to Skylake. It used a GTX980Ti Hybrid so it's very comparable.

Plus the average FPS was already 77FPS for a stock i5-4690K CPU. Heck the i3-4130 is holding up pretty well so you can see a good 2C/4T CPU (dual core with hyperthreading often gets you most of the performance... thus diminishing returns beyond a good quad-core CPU).
 

Jonathanese

Distinguished
Jun 7, 2010
273
0
18,790
Excellent info guys! Makes me feel much better.


@PhotonBoy

Yes, definitely unscientific. Haven't had time to do some real detailed benchmarks. Normally I go all out and create a spreadsheet to do many calculations for me.

And if you noticed my mention of spending $500 on a monitor, G-Sync is exactly what I set out to get. I figured I would make an investment, but also take a risk, so I got the ASUS ROG Swift monitor, but refurbished, with half the reviews saying theirs didn't work. My worked right away, thank god. And I tried to run whatever stress tests I could on the monitor, such as varying refresh rates, 144Hz constant movement overnight, etc. The thing works like a charm, with the exception of minor flickering at frame rates below 30, such as load screens. So now I have 1440p, 144Hz, G-Sync, which is what made me want to upgrade to the 1070. But yeah, G-Sync is bliss. No longer do I have to worry about dipping and hiking frame rates.

I do notice, however, that, say, 60FPS on this monitor looks worse than 60FPS on a 60Hz monitor. You can tell the frames apart more.

My brother bought the 970. At a steal, too. $200, and it still has my fastest stable overclock flashed. The thing is neck and neck with a 980.


@Irrelevant

I'm going to have to take a look at what they are saying about Kaby Lake. And I'm also crossing my fingers that new drivers change the game altogether.

By far the most CPU-bound game I play is Minecraft. You probably chuckled. But with shader mods, large texture packs, and BuildCraft, it goes anywhere from a sustained 100fps in open scenes down to a sustained 20fps in heavy buildcraft scenes, with all cpu cores maxed out, using 8GB+ of RAM. But the CPU struggles to keep up with loading in chunks, and Optifine allows for multi-core chunk loading. Likely has less to do with an underpowered CPU and more to do with shitting programming or just java itself.
 

lrrelevant

Commendable
Jun 22, 2016
210
0
1,760
Honestly, the guy in the video is probably over-exaggerating a bit. But as you might know, GPU's are getting ridiculously fast at the moment, so the old 2500k to 4690k might produce minimal bottlenecks if not overclocked pretty well, someone correct me if I'm wrong. And most games only utilize 4 threads, and a small bunch maybe more. But the newest 6600k/6700k will be GUARANTEED to NOT bottleneck your 1070/1080, if overclocked to around 4.5 which is very normal. I'm talking about literally any game here. So if you want to be absolutely certain you are not limited by your cpu in any way, settle for a skylake or preferably kaby lake for those improvements on the amount of instructions per clock, which will grant you better fps in cpu bound games clock for clock compared to anything else. Hope this helps a little.
 
Solution


There is no CPU that is not the bottleneck at times. Starcraft 2 is just one example. You'll never play a game and stay above 60FPS in an intense battle. Fallout 4 has bottlenecks even with the i7-6700K.

All you do by purchasing a faster CPU is reduce or eliminate the bottleneck (depending on the game).

The question is whether that's worth the extra money to buy a new CPU, motherboard, memory and copy of Windows, plus reinstall everything and I say no Fing way.