Should I upgrade from my i5-2320?

a_shrub

Honorable
May 10, 2013
3
0
10,510
Im looking to upgrade from an i5-2320 to an i7-2700k due to the "mediocre" performance I'm getting in almost every modern game, and in benchmarking tests like Future-mark.

I don't know much about CPU's to begin with, but from what I've read elsewhere, it wouldn't be much of an upgrade (aside from the Hyper-threading, which I will find useful, and slightly higher clocks)but when comparing it to another pc I use at work that has an i7-2700 (same manufacturer, and roughly the same parts), the story changes, as I get about 2-3 times better performance(even with it's poor GT 220 gpu) in a couple games I was able to bench.

Future-mark scores even seem to back this up with the i5-2550k getting about a ~70% better score than the 2320, and the i7-2700k getting twice the score of the 2320(not sure if that includes running on the HT cores).


Unsure as to what I should believe at this point, though I'm leaning towards the side of the 2700k giving far better performance than what my CPU gives.

Anyway, should I upgrade to an i7-2700k or no? (The pc is a store-bought Lenovo k330b-7747 with an added-on GTX 680, if this info helps any.)

Also want to clear any remaining doubt; Would my current stock OEM motherboard support an i7-2700k?
 
I don't understand why you are getting such poor scores. All an i5 2320 is, is a 2550k at 3.0Ghz. You should be only 5% slower than a 2500k. So you should have about the same performance as a 2500k. You do have a quad core processor. So if you upgrade to a 2500k or 3570k you wouldn't see much difference if you don't overclock them. If you do overclock THEN you'll see a huge difference. IDK why your getting such low scores, you have a Sandy Bridge i5 processor. That's pretty good. Almost the best you can get for gaming.

The only meaningful upgrade I can recommend would be a 3770k. And even that won't increase performance all that much. Maybe 10% in games. I'm really surprised to be hearing of such low benchmark scores from a QUAD core Sandy Bridge chip running at 3.0Ghz.

Definitely don't get a 2700k. It's 2 year old tech. At least get a 3570k or 3770k so you can overclock. A 3570k and 3770k perform exactly the same in games. The 3770k will be roughly 30% faster in well threaded benchmarks, and it'll be the same speed in games and single threaded apps. The only thing a 3770k is good for is well threaded apps and benchmarks. So if you don't use well threaded apps then a 3770k is pointless. Plus you can overclock a 3570k and get the same performance in well threaded apps as a 3770k. You can always overclock the 3770k too though.

Most people who have a Sandy Bridge i5 aren't considering upgrading right now as they are still pretty fast compared to the new Ivy Bridge i5's.
 
Me thinkest you have other problems than the CPU like driver.
1) What driver are you using, have you updated the Driver.
2) You are switching from iGPU to your GTX680 when gaming - Correct????? Furmark should show what GPU is being used.

Here's a quote I found:
Statement: "I'm currently hitting 45-60fps on Skyrim and 55-60 in Battlefield. Which is great. But I'm wondering if my i5 2320 is holding me back. Was kind of hoping to get a constant 60 at least maybe 70-80. Maybe that was unrealistic."
The response: "I think your CPU is fine."

Read: http://www.tomshardware.com/forum/360083-33-solved-core-clock-problem

The biggest advantage of the i7 is IF the Game is coded to use more than 2 cores, if the game is coded for using upto 2 cores, Probably very little performance gain by uping the CPU.



 
For games, HyperThreading is not very useful. It gives an advantage to FPU instructions; something that games don't have an abundance of. HyperThreading rarely gives a lot of benefit over non HT processors in games.

Your i5-2320 is still very competent, even in today's world. It is very, very odd that you are getting a massive 2-3 times better performance over the i5-2320. I would expect, at a maximum, in perfect conditions, maybe 40% better performance, mostly lended to cache and clock.

If upgrading is on the horizon, I would upgrade to an i5-3570K if your motherboard supports it. If not, I would wait until Haswell, as an upgrade to an i5-2500K would be very very little in performance difference.

Though what games are you getting such a massive performance difference in?
 
Also, you must have a really poor graphics card to get beaten by a GT220. Games are mainly GPU dependent. Meaning if your getting low FPS it's because of your GPU, not your CPU. So it's your graphics card that sucks, not your CPU. The CPU still matters, but not nearly as much as the GPU.

What Future Mark benchmark are you using? PCMark7, 3DMark11, 3Dmark? Because if your using 3Dmark11 or 3Dmark that's your GPU making those scores not your CPU. The only part of those benchmarks that rate the CPU are the Physics benchmarks. And with that processor in 3DMark11 you should have about a 5000 Physics score. Which is good.
 
Wow, I just realized you have a GTX680. There's no way your getting beat by a GT220. You must be using the HD3000 graphics. You have to select the GTX680 to run everything in the Nvidia control panel.

Something is not right with your rig. Being that you have a 500$ graphics card, I'd definitely try to fix it to get what you pay for.
 


It's using the 680. Intels graphics would never be able to run some of the more GPU demanding games games I've played (farcry 3/metro 2033/crysis 2 and 3 etc). Plus the fact that the card reaching about 80-82c (quite hot imo) means it should not even be touching the Intel HD.

All of the drivers are up to date, prior to updating drivers, I did a complete restore of my OS hoping it would fix the problem, which it didnt.

As for the futuremark scores, I do have this :
http://www.3dmark.com/fs/68997 (Gpu's clock was running logged at 705mhz, when it should be 1175mhz)

However, I was refering to this comparison in 3dmark 11(which is inline with the performance difference im seeing:
i5-2320 getting a 3930
i5-2550k getting a 6140
i7-2700k getting a 8040




The only engine that was weak enough to run well on the GT220, while still having the cpu be the bottlekneck (for the most part during my demo benches), was the Source engine, specifically Counter-strike:Source, running with a maximum fps config.

Benching some demo playbacks told the same story as the last demo-bench.
For example, the 2320 getting ~ 63fps averaged on a run, then doing the same bench on the 2700 gave ~112 fps averaged when I bench a demo of a 64 slot ZE server running a very unoptimized map.
Playing a demo from a 32player dust2 server, the i5 rig refused to reach/stay at 300( gpu running at about 30% load) unless I looked at the ground, whereas the i7 was pushing 700 in some areas(with the gpu @ 99% load).
And this transfers over to every game I play, pushing poor performance. Farcry 3 unable to keep a stable 60, Assassins creed 3 pushing 20 fps(while the gpu usage drops) if looking at the entire town/ abunch on NPC's, Skyrim jumping from 60 to 30-45 when I enter an area with npc's. Even my friends over steam are getting far better results, while being on the 2600/2700 CPU models, compared to me.

Since both pc's seem to have near identical hardware, and the drivers are up-to-date, and the 680 is being used, and you all are saying the CPU's will perform the same(which some tests back up, some dont), its making me confused.

I may go for broke, and just grab a 2700k and see if its my pc bottlenecking, or if the CPU is truly under performing, and return it if there is little difference.

 


82 is throttling territory, however that's usually not a worry, since thats only under autofan. I would never let my card run that hot outside of a quick two minute benchmark. It drops down to about the low 70's when I manually control the fan speed(apparently the card's autofan feature thinks 80+c is fine lol). I also run the card overvolted using a BIOS mod to achieve a slightly higher clock(yes i know the risks), so its natural that its temperature would be that high. I generally never let the card breach above 75 when playing a game, just it was the best proof I could give that my 680 was being used, and not the Intel HD.

The i5 usually averages in the upper 50's under a full load(when I stream my blu-ray movies to my phone which requires converting it to another video codec/format), and the temp is usually less when playing games since the cpu isnt being fully utilized. Have yet to see it break 58 actually.
 


Okay, fair enough. I am still convinced without a shadow of a doubt though that there is something amiss here. An i7 does not outperform an i5 by such a margin. Espiecally when clocks are stock. Espiecally when they are in the same microarchitecture. Something just isn't right here.