How Many CPU Cores Do You Need?

Page 4 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
[citation][nom]WheelsOfConfusion[/nom]it would have been nice to throw some data about power usage with each configuration. Does disabling a core (or three) significantly reduce power consumption? What about temps?[/citation]

I agree, but I didn't put that into the article and here's why: I'm not confident that disabling the cores through software would have provided accurate temperature and power usage data compared to actual retail CPUs with varying amounts of CPU cores.

It looks like there's been enough interest in this story for some follow-up articles where I'll explore more of these issues though!
 
This article doesn't take into account the new core I7 processors, the old core 2 architecture has a memory bottleneck caused by the FSB. I would like to see the same test performed on a core I7.
 
[citation][nom]Tindytim[/nom] ..... As the amount of cores increases in the mainstream, more developers will write apps that take advantage of more cores. And as such having one core more than a resource intensive application will increase performance. So I want to make amendment to my previous statement.I want a 9 core processor since Blender can only use 8 threads.[/citation]
Which is one MAJOR point the article completely missed -- planning for the future. If I take the article advide and get a 2 or 3 core system, for $1-2k then when support for 4, 8, 16 and more cores comes out in the future i will be left behind, and since I just dropped a good amount of money into things, im not going to want to upgrade.

but if you get a 4-8 core system now, your set for a long time to come.
 
This is getting on my last nerve and will say this once so listen up. The i7 does NOT have 8 cores. It has 4 cores and scales 8 threads. It only appears as 8 processors to the operating system. It can execute two instructions at the same time per core but only for a performance gain of around 10-30%. A single CPU with HT will not perform as well as 2 CPU's without HT. You do not have 8 CPU's (cores) with an i7.
 
Quick question: what programs were running in the background? As in, other than your benchmarking software, were you also running anti-virus, an networking suite (like the Intel Wireless Suite that used to come with most laptops), and/or any media playing software like Real Player?

I knew that most games and apps would probably work best with at least a dual core, but what happens when, in addition to the apps you are primarily using, there are programs running in the background? I'd expect that the cores not being used on the primary application would be used for those programs, thereby offering the potential for better performance in the main program if the processor isn't having to devote clock cycles to processes other than what you are primarily using.

But if you guys also had, say AVG Internet Security, Real Player and all the other stuff that usually makes it way down to the bottom left hand corner of the screen during your benchmarks, then I guess dual core is really the way to go for most people, with triple core being the gamers budget solution.

And of course, the guys who blow five grand on a new system every two years really don't care about this, they'll keep plugging away with as many cores as Intel or AMD care to offer.
 
[citation][nom]darkmyst85[/nom]This article doesn't take into account the new core I7 processors, the old core 2 architecture has a memory bottleneck caused by the FSB. I would like to see the same test performed on a core I7.[/citation]

A very small fraction of the community actually own an I7 due to the excessive cost of the platform. I'm actually surprised a quad core Phenom X4 or Phenom II X4 wasn't used alongside the intel. Aside from that, why not use both an ATI and NVidia based graphics card to compare the differences more fairly? 🙂

-- MaSoP
 
[citation][nom]masop[/nom]A very small fraction of the community actually own an I7 due to the excessive cost of the platform. I'm actually surprised a quad core Phenom X4 or Phenom II X4 wasn't used alongside the intel.[/citation]
Excessive cost? $200 for a mobo, $300 for a processor, and $100 for 6 gigs of RAM doesn't seem excessive to me considering the performance, versus the $200 for an AM3 mobo, $250 for the processor, and $75 for 4 gigs of RAM of a Phenom II.
 
I actually read the entire article this time and didn't just skip to the results, so kudos to the author. Someone else mentioned this and I think it's worth exploring further; I think these results are due to memory bandwidth saturation and cache consumption. If this test were run on a well optimized i7 system you might see about the same relative performance change between 1 and 2 cores, but then a more dramatic change between 2and 3 cores and then 3 to 4 cores. If you could tune the system to increase the memory clock without increasing the core clocks, you may see a further shift in the results. I'm sure that the engineers at AMD and Intel spend a fair bit of time on this when optimizing their designs, but for those of us that like to tweak our systems, it gives us another parameter to optimize.
 
Great article

I woulds like to see how a few more Adobe programs preform, especially After Effects and Premiere, as well as a few more "professional" video renderers, Adobe video encoder, Flash video encoder, Canopus pro-coder, ect. I was stunned about no differences in Photoshop however the CS4-64 bit version should be tested.

Thanks
 
Besides those that want to see more benchmarks and on different platforms, there seems to be 2 camps here.

Camp #1 - Dual core is good enough for most everyone, buncha dorks that went and purchased a Quad aren't getting what they paid for. (apparently this camp owns dual cores and they need to justify their purchase)

Camp #2 - With a Quad core I can encode the planet while playing Crysis, run a virus scan, and listen to ITUNES!!! MUHAHA!! Buncha dorks that only have dual cores don't know what they are missing. (apparently they also need to justify their purchase)

Funny stuff, carry on folks. :)
 
Excellent article. I guess for it to mean much to the once-in-a-lifetime upgrader, it may have been more pertinent to compare actual processors from the past and present. An example may be a progression from an AMD 3800+ single core (2.4) - 4800X2 dual core (2.4) - 8750 triple core (2.4) - 9750 quad core (2.4). At the same time, utilizing one standard GPU.
Understanding that technology has changed dramatically in recent times, the potential upgrader is faced with this real scenario.
I'm wondering whether the results would be the same compared to starting at the top with four cores and simply switching off individual cores?
 
Please, lets stop all the discussion on i7, and do the benchmarks, from 1 to 8 hyperthreaded cores.

I bet 5 or 6 "cores" wold generally show the peak erformance.
 
Great article. Useful. Some thoughts:

- glawk et al need to consider total system cost, ie:

[citation][nom]glawk[/nom]
Q9650 = $325 @ newegg = 149.7 fps
E8500 = $185 @ newegg = 133.7 fps
That's a 75% increase in cost for an 11% increase in performance.[/citation]

In a $1500 E8500 system, the $140 premium for a Q9650 yields an 11% performance increase for a 9.3% increase in cost. No terrific, but the point remains.

- There's no need to justify the duo or quad you bought last year. Just remember multiprocessing is an option, and choose your cpu accordingly.

- With (most) games using more than one core now, there's far less rationale for gamers to buy dual core today except . . .

- NOT at the expense of (HOW MUCH?) processors speed. If this wer 1 year ago, tossing an E8XXX into the mix would have been useful . . . but now?
 
This would have been a five-star article except that the author forgot to mention one very significant point - MULTITASKING. When you're analyzing the performance difference between 1-4 cores, you don't just look at how one app runs at a time. These are not the days of DOS 6.0.

Even when you have nothing going on, background tasks and services rob the CPU of performance. When I surf at home, I've got WMP running in the taskbar, MSN running in one corner of the screen and AVG doing it's thing in the background. Guess which CPU wins? Quad-core hands down.

If the author would have gone the extra mile to do a multitasking exercise to show the REAL benefits of what more CPU cores can do for your system and improve the user experience, Tom's would have had an excellent article.

That being said, I would always recommend quad-cores. Dual-core is primitive compared to a Core i7 today. And while you may think a Core i7 is overkill or expensive, once you begin to multitask it will make a world of a difference, regardless of whether your apps are single-thread-optimized or not.
 
[citation][nom]joeman42[/nom]SHAME ON ALL OF YOU! The reason why 3 cores gives a slight advantage in certain cases over 4 is obvious yet completely overlooked by the writer and all the respondents. C2Q are essentially two C2Ds sandwiched. When a strictly 3 core load is presented the overhead of managing an unneeded 4th on the 2nd pair created the results shown. The i7 and Phenom cores are native quad and up and would, I suspect, not have this glitch.[/citation]

Small problem with that theory I would believe. I think the overhead is minimal if it even exists as the operating system would just ignore the extra core as if it didn't exist and not issue instructions to it. If indeed there is overhead with it though, explain why ignoring the core has less overhead and gives better performance than using it.
 
They should have included in the games testing music playing or a download going in the background. While many of the games don't use the 3rd core, something tells me that if anything is running in the background, that the situation will be handled much better by having the 4th core enabled. I mean, when I do FPS gaming, I tend to play Enigma with in-game music disabled and only effects enabled, while some sort of download is running in the background...
 
lol
How much AMD have paid you to say that triple-cores are all you need for gaming :)
Obviously you're reaching a GPU bottleneck at 70-80 fps.
Upgrade your GPU and rerun the tests.
 
It's a good idea for an article, but why didn't you use a quad core for testing? You used the wrong processor, and therefore the results aren't very useful.

The Nehalem or Phenom (probably best since you wouldn't have to worry about hyperthreading) would have been true quad-cores, not the 2 x 2 you picked. The Phenom also offers a dual core now, so it makes a little sense, although the Nehalem does not, so the data would be more academic.

Which cores are shut off when you pick dual core? Is it one from each actual core, or is it two from the same core? I'd guess by looking at it that you're getting one from each core, but either way, you

That's why you see some of the strange behavior in the benchmarks, by the way. Because you didn't use the right processor. A true quad would have exhibited different behavior.

Also, why use an obsolete quad core? Who would consider it today? The size of the cache is very important in the behavior of the processor as you use different cores. What thrashes a smaller cache, may not in a larger one, and thus you could have different performance characteristics.

For the reasons given, I would by no means call this a definitive statement on how many cores to get. It tells you how one obsolete processor performs with different enabled cores, and gives a rough idea of what to expect, but only a rough one that could be contradicted by different configurations. I'm not saying it's useless by any means, I just wish it did not aspire to be more meaningful than it is.
 
[citation][nom]Robert Spanjaard[/nom]There are a couple of things to consider here that haven't been mentioned yet.Tthe slight speed-increase on single-thread applications, when moving from one to two cores, is easy to explain. On two cores, all the processes running in the background are handled by a different core, so you only measure the application itself.The AVG-test is also interesting. Most times, this will be a process runnning silently in the _background_. Now ask yourself this: do you _want_ it to occupy all four cores, potentially slowing your entire system down? Or should it stick to a single core, and leave the other three free for whatever it is that you're doing in the foreground?In some cases, multi-threading is a good choice. But in other cases, it seems to undermine the concept of multitasking, especially when the application fails to set a low priority for itself.Some benchmarks, like archivers, will probably always scale badly, simply because compression algorythms are hard to split up without sacrificing compression rate. Ofcourse, if you compress multiple archives at the same time, they will scale just as well as any other application.[/citation]
you can choose what cores you want avg to run in windows task manager...
 
I'm going to 2nd that suggestion for a follow up article.

How many people game with no background apps?

I usually have at least 10 firefox tabs, instant messenger, thunderbird, and some random crap while i'm running a game or something.
 
As far as i know Winrar does use all four cores of a quad. At least i see them all working when creating a rar file.
 
It seems to me that some of the apps that see a slowdown while moving to four cores are likely bumping into bandwidth and bus arbitration overheads, as the Q6600 is essentially two C2D's packaged on the same chip, sharing the FSB. The Corei7 eliminates this bottleneck, and I'd be willing to bet the performance decrease from 3->4 cores goes away as well.

I personally don't see this happening with games. I've suspected investing in either brands quad core cpus might not be as efficient as AMD's triple cores if gaming was your only consideration.

The majority of PC games are now at the mercy of console ports and the XBox uses 3 cores. The PS3 has more cores than that but to my understanding they use the additional cores to make up for the inferior graphics chipset.
 
Status
Not open for further replies.