Is This Even Fair? Budget Ivy Bridge Takes On Core 2 Duo And Quad

Page 4 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
The benchmarks I have waited for..
It stands clear that IPC have stopped dramatically in the last years, I don't see any revolution in the last CPU generations, let's look at 3570k and q9550 at the same clock speed: 30% improvement in more than 5 years is.. nothing (IMHO).

From a business point of view, power consumption is the only real improvement.

I think that traditional computing is going toward a new MEDIEVAL AGE, yes you know, the horse-car has been about the same for hundreds of years.. I think we are going to live a cpu performance stasis for long time (the new horse-car); I hope into a technological breakthrough.

The nice thing about this is, I'm am not going to update my horse-car for a looong tiiimeeeee
 

If having more threads and cores could benefit a broader selection of mainstream applications, the obvious way forward would be to do exactly that: add cores and threads.

The biggest challenges to higher performance in mainstream applications is that on one hand, software that depends heavily on user input tends to just sit there waiting for input most of the time regardless of how well or poorly it may be threaded and on the other hand, that lots of code depending on user input is usually intrinsically sequential which makes it difficult if not impossible to thread in a meaningful and efficient manner.

In other words: mainstream hardware is waiting for mainstream software to catch up.
 
[citation][nom]Diabowx[/nom] 2.Core 2 duo and core 2 quad uses DDR2-1066 , 1000 , 1045, 890 Rams , where as core i series , celeron and pentium uses DDR3 1600 and 1333 respectively [/citation]

IIRC, Core 2 didn't see much of a performance benefit with the faster ddr3 ram. Intel in general doesn't seem to benefit quite as much from faster ram or better latency. AMD, however, has had a history of being affected by ram speed and timings. AM2 was painfully obvious of this back in the day. You had to go with fast DDR2 800 to achieve the similar performance of a S939 chip of the same 3xxx+ number with fast ddr1 400.
 
[citation][nom]Diabowx[/nom]Found some interesting things/facts in this review , These are follows : 1.Core 2 duo and core 2 quad doesn't have L3 cache where as core i series , celeron and pentium does have L3 cache . 2.Core 2 duo and core 2 quad uses DDR2-1066 , 1000 , 1045, 890 Rams , where as core i series , celeron and pentium uses DDR3 1600 and 1333 respectively .3.Core 2 duo and core 2 quad uses DDR2 4 GB Ram clocked at 1066 , 1000 , 1045, 890 where as core i series , celeron and pentium uses 8GB DDR3 Ram clocked at 1333 and 1600 . Simply i believe (may be i m wrong ) that core 2 duo and core 2 quad will be no matched with these core i and pentium series because of the Rams they were using , As we have already seen in other posts about how Ram timings and frequencies affects gaming and all that apps used in this benchmark .May be we get equivalent performance when we use exactly same hardware (DDR3 Rams) for these cpus . Remember DDR3s are faster than DDR2 Rams . I have one doubt regarding the power consumption , power consumption in case of core 2 duo and core 2 quad in cpu load 122.4w and 154.5w and in gaming 264w and 299w respectively , Then how does they consume more power than there thermal limit ?They should have invented a method to find out how much power cpu(only) using , Not the power consumed at AC source , even converting Ac to Dc wastes some power .Lastly what i m thinking is (again i may have been wrong) today's game are not optimised for core 2 duo and core 2 quad processors .[/citation]

Cache and ram does influence performance, but simply slapping DDR3 with a Core2 won't make up for the difference. Mostly because they are incompatible and you can't do that, but there are several architectural changes that account for the difference in performance as well, such as the memory controller being on die for Ivy Bridge.

Measuring the power consumption at the socket is the easiest way to distinguish CPU power consumption differences. There are other ways to measure CPU power specifically, but in this case the large majority of that difference can be attributed to changing out the CPU.

Also TDP != power consumption. TDP is reported differently between different manufacturers, but more or less it is a guide to how much cooling a part needs.

And finally, they don't optimize games for specific generations of chips. Changes in CPU architecture from Core2 -> Ivy Bridge may utilize software more efficiently (pipeline, branch prediction, etc.), but it's not like programmers are writing specifically for Ivy Bridge or any other CPU.
 
I still use a E8500, but will upgrade when the new processors come out. These benchmarks would explain why I only get 30-80 fps in Borderlands and L4D2 dips into the 30s from time to time with my 7850.
 
Nice concept, as a happy Core2 Q8200, but this article reads like one giant ad for Intel and the i5. Rampant fanboism does come to mind when reading it...

Also, this article needed to include the following AMD processors, which fill the same price ranges and perform better, in most cases, than the Core2/Celeron/Pentium series: Athlon II x4 640, Phenom II x4 965, A8-5600k, A10-5800k, FX-4300, FX-6300, and the FX-8350. That would have made this a more objective and useful article.

Personally, I am content with my Core2 Quad desktop and my Turion 64 x2 laptop. Perfectly fine for media encoding, video editing, most games, etc. On the desktop and the lappy is great with Lubuntu for programming, web, and watching videos.
 

Thats it what i want to say , since DDR3 Rams are typically incompatible with Core 2 series processors , comparing both architectures is somehow not right . What i simply want to tell that if core 2 processors are somehow given IMC (Integrated Memory Controller) like ivy bridges and make them compatible with DDR3 Rams and give them L3 cache , Then we may see comparable results .
 
Me and my friend are still rocking an OCd QX9650 (Friend's) and QX9770 (mine) coupled with an old GTX 570 in my rig, it will still comfortably run almost all games at the highest settings with an SMAA injector instead of MSAA at ~40-60 FPS average (BF3). It is amazing that an i3 can get quite close to the 9550@Stock although the difference between the $30 launch price difference between the E8400 and i3 CPUs tells me that really that they have not evolved much. At that, looks like i7 920-940 users may be a bit scared after seeing this because the 9550@3.4 is not that far off from a 920-940. Finally, DDR3 is the smallest performance gain known to man, DDR2 will never bottleneck any system, if a 3570K had a DDR2 Mem Controller and was running it @1100MHz, you would have maybe a 1 FPS gain in Games. when using DDR3-1600, since 1866+ is worthless on an Intel System.
 


Interesting. I understood the memory controller was not on die, but I still was under the impression that there were no chipsets for LGA775 that took DDR3. Good to know, thanks.
 


The G41 chipset supports both. I am running DDR3 with a C2D right now.
 
[citation][nom]Soma42[/nom]Interesting. I understood the memory controller was not on die, but I still was under the impression that there were no chipsets for LGA775 that took DDR3. Good to know, thanks.[/citation]
[citation][nom]ingtar33[/nom]an old core2duo won't work with ddr3 ram....[/citation]
As others have stated, many LGA-775 Chipsets like X-48, X-38, and P-45 Express came in both flavors. You could not mix and match RAM on the same mobo, but you could pop the processor from a DDR2 board to a DDR3 board. DDR3 was very expensive back then, so far fewer Core 2 owners went that route. What didn't help is there was little extra performance to be squeezed from the extra $.

Here is our DDR2 Asus Rampage Formula X48-Express test board.
http://www.asus.com/Motherboards/RAMPAGE_FORMULA/

Here is the equivalent DDR3 Rampage Extreme:
http://www.asus.com/Motherboards/RAMPAGE_EXTREME/
 
[citation][nom]ingtar33[/nom]Its funny the two different camps after reading this article. one side says "wow, look at how faster cpus have gotten" while the other says "wow, look how good the old hardware runs". [/citation] Yeah.... I even had a hard time choosing sides, and my viewpoint changes stock vs. OC. That's what made this story a fun one.
 
[citation][nom]JAYDEEJOHN[/nom]Again, thanx Paul for putting Toms on top again, great article, a Toms classic[/citation]
My pleasure! And many thanks for the feedback and encouragement too! We're thrilled so many of you enjoyed the piece and data.

[citation][nom]iam2thecrowe[/nom]would have liked to see some comparisons to AMD cpu's.[/citation]
Me too! And that is the plan. 
Because I want to test each both @ stock and overclocked, 3-4 Processors will likely be the limit. Personally I lean towards the gaming bang of Athlon II X4 640, Phenom II X4 965BE, FX-4300 and FX6300.

[citation][nom]veroxious[/nom]You are misunderstanding me........ according to the benchmarks in this article Crysis 3 should be unplayable on my rig @ 1080p due to my E6750 CPU. But it is playable , very much so...[/citation]
As stated at the top of the page, we test a brutal sequence (within “Welcome to the Jungle”), for a worst-case look at Crysis 3 performance. (Down by the swamp where you first get instruction to use electric arrows)
Other areas of the game have a lighter load, and will play more smoothly. But this part of the game is unavoidable during the single player campaign, meaning your frame rates will tank if we are comparing the same loads. Quite frankly it’s brought every dual-core we’ve thrown at it to its knees, well, up until this 4.5GHz Wolfdale anyway.
So “unplayable” is a bit harsh. But I’d argue it is certainly FAR LESS ENJOYABLE on a dual-core. Of course, not everyone’s tolerance level for low fps is the same. Factor this… In this area of the game, Core I5-3570K stock delivered a smoother experience at Very High details, than a 4.5 GHz E8400 did at the very lowest details.

 
I was using my trusty Q9550 up until the end of 2012 as various other bits of tech crumbled and failed under the onslaught of new gaming technologies, including a GTX260, at least one WD Caviar Green and one stick of RAM that never seemed to be right. Recently purchased a new machine with a Z77 board and a 3570K, but if anything kept my old, slightly-broken machine going before its replacement, it was the Q9550. Great to see it at least hold its own in some of these tests!
 
[citation][nom]Steelwing[/nom]Very nice review! I've got a C2D E6600 (2.4 GHz) and had been considering the Core i5-3570K (or possibly wait for a Haswell i5) and was wondering about the performance differences. My CPU is still good for a lot of apps, but I can definitely see a reason to upgrade.[/citation]

As long as your processor does what you ask of it, there is ZERO reason to upgrade. Unless of course you care about coal mining.
 
Funny all the gamer boys on toms will flame on FX like its a horrible excuse for a cpu even tho it has a better IPC than core2 and toms just proved core2 quad is still relevant and can hang with the ivy stuff so what's that say about FX ? lol just makes me chuckle a bit. Glad I hung on to my old Q6600 all these years, shes still a trusty gaming rig to this day.
 
[citation][nom]Soma42[/nom]Interesting. I understood the memory controller was not on die, but I still was under the impression that there were no chipsets for LGA775 that took DDR3. Good to know, thanks.[/citation]

I myself am using a Q6600 @ 3GHz with 2x2GB DDR3 RAM.
 
Now these kinds of reviews I'd like to see more of going forward! Comparing older chipsets to current games and apps. My E8400 built a little over four years ago ran at 4.4GHz for two years before I upgraded to an overclocked, 4.8GHz 2500K, 680 SLI build. The 8400/GTX 275 SLI rig is still running strong, one that ran Crysis 1 at high settings at over 50FPS at 1920x1200 - But it is now only relegated to backup gaming duty for older nostalgia games and as a backup server.

I was always curious how it would run FC3 and Crysis 3 specifically. Now I have a very good idea. Glad I upgraded two years ago!
 



I myself am using a Q6600 @ 3GHz with 2x2GB DDR3 RAM.[/quotemsg]

775 4 Life! Sorry just had to throw that in there, my Q6600 G0@3.2 is still folding in another rig to this day, coupled with an 8800GTS, I bet it can still play all DX10 games on medium settings @720/900P. I should thank myself for not getting the E6850 as that wonder-cpu lasted forever or should I say still hauling proteins, even my hand-me-down QX9770 is not that much better than the G0 Q6600.
 
looking at how i3 trash the pentium. It seems to me that the same thing will happen to 2600K vs 2500K. HT really make a huge diff when in threaded environment. I guess when 8 thread app starting to be a common thing in 3yrs later 2600K will start showing its muscle.
 
Status
Not open for further replies.