The Gigahertz Battle: How Do Today's CPUs Stack Up?

Page 4 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Well you think this could be a Mac PC debate but all it could be is just a debate about PC processors running Winblows and people splitting hairs. I think that PC's are as good as the operating system and buddy we have all been handed a bag of Sh**! Why can't we all just get along 😳
 
Pentium Core 2 is descended from P3 as everyone knows.

Here is a real article: http://www.anandtech.com/cpuchipsets/showdoc.aspx?i=2939

I wonder if I will get banned, it would be funny to be banned for pointing to a real article from this one.

You linked the wrong article:
http://www.anandtech.com/cpuchipsets/intel/showdoc.aspx?i=2748
Sweet I will read that one too. I have forgot to check Anandtech lately, I will start going there instead to start with. Ack..

I am not going to upgrade, so I guess I will wait and see if Barcelona kicks ass (chances are it will 😀)
 
I always find the C2D is just a reworked P3 argument intersesting, it actually was a reworked Yonah core.... and Yonah had a strong showing against K8.
I]http://www.gamepc.com/labs/view_content.asp?id=coreduo&page=9
I meant based on as in "descended from" both those Anandtech links point out that Yonah is M descended, and M is P3 descended.

Even then, clock for clock Dothan/Banias was about the K8 equal or better for the most part:
http://www.gamepc.com/labs/view_content.asp?id=dothandesktop&page=12

I always wondered why Intel did not push the Dothan/Banias product up into the desktop space and ditch Netburst earlier.
Yes, that is what I have been thinking, Intel likes to jump for stuff (R-DIMM?), and it usually doesn't pay off (Netburst). I looked again at the prices and C2D is pretty competitive, but I am going to wait because a 3Ghz K8 w/1MB Lvl2 isn't bad at video games :), I think I can wait it out 3-6 months (I don't do hi-res gaming).
 
Here we go again. Yet another person comes out of the wood work to claim bias.

Does the woodwork have holes in it? How are they getting in?
Yes, I claim bias also! I've noticed a strong Intel bias on this board.
But then again who isn't biased about something.

:wink:

I'm biased against prune juice because it tastes too much like Dr. Pepper. Plus it gives you the r-r-r-r-r-runs.

I liked the Northwood comparison there. Considering a Celly D 356 is a lot like a Northwood, I can see that at 5ghz it may be able to compete with a 3ghz single-core AMD 😱Only in the 512k L2 spec. Northwood uses a 20-stage pipeline, Cedar Mill/Presler/Smithfield/Prescott use a 31-stage pipeline. Makes a big difference, but, yes a 5GHz 356 would be fast...but not so much in games. 😀
 
Wasn't he the same guy that did the"$300 pc"? :?
Are you referring to the $300 PC that actually cost over $300? Yeah, that was a joke.


Ahhhh the $300 PC or the £300 PC as it was called.
American ignorance wins again.
 
[...]

I always wondered why Intel did not push the Dothan/Banias product up into the desktop space and ditch Netburst earlier.
I'd say, probably because Marketing wouldn't let go of the Gigahertz credo, nothing else makes sense: why maintain two different chip (and chipset) designs? P4 was known to be defective from the beginning (otherwise the Pentium III design would have been ditched, not kept for laptops).
I mean, the P4 has always been an overpowered, underperforming product only good at one thing: ramping up the clock speed. It may be that the massive marketing campaigns over clock speed needed 4 years to be compensated - or that Marketing had locked down on a 4 years budget on a specific campaign and wouldn't let go of it until the end (actually dictating Intel's product development) where Intel could finally enter a different paradigm - one AMD had started (or never left): one single, very good all around product.
 
Frankly, saying 'the P4 only works well when it's turbocharged/clocked high' is like saying 'this car would be much faster if it was run at a higher RPM'.

Actually, even the K8 can be made to run at 3 GHz - and even then it smokes the P4. I won't even mention the C2D (which can clock as high as the P4).
 
I think it all comes down to money...

If AMD gets on the market something better than C2D at a lower price, people will turn to AMD.

And I also think that the majority of PC users look for best performance/ buck... at the lowest price of course 😀 ...I am...


For the moment Intel kick AMD ass, and that is not a bad thing after all. The buyer is the one who benefits in the end...

cheers
 
I don't know if this has been mentioned before (I don't care to read 4+ pages) but I meant to post this yesterday...

Why not use a motherboard with downward multipliers available? That way they could test LGA775 P4's and PD's on relatively-equal footing. It is good however that we saw the differance between Conroe and Allendale at the same clock and FSB speeds. Informative to say the least.
 
I wouldn't say foolish as much as impossible to maintain for a company other than the Intel mastodonte - that can still force its half-baked products down resellers' throats (see: Dell & all) considering it performs well enough - as in, it could run i386 code (they weren't strong enough to force Itanium though).
No, I think it has more to do with bad design choices at the time: Intel didn't believe in SDRAM and wanted to promote Rambus, which worked better at very high frequencies. For that, the P4 actually made somewhat of a sense - and, had they managed to impose Rambus serial RAM as a standard, they may have raked in the big bucks.
They did however screw that up, refusing to consider that SDRAM actually had some oomph in it, and that DDR actually solved the throughput problem.
This led to either costly Intel P4s (Rambus cost a lot and required tricky assembly: fill all the memory slots...) or buggy SDRAM chipsets (remember Memory Translation Hub?). When Rambus got down screaming and kicking, Intel was then stuck with an ill-matched core that needed to be squeezed for all it was worth, and they gave all powers to Marketing to sell what they had - and Marketing took it and ran away with it.

Results: slow, overheating, power guzzling P4s inundated the market; users were (mis)educated on MHz=Power; Rambus made as many peeople as possible miserable (I mean both the tech and company here); Intel got stuck on a chip design that they couldn't scrap following all their mistakes (blundering the P-III, screwing up the 440BX successor, ignoring DDR, forgetting graphics chips, the Itanium...) without completely loosing face.

What d'ya think?
 
Frankly, saying 'the P4 only works well when it's turbocharged/clocked high' is like saying 'this car would be much faster if it was run at a higher RPM'.

Actually, even the K8 can be made to run at 3 GHz - and even then it smokes the P4. I won't even mention the C2D (which can clock as high as the P4).

But the P4 was MADE to be clocked high. It was built to speed. Hell NetBurst had the capacity to operate at over 8GHz.

Which leads me to the point that yes, K8 can be made to run at 3GHz, but that's after you get the highly binned batches and force it at speeds that it wasn't made to run at.

Get what I mean? Using a slow Pentium 4 is like taking a tank to ergonimics contest...

I wouldn't say foolish as much as impossible to maintain for a company other than the Intel mastodonte - that can still force its half-baked products down resellers' throats (see: Dell & all) considering it performs well enough - as in, it could run i386 code (they weren't strong enough to force Itanium though).
No, I think it has more to do with bad design choices at the time: Intel didn't believe in SDRAM and wanted to promote Rambus, which worked better at very high frequencies. For that, the P4 actually made somewhat of a sense - and, had they managed to impose Rambus serial RAM as a standard, they may have raked in the big bucks.
They did however screw that up, refusing to consider that SDRAM actually had some oomph in it, and that DDR actually solved the throughput problem.
This led to either costly Intel P4s (Rambus cost a lot and required tricky assembly: fill all the memory slots...) or buggy SDRAM chipsets (remember Memory Translation Hub?). When Rambus got down screaming and kicking, Intel was then stuck with an ill-matched core that needed to be squeezed for all it was worth, and they gave all powers to Marketing to sell what they had - and Marketing took it and ran away with it.

Results: slow, overheating, power guzzling P4s inundated the market; users were (mis)educated on MHz=Power; Rambus made as many peeople as possible miserable (I mean both the tech and company here); Intel got stuck on a chip design that they couldn't scrap following all their mistakes (blundering the P-III, screwing up the 440BX successor, ignoring DDR, forgetting graphics chips, the Itanium...) without completely loosing face.

What d'ya think?

I would just blame a poor architecture that wasn't scalable and ran into the law of physics.

:/
 
Yes, I claim bias also!
I AGREE!
The article is HARDCORE BIASED to make Intel look BAD!

They are using a crappy Northwood, which has only 512kb L2!!!
They should have included P4 EE(Galattin), which is also older than the first Athlon64. They used so much newer Athlon64 with dual channel ODMC and a faster HTT, just to make Intel look very slow! They should have used the s754(WITH A SINGLE CHANNEL ODMC AND A 800MHZ HTT) instead. Also, they should have included a desktop variant of Pentium M or Core Solo.
Their BIAS has no limits and just to make Intel look bad, they used the faster sAM2 90nm X2's. Comparing them(the 90nm X2's) to a 65nm Intel CPUs is like comparing apples to an oranges.
They also compared the E4300 which has no Virtualisation and has a default 800MHz FSB!!!! WTF????? Why not the E6400 instead, which has virtualisation and default 1067MHz FSB!

What a BIAS! :x
You people(TH) are making me to HATE AMD for NO REASON!
YOU ARE PAID BY AMD!!!!!!!!!!
Your articles makes me laugh!!!! Only underaged Jimmie will blindly believe you!

BASTARDS! :twisted:
 
But the P4 was MADE to be clocked high. It was built to speed. Hell NetBurst had the capacity to operate at over 8GHz.
errr... Where? Even 65nm models weren't able to get past 6.2 GHz even cooled down with liquid nitrogen... And somewhere, the point is to be able to use those processors in a desktop system - meaning, on air with a heat spreader fitting inside a small case.

Which leads me to the point that yes, K8 can be made to run at 3GHz, but that's after you get the highly binned batches and force it at speeds that it wasn't made to run at.

Get what I mean? Using a slow Pentium 4 is like taking a tank to ergonimics contest...
Thing is, in this case you see that in practice the maximum on-air cooling you can get for all cores are:
- P4: 4.5 GHz (nice on air overclocks)
- K8: 3 GHz (highest speed found at retail)
- C2D: 4 GHz (higher speeds aren't officially released yet)
Thing is, for the P4 not to look too bad, it needs 200% of the clock speed of its opponents - and its max clock speed is (at best) only 150% of its worst competitor (here, 90nm AMD).

I would just blame a poor architecture that wasn't scalable and ran into the law of physics.

:/

Which was, actually, my point: P4 is bad and has always been so.
 
Here is a price/performance analysis for Gigahertz Battle.

The prices are current prices from Newegg
The performance is seconds to Video Encode with DivX 6.5 as reported at
http://www.tomshardware.com/2007/03/26/the_gigahertz_battle/page13.html#video

Intel Core 2 Quad Q6600 Kentsfield.................115 sec......$846
Intel Core 2 Duo E6600 Conroe.......................116 sec......$313
Intel Core 2 Duo E4300 Allendale 1.8GHz.........125 sec......$169
AMD Athlon 64 X2 4600+(65W) Windsor AM2....148 sec......$126
AMD Athlon 64 X2 4600+ Toledo S939..............158 sec......$200
Intel Pentium 4 2.4B Northwood S478..............252 sec........$63

Seems like the Athlon 64 X2 4600+ AM2 is the clear winner for value.
 
Seems like the Athlon 64 X2 4600+ AM2 is the clear winner for value.

Look at Itunes, a very popular application although you do only import your CD's once, clearly the code is optimized for AMD. Although somewhat strange considering Apple's new connection to Intel, maybe they wrote the conversion core a few years ago.

Intel Core 2 Quad Q6600 Kentsfield.................204 sec......$846
Intel Core 2 Duo E6600 Conroe.......................204 sec......$313
Intel Core 2 Duo E4300 Allendale 1.8GHz.........206 sec......$169
AMD Athlon 64 X2 4600+(65W) Windsor AM2....225 sec......$126
AMD Athlon 64 X2 4600+ Toledo S939..............210 sec......$200
Intel Pentium 4 2.4B Northwood S478..............252 sec........$63

If your not interested in overclocking so you can keep your powersaving feature going, looking at a mid to upper system, the AMD is still a very viable product and should not be overlooked just because it doesn't hold the ultimate performance crown.
 
The AM2 choice should have been a declocked FX62 down to FX53 speed ... just lower the multilier.

In fairness to the AMD fans, the "E6600" they used was in fact an X6800 with the multiplier lowered (see the CPU-Z screen or the photo of the chip). I don't know if that matters or not but it was a bit deceptive.
 
Eh, I'd like to point out that cost per processor was not within the scope of this article at all. THG haa another article they publish periodically that focuses greatly on $/performance.

@Senor_Bob

It should make zero difference. At the setting they had it at it was effectively an e6600 and they did include a screenshot of CPUz.

They really ought to have included a couple more AMD CPUs and some benchmarks that highlight the AMD IO and memory capabilities.
 
Eh, I'd like to point out that cost per processor was not within the scope of this article at all. THG haa another article they publish periodically that focuses greatly on $/performance.

@Senor_Bob

It should make zero difference. At the setting they had it at it was effectively an e6600 and they did include a screenshot of CPUz.

They really ought to have included a couple more AMD CPUs and some benchmarks that highlight the AMD IO and memory capabilities.

According to THIS, the x6800 would benefit from the reduced multiplier on a 965 as the NBCC* (North Bridge Core Clock) = (11/9) x 266 = 325 vs and e6600's NBCC* of 266.

*Note CPU-Z does not report NBCC.
 
I note you compared crippled AMD64 dual cores (half the cache disabled).
How can we trust you when you publish this kind of review?


The 939 choice of CPU should have been the 4800+ (e4 Toledo) not the 4600+ crippled MANCHESTER E6 core you used instead.

You also should have used a 4000+ sandiego core - SSE3 ??? not the earlier 130nm 4000+ core.

The AM2 choice should have been a declocked FX62 down to FX53 speed ... just lower the multilier.

Once again Toms has put AMD at every disadvantage possible in the benchmark comparisons.

But the free rent from the Intel building your located in must be nice.

You always seem to put just enough spin on the articles ...

I remember when the world was bagging Netburst after the A64 was introduced and you sere still showing bent benchmarks here showin the Extreme in front ... man that made me laugh.

Toms will always be known as a site that Intel bought and uses as part of it's longterm marketing strategy ... its just sad that the overage Joe believes what you tell them.

For the rest of us .... well.

🙁

Well, you totally missed the point of the article didn't you. Intel people could whine that the P4 was designed to run at higher clock speeds thus its not fair.
There's no point though since the whole idea is to show how it's not about the number of mhz, but what is with each cycle.

Congrats on you're appauling first post.

What Intel people would say that? This article is about filling time until AMD finds a way to compete again. If AMD had the faster chip we would be hearing, Intel suck yada yada yada, but the fact is no matter how you dice it the best Intel versus the best AMD available is a walk that on majority of the benchmarks, Intel has the better product. Putting the P4 into this mix was what was weird to me. You take a 2 year old product and claim it to be a system of today. In CPU world P4 is not a compter of today, it is what most of us are handing down.

At any rate as an Intel loyalist it is always a little fun to see you AMD loyalist squirm a little like us when AMD was on top of the charts. No matter what you say seeing my C2D smoke through these MPEG-4 encoding times at 100FSP is sweet. Throw 2 on and it is 60FPS each. VERY VERY NICE! It takes almost as much time to rip a CD as it does to encode one. I love technology enhancements!
 
Eh, I'd like to point out that cost per processor was not within the scope of this article at all. THG haa another article they publish periodically that focuses greatly on $/performance.

@Senor_Bob

It should make zero difference. At the setting they had it at it was effectively an e6600 and they did include a screenshot of CPUz.

They really ought to have included a couple more AMD CPUs and some benchmarks that highlight the AMD IO and memory capabilities.

According to THIS, the x6800 would benefit from the reduced multiplier on a 965 as the NBCC* (North Bridge Core Clock) = (11/9) x 266 = 325 vs and e6600's NBCC* of 266.

*Note CPU-Z does not report NBCC.

That thread STILL hasn't been updated since november and it doesn't say that, at all. It says that it would be exactly the same as an e6600 as the concept they are discussing supposedly doesn't apply to the "extreme" edition CPUs (which makes zero sense to me and causes me to question the validity of any of the new statements they make). Since the NB is is only used for memory access in the benchmarks run (no IO, no GFX) the theoretically increased NBCC would not have a performance advantage as it would not improve memory bandwidth or timings. If the extreme CPU reacted the same way to having it's multi lowered as other processors it could actualy suffer a performance PENALTY due to an increased NB strap, if such a higher strap even existed on the mobo used in the test. At 325mhz it shouldn't be bumping it up to the 333strap yet but there are some mobos that seem to move to higher strapping early to achieve higher clock rates (but ussually reduced performance). If it went to a higher strap the increased NB latencies would lower NB performance even with a higher clock rate.

So it shouldn't be doing that and even if it did it shouldn't matter.
 
In CPU world P4 is not a compter of today, it is what most of us are handing down.

wrong - there's a huge amount of netburst chips running - dell are still selling them by the drove! they only recently started to sell AMD in this segment. In case you didn't know, Dell are pretty big supplier.

Edison (electricity supplier) currently sponsor the disposal of old inefficent refrigerators, can we get Netburst on list?
 
In CPU world P4 is not a compter of today, it is what most of us are handing down.

wrong - there's a huge amount of netburst chips running - dell are still selling them by the drove! they only recently started to sell AMD in this segment. In case you didn't know, Dell are pretty big supplier.

Edison (electricity supplier) currently sponsor the disposal of old inefficent refrigerators, can we get Netburst on list?

That is very true that Intel is still cranking them out and other manufactures still are selling them in droves, but to add them into the benchmark mix was weird to me. This is what the battle was pre-C2D. I just did not see a value to showing how Intel has improved their product line. TH could have saved time and money by saying P4 vs. C2D is no comparison. Of course if they did that we would have the exact same article they had 6 months ago.

Man when is AMD coming out with something new so we have new material to debate! Anyone want to fill the void with a evolution vs. creation conversation while we wait? :)
 
. . .a 3Ghz K8 w/1MB Lvl2 isn't bad at video games :), I think I can wait it out 3-6 months (I don't do hi-res gaming).
Actually, even the K8 can be made to run at 3 GHz - and even then it smokes the P4.
:lol: :lol: I wish I could go higher than 250mhz fsb on my MSI nForce 4, this 4000+ is solid at 3Ghz and I don't feel like I am pushing it at all 🙁. Is there truly a better processor for $78 that I am missing?

Vista experience index: 4.4 (due to the processor)

Ram 4.5 (forced because of 1GB Dual DDR, Supposedly Vista "needs" 1.5GB to show me a real benchmark, pffft. Stupid M$)

Video 5.9

Gaming 5.9 (amazing what a 650mhz clocked $120 7900GS will do 😉, I just used nibitor and nvflash on a "floppy-bootable" USB stick to put that in the bios so I don't need any utils in Vista 😀, isn't that a stock GTX clock? 😛)
 

TRENDING THREADS