P4E 3.4GHz reviewed - Find the errors!

TheRod

Distinguished
Aug 2, 2002
2,031
0
19,780
First error : <A HREF="http://www.tomshardware.com/cpu/20040322/prescott-04.html" target="_new">THG don't mention the P4E 3.4GHz in their test setup</A>
Second error : <A HREF="http://www.tomshardware.com/cpu/20040322/prescott-19.html" target="_new">Why include SETI results WITHOUT Prescott 3.4GHz</A>

I know these are not big mistake, can you find more? :smile:

--
Would you buy a potato powered chipset?<P ID="edit"><FONT SIZE=-1><EM>Edited by TheRod on 03/22/04 12:09 PM.</EM></FONT></P>
 
I noticed that too... but at least that first error isn't really a big one.

The second one isn't too big anyway either. Interestingly, though, they even put some Opteron and Xeon numbers in the mix.

<i><font color=red>You never change the existing reality by fighting it. Instead, create a new model that makes the old one obsolete</font color=red> - Buckminster Fuller </i>
 
I know these are not big errors, I was just wandering why no one started a thread on this review?

Probably, no one is excited about Prescott anymore. Probably many thought this would be tha A64 killer, but it wasn't!

Well, I read a couple of Prescott 3.4GHz review today, and I can only say a thing : new CPU will never gives stellar performance anymore!

Always small steps...

--
Would you buy a potato powered chipset?
 
RE: Well, I read a couple of Prescott 3.4GHz review today, and I can only say a thing : new CPU will never gives stellar performance anymore!

Always small steps...
---

It's not very often I agree with so many posts from the same guy. This is the 3rd or 4th time I wanted to respond to a post only to see TheRod has said what I was going to say. You and I seem to be on the same page when it comes to Computers.
 
I agree that Intel as of late really hasn't been improving and I do agree that Over all AMD is only now getting really back in the game. AMD although has made many jumps in technology and proformance. If you compair the AMD64 line to the XP line well there you see a AMD jump in many areas by leaps and bounds. Gaming was only a small step yes but AMD has become a compeditor in many areas that people thought untouchable by AMD.

In the end all I have to say is I am looking forward to the future of Computers. With AMD and Intel both duking it out right now I think the future will hold some great improvements. We can see that both AMD and Intel are now trading and using technology the other never thought of using. AMD is now using SSE2, SSE and I think SSE 3 but don't quote me on that one. Intel is now going to be putting ondie memorie controlers (sorry about the spelling) along with now using AMD64 technology in their IA32E, I think thats the name sorry don't feel like looking [-peep-] up and I know you get what I mean. Im lazy 😛. Anyway Now that both a fighting and both seeing the benfits in the otheres technology we will see two very powerful processors and more then likely see a split in the market Intel for audio or video encoding applications and AMD for Gaming. The way I see it AMD wins clear in Some areas and Intel wins clearly in others but all around both kick @ss. Im just hoping that things stay this way and both execel in their own areas. Right now when I look at a review I don't look at anything but Gaming benchmarks as thats all I care about is AMD kicking but in gaming. I then jump onto the boards and listen to what you guys say and look to see if anyone is rightfuly complaining about either of the companies and investage into what you all talk about. As far as I can see AMD good at everything and Doing Great in Gaming. Intel doing good at everything and doing Great in audio and video encoding. And I personly hope that things fall this trend of no clear winner in many areas but having Intel and AMD excel over the other in one area.

SO I DON'T KNOW WHEN TO SHUT UP 😛

-------------------------------------------------
Remember what your fighting for, Remember why you even started fighting, and Remember who you are
 
LOL, they said the P4 Williamette was popular because it was faster than the PIII Tualatin. Problems are:
1.) The Willies weren't faster than the Tualatin
2.) The Tualatin was released after the Willies were replaced with the Northwood.

<font color=blue>Only a place as big as the internet could be home to a hero as big as Crashman!</font color=blue>
<font color=red>Only a place as big as the internet could be home to an ego as large as Crashman's!</font color=red>
 
This is the 3rd or 4th time I wanted to respond to a post only to see TheRod has said what I was going to say. You and I seem to be on the same page when it comes to Computers.
We might be the same virtual being? :smile:

--
Would you buy a potato powered chipset?
 
As far as I can see AMD good at everything and Doing Great in Gaming. Intel doing good at everything and doing Great in audio and video encoding. And I personly hope that things fall this trend of no clear winner in many areas but having Intel and AMD excel over the other in one area.
I will try to be honest with what I will say here...

I really think that MOST people who buy hom PC buy them for gaming/internet/office apps. I don't know many people who do Video/Audio stuff a lot.

To be honest most people download music and video from the internet, but they don't transcode or edit them a lot.

I just bought a Mini-DV camera and I transfer the video to my PC. I do all my editing in realtime with my Athlon XP 1800+ (o/c to 2400+) without any problem or lag. Because most high-quality video editing application render in realtime the applied FX based on cpu power. So when I push the play button, the application will render FX at low quality because I only have a 2400+. If I would have a P4EE 3.4GHz I would probably see "high-quality" preview, but I don't care! I just want to preview my video.

And when it comes time to RENDER or transcode, I really don't care about speed, because I start a batch process before I go to bed.

Today, when I check CPU comparison I mostly don't care about audio/video benchmark, because thses numbers are not representative of MY reality (and the reality of most PC buyer's).

Honestly, when I play a game I want good FPS and fast load time, but when it comes to video editing, I know that the redering of long movie will take long and I know that when these jobs are running on my PC, I don't stay in front of it waiting for the completion bar to fill. Even, if I had a P4EE 3.4GHz I would not wait for a 2 hour video to render. I would start the rendering at night or when I know I will not use/need my computer for a long period.

Only hardcore video users will benefit from the P4 architecture. Because average video user like most of us really don't care about the time it takes to render a movie.

This was my point. I don't say that Intel sucks, I only tell the truth : the majority of users need good performance in games and office applications, not in Audio/Video editing.

--
Would you buy a potato powered chipset?
 
the numbers for the opterons in the seti benchmark look a bit odd... the 4way opteron is exactly twice as fast as the 2way opteron... i knew they scaled well but i didnt think they scaled perfectly!
 
the 4way opteron is exactly twice as fast as the 2way opteron... i knew they scaled well but i didnt think they scaled perfectly!
It might be the SETI team that coded very well the MULTI-CPU support. And Opteron scales well because each CPU have it's own memory controller. And I think SETI is memory intensive (lot's of read/write) to memory...

But, I'm not sure... It would be great to ask THG and/or SETI about this very impressive scalability.

--
Would you buy a potato powered chipset?
 
Actually, SETI was designed to be perfectly modular so that it could be transmitted by the internet... So it's more of a case that SETI scales perfectly, rather than Opteron. A single thread doesn't require absolutely anything from any other thread...

<i><font color=red>You never change the existing reality by fighting it. Instead, create a new model that makes the old one obsolete</font color=red> - Buckminster Fuller </i>
 
if it was simply a case of superb mp support then the xeons would scale more or less the same (i assume) but they appear to be scaling the same as they do everywhere else. i know K8's have their own on-die mem controller but surely even with this why do they scale so perfectly here and just scale well in every other test?
 
>>>I would start the rendering at night or when I know I will not use/need my computer for a long period.

Why? Because your CPU can't do an acceptable job rendering and gaming at the same time? You seem to be missing the point of hyperthreading - to provide a good experience to those people who want to do both simultaneously, instead of waiting for a time "when I know I will not use/need my computer for a long period". And that is what benchmarks don't show you...
 
the same reasont hey dont scale perfectly in every other benchamrk. the biggest thing is that they have to share one bus speed and bandwidth gets limited. most benchmarks show opterons scale much better.
 
Because your CPU can't do an acceptable job rendering and gaming at the same time?
No, my CPU can do both well. I actually run 24/7 SETI, ZoneAlarm, NAV, FTP server and when I play games I don't shut anything down. And when I want to render at the same, I can actually do it. But as many people do, I don't try to overlad my CPU for nothing. I don't NEED to render my home-made videos on the spot. Is why I usually starts my rendering when I do nothing on my computer. It's not because I can't do it while doing other stuff!

You seem to be missing the point of hyperthreading
I never talked about hyperthreading or multi-threading. In fact, most reviews don't actually do multi-threaded benchmarks.

So, the best thing to do for review site would be to run a GAME and some other apps in the background to test hyperthreading or multi-thread effective workload on each CPU. If we check only technical specs. Intel would win these benchmarks because of their hyperthreading technology.

Benchmark numbers would look like this :
Game X FPS : 100
Game X FPS with background DivX encoding : 70
Game X FPS with DivX and MP3 encoding : 60

This would provide you the information you want!

to provide a good experience to those people who want to do both simultaneously, instead of waiting for a time "when I know I will not use/need my computer for a long period".
I repeat, maybe I was not clear enough... I said when "I know I will not use", this mean, I would NOT be in fornt of my PC anyway! Why would I start a rendering job that is not important when I want to enjoy a game at full FPS. I prefer to wait to start my rendering job. And even, if I would start my rendering job while playing a game, the time I would save, would not be much, because the game would take much of the CPU cycles, so the rendering would not process as fast.

Can you actually provide us numbers on encoding while playing? This would be interesting to analyse this!

--
Would you buy a potato powered chipset?
 
Hyperthreading isnt some magical thing. Yes it would allow oyu to do both at the same time, but you still take a performance hit, this si soemthing you might notice when your playing a game like unreal or battlefield vietnam where every fps is precious. Plus, why not use both threads so that the video work goes faster? in this case you couldnt be doing other things as well, as it would hender the encoder form using both threads completely. So youd leave the ocmputer alone while its working... so your back to where we started. Hyperthreading is a boost, but its not what has kept the p4's in the lead in video/audio work.
 
the same reasont hey dont scale perfectly in every other benchamrk. the biggest thing is that they have to share one bus speed and bandwidth gets limited. most benchmarks show opterons scale much better.


So explain me why itanium scale better tha opteron in most benchmark

i need to change useur name.
 
>Always small steps...

Nah man, you got it wrong. intel is making bigger and bigger steps every year. From 486 (120) to Pentium 66 was only a small step back. From Pentium to Pentium Pro was a rather major step back running Win95, from P3 to P4 was a HUGE step back for pretty much everything expect Quake, from Xeon to Merced was a DISASTER, giving you a fraction of the performance at a huge price premium. Now Intel is about to make the biggest step ever: go back from Pentium 4/5 to the 10 year old Pentium Pro ! Small steps my arse, those are GIANT LEAPS (backwards) ! LMAO, next thing, intel will reinvest EDO Ram and external FPU units ! Or maybe they can licence the K6 design from AMD ?

---------------------------------------------------
I am severly limited in what my mind can perceive.
 
Poor dude. you suffer from strabismus ?

---------------------------------------------------
I am severly limited in what my mind can perceive.
 
re: Honestly, when I play a game I want good FPS and fast load time, but when it comes to video editing, I know that the rendering of long movie will take long and I know that when these jobs are running on my PC, I don't stay in front of it waiting for the completion bar to fill. Even, if I had a P4EE 3.4GHz I would not wait for a 2 hour video to render. I would start the rendering at night or when I know I will not use/need my computer for a long period.
---

Yea I agree, for home users the P4EE 3.4 is not needed. I just paid to have a video job done for a project, and it cost me $40,000. I almost couldn't get the crew I got cuz they were very busy. If having a P4EE 3.4 over anything else for there video editing can help them do 1 more job a year due to time savings, then for them it's worth it. That's the market where they don't care about a few extra hundred dollars for a chip. Hell, 5 years ago they were paying 30 grand for a PC to do the same job for the same money.