Intel bribing THG? Is it possible?

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.

coolsquirtle

Distinguished
Jan 4, 2003
2,717
0
20,780
that's has nothing to do with their performance

RIP Block Heater....HELLO P4~~~~~
120% nVidia Fanboy
PROUD OWNER OF THE GEFORCE FX 5950ULTRA <-- I wish this was me
waiting for aBox~~~~~~~~~~~~~~~~
 

slvr_phoenix

Splendid
Dec 31, 2007
6,223
1
25,780
And at least Intel's rebranded Xeon runs on P4 mobos and doesn't require ECC RAM to run. (How many performance freaks do you know that use ECC?) So it's an actual desktop chip, even if it is a Xeon at heart.

AMD's rebranded Opteron however won't even touch a plain A64 mobo <i>and</i> needs the slower and more expensive ECC RAM just to run. So with AMD you end up with a complete and entire Opteron workstation, but with an Athlon label. Yeah. Great idea AMD.

<pre><A HREF="http://ars.userfriendly.org/cartoons/?id=20030905" target="_new"><font color=black>People don't understand how hard being a dark god can be. - Hastur</font color=black></A></pre><p>
 

slvr_phoenix

Splendid
Dec 31, 2007
6,223
1
25,780
What's it been 1.5 to 2 plus years for Hammer and this all we get. I would have thought Amd would have done more. What are the waiting for?
Actually it's pretty impressive in my opinion even if it is just an upgraded K7.

For starters AMD is <i>finally</i> taking advantage of memory bandwidth. That's a huge step forward for them.

And then there's the actual 64-bitness, which really isn't needed yet but at least <i>might</i> shift PCs towards 64-bit. And at the very least it makes it a good low-cost server.

Really there are just only two disapointments as far as I see:
1) AMD actually bothered with a single-channel DDR version of A64. They could have just designed a non-ECC Opteron and ditched Athlon forever as far as I'm concerned.
2) AMD's dual-channel A64 FX (though it really should just be called the Opteron that it is, since there IS no difference other than name and perhaps price) is stuck using the slower ECC RAM. Once AMD get's dual-channel with <i>normal</i> RAM (or ultra-low-latency RAM more specifically) it'll perform even better.

Really I think that the Opteron at the very least proves that AMD isn't going down without a fight. They're still struggling to keep up with Intel. Now if they can just design a new chip from scratch using all of Opteron's good concepts they might really have a chance of impressing us.

<pre><A HREF="http://ars.userfriendly.org/cartoons/?id=20030905" target="_new"><font color=black>People don't understand how hard being a dark god can be. - Hastur</font color=black></A></pre><p>
 
About time with the bandwith. They should have dual channel. Thats a Big IF the can design that new opti chip you mentioned. That would be real nice. When? Do we have to wait two more years?
 

Grub

Distinguished
Aug 30, 2002
2,814
0
20,780
did he fall on a chain link fence or something?
I think that is the key right there. The current architecture is a little tired. I know that my athlon is good on games, but the low clock speed makes me want to get an intel processor whenever I start encoding video. If they can figure out a way to raise the clock speed on the current design, then fine, but if they can't maybe they should explore some ways of increasing performance in this application---> new chip design?

Scamtron doesn't like my sig...
 
G

Guest

Guest
> Most show the oposite, but to a smaller extent than THG
>(like 11-9 instead of 32-15). None seem to use as MANY
>tests as THG did.

If you are going to attribute a performance crown and a buying decission based upon the number of benchmarks, then a little common sense wouldn't hurt. For instance, you seriously find Q3A 640x480 results THREE times as important as a DX9 game result ? Or SPEC Viewperf FIVE times as important as lame or 3D studio ? That is nonsense.

The problem with THG (and many loyal followers on this board) is not the benchmarks as such, but the interpretation of them. Let me give you an alternate interpretation of the very same datapoints (benchmarks) with a focus on gaming. I'll stick to the A64 versus the P4C (as the P4EE isnt there yet, and the FX likely too expensive for most of us):

1)Q3A IMHO utterly irrelevant. No one buys a new cpu to get better framerates in Q3, but since its the only OpenGL game bench, -> P4C wins
2) Serious Sam ->A64
3) Wolfenstein ->P4C
4) Commance4 ->P4C
5) UT2003 ->A64
6) Splinter Cell->A64
7) Warcraft 3 ->P4C
8) X2 ->A64
9) Gunmetal ->A64
10)3Dmark 2001 ->A64
11)3Dmark2003 ->A64

Out of 11 game benchmarks, the A64 wins 7, and the A64 wins every DX9 bench except Aquamark (which is more a DX8+ bench anyway). SO, guess which cpu gets my recommendation for gaming ?

If you look at media encoding, the only obvious choice is the P4C, although 64 bit support might well change that as some early 64 bit benches indicate (ie DivX encode on Aces' hardware)

For 3D rendering & CAD both cpu's are pretty much even, and office performance is a total non issue no one cares about.

Pricewatch lists the P4C3.2 for $599, the Athlon 64 is $441.
So if you are a gamer without unlimited budget, the A64 seems to be your logical choice. 64 bit support is there for free. If you do a lot of media encoding, the P4C price premium over the A64 is worth it, and gaming performance is still respectable.

The FX is for people with too much money, and the P4EE a collector item for those with too much money and patience to boot.

Thats my analysis...

= The views stated herein are my personal views, and not necessarily the views of my wife. =
 

TomaHawK

Distinguished
Aug 28, 2003
64
0
18,630
hmm i think the 64 fx will dominate the market, even tho releasing it abit to early i think with not many 64bit apps, no standardized ram and premature boards...but isn't that what all the great ones have done...man remember win95? os/2 warp was alot better, but launched the wrong way. it's all about marketing and the rest will follow...if the 64 fx wins the race, and i think it will due to it early launch the rest(chipsets, app and memory...) will autimagically follow up
(don't kill the messenger)

Beat the heat with the USB-Powered Fan :wink:
 
G

Guest

Guest
>hmm i think the 64 fx will dominate the market

Dominate ? LOL !! Get real, AMD states the A64 will outsell the FX by 40 to 1. FX may (or may not) dominate benchmarks, but no way its dominating anything else. Its a very low volume niche product (much like the P4EE). We are talking tens of thousands, not (tens of) millions like Athlon/durons s or P4/celerons

>even tho releasing it abit to early i think with not many
>64bit apps,

How do you expect software vendors to port if there are no chips in the market ? Of course there is still little AMD64 software out there, how long did it take to get MMX, SSE SSE2/P4 optimized apps ? Hardware first, software follows, always, and never the other way around. You cant expect AMD to wait until everyone ported to AMD64 before releasing the chip, can you ?

> and premature boards
Fx uses the same boards as the Opteron 1xx, which has been out for a while. Compared to the 1999 Athlon (K7) launch, boards and support are very mature.

= The views stated herein are my personal views, and not necessarily the views of my wife. =
 

sjonnie

Distinguished
Oct 26, 2001
1,068
0
19,280
It seems like every review looks biased, no matter which brand wins.
In the end, there is far more to think about that which brand has the absolute top benchmark in some hardware lab in Munich. In reality you could build systems using an AMD or Intel processor with an ATI or Nvidia card and end up with a machine capable of playing any current game perfectly. So why is it always portrayed as a "war"? My guess is because it is because there are only two main players. Infineon isn't at war with Corsair because there is Kingston/Crucial/Mushkin etc. waiting in the wings. MSI isn't at war with ASUS becuase there is Gigabyte/Iwill/Abit will equally good products. But surely, just because there are only two main players it can still mean there are strengths and weaknesses to both products, just like there are strengths and weakness to MSI boards or Crucial RAM.

Ah well, I guess your calls for rational assessment of the facts will continue to fall on deaf ears Crashman. Yes, the P4 EE is certainly a spoiler for what is actually a nice product from AMD. In a way I feel a bit sorry for the guys at AMD who must be feeling pretty down about it. But then that's why Intel released it. Having said that, remember how crap the P4 was when it was released and how we all derided it. The future is bright, it's neither Intel's nor AMD's, it's ours. Build your system and enjoy.

<A HREF="http://www.anandtech.com/myanandtech.html?member=114979" target="_new">My PCs</A> :cool:
 

slvr_phoenix

Splendid
Dec 31, 2007
6,223
1
25,780
Yes, the P4 EE is certainly a spoiler for what is actually a nice product from AMD. In a way I feel a bit sorry for the guys at AMD who must be feeling pretty down about it.
I would have said to remember how AMD held an AMD64 love-in right outside of the very doors of IDF to spoil <i>that</i> show. Turnabout is fair play after all. ;) If you can't play with the big dogs then stop pissing in their yard. :O ...An' all that.

<pre><A HREF="http://ars.userfriendly.org/cartoons/?id=20030905" target="_new"><font color=black>People don't understand how hard being a dark god can be. - Hastur</font color=black></A></pre><p>
 

darko21

Distinguished
Sep 15, 2003
1,098
0
19,280
Re: WTF are you on, crack? You don't remember how Intel mistreated THG concerning their exposer of the PIII 1.13 Coppermine flaw? Or Tom's attack on the i820 Rambus chipset? Or Tom's general attack on Rambus? Or Tom's attack on inside deals between Intel and Rambus before their joining? Or the fact that Tom himself testified against Rambus in the memory makers lawsuits? Or Tom's attacks on the Socket 423 platform, or the low IPC of the P4 Willy?

So is Intel bribing THG? Personally I doubt it.

However this site has definitely changed over the years. From what I remember (and I have been using this site for many years) Things started really changing around the time that article was written about fanboys. This fanboy term is now common language used in review sites, forums all over the internet.

IMHO (yes this is merely a theory on my part) Tom sold out to investors or partners, I'd guess Tom still has a stake but the site is out to make money so its more commercialized. So as a company out to make more and more money (hard to due on the internet with all the competion) they must respect the large advertisers like Intel, Nvidia, Microsoft.. In the old days tom would slag Intel to no end and he would never do that today.

THG has to walk a fine line too many smart knowledgeable people read the reviews. These are not reviews about questionable opinions like which tastes better Coke or Pepsi but reviews about factual things and if they fudge the numbers this site will loose all credibility.

In a review today THG still does fairly objective reporting However, THG does not seem to reprimand THE BIG GUYS when they pull very questionable marketing stunts. Tom definitely would tear them a new one in the past. Case in point Nvidia and all the garbage they have pulled in the last year. I know THG was clear to point out the beta drivers were suspect and to be taken with a grain of salt. In the past tom would have torn Nvidia a new one for all the stunts they have pulled recently. and like I said THG is walking a fine line between knowledgeable readers and keeping the big advertisers happy or content.

So in regards to is Intel bribing THG? Is it possible?
Possible yes, likely no, but putting that overclocked P4 in that review was tacky to say the least.
 

Copenhagen

Distinguished
Oct 21, 2001
552
0
18,980
So in regards to is Intel bribing THG? Is it possible?
Possible yes, likely no, but putting that overclocked P4 in that review was tacky to say the least.
Guess what, THG has just officially admitted that it was a mistake:
<i><font color=red>
Update Sept 24,2003: Unfortunately we have made a mistake in the original article: In addition to the official P4 EE 3.2GHz we had included benchmark scores of the P4 Extreme 3.4GHz and 3.6GHz. These values were planned for a future THG article and were not intended to be included here. We would like to apologize especially to those readers who misinterpreted our charts. The two bars of the P4 Extreme 3.4GHz and 3.6GHz have now been removed. However, this issue does not affect our conclusion as we have only compared the official P4 3.2GHz EE to all other test candidates in our original article. For your information: The press sample of the P4 Extreme provided by Intel does not have a multiplier lock and is already designed for higher clock speeds. </i></font color=red>





<i>/Copenhagen - Clockspeed will make the difference... in the end</i> :cool: - <A HREF="http://icq277242841.subnet.dk/_1046137.html" target="_new"> <b><font color=blue>My Rig </font color=blue></b> </A>
 

Copenhagen

Distinguished
Oct 21, 2001
552
0
18,980
And here is the link:

<A HREF="http://www20.tomshardware.com/cpu/20030923/athlon_64-22.html" target="_new">http://www20.tomshardware.com/cpu/20030923/athlon_64-22.html</A>

<i>/Copenhagen - Clockspeed will make the difference... in the end</i> :cool: - <A HREF="http://icq277242841.subnet.dk/_1046137.html" target="_new"> <b><font color=blue>My Rig </font color=blue></b> </A>
 

Incitatus

Distinguished
May 20, 2003
36
0
18,530
They 'accidentally' added results from a non existant chip meant for a future article? Yeah, right.
Sounds very probable. Personally I think they didn't expect the backlash they got and now are trying to weasel their way out
 

Ncogneto

Distinguished
Dec 31, 2007
2,355
53
19,870
That could not have been any more absurd. I suppose when the XP2600+ was paper launched, THG didn't use it later when it was never available, same as the XP2800+ huh?
Actually, your quite wrong. What you say is absurd. A paper launch is a paper launch, regardless of it being either AMD or Intel. If the scenario was reversed I would be be saying the same thing in support of Intel. I am not a fanboy crying foul by any means any one suggesting such really has no clue. If AMD paper launched processors in the past then the same applies to them. However, this is not a paper launch (for AMD), no one can argue it is. Intel zealots have to agree, that at this time the AMD64fx processor is the fastest processor you can buy.


Gee, I suppose THG deserves slapping.
Yup, they do. the quality of the reviews have steadily gone downhill over the last two years.


It's not what they tell you, its what they don't tell you!
 

Ncogneto

Distinguished
Dec 31, 2007
2,355
53
19,870
And at least Intel's rebranded Xeon runs on P4 mobos and doesn't require ECC RAM to run. (How many performance freaks do you know that use ECC?) So it's an actual desktop chip, even if it is a Xeon at heart.

AMD's rebranded Opteron however won't even touch a plain A64 mobo and needs the slower and more expensive ECC RAM just to run. So with AMD you end up with a complete and entire Opteron workstation, but with an Athlon label. Yeah. Great idea AMD.
Yeah ok run a williamette core processor on a northwood board, enough with the motherboard issue.

And now to address the ECC Ram. Big deal, as demand goes up price will come down. Most of the people that use thier computers to do more than just play games actually consider ECC to be a blessing not a curse. You want to run more than a gig of ram and not have ECC? Have fun! For those who find ECC unacceptable the 939 pin variant will not require it and will compete quite favorably with the p4. Furthermore, the requirement for Opteron and AthlonFX is registered memory, with ECC optional.

It's not what they tell you, its what they don't tell you!<P ID="edit"><FONT SIZE=-1><EM>Edited by ncogneto on 09/24/03 07:07 PM.</EM></FONT></P>
 

rain_king_uk

Distinguished
Nov 5, 2002
229
0
18,680
Did they include it to show :
a)how well the EE scales up?

How does overclocking a hand-picked engineering sample show how well the EE scales up?

Let's face it, Intel sent out unlocked CPU's hoping some n00b reviewer would do exactly what THG did. The only way anything other than a P43.2EE is ever going to be available is if Prescott is severely delayed. Therefore having a 3.4 and 3.6 in the review did nothing but mislead.
 

Crashman

Polypheme
Former Staff
Yes, with crap articles on things like the VIA Epia platform (we won't test it in games because that's not what it was intended for), Shuttle cubes (it could hold 2 hard drives but they'd be so close together we can't recommend it), fanbus devices, etc., they're becomming more and more a marketing site than an enthusiast site.

<font color=blue>Only a place as big as the internet could be home to a hero as big as Crashman!</font color=blue>
<font color=red>Only a place as big as the internet could be home to an ego as large as Crashman's!</font color=red>
 

Crashman

Polypheme
Former Staff
YOU DON'T UNDERSTAND! IT'S UNFAIR FOR ANY WEBSITE TO COMPARE A FACTORY MODIFIED XEON TO THE FX-51, HOW DARE YOU CALL IT AN OPTERON CLONE!!!

<font color=blue>Only a place as big as the internet could be home to a hero as big as Crashman!</font color=blue>
<font color=red>Only a place as big as the internet could be home to an ego as large as Crashman's!</font color=red>
 

Crashman

Polypheme
Former Staff
None sucked, it's just that any website that doesn't claim AMD superiority sucks, regardless of the facts (that both perform similarly).

<font color=blue>Only a place as big as the internet could be home to a hero as big as Crashman!</font color=blue>
<font color=red>Only a place as big as the internet could be home to an ego as large as Crashman's!</font color=red>
 

Crashman

Polypheme
Former Staff
I still think AMD stock will hit at least 20, and if I'm wrong it will be my first bad stock pick.

<font color=blue>Only a place as big as the internet could be home to a hero as big as Crashman!</font color=blue>
<font color=red>Only a place as big as the internet could be home to an ego as large as Crashman's!</font color=red>