AMD vs. Intel: Refuting Historical Inaccuracies

Page 10 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.


The standards we expect everyone to share are those of sound reasoning and the ability to absorb new information without feeling threatened (Open Minded).

The facts and the evidence don't support your favored company (AMD) when it pertains to the technological innovations claimed by its fans. For me to simply post the facts and the evidence does not make me an Intel fanboi. The evidence is what it is. History does not change (or evidence for that matter) just because we don't like what it/they have to tell us about a given subject.

Opinions should be formed based on evidence and not prior to obtaining evidence. The reason why I appear to be winning this debate is simple. I'm not arguing in favor of Intel.. I am arguing in favor of the evidence. You shouldn't be getting defensive about this but rather should absorb the evidence and gain knowledge and then you, yourself freely share this new found knowledge. But instead of following that path, you're showing yourself to be a partisan hack who cares nothing about the "truth/evidence/facts" and instead would rather spin the facts and history so it suits his own world view.

You are making quite a few "cry" remarks and you're correct. I do feel sad... for you. I feel sad for someone who would rather stand their ground knowing that they're wrong than simply admit defeat, thank those who have provided enlightening material and move on to spread the word of this new found knowledge.
 


The fact that they used it on one product line and then abandoned the technology actually DOES mean something important. To pretend it doesn't mean anything just because you are trying to make a point is ridiculous.

They abandoned the technology. They did not implement it large scale. This is something that happened.

When you abandon something that means you've decided it is not something you will use. You have abandoned it. Disowned it. Realized you have no use for it.

If somebody comes along later and takes that idea and implements it on a large scale it makes you look pathetic for abandoning it. Perhaps we shouldn't use the word "innovative" for the company that implemented it. But we definitely can use the word "pathetic" for the company that abandoned it.
 


I don't blindly accept facts so somehow I'm not as open minded as you. Based on that I'd rather be closed minded than as "open minded" as somebody that blindly accepts your opinion and "spreads the word of this new found knowledge."

(You do realize that the last paragraph above sounds like something that would be said by a cult?)
 


That's not true, Intel did not disown\abandon it. They decided to use another alternative which they felt was better.


Correct me if I'm wrong, I don't know all the gritty details, but hasn't Intel re-implemented this technology in QuickPath?

Can Intel reimplement and abandon something simultaneously?

Also, does not Intel's implementation of the technology provide better results than AMD's implementation in CPU intensive benchmarks?



Not that HOW you get to the result matters much, it's what the result is. Now to be fair, this doesn't mean that AMD's stuff is junk, because for most people the goal isn't to have the fastest processor, the goal is to have something that will meet there needs within a certain budget, and AMD is going to be very sufficient for many people.
 


Ah... a year later and you STILL are attempting to put words into my mouth. (It didn't work last few times you tried it either.)

I never said there was anything wrong here. In fact I've said several times: Coming here is entertaining and makes me laugh.


AND TC: You keep telling yourself those things. Make it a mantra. "Intel didn't abandon it. They just didn't bother using it in the next 3 or 4 architectures. There's a difference."
 


No you shouldn't as, according to the definition of the word, that honour goes to Intel.


Innovative adjective = using new methods or ideas

It does not describe taking an old, already tried idea, and running with it.

 


I didn't put one word in your mouth. What was the first thing you posted? Pretty much arguing the OPs fact and then twisting it. Sure Intel didn't go with it. AMD jumped technologies pretty fast. They went from S939 to AM2 pretty fast thus abandoning it.

Wellif you are entertained, then good. Besides its all opinions unless you can post actual facts which, funny thing too, the OPS did. None of it was opinion. Yours is.

Interesting....
 
The ONLY thing keeping Intels IGPs afloat is their cpus, otherwise, theyd be dead.
Cheaper? Thats what Id like to see, as I dont really have a pricing comparison, and would love to see one, as Im thinking they arent competitive to price/perf, and wont be swayed by heresay, if you have links, Id appreciate it.
And, I too disagree with you, as, since the IGPs of nVidia and ATI will have caught up with Intel in process and HKMG, this wont just go to waste.
As we move into a more mobile solution, and as we see more gpu usage in our day to day apps, having a useful and powerful IGP becomes more and more important, and yes, the game has changed.
I know itds early on, but to say 2-3 years nothing goes agianst 2 major releases, plus trends in all directions.
LRB and Fermi both will have huge impacts on DT usage regarding gpgpu type usage over the next 2-3 years, and having all the things Intel currently already has just means an extra gens growth/dev in those IGPs/gpgpus.
The way I see it is, as time goes forwards, Intel doesnt have as much to offer as far as process goes, unlike those IGPs/gpgpus, and the IGPs will only gain a gen or 2 over current Intel solutions.
Thats why I say, its imperative that LRB delivers, and big time, as Intel has basically shut out all other alternatives in this direction, and NEEDS a full platform, regardless of their cpus abilities, and that its imperative that LRB scales extremely well regarding this
 


Did you not notice the whole host of links (evidence) on the topic at hand? Did you not notice that they're all credible sources and historically accurate?

I'm not asking you to take my word for it.. I'm asking you to look at the evidence. If you look at the evidence you should, if you're being reasonable, come to the same conclusions as I have.

I don't want you to blindingly follow what others say (which in fact is what you're doing right now.. you're blindingly following AMD and AMD fans). I want you to simply absorb the evidence.

That is all.
 


Intel did not outright abandon the technology. Intel used the technology across two product lines. Intel stopped using the technology for a many years and are now back to using the technology. So in essence Intel never abandoned it, they just stopped using it for a while until it became something that was needed again.

Intel did the same with Hyperthreading. After the Pentium 4/D series (Netburst) Intel stopped using HT on their desktop product lines (and mobile segment as well). The Core 2 didn't have HT. Intel is once again using HT now with their Core i7 line.

Perhaps the dictionary is a great place for you to start:
innovation [ˌɪnəˈveɪʃən]
n
1. something newly introduced, such as a new method or device
2. the act of innovating
innovational adj
innovationist n

Who "newly" introduced the IMC to the x86 architecture? Intel.

End of debate.

(Unless of course you feel that you're worthy of re-writing a dictionary).
 

The only motorola CPU I own resides in my car's fourth generation electronic engine control computer, running at an amazing 15 MHz.
 


Oh.. you're just so right. I guess it's just wrong to say that they abandoned the technology because they stopped using it for many years. We need another word to describe shelving something and not using it for many years let me think... what was the best word... let me think... Oh yeah... the word "abandoned" fits that description.

You are correct in your original facts. What you are not correct on is the IMPORTANCE or RELEVANCE of those facts.

==============
PLUS something not even mentioned:

What was Intel's original intent? When the IMC was originally designed was it meant to be implemented in all their systems? Was that their goal? Or did they just want to make something that would allow them to add the chipset to the CPU to make manufacturing cheaper because it would be one piece instead of two. It had nothing to do with performance.

So pop quiz: Did they "invent" the IMC as it is implemented now?
 


The idea for an integrated memory controller was not "abandoned". A more fitting word would be shelved.

Well I understand why that was not mentioned... because it's a straw man argument and has nothing to do with the topic at hand.
"the author attacks an argument different from (and weaker than) the opposition's best argument."

You want to divert the conversation away from the original intent (which was to counter false claims regarding who came up with what technology first) and instead focus on the intent of the CPU makers (as though intent has any relevance to the topic at hand).

I'll simply ignore your attempt at a straw man.
 


It might be a straw man argument IF I was claiming your main argument was incorrect.

Your main argument is correct. Your secondary argument which you are only implying is not correct.


PLUS analyzing their intent is actually a lot more important than you pretended. (I understand why you would want to sweep that under the rug as fast as possible since it is devastating to your secondary implied argument; which also reduces the impact of your primary argument.)

To use YOUR word... you can spin it all you want. It won't change reality.
 

No.

It's that simple. No.

Intent does not at all have anything to do with the topic at hand. Whether or not Intel's intent was to integrate the memory controller onto the i386SL in an attempt to save space does not at all affect the outcome (that they integrated the memory controller into the CPU die).

It also does not diminish the accomplishment.

Whether the intent was to obtain more performance or save space does not at all have any bearing on the topic(s) at hand. You have completely lost this argument. Go home (AMDZone) and lick your wounds. While you're at it.. send back the best "they" have. 😉
 


You have failed to actively backup your secondary implied argument.

So now you have started to resort to getting inflammatory because you realize that there is no way you can support that implied argument.

Your first fact is correct. Your implied argument that it is somehow important has failed. It is making you crazy because there is no way for you to make me accept that secondary argument. (Especially when you dance around the issue.)

As I said earlier... this forum is fun in that you guys are completely guilty of the very things that are blamed on AMD Fanboys. But somehow that's okay for you.

Oh WAIT... I forgot you're not a fanboy... you're just "unbiased".

EDIT: PS::: The intent issue IS important to your secondary argument. (Whether you like it or not.)
 

You are completely out of your league.

What Second argument.. this one?
Intel did not outright abandon the technology. Intel used the technology across two product lines. Intel stopped using the technology for a many years .

That's factual. Intel did not abandon the "idea" of an Integrated Memory Controller. How do I know this? Because when Intel was posed the question they replied with this (http://blogs.zdnet.com/BTL/?p=1455):
I’m not saying Intel will or won’t go this route someday. History shows we’ve gone both ways. We integrated the memory controller in our CPUs as far back as 1990 (i386SL w/ Page Mode DRAM + SRAM + FLASH MC) and 1992 (i486SL w/ Fast Page mode 3.3v DRAM, x4, x8, x16). We do it now with our X-scale based products. And, we had it as part of a desktop PC chip project called Timna that we cancelled in the late 90s.

This is an acknowledgment that Intel knows they've done it before and continued to do it to this day with their X-scale based products and that they had a project for the desktop named Timna (which would have included an IMC) which they canceled in the late 90s.

In other words, Intel has been using IMCs ever since 1990 across various products. So they did not abandon the technology as you claim. They stopped using it on the x86 architecture for many years and are now using the idea once more (all the while still using it on their X-scale lineup).

Again you lose that argument just like any other argument you've brought forward.

Now please.. enlighten us just as to how Intel's Intent diminishes their accomplishments in this field?

I'm not a fanboi my friend.. you are the fanboi (you admitted as much yourself and you frequent AMDZone). You simply disagree with the evidence I am providing which does not favor AMD but favors Intel therefore you assume I have ulterior motives (which is simply an attempt at reflecting your own personal imperfections onto someone else).
 
Let me guess, keithlm thinks he's not an AMD fanboy?

As I see it:

- "Intel made the first x86 CPU with IMC" - correct
- "AMD was the first to make IMC widely available in a performance/desktop procoess" - correct

I just don't quite get what is being argued here. It seems to just be all spinning/marketing.
The first of those two points was unknown to many, and is a useful piece of history.
The second is just a correction of the common "Intel copied HyperTransport" or "AMD did IMC first!' statements.

 
Anonymous wrote:

1-2-2005

Which RAM do you'se guys suggest? Stability problem since newborn. OCZ DDR333 problem? Systym freezes and have to hold in power button, I found out. This is random, but is getting on my nervous. Any fix this.

DFI S939 NF4
OCZ Gold 400 512 x 2 (megagits)
Antec 550w

Let me know if further info is needed. I think I have boarding pass dual channel control. Hello. Goodbye.
 

He is arguing that intent (meaning the fact that Intel integrated the memory controller to save on space) somehow diminishes the accomplishment of being the first to do so. Because in his world view.. it's only an accomplishment if the CPU maker's intent is performance.

In other words he doesn't have a single argument. He's simply being a troll.
 
Status
Not open for further replies.