Intel Gets Start of Antitrust Backlash from OEMs

Page 10 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.

enigma067

Distinguished
Jun 29, 2007
208
0
18,680
Intel Gets Start of Antitrust Backlash from OEMs

By Erik Sherman | Jan 4, 2010

A recent announcement that Lenovo would use CPUs from AMD (AMD) in a couple of its ThinkPads rather than chips from Intel (INTC) is the beginning of the price the chip giant could end up paying for its alleged anticompetitive activities: OEM customers shifting their orders.

In two separate statements, Lenovo said that it would use AMD chips in the ThinkPad X1003e ultraportable as well as the 13-inch ThinkPad Edge series, which is aimed at small- to medium-sized businesses. This is the first time that the ThinkPad brand, originally owned by IBM, will have used non-Intel chips:

An ultraportable PC positioned between a notebook and a netbook, the ThinkPad X100e can be equipped with AMD’s Athlon Neo single-core and dual-core, as well as the Turion dual-core processors. The ThinkPad Edge model, the smallest of three offerings in this product family and targeted at small and midsize businesses, may be paired with dual-core AMD Turion and Athlon Neo processors. The 14-inch and 15-inch ThinkPad Edge versions will still be powered by Intel’s Core 2 Duo chips.

Before you say, “But those are the small systems,” remember that the smallest systems, like netbooks, are the ones whose sales are really growing. To put it differently, AMD may not be in the prestige machines, but they’re going into the ones that may get the greater volume sales.

Starting in mid-November, I began noting that the upshot of all the antitrust activity focused on Intel would be customer defections:

PC vendors get completely wary of being sucked into the investigatory void and start shifting a significant portion of their purchasing to AMD. Forget fines and forget legal fees. That’s going to be the real price tag for years of allegedly using money and influence to keep a competitor constrained, and it will be a number with a whole lot of zeros.

I think the Lenovo switch is the first sign of that real price tag. Who knows how large a card it will need to be to record all the potential long-term loss for short-term gain?

Image via stock.xchng user MeHere, site standard license.

http://industry.bnet.com/technology/10004584/intel-gets-start-of-antitrust-backlash-from-oems/
 


I will see what I can find. It was a while ago. Sometime with their fiber embeded into silicon stuff.
 

i was generalizing vastly. Point is Intel has the stuff to take on ARM . sorry to be rude here but its annoying when people just dont think outside of a linear pattern of symbols representing an idea.

Look into Atom soc's and realise how silly you have been.

its annoying when people just dont think outside of a linear pattern of symbols representing an idea.
Ill keep it in mind that some people dont look to the visualization of the idea but rather limit perspectives on opinions to said symbols.

no offense man but friggen derr.it would have been less stressful if youd just suggested atom where I was generalizing soc. Its like myopia with some people they have to get stuck on themselves instead of try to translate ideas.

mooo FYI dewd. I even give jenny that benny and were at each others throats too often. But then I guess expecting others to be able to see ideas in words is asking too much for most. at least jimmy seemed to sense i was generalizing a bit. must have a decent IQ.

meh its not like i dont know a phd that cant get past his own inner ape 😀 nothing to feel bad about
 
I think the exactness of thought and ideas can be crippling, especially if those ideas are trusted, and not expanded upon.
I love looking at the bigger picture, which requires this.
Not diggin up bones here, but Intels IGPs are a great example.
As we progress, and our cpus get better, we reach the point of no returns for the average user, and need other things, like gfx, thus Intels IGP. Its another feature average Joe can and will appreciate, and is important towards sales benefits.
So, sure, Ive never ever denied this wasnt a need in the business environ, unless other apps start using gpgpu things, but to apply it towards a large market say of average Joe...
If you get to the point where every i needs a dot, and every t needs be crossed in casual conversations, even tho technical, we slow down to the pace of the governments most dislike
 
I think if Intel could get i3 power into the size and TDP of Atom for cell phones and embedded, ARM would be killed instantly. Right now the fastest cell CPU I have seen is the 1GHz Qualcomm one in the Google Nexus One.

Imagine a 2.5GHz CPU with a IGP that can play better than Playstation (probably near PS2 quality) games in a cell phone the size of the Droid........
 


This I didn't know - thanks for the update. I suspect that with 32nm and below, Intel could stick something inbetween an Atom and Nehalem core on the new CPU. Or else design a mobo with both x86 & non-x86 support.

Even though I think it is unlikely, Itanium could still end up as the dominant CPU architecture in 15 years time, but other than that happening, Intel will be with x86 forever and a day.

Itanium is VLIW, correct? So the burden of OoO deciphering falls upon the compiler rather than the CPU hardware, which means software would have to be better optimized and thus more expensive, or better compilers developed. VLIW would probably kill off PC gaming, which would make Jay tear off on yet more anti-Intel rants 😀.

As for ARM, they are already optimized for low power embedded apps - it's Intel who can make the most progress there with Atom (to paraphrase Jay's & Jenny's arguments on HKMG :kaola: ). And I also don't think ARM has the resources to move upwards to mobile and DT & server and displace x86.

AFAIK even Nehalem still supports legacy stuff like 16-bit 'real' mode with segmented addressing.

Really it comes down to Intel to move the industry forward, and I agree its time to shed the legacy poundage and jump into the 21st century with a new CPU and instruction set.
 


Yes lets all imagine a world when intel progress and the rest don't. That's the sort of world intel would love us all to be living in, thankfully it's quite far removed from reality. 😀
 


Thats also what I was saying, taking SOMETHING LIKE i3( notice the intent to generalize) below22nm or atom or the 40 core x86 chip ,,,,, thats where I was going. With atom you add functionality as with i3 and there is the 80 core thinga ma bob (technical term for generalizing that chip).

Point is can Intel kick Arms but? not right this minute but definitely on smaller nodes. The examples are all over their tech from i3 to atom to the now non mainstrteam larrabee ideas.
 


We will get a better idea later on this year when this product gets released.

[flash=560,340]http://www.youtube.com/v/MKC0UpizqUU&hl=en_US&fs=1&[/flash]

 

the problem with ARM tech is that its mostly sub functional to pc demands and the people want a pc in a phone, they want as much pc functionality as you can get in that format. I think that trend will continue to the point where most casual pc users are plugging monitors and keyboards into cell phones.
The question is how relevant is ARM in that scenario?
 

wow no sooner than i said it and its already here. Thats the future of casual pc use right there. add a monitor jack and a mouse jack and youll sell them to everyone.

That will probably be capable of windows mobile, Apple will probably eventally throw its os on one like it and what does ARM have? the lower end of the cell phone spectrum. Arm cant do that level of computation. Not yet at least. and that really drives home the fact that they teamed up with GF as possibly a means to survive.
 
No, the question is, how relevent is x86, and how much it costs as we try pushing it smaller.
Ive heard estimates as high as 15%. This sint PCs per se here, and again, this isnt Intels territory either. Much like LRB has to face, Intel will have to face this as well.
And, will ARM just roll over and wait til Intel comes?
This hasnt been shooken out yet, and there still time for many changes
 

That atom cell phone just killed ARM on the higher end, now apple will make an atom powered cell. You can deny it all you want but its clear that Intel just killed ARM's higher end clientele.

Technically JD thats a pc in a phone form factor.
 

of course they wont roll over theyll team up with an AMD partially owned foundry . I wouldnt be shocked if ARM and AMD did some kind of merger, its that or VIA.

within 5 years Intel could take 50% of ARM's clients because they have the capacity to lower costs and compete on mid level phones and in 5 years , things like atome will be at nodes and power thresholds that could utterly destroy ARM, and then that node timeframe could see a massive power capacity(ability to multi task and compute) on cell phone computation because of node shrinks.

in the 5 year and beyond, ARM is screwed without help.

I see people doing photoshop on phones within 10 years among all of the other things that are done, AMD would be wise to "Merge" with ARM and aim at 64 bit cell phone computation. I use the term, merge, very loosely.
 

Can ARM compete on desktops? that pretty much answers the ARM question in the 5 year and beyond scenario.

I could see flexible monitors and kewy boards opening up things like ultra mobile studio recording. this is exciting imagine having a cell phone with a dual core and 2gb ddr6 and a protools mobile 32 track self powered or battery and line powered with flexible monitors and keyboards ,,,, drool. This is the shyt JD.

And stream casting to paid vid sites live goes mainstream :bounce: thats the good stuff. Imagine every college band out there using this kind of tech, they wouldnt need to own it ,some jew bandage with money could buy like 10 and send mixers out to stream the vids , theres a mini industry waiting to pop.

How about coverage on sports? one guy gathering streams from several cell phones, die mainstream networks die :lol: I wish they could do quad cores on that format already.The mainstream applications for this tech are awesome.
 
Yes lets all imagine a world when intel progress and the rest don't. That's the sort of world intel would love us all to be living in, thankfully it's quite far removed from reality. 😀

Actually that wasn't my point. A CPU with the power of a desktop or near it for cell phones would greatly further than market a lot. You know the next gen 4G network is being done on Intels WiMAX, right?

But I forget. Evil Intel keeps putting out something new while AMD keeps putting out something based on the same thing. Its ok though. I am sure in 1-3 years AMD will finally have a Atom like chip.
 


According to Intel, you first have to develop an arch on full node, then you can refine said arch on a half-node. The problem is it costs a lot of $$$ to switch over to half-node. But while on half-node, you can't design new archs on it.

Half-nodes work well for GPUs where you just copy/paste on more cores or clock them up, but no major changes. Half-nodes on CPUs usually involve very minor changes and extra cache/clock speed.

Intel just plans to skip over the half-nodes, save $$$, and in the long run they develop the next full node faster.

Don't forget, Intel has already put out 32nm and AMD is yet to deliver.

AMD will be releasing their Bulldozer early next year on 32nm. If they wanted to switch over to half-node, it would take a few months, then they'd be at 28nm. But Intel already has a 22nm full node updated arch for Nahalem planned for mid 2011. The *entry* level Intel 22nm chip will feature 16MB L3, 1MB l2, 128k L1 and be quadcore with HT. The mid/high end will be the same but 6/8 cores and even more L3 cache.
 


There is no good reason to use 128bit other than consume more transistors. All heavy CPU work is already behind done in 128,256 bit registers. AVX will have 512 and later 1024bit registers. There's just multimedia extensions to the 64bit general registers. This is probably what you're thinking about.

We won't be consuming our 64-bit memory range anytime soon. 128GB is only ~209,700 times more than 640k and that took about 30 years. 64bit supports ~4.3billion times more memory than 4GB. It took us about 8 years to go from 2gb to 4gb. Now we just need to get to 17,179,869,184gb then we will need 128bit.

At least if we ever get to 128bit, we'll never need to upgrade that for memory reasons. Unless we find a way to consume 309,485,009,821,345,068,724,781,056 terabytes of ram. This number is so large that it's nearing the amount of atoms in the earth. So, unless we find a way to store a bit in an area smaller than an atom, we'd have to have a device about as large as the earth. You're memory could have it's own moons.
 

It is far from clear that Bulldozer will be out in Q1 of 2011.

I think Q2 is their best case scenario, with Q3 being just as likely.

But Intel already has a 22nm full node updated arch for Nahalem planned for mid 2011.
I think you are off on your dates here.

If all goes to the tick tock schedule, Sandy Bridge will come out in Q1 2011 on 32nm, then in Q1 2012, 22nm is due.

However I am beginning to suspect that Sandy Bridge and 22nm will come at least 1 QTR respectively beyond what the tick tock schedule of a year or two ago was indicating.
 
Chad, I don't think Sandy Bridge is going to be late at least from the processor side. Intel showed off Sandy Bridge at 2009 IDF. If anything I think it is the next gen chipset that will slow the release of Sandy Bridge down. I think they used Ibex Peak as the chipset to work with Sandy Bridge.