AMD Responds to Intel's Larrabee Delay

Status
Not open for further replies.

climber

Distinguished
Feb 26, 2009
325
0
18,780
Imagine GPU accelerated applications for laptops when they're plugged into the wall and CPU only when on battery power with minimal video acceleration and low power state GPU functionality. Sort of like a math-co-processor on steroids.
 

lumpy

Distinguished
Aug 17, 2007
136
0
18,680
i like seperate cpu/gpu,its too dam expensive as is for high end stuff.
Put it all on one chip and well...$$$$
I suppose someday even RAM and SDD Could all be on one chip.I wonder.
 
[citation][nom]lumpy[/nom]I suppose someday even RAM and SDD Could all be on one chip.I wonder.[/citation]
That would be a Bad Thing as we won't be able to upgrade individual parts with out replacing the entire box.
 

festerovic

Distinguished
[citation][nom]Shadow703793[/nom]That would be a Bad Thing as we won't be able to upgrade individual parts with out replacing the entire box.[/citation]

Just thinking the same thing...
 

Honis

Distinguished
Mar 16, 2009
702
0
18,980
[citation][nom]lumpy[/nom]I suppose someday even RAM and SDD Could all be on one chip. I wonder.[/citation]The problem with this is we are stuck on an archaic architecture (x86). Even the latest 64-bit chips are x86-64. The architecture requires the use of a north bridge to access RAM and the south bridge which access the hard drive (through the north bridge). To fit all of this onto a single chip would lead to a headache in production since the dye size would be enormous (leading to a greater lose in production).

System on a Chip processors greatly reduce the bridge required by the processor but they are highly specialized for the system they are implementing.

More on SoC:
http://en.wikipedia.org/wiki/System-on-a-chip
 

belardo

Splendid
Nov 23, 2008
3,540
2
22,795
With intel owning the CPU market, its mostly good that its another business area they are not taking over.

Of course Intel is doing very well with their SSDs. Which because they are very good - they are on the top of everyones list.
 
Actually it may be easier to upgrade especially if pricing is competitive. Just pop out old APU and replace with new APU and you have and all in one upgrade. Like mentioned it does remind one of the math coprocessor or front side bus being assimilated into the CPU. For now I assume Fusion will not focus on gamers but 5 years from now we all may be running an APU or two in our rigs. Fusion could be a huge hit in laptops and HTPC.
 

XD_dued

Distinguished
Dec 23, 2008
415
0
18,810
[citation][nom]Honis[/nom]The problem with this is we are stuck on an archaic architecture (x86). Even the latest 64-bit chips are x86-64. The architecture requires the use of a north bridge to access RAM and the south bridge which access the hard drive (through the north bridge). To fit all of this onto a single chip would lead to a headache in production since the dye size would be enormous (leading to a greater lose in production).System on a Chip processors greatly reduce the bridge required by the processor but they are highly specialized for the system they are implementing.More on SoC:http://en.wikipedia.org/wiki/System-on-a-chip[/citation]

um...x86 is for the processor only. How about p55 without north bridge? Or how about Phenom with Hyper transport?
 

rambo117

Distinguished
Jun 25, 2008
1,157
0
19,290
I was pretty bummed out when i found out larrabee wasnt happening, But there are still very many things to come in the next few years. Fusion is a facinating concept, excited to see how it performs.
 

ik242

Distinguished
Mar 25, 2009
96
0
18,640
i don't see it that way - in fact i dare to call "keep them separate" claims silly.

integration is what has brought low prices and high availability of any product (and specially electronics).

memory and memory controller integrated in cpu don't cost much and since part of the cpu, get replaced together with cpu.

just because there is some cache on the cpu, or some flash memory on some new digital camera (just to make point), it does not mean that you cannot add more ram (on computer) or larger storage (SD card for example in case of camera).

for those who don't remember, there was a time where cache was not integrated in cpu. it was damn expensive and often costed more than cpu.

there was time when CD drive needed dedicated controller (before they could attach to IDE for example) and it would occupy mobo slot. aneedless to say it was cluther, with slow performance and high cost.

there was also time when chipset was just that -> collection of few dozens chips (a set) performing only few very basic functions (didn't include modem, serial or parallel port, network card, sound card, hdd or fdd controller etc. - think about what comes in today's moos or the north and south bridge).

my first network card, sound card, modem etc. costed each about same as the CPU of the day. nowdays those things are part of chipset/motherboard just like video output which may not be faster than discrete card but it's good enough for 95% of applications and - it's "free". and just because there is onboard video, nobody says that you can't add another graphic card (or two, or three...).

another thing is with integration, many things can be resolved more efficiently including size, power consumption, foootprint, bandwith etc.

so AMD and Intel, please make my next pc small, size of a dime sounds about right as i would like to carry it around without straining my arm. heck, integrate it into glasses that can double as high definition monitor.
 

matt_b

Distinguished
Jan 8, 2009
653
0
19,010
If my money was on one company to successfully pull this off, it would be on AMD. They are the only company to house both sides of the court and they already have the know-how and technology from both sectors to do it. The interesting part will be to see how they manage to marry the two together into one product.
 

mman74

Distinguished
Mar 22, 2006
403
0
18,790
If this was a low cost, low power I can totally see this chip being put into numerous devices. Standardization of the CPU/GPU platform, with a bear minimum of FullHD as with the ION chipset, less board space, and power. There could be no limits to what they put this chip into - microwaves, fridges, etc.
 

biofrog

Distinguished
Oct 13, 2004
22
0
18,510
Then again, with Intel displaying their 48-core processor recently perhaps they realised the processor development was a lot further along than expected, making Larrabee somewhat superceded already.
 

elel

Distinguished
Jun 18, 2009
1,042
0
19,360
[citation][nom]ik242[/nom]so AMD and Intel, please make my next pc small, size of a dime sounds about right as i would like to carry it around without straining my arm. heck, integrate it into glasses that can double as high definition monitor.[/citation]
lol, nice point. But if you are afraid of cell phones, do you have any idea how much electrical noise this would make? right next to your eyes? with a high frequency clock? But I do like the idea of integrating more stuff on one chip, if it saves me money.
 

liquidsnake718

Distinguished
Jul 8, 2009
1,379
0
19,310
[citation][nom]festerovic[/nom]Just thinking the same thing...[/citation]
Who says the SSD cant come with an expansion clip for RAM? I can imagine regular desktop RAM being the size of leptop RAM at 4-8 gigs per stick and speeds past 2ghz..... Coupled with 2-8 terabytes of SSD and we can have great computers with 8cores and dual gpu's like larrabee in boards as small as micro atx boards. These can porbably run one floor of a house along with its security, music, vids, tv, electricity, billing, phones, and all the media centric needs a person would need in 5-10 years. I even think 6-8terabytes wil be standard or too little by that time.
 

liquidsnake718

Distinguished
Jul 8, 2009
1,379
0
19,310
[citation][nom]mman74[/nom]If this was a low cost, low power I can totally see this chip being put into numerous devices. Standardization of the CPU/GPU platform, with a bear minimum of FullHD as with the ION chipset, less board space, and power. There could be no limits to what they put this chip into - microwaves, fridges, etc.[/citation]

Only because they recently bought ATi out. Wait in the next 5 years to see how they fair once their combined SKUs are actually merged, meaning their teams are actually integratedi n making one single product because now AMD is based in several countries like Singapore for example and their company is just running and creating the chips while ATi is just focusing on their GPU's(which are now good btw).
 
Status
Not open for further replies.