Ati says "no comment" on AMD's takeover

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.

LordPope

Distinguished
Jun 23, 2006
553
0
18,980
we are having the same discussion at the JUNGLE

when is the INQ credible.

the answer seems to be....which AGENDA are u pushing

pro amd news on the INQ to a INTEL FAN bois is crap
Bad amd news on the INQ to an INTEL fan is GOSPEL TRUTH

and vice versa
 

sailer

Splendid
All interesting to say the least, yet if it would come to investing in stocks based on these speculations, there might be few takers. Of course, those takers might end up very rich too. Speculating can be fun, as long as no hard money is involved. Whatever happens, I think the future will be interesting. At least I hope it is. I hate being bored.
 

chrislax20

Distinguished
Jul 6, 2006
75
0
18,630
If we go to GPU within the CPU, what would happen to consoles? I sure as hell wouldn't buy a PS4 if I could build my own comp that had just as good graphics integrated into the CPU. Also, integrated CPU graphics could make it so that laptops had just as tite of graphics as desktops, no more huge card to try to fit in 8O

What do you all think? If someday its all integrated do we all play on kick @$$ laptops?
 
After AMD broke off from Intel with the whole 386 flap (at that time, AMD already had plans for the 486 and so they cloned those too) AMD made their independently-designed processors pin- and chipset-compatible with Pentium Socket 7 motherboards. Once Intel switched from Socket 7 to Slot 1 with the Pentium II, that broke AMD's compatibility with Intel's motherboards. Manufacturers responded with the Super Socket 7, which gave AMD a little more headroom on that platform, but AMD eventually needed to get a new socket and with that a new chipset. AMD could and did make some chipsets, but they needed their fab capacity to make CPUs more than chipsets, so they needed a second source. Intel pretty much told most every chipset maker that they would not be able to use or make Intel-compatible chipsets if they made AMD-compatible chipsets. NVIDIA was big enough and had the experience to flip Intel the bird and help AMD to launch the Athlon Classic in Slot A with their chipset support.
 

turpit

Splendid
Feb 12, 2006
6,373
0
25,780
we are thinking along the exact same lines my brother.

The PC as we know it...is going away

With all the talk of MULTI-CORES

Having GPUPPUCPU north n south bridges....all one on die is coming

it may take 22micron or lower to achieve it cost effectivly

buts its coming

AMD wants to be ready... Ati in house would really push the dare i say

OPU

OMNI PROCESSING UNIT

While it is possible, I wouldnt count on it. If you remember 15 years ago, the CPU pretty much did everything, with graphics and audio cards being little more than output interfaces.

The advent of the GPU years ago (then called VPU) freed CPU resourses consumed by processing video data, which allowed for substantial performance increases as well as more complex program writing. It also lead to further separation of processes within systems.

While developing independant sub system processors does invariably lead to compatability conflicts, it also signifcant reduces component R&D costs and cycle times, while allowing experiance bases to be built a faster rate through specialization.

Additionally, I find it highly unlikely that any manufacturer would integrate CPU/GPU for desktop/server (though laptop, tablet, mobile etc are strong possibilities for the obvious space/power savings) as it would severly restrict custom system taylorability as well as significantly confine the possibility of individual component upgrading. You are right, it could increase efficency if implemented correctly, but people love the freedom of options and choices, dont you? Choose your own components and have it your way, or have it the only way one manufacturure says you can have it. Blaaa.
 

mpjesse

Splendid
meh... you're paraphrasing. what ATI really said was, "we don't comment on rumors."

besides, I think the bigger story is that nVidia's stock is at $18. WTF???? that is undervalued.
 

BaronMatrix

Splendid
Dec 14, 2005
6,655
0
25,790
meh... you're paraphrasing. what ATI really said was, "we don't comment on rumors."

besides, I think the bigger story is that nVidia's stock is at $18. WTF???? that is undervalued.

that is cheap. i may buy some. I guess no one realizes nVidia wouldn't be very pleased and they did help AMD build to this point. I could see them (AMD/nVidia) merging but even then why piss off the other chipset/GPU provider?

Bad blood isn't good for business.
 

chrislax20

Distinguished
Jul 6, 2006
75
0
18,630
Additionally, I find it highly unlikely that any manufacturer would integrate CPU/GPU for desktop/server (though laptop, tablet, mobile etc are strong possibilities for the obvious space/power savings) as it would severly restrict custom system taylorability as well as significantly confine the possibility of individual component upgrading.

Kinda like we see with the console market now huh? Not saying its likely, but if it did all get integrated hypothetically you would have your AMD/ATi system or Intel/nVidia system, perhaps with each manufacturer offering different levels of gaming performance CPU
 

LordPope

Distinguished
Jun 23, 2006
553
0
18,980
this of this scenario

its 2012

AMD/ATI introduced ATHLON K10+ line

12 core top models has built in quad core Radeon x6000x GPU
8 core has dual core'd radeon x6000
8 core had dual core's radeon x5000
4 core has 1 radeon x6000
4 core has 1 radeon x5000

this could be something we would be looking it

power user...down to value segment

there will be no more SAPPHIRE or HIS or EVGA making Graphic cards however.....
 

MarcusL

Distinguished
May 18, 2006
127
0
18,680
I doubt that the GPU will be incorporated into the CPU. The two are different enough that the pain of integration outweighs the benefits.

1) GPU design cycle is much faster than CPU
- Graphics feature set changes much faster than CPU
- GPU is obsolete much sooner than CPU

2) Modern GPU takes just as much silicon as the whole multicore CPU
- It's not just a matter of dropping the GPU in place of one of the CPU cores
- Maybe a value GPU core could replace a CPU core box, but nobody who knows the difference will want that. Maybe a 3 core Celeron/Sempron bundled with a value GPU core would make a super-cheap entry level system.

3) GPU likes to have a lot of wide, fast, dedicated memory access. The GPU will not want to share this with the CPU cores if it wants to maintain performance. Pin count on CPU just increased by 300-500. GRAM is now on motherboard very close to CPU. Board layout will be a nightmare. Motherboard is now tied directly to a specific CPU stepping.

The problems go on and on. The benefits are compromised by the problems so that the super CPU is worse than the current system.
 

chrislax20

Distinguished
Jul 6, 2006
75
0
18,630
this of this scenario

its 2012

AMD/ATI introduced ATHLON K10+ line

12 core top models has built in quad core Radeon x6000x GPU
8 core has dual core'd radeon x6000
8 core had dual core's radeon x5000
4 core has 1 radeon x6000
4 core has 1 radeon x5000

this could be something we would be looking it

power user...down to value segment

there will be no more SAPPHIRE or HIS or EVGA making Graphic cards however.....

Exactly what I was trying to say. Word.
 

LordPope

Distinguished
Jun 23, 2006
553
0
18,980
you make good points to counter mines....

appreciate the discussion.

what we dont know ..is what new tech will come out to make the high powered OPU a reality...

todays contraints may not be tommorrows
 

ltcommander_data

Distinguished
Dec 16, 2004
997
0
18,980
ATI saying no comment to a possible AMD takeover doesn't make it a done deal. Far from it.

There are many reasons why AMD would not merge with ATI and many have already been mentioned. One for instance is ATI's close relationship with Intel after Intel agreed to allow some ATI chipsets to be branded under the Intel banner. The other, is ATI's relatively indifferent attitude toward AMD after AMD gave nVidia exclusive rights to their own 4x4 platform.

A great example of my points is here:

http://www.theinquirer.net/default.aspx?article=32915

It is an Intel-only chipset and there are no signs of an equivalent PCIe 16x3 chipset for AMD. ATI wants to focus on Intel and wants to be early on the Conroe bandwagon so it can make some extra money.
Essentially, now that AMD has given nVidia their 4x4 platform, ATI is partnering with Intel to offer their own exclusive 16x3 platform. (Well, the marketing people are definitely working full out with these concepts.) It doesn't make sense for ATI to devote themselves to Intel while working on a contract with AMD. Certainly, Intel would never allow ATI to get this close to them if ATI was working behind their back. The fact that ATI is focusing on Intel also doesn't make a good starting place of mutual understanding for acquisition negotiations with AMD.

its 2012

AMD/ATI introduced ATHLON K10+ line

12 core top models has built in quad core Radeon x6000x GPU
8 core has dual core'd radeon x6000
8 core had dual core's radeon x5000
4 core has 1 radeon x6000
4 core has 1 radeon x5000
Furthermore, this type of scenario with GPUs "built-in" to your CPU is completely ridiculous. There is no way you can offer a wide enough product matrix with combinations of CPU and GPU power as you can when the two are separate. Secondly, if that was really AMD's plan no one would ever agree to it. ATI would be completely reliant on AMD since cooperation is out of the picture and nVidia would never work with AMD again since they are completely cut out from the market. This on top of the fact that the thermals for such and integrated CPU/GPU solution would be through the roof regardless of what magical process you are on. This type of bundling is also anti-competitive just as Microsoft bundling Internet Explorer with their OS is frowned upon.

Regardless, I cannot support any sort of CPU-GPU merger or acquisition. It doesn't matter who partners with who, all it'll serve is to split the market into camps say the AMD-ATI camp against the Intel-nVidia camp. Dividing the market destroys consumer choice and limits free competition and the survivability of the market in general. These recent platformization campaigns and AMD's lawsuit are already pooling people into camps and I can't say I like it.
 

Mex

Distinguished
Feb 17, 2005
479
0
18,780
this of this scenario

its 2012

AMD/ATI introduced ATHLON K10+ line

12 core top models has built in quad core Radeon x6000x GPU
8 core has dual core'd radeon x6000
8 core had dual core's radeon x5000
4 core has 1 radeon x6000
4 core has 1 radeon x5000

this could be something we would be looking it

power user...down to value segment

there will be no more SAPPHIRE or HIS or EVGA making Graphic cards however.....
I doubt this scenario will ever come to light. However, you are not entirely wrong about the possibility of integrated video controllers, as Intel is going to try...again.
http://www.xbitlabs.com/news/cpu/display/20060615075534.html
It's mostly about the possibility of an IMC on Intel chips, but they say this towards the end:
Rumours about Intel’s plans to incorporate memory controller into processor have been floating around for a couple of years now, even though no one knows what future memory chip from Intel will get the feature. At the same time this is the first time, when the company speaks about building graphics core – which would obviously result in increased die size, transistor count and power consumption of the CPU – into microprocessors after the failure or project code-named Timna, which contained Intel Pentium III core, memory controller and graphics controller on chip.

Obviously, a project that revitalizes the idea which is about eight years old may be targeted primarily at low-end personal computers. For example, it could enable very affordable systems for developing countries. However, such an all-in-one chip would never boast with support for latest technologies.
 

kansur0

Distinguished
Mar 14, 2006
140
0
18,680
How far do you think we will need to go before we make a video card or processor so powerful that it seeing one generation to the next is barely visible to the eye? If that were to happen what is the next step? A smaller version. 10 years is probably way to far forward to think about what the configuration might be but if CPU/GPU/whetever was so powerful from one generation to the next that even programmers could think of making something more realistic through sheer advances in programming technologies then the only thing to do is make it smaller.

Maybe the desktop will soon disappear and the laptop will become a standard piece of equipment...and the only one needed for even the fastest tasks. 10 years isn't far off towards having a totally flat integration of CPU/GPU/RAM/whatever. I think combining everything into one chip is just a matter of time. It has to start somewhere.

Think of the first computers that filled rooms. The same power is now in a desk calculator. I think the same thing will happen to the desktop PC.

Just a thought.
 

Mex

Distinguished
Feb 17, 2005
479
0
18,780
One of the future advancements I have heard in recent months comes from USB flash drives and how they could replace mobile desktop computing in 10-15 years. Mind you, this is for people who need to check e-mail and do baisc office tasks, not game, but I still think it is interesting. The OS and all the user's data would be on the flash drive. The idea is that desktops would become commodity items and you would find them everywhere, like in hotel rooms, etc. One would merely plug in their drive and go to work. Basically, instead of having to remotely access one's desktop to do work while abroad, you would "carry your desktop" with you.

That's what I heard.
 

Action_Man

Splendid
Jan 7, 2004
3,857
0
22,780
you think GRAPHIC cards will be around forever?.....

You're idiocy amazes me.

with all this talk of 50 core Cpu's in 2010

You simply can not compare cpus to gpus for 3D graphics.

With AMD working on CO PROCESSING so hard...

Yeah look at all the options.

my text formatting is fine..... this is not 10 grade english class...

Christ man kindergarten is of higher quality then this.
 

Mex

Distinguished
Feb 17, 2005
479
0
18,780
i sense that me and you may end up going at it shortly

pope_cp_7177772.jpg

VERSUS
Trex.jpg

$50 says it ends in a draw.