ATI's Intel Chipset Team Reassigned

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
i was aware of the uli aquisition by nvidia,but i am unaware of a via aquisition;could you post a link?

I believe it was The Inquirer that also pointed out that CPU defiency for nVidia. They suggested that nVidia might acquire VIA. It's pretty farfetched right now, but it is logical and it would give nVidia very strong mobile expertise


Cheers!
 
On the other hand, nVidia 'only' lacks its own CPU for a complete "n-platform"...

They should merge with VIA, and put out a 64bit CPU....

edit: oops sorry, I see someone has beaten me to the idea... Good to know I am not the only one here who thinks outside the box 😀
 
They should merge with VIA, and put out a 64bit CPU....

VIA may become a very appealing company, in the mid-term... :wink:


Cheers!

It might, but how long would it take for an Nvidia/VIA merger take to develop a CPU that could face C2D and K8L? And how would either AMD and Intel react to such a merger? Would both companies just end all licensing to Nvidia? If such thing happen for how long Nvidia could survive before having it's own complete system? With ATI, AMD wouldn't be needing Nvidia GPUs and Intel might bring up some hidden cards (like discrete graphics) to face this...
 
It might, but how long would it take for an Nvidia/VIA merger take to develop a CPU that could face C2D nad K8L? And how would either AMD and Intel react to such a merger? Would both companies just end all licensing to Nvidia? If such thing happen for how long Nvidia could survive before having it's own complete system? With ATI, AMD wouldn't be needing Nvidia GPUs and Intel might bring up some hidden cards (like discrete graphics) to face this...

Well, these are just suppositions & speculations but, here's a view:

a. AMD/ATi: CPU/Chipset/GPU/...; full platforms (mobile/desktop/server); low-end/mainstream/high-end;

b. Intel (+ 3D Labs...): CPU/Chipset/GPU/...; full platforms (mobile/desktop/server); low-end/mainstream/high-end;

c. nVidia: GPU/Chipset/...); depleted platforms (mobile/desktop/server); low-end/mainstream/high-end.

NOTE: the designations are not exactly accurate.

As the Inq. suggested, nVidia could aquire VIA as the complement to its [CPU depleted] platform; it's not that it will happen for sure but a possibility, nevertheless.
AMD/ATi wont be needing any CPU/Chipset/GPU/Mainboard manufacturers/suppliers to achieve a full blown platform; neither Intel.
Apparently, Intel has no reason to stop supplying its CPU lines to nVidia's platforms (Chipset/GPU/Mainboard); and, nVidia will keep supplying its [CPU depleted] platform to Intel.
With this sort of 'passive' agreement, Intel & nVidia lost both AMD/ATi's platform spaces; Intel can afford it but, that alone, leaves nVidia totally dependent on Intel; and, if - with the incorporation of brainpower from 3D Labs - Intel's going to strenghten its graphics division, upgrade its Chipset line, improve its CPU and so on, nVidia will be left with a increasingly shattered & pulverized plaform.

I think the question should be quite the opposite of yours:

As it was suggested, nVidia's aquisition of VIA would turn it somewhat independent of Intel, namely on what regards the mid-to-low-end & the embedded spaces; it would keep its dependence on Intel, on the mainstream & high-end... but, it still would have to compete - directly - with AMD/ATi & with Intel itself! And, there are not many options, on what regards chipmakers.
While there are other niches where nVidia can compete with some agressivity (Communications, ultra-portable graphics chips,...), so can ATi (which aquired the finnish Bit Boys Oy, last year, I think).

So, for how long could nVidia survive if it doesn't create its own CPU division? (reformulating your question).

Of course, this is a possible, but not unique, scenario.


Cheers!
 
It might, but how long would it take for an Nvidia/VIA merger take to develop a CPU that could face C2D nad K8L? And how would either AMD and Intel react to such a merger? Would both companies just end all licensing to Nvidia? If such thing happen for how long Nvidia could survive before having it's own complete system? With ATI, AMD wouldn't be needing Nvidia GPUs and Intel might bring up some hidden cards (like discrete graphics) to face this...

Well, these are just suppositions & speculations but, here's a view:

a. AMD/ATi: CPU/Chipset/GPU/...; full platforms (mobile/desktop/server); low-end/mainstream/high-end;

b. Intel (+ 3D Labs...): CPU/Chipset/GPU/...; full platforms (mobile/desktop/server); low-end/mainstream/high-end;

c. nVidia: GPU/Chipset/...); depleted platforms (mobile/desktop/server); low-end/mainstream/high-end.

NOTE: the designations are not exactly accurate.

As the Inq. suggested, nVidia could aquire VIA as the complement to its [CPU depleted] platform; it's not that it will happen for sure but a possibility, nevertheless.
AMD/ATi wont be needing any CPU/Chipset/GPU/Mainboard manufacturers/suppliers to achieve a full blown platform; neither Intel.
Apparently, Intel has no reason to stop supplying its CPU lines to nVidia's platforms (Chipset/GPU/Mainboard); and, nVidia will keep supplying its [CPU depleted] platform to Intel.
With this sort of 'passive' agreement, Intel & nVidia lost both AMD/ATi's platform spaces; Intel can afford it but, that alone, leaves nVidia totally dependent on Intel; and, if - with the incorporation of brainpower from 3D Labs - Intel's going to strenghten its graphics division, upgrade its Chipset line, improve its CPU and so on, nVidia will be left with a increasingly shattered & pulverized plaform.

I think the question should be quite the opposite of yours:

As it was suggested, nVidia's aquisition of VIA would turn it somewhat independent of Intel, namely on what regards the mid-to-low-end & the embedded spaces; it would keep its dependence on Intel, on the mainstream & high-end... but, it still would have to compete - directly - with AMD/ATi & with Intel itself! And, there are not many options, on what regards chipmakers.
While there are other niches where nVidia can compete with some agressivity (Communications, ultra-portable graphics chips,...), so can ATi (which aquired the finnish Bit Boys Oy, last year, I think).

So, for how long could nVidia survive if it doesn't create its own CPU division? (reformulating your question).

Of course, this is a possible, but not unique, scenario.


Cheers!

Joset,

What I meant is that VIA's CPU is a really weak processor, and probably won't be able to compete with current processors unless they redesigned it. This brings a few things into the scenario:

Nvidia would spend a lot of money on a acquisition/merger that might have an impact in their ability to invest in R&D of a new CPU. Also it could affect their chipset and gpu business.

Intel might feel even more threatened by this second merger in less than 6 months and could really give a hard time for Nvidia for the next year.

AMD, which has stated it won't take support from Nvidia in their plataform, might feel threaned as well (as they are the underdog in the cpu industry) and might cut all Nvidia products next year.

This could be really bad for Nvidia, because it would hamper their business and would have to come up with a great solution in a small amount of time, which means, investing a really big amount of money. Things could get really hard for them.

But they could also get to the other side of the tunnel and bring a whole bunch of new technology.

Well, as long as we're talking about suppositions, Nvidia has worked on the Playstation 3, and so they have access to Cell and XDR. If they'd bring such a plataform for the PC, could be an interesting thing. RAMBUS has already developed XDR2 (8 GHz bandwith) and we could see a very powerful plataform that could change computing in the next few years.
 
joset you really reaching for nvidia to create a cpu

😀 No, not really. Just speculating about a possible, viable scenario.

and via's post cyrix cpu is kind of a joke;that bieng said,they could redesign the chip,although fab investment would be outrageous;not to say it couldnt or shouldnt be done,just that it would be a long road to bring it up to current speeds and processes.

Yes, the investment would be drastic; however, better late than never (i.e., while nVidia still can...).
nVidia is getting somewhat locked in its isolation; the scenario I considered above wouldn't be threatening for Intel since, as you state, VIA's CPUs are actually unable to compete from the low-end up (the embedded space is another matter); but, VIA has a good portfolio in the chipset arena, for the low-end customers. Hence, the referred aquisition would provide, for sometime, the best of both worlds: nVidia + VIA for the low-end & nVidia + Intel, for the remaining spaces (priced accordingly). Obviously, any 'proprietary' nVidia CPU would be a mid-to-long-term process...


Cheers!
 
Well, as long as we're talking about suppositions, Nvidia has worked on the Playstation 3, and so they have access to Cell and XDR. If they'd bring such a plataform for the PC, could be an interesting thing. RAMBUS has already developed XDR2 (8 GHz bandwith) and we could see a very powerful plataform that could change computing in the next few years.

That would be another possible scenario.

I just brought this VIA thing because it was quoted from the Inq. (and from someone else before, as it seems).

The issue remains, however: With these latest two moves (AMD/ATi & Intel/3D Labs), how will nVidia overcome its seemingly outcast position, in the near-term?


Cheers!
 
Let me sum up what joset & I are saying this way....

Nvidia has to do something before they go out of business.... They are sitting pretty now - no doubt about it.... But as Intel/AMD start to integrate the CPU and the GPU - Nvidia will start declining.... So basically the question is: How can NVidia afford not to try something drastic.... Intel doesn't seem to need to merge with them - they are going to find a way to do high end gfx on their own....

Maybe via/nvidia could make their mark in the small form factor/pda/mp4/handheld business....
 
I'm glad you don't manage a business.

I created 3 very succesfull businesses.... Every time you criticise me - I have been proven correct.... You think by now you would learn to listen to me when I speak.... I have been in technology 24 years, and have either been a manager and/or business owner 12 years....

Your one of the people who ganged up on me and insisted the GPU and the CPU couldn't merge - and were not shut up until AMD said they could be publicly....

OK smarty.... How do you think NVidia is going to still be in business 5/10 years from now - if they don't make a big change? You think the PC business is going to be the same? The industry is moving to smaller devices(if you have not read).... Notebook sales will overtake desktop PCs probably in 2007.... When HDTVs are the norm - alot of people will buy 1 box to do their games, and internet browsing(alot will dump their PCs).... Handheld devices will be even more popular when sprint/nextel has 1/3 the U.S. setup with high speed wireless G4/WiMAX by end 2007.... I am sure Australia, and the EU will do the same thing(if not sooner)....
 
Every time you criticise me - I have been proven correct

:lol: In your little fantasy world maybe.

Your one of the people who ganged up on me and insisted the GPU and the CPU couldn't merge - and were not shut up until AMD said they could be publicly....

:roll: Why don't you go back and read my thoughts on it then you'll see how much of an idiot you really are.

OK smarty.... How do you think NVidia is going to still be in business 5/10 years from now - if they don't make a big change?

By making a series of small changes according to market? No that'd be too smart.
 
joset you really reaching for nvidia to create a cpu

😀 No, not really. Just speculating about a possible, viable scenario.

and via's post cyrix cpu is kind of a joke;that bieng said,they could redesign the chip,although fab investment would be outrageous;not to say it couldnt or shouldnt be done,just that it would be a long road to bring it up to current speeds and processes.

Yes, the investment would be drastic; however, better late than never (i.e., while nVidia still can...).
nVidia is getting somewhat locked in its isolation; the scenario I considered above wouldn't be threatening for Intel since, as you state, VIA's CPUs are actually unable to compete from the low-end up (the embedded space is another matter); but, VIA has a good portfolio in the chipset arena, for the low-end customers. Hence, the referred aquisition would provide, for sometime, the best of both worlds: nVidia + VIA for the low-end & nVidia + Intel, for the remaining spaces (priced accordingly). Obviously, any 'proprietary' nVidia CPU would be a mid-to-long-term process...


Cheers!

😀 i was tossing the idea around months ago as well ,while the nvidia via thing seems exciting ,i just wonder how things will evolve.having 3 proprietary vendors is kind of scary and exciting,are we to look at a future where interchangability is grossly limited?

we could be witnessing the era in which there will be 3 platforms and no cpu is interchangeable nor is the graphics solution,kind of making a console esque pc market,and if that is the end result ;ibm should jump back into the game and take whats left of the gpu vendors like matrox,or whoever is left.
is this just crazy or what?sure it opens new doors of technology but it potentially closes the enthusiast builders market,after a point in the maturation of said companies mergers.
the concern i point out is ;in 5 years or ten ,can i put an nvidia card on an amd board?Are we seeing the closure of this interchangability?

Well, first, I don't think IBM will go back to the desktop market (they sold their PC and Notebook part to Lenovo).

On the other issues, if things do go as you say, probably we'll have in a few years something like the PC/Mac market a few years, where you'll also get specific OS for systems and each won't be compatible.
 
You forget that there are OSes that are relatively CPU-independent that would likely run on all of the platforms. Just not Windows.

Oh no, I thought about these ones too (unix, linux, freeBSD, etc), but I was referring to the more "used", like MacOS and Windows, that are far more used on their plataforms (at least by end users).
 
I hope you're (Joset) right about Intel entering the high-end graphics space. Intel has a tremendous amount of expertise in fabrication and design, and it seems a shame that they don't enter the graphics space. They have a great reputation for reliability, but a poor reputation when it comes to graphics. It would be nice to seem them address this.

I'd like to see Intel pull a rabit out of their hats with GPUs the way they did with Conroe. That was just awesome :)

As for VIA and nVidia, it seems like a mini-ATi/AMD merger to me. The difference is that the merger (or even partnership) would apply more to the embedded and CE sector. SoC (System-On-Chip) has been growing tremendously over the years. A lot of embedded solutions depend more on price per functionality, not price per performance. I don't think VIA or nVidia would be able to produce something, together or seperately, for the desktop PC that would compete with AMD or Intel at this point. I think that it would be a fantastic move when you're talking CE, UMPCs, and the likes.
 
I hope you're (Joset) right about Intel entering the high-end graphics space. Intel has a tremendous amount of expertise in fabrication and design, and it seems a shame that they don't enter the graphics space. They have a great reputation for reliability, but a poor reputation when it comes to graphics. It would be nice to seem them address this.

I'd like to see Intel pull a rabit out of their hats with GPUs the way they did with Conroe. That was just awesome :)

As for VIA and nVidia, it seems like a mini-ATi/AMD merger to me. The difference is that the merger (or even partnership) would apply more to the embedded and CE sector. SoC (System-On-Chip) has been growing tremendously over the years. A lot of embedded solutions depend more on price per functionality, not price per performance. I don't think VIA or nVidia would be able to produce something, together or seperately, for the desktop PC that would compete with AMD or Intel at this point. I think that it would be a fantastic move when you're talking CE, UMPCs, and the likes.

I agree on this direction a Nvidia/VIa merger might go, but this could bring a dark age on graphics for the desktop...Nvidia could loose focus on the desktop and start longer term boards, and AMD/ATI could go into the on-board frenzy with Intel, making gpu become a less important. This could make the gpu dvelopment loose it's 6 month renewal rate (might even be good to our pockets).
 
Ultimately i think we're going to see a slowing in product releases. AMD's purchase of ATi has essentially removed some of the fierce competition, at least in the graphics space. This is mainly because if you now buy an Intel chip, you're limited to nVidia.

If Intel enters the high-end space, this will change the landscape a bit. AMD dropping support for Intel chips means that if you buy an ATi/AMD vid card, you're married to the AMD platform. At the same time, you're locking out a lot of your potential customer base. It's a risky move, but not entirely unexpected. The point is, once Intel enters the mainstream graphics space, nVidia may be the only company producing video cards that are very compatable between both platforms.

From a strictly competetive standpoint, if Intel is indeed entering the high-end gfx space, it may be in their best interest to buy out nVidia, just to eliminate them as a competitor (if the SEC approves, of course).

On an aside, Intel really needs to do more with their current integrated gfx chipsets. They're a real dissapointment.
 
😀 i was tossing the idea around months ago as well ,while the nvidia via thing seems exciting ,i just wonder how things will evolve.having 3 proprietary vendors is kind of scary and exciting,are we to look at a future where interchangability is grossly limited?

we could be witnessing the era in which there will be 3 platforms and no cpu is interchangeable nor is the graphics solution,kind of making a console esque pc market,and if that is the end result ;ibm should jump back into the game and take whats left of the gpu vendors like matrox,or whoever is left.
is this just crazy or what?sure it opens new doors of technology but it potentially closes the enthusiast builders market,after a point in the maturation of said companies mergers.
the concern i point out is ;in 5 years or ten ,can i put an nvidia card on an amd board?Are we seeing the closure of this interchangability?

Still within speculation, I believe that, sooner or later, new players will interrupt this "linear" platform interplay. Take IBM/Apple/Intel, for instance: Who'd say?! Or AMD/ATi?!
Two years ago, the landscape was mostly quiet & flat, CPU wise: Intel, AMD, IBM (Motorola, Freescale if you wish). Suddenly, take that!
As for interchangeability, almost only CPUs have been polarized towards their own manufacturers (sockets, board logic, features/performance...); there has been a run off of proprietary solutions in benefit of standardization (excluding some specific niches, of course); mutual profit was the slogan. What ticks inside a MacIntosh, now? IBM left its mobile & DT business (for the time being?), Intel came up with Core and AMD's still reacting... whim wasn't certainly the reason it aquired ATi. Exclusive profit, massive reorganization & tight timmings seem to be the rules, now.
Graphics-wise, there's been rumours of another paradigmatic shift: The gradual inclusion of graphics's functions (and others) into the CPUs. Ultra-portable comms growing in image, sound & feature-rich capabilities; hence, demanding very sophisticated processors, low-profile & low power in a highly (& growing) profitable market. The list goes on and no-one wants to loose the track...

Although I still believe standartization will continue (for the sake of customers... and mutual profit), Intel has not adhered to the HyperTransport Consortium and will certainly create its own protocol; AMD (and others, Apple included), is utterly commited into the HTT development... and so will ATI, from now on. Apparently, different protocols will co-exist in the existing platforms... unless someone takes a more drastic turn.
Again, where does nVidia fit in this panorama?

(All the above is my opinion and merely speculation).


Cheers!
 
I hope you're (Joset) right about Intel entering the high-end graphics space. Intel has a tremendous amount of expertise in fabrication and design, and it seems a shame that they don't enter the graphics space. They have a great reputation for reliability, but a poor reputation when it comes to graphics. It would be nice to seem them address this.

I merely stated my (speculative) opinion on the matter; I'm not affirming that it will happen.

That said, I really don't see any other strong reason for Intel's recruitment of brainpower from 3D Labs; and, I find it too close to the AMD/ATi merger to be mere coincidence.

By the way, all this is somewhat contextualized in the first post of this thread's author (with his respective quotes).


Cheers!
 
I agree on this direction a Nvidia/VIa merger might go, but this could bring a dark age on graphics for the desktop...Nvidia could loose focus on the desktop and start longer term boards, and AMD/ATI could go into the on-board frenzy with Intel, making gpu become a less important. This could make the gpu dvelopment loose it's 6 month renewal rate (might even be good to our pockets).

Addressing your previous post, "Virtualization" technologies are now more viable than ever; while chips will grow in processing power, more VT capabilities will be put into it. Seems that software will be the most independent & pervasive technology, in the mid-term. VT isn't a coincidence either, Intel's or AMD's.

Concerning your above statement, I beg to disagree: Either in discrete cards, on-board discrete chips, included in CPUs or in any other form, GPU/PPU/whatever will always have to increase, power & features wise, especially on the DT (and mobile), just because we demand it (who would buy the most powerful CPU and get stuck with [very] early XXI century graphics?); OGL 2.x & DX 1x.x only make sense in a highly sophisticated powerhouse; and that is but the workstation (DT or mobile) we'll be having right in front of us. :wink:


Cheers!
 
I thought I would provide a ATI chipset update, but it doesn't really deserve it's own thread so I'll just add it here.

http://www.dailytech.com/article.aspx?newsid=3993

http://www.ati.com/products/Radeonxpress1250mob/index.html

Today, ATI launched the last of the breed and the first of the Rx600 series. This being the RS600M dubbed the Xpress 1250. It's for mobile and supports up to Merom. Interestingly, it doesn't officially support the Netburst processors yet they promote HT support. It's also the first mobile chipset to claim DDR2 800 support (although actual SO-DIMMs are lacking), which even Santa Rosa and the GM965 doesn't have. It doesn't list FSB support, but with the DDR2 800 I'm going to assume that the chipset can also be used with future 800MHz Meroms.

Also, the RS600M contains a X700 based IGP. They don't list the configuration, but it's probably safe to assume it'll be 4PS/2VS (double the Xpress 200M). Of interest is that they don't try to claim DX9.0b support like a X700 would, but only regular DX9.0 support like a X600. Anyways, Vista Premium and Aero Glass is obviously supported.
 
I thought I would provide a ATI chipset update, but it doesn't really deserve it's own thread so I'll just add it here.

http://www.dailytech.com/article.aspx?newsid=3993

http://www.ati.com/products/Radeonxpress1250mob/index.html

Today, ATI launched the last of the breed and the first of the Rx600 series. This being the RS600M dubbed the Xpress 1250. It's for mobile and supports up to Merom. Interestingly, it doesn't officially support the Netburst processors yet they promote HT support. It's also the first mobile chipset to claim DDR2 800 support (although actual SO-DIMMs are lacking), which even Santa Rosa and the GM965 doesn't have. It doesn't list FSB support, but with the DDR2 800 I'm going to assume that the chipset can also be used with future 800MHz Meroms.

Also, the RS600M contains a X700 based IGP. They don't list the configuration, but it's probably safe to assume it'll be 4PS/2VS (double the Xpress 200M). Of interest is that they don't try to claim DX9.0b support like a X700 would, but only regular DX9.0 support like a X600. Anyways, Vista Premium and Aero Glass is obviously supported.
~

If, as reported in the news (http://www.xbitlabs.com/news/chipsets/display/20060823235224.html, for instance), ATi will cease to support Intel platforms, namely, with its Rx7xx chipset line, these Rx600 series will certainly be their last incursion into their contender's platforms.
At first, I thought ATi was throwing all they could into it (DDR2 800 SO-DIMMs, backwards compatible ATA133, AVIVO, Power Management enhancements, integrated UMA IGP and so on...); curiously, they include support for HyperThreading but no NetBurst; x16 PCIe slot but no >DX9.0b support and no support at all for Dual-Channel DDR2...

Somewhat puzzling...


Cheers!
 
Seems like they're releasing a 1/2 done project for a little ROI.


Exactly.
I'd even dare to say that, these series were already thought out much before the merger; "at the last minute", DX9.0b & Dual-Channel DDR2 were left out for the "bombastic" Rx7xx series (DT/Mobile?)
A conspiracy theory could be something like... Dual-Channel DDR2 support & >DX9.0b (DX 10.0?) should come out at the same time for both AMD/Intel platforms (both chipsets have a lot in common...); and, the HyperThreading/no-NetBurst support could be a "wait & see" step, regarding Intel's moves on NetBurst/Core (I hardly believe whatever x-Threading AMD will have will be Intel compliant...).


Cheers!