baseline :
It is my understanding that both Fusion and Penryn are transition products. Fusion will integrate the GPU and CPU into what AMD calls the APU (accelerated processing unit). AMD will then integrate the chipset logic when Swift is introduced. Once the buses are on the die won't this provide a significant advantage over add in solutions?
We really can't compare existing intergrated graphics motherboards to what Fusion and Nehalem reprensent they are entirely different and are more about putting the whole motherboard on a single chip.
Intel will integrate the GPU with Nehalem along with the long awaited memory controller integration, while both Fusion and early Nehalem parts with a integrated GPU may likely be laptop and CE products it seems clear that at least AMD will be looking to expand Swift beyond those markets into mainstream and enthusists markets. I would expect that Intel will follow suit and integrate the logic chipset as well.
This is when it occurred to me that NVIDIA might find themselves on the outside looking in. What market will be left for them as they are already being removed from AMD customer access since the aquistion of ATI?
The landscape does appear to be changing, thoughts?
First of all, I think you are correct that the landscape is changing. This is why I think Nvidia needs to be very careful. Nvidia needs to heal the breach between it and Intel. Otherwise, in the future, it could very easily become a company where no SLI is offered by anyone, and it could effectively have its motherboard chipsets for AMD and Intel chips cut off. There could be a market left for them with some of the other, minor cpu producers, but that would be devestating to Nvidia.
As to Fusion and Nehalem, they may well integrate the cpu with the gpu, but in my opinion that is a dead end street. No matter how well the gpu part of the chip is made, if graphics performance is to be enhanced, it would mean either replacing the chip as a whole, or allowing that a disrete graphics card be used, such as we do now with ATI or Nvidia cards. I personally can't see how performance enthusiasts would ever tolerate the idea of having to buy a whole new cpu/gpu every time a new game came out that demanded more than the old chip could give. We get enough complaints just having to change graphics cards when some new game, Crysis, Oblivion, etc, comes out. To have to spend even more money would become too much to be tolerated. Another problem is that the CPU companies would have to make even more chips to accomodate the various levels of performance that is expected.
On the other hand, maybe this is occuring without our noticing too much already. Many have predicted the end of the PC as a gaming machine for years, with gaming being regulated to a separate machine, such as the XBox, Play Station, etc. Perhaps this is the desired effect, that PCs will no longer do it all, but only be used as business machines or simple tasks such as done by a cheap E-Machine, while all gaming ends up on the XBox etc. I can't give an answer to that.
Just a few thoughts.