Half Of All Notebooks To Use gCPUs This Year

  • Thread starter Thread starter Guest
  • Start date Start date
Status
Not open for further replies.
i realize im a gamer, but i cant see consumer numbers that high, true the market share may be that high, but i cant see consumer numbers matching it exactly that way.

until gCPUs are able to do the same work load that is
 
Really shouldn't surprise anyone, since a gCPU is all you'd ever need to play Farmville or Plants Vs. Zombies.
I know that's a pretty typical spiteful PC gamer response, but heck, when you really think about it in another way, most gCPU's coming out this year are probably more powerful (graphics wise probably, processor wise definitely) than an Xbox 360 or PS3 (considering they are using 5 year old hardware), so games of console caliber should be able to be played reasonably well on a PC running a gCPU, which is just fine for most people. Sad, but true.
 
Not surprising at all considering all the mainstream parts are gCPU's. It's not about whether you want it or not, it's about whether you have a choice. And most people don't need more than that for what they do with a portable computer anyway.

People who actually want to game on them will invest in something with a dedicated GPU, even though the CPU may have an unused graphics core built into it anyway.
 
The most intensive task my CPU does most of the time is gaming anyway. It'd be a good thing if the CPU can pitch in with it's dedicated cpu/gpu to the discrete GPU then you'd be seeing nice performance increases. There's already talk of this with the upcoming AMD processors.

You could potentially have a 6850 x-fire solution and a llano/zacate CPU providing an even bigger performance boost. Just a matter of how well it works... took them quite a while to get x-fire right. can't imagine it'll be smooth-sailing right out of the box.
 
Well, the integrated GPU used to be in the chipset, now it's in the CPU. Nothing has changed for those that use discrete graphics. All this article is saying really is that the percentage of chipset-based graphics is going to practically disappear by 2014.
 
[citation][nom]rmmil978[/nom]Really shouldn't surprise anyone, since a gCPU is all you'd ever need to play Farmville or Plants Vs. Zombies. I know that's a pretty typical spiteful PC gamer response, but heck, when you really think about it in another way, most gCPU's coming out this year are probably more powerful (graphics wise probably, processor wise definitely) than an Xbox 360 or PS3 (considering they are using 5 year old hardware), so games of console caliber should be able to be played reasonably well on a PC running a gCPU, which is just fine for most people. Sad, but true.[/citation]

Since when can integrated graphics play Crysis 2 on setting equal to consoles?
 
can someone tell me how this is any different to integrated gfx on the motherboard? is it cheaper this way? faster? more power efficient? Because it seems like the exact same thing, just moved closer to the cpu.
..

All of the above.
 
What will be interesting to watch is how Intel handles the gpu refresh for its Sandy Bridge and Ivy Bridge CPU. AMD has a continually evolving library of OpenCL based gpu's to add to their future refreshes of Llano and Brazos. This makes Intel a gpu design house as well as a cpu design house. But without a portfolio.

AMD simply makes this years star gpu into the APU.

Designing gpu's does not come cheap. AMD library is already paid for. That shold be a huge price advantage or margin advantage for AMD.
 
Plus don't forget that north bridge IGP access to memory is @ 32bit (at least according to CPU-Z). I'm hoping that moving the IGP to the CPU will increase that to 128bit. This will still be a severe limiting factor for modern gaming as all modern cards pack GDDR5. However the problems some encounter with DXVA2 with an IGP when their CPU throttles down should be totally eliminated.
 
good news and bad. intel igp sucks unless you're using it for video encoding. Also AMD need to release llano. Brazos is cool, but thats for netbook.
 
[citation][nom]aftcomet[/nom]Since when can integrated graphics play Crysis 2 on setting equal to consoles?[/citation]
The same integrated graphics from AMD that shows that it can handle games like Aliens Vs Predators under DX11. If the integrated graphics chip is capable of doing dx11, it already beats consoles graphics as a tech standpoint.
 
Contemporary integrated gpu's may beat console-grade graphics from a technical and specs standpoint, but, the game will still not run better on a PC or a laptop using that.
Why?
Due to porting and poor optimization for the PC.

Ideally, any game made for consoles should be playable on maxed out settings (console-level wise) on a notebook from 2008 with mid-range discrete gpu's, and yet, numerous console game ports require high-end PC hardware to be run on those setting alone (never mind the higher ones).

So the hardware is not the problem, it's the idiots who port games to PC.

However, where is the profit in being able to run any console ported game that looks good in the process?
There isn't any... at least not for manufacturers of new desktops and laptops.

It's all interconnected to be honest.
And seriously, unless a game can take full advantage of DX11, then there will be no improvement in image quality between DX10 and DX11 game.
You might notice some minor differences (to which you'd be oblivious 90% of the time) but not enough to justify the lag that occurs when the gpu has numerous computations to do on those settings (and for what?... an obscure detail you probably won't be able to notice).
 
Sandy Bridge and AMD Fusion means the end of Invidia as we know it.
Llano uses a one year old discrete GPU core design!! If you look at the Radeon cores that are being used for AMD Llano several things really stand out. The HD 6370 is a 7 watt core as discrete GPU so they probably draw much less as APU. Clearly this is headed for laptops. These cores also have one other HUGE common denominator.
Here's a link for Radeon releases: http://en.wikipedia.org/wiki/Comparison_of_AMD_graphics_processing_units
They were all released Nov 2010!!! If anybody does not believe that AMD Llano will eliminate the mid price point mass market for discrete gpu's then they really need to have a hard look at the facts.
AMD is releasing Radeon 6990 without quantity restrictions. Nvidia is releasing less than 1000 GeForce 590's. The 590 is a cherry picked dual gpu board. It may perform equal to or actually outperform Radeon 6990. But what good is it if you can't buy it? Or is the market for bleeding edge bragging rights also just not there?
The mass market supports new gpu core development. Without the sales of millions of discrete gpu's for legacy upgrades, the next generation doesn't get designed or if it does without the prospect of any mass sales volume then it becomes a very expensive piece of silicon.
A good example is the ATI FirePro and Nvidia Quadro brands. They simply do not have the mass volume sales to allow for a lower purchase price point of $2000-$3000.00, the demand is simply not there. Product refreshes are also not as frequent as the mass market again due to demand.
If AMD is using this year’s top discrete gpu design for next years Fusion APU then the discrete gpu market is most certainly dead. Will there be a reason to upgrade a one year old Llano box with the latest discrete GPU? For what gain other than bragging rights? And what would be the discrete GPU demand looking forward?
The real question becomes is that AMD’s plan? And if so how does Nvidia plan to keep the discrete market open? Does Nvidia license core designs to Intel?
The other question is just what does AMD plan to do with Bulldozer? It seems that Bulldozer will be the server, workstation or high performance desktop and gamers cpu. This is certainly not a mass market cpu. As a server obviously graphics are not needed beyond a motherboard integrated gpu. So there will be some demand for discrete gpu boards with Bulldozer.
The next question becomes. When does AMD release Bulldozer with an on die graphics core? Probably with Llano’s replacement the Trinity Fusion APU with 2nd gen Bulldozer. Because Bulldozer will be the only market left open for discrete gpu’s. That would imply a Bulldozer development APU first.
Of course just how Intel intends to answer AMD will determine the future of Nvidia graphics. Arguably Intel cannot compete with the AMD/ATI library. Every few months AMD releases new graphics silicon, they are continually evolving that product to meet present market demand. Intel is not a graphic’s design house. But now they have to be to keep their CPU business competitive. That means they are designing graphics gpu’s to penetrate a market that is owned 100% by AMD and Nvidia.
AMD is now designing discrete GPU’s with the intention of integrating that design on-die for an APU release ONE YEAR LATER! That has to be an optimized model and as such just how can Nvidia compete with AMD if they don’t have that insight into Intel future release Architectures? Nvidia’s only market will be on an Intel Inside box.
Right now AMD is directing the future of CPU design. They have the edge over Intel with ownership of arguably the world’s best graphics design portfolio and gpu design team. And they have the cost edge over Nvidia as they simply sell a one year old core design on die to millions of consumers as an APU. For Intel to remain competitive they are forced into the same model and this model shuts out Invidia.
 
APU was coined by AMD to describe FUSION. A cpu with on-die gpu. The author either doesn not know that, or he does not want to use AMD nomenclature.

What is NOT being said here is that Fusion and Intel Sandy Bridge will achieve 100% notebook penetration by next year. That is the end of Nvidia notebook graphics.

When a notebook cpu has Radeon HD 6550 on die that is huge!!!! Apple is using Radeon HD 6750M boards in Mac Book Pro now. That gpu is only a few months newer refresh. Apple has to go to Fusion.
 
So basically I will have to upgrade my CPU to get the latest graphics (combining the price of both the CPU and GPU). Very poor for desktop gamers. I upgrade my components at different times. Seems like choice and customization is slowly being driven out of this market. Of course the niche will always be there, but cost may drive most of us into the integrated market.
 
I don't see discrete graphics cards going away anytime soon at least for the mainstream gaming market not until they are able to cram in as much gpu power into a gCPU and make the price of upgrading the same has buying a midrange discrete graphics card and i dont' see any of those things happening anytime soon. It's clear though that budget discrete graphics will be dead in the next year or so seeing as gCPU's will start to overtake that market in no time.
 
I wouldn't worry about gCPUs dominating the market for video cards.. The article even supports that idea if you read between the lines.

The gCPU will be poor performance, enough to watch youtube, play flash games and view the web. I wouldn't put them faster then Intels garbage GPU of GM Chipset.

All that is going to happen is they shift the CPU and GPU together so they can reduce production lines. Instead of having two, they only need one, or build twice as many of them by running two lines building the same thing.

What is annoying is even if half the computer users out there don't need any more graphics performance, Intel/AMD will shift to making gCPUs exclusively giving them an excuse to jack the price even though they have reduced cost.

A performance or gamer box will cost more because the CPU with the crap-ass graphics HAVE to be purchased, along with a decent dedicated GPU.

It's just capitalism at it's finest here, nothing to see, move along.
 
Status
Not open for further replies.