Intel to Introduce GPGPU Functions Into Westmere

Status
Not open for further replies.
They'll get it all on one die eventually, and it will be more cost effective than manufacturing separate dies, but they'll never compete in gaming class graphics. Having the whole platform, including graphics, on one die will make it hard for nVidia to compete in enterprise graphics markets. AMD could do something similar, though they haven't moved in that direction yet.
 
GPGPU on an Intel GMA IGP? Intel IGPs can barely handle the graphics processing that already gets thrown at them!
 
[citation][nom]Zoonie[/nom]2 heads ARE better than 1 D[/citation]

So that explains why guys are better than girls...I knew there had to be a reason.
 
Can this extra graphics processor be utilized even with a separate graphics card? Say I have a 5850 and one of these chips, will the CPU dump off some work onto the integrated graphics while I use my primary GPU for other things, or does having a separate GPU leave the integrated GPU sitting dead in the box taking up space?

How does this add to the heat of the CPU?
 
[citation][nom]pbrigido[/nom]Hey AMD Fanboy, get over it. Intel and AMD are both great companies.[/citation]

Intel is good at making CPUs and AMD is good at... Whatever. They need to put more cores on CPU chips not some ultra low level GPUs. I want my HTed 48 cores ASAP!
 
I remember reading about Westmere a long time ago... this is mainly just an announcement that Intel's working on driver support for GPGPU functions... which isn't exactly a surprise. This isn't really anything interesting yet, since Intel hasn't even given a vague indication when this driver update will release, so its probably just an effort to get any positive news out, no matter how small, after they admitted failure and scrapped the "first version" of Larrabee.

 
[citation][nom]Abrahm[/nom]Can this extra graphics processor be utilized even with a separate graphics card? Say I have a 5850 and one of these chips, will the CPU dump off some work onto the integrated graphics while I use my primary GPU for other things, or does having a separate GPU leave the integrated GPU sitting dead in the box taking up space?How does this add to the heat of the CPU?[/citation]

if you have a 5850 then who cares if there's any other gpu's in the system? (unless it's a second 5850 for crossfire 😉) but to answer your question, no
 
Yes, the future is a unified CPU and GPU. I really don't see any future in the discrete graphic cards. When the CPU and GPU are centralized, resources would be dynamically allocated depending on the application. There would be less wasted silicon. And without the messy interfacing, the system as a whole would operate much faster.
 
It's good enough for most games!
Just not the latest.
I don't think it's fail at all!

I'm really looking forward to this chip, and I bet a lot of other people do too!

I see they integrated the memory controller on the same die as the graphics chip, which means lower latencies for graphic applications. It allows them to have faster VRAM speeds without much additional cost; but it will also mean (since the memory controller is not located on the CPU) that the chip will perform slightly slower on programs.
On the other hand the CPU is fast enough for most programs, (save perhaps video encoding), but it is not for most games.
They did well in focusing the memory controller on the RAM.
Just that detail alone could increase graphics speed by a couple (or more than ten) percent on fps!

Combined with most likely what's going to be a slightly faster graphics GPU than a GMA950, we'll probably look at a decent graphics solution to play DX9 games with resolutions upto 1280x800 fluidly!
Perfect for netbooks and notebooks, servers, and small dekstops!

Those who don't like this, can still equip their pc with a Corei7, and a pair of Radeon 5870's if they like; I mean, it's not like this chip is going to totally replace the current lineup of regular CPU's.
 
Have anybody thought of the power-saving capabilities of those processors? Not only they are 32 Nm but they also come with a GPU. Yes, I know it is not a good GPU, but for using Windows alone, it does the job pretty well. Now your real GPU will be on idle most of time, saving power.
 
Not that I generally support Intel graphics but the Westmere IGP looks to be a major improvement over the GMA4500HD and can handle all you'd expect from an IGP as far as I can tell from the information already out in the wild.

The advantage of doing something like video transcoding over the GPU would effectively mean you'd get an extra "core" for free in the IGP. What's not to like?

I mean, I'm sure it won't be _amazing_ or anything but everything that leads to better utilization of all available resources in the computer must surely be a good thing.

I'm in the market for a notebook myself, though I'm holding out for Arrandale before making the purchase, and since I have a dedicated desktop for gaming I'm going to aim for a product that relies on the IGP alone. If I can get more functionality out of it that's all good in my book.
 
Status
Not open for further replies.