Intel to Introduce GPGPU Functions Into Westmere

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
[citation][nom]Abrahm[/nom]Can this extra graphics processor be utilized even with a separate graphics card? Say I have a 5850 and one of these chips, will the CPU dump off some work onto the integrated graphics while I use my primary GPU for other things, or does having a separate GPU leave the integrated GPU sitting dead in the box taking up space?How does this add to the heat of the CPU?[/citation]
Yes, that's what switchable graphics means
 
[citation][nom]ViDER[/nom]About freaking time![/citation]

sure AMD is great as long as they don't mess up like they did with the first phenom chip and finally go bankrupt which would be no surprise.

but in the end Intel will need to bail them out cause they need there 64-bit instruction set that AMD built on Intel's original CPU design.
 
[citation][nom]pbrigido[/nom]Hey AMD Fanboy, get over it. Intel and AMD are both great companies.[/citation]
You're assuming he's an AMD fanboi. Intel isn't the great innovator they want you to think they are, whereas AMD has made CPUs great and affordable. I am not a "fanboi" but yeah, Intel does = fail.
 


really? i guess your not counting there core 2 CPU's, the atom CPU that popularized the netbook market, core I5,17 etc., etc. oh and what about all there mobile CPU's for notebooks? trying to find a notebook with an AMD CPU is like looking for a needle in a haystack because there mobile CPU's FAIL.
 
[citation][nom]Abrahm[/nom]Can this extra graphics processor be utilized even with a separate graphics card? Say I have a 5850 and one of these chips, will the CPU dump off some work onto the integrated graphics while I use my primary GPU for other things, or does having a separate GPU leave the integrated GPU sitting dead in the box taking up space?How does this add to the heat of the CPU?[/citation]
Probably does turn off the IGP with a discrete card. Course then you can use the far more powerful discrete card to do the work instead.
 
[citation][nom]omnimodis78[/nom]You're assuming he's an AMD fanboi. Intel isn't the great innovator they want you to think they are, whereas AMD has made CPUs great and affordable. I am not a "fanboi" but yeah, Intel does = fail.[/citation]
wow, what a douchebag.
 
[citation][nom]omnimodis78[/nom]You're assuming he's an AMD fanboi. Intel isn't the great innovator they want you to think they are, whereas AMD has made CPUs great and affordable. I am not a "fanboi" but yeah, Intel does = fail.[/citation]

Hey AMD fanboy...get over it. Like I said, they are both great companies. Without Intel, there would be no x86 market...period. Without AMD, there would be no value minded market.

 
This fusion of GPU and CPU sounds like it would be pretty decent in laptops and low end desktops. This most likely will become the standard for all laptops and low end desktops(i.e replace IGP+CPU) once AMD's fusion is out the door as well.
 
[citation][nom]omnimodis78[/nom]You're assuming he's an AMD fanboi. Intel isn't the great innovator they want you to think they are, whereas AMD has made CPUs great and affordable. I am not a "fanboi" but yeah, Intel does = fail.[/citation]
So why is the iCores the best CPUs in the market? I don't see powerful benchmarks running on AMDs for quite sometime. And when they're run in one, it's probably because of they accessible price.
 
Off-topic here but one of the major annoyances with the TH posting system is the extreme ass-hattery going on with ratings. Good posts get up/downrated due to fanboyism rather than content, which make the system inherently pointless as you're forced to check all posts just in case.
 
[citation][nom]jharel[/nom]Do your research before making uninformed remarks.Clarkdale pulls even with AMD's 780G integrated solution in performance:http://www.xtremesystems.org/forum [...] stcount=38[/citation]Take your own advice. You're comparing the unreleased Clarkdale (desktop variant) to AMD's slowest current desktop IGP? Look instead at the Radeon 3300, or better yet, compare it to a future 5000 series IGP. 😛

I have no doubt that Intel's new processors will be fantastic, but the purpose of their new more tightly integrated GPU is NOT performance. It will have very low power consumption, and it will help provide platform dominance in the low power and low cost markets. Why integrate an Nvidia graphics processor into a mainboard when the processor package includes one already? They're trying to squeeze Nvidia a little harder. If they release an Atom with this GPU on-package, it will put even more pressure on Nvidia.
 
[citation][nom]jj463rd[/nom]Intel CPU = Awesome Intel GPU = SucksAMD CPU = Good AMD (ATI) GPU = Awesome I think I'll trust AMD's CPU/GPU integration.[/citation]

Awesome + Sucks = Sucksome ?

Good + Awesome = Goosome ?
 
When it comes to CPU, I perfer Intel.
When it comes to GPU, I perfer ATI.
The future is a fusion/integration of CPU+GPU.
So AMD made a smart move acquiring ATI.
And NVidia's future is very gloomy.
In the end, once the discrete graphic is gone, then we're all stuck with either buying an AMD system or Intel system.
Now, what which should I choose...?? Someone please help me toss a coin?
 
I don't believe the future lies in unified architecture.
- People that demand high GPU power won't have any performance benefit for quite sometime.
- In order to upgrade one component, you have to buy an entire new rig.
- Computer Prices will go up because you don't have much choice between components.
- Competition will be heavy between Intel and AMD (figuring that they will have such technology) while other players such as EVGA, Shappire, Corsair, Kingston... will be in deep trouble because of theunified architecture.


I may come with more points....
 
I agree with all your points, Zehpavora.
Any resonable unified architecture won't come in the near future. But I believe it is destined to happen; maybe in 20+ years. Because, our current hardware architecture is bound to hit a limit. Just like the the CPU clock speed; the clock speed hasn't been up any faster for the last 2 to 3 years. It is hovering around 3GHz for quite some time. Instead, we went multi-core.

Also, I cannot see myself still using a computer the size of a microwave oven in the next 20 to 30 years. That would be like in a computer history museum.

And pretty soon all those long copper traces would be replaced by the on-chip photonic network that unifies the processor core, the memory and, the graphics core.

The fact is that the evoluation of the architecture would leave behind what we once dear loved. There's nothing can be done about it.
 
[citation][nom]DjEaZy[/nom]... that mean new MoBo?[/citation]

Nope. Support for LGA 1156. Notebooks on the other hand may need it.

Its nice to see it will have switchable graphics support. That means you can have GMA for internet, music, Office and other tasks and then use a ATI/nV for gaming mobile. Will help save battery life for the gaming mobile community.

I personally want to see what exact GPU will be going with it. Is it the HD4500 or maybe a new one. Plus its going to 45nm (HD4500 was 65nm) and as well being on the same chip should make some interesting improvements in performance since it wont have to communicate from the NB to the CPU. Now it will be via DMI from CPU to GPU.

Can't wait to see it though. First step towards a single chip. Still not what enthusiasts want but good for the regular peoples.
 
[citation][nom]jharel[/nom]Do your research before making uninformed remarks.Clarkdale pulls even with AMD's 780G integrated solution in performance:http://www.xtremesystems.org/forum [...] stcount=38[/citation]

Thats interesting to see. Kinda makes you wounder what it is if it outpaces a 4500HD and keeps up with a 780G thats at least a HD3200......
 
Can't wait until the GPU is also 32nm, and they are eventually on the same die.

Does the GPU still use system memory? If so, why not go for 256bit channel memory? (as compared to the current 64bit for LGA 1156 and 96bit for LGA 1366. Current GPU's use a higher bit depth for increased bandwidth/performance, and faster frequncy too by use of a much higher voltageon the memory, though at thje cost of dramatically higher power usage)

Great potential, but I can't see the initial batches as anything better than a cheap Athlon II x4 and a 4200 integrated GPU.
More than anything though, I see this as a chance to reduce the power usage of a laptop dramatically (now only if every single chip could be put into one for a complete SoC)
 
Status
Not open for further replies.