GeForce GPU Will be Inside 200 Sandy Bridge PCs

Status
Not open for further replies.

joytech22

Distinguished
Jun 4, 2008
1,684
0
19,810
10
About time, the 360 has been doing this for years and it's about time it hit PC's too.

But you can bet your ass these solutions will probably cost *something stupid here*
 
^ They already have, they were just testing it out. Intel's i5s and i think i3s already have an iGPU. AMD also proved they could do it, they just don't have it on a desktop CPU. Their laptop processors do have an iGPU however.

 

warfart1

Distinguished
Sep 29, 2009
11
0
18,510
0
They already have, they were just testing it out. Intel's i5s and i think i3s already have an iGPU. AMD also proved they could do it, they just don't have it on a desktop CPU. Their laptop processors do have an iGPU however.
The previous generation of Intel processors with integrated graphics had them on package, not on die. I believe that the CPU cores were 32nm while the graphics cores were 45nm.
 

Blessedman

Distinguished
May 29, 2001
577
0
18,980
0
I can understand this design for a business type of drone, but doesn't their entire new lineup have this? I bet Nvidia is laughing at this decision. I can see putting a larrabee type of designed gpu on die, so you can have decent graphics but an amazing co-processor, but why waste the space on ... junk?
 

toxxel

Distinguished
Apr 14, 2009
68
0
18,630
0
I personally won't even think of buying a cpu with an integrated graphics, or even use it. I could personally only see something like this being popular in mobile systems, not desktops. I build a desktop to have the customization to pick and upgrade what I want, integrated graphics disallows that option. More of a casual user if you would find it in a desktop, or someone who could care less.
 

mtyermom

Distinguished
Jun 1, 2007
956
0
18,980
0
[citation][nom]toxxel[/nom]More of a casual user if you would find it in a desktop, or someone who could care less.[/citation]

And there are a LOT more of those desktops than ones like ours.
 

phatboe

Distinguished
Sep 2, 2006
237
0
18,680
0
I agree with toxxel, while I like the idea of an APU/Fusion/integrated GFX core for laptops but I will prob never use it on a desktop computer as I will always have a dedicated card. It sucks that it seems that both AMD and Intel will make it so that all their CPUs will have a GFX core when that die space could be used for an extra CPU core or more L3 and etc. And seeing as to how I mostly stick to Nvidia cards I doubt either AMD or Intel will make it so that I can pair the GFX abilities on the APU with the dedicated card (like some kind of hybrid SLI\CrossFireX)
 

Nintendork

Distinguished
Dec 22, 2008
464
0
18,780
0
If you read Fusion articles(other websites) there's the possibility that the APU can be used to offload physics calculation in games.

As for now AMD will provide regular desktops without IGP on die (Zambesi 8-6-4 cores) and Fusion Llano(based on Deneb improved cores) updated later Fusion Trinity (Bulldozer Cores). So if you don't like to have a "useless" IGP then wait for Zambesi.
 

madevil59

Distinguished
Feb 18, 2009
12
0
18,510
0
[citation][nom]mtyermom[/nom]And there are a LOT more of those desktops than ones like ours.[/citation]

If you add in a video card it will disable the GPU in the Intel chip. The i series was built with gpu in them. This is just one of the first times it caught the media's attention to have enough people to read into.
 

RazberyBandit

Distinguished
Dec 25, 2008
2,303
0
19,960
93
These 200 PCs could just as easily have Radeon GPUs in them.

What I've managed to take from this is Nvidia has somehow partnered-up with some major PC manufacturers to ensure their GPUs are featured alongside the Sandy Bridge CPU release, which is expected to be a major attraction at CES. Nvidia is sort of hitching a ride on Intel's coattails. It's simply clever marketing and product exposure.
 

Silmarunya

Distinguished
Nov 3, 2009
810
0
19,010
13
[citation][nom]phatboe[/nom]I agree with toxxel, while I like the idea of an APU/Fusion/integrated GFX core for laptops but I will prob never use it on a desktop computer as I will always have a dedicated card. It sucks that it seems that both AMD and Intel will make it so that all their CPUs will have a GFX core when that die space could be used for an extra CPU core or more L3 and etc. And seeing as to how I mostly stick to Nvidia cards I doubt either AMD or Intel will make it so that I can pair the GFX abilities on the APU with the dedicated card (like some kind of hybrid SLI\CrossFireX)[/citation]

1) The die size 'wasted' on the IGP is not wasted. Today, many calculations can be offloaded to the GPU. If that trend continues, the IGP would in effect be an extra processor core that only takes effect in computationally intensive applications. That's not a waste at all...

2) You might not use an IGP, but a majority of the market does - with good reason. Modern IGP's can do all the things Average Joe asks of them (video playback, web browsing, casual gaming and word processing) quite well.

3) Intel has nothing to gain from strong arming Nvidia GPU's out of the market. First and foremost it would cause massive antitrust regulations, and second it would mean they'd effectively cede the GPU market to arch rival AMD.

FYI, IGP's can already be used in Crossfire/SLI with discrete GPU's (Hybrid SLI/Crossfire anyone?). However, only for the lowest of lowest end GPU's there's a noticeable performance benefit. Even the 5450 fails to post significant gains. After all, IGP's are terribly weak and their Crossfire/SLI scaling is absolutely pathetic.
 

marraco

Distinguished
Jan 6, 2007
671
0
18,990
1
I hope that processors without integrated GPU will be cheaper, more overclockable (because of less transistors), or more powerful (same reason).

I do not want to waste money and energy on unneeded transistors. Unless nVidia and AMD manages to add-up the power of integrated chipsets and discrete cards. They should work on that, since it would give a for-free performance advantage. But I suspect that it will not work without DirectX support.
 

woshitudou

Distinguished
Oct 11, 2006
302
0
18,790
2
I like nVidia cards but the company is the Ryan Seacrest of computer (cause they're douches and will backstab you for a promotion). They made a dedicated Intel-hate site and said dx11 means nothing when they didn't have dx11 yet (http://www.tomshardware.com/news/Nvidia-GPGPU-GPU-DirectX-ATI,8687.html) and now they flip flopped.
 

tommysch

Distinguished
Sep 6, 2008
1,165
0
19,280
0
[citation][nom]toxxel[/nom]I personally won't even think of buying a cpu with an integrated graphics, or even use it. I could personally only see something like this being popular in mobile systems, not desktops. I build a desktop to have the customization to pick and upgrade what I want, integrated graphics disallows that option. More of a casual user if you would find it in a desktop, or someone who could care less.[/citation]

Well, you will have to... All the top end unlocked Sandy bridges will come with that PoS integrated space taking crap. It will be disabled but it will be there. Taking precious die space and costing money for absolutely no reason. Even the Core i7-2600K... Whats the point of putting such a crap on a K model?
 

phatboe

Distinguished
Sep 2, 2006
237
0
18,680
0
1) The die size 'wasted' on the IGP is not wasted. Today, many calculations can be offloaded to the GPU. If that trend continues, the IGP would in effect be an extra processor core that only takes effect in computationally intensive applications. That's not a waste at all...
The die space is wasted, you do realize that on sandy bridge once a discrete GFX card is in use the IGP on the CPU will simply shit down? So no it will not be used toward offloading calculations, the discrete card will have to take care of that.

2) You might not use an IGP, but a majority of the market does - with good reason. Modern IGP's can do all the things Average Joe asks of them (video playback, web browsing, casual gaming and word processing) quite well.
Yeah, I know and agree but I would hope they would make a version of sandy bridge without the IGP for those few of us who don't want it.

3) Intel has nothing to gain from strong arming Nvidia GPU's out of the market. First and foremost it would cause massive antitrust regulations, and second it would mean they'd effectively cede the GPU market to arch rival AMD.
Intel has everything to gain. Intel has already dealt with the entire antitrust issue for the most part and it is unlikely that intel will be brought up again for integrating its GPU into the CPU.

As far as them ceding to AMD, you may have a point but with intel still working on larabee (they might not have released it but intel is still working on designing a distrete GFX card despite the fact that they have claimed they won't release it as a discrete GFX card anytime soon).

FYI, IGP's can already be used in Crossfire/SLI with discrete GPU's (Hybrid SLI/Crossfire anyone?). However, only for the lowest of lowest end GPU's there's a noticeable performance benefit. Even the 5450 fails to post significant gains. After all, IGP's are terribly weak and their Crossfire/SLI scaling is absolutely pathetic.
There is no hybrid Crossfire/SLI with Sandy Bridge, as I said before the IGP will be disabled.
 

K-zon

Distinguished
Apr 17, 2010
358
0
18,790
3
I dont mind the idea of it, but i think if they keep going, they are going to have to increase die size again, and really have to work with the thoughts of increased wattage for the procs. And then even take the lower wattage and expand into the motherboards again to maintain lower wattages for use with continued calcs and processes. And hopefully they continue to work with releases of relating products for parts integrations.
 

nebun

Distinguished
Oct 20, 2008
2,841
0
20,810
19
i guess intel can't make their own gpu, lol, way to go nvidia...like the saying goes, "if you can't beat them, join them".....intel had no choice here ;) but to join envidia, pun intended
 

nebun

Distinguished
Oct 20, 2008
2,841
0
20,810
19
[citation][nom]joytech22[/nom]About time, the 360 has been doing this for years and it's about time it hit PC's too.But you can bet your ass these solutions will probably cost *something stupid here*[/citation]
you are so wrong buddy, the 360 has a dedicated gpu
 
Status
Not open for further replies.

ASK THE COMMUNITY

TRENDING THREADS