GeForce GPU Will be Inside 200 Sandy Bridge PCs

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
It is all stupid and waste of time. Performance you will get is 'Slide Show in any current games'. Get regular CPU and dedicated graphic card since any mobo you buy now has at least one PCI 16x slot. For any video encoding/decoding use your dedicated GPU!
 
Do you guys even realize what heat you will have to deal with, can't even overclock CPU but you will be offered some Turbo joke which dynamically overclocks one of your Cores -> performance wise you gain nothing unless all cores are overclocked and stay overclocked until you shut down your machine.
 
integrated Intel graphics
not being a DirectX 11 part. For full DX11 support, OEMs have to turn to graphics solutions from Nvidia (or AMD).

this makes me chuckle and smirk. i figured intel would have to do this unless they wanted to make their competitor AMD rich by using their cheaper and lower power useage GPU which would have been smarter for the near future. after nvidias stocks tumbled from that move intel could have picked nvidia for dirt cheap!
if this jacks up the price of their cpu ridiculously more then it already is tho. it's going to flop with out a massive battery to keep it powered for longer then an hour for the people that will take advantage of it in a laptop i don't see this ending well as long as nvidia and intel stay seperate companies and there's no way nvdia can buy intel. if this doesn't work and nvidia doesn't make it work, they better start searching for a decent cpu manufacturer and either make thier own or buy one out and massively overhaul it like AMD did or they will wind up like 3DFX soon.
most buyers are moving towards a light weight portable computing that doesn't require much to upkeep. apple is proving they will pay thru the nose for it too.
 
[citation][nom]phatboe[/nom]I know and agree but I would hope they would make a version of sandy bridge without the IGP for those few of us who don't want it.[/citation]
Simply because it's more expensive to create. They'd have to make a separate mask and change their tools to do that. It's easier to disable the iGPU and call it a day. I also doubt this CPU only die will have an advantage in terms of performance. It won't overclock better and actually the disabled iGPU is acting like a cold spot.

I'd actually like a CPU+GPU combo. When not playing games, folding, etc I can use the iGPU and draw less power. For laptops this is a must. Currently when a dGPU is installed the iGPU is disabled but I think that can be worked around by the manufacturers.
 
:/ don't tell me that they're still going to stick with their GF110's and boast about nailing ATi's confins with it.

If intels new hush hush IGPU is designed by nvidia's techies...i'm all in favor of not buying anything with an nvidia on it. Overheating CPU's anyone?
 
Why would anyone (directed at you Intel) make any type of graphics solution to be released NEXT YEAR that would not be DX 11? Stick with CPU's Intel and leave the graphics to Nvidia and ATI.
 
quoting marcus..
"While the graphics prowess of is Intel’s best yet,"
Marcus.. just what are you TRYING to say there? I don't understand...

.. or let me ask in your lingo...
Marcus.. you what try to saying you there? I no understand you...
 
[citation][nom]Silmarunya[/nom]1)
The die size 'wasted' on the IGP is not wasted.

Well, what is it then?

Today, many calculations can be offloaded to the GPU. If that trend continues, the IGP would in effect be an extra processor core that only takes effect in computationally intensive applications.

Oh, like two other huge manufacturers already do, one of whom have been / are involved in protracted law suits brought by Intel to stop them using their GPUs as CPUs?

2) You might not use an IGP, but a majority of the market does - with good reason. Modern IGP's can do all the things Average Joe asks of them (video playback, web browsing, casual gaming and word processing) quite well.

That's market share. It says nothing about quality of the product, how much it costs, or the flexibility it gives someone in their choice. Intel have given buyers of this product series absolutely nothing and instead taken away everything. Essentially, they've packaged their chip with Internet Explorer and if you don't know any different, you'll stay with it forever and ever. The cost is hidden in the price, and you're stuck with it.

3) Intel has nothing to gain from strong arming Nvidia GPU's out of the market.

HAAAAAAAAAAAHAHAHAHAHAAHAHAHHAAAAHHAHA!!!

First and foremost it would cause massive antitrust regulations, and second it would mean they'd effectively cede the GPU market to arch rival AMD.

It would cause antitrust regulations if someone could be bothered, politically, to take one of America's largest tech manufacturers to court for being "inovative" and "progressing American values and business". Which they won't.

And as AMD... I'm confused and I don't understand.

FYI, IGP's can already be used in Crossfire/SLI with discrete GPU's (Hybrid SLI/Crossfire anyone?). However, only for the lowest of lowest end GPU's there's a noticeable performance benefit. Even the 5450 fails to post significant gains. After all, IGP's are terribly weak and their Crossfire/SLI scaling is absolutely pathetic.

So you're saying an IGP isn't any good at anything other than being an IGP? And that Mr Bloggs out there wouldn't use it for Crossfire or SLI? (First cos they wouldn't know what the hell you were talking about).

Tell me again? What is actually GOOD about this for ANYONE other than Intel? Why didn't they just bl**dy leave it the way it was. Oh yes. MONEY.



[/citation]
 
[citation][nom]scrumworks[/nom]Nvidia GPU inside 200 Sandy Bridge PCs? Well that's not much. I bet AMD GPU will be inside hundreds of thousands of Sandy Bridge PCs.[/citation]

... and a lot more in AMD PCs too... 😛 😛 😛

(Maybe I should drop Intel / nVidia and get my ass back to AMD / AMD)
 
[citation][nom]eddieroolz[/nom]I'd love to see more Radeon/GeForces inside PCs, and this is a good step towards that.[/citation]

I'm not sure I get you - this will just make 75% of the people I know who have PCs not bother with any graphics card at all. It'll be marketed as some shiny new-fangled pair of shoes that everyone will have to have, and before you know it there'll be no 330M in laptops, no 5750s in Best Buy - just a nice clean sparkly row of Sandy Bridge PCs and laptops.
 
[citation][nom]mtyermom[/nom]And there are a LOT more of those desktops than ones like ours.[/citation]

Casual users dont buy 300$ unlocked multiplier CPUs...
 
[citation][nom]amillion[/nom]... and a lot more in AMD PCs too... (Maybe I should drop Intel / nVidia and get my ass back to AMD / AMD)[/citation]

If you dont mind the epic crappiness of AMD.
 
Ok after READING the article. I see one thing. Sandybridge on die Graphiscs is too weak to take advantage of IE9 (gpu for internet and all that junk) soooo.... all the OEM's are getting their arms twisted by Nvidia because AMD was too busy actually developing a better product to pay their marketing department.

[citation][nom]TommySch[/nom]Casual users dont buy 300$ unlocked multiplier CPUs...[/citation]
I think this guy was saying exactly that already TommySch.

[citation][nom]TommySch[/nom]If you dont mind the epic crappiness of AMD.[/citation]
Maybe he doens't wanna spend 100% more money for a 20% gain in performance?
 
I just had to RMA my 5770. I'm really glad I forked over $10 to have motherboard 4250 graphics. Now my desktop is running rock-solid instead of being OS. Some type of integrated graphics can be really helpful on the desktop.
 
Well seeing how consoles already did this, and we know how they turned out it feels like we would be better off buying discrete still (and most likely for a long time to come too)
 
Status
Not open for further replies.