What is the point of integrated graphics???

Status
Not open for further replies.

Tig2575

Distinguished
Jan 31, 2012
28
0
18,540
A lot of the hubbub about Ivy Bridge has to do with how much better its integrated graphics core is than Sandy Bridge, and people often talk about how Haswell will handle graphics integration, especially in comparison to AMD's current line of "A"PU's.

I understand that integrated graphics might be nice if you're using a netbook or something, but what in the world is the point of it if you're using a desktop, the vast majority of which have a discrete graphics card? Ivy Bridge may be a godsend to notebooks, but Intel also produces CPU's for desktop computers, too! Why would any of us care about a miniscule GPU buried away in our processor, when we're never going to use it because we have our own dedicated graphics processor?
 
Solution
In the 9 years that I have been with Intel® the vast majority of systems that I have talked with people about have all used integrated graphics of one type or another. Heck even most of the computers that I have advised on for home users systems will use integrated graphics. The only time when to start to look at putting in a video card is when they are gamer or power some type of power users.
The Intel® HD 4000 graphics are powerful enough to provide even some of the gamers and power users enough performance to run their applications. So if you are a heavily gamer or someone who is doing heavily graphics work (i.e. video editing) then you might look at adding a video card but otherwise the on board graphics are good.


Christian...

Tig2575

Distinguished
Jan 31, 2012
28
0
18,540
What about the 3770k, which is obviously intended for enthusiast users? Why in hell would intel bother with including "more powerful" (i.e. not quite as good as a GT 440) integrated graphics, on a chip that will virtually only be used by people who will have discrete GPUs?
 

Tig2575

Distinguished
Jan 31, 2012
28
0
18,540


I would appreciate a genuine explanation of the underlying principle, rather than unwarranted, borderline immature hostility.
 

Tig2575

Distinguished
Jan 31, 2012
28
0
18,540


I was hoping someone could explain integrated graphics emphasis in enthusiast models (like the 3770K) that will only be used by people with discrete GPU's. The first reply to this topic doesn't explain that
 
It actually does.
It's not like they could take the IGP out just for 5% of the market and rework the rest of the CPU to ... do something they feel doesn't need to be done at this point in time.
The basic underlying principle is a business decision.
 
'While there have been relatively minor changes to the CPU component of the chip, Intel also claims to have made significant improvements to Ivy Bridge’s integrated graphics processor (IGP). The Intel HD graphics 4000 found in both the Core i7-3770K and the Core i5-3570K sees the execution unit (EU) count rise from 12 in Intel HD 3000 to 16, as well as support for, at last, DirectX 11. Of course, no self-respecting PC enthusiast would want to run their system solely off of integrated graphics, but it’s a step in the right direct for mass-market users, and still brings the advantages of technologies such as Intel’s Quick Sync video, which benefit from the increased EU count'.
 

Tig2575

Distinguished
Jan 31, 2012
28
0
18,540
Gotcha, thanks, all. Going forward, is there any potential for using the IGP to help process additional computations alongside of the CPU, that would never have been relegated to discrete graphics cards because of the relative slowness of PCIe?
 


Quick Sync... Even though CUDA is great, Quick Sync is much faster and gives better quality video transcoding. HD 4000 just makes it even faster.

Edit: Not to mention the fact that using the HD 4000 graphics with Virtu MVP (with a Z77 mobo and a discreet GPU, of course) might actually help gaming performance.
 

cmi86

Distinguished
My logic would lead me to believe that intel is equiping these chips with their ne HD 4000 graphics in an attempt to reclaim the onboard GPU performance crown from AMD, just a gues though I could be completely wrong. The back-up to discrete failure is kind of nice though.
 
I have 6 :eek: desktops. My three "personal" systems each have discrete graphics. One is pretty old - that's where my G80 640 MB 8800 GTS finally ended up.

My three "office" systems use entirely adequate G41 motherboards with built in graphics.

You all need to remember that, hard as it is to believe :), we gamers are a niche market.

There are also a lot of people with "K" chips that use them for other than gaming tasks. And for them, the built in graphics are more than adequate.
 

loneninja

Distinguished


Because there are many reasons to purchase a powerful processor like the 3770K and if your not going to game, than the integrated graphics are usually more than powerful enough. All Intel did was move the graphics from the motherboard chipset to the processor, it's not like they added something that wasn't there before. :lol:
 
I can't speak for the ivy bridge chips, and not gonna bother looking them up, but The 2500k does not support integrated graphics. I'd imagine the enthusiast targeted chips will come sans intergrated graphics support as well.

p.s., the four-core hyper-threaded chips like the 2600k are not targeted at enthusiasts, imo. As AMD has so gracefully shown us *cough*, games aren't gonna use 8 cores any time soon.

The vast majority of PC's use integrated graphics, not discrete cards.
 


Fair enough. Wrong model, same point.

I suppose they either hadn't figured out which model to target at enthusiasts at launch, or, being a virgin fabrication, just didn't screw with too much in the beginning.
 
Status
Not open for further replies.