Integrated graphics with dedicated graphics card

clankirkpatrick

Honorable
Jan 10, 2013
17
0
10,510
If you have integrated graphics, like on the i5-3570K, does that work in tandem with a dedicated card to boost performance, or do they only work alone. Thanks!
 
Solution
for any gaming build there is the important point of future proofing i.e being able to play games which will be released in say another couple of years. the link you provided shows an old game being played almost okayish at a low resolution and a low game settings. the point of that review is to point out to the fact that Intel is really focused on improving their IGP and bring it on par with AMD's APU IGP but whats noteworthy is that they still have a long way to go. :)

as far as future proofing goes, IGP wont let you enjoy most games especially those that employ technologies like physx. A discrete GPU adds/enhances gaming muscle by many folds and it is therefore very important to a gaming/rendering build.

here is a way to...
they work alone. when an external graphics card is installed, unless you change its BIOS entry to keep it enabled always, the IGP will be disabled.
you cannot get them to work together.
Only with AMD APU's it is possible to use IGP with an external graphics card but that also has constraints such as only AMD graphics card can be used and also from certain series only.
 
+1 for rds...that is the most important question. if you just want to enjoy watching HD movies then the HD4000 IGP in the 3570K will hold its own.
If you want to do video editing/encoding/rendering, play games at consistent 60+ fps and do stuff like product rendering etc, then a discret graphics card is mandatory
 
Its going to be mostly gaming and normal daily tasks...my total budget is around $800-$900 dollars. This is my first complete build, I have some help from friends who have done it many times, but for the most part I am learning as I go.

I was leaning toward the 3570k, but I didnt know if I would need a card to go with it . I've seen some benchmarks with the 3570k, and the 3570k+card so thats where my question arose. Here's what I was looking at:
http://www.xbitlabs.com/articles/graphics/display/intel-hd-graphics-4000-2500_5.html
 
for any gaming build there is the important point of future proofing i.e being able to play games which will be released in say another couple of years. the link you provided shows an old game being played almost okayish at a low resolution and a low game settings. the point of that review is to point out to the fact that Intel is really focused on improving their IGP and bring it on par with AMD's APU IGP but whats noteworthy is that they still have a long way to go. :)

as far as future proofing goes, IGP wont let you enjoy most games especially those that employ technologies like physx. A discrete GPU adds/enhances gaming muscle by many folds and it is therefore very important to a gaming/rendering build.

here is a way to understand things. 24 frames/second is used to record videos and when frames are run at that speed, human eyen sees it as fluid motion. All these frames are however prerendered. i.e. all details of shadow, lighting and texture are already captured in every frame. An IGP will easily run any HD movie because of this fact.

when you game, it is a different story. playing an FPS for e.g., everytime you turn your head, the game populates your GPU with compute data to "compute" details of shadow, lighting and texture. remember, here nothing is pre-rendered and therefore every detail you see on the screen is being rendered in real time. To achieve the same fluid experience as with HD movies while rendering in real time, you should ideally be sitting at more than 30 fps.

The above requires a lot of parallel computing muscle which is only possible with discrete GPU's atleast for now.
 
Solution
Great answer! So what would you recommend for a GPU for under $150.

Also, this maybe the wrong forum to ask, but would I be better off with one better card or 2 cheaper cards that do SLI or crossfire?

You guys have been super helpful so far, thanks!
 


Can you help me pls i have R7 250 GPU and when i go to Device Menager--Display Adapters i can see only AMD Radeon R7 200 series but thers not the integrated graphicks (i5-3470/HD2500). Why integrated is missing?-i have seen on some computers that igpu and gpu are written there :/