[SOLVED] Intel Integrated Graphics or Basic Graphics Card?

Oasis Curator

Reputable
Apr 9, 2019
236
7
4,595
I know this is probably not going to make a huge difference but wondering whether I should get something like an Nvidia GeForce EN210 1GB graphics card instead of using my Intel i5 9500 processor for 4k film watching?
The processor does okay but I'm looking to run some 10-bit HDR files so wondering if a graphics card would share / take some of the load from the processor?

I don't have hundreds of pounds to spend on this - the graphics card is £10.
I could spent up to £100 but as prices are stupid at the moment... that's why I got that particular processor so it could handle the 4k files, plus, it's for our HTPC which is nearly silent so adding a card with fans would add noise and I fear, a lot more noise.
 
Solution
The 210 would be a bad choice.
It does not have dedicated decoders for newer video formats. Which would then have to be done by the CPU or integrated graphics if still enabled in bios.
Some boards automatically disable integrated graphics when you install a dedicated video card. So the burden would shift to CPU decoding.
The 210 would be a bad choice.
It does not have dedicated decoders for newer video formats. Which would then have to be done by the CPU or integrated graphics if still enabled in bios.
Some boards automatically disable integrated graphics when you install a dedicated video card. So the burden would shift to CPU decoding.
 
Solution
Still a little old for hardware decoders.
The 9?? series does not support hardware HVEC decoding. It can do it but at high cpu/gpu usage, which translates to noise.

My bad. The 960 was released later and does have NVIDIA's decoder. The higher end cards do not.
 
Last edited:
Since you already have integrated graphics, why not try it first and see how you do?

What is the make/model of your motherboard, and the device that you will be using?
Usually, a hdmi connected device to integrated graphics will display 4k resolution ok, but only at 24hz. dp connection may do 60hz.
I might guess that a GT1030 might be a good card to use:

I do not understand the implications of 10 bit processing.
 

InvalidError

Titan
Moderator
My i5-3470 had enough processing power to decode 4k in software, so a newer i5 should have no problem with it. I wouldn't bother slapping on an ancient entry-level GPU for that, especially one that doesn't have hardware acceleration for newer CODECs either so you end up using CPU-based decode with partial hardware acceleration either way.
 
  • Like
Reactions: Oasis Curator

Oasis Curator

Reputable
Apr 9, 2019
236
7
4,595
I think also what's throwing me off is the TV can process all this stuff but I guess if the motherboard isn't outputting it, then it won't matter?
Need to try some proper HDR stuff and see what it looks like I think.
 

TRENDING THREADS