Do I need a graphics card for watching movies and playing 4k contents via Youtube?

click2saurabh

Honorable
Nov 4, 2012
8
0
10,510
Hello,

Sorry, for being naïve as I am very new to this.
My system configuration is as below
i5 4690k
Z97i - Plus mobo in a Raven RVZ01 mini iTX case
8GB DDR3
500 GB M.2 SSD
Plugged to Samsung 55in UHD TV via DisplayPort to HDMI cable
My question is do I need a discrete graphics card to play HD movies and watch 4k contents from YouTube? I use PS4 for gaming purpose so I don't intend to play games on PC. I may use photo editor software's Lightroom/photoshop rarely though. If I do need it which one should I go for as it should fit in the mini case I have (silverstone’s Raven RVZ01 iTX form factor).
Please advise.
 

Thanks and yes it shows me options to increase the resolution more than the usual 1920x1080 however windows keeps notifying me the resolution not being optimal. Also, and as soon as I open Cyberlink PowerDVD it changes the resolution with a message.

If I have to buy one what would it be? A decent card with 4K support.
 
You should be able to, the supported resolutions are

Multi-VGA output support : HDMI/DVI-D/RGB/DisplayPort ports
- Supports HDMI with max. resolution 4096 x 2160 @ 24 Hz / 2560 x 1600 @ 60 Hz
- Supports DVI-D with max. resolution 1920 x 1200 @ 60 Hz
- Supports RGB with max. resolution 1920 x 1200 @ 60 Hz
- Supports DisplayPort with max. resolution 4096 x 2160 @ 24 Hz / 3840 x 2160 @ 60 Hz

Don't know if you are using an old hdmi cable, not all are capable of the 4k standard.

what message is it giving you on the force change?
 


Hi,
It says "This is not an optimal resolution for primary display. The optimal resolution is 1920x1080. Select the notification for more information" When clicked it says " Resolution settings other than 1920x1080 may reduce the quality of viewing "
I forgot to mention 1 thing above is that my PC is connected via DisplayPort to HDMI cable (new cable bought recently) to Yamaha RXV 675 AV receiver with HDMI out (new HDMI 2.0) to my Samsung UHD HDMI in. This receiver has 4k upscaling and passthrough. I tried passthrough also but same result.

Please help
 


Whoa, many thanks. Yes, I did as suggested and now it shows the optimum resolution as 3840x2160 at 30p hz. Changing it to 4096x2160 throws same notification about resolution not being optimum. However, the txt have gone too small for me to read even at maximum size. Is there a way out?
 
The exact model of tv?

And text size, right click desktop - resolution, then "make text and other items larger or smaller"
at least, thats for windows 7, i imagine 8 has the same thing but i dont know where 8's res settings are.
 


TV model is Samsung UHD UE55JU7500. I am using Win 7 and figured out the txt sizing. I also, connected it back to the receiver and changed the video settings to process passthrough (was set to auto earlier) and forced it to 3840x2160. Resolution looks same with or without receiver now. However, I cant get it to 60hz either ways as you mentioned in your previous response [- Supports DisplayPort with max. resolution 4096 x 2160 @ 24 Hz / 3840 x 2160 @ 60 Hz ]. Am i still doing something wrong?
 


Not sure if this is the problem and if you have allready solved your issue BUT, you need TV or monitor which supports NATIVE 4K resolution (Just because you have 70 inch wide screen TV doesn't mean there's 4K resolution support on it.)

Resolution is bit confuisng since any size TV or monitor can display images at any resolution, what Graphics cards do is render the image and fit it to that screen size.

For example 1920x1080 is Full HD resolution, that would require monitor which length is 1920 inches and width 1080 inches = 40 inch screen.

4K resolution is twice the size of Full HD which is 4069x2160 which would require 70 inch TV or Monitor.

Every TV and monitor has microchip installed which contains the information of the size of that TV or monitor, GPUs and Graphics chips can detect that microchip and thanks to that, they can render and automatically fit and resize the image and video suitable to your screen size.

If your TV / monitor doesn't support native 4K resolution, only way to get 4K resolution is to use dedicated GPU and feature that let's you increase the resolution to higher than supported resolution (on Nvidia that feature is called DSR).

Intel CPUs have integrated graphics chip (yes, it's a graphics chip not actuall graphics card, only AMD FM2+ CPUs have integrated graphics cards and even those are graphics CPUs installed on CPUs) meaning you can't use higher resolution than the native resolution of your monitor or TV supports.

Some 4K monitors require Display port cable for 4K resolutions, so that also can be something to check.

But back to the question itself:

Do you need dedicated GPU for 4K video formats?

Answer: No, for 4K resolution you don't need dedicated Graphics card as long as you have 4th or 5th gen Intel CPU.

HOWEVER dedicated graphics card is smarter to use because it won't put your CPU under huge load and you can use 3D and somewhat increase the quallity of video you are watching and you can set up multimonitor using 3D surround video (Nvidia) or Eyefinity (AMD) up to 36 screens if you want (36 screens would require 4 AMD GPUs running on Crossfire or 4 Nvidia GPUs running on SLI).

Unless you are planing to install 36x 10 inch monitors and create home cinema wall, you don't need other than intel i7-5770C CPU, about 4GBs of Ram, 32 bit windows 10, 4K monitor or TV and 5.1 speaker system to watch movies in 4K resolution.