Do GPUs help with streaming videos?

DeclanDornstauder

Honorable
Dec 7, 2014
132
0
10,680
I was just wondering, I know that GPUs help for rendering graphics in games and other programs, but would they help with watching videos and streams clearly and at high quality?
 
Solution
Modern GPU's have a video DECODER which is used instead of the CPU.

If you have a fast CPU then it's not a big deal usually though on a laptop it should help keep fan noise down. Modern browsers often have "hardware acceleration" on by default for web site plugins though for watching your own content with programs like K-Lite you usually have a choice and the default I choose is NOT to use hardware acceleration.

Anyway, for most people it's the SAME general experience if it's used or not.

Other:
QUALITY streaming of video is often more about network BANDWIDTH with Youtube and other services throttling quality if you don't have sufficient bandwidth.
Modern GPU's have a video DECODER which is used instead of the CPU.

If you have a fast CPU then it's not a big deal usually though on a laptop it should help keep fan noise down. Modern browsers often have "hardware acceleration" on by default for web site plugins though for watching your own content with programs like K-Lite you usually have a choice and the default I choose is NOT to use hardware acceleration.

Anyway, for most people it's the SAME general experience if it's used or not.

Other:
QUALITY streaming of video is often more about network BANDWIDTH with Youtube and other services throttling quality if you don't have sufficient bandwidth.
 
Solution

DeclanDornstauder

Honorable
Dec 7, 2014
132
0
10,680


Ok, well the CPU on the desktop is fairly weak (Core2 Duo E4500), so would there be much improvement? I plan on adding a GT 730 and upgrading the RAM to 4GB ddr2.

 
Comes down to what you mean by streaming? If you're watching pirated-in-theater-still-phone-recorded movies off some website that ends in weird letters, than no, nothing will help that. If you are trying to stream 4K H265 content, then either a CPU or GPU benefits from H265 decoding. There is no right or wrong answer, but mostly, no, it doesn't have a big impact. You can buy a $50 pc-on-a-stick that can do Netflix at 1080p, cell phones can do Netflix and Hulu and all that. Even if it's stressing your CPU and you are streaming and the CPU is 80% maxxed out becauase you don't have a GPU, then all a GPU is going to do is take the load off the CPU, so it goes down to 10% usage. Still doesn't improve the video quality.
 
Yes and no. All modern GPUs have hardware video decoders which make mincemeat of tasks like decoding a 1080p h.264 stream (converting the stream to video). That's why your phone can decode a 1080p video without breaking a sweat.

The fly in the ointment is Hollywood. They are paranoid that if Netflix streamed raw video to you, instead of playing it on your screen, you'd capture and save it to a file - essentially giving you a copy of the movie. Consequently, they require streaming services to implement some sort of encrypted stream before they'll license them to stream movies and TV shows.

Because Hollywood doesn't want the computer to be able to access the decrypted stream, the decryption and video decode is done immediately prior to displaying on the screen. Consequently, the decrypted stream can never be sent to the GPU for decoding. Instead the decryption and decode is done inside an encrypted virtual machine running in Flash or Silverlight. So it's entirely up to the CPU to decode the video, and because the CPU isn't optimized for video decoding it takes a lot more processing power.

This is why it usually takes an i3 or better to watch a Netflix or Amazon or Hulu video at 1080p. This is also why there are very few TV viewing apps for the PC (Windows Media Center was the last one to be approved by Hollywood, and it looks like Microsoft has given up getting it re-approved for newer versions of Windows). And why it took longer for Android devices to get a Netflix app than iOS devices (Apple could just submit each hardware device for approval, but since Android was a hardware agnostic platform the software app needed to be approved by Hollywood).

So if you've got an unencrypted movie file sitting on your hard drive, it'll be decoded by the GPU (unless you disable hardware decode). If you're streaming it from a "legit" source like Netflix, it'll be decrypted and decoded by the CPU.
 


I used a Core2Duo with 2GB ram and built on video as a box to run Kodi and Netflix in my living room. It handles netflix fine and stream some of my bluray rips in the area of 20GB from my NAS box no problem. I did add a GPU but I wanted HDMI output or the built on was fine as it plugs into a home theater receiver. I did upgrade the RAM to 4GB but only because I got it for free.
 
Hey,
You can EXPERIMENT, but basically if your video isn't STUTTERING then there's nothing to worry about.

You can also open the TASK MANAGER (ctrl-alt-del) and look at "Performance-CPU" though change graphs to show all (two) because you want to see if any single core is running close to 100%. If you hit 100% you'll get stuttering. Average across two cores is useless for this scenario.

You should be okay, but if any program kicks in like anti-virus you'll have stuttering issues for HD content. I tested a slightly weaker CPU and it was having issues with HD content but it was pretty close.

*I then added a cheap HD6450 graphics card, installed K-Lite Standard, then the content played just fine.

K-Lite: http://www.codecguide.com/download_kl.htm

**For K-Lite (for your own content) you'll want to enable hardware acceleration. I'm not sure what your current GPU supports but the GT730 should support the main content of H.264, MPEG2, VC-1.

Summary:
- Task Manager can show CPU usage (100% on any core is bad)
- K-Lite Standard (enable hardware acceleration on install or under "options-> Internal Filters"
- web browser plugins should default to hardware acceleration (you can right-click to see)

Other:
If you have W7 or W8 then W10 might help. I say might. For two reasons:
a) clean install (even if just upgrading that's "cleaner"), or
b) better memory management

Probably pretty similar though. I had W7 on an older PC (X2-4800+ CPU, + HD5450) and it's running much better with Windows 10 though I'm almost certain most of that is because W7 software had issues of some kind. Whatever, it's working better now and my BSOD and other issues are finally gone on all my W10 machines.
 
Netflix etc:
CPU requirements appear to be pretty low. I can't find a reference to hardware/GPU acceleration just that it needs an x86 1.6GHz CPU (a bit vague).

Netflix does support HTML5 so it's possible that this does work through the GPU decoder. I'd be surprised if it did not. I suspect it removes the DRM wrapper via CPU and the raw video streams through the GPU decoder but I can't be certain.

(It may work better via the Windows 8/10 app rather than running through a browser. May be similar performance, but either way it's probably best to leave a browser or other programs NOT RUNNING to avoid taxing the CPU)

*Again, all that matters is if your CPU isn't being maxed out on a single core, otherwise no worries.
 

TRENDING THREADS