Need advice on upgrade/Build

hoarybat61

Reputable
Mar 31, 2015
13
0
4,510
0
Need to upgrade my micro-ITX

I have a Biostar 1037U micro-ITX mobo (about 5+ years old tech) that accomodates a 2.0 x16 graphics card. Currently I am using the onboard video and want to be able to run 4K direct X 12. If I buy a $50 graphics card will it still be too slow with its dual core or create a bottleneck to run 4K direct X 12 despite a graphics card?

I am also considering a new embedded mobo: ASRock J5005-ITX Intel Quad-Core Pentium Silver Processor J5005 (up to 2.8 GHz) Mini ITX Motherboard/CPU Combo Which will handle it no problem but this is a more expensive solution. THanks for your suggestions....
 

hoarybat61

Reputable
Mar 31, 2015
13
0
4,510
0

hoarybat61

Reputable
Mar 31, 2015
13
0
4,510
0
Ah did not realize that it was more CPU dependent. In that case no wonder this 5 year old $90 embedded dual core 1.7 can't play 4k video. So you are saying that if I buy a cheap low profile video card such as the one below I will see no benefit as it is only benefits gaming and not 4K video...hmm Guess I will have to go with the newer ITX at least I know it will perform well with 4K video for a few years. Thanks to all who replied. Almost pulled the trigger on this one for the Biostar MOBO x16 slot and won't now: https://www.newegg.com/Product/Product.aspx?Item=N82E16814137199R&cm_re=low_profile_video_card_pci_2.0_express_x16-_-14-137-199R-_-Product
 
Best thing to do is monitor cpu and gpu usage when playing 4K. If your video players are maxing the cpu and not the onboard graphics then a new gpu isn’t going to help. I believe there are some video players that can use gpu power but they are not as common. It’s been a few years since I paid attention to video player software.
 

hoarybat61

Reputable
Mar 31, 2015
13
0
4,510
0

4K Video is at 100% CPU and 1080P is at 75% CPU usage. Looks like the build of ASRock J5005-ITX will soon be underway although I just only have to get 1 stick of So-Dimm memory. Thanks for your help!

 

hoarybat61

Reputable
Mar 31, 2015
13
0
4,510
0
Conflicting Posts from here at Tom's I think I agree with the #2:

#1 Hmm this says the latest GPU's from NVIDIA and others will offload video play from CPU to GPU:

2016 "Modern GPU's have a video DECODER which is used instead of the CPU.

Yes. If you have a modern graphics card, it will take the load off your CPU. You'll be able to stream much higher quality video. Both AMD and Nvidia cards are good at this so pick anyone you like.

#2 All modern GPUs have hardware video decoders which make mincemeat of tasks like decoding a 1080p h.264 stream (converting the stream to video). That's why your phone can decode a 1080p video without breaking a sweat.

The fly in the ointment is Hollywood. They are paranoid that if Netflix streamed raw video to you, instead of playing it on your screen, you'd capture and save it to a file - essentially giving you a copy of the movie. Consequently, they require streaming services to implement some sort of encrypted stream before they'll license them to stream movies and TV shows.

Because Hollywood doesn't want the computer to be able to access the decrypted stream, the decryption and video decode is done immediately prior to displaying on the screen. Consequently, the decrypted stream can never be sent to the GPU for decoding. Instead the decryption and decode is done inside an encrypted virtual machine running in Flash or Silverlight. So it's entirely up to the CPU to decode the video, and because the CPU isn't optimized for video decoding it takes a lot more processing power.

This is why it usually takes an i3 or better to watch a Netflix or Amazon or Hulu video at 1080p. This is also why there are very few TV viewing apps for the PC (Windows Media Center was the last one to be approved by Hollywood, and it looks like Microsoft has given up getting it re-approved for newer versions of Windows). And why it took longer for Android devices to get a Netflix app than iOS devices (Apple could just submit each hardware device for approval, but since Android was a hardware agnostic platform the software app needed to be approved by Hollywood).

So if you've got an unencrypted movie file sitting on your hard drive, it'll be decoded by the GPU (unless you disable hardware decode). If you're streaming it from a "legit" source like Netflix, it'll be decrypted and decoded by the CPU.
 

ASK THE COMMUNITY

TRENDING THREADS