[SOLVED] Oldest Generation Intel for decoding x265?

Mike_368

Prominent
May 10, 2017
22
0
510
Does anyone have a short list of the oldest generation Intel chip I need in a rig to encode/watch my x265 video. My current old laptop is NOT good!

Does GPU even matter when encoding these video files?

 
Solution
encoding x265 with a GPU is pretty terrible quality. it is best to use CPU for that. that is a resource heavy task so you really just have to go with the most powerful CPU you can afford. Intel CPUs are more efficient at encoding x265 than AMD Ryzen's are. They seem to have better AVX support. no matter what though, if you encode for example a 4k movie length title, it is gonna take a while. i use distributed encoding to use three ryzen 5 systems i have in my house and it can still sometimes take hours for a movie. something like a 1080p tv episode can be knocked out in 15 minutes or less with that much computing power though.

as far as decoding, most people say to use a GPU for that, but there's a caveat they don't seem to realize...

dvo

Distinguished
Jan 16, 2008
90
0
18,660
encoding x265 with a GPU is pretty terrible quality. it is best to use CPU for that. that is a resource heavy task so you really just have to go with the most powerful CPU you can afford. Intel CPUs are more efficient at encoding x265 than AMD Ryzen's are. They seem to have better AVX support. no matter what though, if you encode for example a 4k movie length title, it is gonna take a while. i use distributed encoding to use three ryzen 5 systems i have in my house and it can still sometimes take hours for a movie. something like a 1080p tv episode can be knocked out in 15 minutes or less with that much computing power though.

as far as decoding, most people say to use a GPU for that, but there's a caveat they don't seem to realize. if you do not have an HDR display, decoding an hdr title with a GPU will cause banding due to the gpu doing a bad job at converting the color from 10/12 bit down to 8 bit. it looks awful. if you have an HDR display then its a moot point, and i suggest using a GPU (i use an AMD rx550 for this from time to time. it works great and is about as cheap as a GPU can get)

as i do not have an hdr display but i do decode 4k HDR material to watch on my 4k non HDR TV, i use my CPU for software decoding. the minimum CPU i've personally found capable of doing this is my i5-4690k. it uses up about 80% to decode a 4k HDR file if played with Plex Media Player (direct play from my server). if VLC or MPC is used, the CPU simply wasn't powerful enough. What I settled on was a ryzen 5 1600. it'll play my most demanding titles at 25% cpu usage. that machine is in my living room, so at 25% usage it stays quiet.

my general personal guide for decoding 265 is any Core series intel chip from 7th generation or newer should do a fine job, as well as any ryzen 5, or better, chip. there are other chips capable, i'm sure, but what i mention are the ones i have actual experience with. some slightly older chips will accomplish the task (like my 4th generation i5 i mentioned) but the CPU usage will be significantly higher and you'll end up using more electricity, building more heat and making more noise, so I dont believe that is worth it.

To literally answer your question though, the oldest intel generation would be a 4th gen i5, or 8th gen i3. My experience is desktop parts though, not laptop parts. In laptop parts I would imagine you'd have to go maybe a generation or two newer. Also, don't forget, the player you use plays a big part. VLC and MPC which everybody recommends are terrible players for that sort of thing. very inefficient. you'll have far better performance using plex media player, or even the built in movie player in Windows 10.


i hope some of this helps. i probably used way too many words to answer your straight forward question though. if you have any more questions about it or need me to clarify on anything i've said i'll be happy to help.
 
Solution

Mike_368

Prominent
May 10, 2017
22
0
510




Wow thanks so much! I am trying to learn on the fly and take this all in but SO MANY CHIPS out there!

Just trying to decide to get a laptop, or a used desktop to hook to the TV. I would LOVE to spend $250 or less. I have a buddy who has a Dell XPS 8900 desktop with a GTX 750 card and 6th generation i5. I mean all I need is to hook to my TV via HDMI and I am set.

laptop would be easier but more money I assume

I was reading in an old forum post somewhere about "Ivy Bridge" micro architecture being able to decode x265 efficiently so got to read into that.


I am learning so much thanks so much for the help!!!!!