[SOLVED] Nvidia 1050ti for 4k editing

k1114

Titan
Moderator
A gpu does a lot in video editing and can be the most important component. But it does depend on the software. The good ones will be using it because it's 10x+ faster than a cpu doing the same work.

Photo editing requires a lot less processing power so weak gpus are sufficient but I wouldn't say it's doing almost nothing. Depending on what they are doing, the gpu can be doing all of the work so can be the most important part once again.
 

The GPU is used for decoding and encoding the video. Basically decompressing the video from the stored state to the displayed state, and compressing your newly edited displayed state to create a new stored state.

The decode stage is standardized and relatively trivial. The GPU on your phone can do it in real time.

The encode stage is a lot harder, since the compression algorithm has to first go over all possible methods of compression to figure out which method is best for that particular video segment, before finally compressing the video. This is where a beefy GPU helps the most. But for 1 minute videos, it just means the final encode step of your editing process takes 10 minutes instead of 1.

The GPU (and RAM and a good SSD) can also help if you're editing together lots of different video clips. But I can't see that being much of an issue with 1 minute videos.

Unless color grading is some processor-intensive function which is massively sped up by the GPU, I would think a 1050Ti would be overkill for 1 minute videos. I suspect that's what OP's question really is. If anyone has done color grading of 4k videos with/without a GPU, and how much GPU is necessary to do it in a timely fashion.

Photo editing requires a lot less processing power so weak gpus are sufficient but I wouldn't say it's doing almost nothing. Depending on what they are doing, the gpu can be doing all of the work so can be the most important part once again.
It's barely used at all for photo work. A GPU from the 1990s could do it, they just didn't design GPUs back then for today's 4k monitors

In photo editing, the GPU is only used to accelerate a few filters. Most filter effects will apply in a few seconds anyway, so it becomes the difference between waiting a few seconds to see the filter effect, and the filter effect showing up almost immediately.

If you shoot in the camera's RAW mode, a GPU can also speed up decompression of RAW images. The RAW formats of the different camera manufacturers are proprietary, so there's no standardized hardware decompressors like there are for video. So you have to rely either on the CPU to decompress them one pixel at a time (slow), or a GPU which can decompress a bunch of pixels in parallel (fast).
 

k1114

Titan
Moderator
There's also the preview window. When scrolling through the video and it stutters worse than vsync on a bad day, then it's a pain to edit on. Then there's the software that only uses the cpu and you wait 10 mins for 1 sec of scrolling. I didn't say a 1050ti wasn't sufficient but the gpu doing very little or nothing for video/photo editing isn't always correct. I know you didn't mention a 90s gpu here but if it does do very little than a 90s gpu could work but it would only if you like torture or slow internet.

Try applying filters on something bigger than your normal 5x7 photos and see how many minutes a UHD 620 takes. Even something from sandy, 8 years ago let alone 20 years ago, will have some trouble editing larger projects depending on what you are doing and I'm not talking about filters. When your panning blocks or goes slow, it's not what you want to work on. That canvas is gpu accelerated. It's not just filters. How do you think 4 MB sgram (that's late 90s) will handle even a compressed 6MB jpg?
 

ASK THE COMMUNITY

TRENDING THREADS