Question Do I even need a graphics card with this CPU?

PewterScreaminMach

Distinguished
Nov 18, 2010
43
7
18,545
1
I am spec'ing a desktop for work. The primary function - aside from typical office work - will be editing and rendering photos and video up to and including 4k. I will be using Lightroom, Photoshop, and Premiere. There will be no gaming.

While it wouldn't be my first choice, they currently want to stick with a Dell workstation for consistency reasons, which is understandable based on other factors.

The best processor available for this workstation while keeping the computer within budget is the i9-10980XE, which appears to be a monster according to overall benchmarks.

So my question is: would I even need to spend the extra on a separate video card in this case?

My current home computer has an 8700K and 1080 Ti because I game a bit but I have had zero issues with these editing/rendering tasks when working from home. That said, I don't know how involved the 1080 is getting during those tasks.

My understanding is that a higher end modern processor should handle these particular tasks pretty well on its own if all other components are up to snuff (enough memory, fast HD, etc.), but I don't know if that's accurate.

To give even more info, the highest level single video card that can be added to this build within budget would be an RTX 4000 8GB. The RTX 5000 16GB could potentially happen if the cost:reward ratio was high enough but that's a huge jump in price. Or I could possibly downgrade the processor a little bit as a partial trade-off so the video card upgrade isn't so huge.

Thanks for any advice or input.
 

RodroX

Respectable
Aug 4, 2019
1,590
447
2,340
70
"The best processor available for this workstation while keeping the computer within budget is the i9-10980XE, which appears to be a monster according to overall benchmarks. "

Yeah intel had to cut prices a lot with the 3xxx Threadrippers around. Funny how things change once theres some competition.
 

neojack

Prominent
Apr 4, 2019
147
24
595
3
at home, you could test your usual tasks when removing your current GPU, activating iGPU of your 8700k, plug the screen to the motherboard, install intel's GPU drivers and test how much time your usual tasks takes to complete before / after.

that way you could be sure of the effect of a GPU on those tasks.

I recommend a backup of your system befoe with macrium free, so you won't have to reisntall your drivers etc when you are done with the tests.

Anyway, you would need a GPU as tennis2 said, but if it's just for display, you can get away with a 50$ card.

On a side note, threadrippers tend to have a better price/performance ratio than those extreme intel CPUs. Don't know how much Dell sells them for though.
 

Karadjgne

Titan
Ambassador
Much depends on the software you use and how it's used. Most ppl will encode using NVenc, rendering using the gpu to do all the legwork as that's what a gpu is designed for in the first place, rendering frames. They'll get better time performance from a strong gpu and milder cpu, which also happens to double as a gaming pc. It's only when the pc is setup for total cpu everything that the gpu takes a back seat. Which takes an absurd amount of cpu power.

The Quadro class gpus are designed specifically for content creation type apps like Photoshop etc as their Cuda core count works in the apps favor and alleviates the need for a huge cpu and supporting hardware.

Gpus are not Just for gaming, that's just an added bonus. Like @geofelt linked above, doing some decent research into exactly what's going to benefit you, the apps, the pc best would be a wise amount of time spent. What you think you desire is often not what's going to be better for your actual needs.

Another reason to split up the workloads is speed and required cooling. At stock, a heavy thread workload on a 10980XE will be pulling just under 200w-ish, and your cores will be running a lot closer to 3.0GHz. A 10900K will be running @ the same wattage except at closer to 5.0GHz. Serious savings in time. If you lock the cores to 4.8GHz on the 10980XE to get close to the 10900k speeds, power consumption jumps up far closer, if not above, 350w. Kiss Air coolers goodbye, kiss AIO's goodbye, welcome to the wonderful world of Full custom loop liquid cooling and a minimum of a 1000w GOOD quality psu. And that's not even considering the $900 price tag of the cpu, the $700+ price tag of a supporting motherboard, $700+ worth of liquid cooling and a full tower case capable of supporting at least 2x 280mm radiators and the loop.

Yes, things can get out of hand with 1 click of the mouse and 1 seemingly small decision like 'lock cores' in bios.
 
Last edited:

tennis2

Honorable
I have a friend who's a graphic designer. He's used Intel integrated graphics in Photoshop/Lightroom and was satisfied with the performance. It really depends on what your usage is.

I think that conclusion starts to change when you start talking about Premier.
 

Karadjgne

Titan
Ambassador
and was satisfied with the performance.
That's a very broad and argumentative word, satisfied. That's going to be determined by each individuals level of what exactly constitutes 'satisfaction'

It's easy to be satisfied with an average hamburger, it's doable. But it's not in anyway a Rib-eye steak.

Trying to render a 4k video of any real length on a igpu is a lesson in futility unless you have all day to do nothing else.
 
Reactions: Corwin65

tennis2

Honorable
^ Exactly. Hence why I was sure to include "depends on what your usage is" for Photoshop/Lightroom and "the argument changes when you include Premier" since that's video editing now.

I suppose the nice thing about an i9-10900K is that you can TRY it with just the IGP and see how/if that works.
 

PewterScreaminMach

Distinguished
Nov 18, 2010
43
7
18,545
1
Thanks again for all of the replies and info above.

That's a very broad and argumentative word, satisfied. That's going to be determined by each individuals level of what exactly constitutes 'satisfaction'

It's easy to be satisfied with an average hamburger, it's doable. But it's not in anyway a Rib-eye steak.

Trying to render a 4k video of any real length on a igpu is a lesson in futility unless you have all day to do nothing else.
Fair point here, so I should clarify. The last video I created in Premiere that falls into this "satisfied" category was shot in 4K 30p, comprised three or four original video files (all around a minute or less), edited with splices/FX/other editing in "full quality" (not the 1/2 or 1/4 setting), and the finished product was about a minute long.

In this case on my home computer (8700K with 16GB RAM and 1080 Ti), all video and audio editing was seamless with no freezing/skipping/hesitation and the final rendering in 4K 30p took maybe 20 seconds.

Again, it was a fairly simple piece with only a few original clips used to pull from, so I'm aware that a much longer video comprising of dozens of original clips could be a different story. But most of what I'm working on will likely be 5 minutes or less type stuff, probably more like one minute or less typically. And I probably won't be working on videos that require dozens of long video files to pull from (although, who knows what the future holds?).

So I guess my definition of "satisfied" comes down to the computer not slowing me down in any way during the editing process and then taking no more than half the final video time to do the rendering. Generalizations, of course, but that would satisfy me for this purpose.
 

PewterScreaminMach

Distinguished
Nov 18, 2010
43
7
18,545
1
The HD-630 in the 10900k is the same HD-630 in the 8700k, so you gain nothing but what the cpu can bring to the table, which is considerable when talking about the actual amount of work a 10c/20t can do vs a 6c/12t at roughly the same speeds. Just bring on the cooling 👍
Well, I did some more "real world" testing on my home computer in Premiere while monitoring in Task Manager, per recommendations here (reminder: 8700K CPU and 1080 Ti 11GB GPU with 16GB system memory).

I loaded up the previously mentioned project, then added another dozen or so clips (all 4K 30fps, as well) into the project to see how it affected memory. Around 10+ GB system memory max used throughout the process.

Used five total clips in the video with one main 65-second clip as the foundation and the others as shorter/trimmed ones stacked on top of it (some in-screen, some dissolved transitions). Added a few 3D-type transitions on top of the dissolves that were already there, then added "Stylizing" effects to tracks, including one to the entire 65-second track.

As you might imagine, things took a serious performance dive here compared to the previous version which was three tracks and no stylizing effects other than a few dissolve transitions.

Couldn't preview at full quality in any way. Half quality was still pretty skippy when it hit the heavier effects. Quarter quality worked fine.

What's interesting to me (being a novice with this stuff) is that at no point yet did the GPU take on any significant level of the work. I think it maxed out at 3% usage while the CPU was pegged at between 75% and 100% usage the entire time you were trying to preview the video, meaning the CPU was taking on the entire workload. [EDIT: I now understand that this is always how it works after reading some articles, as Premiere doesn't use GPU for live preview, only rendering]

I then rendered the video to see what happened. Interesting results.

CPU (8700K) hovered around 70-75% most of the time with a few 100% spikes while GPU was barely used during that time. Occasionally, the CPU would drop down to maybe 30% and the GPU would spike at a max of around 30-35%, which I'm guessing was during the peak "FX" requirements. Overall render time on the 65-second video went from well under 30 seconds in the initial version to about a minute for the "final" effects version.

So during the entire editing/preview process at 1/4 quality and then rendering, the CPU typically hovered around 70-75% with 100% spikes and the GPU was either doing nothing or occasionally spiked to around 35% but only at a few points while rendering.
 
Last edited:

PewterScreaminMach

Distinguished
Nov 18, 2010
43
7
18,545
1
Some of your apps can use the CUDA cores of nvidia graphics cards to help performance.
The power of the cards is not as important as the number of CUDA cores it has.
Here is a discussion of recommendations for photoshop:
https://www.pugetsystems.com/recommended/Recommended-Systems-for-Adobe-Photoshop-139/Hardware-Recommendations
I just wanted to thank you for posting this link. At first I sort of quickly perused it since it was for Photoshop but as I dug deeper on that site, I found some more great articles focused on Premiere Pro, which I'm figuring out is actually my primary concern. I've yet to see any other site or article present such current hardware testing and results so clearly with enough tech detail to be useful but not go so deep that it gets over my head. They do a great job of separating out the CPU vs GPU thing and addressing live preview vs rendering results, etc. So thanks again.
 

tennis2

Honorable
This was at least 5 years ago, but I seem to recall having to go through a somewhat convoluted process to enable CUDA acceleration (for non-quadro GPUs) in Adobe software. Maybe GPU acceleration isn't enabled in your case? Might be a good idea to at least look up the setting and check to make sure it's enabled.

Either way, I'm interested in your findings.
 
Reactions: PewterScreaminMach

PewterScreaminMach

Distinguished
Nov 18, 2010
43
7
18,545
1
This was at least 5 years ago, but I seem to recall having to go through a somewhat convoluted process to enable CUDA acceleration (for non-quadro GPUs) in Adobe software. Maybe GPU acceleration isn't enabled in your case? Might be a good idea to at least look up the setting and check to make sure it's enabled.

Either way, I'm interested in your findings.
I planned on doing exactly that when I have time. I would hope that in 2020 it would be pretty automated with such pro-level software recognizing your key hardware and enabling settings to take advantage of that but I never assume it's the case.
 

Karadjgne

Titan
Ambassador
And what really sucks is all this research means nada if you use a different program. It would be nice, as you say, if they automated some of this, but then you run into ppl who like Premiere Pro for one task, Maya for another, Sony Vegas Pro for another etc. Just because different software might have 1 setting that's adjustable and other software doesn't.

Which can change requirements totally.

And yes, Puget Systems is the single best source of info for content creation and pc requirements and performance that I know of, there might be others, but I can actually Read their articles, unlike others whose technical jargon might as well be written in Court Mandarin.
 

PewterScreaminMach

Distinguished
Nov 18, 2010
43
7
18,545
1
And what really sucks is all this research means nada if you use a different program. It would be nice, as you say, if they automated some of this, but then you run into ppl who like Premiere Pro for one task, Maya for another, Sony Vegas Pro for another etc. Just because different software might have 1 setting that's adjustable and other software doesn't.

Which can change requirements totally.

And yes, Puget Systems is the single best source of info for content creation and pc requirements and performance that I know of, there might be others, but I can actually Read their articles, unlike others whose technical jargon might as well be written in Court Mandarin.
Luckily I'll likely be using Premiere Pro, Lightroom, and Photoshop pretty exclusively for the purposes of this system. I'm not saying it's impossible that I might incorporate others in the future but those three should comprise the bulk of the work going forward as far as I can see (I don't see my limited skills requiring any more than that any time soon, honestly).

And I'm thrilled to have discovered Puget through this post. What a wealth of info. Not sure my company will go for it but will definitely float the idea of ordering the computer through them (any similar companies worth comparing?). I'd really prefer this to not be a Dell workstation based on past experience but you know how companies can be set in their ways and put a lot of stock in consistency of brand across the company. It has its merits.

If it was entirely up to me, I'd buy the components and put the system together myself to max out value/performance, as I do with my home computers, but I'm nearly positive they won't go for that option. Too bad because it would save them a bunch of money and allow me to know I'm getting the best performance within the budget.
 

tennis2

Honorable
If your company is large enough, they likely get a pretty good discount on whichever OEM they're working with.

For purposes of building your own, that's almost always a no-go unless you're working for a very small company.

Depending on company size, they may also have disk images and specific encryption/security features that need to be present/installed. Not to mention if they have any hardware compatibility requirements etc etc. It all gets pretty convoluted for a consumer-level DIYer. Lots of requirements/limitations that doesn't make practical sense.
 

Karadjgne

Titan
Ambassador
The thing about companies like Pudget and Dell etc, when dealing with a Company purchase, is the warranty. Take a card like a nvidia Quadro, it's not all that big or powerful compared to its GTX/RTX brethren, BUT it can be set up under warranty where IF there is an issue, you have a direct line specific to nvidia tech Quadro department and possibly a mobile tech in conference call whom will come to your place of business and either fix the pc on the spot or replace the card, on the spot. No charge. If your GTX/RTX goes belly up, it's weeks away for an RMA, IF it's still under vendor warranty, otherwise you have to deal with manufacturers warranty.

That's the difference. With Dell and Puget, a Company pc bought through them has 24hr tech support, upgraded warranty process, all the bells and whistles coverage, even full replacement value in case of catastrophic failures. Stuff you build yourself... Ehh not so lucky.

Most ppl have 1 pc. Some families might have 2 or 3, but it stops there. You won't have upgrades in the next 4-5 years, so whatever you have is a dead end. Company pc's are different, and thats what Dell is counting on. Not an individual sale, not the warranty work, not any repair bills, but the 10-100+ pc's your company might buy, the service contracts, upgrades for 100+ pc's, are software purchases, multiple license agreements, networking systems and servers to support that many pc's etc. THAT'S big money, and it's consistent. Homeowners pc's, dead end after purchase. So a Puget/Dell pc, company bought, had better make you happy, and they will want to make you extatically happy with them because they are hoping you'll sign on for a 100pc and support contract.
 
Last edited:
Reactions: tennis2

PewterScreaminMach

Distinguished
Nov 18, 2010
43
7
18,545
1
Yeah, we're a fairly small company and it's not out of the realm of possibility that they'll at least consider the best "bang for your budget" option. It's not likely but they know that the budget is very small for what the computer needs to handle and will potentially defer to my better judgement, at least to some degree. They understand there are a few computers around the shop that Dell is not best-suited to provide (CAD-related), so those have been from other companies in the past.

Regardless, I've received some great info and leads here, so thanks to everyone. For the hell of it and out of curiosity, I've done some more testing on my home computer (again, 8700K and 1080 Ti). This particular test used three 4K clips at around 36 seconds total time with all three running the full length (PiP). Since reading about the GPU acceleration usage for Premiere Pro, I wanted to focus on that. Here's the layout:

- a "main" clip with blur in and out and a Lumetri Cinespace Faded Film filter on the entire thing (GPU-accelerated effect)

- a 25% lower left picture-in-picture of the second clip with Lumetri Fuji F125 filter on the entire thing (PiP and filter both GPU-accelerated effects)

- a 25% upper right picture-in-picture of the third clip with color correction on the entire thing (PiP and color correction both GPU-accelerated effects)

This is a screen shot of the Task Manager Performance and CoreTemp while Live Previewing the entire thing at Full quality in Premiere Pro (snapped screen shot in middle of the Live Preview):

https://ibb.co/DWd1jLH

And this is a screen shot from near the end of the 4K render, which took about 36 seconds for the 36 second video:

https://ibb.co/wRHZgcQ

I just found it interesting to test and post results from this particular setup. All video shot with a Nikon Z6 at 4K 30p.

Oh, and I also have an extra 16GB of this same memory on its way to help future-proof the setup for a while (hopefully makes it last a few extra years).
 
Last edited:

PewterScreaminMach

Distinguished
Nov 18, 2010
43
7
18,545
1
I wanted to thank everyone for their input and advice and also give the update that I was, indeed, able to go with a custom Puget Systems option for this purchase. I will be eagerly awaiting its arrival...

Specs are:

10900K
Z490 Vision D
32GB RAM
2060 Super
970 Evo Plus 1TB main drive
6TB WD Black storage drive (already have separately)

Being that I also work from home sometimes, it will certainly be interesting to see how this system compares in real world use to my home computer, particularly with Premiere Pro being the software that will tax each system the most. Home computer is:

8700K
1080 Ti
860 Evo main drive
other specs comparable to new work system

Thanks, again!
 

ASK THE COMMUNITY