[SOLVED] Intel UHD 750 vs GeForce GTX 770 ?

hotlips69

Distinguished
Jan 27, 2009
9
0
18,510
I really need to know how the new Intel UHD 750 on the new 11th gen CPUs compares to my ageing Nvidia GeForce GTX770 for 4k video editing & general use?

Until I can afford the RTX 3070 in a few months, my new build will have an Intel Core i9-11900k, but I can't decide if it's better to simply use the built it graphics from the CPU or install my ageing GeForce GTX 770 for the best graphical performance for video editing & general use...
I don't play any games.

Thanks in advance.
 
Solution
The HD630 found in 7th, 8th, 9th, 10th Gen Intel comes in roughly Half of a GT1030 in performance. The UHD 730/750 is a few fps higher than the 630, generally not enough to really notice, most ppl can't see the difference between 50fps and 60fps, but the UHD will generally allow slightly higher graphics settings.

Your old GTX770 stomps all over Any igpu, including Vega graphics on the Amd APU's.

The best use of an igpu is general office type use, or in an HDTV tiny box where the only thing needed is for playback of movie collections.

Of course grandma wouldn't mind the odd game of mah-jong or solitaire, but don't get your hopes up for anything else.
Thanks very much Punkncat.

I presume you mean the GeForce 3070 series run very...

hotlips69

Distinguished
Jan 27, 2009
9
0
18,510
The 770 will be far stronger than any of the iGPU.

750 is a solid improvement over 630, but don't expect miracles. The 3xxx Vega graphics are better for most task ( of course towards a gaming slant ).

Side note, consider a really good cooler. These puppies are hot.
Thanks very much Punkncat.

I presume you mean the GeForce 3070 series run very hot?
They all have multiple fans though by default.
 

Karadjgne

Titan
Ambassador
The HD630 found in 7th, 8th, 9th, 10th Gen Intel comes in roughly Half of a GT1030 in performance. The UHD 730/750 is a few fps higher than the 630, generally not enough to really notice, most ppl can't see the difference between 50fps and 60fps, but the UHD will generally allow slightly higher graphics settings.

Your old GTX770 stomps all over Any igpu, including Vega graphics on the Amd APU's.

The best use of an igpu is general office type use, or in an HDTV tiny box where the only thing needed is for playback of movie collections.

Of course grandma wouldn't mind the odd game of mah-jong or solitaire, but don't get your hopes up for anything else.
Thanks very much Punkncat.

I presume you mean the GeForce 3070 series run very hot?
They all have multiple fans though by default.
No. The cpu itself. That's where the igpu is located, right next to the cores. With a discrete gpu, you aren't running the igpu, so cpu temps are somewhat moderate, but with an igpu trying to get fps, that's serious added heat to the cpu. Also figure the igpu uses system memory, so the memory controller in the cpu is also put under added stress.

So if you are gaming in the 55-60°C range with a discrete gpu, figure on closer to 70-75°C with the igpu blasting away, maybe higher.
 
Last edited:
Solution

hotlips69

Distinguished
Jan 27, 2009
9
0
18,510
The HD630 found in 7th, 8th, 9th, 10th Gen Intel comes in roughly Half of a GT1030 in performance. The UHD 730/750 is a few fps higher than the 630, generally not enough to really notice, most ppl can't see the difference between 50fps and 60fps, but the UHD will generally allow slightly higher graphics settings.

Your old GTX770 stomps all over Any igpu, including Vega graphics on the Amd APU's.

The best use of an igpu is general office type use, or in an HDTV tiny box where the only thing needed is for playback of movie collections.

Of course grandma wouldn't mind the odd game of mah-jong or solitaire, but don't get your hopes up for anything else.

No. The cpu itself. That's where the igpu is located, right next to the cores. With a discrete gpu, you aren't running the igpu, so cpu temps are somewhat moderate, but with an igpu trying to get fps, that's serious added heat to the cpu. Also figure the igpu uses system memory, so the memory controller in the cpu is also put under added stress.

So if you are gaming in the 55-60°C range with a discrete gpu, figure on closer to 70-75°C with the igpu blasting away, maybe higher.
Thanks for the detailed answer.

If you have a discrete GPU, am I incorrect in thinking that the igpu is used automatically for basic use & the discrete GPU only kicks in when a threshold is hit?
 

Karadjgne

Titan
Ambassador
Yes and no. In a laptop, the battery is the most important component, everything is geared towards saving power, and therefore extending run time on battery. So a laptop primary graphics adapter is always the igpu, until a threshold is reached or needed and then the gpu takes over.

But a desktop pc doesn't rely on a battery, power savings is not a consideration, so the gpu is the primary graphics adapter, and as such the igpu is not used at all, unless you actually facilitate its use by plugging in a monitor to the motherboard ports instead of the discrete gpu ports.
 

InvalidError

Titan
Moderator
If you have a discrete GPU, am I incorrect in thinking that the igpu is used automatically for basic use & the discrete GPU only kicks in when a threshold is hit?
Most motherboards will automatically disable the IGP either systematically when a dGPU is installed or only when no monitor is connected to the IGP. Some motherboards have an option to let you force the IGP enabled, others don't. If you don't see it in device manager, it probably is disabled.