Question Using 1 graphics card to render while the other for display output

Feb 21, 2022
8
0
10
So i got a RTX 3070 ti but my monitor is only compatible with it through HDMI 1080p@60hz. When i used my old GTX 1060 this was fine because it has a DVI-D (dual link) output which is 1080p@144hz.
I want to render my games on my RTX 3070 ti but display to my monitor through my GTX 1060.
I know this is possible because i got it to work once for overwatch but it was not happy my monitor went black multiple times and other strange things. But once it stopped it was running at 144hz on my monitor rendering on my rtx 3070 ti but displayng through my gtx 1060 is there away to achieve this again.

Oh also im not buying an active adapter for dvi-d to Display port they are like £80
 
Go to Settings -> System -> Display (or wherever your display settings are) -> Graphics. Then click on an app, select "Options", then choose which video card you want:
v8XkZkW.png


Note this is different on desktops, there'll be another option called "Specific GPU"
 
Go to Settings -> System -> Display (or wherever your display settings are) -> Graphics. Then click on an app, select "Options", then choose which video card you want:
v8XkZkW.png


Note this is different on desktops, there'll be another option called "Specific GPU"
Yeah i dont have this option it just shows my gtx 1060 for all options. I got it to work but it is reliant on making my 2nd monitor out put with my rtx 3070 and making it my primary display in windows then in each game i will have to manually choose my other monitor to run the game on if i a game lacks this feature i cant use my rtx 3070 on my main good monitor. 🙁
 
Last edited by a moderator:
So i got a RTX 3070 ti but my monitor is only compatible with it through HDMI 1080p@60hz. When i used my old GTX 1060 this was fine because it has a DVI-D (dual link) output which is 1080p@144hz.
I want to render my games on my RTX 3070 ti but display to my monitor through my GTX 1060.
I know this is possible because i got it to work once for overwatch but it was not happy my monitor went black multiple times and other strange things. But once it stopped it was running at 144hz on my monitor rendering on my rtx 3070 ti but displayng through my gtx 1060 is there away to achieve this again.

Oh also im not buying an active adapter for dvi-d to Display port they are like £80
DVI-D does not need an active adapter since it's already digital.
What are you hoping to achieve with using 2 GPUs? I see no benefit in gaming, only in specific apps, not games.
 
  • Like
Reactions: drivinfast247
Yeah i dont have this option it just shows my gtx 1060 for all options. I got it to work but it is reliant on making my 2nd monitor out put with my rtx 3070 and making it my primary display in windows then in each game i will have to manually choose my other monitor to run the game on if i a game lacks this feature i cant use my rtx 3070 on my main good monitor. 🙁
While the screenshot I took from my laptop, I did do this with my desktop with the monitor plugged into a GT 1030 (the other card is a RTX 2070). I had to use DDU to reinstall the drivers so Windows could see both cards properly. But otherwise it worked after that.

No, single link DVI goes up to 1080p 60Hz and dual link DVI goes 1080p144Hz. An active adapter can not change the bandwidth, it just changes the ANALOG signal to DIGITAL signal. At least the ones I know and use.
You have to use an active converter from DVI to DP since DP uses a different signalling method. DP to DVI can be done passively as DP has a mode to output DVI compatible signals.
 
I get the why, regarding somewhere to plug in your legacy monitors. However you are trying to utilize some OS features designed for laptop discrete graphics and integrated graphics switchover that aren't really suited to your purposes. (IE switching between discrete graphics and integrated on the same laptop port output is why those settings are there.) Multi card systems in their hay day typically utilized dedicated proprietary link interfaces (like nvlink or SLI) to pass that stuff over on the driver level and still took special configuration. However Nvidia and ATI have both moved away from that setup because it is a mess and current gen mainstream cards are powerful enough on a single card level to not really warrant it. I would strongly suggest going to a single card and running converters. Your overall performance will be better and you will end up fighting driver support forever trying to mix those two cards.
 
I get the why, regarding somewhere to plug in your legacy monitors. However you are trying to utilize some OS features designed for laptop discrete graphics and integrated graphics switchover that aren't really suited to your purposes. (IE switching between discrete graphics and integrated on the same laptop port output is why those settings are there.) Multi card systems in their hay day typically utilized dedicated proprietary link interfaces (like nvlink or SLI) to pass that stuff over on the driver level and still took special configuration. However Nvidia and ATI have both moved away from that setup because it is a mess and current gen mainstream cards are powerful enough on a single card level to not really warrant it. I would strongly suggest going to a single card and running converters. Your overall performance will be better and you will end up fighting driver support forever trying to mix those two cards.
OP has not mentioned that the system is a laptop, @hotaru.hino said about his laptop.
 
While the screenshot I took from my laptop, I did do this with my desktop with the monitor plugged into a GT 1030 (the other card is a RTX 2070). I had to use DDU to reinstall the drivers so Windows could see both cards properly. But otherwise it worked after that.

Ive tried DDU and NV clean install but i think the drivers might be the problem because i think both cards use teh same drivers so windows might be confused.
Also i got this to work with Monster Hunter World as that game lets you swap monitor in the settings. So i know the results i want are achieveable.
 
Ive tried DDU and NV clean install but i think the drivers might be the problem because i think both cards use teh same drivers so windows might be confused.
Also i got this to work with Monster Hunter World as that game lets you swap monitor in the settings. So i know the results i want are achieveable.
I used a single driver package to get both to working as I described.

The thing with DDU they probably don't mention is after cleaning the system of drivers, do not have the computer connected to the internet when you reboot back into Windows. Windows will try to find drivers and install them, which will screw things up.
 
The thing with DDU they probably don't mention is after cleaning the system of drivers, do not have the computer connected to the internet when you reboot back into Windows. Windows will try to find drivers and install them, which will screw things up.

Thanks man ill try that now.
 
That's what he wants. DP from 3070 to DVI in monitor.........
Right, and I've seemed to miss the other points of your post. But I'll just focus on this part:

An active adapter can not change the bandwidth, it just changes the ANALOG signal to DIGITAL signal. At least the ones I know and use.
Assuming the output of the adapter is dual link DVI, then yes, an active adapter DP to DVI adapter can output 1080p 144Hz. Otherwise explain this: https://www.amazon.com/StarTech-com-DisplayPort-DVI-Adapter-Dual-Link/dp/B00A493CNY

The only thing that makes an active adapter "active" is it uses "active" components, i.e., IC chips that require additional power to do the converting.
 
Right, and I've seemed to miss the other points of your post. But I'll just focus on this part:


Assuming the output of the adapter is dual link DVI, then yes, an active adapter DP to DVI adapter can output 1080p 144Hz. Otherwise explain this: https://www.amazon.com/StarTech-com-DisplayPort-DVI-Adapter-Dual-Link/dp/B00A493CNY

The only thing that makes an active adapter "active" is it uses "active" components, i.e., IC chips that require additional power to do the converting.
Maybe I am not understanding well what you say or something is lost in translation (since I am not a native English speaker) but I don't think we disagree. I never said that an active adapter can not output 1080p 144Hz. I said that it can not make a single link DVI output 1080p 144Hz. In other words it can not make a single link DVI perform as dual link DVI.
 
Maybe I am not understanding well what you say or something is lost in translation (since I am not a native English speaker) but I don't think we disagree. I never said that an active adapter can not output 1080p 144Hz. I said that it can not make a single link DVI output 1080p 144Hz. In other words it can not make a single link DVI perform as dual link DVI.
Looking back at what you quoted originally, i think you got confused. OP has a dual link DVI output. They're not trying to convert a single link DVI output into a Dual-Link one or equivalent. They don't want to spend a non-trivial amount on a converter that's compatible with dual-link DVI.

At the end of the day, sure we agree on the same thing, but I'm also nitpicking about the finer details.
 
Looking back at what you quoted originally, i think you got confused. OP has a dual link DVI output. They're not trying to convert a single link DVI output into a Dual-Link one or equivalent. They don't want to spend a non-trivial amount on a converter that's compatible with dual-link DVI.

At the end of the day, sure we agree on the same thing, but I'm also nitpicking about the finer details.
The following quotes contradict each other.
it has a DVI-D (dual link)
unfortunatly DVI-D does require an active adapter to convert to Display Port if you want 1080p@144hz you can use a regualr adapter but it only goes to 1080p@60hz
If he has dual link then it would output 144Hz, if he has single link it will output 60Hz.
I was not confused, I believe OP does not have dual link otherwise a regular converter would have been suffice from DP (GPU) to dual link DVI (monitor).
 
If he has dual link then it would output 144Hz, if he has single link it will output 60Hz.
I was not confused, I believe OP does not have dual link otherwise a regular converter would have been suffice from DP (GPU) to dual link DVI (monitor).

Only my gtx 1060 has DVI dual link output and my monitor can only run 144hz witha DVI-D input
 
If he has dual link then it would output 144Hz, if he has single link it will output 60Hz.
I was not confused, I believe OP does not have dual link otherwise a regular converter would have been suffice from DP (GPU) to dual link DVI (monitor).

Only my gtx 1060 has DVI dual link output and my monitor can only run 144hz witha DVI-D input
Then the adapter should have output 144Hz and not 60Hz. In ANY case, the solution is to buy a new monitor with proper connections and sell the 1060 and the old monitor (either or both). Less power consumed and no adapters involved.

Also please provide your monitor and the exact model of the 1060
 
Then the adapter should have output 144Hz and not 60Hz. In ANY case, the solution is to buy a new monitor with proper connections and sell the 1060 and the old monitor (either or both). Less power consumed and no adapters involved.

I want to keep my gtx 1060 for linux and virtual machines. The only adapter i can get that does 1080p@144hz is an active adapter look at the amazon listings for the regualr DP to DVI-D its never 144hz. and regardless when i upgrade to a new monitor im gonna want to use my current one to its full capacity as a second monitor.
 
Since you do want to use both GPUs it's better if you also list your FULL system specs including make and model of your PSU.
As you are interested my specs are
CPU: Ryzen 3600
Mobo: ROG STRIX B550-A GAMING
GPU : RTX 3070 ti (top PCI-E)
GTX 1060 3gb
PSU : Corsair 850W gold rated
Monitor : Acer Predator GN246HL 24" LED Gaming 144Hz Monitor

Also have an Old 60hz tv connected to my gtx 1060
Currently im just using my Acer monitor at 60hz through HDMI to my RTX 3070