Question triple monitors with gtx 1050 and gtx 1060

Apr 13, 2020
4
0
10
Hello everyone! Excuse my English!
I have two monitors connected via DVI and HDMI to GeForce GTX 1060 3GB on PCIe 1 and one monitor connected via HDMI to GeForce GTX 1050 2GB on PCIe 0.
Whatever I start on 1060, the 1050 gets usage as well. If I open Chrome on one of the monitors connected to the 1060, it shows that Chrome is open on the 1050 and GPU gets to work harder than the 1060 one, which is where the action is happening. When I watch youtube videos, the story is the same, doesn't matter if I watch on 1060, the 1050 get more usage. When I move windows between the monitors, it works the same way. I connected the two monitors to the 1060, because it's more powerful than the 1050, but the 1050 gets the hardest work.
I cannot figure this out. Is it supposed to work like that? Before I used the same exact 1050 and one old gt610 I believe it was, and did'n have this problem.
I truly believe I can find solution here. Thank you in advance for your advices! Looking forward to read them.
 
You could simply say you don't have any idea! Thank you anyway!

No I said, using two different cards can cause issues, including performance. Again there is almost no advantage in running two cards like this in almost all consumer applications. The difference in performance requirements of the GPU running two vs three monitors off one card while doing general computer work is inconsequential.

Also remember that each card is working independently from the other, so the workload is not split evenly between them.
 
Apr 13, 2020
4
0
10
No I said, using two different cards can cause issues, including performance. Again there is almost no advantage in running two cards like this in almost all consumer applications. The difference in performance requirements of the GPU running two vs three monitors off one card while doing general computer work is inconsequential.

Also remember that each card is working independently from the other, so the workload is not split evenly between them.

Thanks for your reply!
When the workload is not split evenly between them, why the 1050 keeps getting done more work then 1060 even when everything is on the 1060. It acts like the app is running on the 1050, when it actually runs on the 1060. This is very strange to me. I don't want them to run together for a single task. I want them to run separately and doing the job that is on the current monitor. When something is opened on 1060, I think the 1060 should get the workload, rather then 1050. I don't know if this is the logic behind it, or I am messed up. Thank you!
 
Thanks for your reply!
When the workload is not split evenly between them, why the 1050 keeps getting done more work then 1060 even when everything is on the 1060. It acts like the app is running on the 1050, when it actually runs on the 1060. This is very strange to me. I don't want them to run together for a single task. I want them to run separately and doing the job that is on the current monitor. When something is opened on 1060, I think the 1060 should get the workload, rather then 1050. I don't know if this is the logic behind it, or I am messed up. Thank you!

Again its not that simple, you can not set tasks for individual cards outside of some specific programs that allow you to do so.

Also the 1050 will always have a higher work load percent since its a weaker card, the 1060 can handle far more.

That's why I keep trying to tell you just ditch the 1050 and use only the 1060, you will get better performance all together without the issue of running a multi GPU setup like this.
 
Apr 13, 2020
4
0
10
Again its not that simple, you can not set tasks for individual cards outside of some specific programs that allow you to do so.

Also the 1050 will always have a higher work load percent since its a weaker card, the 1060 can handle far more.

That's why I keep trying to tell you just ditch the 1050 and use only the 1060, you will get better performance all together without the issue of running a multi GPU setup like this.

I understand "can not set tasks for individual cards outside of some specific programs that allow you to do so." Before, when I had the 1050 and one gt610, when I move google chrome (for example)from monitor to monitor, the gpu that's connected to the destination monitor gets the work load. Move from monitor with 610 to monitor with 1050 and the 1050 gpu load increases, 610 work load decreases. That's how it was, my logic works the same way. Now it's not like that, the 1050 get the work load always. I am not trying to set the 1060 to do specific task, just when I move the window to the monitor connected to the 1060, I think the 1060 should get the work load and the 1050 work load should drop drastically. I reinstalled the latest driver a few times and in the beginning it works fine, after a few minutes starts to work as I am trying to explain here. On the monitor with 1050 I have opened only two small diagnostic programs. The monitorS with 1060 gets real work. I am doing my stuff there and the 1050 gets work load on that as well, because when I stop, the 1050 work load drops no matter that I don't use it for that work, I use the 1060 monitors only for the heavy work. I hope you can understand what I am trying to ask. Excuse my English again, thank you for your replies and patience.