Dec 12, 2018
88
1
35
Could you use multiple GPUs together and use a VM to connect them into a vGPU (if the Graphics Cards support virtualization) so that you can have parallel tasks running together on the multiple GPUs? (e.g. using 2 GPU and connecting them into 1vGPU)
 
Last edited:
Solution
That's what mining does so yes you can but it's useless for gaming,in mining ( parallel computing tasks in general ) it doesn't matter if the results come in out of order, for gaming it would be really annoying, if you cut the screen up into let's say 8 equal parts to have 8 GPUs work on the game you could end up with 8 parts on your screen that run at slightly different moments because not every part of your screen has the exact same amount of things happening.
You would have 8 way tearing going on...
That's what mining does so yes you can but it's useless for gaming,in mining ( parallel computing tasks in general ) it doesn't matter if the results come in out of order, for gaming it would be really annoying, if you cut the screen up into let's say 8 equal parts to have 8 GPUs work on the game you could end up with 8 parts on your screen that run at slightly different moments because not every part of your screen has the exact same amount of things happening.
You would have 8 way tearing going on...
 
Solution
Dec 12, 2018
88
1
35
Look up CrossFire and SLI, unless you mean something else here.
Sorry for replying late but what I mean't was that: Could vGPUs be used as a better way for multi-GPU scaling? I thought this because I thought games would only see 1 GPU (hence virtual GPU or vGPU) and so they would only render frames to that 1 GPU and not worry about multi-GPU coding. But in real life, that GPU is actually a vGPU and there would be multiple GPUs powering it. So could that then work, gaming on a vGPU so you don't need multi-GPU code to run multi-GPU setups?
As a follow-up question, how well do vGPUs scale in terms of gaming?