Exclusive: DirectX 12 Will Allow Multi-GPU Between GeForce And Radeon

Status
Not open for further replies.
Just as the implementation of these different distributed rendering techniques will be left up to the developers, won't the use of mixed GPU's, especially cross-vender, be up to the GPU manufacturer? I doubt AMD and Nvidia will allow such a configuration. Suddenly the less expensive AMD flagship combined with an inexpensive Nvidia Cuda / PhysX card would gain a lot more traction I would imagine.
 

John Wittenberg

Reputable
Mar 9, 2014
159
0
4,710
Yep - Nvidia wrote out the ability to use any of their cards as a PhysX card with a beefier AMD GPU as primary years and years ago. I highly doubt Nvidia will play ball - but stranger things have happened.
 

Maddux

Reputable
Feb 24, 2015
5
0
4,510
I'm excited about this as it means you can always use your last two video cards to give you a nice boost in performance. That way I'm not wasting money buying two of the same cards to set up SLI or Crossfire that will both be antiquated at the same time. Just use your newest card as the master.

My question is, is DX12 smart enough to use this to give any boost at all to older games? Or does it strictly require a supported game.
 


The article points to it's all up to the developers to take advantage of DX12's ability to operate in such a way. I imagine PC only devs will be the first to test the waters.

Only DX12 titles will be able to take advantage of the tech. There have been a number of articles reviewing backwards compatibility. Previous releases would require too substantial of an overhaul.
 

Foobar88

Reputable
Feb 24, 2015
1
0
4,510
Rather than the ability to combine an Nvidia and an AMD GPU, I see the big takeaway here as being able to upgrade and run SLI on two cards from different generations. So, for example, I have a GTX 970 now. When Nvidia comes out with a "1070" or similar GPU on a 14nm or 16nm die, I could simply slap one of those into my machine and run SLI with those two cards, rather than having to simply replace my 970. That seems like the most cost-effective way to get true 4k capability, assuming the devs play along.
 

edwd2

Honorable
Feb 20, 2013
69
0
10,660
I have an HD 7950 that's just sitting there collecting dust right now, It'll be great if I can combine it with my current 290X.
 

Grognak

Reputable
Dec 8, 2014
65
0
4,630
So it's gonna work with APUs/iGPUs too? That'd be pretty awesome, those lousy Intel graphics will finally be useful, and APU + GPU systems will get a serious performance boost.
 

booyaah

Distinguished
Mar 17, 2006
171
0
18,690
Now if they can only find a way to allow SLI/XFire to work in fullscreen windowed mode, all my problems will be solved !!
 

spiketheaardvark

Distinguished
Apr 14, 2009
134
14
18,715
This may give reason to slap in that old card you have laying around, but I'd wager the reason for this whole set up is for the future arrival of 3d headset like Oculus. This make it possible assign one card per eye rather than each card rendering alternate frames for both eyes. The bonus reduction in latency is a big deal for 3d headsets to prevent nausea.
 

Maddux

Reputable
Feb 24, 2015
5
0
4,510
Reading the article a second time it sounds like it could theoretically work on more than two GPUs as well. So I could use two discreet GPUs AND gain a little extra power from my CPU's integrated GPU. Man, this sounds like a dream if it's true!
 

Kewlx25

Distinguished
Just as the implementation of these different distributed rendering techniques will be left up to the developers, won't the use of mixed GPU's, especially cross-vender, be up to the GPU manufacturer? I doubt AMD and Nvidia will allow such a configuration. Suddenly the less expensive AMD flagship combined with an inexpensive Nvidia Cuda / PhysX card would gain a lot more traction I would imagine.

In order to be DX12 compatible your device must be able to compute. DX12 will treat all computer engines and a resource pool and distribute work among them. It doesn't matter if they're AMD, Nvidia, Intel, or whatever, as long as the device can support DX12.
 


That doesn't make sense to me, can you explain more? I understand the concept, but I don't see DX12 not needing drivers which will lock the resources, as they do currently, if a card of a competing vendor is present.
 
I'm with Maddux and Foobar. My first thought was that your old GPU would still be useful after an upgrade. Of course they'd both need to be DX12 compliant, so just how old can the "old" card be? And depending on how much an APU can contribute to the framerates, they actually might be preferable than the low-budget Intel chips. This could also make Z mboards more desirable as PCIe lane splitting could be more easily used.

My second thought was how this would reconcile the different rendering effects between AMD and NVidia cards. My best guess there is that only "pure" DX12 rendering methods would be supported when trying to use mixed graphics resources.
 

jnanster

Distinguished
Dec 29, 2008
15
0
18,510
This article only referenced gaming, which is huge, especially from a AMD/Cuda/Physx standpoint. Would also be interested in improved video editing software; OpenCL/Cuda.
 

Achoo22

Distinguished
Aug 23, 2011
350
2
18,780
Although the article makes it sound like we are ready to move away from SLI and Crossfire licensing, I suspect that things will somehow be twisted in such a way as to do the opposite.
 
I believe drivers will hold the keys for cross platform resource sharing. Historic segregated platform buying practices will keep that pool shallow.

The big upside is that I will now have a use for my GTX 770 after my ROG Swift + GTX 970 arrive. Also, I will finally make use of that beefier power supply, once a developer puts out a game to make use of all this "closer to the metal" API.
 


Within the same architecture, this is directed toward much more universal compatibility.
 
Status
Not open for further replies.