[SOLVED] How much extra work is required of game developers to incorporate CrossFire or SLI compatibility?

carlsow

Honorable
Dec 31, 2018
6
0
10,510
Hi Everyone,

I recently picked up a second Radeon RX 580 from a friend for cheap. I figured it would be fun to try using CrossFire given I was able to get the second card at a good deal.

I've managed to get everything setup and I've seen big improvements when running the Witcher 3. With a single gpu, I was averaging a generous 30 fps on my 34" UWQHD monitor, with in game graphics set to Ultra. After setting up CrossFire, I now get 55-60 fps with in game graphic settings being the same.

After experiencing such improvement (which I know is not the case on every game), I've been wondering what sort of requirements on behalf of the game developers are required to include SLI or Crossfire compatibility in a game? I've found a lot of sites debating advantages and disadvantages of the technology as well as setup and compatibility. What I don't seem to find is how the cards actually work together to render an image. Additionally, how much work is it for a game developer to add this functionality? I'm guessing that it is a lot, but I would like to know more as to why it is so much work.

If anyone has some knowledge they would be willing to share on this, I would appreciate it!

Thanks and Best,

Carlsow
 
Solution
Hi Everyone,

I recently picked up a second Radeon RX 580 from a friend for cheap. I figured it would be fun to try using CrossFire given I was able to get the second card at a good deal.

I've managed to get everything setup and I've seen big improvements when running the Witcher 3. With a single gpu, I was averaging a generous 30 fps on my 34" UWQHD monitor, with in game graphics set to Ultra. After setting up CrossFire, I now get 55-60 fps with in game graphic settings being the same.

After experiencing such improvement (which I know is not the case on every game), I've been wondering what sort of requirements on behalf of the game developers are required to include SLI or Crossfire compatibility in a game? I've found a lot of...

kanewolf

Titan
Moderator
Hi Everyone,

I recently picked up a second Radeon RX 580 from a friend for cheap. I figured it would be fun to try using CrossFire given I was able to get the second card at a good deal.

I've managed to get everything setup and I've seen big improvements when running the Witcher 3. With a single gpu, I was averaging a generous 30 fps on my 34" UWQHD monitor, with in game graphics set to Ultra. After setting up CrossFire, I now get 55-60 fps with in game graphic settings being the same.

After experiencing such improvement (which I know is not the case on every game), I've been wondering what sort of requirements on behalf of the game developers are required to include SLI or Crossfire compatibility in a game? I've found a lot of sites debating advantages and disadvantages of the technology as well as setup and compatibility. What I don't seem to find is how the cards actually work together to render an image. Additionally, how much work is it for a game developer to add this functionality? I'm guessing that it is a lot, but I would like to know more as to why it is so much work.

If anyone has some knowledge they would be willing to share on this, I would appreciate it!

Thanks and Best,

Carlsow
Multi-GPU rendering is usually done in one of two ways -- alternate frame rendering where one card does "even" frames and the other does "odd" frames or by splitting the frame about 60%/40% and the card doing 60% is the display card. The split is not 50/50 because of the time required to copy the results from the second card. This Wikipedia article has more info -- https://en.wikipedia.org/wiki/Alternate_frame_rendering
 
Solution