News AMD Apple-Exclusive Dual GPU Beats RTX 4080, RX 7900 XTX

bit_user

Titan
Ambassador
If this card is not available natively for a standard desktop, then how did this person find drivers for it?
It must just work with the normal AMD drivers. Either that or it was running on Linux, under Wine - but I seriously doubt that.

I'd hazard a guess that the software sees it as two RX 6800 GPUs on a PCI bridge, with a Crossfire link between them.
 

blppt

Distinguished
Jun 6, 2008
579
104
19,160
"I'd hazard a guess that the software sees it as two RX 6800 GPUs on a PCI bridge, with a Crossfire link between them. "

I tend to doubt it is using CFX technology, which never worked on Mac that I can recall, OGL or Metal. It might actually be seen as one gpu.
 

Elusive Ruse

Estimable
Nov 17, 2022
459
597
3,220
Damn, does it mean Apple paid AMD enough money to stop them from releasing this on PC even if it meant it would rival Nvidia's high-end products? Sheeeeiiiittt!
 

healthy Pro-teen

Commendable
Jun 23, 2022
58
54
1,610
Damn, does it mean Apple paid AMD enough money to stop them from releasing this on PC even if it meant it would rival Nvidia's high-end products? Sheeeeiiiittt!
This dual GPU doesn't give any gaming benefits over the regular Rx6800. If it did, AMD would have released this along with the regular RX6800 and dominate the RTX3090Ti. Heck AMD could've released a dual 7900XTX GPU monster that even the RTX5090 might not have beaten.
 
Kudos to 3DMark for still supporting multi-GPU and obviously props to der8eur for his curiosity and ingenuity.

I wonder if a dual RX 6900 XT could've even bested a RTX 4090. That'd be 33% more CU's. If it scaled really linearly, then it could just squeak past the 27% margin held by the Nvidia card.
I doubt that it's 3DMark enabling dual-GPU support on recent GPUs so much as it's just a relic due to the fact that 3DMark Time Spy is now over six years old — it was released back when dual-GPU was actually a thing, though admittedly not a very important thing. Multi-GPU gaming has at best been maybe 50-50 on support, and that has effectively ended now. Not just for Nvidia killing SLI, but AMD never talks about CrossFire these days either.

It's the same old story, though: Considering the latest gen AMD GPUs aren't more than twice as fast as their predecessors, two of the previous gen GPUs in an ideal case can deliver more compute than a single latest gen GPU. Dual RTX 3090 / 3090 Ti would also likely best an RTX 4090, at least in Time Spy. But 3DMark has always been more of a contest for bragging rights than anything truly useful, especially when it comes to multi-GPU support. Also, dual-GPU has increased latency compared to a single GPU, because of AFR.
 
  • Like
Reactions: Phyzzi and bit_user

bit_user

Titan
Ambassador
Also, dual-GPU has increased latency compared to a single GPU, because of AFR.
How confident are you that it's using AFR?

Also, it seems to me that, even if it is using AFR, the downside of that extra frame of latency would be a lot less at higher framerates. Ideally, if the CPU is just twiddling its thumbs, waiting on 1 GPU to display so it can start rendering the next frame - with AFR, it can fill that idle time by queuing a frame to render on the second GPU. That simplistic description presumes no double-buffering, but you get the idea.
 
How confident are you that it's using AFR?

Also, it seems to me that, even if it is using AFR, the downside of that extra frame of latency would be a lot less at higher framerates. Ideally, if the CPU is just twiddling its thumbs, waiting on 1 GPU to display so it can start rendering the next frame - with AFR, it can fill that idle time by queuing a frame to render on the second GPU. That simplistic description presumes no double-buffering, but you get the idea.
While there was talk of split screen rendering for multi-GPU, to my knowledge that was almost never implemented. It required games to directly code for multi-GPU, rather than having the drivers doing the work. Way back in the early SLI days, there were some examples of SFR, but I suspect nothing since 2015 has done multi-GPU without using AFR. And of course, 3DMark was never a game engine and thus had no real reason to look at other approaches. SFR basically requires more work and doesn't scale as easily as AFR is my understanding. But, LOL, some tool did write this back in the day: https://www.anandtech.com/show/8643...crossfire-with-mantle-sfr-not-actually-broken

Anyway, with AFR my understanding is that to avoid having the game potentially get ahead of itself, you basically render frame 1, render frame 2, send frame 1 to the actual display, render frame 3, send frame 2 to the display, render frame 4, etc. Maybe some games were better about this, but I'm pretty sure they always showed frame n-2 whereas with one GPU you'd show frame n. If I still had two of the same GPU with the requisite NVLink or HB-SLI bridge, I could test! (I probably do have the necessary parts, but some are buried in an unknown box.)
 
  • Like
Reactions: Phyzzi and bit_user

Phyzzi

Commendable
Oct 26, 2021
11
14
1,515
Are you aware Nvidia has killed SLI?

I'm not sure how you can compare if you cant do SLI

At the range of about 5k a pop, you can get the Ada silicone on a card that will do SLI, and it would absolutely dust this thing. That's not to say AMD couldn't provide equal power for better value, but if you are spending that much on GPU processing hopefully you aren't installing it in a mac.