Crossfiring Same GPUs with different VRAM amount

Zuriel Ramirez

Commendable
Jul 2, 2016
88
0
1,640
I was thinking about upgrading to a MSI RX 580 but I'm not sure if I should get the 4GB or the 8GB model.. I plan to get a second gpu in the future to crossfire. If I get the 4GB model, am I able to crossfire with the 8GB model? Right now I'm still saving up for the GPU and the 4GB model costs $50 less than the 8GB model.
 
Solution
What's your timeframe on moving to the second GPU? Crossfire (and SLI) was always a questionable undertaking, with limited game support. The problem is that both CFX and SLI support have been getting steadily worse over the last few years. From what I know anecdotally, I would expect only around 50%, maybe 70% at best of games AAA to provide decent crossfire support... and even those aren't going to necessarily scale particularly well. So you're spending something like $400 US, probably more, for a graphics solution that half the time is only going to work like a $200 solution, and even when it does work, maybe gives you 150% of the performance.

The point is you're well into GTX 1070 (or a similarly priced Vega GPU which will be...
You will be able to crossfire these GPUs, but they will limit to 4Gb, making the original 8Gb card a waste. That said, you still should get better performance in most cases with Crossfire enabled, just a lower ability to store textures and other graphical data. Even so, many games don't take advantage of the full 8Gb anyway, so... yeah.
 
What's your timeframe on moving to the second GPU? Crossfire (and SLI) was always a questionable undertaking, with limited game support. The problem is that both CFX and SLI support have been getting steadily worse over the last few years. From what I know anecdotally, I would expect only around 50%, maybe 70% at best of games AAA to provide decent crossfire support... and even those aren't going to necessarily scale particularly well. So you're spending something like $400 US, probably more, for a graphics solution that half the time is only going to work like a $200 solution, and even when it does work, maybe gives you 150% of the performance.

The point is you're well into GTX 1070 (or a similarly priced Vega GPU which will be coming soon) territory price wise, but you're not getting that sort of performance consistently.

Multi cards are just a really bad idea right now.
Can you stretch your budget right away?
Can you wait until your budget nets you a better single GPU?
Can you just get a cheaper GPU and hold off a bit longer before replacing it with a better, single GPU.
Any of the options above are going to be much better, IMHO, than dabbling in the quadmire that is crossfire (or SLI) right now.
 
Solution
I would say XFire has actually gotten better and only SLI has gotten worse.

XFire scales very well, upwards of 70%. SLI doesn't scale as well.

XFire no longer requires any sort of card bridge. Current SLI requires higher cost, high bandwidth bridges.

XFire will work with PCI-e slots with fewer than 8 lanes. NVIDIA will not allow you to enable SLI unless all slots are 8x or greater.

XFire has been very smooth since the frame pacing update in the drivers a few years ago. I can't speak to in-game feel for SLI.

XFire doesn't require motherboards to be licensed. SLI requires licensing and is the main reason fewer boards support the technology, not because the boards couldn't run SLI.

XFire supports cards with different memory amounts; they simply run with the amount of the lesser card. SLI does not allow this. Cards in SLI must have the same memory sizes.

If you XFire two 580's, you won't get a lower ability to store textures, you simply don't get any more ability. Both cards will essentially have the same working set in their memory. DX 12 has the ability to have different working sets on each card, but this is a whole new can of worms for developers.

Running multi-GPU under DX 12 does not mean you get better memory usage on your cards, so always go into it expecting 2x 4 GB cards to behave as 4 GB cards, and 2x 8 GB cards to behave as 8 GB cards.

Multi-GPU support varies on a per-title basis. If the game you want to play doesn't support multi-GPU, you're out of luck. You can usually force it, but you'll often find that the reason the graphics drivers from AMD or NVIDIA haven't already come with support for the title is that is simply doesn't work, or causes more problems than it solves.

If you end up with an 8 GB card and a 4 GB card, make sure the 8 GB card is your primary in the XFire configuration. In the event you run a title that does not use XFire, at least you'll be running it on the 8 GB card.

Make sure your computer's power supply is up to the task of powering both cards.
 
@bigpinkdragon... while I agree with some of what you say, there are a few issues.

"SLI doesn't scale as well" is a massive generalisation. There are certainly games where SLI scales better, and others where CFX scales better. I don't think you can generalise like that without hard data to back it up.

"XFire will work in PCI-e slots with fewer lanes" - technically true... but you absolutely do not want to run it this way. On all current Intel and AM4 platforms, any x4 lane PCIe slots go through the PCH (or Ryzen equivalent) which adds significantly latency and means the GPU traffic is competing with all other communication over the PCH (including SATA devices, USB devices, network cards, etc). You don't want to run it that way.
While SLI licensing does add cost, it's very hard to find a board that supports two x8 slots (basically required for good CFX performance as noted above) that does NOT have SLI support. So you're probably paying for it anyway.

You're going to have a hard time finding support for the statement that CFX support has gotten better over recent years. AMD had a dual GPU version of the Fury X ready to release which would have been (technically) the most powerful card on the market. But they chose not to release it as a gaming card, almost certainly because crossfire game support was so poor.
There's a pretty detailed article exploring the situation from Ryan Smith at Anandtech here: http://www.anandtech.com/show/9874/amd-dual-fiji-gemini-video-card-delayed-to-2016
It's 18 months old now, so the specific games he looks at are no longer hugely relevant. But the game development landscape hasn't really changed much IMHO.
The TLDR of the article is that game developers are increasingly using specific rending techniques (such as temporal reprojection) which either outright break, or significantly complicate the process of alternate-frame-rendering, which is the basis of both crossfire and SLI.

Anyway, the biggest issue is a pricing one. 2 x 480 (or 580s), especially if you get the 8GB versions you really want for the performance you're hoping for, is going to be priced in the high four hundred dollar mark... right around a GTX 1080.
I understand not everyone can afford to buy something like that outright, and would prefer to stagger the cost by purchasing two cheaper cards at different points in time, but you're just making such huge sacrifices that it's really not worthwhile for most people.

I would suggest OP gets an RX 580 right now, hangs on to it for as long as he's happy with the performance, and then looks to sell that card and purchase the best single card he can for his budget when it comes time to upgrade. In the long run you'll spend a similar amount of money, get much more consistent performance, can manage a cheaper PSU with less heat and noise, and benefit from whatever new features come with an upgraded GPU in 2 years or so once the single 580 isn't cutting it anymore.
 
My link isn't exactly brand new either, but you're welcome to knock yourself out. It's a reasonably well done piece that set out to investigate multi-GPU. NVIDIA loses in the scaling department at every resolution, no matter how many cards are used:

Multi-GPU scaling comparison

If you dig deeper in the article, it shows that SLI has higher (worse) frame timing as well.

You may not like my massive generalization, but I'm comfortable saying AMD's XFire scales better than NVIDIA's SLI.

I forgot to mention that NVIDIA also dropped support for SLI on mid-range cards, and no longer allows using more than 2 cards together.

Even if XFire worked perfectly in every title, a dual-GPU Fury X card is an impractical idea. It's too hot, too power hungry, too expensive, and too niche a product to make it worth purchasing over two Fury X cards. Hardly anybody was falling over themselves to buy a single Fury X card, so going an extra mile and trying to convince them to buy a dual Fury X card seems a stretch. Why waste company resources pushing a card that's not going to give a return on the investment when the only benefit would be the space it takes up?

I really wouldn't worry about a PCI-e x4 slot being used for XFire on the AM4 platform, as only the x370 chipset supports multi-GPU, and those boards pretty much all feature well balanced slots. If performance was that bad, I suspect a redesign would have been in order to even get XFire working on the platform. Older AMD platforms derive all PCI-e lanes from the chipset, so x4 slots wouldn't be an issue there anyway. Whether this is an actual issue on Intel platforms I don't know, as I've never heard the assertion before now. Could be something to it, and I would be honestly curious to see real-world testing done.

Now, don't take all of this as my saying, multi-GPU is a better solution than a stronger single card. It's not, and for the foreseeable future it won't be. But, it is a solution that fits certain scenarios, whether or not you or others agree.

The way I see it is, by the time most people have saved up and are ready for the second card in a multi-GPU setup due to initial budgetary constraints, the market has usually moved on to a point that the 2nd card is no longer worth getting, unless you can purchase it cheaply 2nd hand. I see multi-GPU as a solution initially when a build is put together, shortly after a build is put together, or long after the cards have passed from primary circulation, as a means of adding value or a bit of extra life to an aging system. There's a point in the middle of a card's lifespan where it makes more sense to spend the money toward a faster, single card configuration.

For the price of the 2nd card down the road, you can probably get a 2nd hand top-tier card to replace your middle-tier card.
 
So the difference is a few %? And they only look at top tier cards. And it's wccftech :-(
Sure, it does look like a detailed analysis, but the gap is small and generalising it to different cards of different generations is a stretch. You also originally said "XFire scales very well, upwards of 70%. SLI doesn't scale as well" -> according to the article you linked, the difference is a few %.

I agree with all the criticisms you put to the dual Fury X - except that AMD actually developed it and released it. It was a working card with all the R&D done. They simply choose not to market it as a gaming card because it didn't game very well... because Crossfire support is in such a bad state. That link I provided explains why.


I agree with 90% of what you're saying here, and this is basically what I've been saying since my first post. Crossfire is very rarely a good idea.

I agree it can be an option to extend the life of an old system/card if you can pick up a matched card for cheap on the second hand market... and your PSU has the required headroom.

I agree it can be an option when buying new, but **only** when you're going all out on a top-tier rig. If you want to game on 3x1440P 144hz monitors, then going 2x1080ti (or dual vega when they're released) is not a bad idea... because you don't really have any other options.
In OP's situation, 2xRX 480s put you well past the price of 1070s and approaching 1080 territory. Surely you'd agree that any well rounded gaming rig is better off saving ~80-$100 and getting a 1070 instead of two 480s... and you could just about afford a 1080 if you're looking at 8GB 480/580s.
Both AMD & Intel require you to spend up on motherboards with two x8 slots (not mandatory for crossfire, but surely recommended at least?), so once you add that cost and the extra PSU you'd need to drive 480s the "value" of that solution looks even worse.

I stand by the recommendation in my previous post: IMHO OP would be better served by a RX 580 (or 480 if it can be had significantly cheaper), and then upgrading to another single card in a couple years time when that card is no longer cutting it.
 
They really released the Dual Fury X? I must have missed that one. Sounds like an expensive mess.

Not sure where the OP is located, but if he were in my neck of the woods and willing, the used market has used R9 290x cards from 200 to $130. That right there is a good bargain, and is why new isn't always better. Fury X can be had for down to $300. Really makes me question the bargain of a 580 at $230 or more.

A shiny new 480 or 580 sounds tempting, but are all of the incremental architectural improvements going to allow it's 2304 cores and 256-bit bus to out grunt the 2816 cores and 512-bit bus on the 290x, much less a Fury X? As with multi-GPU, I would say only on a per-title basis. The biggest benefit is power draw.
 

They did indeed: http://www.anandtech.com/show/10279/amd-releases-radeon-pro-duo-fiji-350w-vr
And it was indeed!

Not sure where the OP is located, but if he were in my neck of the woods and willing, the used market has used R9 290x cards from 200 to $130. That right there is a good bargain, and is why new isn't always better. Fury X can be had for down to $300. Really makes me question the bargain of a 580 at $230 or more.

A shiny new 480 or 580 sounds tempting, but are all of the incremental architectural improvements going to allow it's 2304 cores and 256-bit bus to out grunt the 2816 cores and 512-bit bus on the 290x, much less a Fury X? As with multi-GPU, I would say only on a per-title basis. The biggest benefit is power draw.
In the US, 4GB 580s start at $205 and are the go-to for me right now. That's $25 less than the 6GB 1060, which I think is a fair trade off for the power draw.

I'm certainly not knocking second hand options: A 290X at under $150 would be an interesting proposition. The Guru3D review has a decent aftermarket 580 overall a few % faster than a 290X and 390X, particularly at 1080P (reivew here) I don't know whether Hilbert over at Guru3D retested those older cards with updated drivers. But both the memory and core are clocked much higher on the 580, so that probably accounts for how it can make up for the lack of resources. It's still an interesting proposition though, effectively trading a couple of FPS, a new card warranty and much lower power draw for a chunk of extra cash.

Anyway - I don't mean to take things off topic. I still recommend OP abandon the plans for Crossfire and get a single 4GB 580 @ just over $200. Second hand may well be a viable option. Hopefully our back-and-forthing above can help OP make an informed decision.