Radeon R7 240 Crossfire

Wayne Winquist

Honorable
Jan 11, 2014
11
0
10,510
In a previous post I mentioned how I gave the card I used to have (Radeon 6670 2 gig) to my mom for HDMI use, while upgrading my machine (MSI A75-G55 FM1 motherboard) to 2 Gigabyte Radeon R7 240s. This was not a light decision: one of the reasons I went with this was because on AMD's specifications website the cards were listed as crossfire-capable. Unfortunately, Gigabyte is saying otherwise, and while both cards are running fine (each running one of my two displays - the TV on one, the monitor on the other), the lack of SLI leaves me concerned for a few of my games. Is there any way of enabling crossfire, either via hack or forced? Catalyst Center isn't giving me the option (probably due to the expectation that Crossfire should be with 2 compatible cards, such as the 6670 or 7670, the latter of which I had difficulty finding when I was making the purchases and which forced my hand further.) Thanks.
 
Well, the R7 240 is indeed supposed to support up to 2 cards in CrossFire without the need for a bridge connector. Wayne, are you using the newest catalyst package? You should be able to enable it there, what do you mean that Gigabyte says otherwise, did they specifically tell you these cards won't work with CrossFire? Also, the only issue I could see is that your mobo only provides x4 pci 2.0 lanes on the 2nd slot. That card should probably be fine with that, but maybe catalyst doesn't think so... I dunno.
 
Baba, in hindsight thethe single 260X would have been a better buy - that said, I was hoping the extra GRAM and processing power working together (as it had with my previous setup - the 6670 worked perfect in crossfire with the 6530D built into the 3650 Llano APU) would outweigh going the single card setup, under the assumption that Crossfire would work. (And yes, I had already bought them - if I was still considering it, we'd be having a different discussion.)

Swifty, already mentioned CCC (Catalyst Center in the post - forgot to add Control, the third C.) Sorry. Yes, already looked there.

 
Lark, you snuck by me while I was answering the other two guys. Using the latest edition of CCC (unless something's changed in the last 24 hours.) In terms of features when looking up information on the specific cards, Gigabyte (the cards manufacturer) does not mention anything about crossfire working, in either ad or support. I'm thinking the conflict may be arising either because the motherboard is specifically looking for crossfire connectivity between the first PCI-e and the APU (which is incompatible with each other), or the fact that Gigabyte disabled it for some strange reason. You are right on the second PCI-e x16 being 4X, but I'm doubting that's the actual problem, as is the power supply (which I had checked on a couple of power supply calculator sites.)
 
Ok, what about some kind of setting in the BIOS? Could you try setting the graphics mode to PCIe-only instead of "auto". Does that make any diff? Ah, from your original post - I'm pretty sure that for CrossFire to work both monitors must be plugged into only one card of the cards (use the primary card, closest to your APU NOT the motherboard ports). Try that first.
 


The whole reason why I originally discovered the crossfire issue was because both replied the same graphics card at the same time. Looking into the Bios, my setup was set to only look into the PCI express slots.
 
The only other thing I can think of would be possibly a BIOS setting or Catalyst setting for "Dual-Graphics". Dual-graphics is like CrossFire but uses the APU in conjunction with a single compatible graphics card. Make sure any settings for dual-graphics are disabled since CrossFire is slightly different. I sympathize - if AMD says the R7 240 should do CrossFire then any card carrying it should do CrossFire. I wonder if its the x4 slot, but again, it doesn't seem like that should be a problem.
 
Swifty, that chart was one of the first things I looked up when I was looking into things. As for the HIS page, it's funny that the 6670 was a HIS card. However, the two r7 240's I have are these Gigabyte GV-R724OC-2GI, found here: http://www.gigabyte.us/products/product-page.aspx?pid=4794#dl

Lark, I originally had that deselected with no such luck, so I turned it back on with (still) no luck and an extra disabled GPU! I agree with your thought on "shoulds," but companies deviate from stock all of the time - The core clock on both of these cards are at 900, up from the standard 780, according to GB's website. As for the 4x slot, I don't think that'd be doing it, though it's worth noting that with both cards in, the 16X slot turns down to 8x - so it's possible.

I suppose I should look at the bright side: for what I will need while in school this semester, they'll work better than what I had before. Until I start making games and relevant software that takes advantage of that speed found in the better cards, I can survive on this.
 


It doesn't say one way or the other about Crossfire - there's no mention of it in support or features for the card. Usually those that support it make mention of it, from what I have seen, even if it's a standard feature of the basic graphics solution for that chip.
 


Doesn't answer the question, the original question is about is there a hack or means to do crossfire with the R7 240, not "what do you think of my video card?" If you don't know the answer, don't waste time here.
 
Hi! I've just built a PC with two Radeon R7 240.
Specs:
mb: ASUS P5E
cpu: Core2Quad Q6600
ram: 2x1GB Corsair DDR2 (i'll buy some more)
gpu: 2x Gigabyte R7 240 2GB
Bios at default settings.

I got the the motherboard, cpu, ram and one of the R7 for free, so I bought another R7 because that was the cheapest solution.
Crossfire works like a charm!
GTAV at 1366x768, 4xAA, High settings, 30-50FPS :) Sometimes it freezes for about 0.5sec. I think low memory causes that.. Witcher at 1366x768, 4xAA, high postprocess settings and medium graphics settings: 40-60FPS
Dirt2 (just for fun) at ultra settings @1920x1080 at 60FPS (that can't be better because of the 60hz refresh rate).

Hope this helps for somebody :)