Crossfire HD 7850's user review

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.

Few Oranges

Honorable
Jan 15, 2013
95
0
10,640
Hello all, just here to give my personal opinion and review of the HD 7850's I have in crossfire.
First off I have:
--one Powercolor HD 7850 2GB at the typical 860Mhz core clock and 1200Mhz memory clock
--one HIS HD 7850 2GB also at 860Mhz/1200Mhz
--Beta Drivers 13.2

Currently I am playing at 1920 x 1080 resolution (however im gonna whip out the eyefinity setup in a cpl days)
--All games I play are first person shooters at absolute max settings with a minor adjustment between 2X-8X aliasing (i dont really care for aliasing since i personally don't notice much of a difference)
=======================================================================================

With that being said here are the games and experiences I've had.

--Black Ops 2 multiplayer: 124-178 FPS
--Far Cry 3: 78-150 FPS
--Borderlands 2: 175-245 FPS Physx on Low
--Dishonored: 128-130 FPS
--Crysis 2: 65-102 FPS DX11 with Maldo High Resolution Textures
--Just Cause 2: 122-280 FPS 8X Aliasing
--Metro 2033: 49-82 FPS DX11
--Crysis 3: 39-65 FPS Aliasing Disabled

Dolphin Emulator
--Super Smash Brother Brawl 1080 Texture pack 60FPS or 100% Emulation almost the entire time however majority of the time i could achieve this with a single 7850 with the exception of having 4 Players on the screen
(For those that know about this, 100% emulation is hard to achieve due to sloppy coding for the emulation)
=======================================================================================

Things to Keep in Mind:
--I am currently using a VIZIO 32' LCD Tv that is 60Hz refresh rate, therefore anything over 60 frames is not registered
--Yes, I do in fact own a Wii and an actual copy of the SSBB game.
--All games that have a cliff note regarding DX11 indicates that there are multiple C++ settings you can choose that enhance performance with the degradation of quality, ie DX10 and DX9 games
--Crossfire is NOT for everyone.. there IS a bit of tweaking involved with getting what you want out of your crossfire setup
--My HD 7850's are connected using one a single crossfire bridge as thats the only option my cards have, i believe all the 7850's are single fingered crossfire capable but im not positive
--Crossfire DOES generate a lot of heat if you are not using proper cooling and airflow/fan profiles
+My First GPU is the Powercolor, other than Crysis 3 the core temperature never exceeds 56C (61C for Crysis3)
+My Second GPU (HIS) Never exceeds 53 on any game
+Both Cards use Stock coolers that are single fans, however they both have Antec Formula 6 Tim applied
+Both Cards are REFERENCE designed meaning the fan pulls air in across the PCB and exhausts out the back
=======================================================================================

What about Microstutter?!?
--I have yet to run into ANY microstuttering of any sort, even in games where there may have been some dips in FPS, although this is the case for me I attribute this to the fact that im getting some very high frames and microstutter is rumored to plague the lower frame situations, but no one is actually positive about the microstutter causes.
=======================================================================================

Tweaks/Issues?
--The only issue I have run into is that on 13.2 Beta drivers the "AMD HD Audio" driver is no longer present on CCC or my computer whatsoever, however I'm using speakers so this is not an issue for me, this would only effect those that are using native speakers in their monitor or television
--Using "MSI Afterburner" to monitor my temperatures, the only annoyance with this is that i must start the game, go to desktop and then launch afterburner in order for both my GPU1 and GPU2 temperatures to be displayed. Its a minor grievance, but a kink in the fluidity of gameplay nonetheless
=======================================================================================
Verdict::
After reading and researching countless reviews and talking with individuals in person about the horrors of crossfire i decided to dive in against the majority opinion and I am glad I did.
While Crossfire may have its ups and downs Ive had an excellent experience thus far, I understand that crossfire problems range from rig to rig and that's definitely understandable
--Don't go the crossfire route if you are not: |Patient| Technologically sound| Or Experimental|
--DO go crossfire if you already have a decent card and are not willing to sell and buy a larger one but want performance boost, this is what i did.
Even with my crossfire setup being smooth there are always going to be bumps with drivers and the like so use caution.

Thanks for reading this short novel, if you have any questions about my setup ill be happy to help!
 


Few Oranges, I was about to ask you what your main rig is that you use. But I see in your signature you're using an i7 2600. I myself have about 10 PCs! Four of them are quads and are still quite current (about 2-3 years old apart from some recent upgrades like video cards, CPUs and Mobos). Three of them are Phenoms, a X4 B99 itx build in a Cooler Master itx case (the 7850 was bought specifically for this as my portable gaming rig), 2 of 555 BEs (which both unlock to stable quads @ 4.0 Gig), and my beast is an i5 2500K on a Z68 Extreme4 Gen3 board. My i5 is also in a HAF 912 case powered by an Antec 750 watt Mod PSU and a Evo 212 with 2 fans on the chip! Oh yeah, it overclocks to 4.4 Ghz without breaking a sweat!

Since the Z68 Extreme4 is the only board that supports both SLi and Xfire (and is PCI Express 3.0 compatible), I plan to bench both the 7850s and GTX460s and get my own un-biased results. Unfortunately the i5 2500K (Sandy Bridge) does not support the newer PCI Express 3.0. Since the HD7850 is a PCI Express 3.0 card, there might be a slight penalty hit with that combination! I'll also bench the 460s on one of my unlocked 555 BEs, they're both implanted on a Asus EM4N98TD EVO mobo which only support SLi. This will give me a chance to see if there is a noticeable difference on 2 CPU platforms. And while I'm at it, I might as well bench the pair of 9800GTs with Zalman coolers I picked up recently for dirt cheap!

One thing I forgot to mention was how poorly some older games (such as FarCry and Crysis) play on an SLi/Xfire setup. Occasionally while playing one of these games, GPU usage drops to below 30% and that's with V-Sync off. I'm not too sure if it's a CPU limitation (older games don't use all 4 cores apparently so a faster dual core is better) or if the game is poorly coded or if it's a driver issue again. I've noticed this only in SLi (I can't vouch for Xfire until I get the 2nd card). Currently my HD7850 is in one of my unlocked Phenom II 555 BE rigs with the Asus SLi board. Just to test how well Crysis 2 is coded, I ran the game on the same level at 2 different speeds noting the CPU usage on a single graph. At 3.2 Ghz, the CPU was consistently between 50-60%. I then dropped the speed right down to 2.0 Ghz (the core down to 1.0v on the CPU, talk about cool running) and CPU usage was hovering between 70-80%! The HD7850 was run on a 22 inch screen with a native res of 1680 x 1050 (which is 85% the pixel count of 1920 x 1080). The second highest game quality setting was used and V-Sync was on at 60 FPS. I did this multiple times and I could not notice any difference in game play at all! So, do we really need to OC the CPU to see any difference if the games are coded really well?

As for getting a 2nd 7850, that will definitely be in about 2-4 weeks now. I just bought myself an Align Trex 500E Helicopter (my other ones are getting lonely!) But that's another hobby for another forum!
 
if your using vsync and hitting 60fps most of the time, you will not notice microstutter. You will notice it if you try push the cards too hard and the fps is below the vsync cap for long enough periods to notice it. If you don't plan on using vsync then you will notice microstutter/frame time variations. having a slow cpu paired with a crossfire setup will allow fps to dip more often not helping the situation. these should be something to note for people looking at crossfire setups.
 


Yes I am aware of micro-stutter, but I have not experienced it myself. For awhile I thought my 460s were stuttering but it turned out to be a too low polling rate on my mouse! In Crysis 2, movement with the keyboard was silky smooth but movement with the mouse was very jerky. I don't know why it took me so long to figure it out. My old logitech gaming mouse was only polling at 125 hz. Once I bumped it up to 500 hz, the problem went away.

To make sure my 460s were not stuttering, I logged frame times on FRAPS. I saw no problem there. As for micro-stutter on Xfire 7850s, I have seen the newest drivers have made a big improvement on frame times on some of the latest titles. Bringing the frame times down closer to the equivalent rated Nvidia cards, thereby reducing noticeable stutter if any. Once I get the 2nd 7850, I will be testing for the slowest CPU speed possible that does not impact on game performance on full HD!

The only reason I use V-sync is because I notice tearing on the screen quite easily. I know the FPS goes up to well over a hundred at times with V-sync off, but even tearing at high FPS is annoying to my eyes. It makes the motion seem jerky to me!
 
One of several issues with Crossfire is "runt" frames.... While a FPS monitor may be showing a high FPS number, what it is actually counting as a whole single frame is often a very small part of the screen, just a fraction of a frame (in the Frames Per Second measure). This inflates FPS readings lending the illusion that performance is higher than it actually is. When a reviewer uses proper FCAT testing techniques, what they often find is that the real or "Observed" FPS is actually no higher than a single card.

Filtering out the runt frames leads to "Practical FPS" results that are half the "Hardware FPS" (what would be measured by FRAPS, Afterburner, etc.). You can see that AMD is working on a "Prototype" driver to address the issue. Nvidia SLI does not experience the problem in any way:
http://www.tomshardware.com/reviews/radeon-hd-7990-review-benchmark,3486-5.html
bf3-average.png
 


You raise an interesting point here matto17secs! It seems as this is an important issue for AMD to fix since the release of their dual GPU the 7990 as it would be similar in performance to an Xfire 7970! I wonder if the thread starter is aware of this. Or is this really an issue with the high end card only even though it could be a problem for the entire 7000 series.

Are you sure Nvidia cards are not effected by this to some degree? I seem to recall quite often in my SLied 460 setup (with V-Sync on) displaying a constant 60 FPS on the screen and panning around the screen it would seem to jitter even though FRAPS was showing no dip below 60 FPS! Maybe FRAPS is incorrect here? Once I get my 2nd 7850 I will certainly be testing for this! There is nothing more annoying than jittery game play when the hardware should be capable of high FPS.
 


the 460's are 2 generations old which may have had stuttering problems back then, the newer 6xx cards and drivers don't exhibit this. Plus panning around with even a single card can exhibit jittering if high texture detail and draw distance are used if your filling up vram, new textures loading in a poorly optimized game engine etc. If your running vsync on, which most people do, and your hitting the vsync limit you have nothing to worry about anyway. Just get your detail settings right so your hitting the 60fps most of the time.
 
My 670 SLI was both cheaper and faster than my old 7950 Xfire, which was littered with problems. From my experience, SLI>Xfire. The only disadvantage was the 3GB frame buffer which would of been useful for 1440p which I play on, but then again even in C3 I still don't run out of Frame buffer at 1440p Maxed out.
 


That's something I will have to look into further. I have to admit, for the amount of time I've had my 460s, they haven't really had that much use! And together they outperform my single 7850, but not by much. They will end up going into my 2nd main rig once I get the 2nd 7850. And as I said in an earlier post, I'll do some extensive benching on 3 dual setups and determine what works best!
 

PC Perspective has done an exhaustive series on these issues. They found that the problems increase as you go down the list into the mid-range Crossfire setups. They also did not find any hint of the problem on Nvidia SLI.
http://www.pcper.com/reviews/Graphics-Cards/Frame-Rating-Dissected-Full-Details-Capture-based-Graphics-Performance-Test-3
http://www.pcper.com/reviews/Graphics-Cards/Frame-Rating-Dissected-Full-Details-Capture-based-Graphics-Performance-Tes-12
 


While this is informative its outdated as of the last 4 driver releases from AMD. I know this because I have tested my setup not only on a 60hz monitor but also a 120hz and 240hz monitor as well. When the frames are very high such as 140's or so in borderlands on the 120hz monitor there is an odd sensation of fluidity in movement that you just dont get from the 60hz refresh rate. If the "effective" frame rate was only half of the said frame rate then there would be some choppy scenery in the game on the higher refresh rate monitors. Also one thing to keep in mind is that although nvidia may not seem to have these issues (i am by no means a fan boy of either company) it is a very well known fact that nvidia surround has more issues than crossfire other than the high end gpus such as 680 and 690. Crossfire does suffer from driver issues upon release of fairly new cards but they have definitely pumped out much better drivers the past few releases.

Also one serious thing to keep in mind with this data is that its used in regards to BF3 which is not only geared towards nvidia cards they actually used specific encoding for the nvidia cards. Their marketing scheme even shows the nvidia symbol during startup and all that jazz. Similarly Far Cry 3 and Bioshock infinite etc. are partnered with AMD and would get similar results in favor of AMD. Im not rejecting the information at hand, im simply bringing a wider perspective to light.
 


As far as the difference between 2.0 and 3.0 x 16 performance difference there is seriously like a 3% difference if even that. There are very few cards that have yet to actually push the bandwidth limitations on pcie 2.0x16 let alone 3.0x16. So no worries on that part, Also every single game I have listed above I have played with vsync on to be sure that im hitting the proper fps. When i made this thread i specifically was referring to vsync off since it wouldnt be quite as helpful is i listed all my games and just said 60fps for everyone lol
 
UPDATE: Metro Last Light
Highest settings (except i have ssao turned off)
Im hitting an average of 77.6 FPS (sometimes theyre in the high 80s sometimes theyre in the mid 50s)
Temps are GPU1: 55-59C GPU2: 53-56C
Also for fun i decided to overclock my cards the temps above are in accordance with the Master card overclocked at 910mhz and the Slave card at 1000mhz. The temps are outstanding for the overclock range. I have both cards set at 75% fan speed.

One thing to keep in mind when crossfiring is to overclock the slave card slightly higher than the master card due to the relay of information sent in a loop from the master to slave back to master card. This increase the response time and really helps the scaling near the 80% range for work load.
 


It certainly isn't looking good for going the CrossFire way, does it! I had a good look at those links and I will definitely agree there is a problem with CrossFire. But it really does look like it's a driver issue, and I'm sure AMD are doing their best to resolve it. Could you imagine how upset AMD video card owners (wanting to go Xfire) will be if it turns out to be a hardware flaw! Would it be possible to do a video card recall?

Here's a link to how the new Prototype driver is doing. As you can see, there is significant improvement, but still not as good as Nvidia SLi!
http://www.techngaming.com/home/news/controversial-odd/controversial/those-serious-crossfire-microstuttering-and-runt-frame-issues-amd-responds-r780
 


Lol this is exactly what i was referring to in the above post. To put it simply this is just a marketing launch done by both companies to try and downplay the opposing company's sales. When you do real life benchmarks for the equivalent competing graphic cards they almost ALWAYS go blow for blow in the ring. If you buy a GTX 690 and compare it to an HD 7990 youre going to get almost perfectly level performance. Same goes for most cards down the line (680 vs 7970) etc etc. Then most people hop on a bandwagon of sorts to try and claim their loyalty to a certain brand. I have both nvidia and AMD cards. I love both brands, both have pros and cons that make up for each other where the other falls short. The reason i am going off on a rant about this is because there always seems to be very biased information out there for the average consumer to get lost in the things that really dont matter. My GTX 670 is a beast, I love it. But I also love my crossfire hd 7850s. Both setups play majority of the games i like to play on the highest settings and when it comes down to that, does it really matter what other small snipits float around on the web? 😀
 


Yes, I totally agree with you! I am not a fanboy of any company myself either. I will mostly buy a product mainly for Bang 4 Buck! I had my 460s for about a year on a board with a Phenom II 555 BE (unlocked, as a quad) but wasn't happy with the performance. I thought an upgrade to the i5 2500K + new Mobo would improve the situation but it really didn't! That was $600 I could have spent on something else. As I said in an earlier post, dropping my Phenom CPU speed to 2.0 Ghz from 3.2 Ghz made no difference in gameplay (on Crysis 2, that's all I tested). The CPU usage did go up, but that's to be expected. So it appears the upgrade was unnecessary in this case!

I bought both 8800GTS and HD4870 when they were only new on the market and paid top dollar only to be disappointed with their performance in a short period of time. What we really need to see is benchmarks made on more mainstream setups, not high end! This way the majority of the public (the Bang 4 Buck people like myself) will get a better indication of what performance they can afford. Also I said in a previous post to someone else, I'm sure the Xfire issue is driver related. Here's a link to a better driver:

http://www.techngaming.com/home/news/controversial-odd/controversial/those-serious-crossfire-microstuttering-and-runt-frame-issues-amd-responds-r780

I'm sure AMD spends substantial amount of money on research and development but somehow let a few driver bugs out. They just need to catch them now! I will be getting the 2nd 7850 in about 10 days. Can't really wait!!!

Edit: Just purchased another 7850 today, should have it tomorrow the 25th! Didn't get the same card as the one I already have, I bought the HIS Radeon HD7850 IceQ X Turbo 2GB (it was a bit cheaper and has 2 very thick heat-pipes). I will be mating it with the Gigabyte 2GB OC (with 2 fans). So stay tuned for some additional benchmarks sometime next week!
 
Okay, this will be just a short post (I hope) as I haven't had to time to do some extensive benching as I'd hoped.

Delivery was slow for some reason and I didn't receive the card until the 28th. I spent a whole night trying to adjust voltage to 1.075v on the second card like the Gigabyte card (using AB, Trixx, etc..) but it appears to always boost to 1.210v as a minimum when in 3D. I have flashed several firmwares to no avail! Both cards are stable at 1150 Mhz with the cores at 1.225v. But I will mainly use stock clocks of 1000 Mhz on the core and 4800 Mhz on the memory because for an OC of about 10-15%, system power went from 350 watts to 450 watts and temps rose 10-12 degrees C while using FurMark! I have noticed in Xfire mode, increasing memory speed to say 5200 Mhz made very little to no difference in frame rates while using FurMark! Scaling is very close to 100% while using FurMark once again. I prefer to use minimum FPS to gauge performance and with the settings I used, minimum FPS went from 80 with one card to 158 in Xfire!

I only had a chance to test Xfire with Crysis 2, as the Crysis games are my favourite first person shooters! I had to install 1.9 patch as flickering was an issue and only tested with DX9 (will install DX11 Patch soon). The res was only 1680 x 1050 (17.5% less pixels than Full High Def). Everything was on Ultra settings, with V-Sync off, frames were around 90-150. With V-Sync on at 60 FPS, what can I say! As Few Oranges said in a previous post, there is a strange sense of smoothness, even during intense action! It's this smoothness which makes aiming so much easier (this will make my hit/miss ratio improve a little!) According to AB, GPU load was between 40-60% and my watt meter showing system power around 250 watts. Temps were also good at a little over 50 C for both cards! The top card has 2 fans the bottom card only 1. I set my fans % speed to match the cards temp in celcius such as 50% to 50C.

Before I finish this post I must say one thing, I did notice during game play of Crysis 2 with V-Sync off, I had AfterBurner displaying frame times. While I was around 100-120 FPS, I could see the rates flickering between 2 rates of around 10 and 20 msec! Well 20msec is only 50 FPS. I don't know how Xfire renders the frames, if it's alt frame rendering then it's probably the 2nd card causing some sort of a delay while rendering. More than likely a driver issue and I'm sure it's getting attention! If there are runt frames, I certainly didn't notice it. But tearing frames well over 100 FPS is annoying to my eyes with V-Sync off of course. I'm pretty sure it's not the cause of runt frames as my SLi rig gives me the same sensation with V-Sync off!

I did play around a little with BF3 and Cod MW3 and the smoothness of game play is absolutely amazing and well worth going Xfire IMHO. I plan on getting a single 7970 (when the prices drop) before the end of the year to go in my gaming cube and compare it to my Xfire 7850s.

But for now, that's all I can say until I do more benching!
 


yes its alternate frame rendering. Plenty of sites have tested this, the latency is improved with a prototype driver in some games. http://techreport.com/review/24703/amd-radeon-hd-7990-graphics-card-reviewed/8
better, but not fix for all games, im sure one day they will fix this crossfire problem that has existed since the beginning, but is only now being worked on due to sites like Techreport, asking the hard questions.
 


Yes I have seen that review before, that's how I know of the prototype driver. There still are some nasty spikes in some of the captured frames on some titles regardless of manufacturer (Nvidia/Amd). No doubt these spikes can be perceived as stutter! I suppose it's games like Crysis 3 that can tax even the best system money can buy today. It's definitely the game engines of today that are pushing the video cards of tomorrow to the limit!
 
The problem with developing the Prototype driver is, how to decrease frame latency without also reducing performance? As we have seen in the charts, when you factor out the runt frames, Crossfire simply does not perform all that well. The Observed FPS is the real, actual performance of the Crossfire setup completely stripped of the illusion provided by framerate boosting runts.
 


Excellent to hear Crash! I am glad to hear youre enjoying your setup and I appreciate the feedback. As for scaling purposes to possibly decrease the latency (frame times) between the slave card and the master card always run your slave card slightly higher on the core clock and, if need be, the memory clock as well. I have noticed a major improvement in the smoothness of frame delivery. Especially with the GPU intense games like Crysis, Far Cry 3 and Metro LL. This works for me really well so it may help decrease the frame latency for you!
 


Yes, everything is going well with the CrossFire setup. I've been real busy doing fresh installs (and backing up) of windows on the new setup and my older SLi setup (even though I've had the SLi setup for a couple of years, it really hasn't been used that much, both fans on the 460 cards don't even have any dust on them!) I plan on keeping both systems as I will continue to do some more benching on the two different platforms. Besides, it's hard to sell older technology that's a few years old even if it is like new!

So far with the 7850s, every game I've tested I have been able to max all settings (with V-Sync on, I don't like tearing, even at high frame rates) except for Crysis 3! But I think most gamers will be in the same boat with that game! V-Sync keeps the wattage and temps down so I use it all the time unless I'm benching of course. It appears my Xfire 7850s are about 50-60% faster than my SLi 460s. I thought it would be more as the 460s are only 768mb versions. I'll do more stringent testing to verify this soon.

Few Oranges, thanks for your advice on the slave card speed. I will have a chance to do more benching after the weekend so I'll give it a go then
 
Hey thanks for the great and informative thread. I currently have a gigabyte 7870 ghz edition (its a pitcairin not the chopped down tahiti), and I would love to match or exceed the performance of some of the higher end cards with a Xfire setup. I do have some questions and concerns that perhaps you could could help alleviate. First is the horrific performance issues toms has posted up in the recent past. It appears as though the drivers dont even utilize the second card. This may be a architectural issue as the only Xfire benchmarks I can find are on Tahiti gpu's. That being said it seems none of you experience these problems at all and are posting some great FPS. So it has tempted me to go ahead with it. Second concern is that I am using an fx-8350 (I know I know but I'm have been an AMD diehard for a long time) and am worried I may hit a bottleneck. Thanks for any input.
 


Have a look at this as a guideline, it's still the same generation of GPU but on the high end:
http://www.anandtech.com/bench/Product/768?vs=769

The amount of FPS you will get will depend on the title and the resolution you play at. As you can see, scaling will vary a bit. I've read quite a few articles claiming that 100% scaling is theoretical. But you can see that some benchmarks actually show 100% scaling is possible. So much for theoretical! I believe as long as the hardware can push the data through quick enough, 100% scaling should be possible on most titles.

Anyway, what I suggest you do is play some of your games at the quality/resolution you like to play at and keep a copy of task manager in the background while playing. Make sure you have the performance tab active showing one graph for all CPU cores. After playing your game for minute or so, swap to the graph. If your usage is around 60% or below, your CPU should be OK with the 2nd card. If you haven't OCed your CPU at all then you still have plenty of headroom!

Oh, and one last thing. Make sure you motherboard is CrossFire capable!

Edit: In response to your concerns about Xfire issues, I have not experienced any problems like stuttering/runt frames. They are not visible to my eyes anyway. Driver updates do appear to be improving Xfire performance at least in benchmarks. Heck, even my trusty old GTX460s 768mb are performing better than they were 2 years ago with newer drivers! But I have to admit, I don't run my gaming rigs on XP or Vista anymore. It's 64bit Seven all the way!
 


No, your cpu is nowhere near a bottleneck for this crossfire setup, bottlenecks are really more geared towards low clocked dual core systems. Also most games truly only utilize a single core at a time. For fun though you can always bench with your cpu at stock and then juice it a bit to see if there is really any difference between the results.
As far as the results youre referring to, those are fairly old tests and do reflect the driver performance for that time, however weve gone through many revisions in drivers since then. 13.6 Beta's are the most recent at the time of this post.
As an update if you are interested I will be dishing out some more FPS counts on games such as "Remember Me" and "Deadpool." If you have any more questions please feel free to ask!
 
Thanks for your information, and reassurance. I'm definitely looking forward to getting some delicious Xfire on my 7870's. I will also look at some benchmarks as soon as I get everything up and running. I am looking forward to some more benchmarks from you as well.
 

TRENDING THREADS