Best Graphics Cards For The Money: January 2012 (Archive)

Page 39 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.



looks like a psu issue caused that, or a loose/ shorted connection. i wouldnt say it was bad enough to doom a manufacturer, if they re ordered now an re tested with same results of different physical gpus with same results maybe
 
One person's bad results are an insignificant sample size (as are my one-person excellent results). In any case, I only ran one card, not two. Mine overclocked easily to 1130MHz (I didn't try any higher; I noticed that speed gave me a nice round 800MHash/s rate).
Worth noting, before I bought the Gigabyte card, I had tried a XFX "DD" model since I'd had excellent results with prior "DD" cards (HD7770 and HD7870) and two other XFX (HD5770 and HD7750). Their HD7970 ran hot, howled, and support was unresponsive on whether or not replacing the TIM would void the warranty, so I RMA'ed it for the Gigabyte version. That was a shame, because I really like how solid my other XFX cards have been, and how quiet they are.
 
Yea, XFX seems to be suffering from some issues as of late. Hopefully they can get their act together on quality control as I used to like their products as well. My 4870 and HD 5850's were Sapphire and they are still doing well for me. Well one 5850 is anyway. One is just shelved till I get the time to replace the 4870 in my file server with it. :lol: The 4870 is probably going to end up in my PhII X2 rig for WoW rig #4 once I get ram and PSU for it.
 

It's really funny how that goes. I love Asus motherboards, but 2/4 Asus graphics cards I bought new died within hours or days of first use; they're near the bottom of my list now. OTOH, I've had excellent results with MSI graphics cards, yet their cheaper motherboards are known for VRM issues (to be fair, their Z77A-GD65 Gaming board I got to review was one sweet board; had my case been ATX, I would have switched to it as my primary).
 


Interesting. Do you have any idea how much you have made right now compared to how much the cards hve cost you in electricity usage?
 
For me, the electricity use wasn't much. I mined on my open testbed, which was an i5-3570K. My UPS showed it pulling 312W from the wall. I know it has gone up this year, but I think my rate is in the $0.12/KWh range, so it cost me less than a dollar per day. One BTC is currently at $869, so I'm well ahead of the game.
 
@3rd party cards with satisfactorily performing coolers suddenly turning sub-par: seems like this happens when vendors try to recycle parts without paying attention to quality and performance. cards previously performing well starts performing below standard.
http://www.tomshardware.com/reviews/r9-290x-case-performance,3710.html
in the article, take a look at asus' 290x dcuii and gigabyte's gtx 780 windforce press sample. check the comments (actually thg germany site) for followup and the fix. fudis: i haven't read the german article yet. i think it's a q.c. issue.
 


Unless/until you sell the BCs, you've not made anything back at all. If the goal was to recoup the cost
of the GPUs, shouldn't you have sold the BCs by now? Waiting for the value to keep climbing just to
get those extra few $ when the original investment goal has already been met doesn't make sense to me.
Delaying merely risks the value crashing for whatever reason.

Ian.

 
XFX cards reviews have consistently worse temp and overclocking review results compared to direct competitor products.
So besides the fact that users with XFX cards are a lot more common, that should be enough to avoid them in my opinion.

I remember the HD7950 reviews before I bought mine, I came across a review that showed crazy fluctuations in voltages that correspond alarmingly with VRM temps exceeding 80C.

So yeah, just my 2 cents. :)
 
I'm not sure what to think about XFX now. My HD7870 has been a real sweet-spot card. The HD7770, HD7750, and HD5770 are now with their second owners, and have done well also.
In any case, I'm not likely to need a graphics card any time soon. If I were to buy one, it probably wouldn't be stronger than the GTX760 though.
 
My HD 7970 replaced CF 5850's. I think I am good for awhile as well. To think I would hit a point where I had absolutely no need of an upgrade for probably a couple years is astounding to me. :lol: My 3570k won't be obsolete anytime soon. Maybe more ram and better storage solution in my future, but I doubt I really need it. My spare rigs will probably get upgrades instead of my main rig. :lol:
 
Why did you decide to go with a bar/line chart for the performance/price comparison? I feel like the natural choice would have been a scatter plot with price on one axis and performance on the other. You could even do a sort of candlestick deal with lines connecting the different settings for each card if you wanted to emphasize them.
 
True! I'm getting itchy from lack of an excuse to build! Phoenix has a 1TB RAID1 and Omega has a 2TB RAID1; both have SSDs as well. I'm contemplating a new case though; the Enermax Hoplite is loaded with nice options, but they create some cable management snarls.
 
@Isaiah4110, I'm pulling roughly 1.4 MH/sec with a 7850, R9 270x and 280x. My differential energy cost is about 8-9 cents per kWh and am pulling an extra 550-600 watts. But I'm also not having to run my space heater under my desk, because the 280x is putting out enough heat to keep things toasty. I'm looking at this short term and probably won't run thru the summer.

I started the 7850 mining the week after thanksgiving, and the two others right after Christmas, and we have just over $300 in LTCs in the wallet. I've probably spent an extra $40 in electricity (not counting that I'm not running the space heater).

There's always the risk that the currency markets will crash, but it's not like I'm investing my life savings. I personally would never do this for a real investment. We'll cash out probably in a couple months. Right now, my boys are locked into it and getting a good education on top of it all!

So, this forum is a best GPUs for the $ forum, and so from my perspective the AMD GPUs were the better buy, even at inflated prices, and even though our main purpose for buying them was for gaming.
 
This question is to Onus - Are you using AMD cards to Mine for Bitcoins?

If yes would it take you forever to get one coin because of the difficulty?

I just want to know how you're ahead of the game...
 

Well, you've got quite the graphics gamut there. StarCraft 2 is actually CPU bound and doesn't require much in the way of GPU. BF4 will take as much GPU as you can throw at it. Skyrim is somewhere in the middle, depending on what mods and texture packs you're using.

I do agree that GPUs are hitting the point that even a mid-to-low-end model has enough oomph to get passable results at 1080p. My 6870 is three years old now, and it still gives me acceptable performance at 1920x1200. I don't max out details, but it's medium to high settings depending on the game and I maintain a comfortable 40fps. Older games of course are maxed out and hit my 60fps v-sync ceiling. While I was dreaming of replacing it last year with a 7950, I had to admit to myself that I wasn't really missing out on much so I didn't need to spend the money. Depending on what's released this year, I might replace it next spring.

If you just want good performance on a single screen, I'd start with a 7870 / 270. It's overkill for StarCraft and Minecraft, but it'll do a better job at BF4, Tomb Raider, and the more demanding titles without breaking a tight budget. If you have a bit more money, look at the 270X or 760. I have a hard time recommending anything above a 770 right now. The 770 has plenty of power for anything on a single screen without being a complete bank-breaker. The only reason you'd need more is for multiple displays or bragging rights.

Any card between a R9 270 and GTX 770 should be viable for at least three years ( and probably more. ) Yeah, a few titles come along that always try to break GPUs. But most games look pretty dang good at med-high detail settings. As long as you're willing to turn the detail settings down a bit, I see very few reasons a new GPU can't last at least four years.
 
well i'm coming from a 6850 so i dont want to spend 200 bucks on something that is only marginally faster and so i figured at least a the gtx 760 i'm just debating with myself if the 770 or the 780 will be overkill.
 
Status
Not open for further replies.