Best Graphics Cards For The Money: January 2012 (Archive)

Page 35 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

There are small differences in the clock bands, but the real difference is in the TDP that determines where they'll sit within the very large range of clocks allowed. On the desktop CPUs, the base clock of all the iGPUs is either 200 MHz or 350 MHz. The max. turbo ranges from 1 GHz to 1.3 GHz (1.25 GHz is the highest among the non-R models). So all of them fall within reasonably similar ranges, but with a HUGE range from minimum to maximum clocks. So the TDP becomes crucial in determining actual performance.
 

That's why the chart ought to simply show them at their best. Those of us who regularly reference this chart know to look out for factors that will bring it down. Put a "*" next to all of them; something like: "will vary, typically lower based on CPU TDP limitations."

 
If I may make a suggestion to the article writer... You should add in your notes about recommendations that freebies are (or not?) factored into the cost. I have a feeling a lot of people are recommending cards because they come with free, recent games. So if we went with AMD's Never Settle Bundle which I think had at most 3 recent games for the picking, some people could claim "well then the video card is effectively $180 less!"

Also regarding mobile GPUs, and notebookcheck.com does have a comprehensive list of mobile GPUs and their performance tiers.
 
Why aren't these miners using ASIC's to do their work for them? Wouldn't it be more efficient than stacking up 6-8 cards and burning up mega watts in the meanwhile?
 
I really love the 'hierarchy' chart with older cards. Comparing old to new is really valuable.

The addition of the 'performance per dollar' chart is great. please keep it. You looked at some runner's up cards that did not get a recommendation. Any chance you could add a few of those to the 'performance per dollar' chart ?
 

Although ASICs have rendered BTC mining with GPUs obsolete, Litecoin uses a different hashing algorithm and there are not yet ASICs for it.
 


Yea HD 7970's are in the same boat on that one. Good thing I didn't want another for CF. :lol:
 

It's probably temporary. The 280X is still available cheaper than the GTX 770 in my corner of Europe. And Bitcoin, Litecoin etc. are currently dropping in price (making it less attractive to buy GPUs to mine them).
 


Litecoin uses a different algorithm I had a customer looking for a bunch of 7950s to build multiple mining machines. Of course there is a big shortage on all the AMD VGA cards so when stock is low and prices are high.
 


 
So, there will be different HD 4400 in different hierarchy. If my idea is right then few indications may be given to identify each HD 4400.
 


 
Thanks a lot for this great update! I am currently trying to upgrade my computer and it´s the first time I do it myself. I was planning on getting an AMD Radeon 280X but now I have to find something else because I don´t want to wait too long. I´m thinking of getting a Zotac GeForce GTX 770 Extreme Edition, which seems just as good (even a little bit faster):
http://versus.com/en/amd-radeon-r9-280x-vs-zotac-geforce-gtx-770-extreme-edition
 
All the comments in the top are stating they are from december 2012 - maybe your filter isn't working?
And it would be nice to have the newer integrated intel graphics on the list also.
Thanks for a good list.
 


the 240 is liek the 750 on an A8, teh 250 like the 7550 (256 or 384 stream processors at 800 MHz)
 
Status
Not open for further replies.