Best Graphics Cards For The Money: January 2012 (Archive)

Page 27 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
I don't see how GTX 770 is a good recommendation at 400$, when you compare it to R9 280x ( 7970) which costs 300$. You might have to be braindead to buy that GTX 770, but oh well, seems like Nvidia is paying big bucks so its card appears competitive.
 

Ehm I agree almost 100% on your 1st part with you. GTX770 offers performance which should justify (comparing to the R9 280X) something like $320-330.

So why in the hell R9 280X/7907GE is one tier ahead of GTX770 and in the same lvl of GTX780???
Not only this is totally wrong which 7907GE and R9 280X should be in the same tier as GTX770, while they placing it one tier more, they recommend it at $100 more, LOL!
 
Third moth in a row that TH insists in stating that a 7950 costs as much as a GTX 760, which has not been the case since at least June/2013. Telling a lie a thousand times won't make it any more true.
 


Been wondering the same thing.
Probably crippled(sounds bad when I say this) Bonaire GPUs.
 
(Oct 2013): One thing I'm wondering about in the new AMD 200 series is whether they've taken the opportunity to do anything about the micro-stuttering and runt frames beyond Catalyst driver updates.This would have been a golden opportunity to tweak the HW design of the boards to support future driver changes that finally eliminate the problem.
 

People have been complaining about that for several months now. It's pretty silly. I think the reason is they wanted to show that the 7970GE had a little something extra compared to the regular 7970 etc., but it's pretty ridiculous to rank it alongside the GTX 780 instead of the 770.
 

It's a driver problem, not a hardware problem. They've already pretty much solved it for DX11 single-monitor configurations, they just need to extend support to DX9 and multi-monitor configurations.

Hardware changes are not necessary, better to save any real hardware changes for their true next generation GPUs (Rx 300 series, presumably).
 

I would understand if their chosen benchmarks (some games run better on nvidia hardware and some games run better on AMD hardware) would favor 7970GE/R9 280X enough to reach GTX780.
But at Tom's Hardware Index Extreme and at Tom's Hardware Index Performance 7970GE is even lower than GTX770.
So placing 7970GE/R9 280X even higher tier while their benchmarks show the opposite, dosen't make sense at all, really...

And today they made an other wierd thing. While they are placing in their hierarchy chart R9 280X one tier ahead of GTX770, they recommend it for $100 more. :??:
 
Please add all of the integrated stuff (intel/amd) it looks like you're a generation behind on it. Having that information makes it easier to decide what the threshold is to buy a card to get the performance increase. I know it's difficult and probably boring, but it's useful information to people looking at the value segment.
 
Sooo. The 770 is placed a tier lower than the 280X, and costs a $100 more, yet it's a recommendation?

I can understand choosing the 770 for dual gpu over the 280X, but not otherwise.
 
I wonder how the GPU performances will be affected when AMD releases the Mantle API for the PC platforms? Rumour has it that Mantle will make the AMD GPUs beat the crap out of the nVidia GPUs in terms of gaming performance and will force nVidia to dramatically reduce the prices of their GeForce cards as soon as in this November.
 


The problem is that it has become clear (since the first time the integrated graphics were added to the chart) that iGPUs(APUs especially) are very dependent on the rest of the platform, especially RAM speeds.

The difference can be MASSIVE from 1600 MHz to 2133MHz for example, where does Tom put them on the chart then? Upper end? Lower end? That will just confuse people and add even more complaints to an already broken chart(broken because it near impossible to maintain properly).

 

Mantle needs each game to be coded for it. So far, zero games support Mantle. BF4 will eventually support Mantle, but that's just one game. Since Mantle is PC-only, it's extra effort for the game developers, so I'm not convinced it will gain much traction.

In any case, it won't really be relevant within the next few months. The rumored Nvidia price drop has more to do with the simple price/performance ratio AMD offers on their products, regardless of Mantle.
 
The CPU hierarchy chart is just as messed up as the GPU one. I don't even know why I bother looking at either chart anymore. Reviews are published on here that constantly contradict what is in the charts, yet they are never fixed.
 


I see them on PCPartPicker. I doubt there will be any serious reviews given they are low-end, entry level parts. But it would be nice to see where the R7 240 and 250 fall on the hierarchy chart.

Their stream processor counts (320 and 384) match up with A8 and A10 APU's built in GPUs, so they are likely designed to match up with those units
 

afaik, only a10 6800k officially supports ddr3 2133 ram (only 1.5v)
http://www.amd.com/us/products/desktop/Pages/consumer-desktops.aspx#7
almost all the apus officially support ddr3 1866 ram except very low end and the kabini apus, maybe.
that's why only the a10 6800k(ddr3 2133) or something like that should take higher position in the gfx card hierarchy chart.
 
Would love to see you test the 4870 X2 with some new games and put it against a 650 Ti boost for comparison since you have them at equal levels on the chart.
 


That would be interesting, I agree, but imagine they had to do this for EVERY tier?
 
I have a question about the 280x vs the 770. You rank the $400 version of the 770 as more powerful than the 280x. Does the extra gigabyte of VRAM on the 280X give it an advantage in any situation, or is the 2GB 770 always better for gaming?
 


Mantle is not PC-only, in fact, Mantle is what is being used in XBone and is rumoured to be used in the PS4 to boot (both consoles are AMD based so it is highly likely that this low-level API also will be used in the PS4).

What has been a very well known fact at least since the XBox 360 and PS3 is that the games use the hardware resources of these consoles a lot more efficiently than what a standard PC does through interfaces such as OpenGL or DirectX (that is even more "bloated" than OpenGL). This is a serious issue that game developers have been trying to solve for years by now.

By the introduction of Mantle (that unfortunately for nVidia only works with AMD hardware), gaming performance on existing hardware by orders of magnitude.

Porting games from the PS4, and the XBone will be a breeze compared to what it has been in the past, heck even an AMD supervisor said that "our middle class card will ridicule NVIDIA Titan in Battlefield4"!!!

More news regarding Mantle will come out in November and Battlefield4 will get a Mantle patch in December.

As a response, nVidia is said to give their current lines of GPUs a substantial price-cut.

This is said to be the most revolutionary break-through, or game-changer if you will, in the past 10 years!

I wouldn't want to be a GPU salesman right now, it feels like the sky is about to fall... At the same time as a buyer/end-user, some really exciting times are about to come.
 
This list does not make any sense.

The 7950 boost can be find always substantially cheaper than the 760, so it earns the recommandation there.
THERE IS NO SENSE in paying 100 $ more for the 770 when the R9-280X is just as fast or faster.
THERE IS NO SENSE paying 650 for the 780 when the 7990 can be found for 600.
Titan, even though it's the fastest single GPU around, does not make sense at that money.
GTX690? No. JUST NO.
 

Ok, you win, it is not Mantle per se, but the API for the XBone is based on the Mantle API, or shall we say closely related to the Mantle API for the PC. It isn't too hard to guess what that will do for porting console games to the PC and what that will do for AMD...
 
Status
Not open for further replies.