260 and 4870 are the same price. Help!

Page 3 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.


Both the GTX 260 and the HD 4870, and in fact most modern video cards, support 2560x1600. That's a 30" monitor. There are TV's with 1080p which can be a lot larger than 30", but their number of pixels is smaller than a 30" monitor's. That is, it's 1920x1080. Those are supported all right by both cards too.

AFAIK 2560x1600 is the biggest resolution available in consumer products. OK, technically, both HD 4870 and GTX 260 can be connected to two such monitors in DualView mode so you can say you have 5120x1600.

 


Am I the only one seeing near 100% scaling on those GTX 260??? I tough nothing got near that much scaling on games. It really look fishy to me but heh... I could be wrong.
 
And who won from that Anandtech review @ 2560 x 1600 AA + AF my friend?

GTX 280 sli scored average of 82.60fps

4870 cf scored average of 71.90

From 7 games.

Can we agree that gtx 260 sli is 15% slower than gtx 280 sli?

That puts gtx 260 sli @ 69.75, omg they are almost equal 😀



 


Yeah you are right! Those benches are very sketchy indeed.
 


We can't agree on that actually. You can't just take one benchmark, say that another card is x% faster or slower, and then add the "correction factor." It's more complicated than that.

Plus, who plays at 2560x1600?
 


Wrong. Thats a big assumption to make for something as variable as an SLI (or CF) setup.
 


Thanks for the input. I was planing to run both x plane and fsx. I was plaining to get the Q6600 and oc it to 3.2. Does that sound acceptable to run both programs?

Thanks.

And please! Stop arguing, just give me the facts and your opinion about the subject, not about each other's opinions
 


Well I did ask yee to find me a gtx 260 sli review?

Seems the one I gave wasnt good enough.

 


Why are you asking me to go find evidence to prove your point?
You did not give a GTX 260 SLI review, you gave a bunch of graphs that are at insane resolutions of 2560 x 1600!

Did you even say which source they are from?
 
This is interesting. I get the feeling the green team is getting antsy, seeing their lead desolve away. All sorts of excuses. Im seeing this in other forums too. Hey, its called competition, get over it. Its destroyed the ripoff prices nVidia had had on their cards, a whole generation of nVidia cards brought down in value overnight, and here we see people trying to show how superior they are? LOL..... sorry, as someone told me looong ago, and even a green teamer that didnt see this coming, follow the money. This oparticular fanboy of nVidia said these cards were too cheap to be aby good. He also said that the Assassins Creed thing was nothing, after he suposedly benched them himself. Hes now saying the 55nm refresh will bring a card thatll edge out the x2, but guess what? He did admit....itll be expensive of course LOL. So there you have it, you want to spend more money, and thus , spending more automatically means that the card is better right? What a joke. I only ask this. Dont go by what anyones said here in this post, I suggest you go and look for yourself, therein lies the truth.
 
btw the 1gb version would solve the problem for the 2560 x 1600 resolution since 512 could be a limitation for that resolution
 


Thanks for some real reviews. This is what we need, not some obscure graph.
 



Don't know about x plane, sorry. For FSX, assuming you've got at least one decent video card so it doesn't cause bottlenecks, your fps will scale almost perfectly with the clock of a quad CPU.

For example, you could get a GA-X48-DS4 + Q6700 (multiplier of 10) and reach 4 GHz.
Or, a Q6600 (multiplier of 9) and reach 3.6 GHz.
Or, a Q9450 (multiplier of 8) at 3.2 GHz, which is almost as good as Q6600/3.6GHz because it's a newer and slightly improved architecture, and it has more cache.
The Q9450 can do better than 3.2, but you'd need to start overclocking the MB and RAM too.

There's a guy reviewing the GA-X48-DS4 at newegg who says he runs a Q6700 at 4GHz with an XP-90 cooler. There are better coolers than that. Examples:

Sunbeam Core Contact
http://www.newegg.com/Product/Product.aspx?Item=N82E16835207004
or
Xigmatek HDT-S1283
http://www.newegg.com/Product/Product.aspx?Item=N82E16835233003
http://www.newegg.com/Product/Product.aspx?Item=N82E16835233019

Of course, if you get a Q6600, you don't have to try for 3.6 GHz. 3.2 GHz should be easier to get and it will cost you less in electricity bills too.

 

First of all, you're implying that the GTX280s in SLI would be slower than a pair of GTX260s. Lunacy. Now read the pretty graphs:

17187.png

hl2-1920.gif

grid-1920.gif

etqw-1920.gif

A little harder to read, but notice the victorious 4870s at 1920x1200:
oblivion.jpg


And as expected, the GeForces are victorious in Crysis

crysis-1920-high.gif

17188.png


The GTX280s tend to win @2560x1600

hl2-2560.gif

17189.png

etqw-2560.gif


So in conclusion a pair of 4870s will handily dickslap a pair of GTX260s unless you have a 30" LCD. Then again, these benchmarks are for the GTX280, so maybe not.
 
Performance wise they are the same. One card has slight lead in one game, another card in another. 2FPS difference is not very important.
However, what IS important for some people is power consumption (read temperature in the room) ATI card consumes 50W more than NVIDIA at idle. This, for me, is the most important difference.
 
You may also consider the 4870 a better value for a card It supports Directx 10.1. NIVIDIA said some article that next gen are going to support it and that means games will also use this more for better performance.
small but a big thing in the future.
 
Add up the averages from all tests @ 2560 x 1600 with AA + AF and get back to me.

I did it and the gtx 280 sli beat 4870 sli by 13%

"we know its crap in UT3 etc etc so we wont show it"



 
@MxM: The ATI card can be underclocked when it's idle, if you care about those 50 watts. You'll save more than 50watts that way. When it's under load, the ATI gives more fps per watt than the nVidia. It's really hard to say which one is the more politically correct and environmentally conscious of the two.

@homerdog: nice charts, but you ruined my day, man. My trusty old 8800GTX is only #13 out of 16 there. Tomorrow is our anniversary too - I paid $651 + 14% tax for that card exactly a year ago. 🙁
On the bright side, 45 fps is still acceptable as far as I'm concerned.


 
I just went to Fry's and got a BFG 280 OC 1GB DDR3 for $299.00. I saw the add on the news paper only......I think its until Tuesday while supply last....

My original intent was to get a 4870 i ended up with BFG280....
 


I was planing to get the S1283 cooler.
Do you thing I should spend the extra money for the Q9450 over the Q6600?
 

Methinks I took the trollbait 🙁
 

Latest posts