ATI Radeon HD 4770 In CrossFire: Unbeatable At $220

Page 4 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
I don't like the use of the expensive i7 965 and X58 mobo as the test platform. I would rather see the use of a Q9550 or Phenom II test platform. The platform should be in line with a targeted budget.

Will anyone invest that much money on a CPU and mobo and then chose to CF two 4770s?

Does using the fastest CPU exaggerate the differences one would see on other platforms?

How would a cheaper system, with a single PCIe 2.0 slot and DDR2 memory, and the use of a HD4870 X2 or GTX 295 compare?
 
[citation][nom]DXrick[/nom]I don't like the use of the expensive i7 965 and X58 mobo as the test platform. I would rather see the use of a Q9550 or Phenom II test platform. The platform should be Will anyone invest that much money on a CPU and mobo and then chose to CF two 4770s?Does using the fastest CPU exaggerate the differences one would see on other platforms?How would a cheaper system, with a single PCIe 2.0 slot and DDR2 memory, and the use of a HD4870 X2 or GTX 295 compare?[/citation]

I don't believe it matters much. Even with a over powered CPU the benchmarks will only be as good as the GPU can push out performance wise. Any difference would be very small.
 
at the time of publication they were all 109.99 before MIR? well, you posted the article at 5:04am on today, monday may 4. the prices i listed were current as of 8am same day....

again, this is picking of nits...but still...you absolutely, positively sure that you checked right before hitting publish? cause it strikes me odd that the price went down inside of 3 hours, especially at such an early hour of the day.
 
4980 cards probably only are good in crossfire for those wanting ultimate detail all the way!

Problem with crossfire is that one card can't be turned off.
It would be nice if one card could be disabled for when using 2D (like office tasks, or reading webpages, or watching movies).

I'm more interested in the power usage, since I don't really play a lot of games. But it was interesting to note that a 4770 card now costs little, and I can play my games with them on my screen, and perhaps later upgrade with a same card knowing I can play crysis like games with ease!

I would love to see 40nm 4830, 50 and 4870 cards with GDDR5!

I'm still looking for that 60W graphics card that can play all games fluidly.
 
I'm still biased towards that Nvidia card over ATI. The performance of that single card looks impressive compared to that dual card setup from ATI.
 
[citation][nom]DXrick[/nom]I don't like the use of the expensive i7 965 and X58 mobo as the test platform. I would rather see the use of a Q9550 or Phenom II test platform. The platform should be in line with a targeted budget.Will anyone invest that much money on a CPU and mobo and then chose to CF two 4770s?Does using the fastest CPU exaggerate the differences one would see on other platforms?How would a cheaper system, with a single PCIe 2.0 slot and DDR2 memory, and the use of a HD4870 X2 or GTX 295 compare?[/citation]

A lower-end platform would make more sense from an economic standpoint, but then you'd be introducing other variables into the test. You wouldn't, for instance, get the true difference between GPUs because processing power might hold you back.

For a look at this phenomenon exactly, check out our original 4770 review, where we addressed this exact concern. Here's the link:

http://www.tomshardware.com/reviews/radeon-hd-4770,2281-12.html
 
[citation][nom]dario77[/nom]at the time of publication they were all 109.99 before MIR? well, you posted the article at 5:04am on today, monday may 4. the prices i listed were current as of 8am same day....again, this is picking of nits...but still...you absolutely, positively sure that you checked right before hitting publish? cause it strikes me odd that the price went down inside of 3 hours, especially at such an early hour of the day.[/citation]

Publication happened over the weekend--the CMS automatically posts the story at 11PM PST. So yes, some time between, the prices dropped. That's even better news for the people looking at this card, though. A pair of these boards for $200 is an outright deal for 1) gamers with 2) two PCIe slots, 3) a compatible platform and 4) some money. :)
 
Wow, flame on.

Thanx Chris for the link. Now for the part that will get me shot. I compared the 275 vs the XF4770 at Toms only, 1920x1200 (this is what I play at) and max settings according to TOMS charts. The XF4770 beat the 275 in every test. Sometimes not by more than a frame but still a win. Now the point, The lowest I can find a 275 in Kanada is $317, the lowest I can find a 4770 is $120 x 2 $240 (newegg.ca). It also has xfire support. An $80 saving for the same quality of card. Minus $12 for bridge. I have four spare so not a problem for me. So, for my personal situation I would go with 2X4770 and then overclock.

 
Wow, great write up. The only flaw I see here is that the 40nm tech is probably going to manifest itself in a 58xx series card soon, and then, we will be gangbusters for spending $220 on two 4770s. Oh well, progress is painful at times. Suck it up.


Page viewing should be telepathic.

That is all.
 
why don't they have a table of content on every page? why just the first page and let you choose once?

Is so hard to navigate now.
 
Then in a month or so the price of HD4870 1GB or HD4890 is drops another few $10s dollar. New games still need updated Catalyst support to get decent scaling. The single card will just keep performing well regardless.

This CF/SLi thing in the mid/low-end is getting out of hand.
 
[citation][nom]m3kw9[/nom]why don't they have a table of content on every page? why just the first page and let you choose once?Is so hard to navigate now.[/citation]
Did you read the second comment on the first page?
 
Generally I'd tend to agree that even the single-card segmentation is nuts. Adding prices for 2-3 cards makes things too complex. However, in this case, I'm convinced that there is a case to be made for 4770s in CrossFire. Perhaps that story will change when ATI refreshes its high-end lineup.

And for those who don't read the preceding comments--yes, not having a drop-down or ToC bothers me, too. Several people here are working with the developers in France to get this fixed/updated ASAP. Please stay tuned.
 
I wonder how this would work on a p35 system, that platform was choked doing crossfire with high end cards, maybe it would'nt be hit so bad by this implementation
 
Uh WOW. Good on AMD/ATI for helping out the financially challenged crowd with a great upgrade option. What will Nvidia rebadge/recycle to compete?



You're not clever.
montyuk 05/04/2009 11:51 AM
Hide
Insert quote. Report -0+
It was meant as a rhetorical question given Nvidia's recent history, not sorry for pointing out the obvious question. I think it's great AMD/ATI threw out a stunning mid/lower budget card, it really helps out the consumer. I don't care about fan boys and such, you sound as though you are a sore Nvidia fan. I don't get the fascination over a particular brand because it's red or green... What are they giving me for my money.
 
No one, except for Nvidea themselves, should be upset that ATI continues to drive down the price points on video cards. If you're an Nvidea fan, enjoy the 275 for $250.00. Nvidea would be selling it for a lot more if they weren't keeping pace with ATI pricing.
 
Wow, I see a whole lot of people whining and moaning here. It seems that most of you were so intent on arguing in favor of your favorite card (be it ATi or nVidia) that you missed some things, such as the following line Mr. Cangelini should probably correct:
[citation]memory at 800 MHz—effectively 3,600 MT/s.[/citation]
GDDR5 has 4 data transfers per control clock cycle, so it'd be 3,200 MT/s. 3,600 MT/s is the rate for the GDDR5 for the 4870, which has a base clock of 900 MHz. Also, as noted by others, I'm seeing the baseline price on NewEgg for 4770 cards at $99.99US. Oh, and yeah, I hope as well that Bestofmedia, or whoever the heck is running the site, gives us a return of the old layout... At least the drop-down bar. I also could've sworn that the old format ran faster in my browsers. (FF 2, 3, and Opera 9)

As for the article itself... I must say I'm impressed. For those with the option for CrossFire, the technology is, in fact, mature enough that the whole "1 GPU is more reliable" stuff is largely BS; after all, we've seen tons of benchmarks pairing off higher-end GPUs in dual-setups against each other, and excitedly comparing 4850/4870 CF setups before their respective X2 cards came out. The only case I see where it doesn't work well is Grand Theft Auto IV, which I think can be chalked up to it being an incredibly shoddy console->PC port.

All told, it is quite valid: for those that have the option, it's an unprecented level for the price/performance ratio; a $200US pair consistently beating ATi&nVidia's $250US offerings more or less across the board, and even giving the $300US cards a run for their money. I still would've liked to see the GTX 275 there; given that it's weaker than the 280 and still more expensive than the 4770 CF setup, by no means it could hope to come off faring better, but it would've given us a better idea where the 4770 CF setup sat between there.

Of course, there's the whole slight trade-off of taking a 512MB buffer of VRAM; obviously, that cut the price a bit, and in some cases (most notably Crysis) that shows through as something of a potential weakness. I guess the real shame here is that Tom's couldn't have tested this setup alongside a bunch of 512MB versions of what they could get as well, to explore the significance of that. And possibly, of course, this raises questions of the possibility/purpose of a 1GB 4770 or a 4770X2 with 2x1GB.

As for those questioning why RV740 has a 128-bit memory interface, it's because the GPU core is only 136 mm² in size; unlike, say, the number of stream processors or TMUs, you can't just cram more on with more transistors, they need actual surface area for wires to connect to. The smallest 256-bit GPU was the RV670, at 190 mm², and the smallest GPU with an interface above 256-bits was the R600, at 420 mm². This is why, for instance, G80 was able to fit a 384-bit interface, as it was a monstrous 484 mm². G92 cut the size to 324 mm², so they had to go down to 256-bit.
 
Status
Not open for further replies.