ATI Radeon HD 5570: Reasonable Gaming Performance For $80?

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
[citation][nom]nforce4max[/nom]Right now there is a GTX295 Co-op for parts and repair for less than this but still on bid for 3 more days. $80 can buy a decent used card that can perform better than this.[/citation]

Opps forgot to say ware, its on ebay just search GTX under the graphics card section.
 
[citation][nom]nforce4max[/nom]Right now there is a GTX295 Co-op for parts and repair for less than this but still on bid for 3 more days. $80 can buy a decent used card that can perform better than this.[/citation]

Not seeing it. And since when does a used, broken card that may or may not be fixable, and if it is it will take work, compare against a new one?

I wish Tom's would do two sets of benchmarks. I understand getting rid of the CPU bottleneck by using a fast CPU, but how is that really that helpful to people who might get this card? As the article says, 'reasonable gaming performance for $80'. At the same time, often on Tom's, it's said that spending more on the video card is smarter than spending more on the CPU for gaming. So who would be buying a ~$300 CPU to go with this card? They should redo and add results of a budget CPU matched with a budget card, for a better idea of results.
 
[citation][nom]rambo117[/nom]Two things to point out: first, my 3870's in crossfire did fantastic with their measily 320 shader cores and GDDR3 memory[/citation]

Yes, but the point is that now you can get a single card, more powerful than two 5570's, for about the same price or less. That was not the case when the 3870s were new.

[citation][nom]rambo117[/nom]and second, I don't mean to sound like an ass but the 5750 has 720 shader cores[/citation]

Absolutely, my mistake, I meant the 5770. Fixed!
 
For all future tests, I would love to have noise measurements. Noise is far more important to me (and probably lots of other professionals that still game) than heat.
 
Interesting, but it looks like a HD4670 equivalent for about the same price. Being low-profile will probably sell a lot of them, but since there's a low profile 9600GT, we may see some welcome price drops. The $80 price point is good for a backup-PC GPU, so hopefully the HD5670 will drop to there.
As is, it is still looks like an excellent choice for business users who got those miserable slim cases but want 3D acceleration for Flash.
 
I'm using an 8800GT (factory overclocked) because it is still enough for my needs (although it's noisy), but if you consider someone coming from an older/less powerful card or even building up a new system, it is admirable to look at the long list at the end of the article: AMD has launched its entire DX11 line and there is nothing to be seen from Nvidia, except some crazy unethical rebadging schemes mixed with some entry level DX10.1 cards - when I'm reading a laptop review nowadays I now longer know - or care for that matter, becasue it truly disgusts me - what all those Nvidia Gt100's, 200's, 300's are anyway.

And by the way, why is it that Steam still doesn't show DX11 usage statistics in their monthly survey ? I just filled it in and viewed the survey at the end and nothing, no dx11 cards. Hard to imagine more than 2 million cards sold and none listed... At the time AMD announced it the 2 million+ sales of DX11 cards there where only 5700 and 5800 and 5900 series available, which appeals to gamers... a percentage of whom could eventually have Steam installed... oh well...

Competition is great, and I hope Nvidia gets its' act toghether quickly, launches competitive cards and drops the absurdly unethical behaviour from lately.

And of course congratualtions to AMD/ATI for it's latest lauch, a worthy sucessor to the 4670. It may not have much more performance, but it has more geatures and is more versatile - low profile/heat, you can fit it in more places.
 
It's a budget card, you get budget gaming. I won't be surprised if/when the price drops to current HD4670 prices, and in about 5 months, replaces the card completely. Low profile is also nice.

Waiting on HD5200 (IGP), HD5650, HD5830, HD5890, HD5950.
 
It seems that $80 isn't really a budget card anymore. A quick look at Newegg shows that I can get a GT240, 9600GT and even a 9800Gt for cheaper than $80. Why would I buy a less powerful card? If I'm not gaming and only want the HDMI audio goodies, I would just get the cheaper 5450.
 
ATI release price is reported as $75-$80 for the 5570, but already, I see prices from $83-96. ATI needs to keep the retailers to the price point they set! I know that the MSRP is just that, suggested, but come on, we're already taking a deep gouge on the 5850/70 series!!!

I just want Nvidia to release a price point and some benchmarks so maybe prices will settle back to launch levels.

Is that too much to ask for?
 
Two things you should add when benchmarking the lower price - value graphics cards.
1) 2D performance, Tom's had a great write up and created their own benchmark to showcase it.
2) Blu-Ray performance ~ CPU utilization, dropped frames?

Considering that is what most people would buy the card for instead of using it for gaming. If someone wants a better gaming experience they are most likely going to spend an extra $20 to $40 to move up in performance.

 
[citation][nom]philippeart[/nom]can someone explain me that means:"Just make sure the board you buy has a DisplayPort output first"[/citation]

Some boards do not have DisplayPort outputs, and therefore cannot be used in triple-monitor Eyefinity configuration.

A DisplayPort output is needed for the third monitor. DisplayPort is an alternative type of digital video output, the cable looks similar to HDMI. A few new monitors have a DisplayPort input, if you want to use an older monitor for Eyefinity, you need a DP-to-VGA/DVI/HDMI active converter.
 
[citation][nom]aneasytarget[/nom]2) Blu-Ray performance ~ CPU utilization, dropped frames?[/citation]

I did a review of CPU utilization in an integrated GPU review a while back. Frankly, CPUs are so powerful now - even budget dual-cores - that it isn't much of an issue anymore, and these discrete cards will accelerate video playback even better than integrated chipsets.

I did an HD blu-ray quality benchmark using the 5450 last week and it passed with a perfect score. There's just not all that much to report on in this area.
 
Tom's needs to ban a lot of those AMD/ATI fan boys before this site turns into a shrine.

Even a natural statement will get multiple voted down now. Before you knew it, any sentence that doesn’t end with “ati ftw” will not be posted.
 
I don't see why this card even exists, between 9600GT and 5670 - there is no place for it.

You can fit a 5750 in an HTPC case, why go smaller?

I Hate tiny things.
😀
 
[citation][nom]pei-chen[/nom]Tom's needs to ban a lot of those AMD/ATI fan boys before this site turns into a shrine.Even a natural statement will get multiple voted down now. Before you knew it, any sentence that doesn’t end with “ati ftw” will not be posted.[/citation]
ATI FTW!!
 
[citation][nom]pei-chen[/nom]I don't understand your reply. My 4850 didn't suffer from the blank screen problem so....[/citation]

Sorry, I wasn't picking on you. I also work with ATI (and nvidia) cards. I recently bought the HD5770 because of the many reviews that I read and I was just pointing to a problem that has been plaguing some owners of 5xxx series cards (including me). The thread I linked to has over one thousand posts. And here's another:

http://forums.amd.com/game/messageview.cfm?catid=260&threadid=126958&enterthread=y

My main point here it's the following: if AMD is trying to be ahead in the market by putting out poorly tested products, then it will might regret it in the long term.

But I find it interesting that here at Tom's and many other review sites they never reported any problem. Oh well, I guess that the cards sent to these places go through a better quality control.
 
Toms sure loves to mention the eye infinity with 3 display output. Seriously though, while a great feature for desktop use, I don't see the appeal for most gaming. I don't think many gamers are all that interested in the feature. For one thing, we need cards that can handle insanely high, very wide resolutions. Also it would be better if we could buy triple wide monitors with a nice curve, so that we don't have the annoying bezel between monitors. Personally, I don't know many gamers that used dual monitor gaming, so I'm not sure why all the excitement over 3 monitor gaming. (Simulators are different, I can see the excitement for that niche market).


I'm very interested in seeing stereoscopic 3d support from ATI. The tech is just in it's infancy as far as desktop gaming is concerned. It think it needs to mature more, we need more game development with 3D as it's intended medium. And quality drivers. That won't happen without many gamers having capable hardware. I'd like to see ATI jump into that arena, we need healthy competition over that specific feature, in order for the tech and drivers and high refresh monitors to mature.

It just seems odd to me that Toms is mentioning Eye Infinity all the time, but rarely mentions stereoscopic 3D.
 
[citation][nom]hixbot[/nom]It just seems odd to me that Toms is mentioning Eye Infinity all the time, but rarely mentions stereoscopic 3D.[/citation]

What's to mention? I think all that there is to report is that AMD is allowing iz3d to work with their driver, but there is nothing proprietary going on. When some hard tech is out we'll report on it.

If you're a 3d fan however, I'm reviewing a homemade stereoscopic 3d setup simulating how it's done in the theatres... two projectors, polarized filters, a polarized screen, and polarized glasses for wall sized 3d gaming and movies. :)
 
Status
Not open for further replies.