ATI Radeon HD 5670: DirectX 11 For Under $99

Page 6 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

mayne92

Distinguished
Nov 6, 2009
743
0
18,980
[citation][nom]hannibal[/nom]Good points. It is allways better to read many articles, but it is allso good to have some reference sites. And when chosing to those reference sites we come to area where things like "unbiased" flyes out of window!What most sites seems to agree is that these new cards from ATI and Nvidia are too expensive when compared to older models. We allso knows that these new cards are (should be) cheaper to produce to Nvidia and ATI than those older models, so we can expect either price cuts when they can prodece these enough or they drop out their more expensive older models (and that way making these the only alternative to consumers) or both.They also seems to eat less power (surprice), so they are now viable alternative to those who need GPU that eat less power. We can only hope that this is not the starting of the trend, when PC-are getting more expensive again...[/citation]
I read that in another article online elsewhere that made that same inference of PC prices going up again. It was a good article there too since they were following logical business strategies to support their idea. Only time will tell...and no, I currently don't have the link either.

I agree with some people on here that I am not all interested in the 5xxx models but I'm also not one to just jump on the band-wagon also...but usually wait around a bit. I still have my Crossfire 4870s that perform well enough for me. But if I wanted "latest" feature-set of modern capabilities...for whatever reason...the 5870 would be my choice since I have the PS to support it. Now my brothers older Dell would like the 5670 as he has an older PS so this upgrade would be great for him.
 

neosoul

Distinguished
Oct 13, 2006
38
0
18,530
I think too many people are discounting HDMI output with it's own DTS decoder and display link. This is an awesome HTPC card for those that do zero gaming and only care about media.

Personally I got a 5770, because I do a tiny bit of gaming :D
 

kettu

Distinguished
May 28, 2009
243
0
18,710
[citation][nom]Cleeve[/nom]I guess I wouldn't be doing very well if my motivation was to bicker over irrelevant details.However, if my motivation was to poke fun at your unhealthy obsession, I think my success speaks for itself.[/citation]

Fine, if you don't want to be criticised. But if you want to be writer when you're all grown up...
 

cleeve

Illustrious
[citation][nom]kettu[/nom]Fine, if you don't want to be criticised. But if you want to be writer when you're all grown up...[/citation]

Constructive criticism is great!

Whining about rhetoric is a waste of time tho. If that makes you a grown up, kettu, you're an honorary senior citizen. :)


 

Casecutter

Distinguished
Jan 15, 2010
23
0
18,510
[citation][nom]casecutter[/nom]On the LfD B-M page you said “Again, the GeForce 9800 GT is beating the Radeon HD 5670 by a slim margin.” But then on the page testing with AA/AF it sure looks like a dead heat! Then look at what that GV-NX88T512 (not under-clocked) did in the AA/AF results (1680x) on his GT240 article and something strange! It came up lower than the supposed “stock” 9800GT? All on the same Intel Core i7-920 over-clocked to 3.06 GHz set-up.[/citation]
I suppose there won't be explanation for this?
 

cleeve

Illustrious
[citation][nom]casecutter[/nom]I suppose there won't be explanation for this?[/citation]

Sorry case, missed the question the first time around.

I've looked at the results you're talking about and they're extremely close. When we're talking results over 100 fps the difference in a couple of FPS is well within the margin of error, especially when it's mostly CPU capped like Left 4 Dead is.

In addition, the GeForce drivers were newer in the more recent article. Combine these two things and you're going to see a couple FPS difference here or there. IMHO it's not a significant difference.
 

cinergy

Distinguished
May 22, 2009
251
0
18,780
[citation][nom]Anonymous[/nom]recently in your nvidia 240 review.....it was all ++++ superduper card..... for 100$....now... ati is priced high......buaaah..........[/citation]

It has been for a long time, that nvidia gets always much positive tone in reviews even when there is definitely no need for that:

"The GeForce GT 240 serves up just what the doctor ordered. The cost-effective 40nm process, combined ...blah blah"
"Though at-launch pricing is usually high, Nvidia will finally have the flexibility to compete at the entry-level once production has ramped up, and we will undoubtedly see that happen with the GeForce GT 240."

But here, no clear mentioning e.g. about high-launch prices. No sir. Its just negative, negative, negative from the beginning with. Even when HD5670 smashes the direct competition (GT240) both speed and feature wise (even with a smaller die which is no small merit). The overall conclusion is that this card sucks even if its better than what competition has.

"Here it will have to compete against the similarly-performing $80 GeForce 9600 GT, the slightly-faster $95 GeForce 9800 GT, the clearly-superior $110 Radeon HD 4770,"
"Are we saying the Radeon HD 5670 is a bad card? Certainly not, it's a respectable mainstream offering. It just costs too much."

It's like comparing 10 year old tuned BMW with 350hp to new 300hp Mercedes. Buy a BMW, it has more hp they say. But I want a NEW car, with cruise control, automatic air conditioning and lower gas consumption. Why do you insist me going for old beloved patriot crap that is at the end of its life cycle?
 

Casecutter

Distinguished
Jan 15, 2010
23
0
18,510
[citation][nom]Cleeve[/nom]Sorry case, missed the question the first time around.I've looked at the results you're talking about and they're extremely close. When we're talking results over 100 fps the difference in a couple of FPS is well within the margin of error, especially when it's mostly CPU capped like Left 4 Dead is. In addition, the GeForce drivers were newer in the more recent article. Combine these two things and you're going to see a couple FPS difference here or there. IMHO it's not a significant difference.[/citation]
I’m trying to wrap my brain around your response.
So that 8800GT running a 17% increase on the core provides a 3% less frame rate than a “stock clock” unit and all because of a driver revision? Now I’ve looked at 195.62 release notes and they don’t necessarily say anything about AA or regarding Left for Dead for Vista 64-Bit. To pull 3% more FPS from basically one driver revision is plausible, but to get that after being hobble by 17% reduction in clock is amazing! That’s a driver revision that I would’ve thought, would’ve been exalted from the green mountains of marketing!

Looking further it appears the 9600GT used in both reviews was the GV-N96TSL-1GI. That’s not a standard or generic 9600GT, but 1Gb could be decent representative. Though looking today at Egg a 9600GT sporting 1Gb are priced $80-85; most need at least $15 rebate, while a nice Gigabyte unit like you use is at $110.

What’s more interesting is that you tested that Gigabyte in May ’09 in a Core2 Duo 2.7 (Kentfield) Vista 32-Bit system; with No AA in LfD 1680x. The Gigabyte nip at the heels of a Asus GTS250 in Dark-Night trim (53 vs. 55.7 FPS). When the Eye-candy got turn up to 4xAA 16AF that 9600GT drop off to 42 FPS (>22% loss), while the Dark Night held to 52 FPS.

Fast forward to reviews with the i7 Vista 64-Bit system, that Gigabyte 9600GT is pounding out 118.3 on LfD 1680x No-AA and 110.2 with the 4xAA 8AF! That’s not the significant percentage drop as we see in May? If that not a testament to upgrading to an i7... and better drivers I don’t know what is? Then interestingly enough the GT240 review which clearly calls out the GV-N96TSL-1GI, while the 5670 review it’s shown in the charts as a non-descript Geforce 9600 GT. Obscuring the fact of being a 1Gb, only revealed if one checked the test setup page.

Now what is interesting is that in both the GT240 and the 5670 reviews that 9600GT provides the identical results. I suppose the revision in the drivers is not optimized for a 9600GT only the 9800GT. But wow no discrepancy from one test run to the other there... excellent repeatability.

I suppose this bickering is mute when the Sapphire 4860 is $105, and walks all over a 9800GT, betters 4850 and probably GTS250 1Gb. So if you indeed intending to game, don't see yourself purchasing the newest games over the next several months, low idle power isn’t a concern, although just want a sheer graphics power house... you don’t have any other consideration even if you wanted to spend $50 more.

Now if you want a card that this save on a bunch of power if idling 24/7, don’t want to buy new PSU for you OEM box while not intended to really game, (however would like to do some older titles if you want) there are 5670 512Mb that provides that for $10 less.
 

cleeve

Illustrious
[citation][nom]casecutter[/nom]I’m trying to wrap my brain around your response. So that 8800GT running a 17% increase on the core provides a 3% less frame rate than a “stock clock” unit and all because of a driver revision? [/citation]

I think your difficulty might be because you missed an important part of my response.

The key here is that this title appears to be CPU-limited, not GPU limited. The idea being that if you increased the GPU core 200%, the CPU would still limit the frame rate at the same level.

The 3% difference would be within a margin of error. We actually see quite a bit of difference between runs and we average them out, but there will always be a margin of error because the majority of games will never bench the exact same framerate twice.

As such, I believe you're concentrating on the 17% clock speed which is irrelevant in a CPU-limited game, as opposed to the 3% frame rate difference which lies within the margin of error.

I hope that helps.
 
G

Guest

Guest
The card is to weak for dx11 if dx11 is not your thing the 4770 is the beter chice

 

keropi88

Distinguished
Feb 20, 2010
2
0
18,510
yo guys, if anyone of you plan to buy one, especially buying Sapphire beware of the model code ! I got my Sapphire HD5670 512MB DDR5 after many research online a couple of weeks ago and now i regret ! I didnt realise that the model number plays an important role too .. mine model number is SKU 11168-02 and if you check the specs in Sapphire website you will know that this model DOES NOT support Eyefinity and Hypermemory features !
please check properly the specs before you buy ... anyway, the performance of this card is good but still lag in some intense part of the game (Modern Warfare 2), i wonder its my comp specs problem (free disk space, CPU memory, etc.) or the video card itself ...
good luck.
 
G

Guest

Guest
Anyone else notice the the core was dropped 25 mghz for the 5760 on this review? 750 is the standard.My MSI 1Gb comes in at 775..and I can tell you Im completely happy with this board.Of course...I only run mid-resolutions for my 19" lcd.If I had the money too by a huge high-res display-I wouldnt be buying a budget gpu either.This product may not be the fastest out there,but it is a good value.Im running mine at 850/1050 and runs at 36c under load.My 450w psu is happy,my overclocked cpu is happy,and Im happy.The performance is def. on par with 9800 gt's and 4770's and 4850's for that matter.
 
Anyone else notice the the core was dropped 25 mghz for the 5760 on this review? 750 is the standard.My MSI 1Gb comes in at 775..and I can tell you Im completely happy with this board.Of course...I only run mid-resolutions for my 19" lcd.If I had the money too by a huge high-res display-I wouldnt be buying a budget gpu either.This product may not be the fastest out there,but it is a good value.Im running mine at 850/1050 and runs at 36c under load.My 450w psu is happy,my overclocked cpu is happy,and Im happy.The performance is def. on par with 9800 gt's and 4770's and 4850's for that matter.

The 5670 might be on par with the 9800Gt's but its not even close to the 4770 nor the 4850. The 4850 has double the memory and SPU's ;) . So you are actually paying 99.99$ for the DX11 alone, and yes at low resolutions the card will perform well. Other than that it is a fairly weak card. You might notice more performance than expected because at low resoloutions the CPU does most of the work :)

 
G

Guest

Guest
I always thought that your reviews were kind of strange. Indeed, you're putting low end video cards like Radeon HD 5670 and Geforce 9600 GT with a high end CPU, the Core i7 920.

Of course, the CPU doesn't make a lot of difference, but still, this is strange.

Anyway, very good review
 
G

Guest

Guest
hey great review! needed a lil help though...
is the HD 5670 better overall than a 2 yr old EAH4850?
 

cleeve

Illustrious
[citation][nom]ani_6991[/nom]hey great review! needed a lil help though...is the HD 5670 better overall than a 2 yr old EAH4850?[/citation]

No. The 4850 is better.
 

cleeve

Illustrious
[citation][nom]Anonymous[/nom]I always thought that your reviews were kind of strange. Indeed, you're putting low end video cards like Radeon HD 5670 and Geforce 9600 GT with a high end CPU, the Core i7 920.Of course, the CPU doesn't make a lot of difference, but still, this is strange.Anyway, very good review[/citation]

It's not strange, it's on purpose.

We're trying to isolate graphics card performance. uasing a lesser CPU may shift the bottleneck to the processor and deliver useless data, capping performance at the CPUs limit and not the graphics cards limit.

Therefore, we use a very fast CPU to focus the performance achievements on graphics power.
 

kaspro

Distinguished
Jun 1, 2010
256
0
18,780
Hey there, can i know what custom settings are made for the Crysis test, coz i have the HD 4670 and playing at 1280x1024 with 52 FPS avg is amazing!!!!!
 

cleeve

Illustrious
[citation][nom]kaspro[/nom]Hey there, can i know what custom settings are made for the Crysis test, coz i have the HD 4670 and playing at 1280x1024 with 52 FPS avg is amazing!!!!![/citation]

Sure! Just have a look at page 5, "Test Setup and Benchmarks". :)
 

jjxn

Distinguished
Jan 29, 2011
7
0
18,510
PLEASE HELP ME!!!!!

ok, so i really want this gpu for my compact pc. my computer is only 10 cm wide, so the psu is only 300 watts, this gpu only uses around 80, and so it would be great.

now, the problem is.... i cant find anywhere to buy it!!!! when i google ati radeon hd 5670, i get tons of ati 5670's that are all different. eg, i look at one website, and it says that it need 400 watt minimum, and the picture is different.

so basically, i want THIS gpu, but i cant find where to buy it!!! i need help. i live in canada.

JJXN
 
Status
Not open for further replies.