Radeon HD 5770 And 5750 Review: Gentlemen, Start Your HTPCs

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

anamaniac

Distinguished
Jan 7, 2009
2,447
0
19,790
Ofcourse, excellent article again Chris.
You always do wonders.

It's interesting though...
The 5750 vs. 4770... same performance, damned close to the same power usage, same with noise, and the price isn't too far off.
Seems to be just a more matured version of the 4770 (which I guess it is).

However, the extra $20 or whatever for Eyeinfinnity is worth it.
A matrox TrippleHead2Go are quite expensive (~$300) and only support low resolutions.
SoftTH (a software variant) also has heavy performance issues.

I'm going for 3x 23"@2048x1152 Samsung 2343BWX monitors.

HOWEVER!
One issue needs to be addressed...
Triple head monitor stands are VERY expensive (which is needed if I want to prop a triple head solution for more verticle pixels).

--- --- ---
| | | |
| | | |
| | | |
--- --- ---
(3456*2048, 27/16 aspect ratio)

As opposed to:

----- ----- -----
| | | |
----- ----- -----
(6144x1152, 48/9 aspect ratio)


Also, Matrox DualHead2Go and Matrox TrippleHead2GO had bezel management (if you're playing a game and you want to simulate pizels in the area were the bezel is, to event possible awkward views during some games such as racing sims (giving you a small blind spot however)).

I want the 5870 to go down in price already. =/
I understand the 5850 is only knocked out by a small amount, but yeah...
Plus, when the hell are we going to get the 2GB version? And when will they fix the Eyeinfinnity crossfire issues, and allow a passive DP to DVI adaptor (instead of forcing us to cough up a good chunk of change for a active DP to DVI).

And most importantly of all...
Someone got Zelda Ocarina of Time working properly at 5760x1200 (yeah, it's a n64 game)...

We does WC3 only support 4/3 and 5/4 aspect ratios and a max of 1280x1024... why hasn't someone fixed that yet? It still has a decent userbase...

I think one of the first games I will use my super widescreen on is Oblivion...
 

kelfen

Distinguished
Apr 27, 2008
690
0
18,990
eyefinity is definitly a worth having if you ever have enough money to purchase monitors and are a heavy multi-tasker being able to read the forums while time spent finding a group to raid is rather nice. for WC3 not sure the max res but I supose it is not 1280x1024
 

greevar

Distinguished
Oct 13, 2009
7
0
18,510
"Actually, this is more about allowing the playback software to pass that audio data. The path has to be implemented in such a way that the high-definition soundtrack doesn’t become accessible to a malicious user who’d happily intercept the DRM-free audio."

Any consumer that supports this technology is shooting their self in the foot. I do not want the MPAA, RIAA, and hardware manufacturers dictating to me how I use my media. I bought it, and if I want to rip the audio for my own use, I am allowed to do so by the fair use rights of the copyright act. "Protected path" is just a tool to get you to buy more of what you already paid for.
 

Aerobernardo

Distinguished
Apr 2, 2006
135
0
18,680
Also praising the comments of Angelini, these last GPU's reviews were great. I would like to highlight the need of benchmarking on the 1900 resolution even on high tier cards, since it's becoming mainstream from what I've seen.

With the 4850 review I also picked my VGA but there was a problem: since I needed it this month and it wasn't availble almost anywhere, most vendors were asking a price premium that made the price difference to the 5870 come down to 8%. In that case I opted for the 15% increase, even though I am concerned (to the point of almost regreted) about noise, specially because my case exposes all the noise of a VGA (HAF 932).
 

trinix

Distinguished
Oct 11, 2007
197
0
18,680
Great article, kudos. Too bad the 5700 serie is not what we all expected yet, but with a nice price drop or better drivers it might be competitive.

Also don't forget these are reference boards. I don't know for sure but I think the board makers can add different outputs on the cards. So I would say look for that if they are released or not. I've seen some do really odd things with boards too.

And even at this price, it's not that bad. The features are great and it will support you for a long time, especially if you are a gamer who will game on 1 card for a lot of years. So if you are upgrading from a hd2000 or hd3000 series it would probably be worth it, so you don't have to buy the 6000 series for dx11 support in a year or 2, but continue your upgrade path and wait until the 7000 or 8000 series to get a new card.

Interesting times, can't wait for the 5600 series or whatever the budget cards are called. See how they compare to the new 210 and 220, lol.
 

jestersage

Distinguished
Jul 19, 2007
62
0
18,630
Great review, again, Chris and Tom's.

I think the current relatively elevated prices of the reviewed cards has a more economic and production rationale than pitting price to performance.

Since the 4770 launch flop, I'm betting AMD learned their lesson regarding stocks. With the current pricing, most folks would opt for the better-performing yet less expensive last-gen cards. This will allow AMD and their board partners to sell out their existing last-gen stocks and all the while give them time to produce enough stock of the reviewed GPUs for when prices go down... again... as is wont to happen when either or both the competition brings in their guns or supply increases.
 

cinergy

Distinguished
May 22, 2009
251
0
18,780
"But the real shocker happens when you enable PhysX in this one" Yeah, some shocker all right when ATI doesn't support it. You just had to add physx to give some shine for nvidia. This closed API lasts exactly amount of time how long nvidia has afford to fund game houses and before open standards start kicking in.
 

scrumworks

Distinguished
May 22, 2009
361
0
18,780
[citation][nom]ambientmf[/nom]What's the benefit of DirectX 11 capabilities if the cards are worse performing than last gen cards in DX9/10 games? I'd rather get a 4800 series card, being a gamer myself, for slightly better framerates.I can see the other benefits for the hardcore HTPC crowd though.[/citation]

With an educated guess the benefit of having DX11 capabilities should emerge when next generation of DX11 games start to show up. Lets see which card performs worse then.
 

lowguppy

Distinguished
Apr 17, 2008
192
0
18,710
Aggh, those charts are hard to read, list them in some kind of logical order, and better yet, but the cards/configurations that are going head to head price wise next to each other. Differentiating the new cards would be good too, as suggested.
 
DX11 for the masses!
As for nvidia I'll stay will my G80 and G92 since they can still be had at reasonable price while the GT200 dries up and becomes even more expensive. I have my sights on the 5770 or the 5850 for my crossfire box.
 

dlux

Distinguished
Oct 6, 2009
37
0
18,530
Yeah this is kind of disappointing. Guess I'll wait to see what Nvidia throws out come next year (or whenever) before I dump the ol' trusty 8800GT.

5850 looks tasty though for the price.
 

verrul

Distinguished
Jun 29, 2009
80
0
18,630
I would like to see an amd bottleneck comparision too nice to see that mhz speed isn't that relevant for gaming. I just wonder when amd is gonna get their head out of their rear and stop using dual channel memory and switch to a triple or quad as that is where amd is really lagging behind the i series
 

cleeve

Illustrious
[citation][nom]verrul[/nom]I would like to see an amd bottleneck comparision too nice to see that mhz speed isn't that relevant for gaming. I just wonder when amd is gonna get their head out of their rear and stop using dual channel memory and switch to a triple or quad as that is where amd is really lagging behind the i series[/citation]

From what we've seen, it's not a MHz thing... it's an architecture thing. The i7/i5 will kick the Core 2 Quad's and Phenom II X4's tail in most game titles by a surprising margin. Even when resolution is raised high enough that you'd assume the bottleneck would move to the graphics cards, there are some surprising i5/i7 wins. Games really seem to like the i5/i7 CPUs.

Dual channel vs. triple channel, there's not much difference there. The i5 is limited to dual-channel and it's just as fast as the i7's for all intents and purposes.
 

dlux

Distinguished
Oct 6, 2009
37
0
18,530
[citation][nom]Cleeve[/nom]From what we've seen, it's not a MHz thing... it's an architecture thing. The i7/i5 will kick the Core 2 Quad's and Phenom II X4's tail in most game titles by a surprising margin. Even when resolution is raised high enough that you'd assume the bottleneck would move to the graphics cards, there are some surprising i5/i7 wins. Games really seem to like the i5/i7 CPUs.Dual channel vs. triple channel, there's not much difference there. The i5 is limited to dual-channel and it's just as fast as the i7's for all intents and purposes.[/citation]

May be a dumb question but are games programmed for Intel's architechture much in the way that a game may play better on Nvidia cards rather then ATI because it was built more so for that particular graphics card. If so, is it not the programmer's fault then simply because they program off those processors to be more streamlined on Intel machines?
 
G

Guest

Guest
Cleeve the i7 fanboy: I seem to recall plenty of reviews, suggesting that any Phenom II X4 BE CPU performs about the same as an i7, save for a few "special" games.

See:

http://www.tomshardware.com/reviews/core-i5-gaming,2403.html


Although you are right about the triple-channel memory, which also applies to hyperthreading. People assume that whenever i7 wins a benchmark, that those are the reasons: they are not. Just like SSSE3 (with 3 's') was one of the biggest factor in Core2's dominance, SSE4.2 was the big factor in Core i7's occasional dominance in things like video encoding.
 

invlem

Distinguished
Jan 11, 2008
580
0
18,980
*DROOLS*

I finally have something to replace my 4550 in the HTPC now, I refused to pay over $200 for a sound card just to get bit streaming to work. Not to mention with the micro-atx HTPC case I'm trying to keep the card count down to a minimum, right now I only have the video card and Wifi-N Card and I'd like to keep it that way.

I'm glad ATI put this feature in their cards.

I'd like to see some passive cooling cards hit the market though, I'm sure one of the manufacturers will have something in the near future for the HTPC users who must have absolute silence in their systems.
 

subox247

Distinguished
Sep 17, 2009
4
0
18,510
The path has to be implemented in such a way that the high-definition soundtrack doesn’t become accessible to a malicious user who’d happily intercept the DRM-free audio.What in the hE!! as if someone would actually go the the trouble of capturing audio that way, its called anydvd-hd and viola! Anyway if you buy a blu-ray movie you should be able to bitstream the audio or whatever you want to this only stops legitimate consumers. People who want to copy lossless audio won't have any trouble getting it, this only hurts the person that sincerely wants to watch the movie via htpc and can't afford expensive cards.
 

niknikktm

Distinguished
Jan 29, 2009
39
0
18,530
Ummmmm......did I miss something? The title of this article makes reference to this being a good fit for an HTPC build, as does some of the text.

Why then are all of the benchmarks about gaming? You test these cards out on a plethora of games but not one video application. Shouldn't there at least have been a couple of tests run regarding the rendering of DIVX and H.264 video as well as transcoding??? That was the whole gist of the introduction to this card. If it is suggested to be used for and HTPC then it should have been tested on more than just games.

Sorry. I call em as I see em. You dropped the ball big time.
 
Status
Not open for further replies.