ATI's Radeon 2600 XT Remixed

Page 3 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.

cleeve

Illustrious


I thought so...

When reddozen said "I'm sure with an X2 i'll be able to play damn near anything I want with very few exceptions or problems." and mentioned the projected price of the 2600 X2, I replied that "I'm pretty dure an 8800 GTS 320 will beat down a single 2600 X2 for about the same price."

- indicating that an X2 probably wouldn't be a good buy at that price, because it's a stone's throw away from the 8800 GTS 320MB.

Is there a potential inaccuracy in there that I'm missing?

 

cleeve

Illustrious


Thanks man.

Sometimes I like to have a whole bunch of cards for comparison, but sometimes I think it dilutes the point a bit and I occasionally like to hone in on two competitors. I thought in this case the 2600XT/8600 GT deserved some focus, but in my next review I might add in a whole bunch of other similarly priced cards.




Yeah, if I'd have known they'd have performed so dismally with 4xAA and 8xAF, I probably would have benched 2xAA, 4xAF or even 8xAF. Something to consider in the future for this class of card review for sure.
 

reddozen

Distinguished
May 11, 2007
62
0
18,630
fair enough, and thanks for the input guys.
I'll have to wait on a GFX purchase regardless of what I get since I just bought a new motor for my car and a PS3.
But I'll keep your points in mind. Plus, by then I'm sure there will b some real X2 and X2 quad benches out there to look at for S&G's. Hell they may announce a dual 2900 by Christmas. Who knows.
 

Slobogob

Distinguished
Aug 10, 2006
1,431
0
19,280


Indeed. The 7900GS and 1950 pro are priced quite aggressive offering superior performance for less money. I'm quite interested in seeing a comparison between the different 2600 models, even though their performance can't keep up with the older mid to high range.

Regarding the 2600xt (GDDR4) i must say that i have read some reviews showing it was only 3-4% faster than the regular 2600XT. As it seems the memory bandwidth isn't the real problem of the 2600 series.

I'm looking forward to reading the updated article.
 

cleeve

Illustrious
Damn you and your elitism, Strangestranger! (j/k)

I think max framerates have limited usefulness, so I didn't include them; the average FPS represents whet you can expect during normal gameplay, and the minimum shows the bottom when there's lag.

Maximum framesrates are superfluous information IMHO because it doesn't really tell you anything about the user experience, but it can give a false impression and take the focus away from the important info if included in a graph.

With these cards, I turned off HDR when benching Oblivion because they couldn't even handle 4xAA. With these cards, 4xAA and HDR wouldn't even be playable at 1024x768, which I consider a minimum.

 

kpo6969

Distinguished
May 6, 2007
1,144
0
19,290
not usually my interest these "Mid Range" cards but thought i would have a look. Good to see the use of minimum FPS, any chance of min, max and average and perhaps a graph? I know it is asking alot but it gives readers much more info as high FPS means nothing a game if it is fluctating like mad, makes for a choppy experience.

also, in oblivion, why did you turn off HDR? we are no longer hindered by nvidia so was it just because the performance drop was too much or habit? I know it is a small matter(kinda like a typo :sarcastic: ) but oblivion just ain't oblivion without both on at the same time.

still i feel sorry for these poor souls who can't afford better(patronising ain't i).
I hope everyone in Scotland isn't like you.
BTW: Any reason you still run a 4400? I may not have your $ but I upgraded from a 4400 to a 5000 for $119.00 and it helped my $ challangened system a great deal.
 


Yeah it's too bad it's not easier to gather information (yeah I know, I've been there, not even as much work as yours and it's tough enough), I agree max fps means nothing, the thing that would be nice to know is if the min fps is like a blip, or consistently low. There was a game or two that showed the min fps in both stayed almost the same with AA enabled, yet only the avg budged a bit in the, which might lead me to believe it was bound, by ROPs or something, and that it might be the max fps that was affected, and not the bottom end. But unfortunately you can't tell for sure because the min may have stayed at 10fps, but what happened is that it dipped to 15 more often than before where it dipped to only 25; but you can't tell if it's that it took the bite out of the top end and while it still maxes at the easiest point at 150fps, it spends more time at 70fps whereas before it was more in 90 fps . showing max fps wouldn't clear that up, but that's where your comments come in handy.

Ahhh guess we're never satisfied, but hey it's about info, so the more you give us, the more questions it brings up and the more we want.... we're greedy that way. :whistle:

Anywhoo, like I said, enjoy the new format, and look forward to the next one.
 

DBaron

Distinguished
Sep 18, 2007
2
0
18,510
Can anyone tell me what the performance of this card would be on an old athalon xp 3200? I would assume it would bottleneck the card quite a bit, but it HAS to be step up from my 9800 nonpro. AFAIK this is the only dx10 card out there for agp. I am currently playing the hellgate beta, and my 9800 isn't cutting it anymore. I was leaning more towards the 7600 gt, or possibly the 1950 (possible power supply problems with that card), but this article got my attention. Also, how much slower would the 2600 pro be? A quick look a newegg shows only three versions of this card in agp, two pro versions at around ~ 110 each (ddr2), and one xt version (ddr3) at 170 (which is too much imo). I am more interested in the pro, since its in my price range. Mainly I am looking for a card to extend the life of the machine a bit longer, and gain some performance from my older games. I'm not really ready to upgrade yet, so thats not an option.
 

cleeve

Illustrious


Here's a review I wrote showing how the 7600 GT and X1950 PRO worked on an Athlon XP 2500+. The 2500+ has the same cache as the 3200+, it just has a 333 fsb speed instead of a 400 fsb speed, so the results should be very similar:

http://www.tomshardware.com/2007/01/10/agp-platform-analysis/

Long story short, the 7600 GT class cards worked great with the older CPU. The X1950 showed big gains in some games, but in cpu bottlenecked games performed the same as the 7600 GT.

The 8600 GT and 2600 XT will be about as fast as the 7600 GT, although I don't think they're available in AGP quite yet...
 
I was hoping to see a 2600xt/8600gt/7600gt shoot off with lower end CPU's. Like maybe a x3800/2140 as the CPU's. I'm probably not the only one wanting something like this. When someone is trying to build budget gaming rigs in the $400-600 range, this kind of card can/will be considered. I know a 7900gs/x1950pro is a nicer setup, but all of the reviews (at least most of them) use high end CPU's and memory that most of us can't afford. I know that most enthusiasts don't want to see a budget system, but I do. I have alot of other priorities in my life, so I don't have the $ to fork out for a top of the line system. I just thought that most people would be interested in true budget system benchmarks. Even if the budget system was say $500 and then you added a 8800gts to it to see how much of a performance increase you would get by spending more just on the GPU. That information I think is more priceless than spending $3,000 on a system. Sorry for the rant, but I needed to get that off my chest.

My 2cp's
 

cleeve

Illustrious


Just look for older reviews of the 7600 GT... the 2600 XT and 8600 GT perform similarly.