ATI's Radeon 2600 XT Remixed

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.

utaka95

Distinguished
Jan 12, 2006
375
0
18,780
0
I really don't wanna take sides, but.... the little typo's and editing mistakes that run rampant on the net take away some of the net's credibility. I still have people say things to me like "where did you hear that? on the internet?" meaning they don't give the net any real respect (like the television is some bastion of truthfulness :sarcastic: ). Attacking someone because they are stating something about the lack of editing and overview on the net seems kinda counterproductive, IMHO. Snyper may have written something that didn't sit well with the writer but I believe he was truly being altruistic in his desire to point out a mistake. I recently read an article somewhere on Tom's where the writer called 1366x768 720p, so I took the rest of the article with a grain of salt because the writer must not have tried ACTUALLY using an HDMI cable. Was my rush to judgement called for? Probably not, but that was my gut feeling about the article after seeing that. So I have to agree - little mistakes tend to flush the writer's credibility down the tubes with the more educated readers. And I'm not looking for a flame war - just adding my 2 cents.
 

cleeve

Illustrious
Once again - I certainly don't have a problem with having my mistakes pointed out. And yes, I will make them. That's the kind of guy I am; I'm not perfect.

I like to think my conclusions are for the most part pretty solid though. The jist of it, the conclusion, what a person takes away from an article is what I'm going to concentrate on. To be honest, I'm not sweating the resolution typo all that much. It has very little bearing on the point of the article. Now, if it turned out that I was wrong on pricing and the 8600 GT competing card was $20 cheaper than I thought, hell, I'd be damned ashamed of myself. That would have a profound affect on the conclusion.

Now Mr. Snyper felt the need to point out my mistake, and I'll reiterate: that's a good thing. If mistakes aren't pointed out, they can't get fixed, and might be repeated in the future. I think we can all agree on that.

I just don't think pointing out the mistake necessarily had to be accompanied by what I see as a derogatory statement like that. What kind of value did that statement add? Did it assist me in learning of the mistake? No.

We can argue it up and down, but I can guarantee this: that if his post simply and respectfully mentioned the mistake - instead of suggesting that the rest of the article was suspect because of it - all of this extra stuff wouldn't have happened.

I'm sorry, but I don't see the value of the derogatory comment. If you guys do, great.
 
Absolutly spot on cleeve if people took the time to sit and think about how they worded posts i recon about half of these types of threads could be avoided,Now im not saying im perfect and have worded posts badly myself before.
I dont think snyper intended to be insulting as he pointed out himself and if he was trying to be confruntational then he would have seised on lostandwanderings joking post.
So i for one would like to welcome him and the input he will bring to the forum.
Oh and Cleeve keep up the good work :)
Mactronix
 

shargrath

Distinguished
May 13, 2007
237
0
18,680
0

Theres volt mods out there that will get you too 1ghz core with air cooling(all be it some funky custom air cooler.
I think they could have waited for the 7.9cats to be released before they did this article
 
Volt mods are nice for the enthusiasts but i think i will stick to stock :) ocing has me worried never mind moding :(
I guess they could have waited for the drivers but then we would be slating them for the review taking too long
Rock and hard place i guess :)
Mactronix
 

Slobogob

Distinguished
Aug 10, 2006
1,431
0
19,280
0


A good buy is a relative thing, as you have pointed out yourself. And, as your article shows, the 2600xt seems to be a decent buy at that price point. The question is, who buys it? For those that maybe interested in a 2600xt it usually represents an upgrade or a component for a new build.
For upgraders a comparison is important to see whether it is a worthy upgrade or not. A comparison with a 7600GT and similar cards would be a good idea.
For those that build a new computer the budget can easily shift. Buying a smaller processor might free up enough money to turn a 2600XT (700Mhz) into a 2600XT (1100Mhz) if needed or even an entirely different card.

The big problem i see here doesn't actually lie within the article, but within card itself. The expectations for the 2600 series were high and AMD didn't meet them. At first the prices were off and the drivers were bad, now they change the memory clockspeed while drivers and pricing have improved. Overall that creates a situation of high reader expectations. Some want to see how the new drivers perform or compare the 800Mhz version to the 700Mhz version. Some just want to know how it performs at all or if it is a worthy upgrade or if it will work with their media pc, etc.

I think a comparison between old model and new model would be a good addition. Comparing them with the older 7600gt and/or 1650xt would add a lot too - but doing it all would certainly blow it out of proportion. It's easy to criticize so I hope i haven't worded my reply too harsh.
 

Jakc

Distinguished
Apr 16, 2007
208
0
18,680
0
Again, as far as I know the old model also has 700 mhz memory.
Maybe not the reference design, but the regular 2600XT from sapphire does.
 

cleeve

Illustrious
I hear you, Slobogob.

I'll try hard to get GDDR4 version (and 700 MHz mem version if I can get one) for an upcoming review for comparison.

My gut tells me the GDDR4 version won't be worth consideration since it's price puts it in X1950 PRO territory...
 

prodystopian

Distinguished
Jul 7, 2006
259
0
18,780
0
Wait too much and the refresh cards will be out. They might just be much more worthwhile. Then again we don't know pricing of the Gemini or the new cards.
 

reddozen

Distinguished
May 11, 2007
62
0
18,630
0
Except that the card still comes with the crossfire bridge, so you're talking quadfire. Also, I don't think that the cards will maintain that price. We'll see since the actual list price hasn't been posted yet.
 

prodystopian

Distinguished
Jul 7, 2006
259
0
18,780
0
Quadfire, that sounds interesting. And you do have a good point about the price. They could be similar to the 2600xt, not good at the original price point, but finding a new home at a lower price later.
 

reddozen

Distinguished
May 11, 2007
62
0
18,630
0
what's only DDR2?

the 2600 comes in DD2, 3 and 4.

X2's will be DDR3

EDIT:
i'll rephrase that.
The GeCUBE X2's are DDR2
The Saphire X2's are DDR3
 

San Pedro

Distinguished
Jul 16, 2007
1,284
10
19,295
1
2600xt dual in Crossfire doesn't seem to me like it will be a good value. Might as well just get 8800 or HD 2900, but one 2600xt for $100 seems to be decent value.
 
I agree I'd like to see other cards in comparison, but for what the goal was I liked it, and love the addition of the min fps.

I think it's pretty obvious that the lack of ROPs and dedicated AA resolve hardware just kills them, and unlike the HD2900, the shader power difference isn't enough to help with AA when the GF8600 had 8 ROPs to do the work, and the HD2600 had to go back to the shaders to do the AA. I did find it interesting though that in D3 the performance change went the other way around, and I can't figure that out based on the hardware AA issue. So it's obviously not a straight cut and dry case, maybe the ROPs of the HD2600 were holding it back in the pixels it could render and thus the shader power wasn't being maximized in the older title. But it was an interesting change of the usual 'HD series loses because of AA' position we've all tended to see and come to accept.

I would've like to see a lower but minimum level of AF applied to the noAA level like 0AA and 4XAF as I've seen the performance impact is almost nil, but the visual effect is noticeable. Of course as a baseline I think more people would (IMO wrongly) complain that they need a 0/0 baseline.

I think Cleeve if you do re-run it how about checking a mi-grade IQ level where 2XAA and 4XAF seem to offer a good balance of improve IQ and maintained performance. Just a personal desire based on my past experiences with the midrange and the sweetspots I've found for those 128bit solutions.

Like the new style though, I think more info (like min fps) with slightly fewer cards is a good thing, but adding 2 more reference points, like the GF7600GT and X1950Pro would understandably give people additional upgrade information on whether each solution is worth it (worth it to move from an GF7600GT or choose those instead of it, or also spend a few more and go to the X1950Pro?)

Great work though!
 

reddozen

Distinguished
May 11, 2007
62
0
18,630
0


only problem with that idea is that;
2900 = $400~$500
2600 X2 = $250~$270

I'm sure with an X2 i'll be able to play damn near anything I want with very few exceptions or problems. Not to mention that I honestly don't see the point to blowing $400+ on a video card that will be obsolete in 6 months and 25%~50% less cost. I'm more than happy to bide my time and wait for a price drop on any and everything. I'm not trying to divert from the fact that the 2900 is a better card, it's just the fact that not everyone is willing to pay so much for a video card.
 

sojrner

Distinguished
Feb 10, 2006
1,733
0
19,790
1


here is my problem with your logic:

higher-end card on release = $400-$500 bucks.
after 6 months, the card is hardly obsolete in the sense of functionality or market demand... and THAT determines when the price drops just as much as a newer model release does. Take the 8800gts right now: Released almost a YEAR ago and it is still only ~50 bucks under your magic $400 mark. It is also hardly experiencing functional obsolescence...

Take my current 1900xt512 that was $500 right when it came out. (which is when I bought it) My logic has always been to buy the most you can afford at the time, always reading reviews and watching the tech directions... So at the moment I was upgrading the 1900s came out and I jumped. I gamed for 6 months and still saw no drop in price. (I always look back on what I bought to modify my thinking for the next upgrade)

I am (obviously) still gaming on this card very well w/o being forced to make any sacrifice in quality. I have not seen a game force me to lower settings... yet... (crysis may do that, or UT3... but we will see) It is now well over 1.5 years old and still very viable. If I had waited until the price came down I would have saved (maybe) $100 but been gaming on my finally overtaxed 9700pro that was over 3 years old. (which I also bought at $400 I believe)

Now, my 9700pro is still living in a secondary system... but was my primary gamer for over 3 years and rocked it. I expect to at least get 2 years out of this 1900 if not more. (can't expect it to last like the workhorse 9700 but it is possible) So basically over 5 years I have bought 2 cards and been gaming at or near the top-end for all that time.

Every friend I had that bought the 9600/FX5700 mainstream cards after I got my 9700 were replacing them with an x700 or 6600 (or a now cheaper 9800) as soon as they were released. My 9700 (oc'd by then) was still viable but showing its age by the time of the 1000/7000 generation so I went to the 1900. They then jumped to a 7600 (or a now cheaper 6800/x850) and/or then jumped to an 8600.

figure that out, even setting the baseline price for those cards at $220 (generous IMO) you have 4 different card purchases for ~880 bucks. In that same time mine was 2 purchases at $900. For my extra 20 I only turned down my settings towards the end to the life of my 9700, which brought me to the level my friends were gaming at. (about 6 months) For the rest of that 5 or so years I was (and still am) gaming at the top-end of visual quality. If you upgrade less, then you are REALLY scraping the bottom of the visual barrel, like some that are still using a 6600 on games like Oblivion.

What I am saying is that you end up spending the same but getting a better gaming experience by getting a high-end card when you upgrade. They last longer so you upgrade less (as long as you didn't buy the high-end FX5800 ;) )

Of course, if you just plain don't have the scratch... then my argument is moot and you have to do what you gotta do...

...rock on man.
 

ASK THE COMMUNITY