R600: Finally DX10 Hardware from ATI

Page 12 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
its good that ati finally has a dx-10 card out (how long have we been waiting)-- but think it might be premature to buy based on speculation of dx-10 games that still do not exist --- for now -- think im going to wait until dx -10 games come to market as dx - 9 cards offer significant savings.

Until actual dx-10 games exist --- think it is somewhat speculative to say which card is best --- and once dx - 10 games are out --- there might be second generation dx - 10 cards that offer superior performance to the current dx - 10 cards and we can actually see how these cards perform in an actual dx - 10 gaming environment.
 
yeah but 8 pin power connector on 2900xt for overclocking tells it needs more power. tell me if i am wrong
 
dude your right. before R600 launch they were saying 2900Xt is GTX killer lol. but after launch another excuse it's drivers it will improve or R650 will be GTX killer. wake Up ATI people. Nvidia will also release 8900. and saying that 2900XT was made from ground to directx 10 is bull crap. try running any directx 10 demo. oh wait you can't . lost planet is optimzed for nvidia okay ( but 2900xt should run it atleast ) or the another directx 10 game demo try that. oh that doesn't run on 2900xt either (maybe 2900xt is to superior to run dirextx 10 games LOL) and now wait for directx 10 drivers for another 7 months from ATI( then it will be GTX killer lol). XTX will be GTX killer oh yeah by being 1 year late. (XTX oh i give 5 framrates more then GTX after being year late and i need dedicated power supply and nitrogen cooling ) get up from your high horse and accept it was not a killer as AMD made you guys believe it. NVIDIA screwed there 5800 So did ATI with there R600 BIg deal life goes on. 6 months will get newer cards and everbody will forget 8800 and 2900XT


VERY WELL SAID 8)
 
rob has a point, please, he is a bitter troll who deliberately says what he does like any good troll. you can never be sure what he thinks.

I do have a point to what I am saying in regards to the R600vs8800GTX, Why dont you try and prove me wrong?

Try and prove me wrong that-

R600 uses more power than any current GPU.
Try and prove me wrong that-
R600 is priced $80-$100 above its so called competition the 640mb 8800GTS.
Try and prove me wrong that-
The R600 gets beaten in image quality by the G80.
Try and prove me wrong that-
All the ATI fanboys have been rantin and ravin for months that the R600 was going to be a 8800GTX killer.
Try and prove me wrong that-
AA and AF doesnt absolutley eat up the performance of the R600.
Try and prove me wrong that-
Is not the hottest (TEMP WISE) card on the face of the planet.


People want to know why I get so hot well its because of all the douches in denial over the obvious facts just like you Strangestranger :roll:
 
All you can do is laugh at this rob guy. His posts aren't worth anything else.

And all I can do is laugh at your stupid ass because you lack the necessary intelligence based on this topic to be able to actually add anything useful besides trying to tell my my post are not worth anything :roll:

I am fuking 100% right in every thing I have stated thus far concerning the R600vs8800GTX.

lol, if anyone lacks intelligence, it's you my friend. Don't even try to tell me that you think you're intelligent... you remind me of the incredible hulk... all you do is get blinded by your emotions. Haha. You're just too funny.
 
Nah, I'm having too much fun with someone so "intelligent" as you. I've already looked at plenty of reviews/benchmarks to know the difference between the 2900 and the 8800 so there's no point debating that, especially in a thread that's supposed to be all about the 2900 and not some fools arguement on how uber he and his GPU is. On that note, shouldn't you be off masterbating to the super fps and resolution of your GPU? Keep on postin and I'll keep on laughing.
 
Nah, I'm having too much fun with someone so "intelligent" as you. I've already looked at plenty of reviews/benchmarks to know the difference between the 2900 and the 8800 so there's no point debating that, especially in a thread that's supposed to be all about the 2900 and not some fools arguement on how uber he and his GPU is. On that note, shouldn't you be off masterbating to the super fps and resolution of your GPU? Keep on postin and I'll keep on laughing.

Thanks for proving my point about you :roll: :lol:
 
Nah, I'm having too much fun with someone so "intelligent" as you. I've already looked at plenty of reviews/benchmarks to know the difference between the 2900 and the 8800 so there's no point debating that, especially in a thread that's supposed to be all about the 2900 and not some fools arguement on how uber he and his GPU is. On that note, shouldn't you be off masterbating to the super fps and resolution of your GPU? Keep on postin and I'll keep on laughing.

Thanks for proving my point about you :roll: :lol:

you're still at it?
 
I'd be interested in first had experience of this card. I 'upgraded' to a 7600GT about 6 months ago and the image quality was awful both in 3d and video compared to my previous ATI. It didn't even have FSAA enabled in the default driver settings (choosing not to AA textures with transparency).

Sure nVidia get the best FPS but they cheat and take short cuts.

The drivers were (and still are) buggy too, I never really had a problem with ATI's that wasn't fixed in weeks.

Give me a functioning ATI card with it's higher quality 3d and video over a few FPS and a dull lifeless image any day 😀

EDIT: I have just checked some commets/reviews that say the nVidia cards are better than the ATI on 3d quality ... even if most agreed ATI is better, the ATI drivers are pretty immature so I guess we'll just have to wait, see what comes out of ATI in the next few weeks and really produce a decisive result one way or the other 😀
 
Sure nVidia get the best FPS but they cheat and take short cuts

I am all for ATI, hell I really wanted to go with ATI this gen, I got a crossfire board and everything. After a few months of waiting I said "what the hell" and got an 8800(thanks to grape ape and robsx2/robslin/green768/slinrob who advised me not to wait anymore (ironically I think that's the only time they ever agreed on anything))

Please explain in detail how nVidia is cheating and taking shortcuts.
 
This thread was doing good. Let me say.. if you cant spell it.. well.. like STFU, fuk or WTF.. seems some guy here.. well guessing its a guy keeps using these over and over. Maybe, and just tossing this out, maybe if you cant spell the word dont abbreviate it? Lol yeah your the smart one of the bunch. Bud.. you sit behind a computer saying nothing but words. Thats the best you can do? YAWN.

Does anyone remember when the 8800 first came out? WOW it made Vista fly THREE months later.. NOT! Now unlike ROB I can show you site after site where they were PISSED and ready to lynch Nvidia because of the lack of drivers. Its funny how you dont hear anything about that only NOW. Oh and pat your self on the back for that NEW 8800 you bought in NOV, it runs dx9 games SUPER FAST! So ATI comes out and guess what.. yep there drivers are raw. But what is ATI really late for? Nothing.. just late on their word when the card would come out. Now me .. I always have one of each ATI/Nvidia card. So when I talk its not from what I have READ but what MY computers show me. Three computers but with two.. one with x2 AMD/ Intel duo. LOL now for ME on both computers the 8800 does not run on par in Vista as it does in XP. XP is alot faster.. but I guess its just me. OOH yes let me guess it must be the wrong BETA drivers and theres a TON out now. ATI in vista runs games fine but at the same speed as my x1900. Now in XP WOW .. every game is more than double fps was for my x1900. But has fog problems and little things like that. And you have to be kidding me when it comes to FPS. I see 115 fps I dont give a crap if ATI or NV can do 150.

Anyway I know I jumped around but ATI just came out and the only ball they dropped was "6 months" release. I was lucky to see how it did in Crysis and WOW and by then all this ATI sucks or late well be all gone. So behind or head of NV I dont care. I love both. Love NV for doing 3D ..
 
Sure nVidia get the best FPS but they cheat and take short cuts

I am all for ATI, hell I really wanted to go with ATI this gen, I got a crossfire board and everything. After a few months of waiting I said "what the hell" and got an 8800(thanks to grape ape and robsx2/robslin/green768/slinrob who advised me not to wait anymore (ironically I think that's the only time they ever agreed on anything))

Please explain in detail how nVidia is cheating and taking shortcuts.

I totally understand.

I wanted to go with ATi this generation as well. I moved my nF 4 SLI 939 mobo and 4400+ to my tech bench at work and bought an ASUS A8NR-32 MVP and Opty 170 OC'd to 2.9 to be ready for R600.

I just ripped out my X1800XT 512 and gave it to a special friend :wink: and installed an EVGA 8800GTS 320MB (Superclocked) 😀 .

I wanted my existing games to run even better at 1680x1050 with more than 4X AA and this card guaruntees that. Now I'll sit back and relax and see what wins the Crysis war...

PS. This card is VERY loud and it takes a while after you shut the game down to return to idle speed. So people poking fun at R600's noise can STFU
 
I just ripped out my X1800XT 512 and gave it to a special friend :wink: and installed an EVGA 8800GTS 320MB (Superclocked) 😀 .

I wanted my existing games to run even better at 1680x1050 with more than 4X AA and this card guaruntees that. Now I'll sit back and relax and see what wins the Crysis war...
STFU

Hi mate

Whats your screen size? I am planing to buy 8800GTS 320MB too but not sure if it ll cope well on 22'' Wide
 
Sure nVidia get the best FPS but they cheat and take short cuts

I am all for ATI, hell I really wanted to go with ATI this gen, I got a crossfire board and everything. After a few months of waiting I said "what the hell" and got an 8800(thanks to grape ape and robsx2/robslin/green768/slinrob who advised me not to wait anymore (ironically I think that's the only time they ever agreed on anything))

Please explain in detail how nVidia is cheating and taking shortcuts.

well nvidia got caught before optimizing 3dmark.. it was 2003 if I remember correctly.. by reducing a lot of image quality.
also I remember one website a looong time ago ( when the 1800XT wasout.. and the 7800 512 was considered king ) that claimed Nvidia didnt even processed the full directX standards, by forcing all the graphics from 32 bit mode to 16-24 bit mode (compressing it when rendering ) so get more fps... it might have been a rumor but who knows..

I wonder if that study site was still up 😵
 
I just ripped out my X1800XT 512 and gave it to a special friend :wink: and installed an EVGA 8800GTS 320MB (Superclocked) 😀 .

I wanted my existing games to run even better at 1680x1050 with more than 4X AA and this card guaruntees that. Now I'll sit back and relax and see what wins the Crysis war...
STFU

Hi mate

Whats your screen size? I am planing to buy 8800GTS 320MB too but not sure if it ll cope well on 22'' Wide

Mine is a 22". If I had a 24-26" (1920x1200/1080) I wouldn't have gotten a 320mb. Would have at least bought a 640mb, if not a GTX.

I run everything at 1680, or 16x12 if WS doesn't work with a minimum of 4X AA. So far everything runs much quicker than my 512MB X1800XT, except FEAR. It runs like crap now, but I think its because I'm using a G5 logitech mouse now. I read a lot of the G series logitech stuff kills FEAR's framerate for some stupid reason.

A 320MB model is fine for 22" for today's games. Tomorrow, who knows but whatever.

Just keep in mind you can find stock clocked 640MB models for the price of the OC'd 320s. I was actually going to buy a cheap 640, but I would have had to wait to order it as no store in town had them.

I got impatient and talked a bargain out of a store that had this one in stock.
 
With all the rampant pro-ATI vs. pro-nVidia postings in this thread, the point *I* was trying to make is that not everyone can afford to throw out perfectly usable hardware (even AGP hardware), just because a GPU manufacturer (such as nVidia) is trying to force the issue. The question I *have* to ask all those that are now gaming at 1280x960 (or greater) vs. 1024x768 (given that few games support widescreen gaming at all, and even fewer support gaming at 1680x1050, which has become the typical 22" or 24" desktop resolution of choice) is the detail difference *that much higher* at the taller resolution?

The prime reason for *my* reluctance to change over is because it is, literally, a $1,000 decision (actually, that doesn't even include the cost of Windows Vista, either). Going from AGP to PCIe requires a new motherboard + new CPU + new RAM + new PSU (to take the new graphics card). Worse, no 8800GTS320 is going to cost less than $300 (even e-tail) after factoring in shipping. So I'd be looking at no less than $1,500 (floor cost). That does *not* include new drives (either hard or optical, let alone floppy). That is serious money, and I'm not rich by any means.

However, ATI is giving me the option (via AIB Sapphire Technology) of DX 10 AGP hardware. While *some* games may not take advantage of it (such as Crysis or Lost Planet) others, (such as CnC3 or even SupCom) very easily could (as with the 8800GTS, it depends on your application mix, even when the applications are games). Again, that is an option *none* of nVidia's AIBs are giving me at all (has any of them even brought up the possibility of an 8600GT, let alone GTS, in AGP?). That is a saving of a not-insignificant sum.
 
I understand what you are saying. It is a nice thought that ATi/Sapphire plans to release DX10 AGP cards.

nV isn't exactly forcing the issue though. They have supported AGP as much, if not better, in the past.

Until the 1950pro AGP, there was the 7(8/6)00GS and now we have the 7950GT AGP.

I wouldn't be surprised at all to see an 8 series AGP card come out. If nV sees ATi (or XfX sees Sapphire...) shipping DX10 AGP cards in volume you can pretty much guaruntee they'll whip one up in no time. If they want to be quick and lazy about it, it just takes a bridge chip to make it work.

I'm sure nV and/or a board partner just doesn't want to commit until they see the demand is there. They know damn well the install base exists. But the lack of a killer DX10 Vista game and not knowing how many users will install Vista on their AGP machine (as opposed to just buying a new PC) may be holding them back for the moment.
 
I have the Sapphire Radeon HD 2900 XT x 2 in crossfire mode running on a ASUS P5W DH Deluxe motherboard with a Quadcore 6600 CPU (Intel). It gives me amazing graphics on the games that actually run on Vista64, however, it is HOT! So far (I have only ran this for 3 days), it is running between 70-91 deg Celsius. That pushes my CPU up to 38 C, and my motherboard up to 68 C when gaming.
It gives me a lot of difficulty though with games themselves. I get blinking, game crashes, etc... but when it runs stable, its incredible. Just running Halflife 2 (as long as I sync the Vsync, otherwise it blinks), the graphics are intense.

Worth the 1000.00 CDN? not yet...
I think I have to revert to Windows XP in order to really try the card out. Vista 64 Ultimate seems to be giving me all kinds of trouble.. and the motherboard is a chore just to get it to recognize the hardware correctly (recognizes Mushkin 6400 ram as 5300 by default.. still only runs it at half the speed (400 mhz).

Just info for you.. hopefully its helpful.
 
t to get it to recognize the hardware correctly (recognizes Mushkin 6400 ram as 5300 by default.. still only runs it at half the speed (400 mhz).

You mean you couldn't correct that in the BIOS?

Yeah, I'm too afraid of Vista.

Then again, the next time I build a machine, it might be totally different. I mean, I might have a HDCP compliant monitor, a Blue Ray/HD-DVD burner, SATA HD (instead of PATA), a quad core (instead of single core), DDR3 RAM and Windows Vista.

This stuff might all be here next year. I can't wait for UT3 though.
 
BIOS??? lol. This motherboard bios has a lot of features.. unfortunately, that aint one of em. I searched for a way to bump it to its correct 800 MHZ for the past 5 hours.. nada...

Thanks for the tip about, I tried that. Then to get back into Windows Vista without it crashing, I had to disable Catalyst... and uninstall and reinstall it. The whole system isnt 100% stable, and I suspect it is the motherboard. But, for now.. it serves... I may just swap the mb out and see what happens.

Anyone wondering about Quadcore.. I have run this baby hard testing it and still haven't maxed the CPUs... or even come close...lol.

FYI

Case
- Thermaltake VA9000BWS (Kandalf Series)
CPU Cooling
- Thermalright Ultra-120 eXtreme Aluminum Heatsink for P4-775 / AM2 (CPU Heatsink)

Power Supply
- Thermaltake Toughpower 1200W PSU
Motherboard
- ASUS P5W DH Deluxe (ATI Crossfire Ready)
CPU
- Intel Core™2 Quad Processor Q6600 2.4GHz w/ 2x4MB Cache
Operating System
- Microsoft Vista64 Ultimate
Graphic Cards
- Sapphire Radeon HD 2900 XT 512 MB 740 MHZ (Two cards working in Crossfire Mode)

Hard Drives
-Western Digital 150 GB Raptor 10k rpm 16mb SATA NCQ (2 Drives, Raid1)

Memory
- Mushkin 4GB XP2-6400 Dual Pack (2x2GB)
Keyboard
- Logitech G15 Keyboard (logitech drivers installed)
Mouse
- Logitech G5 Laser Mouse (logitech drivers installed)
Optical Drives
- ASUS 18x DVDRW Litescribe SATA
- Liteon 16x DVD-ROM SATA
- Mistumi 7-in-1 Floppy Card Reader

It does HL2 fine.. but I am sure I am only getting a portion of what this machine is capable of. Will see what I can fix or find fixes for.... a long road ahead.
 
Hey you say 2900xt won't keep up to the 8800 gtx ,well the boys over at exterem-pc.ca has a solution. they tweak the card up gave it a gig of ram with a speed over 2ghz with the core gpu at 875 . Now they say it will keep up to the 8800gtx ultra in most games.Tomshardware should get one of those babies and test it out. they also say it will kick butt of what ever nvidia will throw at it in crossfire against sli but no actual proof
 


It Doesn't.

You just said yourself that a TWEAKED 2900XT caught up. If you tweaked the 8800GTX aswell I'm sure you'll find the G80 keeps the lead.
 

Wait for the g100/r700. Then you will see some cards that don't suck with DX10. It will be worth the (long) wait. By then everything else (including drivers) should be mature enough as well.
 


Yeah, but the reason those numbers and results are interesting is that was the target launch stock speeds of the XTX model 875/1100(2200).

It's interesting from an academic perspective, but doesn't change what's actually out there.
 

I expect on a Tessellation benchmark, the 2900XT will crush the 8800 GTX. No matter how TWEAKED the 8800 GTX is. This should come soon.