R600: Finally DX10 Hardware from ATI

Page 6 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Skyguy is on the right track. I want a card thats silent. Gives me Crysis with all features turned on at 1080p/60fps. I want it connected to my 52" inch 1080p lcd over HDMI with perfect pixel to pixel representation, any oversvan/underscan issues solved and the full resolution. Does it support the higher bit depths of HDMI 1.3? Will it do 1080/24p for movies encoded at 24p? This setup may be the only way to watch 1080p at perfect pixel to pixel representation (from BluRay/HD-DVD particularily) and since a windows computer is an awesome scaler it is the best setup for watching SD thru, TV or DVD.
 
Yeah... I agree. But, there are always going to be drama queens that are like "ATI IS DEAD! ARRRRRRGGGGGHHHHHHHHH!".

Anyway, ATI appears to have some setbacks at the moment, but even if the 2900 doesn't get any better it's not the end of the world (there have been plenty of times ATI has kicked nVidia's arse up and down and they are both still around).

Yeah, I also have to agree. There are also always going to be drama quers that are like "nVIDIA IS DEAD! ARRRRRRRGGGGGHHHHHHHH!".

R600 came out. Its SUCKS. DEAL WITH IT!!!

NV 1 - 0 ATi(AMD)

Nuff said :!:
 
Skyguy is on the right track. I want a card thats silent. Gives me Crysis with all features turned on at 1080p/60fps. I want it connected to my 52" inch 1080p lcd over HDMI with perfect pixel to pixel representation, any oversvan/underscan issues solved and the full resolution. Does it support the higher bit depths of HDMI 1.3? Will it do 1080/24p for movies encoded at 24p? This setup may be the only way to watch 1080p at perfect pixel to pixel representation (from BluRay/HD-DVD particularily) and since a windows computer is an awesome scaler it is the best setup for watching SD thru, TV or DVD.
For a silent setup, wait for the saphire water solution, and if its really effective, this card seems to oc well. So lets see, a quiet card, heat issue solved, oc's well only power to have to deal with. Itll handle HD, whether it lays it on your HDTV expertly I havnt heard or read. Im sure Cleeve will eventually have an article out on it
 
My opinion is that is to early to say enything. The 2900 drivers have not matured yet. Until they do is not fair to compare it with the 8800 640MB
 
I posted this in another thread, but I think its interesting, maybe maybe not but here it is :
Heres some things for everyone to chew on. I went to anands for their review, in the process I also went back to their 8800 review. Some interesting results : 3 games were done on each the same platform then and now. The three games were Oblivion, Prey and BF2. The first test when the 88's were first coming out went like this : GTX Ob-24 BF2-59.5 Pr-56.3 . The GTS : Ob-17.8 BF2-59.5 Pr-35.1 OK, this is abbrev Ob for oblivion etc followed by fps all done at 25x16 with 4AA OK, now lets look at now, withe the 2900 and the GTX/GTS benches :GTX Ob-31.9 BF2-98 Pr-61.7 . The GTS : Ob-22.2 BF2-63.8 Pr-43.5 OK now the 2900 : Ob-19.2 BF2-95.8 Pr-49.7 . I know this is hard to read or follow, but what its shown me is that in 7 months time the 88 series has seen a 10 tp over 20% increase with improvements over their newest drivers. I believe if all things being equal, that the 2900 will see these increases as well. This may help answer the OPs question, as well determine someones next purchase. It could end up that the 2900 could be just a lil under the GTX after all is said and done
_________________
In the end who knows huh?
 
I agree the Sapphire HD2900 XT TOXIC, with it's overclocking and watercooling, looks like a very cool solution that should make ATI able to put up a decent fight with the GTX and Ultra cards.

Just look at the beauty
20070514155127_bigwater.jpg

And it can even cool your CPU :)
 
Its nice if you have a bay to put it in. And the power only goes up. For 400$ and get a quiet and cool solution, this may work for alot of people. I just hate the leaking GPU , the true flaw of the 2900. My 1900 draws alot too, and is hot also. If I get a 2900, then Ill go WC'd.
 
Hmm, interesting to think about also: for many people, upgrading their video card to the R600 may be not only buying the card but a new PSU also, so the $399 may not be the bottom line--perhaps the 8800GTS runs on their PSU while the 2900 does not. Still, the 8800GTS doesn't exactly draw a small amount of power either.
 
So what they're trying to say is that, while the r600 NOW is not performing as well with the g80, in the future when games use more complex shaders, the r600's performance will be fairly constant while the g80's performance will dip?
 
I guess that's pretty cool. Seems like they'd advertise that aspect a little more though, if that is indeed the case. Kind of odd to release a product and it's selling point is "it might not be the best now, but give it a year or two and it'll beat what IS the best card now".
 
We are in a transition, going from DX9 to DX10. No one really knows which direction it will take for sure. This is why buying a DX10 card now for DX10 isnt at this point in time, the best thing to do. Im still going to wait a lil longer. Not too long, as the prices will drop, and Im sure a DX10 game will come out that Ill have to get, or will be coming out anyways. Besides, I want a cooler solution for the 2900. Dont care about the power, other than it does suck it up pretty bad. The 2950 just may be the way to go
 
As a diehard AMD/ATI fan since K5 it really pains me to say this, but I will most likely be going with Intel/Nvidia in Q3-4. I do a lot of video and photo processing and my X2 4400+ is starting to show its age. I’ve upgraded from a 6MP camera to a 10MP and batch processing, Photoshop CS3 and Adobe Primer are really socking it to me in time. By going Intel I’ll be able to complete my work in almost half the time! Price wise, when AMD releases their new offerings (Quads) Intel will drop their prices and then its upgrade time for me just in time for Christmas.

However it is spun, ATI really dropped the ball with this card. Even if you want to just compare it with Nvidia’s lesser offerings (price vs price) its still spells failure due to the fact they could not offer a higher end card. Well, it looks like its going to be a losing year for ATI/AMD.
 
I'm not at all clear on what exactly is "OS Idle 2" versus "OS Idle". Does anyone here know?

In the chart, "OS Idle" depicts the HD2900XT as demanding a not-so-green 87W, but "OS Idle 2" depicts the HD2900XT as demanding a ridiculous 133W for doing apparently nothing at all.

It's a sad day when we buy graphics cards that draw more power doing nothing than a quad-core CPU running at 100%.

WTF are AMD and NV thinking?

-Brad
 
As a diehard AMD/ATI fan since K5 it really pains me to say this, but I will most likely be going with Intel/Nvidia in Q3-4. I do a lot of video and photo processing and my X2 4400+ is starting to show its age. I’ve upgraded from a 6MP camera to a 10MP and batch processing, Photoshop CS3 and Adobe Primer are really socking it to me in time. By going Intel I’ll be able to complete my work in almost half the time! Price wise, when AMD releases their new offerings (Quads) Intel will drop their prices and then its upgrade time for me just in time for Christmas.

However it is spun, ATI really dropped the ball with this card. Even if you want to just compare it with Nvidia’s lesser offerings (price vs price) its still spells failure due to the fact they could not offer a higher end card. Well, it looks like its going to be a losing year for ATI/AMD.
 
So what they're trying to say is that, while the r600 NOW is not performing as well with the g80, in the future when games use more complex shaders, the r600's performance will be fairly constant while the g80's performance will dip?

What they are saying is ATi was gambling when they designed R600. nV designed a product they KNEW would perform very well now, ATi rolled the dice on an approach they thought they industry was heading toward.

Maybe they hoped there would be a complex shader DX10 demo out by now to show it off.

All I know is I won't call suck (like the [H]) on ATi's new card until I see Lost Planet, Crysis, Alan Wake, etc. running on it.
 
My opinion is that is to early to say enything. The 2900 drivers have not matured yet. Until they do is not fair to compare it with the 8800 640MB

Here is also why it is too early to tell. Everyone should read this


http://forums.vr-zone.com/showthread.php?t=147883

If that is the case, then Nvidia has built a card for today's games, and is now working on a card for tomorrow's. ATI has built a card for tomorrow's games that is poorly perfoming (heat, noise and power) in comparison, and by the time tomorrow's games are released, there could be a better Nvidia replacement, or even an ATI replacement.

Either way, it's looking like the R600 has no real place of it's own in the market at present, certainly not at the current price anyway.
 
This is speculation, but it does bear it out. The XT was to compete with the GTS, as the XTX was to compete with the GTX. As weve read, the XT OC's very, very well. But you have 2 things against it. Its hot, very hot. It draws a lot of power. I currently own a 1900xt. If I want to, I can OC to xtx levels and higher, tho I dont. The XT was thre smarter buy IMO. Now look at the 2900XT. IF the 2900 didnt leak juice like a California orange, and IF the XT didnt use as much juice as battery power in a mega nympho party (all girls) then the XTX would be here now kicking the GTX. Just look at the 12 inch overheating monster that was sposed to be the XTX. Had to cool it somehow. And designing it with 1 gig of memory I think was a huge mistake. Now, as I said, IF the transition to 65nm goes well, the 2950 will rule the roost
 
My opinion is that is to early to say enything. The 2900 drivers have not matured yet. Until they do is not fair to compare it with the 8800 640MB

Here is also why it is too early to tell. Everyone should read this


http://forums.vr-zone.com/showthread.php?t=147883

If that is the case, then Nvidia has built a card for today's games, and is now working on a card for tomorrow's. ATI has built a card for tomorrow's games that is poorly perfoming (heat, noise and power) in comparison, and by the time tomorrow's games are released, there could be a better Nvidia replacement, or even an ATI replacement.

Either way, it's looking like the R600 has no real place of it's own in the market at present, certainly not at the current price anyway.

Bad timing and terrible execution aside, it has a real place of its own in today's market.

People will walk into Staples/Future Shop/London Drugs/Best Buy and see the big bad new ATi and buy it regardless of all this discussion LOL.

Hell I still have customers coming in my store refusing to buy Intel because AMD64 is the king of gaming :roll:
 
As far as the AMD is king thing, just let em believe it heheh, its not that Have anything against Intel, nor do I think AMD is currently better, but I do like competition, and AMD unfortuanatelt needs those sales, just like when netburst was selling better cause they "went faster" heheh
 
Either way, it's looking like the R600 has no real place of it's own in the market at present, certainly not at the current price anyway.

I don't think the 2900 XT is all that terrible, but it'd have to get at least a few bucks below the price of an 8800 GTS 640 before I'd recommend it.

If Ati/Amd is smart, they'll undercut the 8800 GTS 640 by a notable margin, and release an HD 2900 XL to compete with the 8800 GTS 320 that is also cheaper.

Only way the 2900 is going to get any street cred is with a lower price...
 

TRENDING THREADS