R600: Finally DX10 Hardware from ATI

Page 5 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
It will only be a matter of time until we have some fully functioning DX10 titles. I have a CoJ demo but it does not always work. That being said, I was told of another title coming soon that we should be getting shortly.

It will be interesting to see how we it will do on G80 and R600.
 
you know, i see too much verbiage here.
the more words people use, the less i trust what they say
this is about the R600 release
which by any stretch of the imagination , is disappointing
 
This is perfect. Based on this past post... you guys now want shorter articles again? LOL

I like the articles that give you more information. You should be able to keep your own Fud-O-Matic 3000 Spin Detector Deluxe so you can look through the crap and find truth in the statements you read or the lies. The hard thing about dealing with companies these days is the amount of stuff you have to know in order to figure out what the heck you are looking at. I give credit to pure enthusiast sites that say "How fast is it?" because you can plug it in and bench till your eyes fall out. The good thing there is that you know how fast your performance was in current scenarios.

There are tech sites that have good articles on the technology and sometimes give what seem to be wishy-washy conclusions. Some feel my conclusion was just that. So be it. I said I liked the hardware. There is some cool technology in there that seems to be ready from DX10.1. Look the Tech Report article, I think Scott did a good job on the content and was able to run down some rabbit trails I postponed for another article due to time. Kudos to Scott. I think his conclusion is close to mine... hot, late, slow, needs better drivers, and still has cool untapped techology.

The fact of this hardware debate is that it is complex. We are not only dealing with hardware designs for graphics, we also have two operating systems (one with so-so drivers and bugs GALORE) and we have two APIs with one that seems to be getting redefined as we get revisions (DX10 vs. DX10.1). Shader model 5.0 could be as close as 2008. Nvidia has stated that the geometry shader (GS) was not designed for large-scale tessellation. (http://developer.download.nvidia.com/presentations/2006/develop/next-gen-dx10-games-develop06.pdf - See page 16)

It is far too soon to place a final judgment on R600. It is great for controversy but not quite ready for a final judgment.




BTW: If you need a Fud-O-Matic 3000 Spin Detector Deluxe I know a guy who has a bunch of them for sale… I hear he has the Brooklyn bridge for sale too.
 
Overadvertised, overpriced, overdelayed= really dissapointing underperformer

Drivers will improve for sure, but now we got what we got.
And prices...2900xt is drowing on high sea. It's 50$ more expensive than 8800 GTS 640MB 8O , and i'm not talking about the 320mb that kicks ass in perf/dollar.
When comparing a 8600GTS and a 8800 320 that is 50$ more expensive you'll take the 8800 8) , but in case on 2900xt where's the point of extra 50$ ???? :?
Video stuff is cool but in my case i don't use it.

ATI has to shift gears faster than NV
 
Right now at newegg the 2900XT is selling for $429. Being that this card is brand new, it is very fair and competitive with the $399, 8800GTS.

Overall I think this was a good release for AMD/ATI. In my eyes it didn't disappoint. Rather it shows promise for a good competitive future.

ummm... actually, the 320mb's are going for $310 or so.

http://www.newegg.com/Product/Product.aspx?Item=N82E16814122022
$269.99 after MIR.

Holy Tamales!!! Thanks for straightening that out.
 
I'm disapointed, i think this new ATI vid card should be 50-75 dollars less, and they should scramble to get something better out the door in the next 6-9 months.


740 clock speed and crap performance, wow. just wow.
 
Well I am gonna skip this generation of video cards from both ATI and Nvidia. The current DX9 video cards perform well and are cheap. There are no DX10 games out there. In a year the DX10 games should be out in full force and Vista should/will hopefully have all of the bugs worked out of it.

By then (in a year) the 8800 and R600 will be old technology and the next generation cards will be coming out. I can't wait!!!!! 8O


Just my opinion...... :wink:
 
That is what i'm planning to do also, no point getting anything this gen, my 7950 does well in WinXP games, and i doubt anything will be out this year with Vista gaming that will be worth anything, i doubt Crysis will be out this year...
 
No, thats stupid.

Haha you have a real way with people, Track. :lol:

Yeh, and they always flame me for it. 😀

If they only knew what i know..

And to add to that claim, i will admit i only looked at one benchmark to come to the conclusion that the alphas dont suck.. but hey i may still be right.
 
I have a 7950 GT as well and I am very happy with it. I am not a huge gamer I play BF2, C&C and City of Heroes/Villians. At 1280X1024 more than enough for me.
 
I don't normally post these things but ATI cards are known for there ability to take load off the CPU. This Bench was with a core 2 duo at ~3hgz. What about a pentium d 820 with the same bench. Don't get me wrong I am mostly an Nvidia fan just cuz (no real reason.) But I want to know if the ATI card (especially with its build in MathProc) can outperform a gtx with a cheap processor (obviously both on the cheap proc). I mean I know people that are using 8800gts with a slightly overclocked Pentium D 805 and 2gb ram. I have a pentium d 920 oc'ed at 3.73 ghz and 2gb memory. I am contemplating just buying one of each and then reselling them on ebay to make as much money back as possible just to see how they perform. maybe overclock only to 3.2ghz and see what happens. The fact is if a person buys a AMD X2 2ghz dell pc and goes to store with a bonus check. The guy behind the counter is not only going to be biased but maybe even just plain wrong. "Um the 8800gtx performs better most the time and especially at higher resolutions". and another thing. Without rebates the 8800gts 320mb is $120 cheaper then the 2900xt. that is 30% less and most of the time at almost all resolution except for the ridiculous ones it was less than 25% weaker and in some cases just better. So value the 8800gts wins. Again an 2900xt on a basic platform would be nice just to see where it goes. Almost want to buy a 300 dollar dell from dell outlet and run the tests myself. But I don't know. Did get this new job. N E Wayz. That was my two cents
 
It is far too soon to place a final judgment on R600. It is great for controversy but not quite ready for a final judgment.

Well said, everyone should "listen" up.


Now my .02 for those who don't understand - or don't want to know - economics and reasoning behind prices. And also for those fan b0is who can't take the praise and smiles behind what they bought to see light in something else. Some people either have too much pride in what they bought, or are too stubborned to to see things unbiased.

What everyone doesn't seem to understand is that in order to make money, a price point at which to sell a certain product is to be made at launch. Some are willing to pay that premium for a product regardless of the slight difference in performance than a cheaper counterpart. Why? Because it's a new product with new features.

Will the price stay at the same price point? NO! It will gradually decrease as demand by those who initially were willing to pay the premium goes down. Then after this occurs, the pricing battles will begin and there you go...the GTS (the competition) is suddenly a closer price/performance to compare to. Why would a company that knows it will sell many cards lower their price just because the competition was out for a few months early, with now reduced prices, if the performance isn't terribly worse?

Would one expect a new model BMW come out at a lower price point because their competition has been on the market for a year (which is now given as a clearance at dealers to make room for new cars or priced to compete with competition's new offerings) with the same performance? NO!

Maybe cars aren't good for analogies, but the picture should still be understood. Everyone needs to stop complaining about the price being too high when in fact it's at a good spot to begin with. If one would rather have that 8800GTS 320MB for $260 so be it; Some people would rather pay double for an 8800GTX which is only faster, not wipe the floor clean faster.

After both fall gradually close to each other's price points, which will then look like the better buy? Both have about the same performance (at least from initial launch reviews for the moment), both are DX10 capable, but the ATI card has theoretically better technology introduced with it.

I, for one, will wait. I will wait for both updated drivers, as well as the next offerings from both nVidia and DAAMIT. I like to keep an open mind, as should every true enthusiast.


PS: Great review for the Tom's staff. Great product from DAAMIT. Great alternatives from nVidia which already carved it's performance in stone.

Disclaimer: please don't take this post in any way, shape, or form as an intentional inflammatory statement directed towards any individual on this forum.
 
HD2900XT vs 8800 GTX in DX10

cojbenchmark2hj5.jpg


With the resolution cranked up to 1600 x 1200 the tables were turned and the ATI Radeon HD 2900 XT was able to take the lead over the more expensive GeForce 8800 GTX by less than two frames per second. Keep in mind our previous comments back in the image quality section about the use of a filter though!

Its not over yet folks. The war has only just begun. :twisted:
 
I am working on coming up with some ways to test the shader horsepower. NV's G80 seems to be PS lopsided while it also has some serious texture horsepower. Based on some simple calcs, NV should not be able to fulfill uncompressed texture fill needs.

Tex fetch rate (Gtex/s)

8800 GTX 10.80
8800 GTS 8.00
2900 XT 11.87

One person at NV said I was correct while another said I was wrong... the person that said I was wrong has yet to tell me why... still waiting.

Well like you mention it depends on the implementation as to where the choke points arise. B3D did a good job at looking at the various texture throughput rates dependant on int, FP, and compression;

http://www.beyond3d.com/content/reviews/16/13

Unlike most tests, they're just writing from cache so they aren't as choked by actually memory bandwidth as would be the case in your example and in most multitexturing tests, like the Tech Report's for basic fill rate;

http://www.techreport.com/reviews/2007q2/radeon-hd-2900xt/index.x?pg=4

The next page includes scaling from Bilinear to 16XAF, and the drop is pretty dramatic, I haven't looked at the design enough to talk intelligently to that, it's pretty amazing considering how small the drop of the X1900 series is.

Granted this calc does not take into effect compression but with HDR that is not a huge thing.

Well and that's what's kind of puzzling, just look at the HDR results from Shadermark in that Tech Report review;
http://www.techreport.com/reviews/2007q2/radeon-hd-2900xt/index.x?pg=3

The R600 kills, and the X1900 does well too with or without filtering.

There is a lot more to test and quite frankly,

Oh I'm sure, I'm just trying to catch up on my reading, for you it's going to be deciding the tests, and then figuring out what the results mean.


Nvidia does not like sharing how it does things. They believe there is no point in giving any detailed information out. They feel it is just like handing ATI and Intel their technology.

Yep, it's like trying to peel an onion, nVidia never fully explained their FP16HDR+MSAA limitation and I don't think so even to this day. Some of us slowly picked it apart through here and the ElderScrolls forums.

Yeah I can imagine the fun of trying to get feedback. Most companies would rather feed you well selected information than have you asking questions they might be uncomfortable answering.
 
Nvidia has stated that the geometry shader (GS) was not designed for large-scale tessellation. ( http://developer.download.nvidia.com/presentations/2006/develop/next-gen-dx10-games-develop06.pdf - See page 16)

I find it interesting point 6 & 7 involving multi-pass to the stream output instead of single pass which relies on the higher speed that it enjoys instead of greater complexity, and the point 7 specifically stating to favour the vertex shader. It makes me want to revist the VS/GS relationship in the two architectures.

This could offer some insight into previous statements on the G80's GS; some interesting work arounds in there with the tetrahedra. Gonna have to unhook the linear headspace for that.

GREAT, now more interestng reading in addition to all the benchies! 😳
 
Well I think technology wise it was meant to compete with it, so it's a fair comparison from that perspective (surely AMD would prefer to have an ultra-high end offering to maximize their return on investment in R&D) so the engineers were aiiming to get something to market that beat the G80.
But for whatever reasons (80nm leakage, GDDR4 issues or just plain miscalculation of the combatants and the playing field) it's a little short, so it was retargeted.

From a price/performance perspective it's obvious that the GTS is now the marketing target, and they hope to push enough through to recoup some of their costs and at least have a revenue stream.

Right now it's looking like it could turn out to be a good value for the money shortly, but it's still got a ways to go to compete at it's current MSRP versus the discounted GTSs which may be selling for more at someplace like BestBuy or CompUSA or such, but on NewEgg the GTS-320 and 640 are still very attractive.

However things have only begun to get interesting.
 
It is far too soon to place a final judgment on R600. It is great for controversy but not quite ready for a final judgment.

The card is out now, it has to be judged whether it's ready or not.

Also would love to see some minimum fps in the benchies. If the 2900 gets consistently higher minimum frame rates than the 8800GTS 640 it's competing against then that's a whole lot more useful than an extra 5fps when you're already pushing 150+.

Judging by the technology on this card it seems to be most suited to people who aren't going to upgrade until the new decade. Maybe it's performance will decline at a slower rate. Time will tell though.
 
132, 158, 192 fps....meh, it's all above a critical minimum so fine. A few frames here, a few frames there.....whatever. More importantly, NV and ATI can compete now. Good for us consumers!

I'm more curious about the heat on this sucker. I shudder when I think of my old X1900XT and the heat that baby put out. OUCH. So, how hot does this thing get? Honestly, I could care less about a few more frames if this thing will heat my house and make noise doing it.....ATI can keep it. I already have a furnace, I don't need one in my case too.

Benchies are fine, and true enthusiasts may push for ultimate performance, but what about all of us who care about the other factors: heat, noise, power consumption, control panel interface, ease of use, driver stability, overclockability, and so on and so on?? I'm sure there are MANY people that consider more than a few % points of difference in framerate when dropping hundreds and hundreds of dollars on a purchase.

Any comment from the author on all the "other" things besides framerate....even the practical stuff like will it easily fit into a regular ATX case with cables coming out the back of the hard drive cage?

Surely there must be more information than benchies provided??? The author went to such great lengths to practically beg people to take things "in context", and so I would ask the same in return.....what about all those other factors that should be taken "in context" when deciding which card is the best purchase for a consumer???
 

TRENDING THREADS