AMD Pushes R600 Back To May

Page 5 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
I agree with most everything you wrote, I would only add that I think that AMD are missing out on just plain high end sales, which they would love to have, let's not pretend that AMD wouldn't want to be selling something competative now; but I agree completly, depending on their reply, if something needed to be done to make the return to the high end more succesful, who knows if it was worth it or not until it's done.

If (and let me just go off the map here people for the 'IF' factor) let's say AMD were able to do something dramatic and worthwhile like moving the production of 65nm ahead, would this delay be so dramatic? If they were able to somehow significantly reduce costs, would it matter as much? As long as it were competative then it'd be a success.

At this point we have no info which sucks, and we have no card, which sucks, but we won't know it's impact on either company until we know what is being launched.

Persoanlly I'm not optomistic that it's something great like that, probably some PR crap at best, but we'll see, I would hope to be proven wrong.
 
About the price, a few things that you need to take into account:
They can only produce the X1k series so cheap, and still make a profit.
While the G80 is a new architecture, and doesn't cost that much more to produce.

But yeah, I don't see how ATi will recover from this. And I'm an ATi fanboy.... so, yeah. :cry:
 
...IMO, buy the GF8800GTS-320 now, far less of a kick factor either way, not a great loss if the R600 is better, and in the meantime, you gets lotsa gaming...

Well, isn't the 8600 coming out in March, or am I mistaken? If I am correct, I'll likely grab the 8600 in March.

Not according to the rumours. It got pushed back as well, about 2 weeks ago this was the buzz on the GF8600, and it's an April-May timeframe now too. But even if it launched tomorrow, IMO the GF8800GTS-320 still would have no regrets, because it's significantly better, and if anyone were looking for GF8600 level performance for $200, they would've already bought the X1900XT for $180 or the X1900/1950GT for $130-140. The GF8600 is a product for people forced to upgrade now because of a new build or something that brings them into tha point, anyone looking towards the future to replace the past, should be looking at the GTS-320 IMO. The GF8600 will have it's benefits once DX10 hits, but until then it's not as critical, but it'd still sell like hotcakes because people would see DX10 for $150-200. Only once the GF8600 has been out for a few weeks/months and price get attractive will it be the comelling upgrade IMO, and even then the GTS crowd won't be regretting the added power.

Either way the kick factor for the GTS-320 is low IMO regardless of anything launched it'll be solid, and a well spent $270, but if you're worried about thing like that the kick factor gets bigger the higher up the spectrum you move.

Just my view on it.

Oh, PS, this has little to do with othercomparos, the market reality is that if nV get the GF8600 to market significantly faster or significantly better than the X2600, then it's a huge headache for ATi, because that will start majorly affecting the market, and it's not about the wise upgrade move at that point, it's about opening up another undefended segment of the market where ATi was leadig, and would almost assuredly get clobbered until they had a reply. Just look at the X1600 series, 2-3 month lead, sold well to those not aware of it's weaknesses, then the GF7600 comes out and bang, beauty for the upgraders (better performance in modern games with things like HDR enabled than the GF6800 series was) and death to the enthusiast X1600 market. Another launch like that without having a nice lead-in and ATi will have difficulty replying without something stellar in comparison the way the GF7600 was compared to the pre-existing X1600.
 
They can only produce the X1k series so cheap, and still make a profit.

Yeah except it's a far more mature process and already many products at a lower half-node than the G80.

While the G80 is a new architecture, and doesn't cost that much more to produce.

Baloney, it's almost twice the size, there is no way the yields per wafer are anywhere near the same, and they're still on 90nm too. No close at all.
 
If you were waiting for Direct3D 10 compliant hardware from AMD (formerly ATI) you will have to wait even longer. AMD pushed the launch of R600 yet once more.

Who cares, it isn't worth jumping onto the dx10 bandwagon until next year at the earliest anyway.
 
Just a Mac Pro ad 😉 (actually I stole it from Motu's web site) -- seriously I have most of the common OS's loaded up, just the OS X happens to be the one I use the most hence it get's the air time right now.

If MS Vienna comes out good, I'll give it some air time. But Vista, no way, just a dog do OS from both a user's and developers stand point. If it weren't for DX10 I wouldn't consider Vista as even viable.

I thought we had specs on the R600 -- I guess that has changed significantly? It seemed to have a pretty significant jump on nVidia's G80 and certainly was much more flexible for developers.

My guess (and that's all it is) is AMD have a new/modified CPU, new chipset, and slightly modified GPU that will work significantly faster as a package. They control all three components, so it would make sense to see a combined package working in harmony. Being delayed 7 months is enough of a delay to "re-think" a design. Or it could be they just ran into some serious production issues that forced them back into design mode -- but based on their press release that doesn't sound like the issue.
 
They can only produce the X1k series so cheap, and still make a profit.

Yeah except it's a far more mature process and already many products at a lower half-node than the G80.

While the G80 is a new architecture, and doesn't cost that much more to produce.

Baloney, it's almost twice the size, there is no way the yields per wafer are anywhere near the same, and they're still on 90nm too. No close at all.

I'll be impressed if the X2900 is 65nm and doesn't require the 6 pin power plug.
That will be worth waiting a few months for 😀
 
Gee...I've waited long enough for the R600. Finally gaved in and bought a 8800GTX. Hmm, for some odd reason I've always bought ATI Radeon cards, but this is just ridicously. Didn't want to spend the dough on 1950XTX (Kinda wanted a unified shader architecture).

Must be the buyout problems. Guess AMD business model may be different from ATI - and even compensation packages (I probably guess lots of ATI employees were shown out the door or was just disgusted at the buyout). I know how this feel as I was just recently part of a company that was bought by another big company - everything got pushed back due to budget cuts, everybody busy looking for a new job, etc...

AMD's busy trying to fight off C2D. Too bad, I need a computer...NOW. My Radeon 9600XT + AMD64 3000 (socket 754) just crapped out...
 
My 9700pro still games hard on FEAR and others. They were just that good. 😉

While both the 9700pro and 9800pro cards were good for there time I find this hard to believe unless you are talking about running FEAR at really low settings.

I have seen first hand how bad FEAR puts the smack down on a 9800pro much less a 9700pro and the settings were not even really turned up plus it was running at a relatively low resolution 1024X768.

It seems that some of the newer games especially the shader heavy games such as FEAR are a little too much for the old 9700 and 9800pro's to handle.

I know what you mean. I played both the FEAR demo and Oblivion on an AIW Radeon 9800 Pro. It was medium settings to get good framerates. I pushed the resolution up to 1024 x 768, but got some slideshow moments. So, the 9800 Pro can still play the games, but not at high settings.

My new interim Geforce 7600GS has high settings in Oblivion, gives me close to 30 fps outdoors and 60 fps in doors, on average using FRAPS. I'm not complaining. Of course, I'm still running at 1024 x 768 and will run at 1280 x 1024 when I get new 19' LCD monitors.

The only thing I can kvetch about is that I got an MSI barebones with an Nvidia chipset that doesn't like older ATI X series cards and MSI tech support says it doesn't like newer X1x series cards and I should choose Nvidia for best DX10 performance on that chipset.

So, I'll go 8800GTS on April 1st to replace the 7600GS. I can swing the extra $100 and would appreciate the performance. Now, if I get Pure Video to behave with my H.264 fansubs, then I'll be happy. Looks like I need to e-mail Nvidia there!

I still want R600, but I was planning on building another PC in the fall, so instead of using the spare MSI Nvidia 405 chipset motherboard, I'll just keep that as a spare and get an ATI chipset board. That way, I can go X2600 or X2800 with no issues.

Anyone else have a large enough family that they need two or three PCs? That's a good way to go both ATI and Nvidia, AMD and Intel and then do your own real world comparison.

Our six year old loves his new Northwood with a Radeon 9800 Pro. It cuts through Reader Rabbit math like you wouldn't believe. It just doesn't do what my modder wife and I want it to do now. Hence the new builds.
 
This may have been said in an earlier post and if so I am sorry but here it goes.

I don't think this is really that bad a thing to happen. Now I am sure part of the reason for the delay is the merger however that would probally be minor since they (ati) were working on the card before the merger. We have seen leaked photos showing the new card so they are close, I am thinking like this:

I would bet they are probably having problems getting everything working together like they want (I know no $h!t sherlock) but they are trying to work in DDR4 memory which I don't think any other card has at the moment although the benefits of the DDR4 have yet to be tested. But frankly there is no need for a DX10 card right now other than better performance in current gen games (which everyone wants) and the current gen ATI cards while not top of the heap are still pretty darn competitive price/performance wise excluding DX10 compatibility. Yes they are losing some market share on the bleeding edge but that happens to both nVidia and ATI with the other one puts out a new top of the line card so they are not really losing something they don't have the opportunity to get back (possibly) when the R600 is released.

Until DX10 games are released having a DX10 card is just for bragging rights more than anything else, also higher FPS. I would bet IMO they are working on a couple of things, one getting good stable drivers for Vista (hopefully, hello nVidia) for those of you who have moved up or down depending on your opinion of Vista. They are hopefully also preparing a full line up of next gen cards from high end to mid range maybe even low end at least if they are smart thats what they are doing. I could for see them having problems getting a good stable DDR4 chip supply since DDR4 is still basically brand new. While it would be nice to get the r600 out and have some price competetion on the next gen level I think it will all work out in the end when they release a fully tested (again I would hope so with all the delays) and fairly bug free card to compete in the DX10 arena.

Basically until DX10 games come out there is as I stated above no reason to get a 8800 series card other than bragging rights, well that and the ability to run Oblivian maxed out (I know how bad many of you hoped and dreamed for that) While the delay is not the worst thing in the world that can happen I think it will let AMD/ATI release a product fully ready to compete head on with the 8800 and what ever next top of the line card nVidia puts out. One last thing with ATI delaying the card longer it does give them a good chance to check out the 8800 series cards and see where they can improve the r600 to compete or maybe blow away the 8800's and what ever nVidia is cooking up to top the 8800gtx. Rather than rushing out to market and both of them trying to play keep up ATI is sitting back letting nVidia do its thing while working on ways to top what ever nVidia can throw at them. (We all hope) Sorry for the long rambling post I tend to stretch it out a bit. Any thoughts on the matter would be greatly welcomed feel free to PM if you like just no flames please.
 
Always a good read from your posts GrapeApe, but something just occurred to me since you were playing the what if game. What if, and this is a huge IF, ATI is was origonally going to bring out say a 2900 or what ever name you want to give it but they realized that the follow up they had planned for the card to top to 2900 say the 2950 (to compete with the 8900) or whatever you would like to call it is basically almost ready for production. So instead of releasing the 2900 and then a few months later releasing a faster card they go with the 2950 as the top of the line launch card and then sell the 2900 at a lower price point. So instead of bringing out a 2900 that will supposedly compete head to head or beat the 8800GTX at the high end they one up nVidia and try to blow the 8800gtx or even the 8900gtx out of the water right from the get go. There is of course no way to know with no benchmarks going only from supposed final release specs. (they do however have to bring to market a monster if they want to regain ANY ground lost due to the delays) Like was said before the bleeding edge is a very small piece of the whole GPU pie, they do make good money on the high end but lets be honest the bread and butter of the gpu market is the mid range cards.

If they could manage to release a high end card that just destroys the 8800 or the 8900 then sell a cheaper, say the origonally planned "2900" for around the price of the 8800gts that still beats the 8800gtx they would be in a good market position. Of course this is all crazy speculation but an interesting thought, to me at least. Also as I said in my first post on this topic if ATI could manage to release a full line up from mid to low end all the way to bleeding edge all at once (assuming they out perform similar cards from nVidia and can manage that big of a release as a hard launch) they would be in a very good mark position. Yes this is crazy speculation and they have not, at least that I have read, said anything about releasing lower end cards at the same time as the r600 if they play their cards right they could pick up any ground lost due to the delays.

One last thing, sorry for the 2 long posts in a row. In regards to SLI vs Crossfire, yes SLI has been around longer as was mentioned before since the 3dfx days (I loved 3dfx by the way to bad they are not around anymore) I think one reason most people liked to say SLI was better, the mature argument has been answered, was that IMO they saw SLI where you just had a little bridge from card to card in you case where as with Crossfire up until recently had a kind of akward dongle from DVI port to DVI port on the back of the cards. To me at least if I knew nothing about graphic cards I would say SLI was better mainly because it looked nicer than having a dongle adding to the rats nest that is already behind most computers. Last post tonight I promise LOL.
 
Until DX10 games are released having a DX10 card is just for bragging rights more than anything else, also higher FPS. I would bet IMO they are working on a couple of things, one getting good stable drivers for Vista (hopefully, hello nVidia) for those of you who have moved up or down depending on your opinion of Vista.

I don't mind delays, especially since I have to build a new second PC with the upcoming R690G motherboard to give the X2800 a go. I'll be happy with an 8800GTS on the Nvidia board until then. If it turns out it doesn't run DX10 games as well as the later 8900 or ATI equivalent, then I've been there before. First generation cards of a new Direct X are really best for top performance in existing Direct X titles. The lineups need a refresh once the actual games arrive.

So, it's not just higher framerates. The Radeon 9700 Pro played DX 8.1 games like Morrowind fast with all the features maxed and I'm sure the first generation of DX10 hardware will cut through Oblivion at Very High settings. The benchmarks on the 8800 series already show this, so I'm sure ATI will do equally well.

So, a DX10 card is the best choice today, at every price point they've been released at. I just wish ATI had their's out now so they wouldn't lose market share. I like competition in more than just high end benchmarks.

...Our six year old loves his new Northwood with a Radeon 9800 Pro. It cuts through Reader Rabbit math like you wouldn't believe...

:mrgreen: :mrgreen: :mrgreen: ROFLMAO!!!:mrgreen: :mrgreen: :mrgreen:

I'm glad you got a good laugh. I also hope you have a younger relative who can use that X700, it's in the same general category as my Radeon 9800 Pro. My kid's having such a good time with his own PC now that I decided not to upgrade it to an X1650XT or X1950 AGP.

I'll let it age gracefully until it's fodder for the local Goodwill computer store (they have one here in Austin dedicated to PC's and they have a quirky little museum of ancient computer history: https://www.austincomputerworks.org/museum/index.html )
 
"Hmmm...there's more here than meets the eye". Could it be that AMD is wanting to more closely coordinate the release of Barcelona with that of R600. Most of the pundits anticipate that R600 will grab the crown (at least temporarily) from the G80s and likewise for Barcelona over Core 2 Duo. For the FISRT time in modern PC history, ONE company could simultaneously claim the speed crown for both CPUs and GPUs! Likely NVidia and Intel would regain the lead, but AMD would have made history! I can think of no better corporate strategy to boost AMD's sagging share price (if only temporarily).

Just a thought...
 
Murphy's Law,

No matter what you do you'll be kicking yourself.

- Wait for the R600, you might be kicking yourself for not buying the GF8800 and enjoying it all that time.
- Buy the GF8800 now you might be kicking yourself when the R600 is launched.

Well I decided that kicking myself later wouldn't be nearly as bad if I had been using one of these
:arrow: http://www.newegg.com/Product/Product.asp?Item=N82E16814130079 for a few months.

Anyway if the r600 really has a significant % gain in dx10 when it FINALLY is released, then there is always ebay 😀

Tip of the hat:
RobsX2, those screenies of Oblivion were the final straw. I think I'll have to reinstall when my card arrives.
 
I'll be impressed if the X2900 is 65nm and doesn't require the 6 pin power plug.
That will be worth waiting a few months for 😀

Yeah I doubt we'll see much of the lack of 6 pin connector, except on the crippled models, or maybe midrange.

It'd be nice to think that the companies would be more focused on that, but usually they both take the opportunity to crank a few extra mhz out of the cores.

Perhaps a GS or XL version may appear with better power requirements, but I suspect it'll never trickle to the GTX/XTX crowd because it'll be more important to add 100mhz than to save 20W.
 
What if, and this is a huge IF, ATI is was origonally going to bring out say a 2900 or what ever name you want to give it but they realized that the follow up they had planned for the card to top to 2900 say the 2950 (to compete with the 8900) or whatever you would like to call it is basically almost ready for production. So instead of releasing the 2900 and then a few months later releasing a faster card they go with the 2950 as the top of the line launch card and then sell the 2900 at a lower price point. So instead of bringing out a 2900 that will supposedly compete head to head or beat the 8800GTX at the high end they one up nVidia and try to blow the 8800gtx or even the 8900gtx out of the water right from the get go.

Yep, that's similar to the X1900 scenario where the issues with the X1800 got so bad that they accelerated the X1900 because there was the possibility of having that ready by the time they fixed the 3rd party memory issue. If something similar happened this time around, definitely it would make sense. Because if they hadn't gone into full scale production by now, then what they did produce they could turn into special partner boards (the X2900GTO or something). It's a possability and similar to the move to another production node, although could be even easier considering if it was slated for 80nm then the referesh Xxx50 part would be ready to go and production capacity is simply waiting for them.

There is of course no way to know with no benchmarks going only from supposed final release specs. (they do however have to bring to market a monster if they want to regain ANY ground lost due to the delays)

I don't think they're worried about making up lost ground (at least I hope they aren't wasting their time on that), it seem like the logical thing to do, but really they didn't lose much overall, and the main thing now would be to get an EQUAL part out ASAP, delaying 2 more months to get another part out makes it even longer, and finally Vista is here so that ground is goign to start spreading, and even worse if any DX10 patch comes for something like FSX. Risky gamble IMO, but I understand the thinking, just in order for it to work, it needs to be a significant boost to ensure customer migration to this new product. Can't be only 1fps better, that's not worth it IMO.

Like was said before the bleeding edge is a very small piece of the whole GPU pie, they do make good money on the high end but lets be honest the bread and butter of the gpu market is the mid range cards.

Yep, I agree, that's why IMO the X2600/GF8600 saga will be the one that does determine the fate off these two, and then the X2300/8300 also will do similar battle, but there's more movement there and truely less performance focus (how many people buy OEM computers with crap 7300/X1300s expecting to game, or thinking they ar gaming at high level on games like FEAR/Oblivion.

If they could manage to release a high end card that just destroys the 8800 or the 8900 then sell a cheaper, say the origonally planned "2900" for around the price of the 8800gts that still beats the 8800gtx they would be in a good market position.

Yep, but of course nV likely has built up inventory so the only concern AMD runs into there is the potential of a price war by nV to clear out old stock, and help pay for the GF8900 coming to market, while also hurting X2K sales. It will get very interesting if they do launch a 2 punch, the market will be as bloody as it ever was.

Also as I said in my first post on this topic if ATI could manage to release a full line up from mid to low end all the way to bleeding edge all at once (assuming they out perform similar cards from nVidia and can manage that big of a release as a hard launch) they would be in a very good mark position.

And they do have a history of that, where they like to launch top-bottom solutions, not always first, but usually within weeks of each others. And with these staggered launches nowadays, no matter what they will be detailed all at the same time IMO.

Yes this is crazy speculation and they have not, at least that I have read, said anything about releasing lower end cards at the same time as the r600 if they play their cards right they could pick up any ground lost due to the delays.

Well the plan has always been to launch lower end cards in Q1, so if everything gets pushed to Q2, I'd say it's very likely that it will be top to bottom.

As fror the Xfire/SLi thing, I thing Xfire is the more mature implementation (more features, wider options), but that SLi's maturity in the marketplace means that companies have optimized for them in many cases.
IMO those two things counterbalance each other, and while I think they are very limited in use and still more about eWang performance most of the time, for those that need it for 30" LCDs, they need to spend the time to find out which is better for their needs, because neither wins in all scenarios from what I've seen. I really can't wait to see the 3-4 card implementations, because that's when I think the load balancing will be interesting, and maybe the supertiling might be more useful than previously demonstrated.

Anywhoo, I hope there's something interesting hardware wise prompting this delay, but their st00pid press release makes me think it's a marketing ploy more than anything. I hope I'm wrong.
 
Yikes. I'm not going to read all the posts before me, just put in my opinion. Sorry!

Unless the R600 gets an average of 100% higher performance than the leading GPU at the time of release, I will forsake ATI. This shit is unacceptable. I'll be fairly pissed off if they have a processing powerhouse better than everything else out there, and because they're so advanced they're really hard to get ironed out, hence the delays. I'll be pissed off to the point of rejecting ATI after years of supporting them if what they have is merely moderately better than the leader. And if nVidia makes something faster than the R600 by the time it launches... well I'll be somewhat fire and brimstone >.o
 
ati has had better technology for the last 3 generations of cards, and a lot better hardware unlike what u seem to think


Actually I'd say it oscillated From R9700 to GF6800 to X1800/1900, performance isn't necessarily an indicator of technology, and while the X8 series may have performed better in many ways the GF6800 was undisputed technology leader of the time. And then this round if rumours are true tech advantage will go back to the R600 with it's DX10.1 support, however they may have issues with the performance crown if that's the reason for delay. Also who knows what the 'extra benefits' offer, comparing the GPGPU and Physics, etc features.

However until it comes, we won't know; and for now the GF8800 is the current leader in technology, with no 'exceptions' the way there has always been.

Yes this is important because we are all living in the past and the fact that Nvidia is king right now holds no ground.

Depends on what people are looking for. Just like those who bought the GF6800 because of SM3.0 or the X1800 for FP16HDR+AA, people may be looking towards more robust future feature support. There are reasons to care, but IMO, they aren't good enough to forgo the benefits of having played witht eh GF8800 all this time. I see it more like any other refresh, just like buying an X1800 and then moving to an X1900, the same would apply here, which I think is your strategy, buy the best of whatever's available, and upgrade when you have to. Not the same market as those who are looking for the best long term support and have no reason to upgrade no because they are OK for current titles, and just want support in future titles as they start to be needed. Heck most of those likely won't upgrade until weeks/months after the R600 comes out, once prices drop, and games start poking out over the horizon.
 
Lots of replies, so I'll pick a random one by the Ape to make my own.

I'd echo what has been said already, there are no DX10 games out there to show off this hardware, but fotunately the DX10 cards run DX9 games better than any previous DX9 card. Unless there are some really severe further delays with the R600, as I've said before, we'll likely all be buying our DX10 cards on the back of DX9 performance, which is fine I suppose for the first steps in a new generation - they'll probably all be upgraded anyway when the first decent DX10 games arrive.

What is hurting ATI more at the moment is not their lack of prescence in the market, it's more the lack of credibility. Being in bed with AMD of course hasn't helped, as they are in a slump at the moment in the seesaw CPU battle. IMO what damages ATI more is that they simply are not doing what they say they are going to do. You do of course expect the odd delay, lapse or manufacturing problem in this industry. However ATI have used up a lot of good will and faith with the constant R600 delays (with little or no explanation) - whilst they may not need to release it yet, it doesn't look good if they keep saying they are going to release it and then don't. Nvidia have had their own problems in the past, however it so happens that this time around the 8800 release went pretty smoothly (some GTX hitches aside which were quickly resolved) and to cap it all, the performance jump took a lot of people pleasantly by surprise. Realitvely speaking, it couldn't look worse for ATI.

However, these things tend to vanish and be forgotten quickly once the latest and greatest is released, and let's all hope the R600 exceeds our expectations :)
 
Actually I'd say it oscillated From R9700 to GF6800 to X1800/1900, performance isn't necessarily an indicator of technology, and while the X8 series may have performed better in many ways the GF6800 was undisputed technology leader of the time.

For sure. Nvidia also learnt from the FX5800 cards and produced a really good card with the 6800; not only because it performed well but because it didn't need a 1Kw+ PSU to run it and because it used a single slot cooler that wasn't loud.

Decibels and power usage may not be the defining purchase points for power users; however, I would rather buy a card that is 5-10% slower, but runs cool and quiet on a 350W PSU rather than something that requires 350W to run and has a triple slot cooler (?!)