AMD Pushes R600 Back To May

Page 4 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
This news kinda stinks. And kinda familiar as well... Do you guys remember way back when S3 and Voodoo and early nvidia chips were around they had these Rendition Verite's i don't know how to spell it anymore but I had a rendition and was waiting for the next big thing cause they keep on sending review sites paper launches of their so called new graphics card.... oddly enough the same scene is happening.... they tell you they are going to release it then move it and move it again then finally boom nothing they just stopped making graphics cards.....

I used to own one of the Rendition V2200 AGP 8MB a long time ago.... anyway if anyone can still remember those chips and all the paper launches and leaked specs kinda makes you think if ATI will still be around by May. Specially with AMD just pushing out new processors for the heck of having something new out there....

It's like they are killing themselves slowly for something...
 
EwWW!
boltiq8.gif


LOL!
Yeah I grow on people like mold, or flesh eating disease. :wink:
 
it played awsome but it was at 10x7=150 fps
and now at 16x10=65 fps.

Uhh....Just noticed you were sli so never mind. LOL.

that was with a single card on the (in game perf. test)
sli was around 200fps.
not quite double the performance like some people think :lol:
i did like sli but the heat was too much.
and i had ac freezer coolers for both cards, but
they wouldnt fit in my case.
 
execution, execution, execution something that AMD is struggling with since C2D hit the market. AMD is starting to look like Intel 2 years ago :lol:
 
this was at 1026x768 though.

one card at 10x7 was max fps 150.
now with 16x10 its max fps at 65. :cry:

cant use sli anymore as i put the other card in my
other computer and added the other arctic cooling hsf.
when both cards have that aftermarket cooler they wont fit
in my case.

when i did run sli the top card had the cooler and the bottom
card was stock cooler.
 
this was at 1026x768 though.

one card at 10x7 was max fps 150.
now with 16x10 its max fps at 65. :cry:

cant use sli anymore as i put the other card in my
other computer and added the other arctic cooling hsf.
when both cards have that aftermarket cooler they wont fit
in my case.

when i did run sli the top card had the cooler and the bottom
card was stock cooler.

Oh ok bud, now it makes sense :wink:

thats onething some people have said to me before.
(oh you are lying you cant crank the settings up on this game or that)

i just always forget to mention that i am gaming at 10x7 or 12x10.

the old 68gt is a good card but (like me its getting some grey hairs) :lol:
8O oh wait i started getting grey 15 years ago :cry: :lol:

but yeah you and mr. ape got me more confused than i was.
spend 150 or less on a dx9 card or 300 or a little more on the g80?

im thinking the g80.
and for around 300$ for the 320mb version
or 375$ for the 640mb version.

i think the 640mb version. ill probably buy in 2 weeks or so, hopefully.
 
I used to own one of the Rendition V2200 AGP 8MB a long time ago....

Sigh, I had one of those too. Also was bummed to not see their dreams turn into reality.

Yeah well it was a bummer.but this whole ATI thing is kinda bothering... Hope they release a beast when and if they do launch it. I was holding off my $500 for a graphics card upgrade waiting on what ati would release... But when May comes and still no sign of ATI's new card. I'm getting an 8800 series or by that time probably an 8600 mid range card not to pricey but with DX10 support.
 
My 9700pro still games hard on FEAR and others. They were just that good. 😉

While both the 9700pro and 9800pro cards were good for there time I find this hard to believe unless you are talking about running FEAR at really low settings.

I have seen first hand how bad FEAR puts the smack down on a 9800pro much less a 9700pro and the settings were not even really turned up plus it was running at a relatively low resolution 1024X768.

It seems that some of the newer games especially the shader heavy games such as FEAR are a little too much for the old 9700 and 9800pro's to handle.

In no way was I saying that it was gaming at top-notch level... hence my purchase of my current card a year ago... (~13months?) But just the fact that a card that "old" could still run the game decent at 1024 with some settings up amazed me.

I was only laying down a point (half-sarcasm) that he could have skipped alot of time and still run the 9800 before upgrading. My 'ol yeller here is oc'd quite a bit w/ aftermarket cooling though, so it is far from stock... > 9800pro(stock). Been a loyal companion through many a frag-fests. 8)

regardless... it was just a jab at the other guy posting about the lag in card upgrades... nothing more. 😉
 
Its only a few things that can cause this ...
The most likely thing I can think of is that the top R600 offering is not competitive in performance with the 8800GTX.

Without even engineering samples shipping, the R600 is, at best, still semi-mythical.

john

This was my guess as well, I can think of no other reason why they would delay this again.

My guess is that the R600 is more than competitive with the 8800 but the power requirements are out of control. Pictures of the card were posted on numerous sites less than two weeks ago and it was a monster, reportedly consuming 270 watts under load. I bet AMD's been working feverishly to get these chips to run cooler and more economical without sacrificing too much in the way of performance. I personally wouldn't be surprised if it's delayed again from May into the summer; a dramatic reduction in power usage could involve some serious redesign.
 
Yes, AMD has no DX10 card but so what; there's no DX10 games (yet). AMD knows that the majority of enthusiasts aren't going to buy something that supports a technology that isn't available yet, so in AMD's eyes there's not much point releasing a DX10 card yet. I can see their point.

I think we can forget DX10 in the current context, it's an afterthought of the 8800 and serves well only for marketing purposes. DX10 won't be relevant to most of us for many months. The point to note here is that the 8800 blows the current AMD cards away in *current DX9 games*. nVidia has the performance crown and is taking share in the enthusiast market, and then when they introduce lower priced offerings it'll be a C2D vs. K8 story all over again in the mainstream market.

I'm interested in the R600 but until AMD get the power requirements (270 Watts!) under control it's about as practical a purchase as the Quad FX CPUs. Not to mention that I'm running an Intel setup, and I wouldn't be surprised if AMD "optimized" the R600 drivers to run slower on C2Ds than on the K8. From that perspective alone I'd be leery of buying an AMD video card.
 
i will admit, i didnt read the whole thread since i honestly dont have the time or the patience to read 5 pages, but it seems like amd is trying to release the r600 and their new processors at the same time, or at the most, a couple of weeks apart...plus, who cares anyway...dx10 games arent out yet so whatever...when dx10 games come out, if the r600 still isnt out, then its time to start complaining hardcore.
 
If Amd/Ati makes their cards perform lesser on my C2D they will lose me as a customer forever, and i have pretty much only owned Ati cards. =/
 
Please could someone out there with loads of cash acquire AMD and fire all those good for nothing executives. they seem to have lost their way and bent on destoying AMD. Mr. Gates, you can do this! PLEASE BUY OUT AMD, PLEEEEEEESE!
 
I.M.O. this is a bad thing for AMD. We all know the competition is fierce, and once AMD comes out with their DX10 product, NVIDIA should be ready with their next iteration.

Understandably, if AMD's product is not ready, then they should not ship. It's worse to send a half-baked product and hurt your business than to wait and get it right.

One could argue that there's not enough market for the DX10 products because there's barely any software driven by DX10 right now. There's been some reported conerns over the drivers for Nvidia's 8800, so perhaps it was Nvidia's haste that AMD could capitalize on by having a product, feature set, and driver that is more competitive.

Unfortunately for AMD, they are lacking in one area which Nvidia still holds great strength in: multi-card configurations (SLI vs Crossfire). Although Crossfire has the capacity to run with varied card strength, compared to SLI the numbers show that crossfire just doesn't do as good. If AMD's plans for the DX10 card include a new, improved crossfire system that beats Nvidia, I'm sure that people will be happy they waited, and push more for AMD.

The fact is, whether it's Nvidia, AMD, Matrox, Intel, or any other manufacturer... whoever makes a great performing product at a price worth spending, people will buy it. Brand loyalty is a myth.
 
Understandably, if AMD's product is not ready, then they should not ship. It's worse to send a half-baked product and hurt your business than to wait and get it right.

One could argue that there's not enough market for the DX10 products because there's barely any software driven by DX10 right now. There's been some reported conerns over the drivers for Nvidia's 8800, so perhaps it was Nvidia's haste that AMD could capitalize on by having a product, feature set, and driver that is more competitive.
good word. could be the case... dunno really but good speculation.

Unfortunately for AMD, they are lacking in one area which Nvidia still holds great strength in: multi-card configurations (SLI vs Crossfire). Although Crossfire has the capacity to run with varied card strength, compared to SLI the numbers show that crossfire just doesn't do as good. If AMD's plans for the DX10 card include a new, improved crossfire system that beats Nvidia, I'm sure that people will be happy they waited, and push more for AMD.
dunno where you get that crossfire is bad. Looking at benchies (obviously not 8800sli... but previous gen) the crossfire is right where it should be. Forget the whining about any dongle (worst reason to complain IMO but it is gone in new stuff) and just look at quality/performance. Many cases crossfire is better. ("better" being either bang/buck or raw perf.) OpenGL is improving but Nv still has it there across the board.

The fact is, whether it's Nvidia, AMD, Matrox, Intel, or any other manufacturer... whoever makes a great performing product at a price worth spending, people will buy it. Brand loyalty is a myth.
again, word.
 
Yes, AMD has no DX10 card but so what; there's no DX10 games (yet). AMD knows that the majority of enthusiasts aren't going to buy something that supports a technology that isn't available yet, so in AMD's eyes there's not much point releasing a DX10 card yet. I can see their point.

I think we can forget DX10 in the current context, it's an afterthought of the 8800 and serves well only for marketing purposes. DX10 won't be relevant to most of us for many months. The point to note here is that the 8800 blows the current AMD cards away in *current DX9 games*. nVidia has the performance crown and is taking share in the enthusiast market, and then when they introduce lower priced offerings it'll be a C2D vs. K8 story all over again in the mainstream market.

I'm interested in the R600 but until AMD get the power requirements (270 Watts!) under control it's about as practical a purchase as the Quad FX CPUs. Not to mention that I'm running an Intel setup, and I wouldn't be surprised if AMD "optimized" the R600 drivers to run slower on C2Ds than on the K8. From that perspective alone I'd be leery of buying an AMD video card.

who said 270?
the rumors said 240 watts on peak, thats 20 amperes 😵
 
dunno where you get that crossfire is bad. Looking at benchies (obviously not 8800sli... but previous gen) the crossfire is right where it should be. Forget the whining about any dongle (worst reason to complain IMO but it is gone in new stuff) and just look at quality/performance. Many cases crossfire is better. ("better" being either bang/buck or raw perf.) OpenGL is improving but Nv still has it there across the board.

I had not stated crossfire is bad. Based on what I read from reviews on THG, the crossfire system performance was not as good as SLI... but I never said "bad". In fact, bang for the buck is a good point and as I mentioned before, crossfire has the ability to use cards of varied performance. I would like to see AMD push this to higher levels and I hope they will.

Being familiar with Radeon and Geforce cards in my machines at home, they both certainly do well. For the consumer's benefit, I hope the AMD solution will force another pricing war. 🙂 Nothing beats two companies dropping their prices to get my hard earned dollars.

Cheers.
 
i do not know where this whole SLI is better and more mature stuff comes from.

please explain or link me to an independant explanation. unless it is just me my own x1900xt config won and lost benchies against the 7900gtx config. it was never trounced, if anything the opposite.

I have wondered that myself. I have always heard that SLI is so much better and as you say "more mature" but what does that mean? Benchmarks seem pretty solid on a crossfire setup...right?
 
So why do AMD think there is Vista growth?? Sales numbers for Vista are disasterious. MS CEO Ballmer has reported Vista sales are 58.9% lower than WinXP sales of 5 years ago. MS cut back employee benefits packages in 2004 and there is a very good indication they'll start cutting back employees this year.

That aside, are there any DX10 must have games out now? Will there be any by May?

Vista is slower than XP so the rush to DX10 hardware just doesn't exist.

AMD can afford to wait a couple more months, this is certainly NOT the death rattle. AMD need to produce a card significantly faster than nVidia's best offerings because you know nVidia is just waiting to release their 8900GTX 1GB as soon as AMD release.

But the money is not in the high end market -- high end folks such as myself buy the next best thing as soon as they can, no brand loyalty here. The folks AMD want are those "on the fence" and/or don't wanna spend $600+ on a GPU. If AMD can produce a R600 based card for $200-$300 that performs as well as nVidia's 8800GTX that WILL be a cash cow for them -- hitting the right price performance point is what makes a strong bottom line.

Rob.
 
...IMO, buy the GF8800GTS-320 now, far less of a kick factor either way, not a great loss if the R600 is better, and in the meantime, you gets lotsa gaming...

Well, isn't the 8600 coming out in March, or am I mistaken? If I am correct, I'll likely grab the 8600 in March. Don't get me wrong, the x700 in my sig. is an awesome card for the money I paid for it (super deal $44 after tax and shipping), it was a solid performer that noticeably beat but not annihilate my GeForce 6600's performance.

It's nothing personal...but ATI is a little too slow this time. Makes me wonder if it is going to be like NVIDIA's initial FX launch (remember the infamous FX5800 "dustbuster", anyone?)
 
i do not know where this whole SLI is better and more mature stuff comes from.

please explain or link me to an independant explanation. unless it is just me my own x1900xt config won and lost benchies against the 7900gtx config. it was never trounced, if anything the opposite.

I have wondered that myself. I have always heard that SLI is so much better and as you say "more mature" but what does that mean? Benchmarks seem pretty solid on a crossfire setup...right?

SLI has really been around since 3DFX--Crossfire is a much newer design. The truth is SLI really is more mature...

...the question, however, is not which one is more mature, but which one is better. That is a question that maturity has little to do with other than flaunting SLI as a "proven design that has stood the test of time", but frankly, ATI's Crossfire seems to be pretty reliable based on the reviews so far. So there is little reason to conclude that SLI is "obviously better than Crossfire" when there are no statistics showing that crossfire has a higher failure rate than SLI.