AMD Pushes R600 Back To May

Page 9 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
You want facts do ya? GeForce 7950 GX2 released June 5, 2006 http://www.nvidia.com/object/IO_31673.html
GeForce 8800 GTX released November 8th 2006 http://www.nvidia.com/object/IO_37234.html

What's that a fact to?
It's an OEM product (not retail) and they aren't even in the same product segment. The GX2 is 2 GTs stuck together, and the GTX is a new architecture. You are confusing the true product cycle of the GF2,3,4 generation when the 6mth product cycle statement was made by nV, with the rebadged clock pushes of the current generation. Using your silly method of SKU change even, what was the product cycle for the GF6800U? It certainly wasn't anywhere near 6mths. Early April 2004 to late June 2005, that's 14mths without even a new SKU in that market segment.

http://www.nvidia.com/object/IO_12687.html
http://www.nvidia.com/object/IO_23415.html

Now I'm pretty bad with Mathematics but I believe that's 5 months, not 18! Now you were saying that I didn't have any FACTS to back up what I was saying?


GF7800 to GF8800 16+ months, not 5. The only way you can even fit within the old statements is by changing a product cycle to just a product refresh. You say; "ATi's cycle seems longer than Nvidia's but it's generally around 6 months per release for all-new cards including refreshes as well"
So including refreshes it took nV 16.5 months to go from GF7 to GF8, and just as long to go from GF6 to GF7, and only if you consider the GF7 a new product and not the NV47/48 extension that even their software sees it as.

nV's original statement talks about "Doubling performance with each generation at a rate of essentially Moore's Law cubed", not a 30% boost here and 30% boost there over three refreshes.

OK, then where's your facts.

As for the 18month cycle, here's an actual article from BusinessWeek on it when ATi proposed to push it out to 24months;
http://www.businessweek.com/technology/content/apr2003/tc2003044_3712_tc024.htm

That's a published source, not your assumption of what a product cycle is without supporting info. You just post dates and draw your own conclusions, which matches most of what you post. Flimsy support and then you draw your own conslusions.

I see that I've put hard facts down and you keep criticizing me, but I've yet to see any facts from you! Put up or SHUT UP!

I put up supporting facts, that support my actual statements, you haven't. I keep criticizing you because your 'facts' don't match your assumptions, you draw wild conclusions from the smattering of data you can cobble together, and then say 'from what I hear' or 'I bet' to cover up your hopes and dreams as if they were facts. Like your following statement;

"Let's hope not because ATi got sued for artificially inflating the market with the R520 when it was a noshow. "

Like I said provide support for that specific statement, not something about accounting practices or insider trading, but actually being sued over the R520 itself 'inflating the market'.

C'mon now it's your turn to put up or STFU!
 
You want facts do ya? GeForce 7950 GX2 released June 5, 2006 http://www.nvidia.com/object/IO_31673.html
GeForce 8800 GTX released November 8th 2006 http://www.nvidia.com/object/IO_37234.html

What's that a fact to?
It's an OEM product (not retail) and they aren't even in the same product segment. The GX2 is 2 GTs stuck together, and the GTX is a new architecture. You are confusing the true product cycle of the GF2,3,4 generation when the 6mth product cycle statement was made by nV, with the rebadged clock pushes of the current generation. Using your silly method of SKU change even, what was the product cycle for the GF6800U? It certainly wasn't anywhere near 6mths. Early April 2004 to late June 2005, that's 14mths without even a new SKU in that market segment.

http://www.nvidia.com/object/IO_12687.html
http://www.nvidia.com/object/IO_23415.html

Now I'm pretty bad with Mathematics but I believe that's 5 months, not 18! Now you were saying that I didn't have any FACTS to back up what I was saying?


GF7800 to GF8800 16+ months, not 5. The only way you can even fit within the old statements is by changing a product cycle to just a product refresh. You say; "ATi's cycle seems longer than Nvidia's but it's generally around 6 months per release for all-new cards including refreshes as well"
So including refreshes it took nV 16.5 months to go from GF7 to GF8, and just as long to go from GF6 to GF7, and only if you consider the GF7 a new product and not the NV47/48 extension that even their software sees it as.

nV's original statement talks about "Doubling performance with each generation at a rate of essentially Moore's Law cubed", not a 30% boost here and 30% boost there over three refreshes.

OK, then where's your facts.

As for the 18month cycle, here's an actual article from BusinessWeek on it when ATi proposed to push it out to 24months;
http://www.businessweek.com/technology/content/apr2003/tc2003044_3712_tc024.htm

That's a published source, not your assumption of what a product cycle is without supporting info. You just post dates and draw your own conclusions, which matches most of what you post. Flimsy support and then you draw your own conslusions.

I see that I've put hard facts down and you keep criticizing me, but I've yet to see any facts from you! Put up or SHUT UP!

I put up supporting facts, that support my actual statements, you haven't. I keep criticizing you because your 'facts' don't match your assumptions, you draw wild conclusions from the smattering of data you can cobble together, and then say 'from what I hear' or 'I bet' to cover up your hopes and dreams as if they were facts. Like your following statement;

"Let's hope not because ATi got sued for artificially inflating the market with the R520 when it was a noshow. "

Like I said provide support for that specific statement, not something about accounting practices or insider trading, but actually being sued over the R520 itself 'inflating the market'.

C'mon now it's your turn to put up or STFU!

It's also retail too...don't think so...then why is eVGA selling it for $600? http://www.evga.com/products/moreinfo.asp?pn=01G-P2-N592-AX&family=22
That article was from 2003 and yes both companies talked about doing that but that only lasted about 1 cycle. Anyway maybe I should qualify what I'm saying. When I mean product cycle, I mean every time they release a new card beit a whole new architecture or just a refresh. The 7800 GTX 512 is considered a refresh but it was more than that, it scored noticeably higher than the 7800 GTX did. Anyway, it's a new card...now do you have anymore outdated articles to show me that are no longer relevant?
 
That article was from 2003 and yes both companies talked about doing that but that only lasted about 1 cycle.

Doesn't matter if it's from 2003 or 2000, it's actual support for the definition of product cycle, not your assumption of what a product cycle is.

Anyway, it's a new card...now do you have anymore outdated articlesto show me that are no longer relevant?

It's not my turn bud, it's your turn to provide even one single article that supports the idea that a refresh is a full product cycle. You ask for proof, I provided it, you simply provided release dates and assumed what they meant.

And so your qualifier doesn't fit nVidia's own definiteion of doubling performance, as the GTX-512 didn't do that, the GF7900GTX didn't do that, only the GF8800GTX matches their definition. And the only business definition has already been shown by me. So do you have something other than just card release dates, your assumptions, and nothing else? Because still going on your system, the GF6800U was still 14mths, doesn't match your definition.

Actually the lawsuit should be fairly easy for you to produce as it would be a matter for public record for a registered company. So let's see it.
 
They were sued for inflating their stock price, not the market. The only thing that the R520 had in common with the lawsuit is that they were having issues with it at the time and it was being concealed from the investors. It was one of several issues in the lawsuit.

Honestly corporations get sued all the time for various reasons. Both ATI and nVidia have had their share.

And enough with the lameass FUDZilla links people.
 
Grapeape is completely right, there is NOT a 6 month product cycle.

We tend to see the high end cards in a range come out (G70) then the mid/low end (GF7300, 7600, etc) and then refreshes of the high end (78GTX512, G71, 7950GX2 etc).

7800-7900 is not a new product or new generation, it is a reworked die and also a die shrink. the "GeForce 7 series" to "GeForce 8 series" is alot longer than 6 months 😛

R600 will come out eventually. It more than likely will be at least 10% faster than 8800GTX, its 7 months later ffs, and if it is I, and many like me, will buy one, and replace my 8800GTX.

AMD is going to open crossfire, and there has been no such announcement from nvidia. Bearlake is the next killer intel chipset, which will support Xfire as AMD will open it, but not SLI. This is a big plus for AMD/ATI.

Still, even if R600 isnt released till May 2008, and is way slower than anything from nVidia, there will still be ATI fans that will buy it anyway, just as there would be nVidia fans who would buy nVidia if the reverse happened. After all, some poor sod bought 5950U
 
You may have bought a dx 10 early but you'll still missed having the cream of the crop DX9 card for 10 months. I don't know what you guys are all arguing about, I can't see how anybody can deny that Nvidia is wiping the floor with AMD right now. It's like somebody lying bruised and bloodied on the floor after a fight and shouting 'I'll kill you when I get up', only to get a foot in his face.
 
That article was from 2003 and yes both companies talked about doing that but that only lasted about 1 cycle.

Doesn't matter if it's from 2003 or 2000, it's actual support for the definition of product cycle, not your assumption of what a product cycle is.

Anyway, it's a new card...now do you have anymore outdated articlesto show me that are no longer relevant?

It's not my turn bud, it's your turn to provide even one single article that supports the idea that a refresh is a full product cycle. You ask for proof, I provided it, you simply provided release dates and assumed what they meant.

And so your qualifier doesn't fit nVidia's own definiteion of doubling performance, as the GTX-512 didn't do that, the GF7900GTX didn't do that, only the GF8800GTX matches their definition. And the only business definition has already been shown by me. So do you have something other than just card release dates, your assumptions, and nothing else? Because still going on your system, the GF6800U was still 14mths, doesn't match your definition.

Actually the lawsuit should be fairly easy for you to produce as it would be a matter for public record for a registered company. So let's see it.

Here's a link of the lawsuit: http://www.hardocp.com/archives.html?news=LDA4LDIwMDUsaGVudGh1c2lhc3QsMiwxMg==

And guys if you want to call a product cycle every 12 months then that's fine, but then let me just say this then. Nvidia releases a new high end card every six months. That is what I mean. ATi falls behind on releasing a new high end card every six months. By that I mean a completely new prodcut cycle OR a refresh. The point is, Nvidia releases SOME type of card every six month and ATi DOES NOT do all the time. They seem to always be late for some reason. I can understand if this happened once but this is a reoccurring theme with them.

In any case, I hope I made myself clear this time. My point was the correct term for a "product cycle" (but I understand what you mean) but the fact that ATi is constantly delaying. Back in November of 2006, they were already behind a quarter. While some have been released since them, we know know they all haven't been (R600). http://www.theinq.com/default.aspx?article=35871

Now that we have that straightned out, I don't see how anyone can dispute the fact that Nvidia releases either a REFRESH or NEW ARCHITECTURE every 6 months while ATi can't do this! Have I said this enough times now so that everyone can understand my point? I don't know how to make it any clearer than I have now. I don't care what you might say now, there's no evidence to say otherwise....the facts speak for themselves that Nvidia releases a new high end card every six months and again, this is a refresh or a new product cycle but ATi does not.
 
You may have bought a dx 10 early but you'll still missed having the cream of the crop DX9 card for 10 months. I don't know what you guys are all arguing about, I can't see how anybody can deny that Nvidia is wiping the floor with AMD right now. It's like somebody lying bruised and bloodied on the floor after a fight and shouting 'I'll kill you when I get up', only to get a foot in his face.

Clearly you understand what I've been saying...I don't see some of these other people don't. ATi has had the crap beat out of them because they just keep delaying products but Nvidia doesn't do this anymore. They used to do this but they learned that...hey, if we keep delaying like this, we lose money. Here's a novel idea....why don't we stick to our release dates as close as possible and that will help us make more money. Apparently ATi hasn't learned this lesson yet. I hope AMD teaches this to them!
 
Here's a link of the lawsuit: http://www.hardocp.com/archives.html?news=LDA4LDIwMDUsaGVudGh1c2lhc3QsMiwxMg==

Actually read the lawsuit. The R520 is not used to inflate the market but it's production delays were not revealed. Completely different, one is overstating (which is not being claimed, by anyone but you) the other is "production/design/yield issues" which is not the same as the claims to inflate market value inflating in the mis-statement of margins, yields and their current market share against nV and intel which are ongoing concerns, and the only things that could artificially inflate the market.
It's pretty clear in the suit itself that the artificially inflating the market is by misstating, like I said, the R520 itself isn't inflating the market like you say Once again you're drawing conclusions that aren't there, like what a product cycle is.

By that I mean a completely new prodcut cycle OR a refresh. The point is, Nvidia releases SOME type of card every six month and ATi DOES NOT do all the time.

And neither does nV, like I said, explain the GF6800U, no new card every 6 months, 14 months without a new card of any kind. And like the R520 and pretty much the rest of your posts, it's how you're sayin it that's causing problems. You're wrong about a product cycle, and you rely to much on rumour and FUD to make your statments.

My point was the correct term for a "product cycle"

No it isn't, and you don't have anything to support it, I think it's pretty clear what the industry thinks is a product cycle. You confuse releasing some type of new product with an actual product cycle, which doesn't even match nV's definition which clearly states doubling performance as their goal in a product cycle.

(but I understand what you mean) but the fact that ATi is constantly delaying.

As did nV. The NV47/48 was delayed and became the G70, the G80 was delayed. So really no difference other than the duration of delay, they BOTH delay products unlike your statement.

Now that we have that straightned out, I don't see how anyone can dispute the fact that Nvidia releases either a REFRESH or NEW ARCHITECTURE every 6 months while ATi can't do this!

Once again Geforce 6800Ultra, nothing for 14 months. What does that do to your theory?

ATi has had the crap beat out of them because they just keep delaying products but Nvidia doesn't do this anymore.

Right. :roll: So is 'anymore' AFTER the G80 delay or were you trying to claim that the G80 itself wasn't delayed?
 
I don't know what you guys are all arguing about, I can't see how anybody can deny that Nvidia is wiping the floor with AMD right now.

I don't think anyone denies the G80 is the king of the high end, but they aren't winning in all segments, which is likely part of the reason there isn't a rush to get a problematic R600 part to market. Heck AMD is struggling harder in the CPU and chipset segment, at least they have profitable products in the middle and low end of the VPUs. If nV were winning the middle too, that would likely put more pressure on AMD to release any R600, to allow a better replacement of their mid range, not feeling comfortable enough to do this mega launch thing instead. It's annoying, but there's no pressure right now.

The GF8800 is and will be a solid card, I don't see why so many people feel the need to re-write history in order to defend their choice because they're afraid of what's around the corner. Anyone who cares that much, especially if they believe in this '6 month cycle' would be expecting a better card now anyways. Seems disingenuous.

People should worry less about the politics and concern themselves more with the performance in games they actually play.

I doubt I'll hear one word of remorse from anyone who bought a GF8800 and has been playing it since the time we originally recommended them when they first came out. The only people who'll have regrets are those who waited until the end and are more concerned with their ego and having the best card/3Dmarks than actually paying games on it.
 
You can never be cutting edge for more than a month at best. I purchased a 8800 for a new build I did the beginning of February and to say I am happy what what I've got would be understating it. Sure, come May better card will surely be out but my card plays the games I own now, and owns them too. 😛

The 65mn story about the R600 is an interesting one too as reduced power requirements/heat is always a good thing.

Its good that we have two major chipset manufacturers tussling for dominance as it keeps both on their toes, but I just wonder how long it will be before core numbers in chips start to ring the final bells for the graphics card.
 
Apparently the card is being released at the end of March. Check out:

http://www.tweaktown.com/news/7193/amd_roadmap_update_on_r600_details_confirmed/index.html

"The big daddy of them all, the Radeon X2900XTX (codenamed Dragons Head 2 in 9.5” size XTX configuration with GDDR4 memory), is due to hit shop shelves on 30th of March 2007 but it’s still unsure if they will actually make that date. Samples were already delivered to AMD’s partners last week and many companies will likely have R600 live demos on display at CeBIT. The Radeon X2900XT (codenamed Cats Eye also measuring 9.5” but with GDDR3 memory and reduced clock speeds about 15% below XTX) will hit shop shelves on 19th of April. Dragons Head 2 is actually already well into production and product will be delivered to AIB’s in just a few days from now. After many revisions, it seems like AMD have finally got things right and are satisfied with production output – Rev90 it is.

The original and monster Dragons Head measuring 12.4” long will still be sold but recommended just for OEM and system integrators. AMD will also release the Radeon X2900XT and X2900XL sometime in April with reduced clock speeds and only 512MB GDDR3 memory. The Radeon X2900XTX will use faster 1GB GDDR4 memory operating at over 2000MHz DDR using a 1024-bit internal memory bus and 512-bit physical memory interface.

The Dragonshead 2 which will become the card that you can actually buy will draw a maximum of 240 watts which is much higher than first speculated. If you’re interested in running Crossfire, that’s a total of 450 watts just for your graphics cards and it’s no wonder we are starting to see consumer power supplies above 1000w hit the market now. The power will supplied by 1 x 6-pin and 1 x 8-pin PCI Express connector and the coolers used are dual slot configuration."
 
Here's a link of the lawsuit: http://www.hardocp.com/archives.html?news=LDA4LDIwMDUsaGVudGh1c2lhc3QsMiwxMg==

Actually read the lawsuit. The R520 is not used to inflate the market but it's production delays were not revealed. Completely different, one is overstating (which is not being claimed, by anyone but you) the other is "production/design/yield issues" which is not the same as the claims to inflate market value inflating in the mis-statement of margins, yields and their current market share against nV and intel which are ongoing concerns, and the only things that could artificially inflate the market.
It's pretty clear in the suit itself that the artificially inflating the market is by misstating, like I said, the R520 itself isn't inflating the market like you say Once again you're drawing conclusions that aren't there, like what a product cycle is.

By that I mean a completely new prodcut cycle OR a refresh. The point is, Nvidia releases SOME type of card every six month and ATi DOES NOT do all the time.

And neither does nV, like I said, explain the GF6800U, no new card every 6 months, 14 months without a new card of any kind. And like the R520 and pretty much the rest of your posts, it's how you're sayin it that's causing problems. You're wrong about a product cycle, and you rely to much on rumour and FUD to make your statments.

My point was the correct term for a "product cycle"

No it isn't, and you don't have anything to support it, I think it's pretty clear what the industry thinks is a product cycle. You confuse releasing some type of new product with an actual product cycle, which doesn't even match nV's definition which clearly states doubling performance as their goal in a product cycle.

(but I understand what you mean) but the fact that ATi is constantly delaying.

As did nV. The NV47/48 was delayed and became the G70, the G80 was delayed. So really no difference other than the duration of delay, they BOTH delay products unlike your statement.

Now that we have that straightned out, I don't see how anyone can dispute the fact that Nvidia releases either a REFRESH or NEW ARCHITECTURE every 6 months while ATi can't do this!

Once again Geforce 6800Ultra, nothing for 14 months. What does that do to your theory?

ATi has had the crap beat out of them because they just keep delaying products but Nvidia doesn't do this anymore.

Right. :roll: So is 'anymore' AFTER the G80 delay or were you trying to claim that the G80 itself wasn't delayed?

You are right about the span when Nvidia released no high end cards, but it was for 11 months, not 14, but it was no 6 months release either, but ATi makes a trend of delays, while Nvidia doesn't. http://www.nvidia.com/object/IO_14696.html
http://www.nvidia.com/object/IO_23416.html

Maybe the G80 was delayed but not more than a month or two...not 6 months though! Don't forget though the GeForce 7950 GX2! Yeah I know, it's an OEM part, but it's also a retail part...remember the link I showed you from eVGA selling it....it's backordered currently.
 
Apparently the card is being released at the end of March. Check out:

http://www.tweaktown.com/news/7193/amd_roadmap_update_on_r600_details_confirmed/index.html

"The big daddy of them all, the Radeon X2900XTX (codenamed Dragons Head 2 in 9.5” size XTX configuration with GDDR4 memory), is due to hit shop shelves on 30th of March 2007 but it’s still unsure if they will actually make that date. Samples were already delivered to AMD’s partners last week and many companies will likely have R600 live demos on display at CeBIT. The Radeon X2900XT (codenamed Cats Eye also measuring 9.5” but with GDDR3 memory and reduced clock speeds about 15% below XTX) will hit shop shelves on 19th of April. Dragons Head 2 is actually already well into production and product will be delivered to AIB’s in just a few days from now. After many revisions, it seems like AMD have finally got things right and are satisfied with production output – Rev90 it is.

The original and monster Dragons Head measuring 12.4” long will still be sold but recommended just for OEM and system integrators. AMD will also release the Radeon X2900XT and X2900XL sometime in April with reduced clock speeds and only 512MB GDDR3 memory. The Radeon X2900XTX will use faster 1GB GDDR4 memory operating at over 2000MHz DDR using a 1024-bit internal memory bus and 512-bit physical memory interface.

The Dragonshead 2 which will become the card that you can actually buy will draw a maximum of 240 watts which is much higher than first speculated. If you’re interested in running Crossfire, that’s a total of 450 watts just for your graphics cards and it’s no wonder we are starting to see consumer power supplies above 1000w hit the market now. The power will supplied by 1 x 6-pin and 1 x 8-pin PCI Express connector and the coolers used are dual slot configuration."

I'd REALLY like to believe that but I highly doubt it will be released before the end of this month. I hope I'm wrong but I doubt it...especially if it's a 65nm part too...then again that's also just a rumor until it can be verified. I must say though, if it's true then I don't see how Nvidia can beat ATi until they're skipping a refresh cycle and going straight to a new architecture because there's no way a refresh is gonna beat the R600 according to the specs....at least on paper anyway, then if you add to that a 65nm process....I don't see it.

I'd have to concede that if all this is true, then I'd have to admit that ATi did ok with this move and the delay would be understandable, but it's hard to say if it's true or not. There are rumors on both sides. There's hope for Nvidia too and if they're actually working on the G90 then it's got as much of a chance as the R600 since originally the G90 was supposed to go against the R700....so we've got hype on both sides.

This is what I think...I think the R600 is a 80nm part and I think it will beat the G80, but not the G81 (yeah I think Nvidia is working on the G81, not the G90 right now...or rather the G81 will be released soon) but I also think despite the fact that the G81 will be faster than the R600, it won't be by much and either card will be a great card.

I think the real question will be which card will have more features and perhaps more importantly, which card has better IQ. We know the R600 has better specs than the G80 and G81...that is a 512-bit bus with 1GB of GDDR4 RAM (supposed to be an R600 variant with GDDR3 RAM and 512MB too) vs for the G81 with 384-bit bus with 768MB of GDDR4 RAM. They both have unified shaders. Oh and the R600 is rumored to have DX10.1 vs DX10. Not a big change but the 10.1 spec is just supposed to be easier for devs to code shaders I think.
 
The only people who'll have regrets are those who waited until the end and are more concerned with their ego and having the best card/3Dmarks than actually paying games on it.

I'm in total agreement with your perspective. Apparently, some folks enjoy running 3DM06 and other benches more than playing games.
 
The only people who'll have regrets are those who waited until the end and are more concerned with their ego and having the best card/3Dmarks than actually paying games on it.

I'm in total agreement with your perspective. Apparently, some folks enjoy running 3DM06 and other benches more than playing games.

Agreed here too, but I do want to have the best card available at the time. That's why I'm waiting until the Intel price drop April 22 and then I'll buy whatever the fastest card is at the time. I've learned that you can't trust release dates of any kind, and it wouldn't surprise me even if the price drop date for the Intel procs was changed.

If you really want to get down to it, it really doesn't matter what card you buy, as long as it runs the games you want at the resolution and detail level and you're happy with the card. And besides like someone else said earlier in this thread, you're shiny new card will only be king for a short time which is normally about six months when something else is released. It's just nice to buy the fastest thing out so it will also last longer in the long run to run whatever game you want to run with all the bells and whistles on.
 
I don't know what you guys are all arguing about, I can't see how anybody can deny that Nvidia is wiping the floor with AMD right now.

I don't think anyone denies the G80 is the king of the high end, but they aren't winning in all segments, which is likely part of the reason there isn't a rush to get a problematic R600 part to market. Heck AMD is struggling harder in the CPU and chipset segment, at least they have profitable products in the middle and low end of the VPUs. If nV were winning the middle too, that would likely put more pressure on AMD to release any R600, to allow a better replacement of their mid range, not feeling comfortable enough to do this mega launch thing instead. It's annoying, but there's no pressure right now.

The GF8800 is and will be a solid card, I don't see why so many people feel the need to re-write history in order to defend their choice because they're afraid of what's around the corner. Anyone who cares that much, especially if they believe in this '6 month cycle' would be expecting a better card now anyways. Seems disingenuous.

People should worry less about the politics and concern themselves more with the performance in games they actually play.

I doubt I'll hear one word of remorse from anyone who bought a GF8800 and has been playing it since the time we originally recommended them when they first came out. The only people who'll have regrets are those who waited until the end and are more concerned with their ego and having the best card/3Dmarks than actually paying games on it.

My bold, couldn't have said it better myself. Seems like the whole thread here has become a battle of egos. One side defending their choice of going for the card and one side defending theor choice of waiting it out. Like you said, both have lost sight of the goalpost, which is enjoying games at the highest setting possible. 😛
 
if u can wait, then wait for better products i mean no one is gona pan u for buying an 8800 card and those who do are idoits, not everyone is willing to wait(any amount of time) and want to buy something now. that article u linked talks about the 6800ultra and gt, the gt came out after the ultra. the 6800 ultra first came out in agp form in 4/14/2004 while the 7800gtx came out 6/22/2005 so thats 14 months

what your talking about is the pcie version wich came out 6/28/2004
 
I'd REALLY like to believe that but I highly doubt it will be released before the end of this month. I hope I'm wrong but I doubt it...especially if it's a 65nm part too...then again that's also just a rumor until it can be verified.

From the (quoted) article, there's no reason to believe it will be anything other than an 80nm part. It has the same (240W) power consumption as before. Also, as AMD has just released CPU chips on 65nm, there is another reason to doubt they would use this (immature) technology for an unreleased chip from another division...
 
I'd REALLY like to believe that but I highly doubt it will be released before the end of this month. I hope I'm wrong but I doubt it...especially if it's a 65nm part too...then again that's also just a rumor until it can be verified.

From the (quoted) article, there's no reason to believe it will be anything other than an 80nm part. It has the same (240W) power consumption as before. Also, as AMD has just released CPU chips on 65nm, there is another reason to doubt they would use this (immature) technology for an unreleased chip from another division...Who said that AMD would be using their 65nm fab to make them?
RV610 and RV630(XT) are confirmed to be 65nm. Given the R600's high power consumption, a die shrink makes sense.
 
The original and monster Dragons Head measuring 12.4” long will still be sold but recommended just for OEM and system integrators.

If that's true, that'll be more of a reason for enthusiasts to get it than anything else (except for the performance). It's the forbidden fruit thing, if they're recommending it to a specific market because they don't think plain old enthusiasts will have the proper power and cooling equipment, the power-users will want it even more.

Great statement by AMD.
 
Who said that AMD would be using their 65nm fab to make them?
RV610 and RV630(XT) are confirmed to be 65nm. Given the R600's high power consumption, a die shrink makes sense.

Noone. I'm stating my own opinion that the first R600 (DragonHead/DragonHead2) cards, referred to in the article, and supposedly due for release in March will not be 65nm. I'm basing this opinion on the claimed power consumption for the cards, and my opinion that AMD will not use (for them) immature technologies to release new chipsets. It's a given that they will do die-shrinks in the future.
 
You are right about the span when Nvidia released no high end cards, but it was for 11 months, not 14, but it was no 6 months release either, but ATi makes a trend of delays, while Nvidia doesn't. http://www.nvidia.com/object/IO_14696.html
http://www.nvidia.com/object/IO_23416.html

You are picking the wrong link again. I already posted the release links in the last page. You first link is for D3, not for the GF6 series itself, it's 14 months from GF6 to 7, and to requote last page;

"Early April 2004 to late June 2005, that's 14mths without even a new SKU in that market segment.

http://www.nvidia.com/object/IO_12687.html
http://www.nvidia.com/object/IO_23415.html "


It's actually 14 months and a bit.
Unless of course you're not focusing on just that market segment, if you're talking about new products period then your statement crumbles again too. Either way it's more than 6 months.

Maybe the G80 was delayed but not more than a month or two...not 6 months though!

So now it's not that nV never delays now you're going for records of delays? The G80 was scheduled for a spring/summer release by all accounts, and when Vista got delayed it switched to september, and then got pushed back again. So even if you only count the last one, the fact is that your stateent that ATi/AMD delays and nV doesn't is false.
Would the NV47/48 have that record? Taking so long it was renamed the G70 even through the drivers still saw it as NV4x series. However like the R600 if the planned replacement offered limited reasons to buy, why not push back until you have the part you want to release? Either way your statement was wrong.

Don't forget though the GeForce 7950 GX2! Yeah I know, it's an OEM part, but it's also a retail part...remember the link I showed you from eVGA selling it....it's backordered currently.

That really has no bearing on those two comments though. The 7950GX2 (which I replied to as a 7900GX2 as I skimmed at work, my mistake I admit) has nothing to do to elevate the G80's delay or the fact that the GF6800U didn't have a replacement for over 14 months. Both of which are issues you say nV doesn't have.
 
Noone. I'm stating my own opinion that the first R600 (DragonHead/DragonHead2) cards, referred to in the article, and supposedly due for release in March will not be 65nm. I'm basing this opinion on the claimed power consumption for the cards,

The article is now kinda out of date as a rumour source based on the current statements about what the January respin really was doing (a shrink, not just bug fix). The current rumour (all this thread seems to be) is that the R600 is going to be 65nm at launch in early May. This will also mean that if they don't push the chips too hard there should also be power savings, but also likely overclocking potential for the enthusiasts if they don't lock them down too good.

and my opinion that AMD will not use (for them) immature technologies to release new chipsets. It's a given that they will do die-shrinks in the future.

Well ATi did jump to 90nm with the X1800 right off the bat, no waiting, so jumping to TSMC's 65nm which was used on the low and mid end parts would be nothing new for a launch and makes sense, if they've been playing with those for a while and find 80nm hard to work with. The whole risk of 65nm is if you put all your eggs in one basket and they get crushed, but if they are having troubles with 80nm and then run a 65nm line to prepare for the replacement part, and find few issues with that, then it akes sense to abandone the 80nm and switch to the 65nm which you are achieving your goals. It should also be significantly cheaper per chip to make then on the 80nm, and baring any major yield issues, it's likely cheaper than the g80 despite the maturity of the G80s, and that would likely make it very attractive for AMD.

Of course they are all still ether products, as is the rumoured GF8800U, until they hit retail we won't know for sure what's what. No argument the wait just to see another DX10 part with which to compare designs is nerve raking just even from the technical side.