AMD Pushes R600 Back To May

Page 6 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
yeah, ehh what is with you guys. I think you guys arnt seeing the big picture here. There delaying the R600 not because they want to keep it out of your hands but because they want it to integrate better into their next chipset series. I believe may is also the date when barcelona gets done baking in the fabs. I'm sure AMD/ATI has planned something to integrate their system better. Also you guys dont realize theirs no worthwhile dx10 game out, and the ones that are coming out that i really want with amazing graphics are Unreal 3. so who gives a flying F*$% about dx10 at this point in time, otherwise your just prepareing for the future. Its about as useful as a 64-bit operating system, (Great in theory, no one has really anything for it). so i mean go spend your money on first gen dx10 cards from nvidia. Personally I dont buy first gen anything. And AMD just wants to skip a generation. 4months of sales wont plummet them, Great DX10 demanding games cause sale of graphics cards on a large scale, not power hungry enthusiasts.
 
Its about as useful as a 64-bit operating system, (Great in theory, no one has really anything for it)


Wow, spoken like a real noob :roll: Unlike 64-bit operating systems the G80's actually do have plenty of use. They play any current DX9 game with max settings, higher resolutions than ever before achieved with a single card and 16XAA all while getting awesome frame rates.


The G80 is the very first card to truly make more than a hand full of games out there finally playable with maxed out settings, unbelievable levels of AA and all while getting fast frame rates.


Stop acting like its useless when its clearly not:roll: The only people who ever say these sort of stupid things are the same people that don't even own one.


Eh, please tell me of one highly played title that has sold millions of copies and has a very large player base, that the 1950xt couldnt handle. my 1900xt plays any dx9 game(90% of the time with max settings) with no visable lag ever. i run in 1680x1050 with full 16XAA. DX9 hardware didnt need an improvement. Yeah its shadows and dynamic lighting are better, but why do i care when i can see things like Gears of War Play on the old ass ATI Holywood on the 360. Please explain the oh so high demand for this type of card outside of watching a jump in framerate or a better looking shadow. Also please explain the difference in 16xAA and 12xAA because for a typical gamer(largest portion of buyers market) it really doesnt matter. like i said minimal capabilities for alot of cash. Now with not getting into much, but the 1900XT's were being linked to create supercomputer cores for standford not to long ago. They push out 0.5 tflops. Trust me, thats not outdated nor am i a n00b. so the delays in ATI's product release really dont matter in the big scheme of things. I hope you start reading and thinking more before you speak. consider this:"not everyone is a computer enthusiast"
 
Does the delay have anything to do with Samsung's announcement that the company has managed to produce GDDR4 memory clocked at 4 ghz? The announcment about R600 delay came 1-2 days after Samsung's announcement.
:?:
 
...i like what your getting at. but its not goign to happen, amd will have probably 3 refreshes, one where they just tweak the chit out of their current hardware, then then upgrade the hardware, then then tweak that one, i think in the 2nd refresh will be when the go for Uber high speed gddr4, although i do think the first gen ones will have some nice gddr4
 
wow amti lost my business i am buying 2 oc gts cards - i am not waiting any more!

amti suxS!

IFB anti is always #2 (except for IGC)

i wonder if they can not pull of 250 watts in a single slot (dual slot card) thats alot heat to disapate - i bet they are over heating the mobos?
 
it played awsome but it was at 10x7=150 fps
and now at 16x10=65 fps.

Uhh....Just noticed you were sli so never mind. LOL.

that was with a single card on the (in game perf. test)
sli was around 200fps.
not quite double the performance like some people think :lol:
i did like sli but the heat was too much.
and i had ac freezer coolers for both cards, but
they wouldnt fit in my case.

Bro im sorry but im gonna have to call you out on this one. Here is a screenie of Fear with my 8800GTX overclocked to 650core and 2030mem along with an E6400@3.5ghz with 4gbs ram at ddr2 1000mhz 4 4 4 5.

The last time I checked a 8800GTX will lay the smack down on SLI'D 6800GT's much less an overclocked one like mine.


Settings are 1680X1050 with all in game options at max and 4XAA and 16XAF.


8AA.jpg






Unless you are running fairly low settings there is just no way.That's with softshadows right? With softshadows I'm getting close to those numbers, although I haven't overclocked my card yet.
 
I belive this is about driver problems with Vista and the amount of energy the gpu needs. Maybe even performance issues.

The downside is that fast gpu´s like the 8800gtx will remain on her high price (even though it´s hardly available here still :/).

If the performance is not significantly superior to the 8800gtx AMD would have a hard time selling that toaster.

I´m voting for cool´n quiet for GPU´s!
 
8AA.jpg






That's with softshadows right? With softshadows I'm getting close to those numbers, although I haven't overclocked my card yet.

Actually I screwed up, Thats with 16XAA+MS and 16XAF. Even with only 4XAA the difference is only 1-2fps with FEAR :wink:
I get some strange numbers with FEAR. Such as higher framerates with max settings than mid settings. Is this well optimized or buggy?
 
I´m voting for cool´n quiet for GPU´s!

Hear hear! Maybe another 6 months when both companies release their mid range GPU's (8600, X2600). My G4Ti is currently passively cooled and there's no way I'm going back to an active cooler.

P.S. Yes, I know my G4Ti sucks ass, but hell man, I can't afford a new PC at the moment, so I'm stuck playing System Shock 2, Half Life and Empire Earth (no bad thing though).
 
Well just got back from my Ski trip today took a nap, watched 'A scanner Darkly', and now back to you folks. :twisted:

i run in 1680x1050 with full 16XAA.

Just an additional BTW ontop of Rob's already pointing out that it wasn't possible on one, but even CF AA only goes up to 14X. Ahh the difference between AA and AF, subtle in acronyms, but hugely important in visuals and performance.

DX9 hardware didnt need an improvement.

Sure it did, ATi didn't have sub pixel precision and no vertex texturing (without R2VB), nV did have FP16 HDR with MSFSAA in hardware. And the G80 improved on AF levels. All nice features to have in one place.

but why do i care when i can see things like Gears of War Play on the old ass ATI Holywood on the 360.

Well n00b, when discussing things in a specific graphics forum, good to get your hardware right before you complain about it. Hollywood is Wii, Xenos is X360. And the Xenos has alot of features that the X1K series doesn't (as well as many features that aren't used unfortunately as Rob would be more than happy to point out I'm sure :twisted: ). Even then, Gears of War is nice, but it's upscaled to reach the resolutions achieved natively by desktop VPUs.

Please explain the oh so high demand for this type of card outside of watching a jump in framerate or a better looking shadow.

That's like saying explain the demand for the Mustang Shelby 500 or Corvette Z06 other than higher speed and better handling. :roll:

like i said minimal capabilities for alot of cash.

So? What's the big difference between the GF6800 and GF7950? Less technological difference and just as much cash.

Now with not getting into much, but the 1900XT's were being linked to create supercomputer cores for standford not to long ago.

And they're doing the same with the G80, and even more thanks to CUDA, no doubt the R600 will continue that again, but the G80 is no slouch there either. GPGPU application spills into the game physics realm, bothe ATi and nV (and intel) are very interested in continuing into the GPGPU realms, and right now the G80 can do more.

consider this:"not everyone is a computer enthusiast"

But consider these two more important things;

A) everyone buying the G80 is a computer enthusiast (or at least gaming enthusiast), and more importantly...

B) everyone reading/replying to the forum is a computer enthusiast and therefore the arguments you've provided don't hurt people's perception of the g80 so much as it hurts people's perception of you.
 
Personally, I want to upgrade from my 7800GT soon, but I'm worried about power consumption in regard to the 8800 series. I have a 550W power supply, though the rails are dubious and I'm already powering two hard drives and two optical drives on top of the regular stuff (sound, video, card reader, tv tuner, dual core processor). I'm sort of holding out on my purchase in the hopes that the 8900 or R600 will be more power-friendly because changing power supplies would be a real pain, not to mention more power draining.
 
And then this round if rumours are true tech advantage will go back to the R600 with it's DX10.1 support, however they may have issues with the performance crown if that's the reason for delay.

Did I understand you correctly that the R600 is going to be DX10.1 compliant? How come? It's not supposed to come out late this year if not next year.
 
And then this round if rumours are true tech advantage will go back to the R600 with it's DX10.1 support, however they may have issues with the performance crown if that's the reason for delay.

Did I understand you correctly that the R600 is going to be DX10.1 compliant? How come? It's not supposed to come out late this year if not next year.

Because at this rate the R600 wont come out till next year either.... 😉
 
Did I understand you correctly that the R600 is going to be DX10.1 compliant? How come? It's not supposed to come out late this year if not next year.

Well the same reason the GF6800 came out with SM3.0/DX9.0C compliance well in advance of the actual full support in software. Just because when they were designing the core they didn't know the timetable of the releases, but that was the level of support they targeted.

ATi and nV (and intel et al) are all given advance warning about the future requirements of DX version and help M$ mould and develop those specs, especially the 'required' vs 'optional' features, before they provide them to developers. No point in requiring a spec that is nearly impossible for the IHVs to support in hardware, that'd be counterproductive for everyone. So both ATi and nV aim for certain feature sets and sometime the support is ahead or behind the hardware. Technically DX10 came out before the G80 in Vista RTM and Corp Business editions, but it's real general release was after with the personal editions of Vista.

We'll see, I don't see the difference mattering much, but then again we really have no point of reference yet because we're just getting DX10 demos now, and little way of telling what the choke points are in current hardware.
 
Yeah EB's articles is a good one, the thing is multi-core won't make a difference for either CPU or VPU situations since the R600 and G80 are similar in those respects.

so I guess it's safe to buy an 8800 now.

Oh most definitely, but for the GF8800 buyers it shouldn't matter much because if something better comes out then they'll buy that too.

Don't keep waiting forever, buy something that looks good, and the GF8800GTS-320 look damn fine!

Interesting comment from the M$ FlightSim X developers on DX10 posted yesterday on delaying SP1 adding DX10 to FSX;

http://blogs.msdn.com/ptaylor/archive/2007/03/03/optimized-for-vista-does-not-mean-dx10.aspx

"Given the state of the NV drivers for the G80 and that ATI hasn’t released their hw yet; it’s hard to see how this is really a bad plan. We really want to see final ATI hw and production quality NV and ATI drivers before we ship our DX10 support. Early tests on ATI hw show their geometry shader unit is much more performant than the GS unit on the NV hw. That could influence our feature plan."
 
Damn, that put a wrench in my gearbox. :?

I hope that this is not a 9700/5900 story all over again. I'm trying to stagger out my spending so as not to make it hurt so much cause here in Malta things are really expensive. For example that BFG 8800 GTS 320MB OC retails at $450 and that's the cheapest GTS I could find. The GTX is at $900. And to make it worse, I can't order online cause with postage and package from England things come out costing the same. This country really sucks about these things, on average things are 50% higher than in England and 100% higher than the States. I really really hate this place. :evil:
 
Damn, that put a wrench in my gearbox. :?

I hope that this is not a 9700/5900 story all over again. I'm trying to stagger out my spending so as not to make it hurt so much cause here in Malta things are really expensive. For example that BFG 8800 GTS 320MB OC retails at $450 and that's the cheapest GTS I could find. The GTX is at $900. And to make it worse, I can't order online cause with postage and package from England things come out costing the same. This country really sucks about these things, on average things are 50% higher than in England and 100% higher than the States. I really really hate this place. :evil:
Can you buy in a forien country? People in Thailand vacation in Singapore for example. (better and cheaper PC stuff).
You may need to travel out a bit to save. Do big spending and try to save the most when you're away.
 
Damn, that put a wrench in my gearbox. :?

I hope that this is not a 9700/5900 story all over again.

No, no, I wouldn't think it's anywhere near the same. If anything it's like the difference between the GF7900 and the X1900, remember it's geometry shaders, which are neat for things like grass and boulders, but won't be the killer feature for a while, and the G80 does support it just supposedly not as fast, and likely the performance differences are like the shader power differences between the X1900 and GF7900, sure they exist, sure they're exploitable, but they are often held back by other portions of the chain like the ROPs etc.

I'm trying to stagger out my spending so as not to make it hurt so much cause here in Malta things are really expensive. For example that BFG 8800 GTS 320MB OC retails at $450 and that's the cheapest GTS I could find. The GTX is at $900... I really really hate this place. :evil:

Yeah I know, and the thing is if it's a one in a long time purchase for you, maybe wait, but really I would say the GF8800GTS-320 is a damn fine card. Even if the GTS is slightly slower, as that blog suggests, the developers will take that into account and program for it. I suspect you will barely notice the difference for quite some time, but maybe in some situations, in a game with grass like Oblivion, but using DX10 shaders, you might see a slight difference in rendering those scenes, but likely like Oblivion it'll probably allow you to turn the features down.

I wouldn't worry too much about it, like I said the support differences will be minor at best, and they likely won't be exploited fully for some time. I could be wrong, but I doubt it.

The main thing is why are you upgrading now? If you have a competant card for your current games, then maybe wait, but if you're doing a new build then buy the GTS-320 now and be secure in the fact that it'll handle all games thrown at it for the next year very well regardless of what else comes out. And if you really want something else sooner, re-sell the GTS, and then buy the best you can then. I'd wait until games ship before worrying about any differences.
 
Personally, I want to upgrade from my 7800GT soon, but I'm worried about power consumption in regard to the 8800 series. I have a 550W power supply, though the rails are dubious and I'm already powering two hard drives and two optical drives on top of the regular stuff (sound, video, card reader, tv tuner, dual core processor). I'm sort of holding out on my purchase in the hopes that the 8900 or R600 will be more power-friendly because changing power supplies would be a real pain, not to mention more power draining.

That depends...what brand do you have? If it's a generic PSU, then I'd say probably not, but if it's a decent brand like PC Power and Cooling, OCZ, Enermax, Antec....brands like that, then yes. I have a Enermax 535 Watt PSU and I'm going to buy an 8900 GTX and I'll have enough power. Just use this PSU calculator to see. http://extreme.outervision.com/psucalculator.jsp

This is just for the 8800 GTX or 8900 GTX. If you try to get the R600, you'll definitely need a new PSU....something no less than a 700 Watt to be safe and preferably more. An 850 is a good place to start but if you can afford it, get a 1K. Of course you don't want to upgrade your PSU as I don't so if you buy either a G80 or G81, you'll be fine with what you have as long as that's not a generic brand.
 
The retail R600 will likely be within about 50W of the 8800GTX, and likely closer to the 8900GTX. So a similarly competant PC P&C will be more than fine for either solution.

Neither will need more than a solid 600W PSU unless Crossfire is used or a honkin CPU setup wth multi drive raid array, etc start factoring in an drawing too much on 12V. The other thing to consider is ensuring the rails can handle enough on their own which is another reason why quality current PSUs are good, olders PSUs spread the burden too evenly and didn't supply enough power on a single 12V rail.

However rememebr the hotter a system runs inside the case the less efficient the PSU as it warms up. Good airflow helps alot.