InQ's got a Blurb about the next ATI chip, and they call it the RV480 but also mention the R480. I thought the RV line was midrange and value cards.

<A HREF="http://www.theinquirer.net/?article=16906" target="_new">http://www.theinquirer.net/?article=16906</A>

Also note mention of the possibility of SM3.0 in the new chip, but of course like everything leading up to May, this is simply speculation. Really I think it will depend heavily on the response to DX9.0C and whether any applications/demos which can now expose SM3.0 features show any tangible benifit short term. Long term is one thing, but if it becomesa 'must have right now feautre' there may be a situation where the X800 becomes this generation's FX to the GF6800. Of course, still nothing to indicate that yet. But nice to think that before the new year is up we'll have a new core, even if it is only a refresh.

Also a 0.11µ core, now that would be interesting, and by then they should have some experience with the RV370. 0.11 Low-K/D would be pretty sweet.

- You need a licence to buy a gun, but they'll sell anyone a stamp <i>(or internet account)</i> ! - <font color=green>RED </font color=green> <font color=red> GREEN</font color=red> GA to SK :evil:
<P ID="edit"><FONT SIZE=-1><EM>Edited by TheGreatGrapeApe on 06/29/04 05:35 PM.</EM></FONT></P>
 

coolsquirtle

Distinguished
Jan 4, 2003
2,717
0
20,780
omg.... you know what. I had it. I'm tired of video cards releasing every 25 seconds. That's it! I'm going on-board video now, at least it'll last alot longer on the market and will not get replaced in 8 months

RIP Block Heater....HELLO P4~~~~~
120% nVidia Fanboy+119% Money Fanboy
GeForce 6800 Ultra--> The Way we thought FX 5800Ultra is meant to be played
THGC's resident Asian and nVboy :D
 

Spitfire_x86

Splendid
Jun 26, 2002
7,248
0
25,780
Me, too. Especially for the low end sector.

ATI have released nothing new+better for low end within last 2 year. However, they crippled their low end section with 64 bit memory junks

------------
<A HREF="http://geocities.com/spitfire_x86" target="_new">My Website</A>

<A HREF="http://geocities.com/spitfire_x86/myrig.html" target="_new">My Rig</A> & <A HREF="http://geocities.com/spitfire_x86/benchmark.html" target="_new">3DMark score</A>
 
ATI have released nothing new+better for low end within last 2 year
X300 is out (of course in limited release like every other thing lately), and it is definitely better than the R9200, even if only by a small amount.


- You need a licence to buy a gun, but they'll sell anyone a stamp <i>(or internet account)</i> ! - <font color=green>RED </font color=green> <font color=red> GREEN</font color=red> GA to SK :evil:
 

coolsquirtle

Distinguished
Jan 4, 2003
2,717
0
20,780
WTF? Scamtron edited ur post?!?!?

RIP Block Heater....HELLO P4~~~~~
120% nVidia Fanboy+119% Money Fanboy
GeForce 6800 Ultra--> The Way we thought FX 5800Ultra is meant to be played
THGC's resident Asian and nVboy :D
 

CaPtAiN_InSaNo

Distinguished
Feb 25, 2004
296
0
18,780
They are and they aren't, that have taken too long to realease this newer genereation, but it doesn't really matter if the new R480 has SM3.0 or not, the funny part is is that most of the flame wars over sm 3.0 is that it is useless, but according to this, ATI doesn't think so. oh well, my 6800 GT will still last the next coupple of years, and it won't be totally taken out of the high end sector until the next generation comes out in 2006~2007.

P4 2.6c@3.1, OCZ DDR533, Abit IC7-Max3
(my vid is shameful, a 3D Prophet that can't run games, lol)
 
the funny part is is that most of the flame wars over sm 3.0 is that it is useless
Actually the biggest issues weren't over whether it is useless or not but whether it will have a big impact or not in this generation, according to this actually the X800 generation may be shorter than we thought which also changes the picture. It's like 3Dc, it's not useless, but exactly how much difference will it make, and if the impact is minor and only in a few titles, is it worth any possible performance penalties? That was the crux of the issue, as it's not a determined fact, but future speculation. Anyone basing their buying decisions soley on something that is unproven (like the test of time / or a known commodity) is an idiot. If you already are looking in one direction and there is an added feature, hey that's nice icing on the cake. That's the way to look at it. Anyone buying for future features/games would do best to buy a last generation card, save their money, then sell their last generation card when the features are actually exploited in a game you like to play, and buy the next gen card cheaper. As an example let's say (just theoretically don't jump on the example) the XGI had PS4.0 support, yet it played the games the way it currently does. Would that support for something not exploited yet, nor offering MUCH benifit in it's lifecycle be worth the performance penalty? I doubt it. The nice thing is that the performances this time are close, so risking a +/- 5% is worth the potential features reward.

oh well, my 6800 GT will still last the next coupple of years, and it won't be totally taken out of the high end sector until the next generation comes out in 2006~2007.
Same goes for the X800 even. Sure it has less features but it will still be playable well beyond the Unreal Engine 3 games. And as for your card being taken out, nothing has proved one way or the other that the GF6800 can use these features as they will be implemented. How do you know it doesn't wind up like an FX5600, nice it has PS2.0 support but not enough power to use it, which once again may change with DX9.0c, but I wouldn't hold my breath. Right now the more stressful features may be supported, but may not actually be functionally useful. I have a feeling that anyone willing to fork over $500-600 for either card right now won't care in a year's time about longevity so much as that the NV50/R500 have been released and outclass their card by 50%, plus have more features. If those killers apps don't ship soon enough, then the impact will be minimal, and the next gen cards will simply put these ones to shame regardless of features.

It's just the natural order of things. Always better than the last (althought the FX5600 wasn't really an improvement over the GF4ti4200). Personally the rate of change is too slow., but the important thing is for the Mfrs THEY think it's too fast.

I don't understand the concept of 'too fast' as long as products aren't rushed and messed up like some have been in the past. New hardware drives down the price of old hardware, and new additions to the middle lower class, create new options. The buyers of the top cards will always want faster, and that helps drive the development process too.


- You need a licence to buy a gun, but they'll sell anyone a stamp <i>(or internet account)</i> ! - <font color=green>RED </font color=green> <font color=red> GREEN</font color=red> GA to SK :evil:
 

entium

Distinguished
May 23, 2004
961
0
18,980
Well features such as 3dc are really nice ATi really does make nice technology but they don't push them to be adopted :/. SM 3.0 is a bit different it will be adopted because its the next wave of special effects for both Dx and Ogl (next revision of GLSL is going to support it, from what I've heard).

As fast as a card is would someone what to spend 300-500 bucks on a card that will only last 6 months?

Someone has to push the envolope for new technology, ATi did it with the 9700 last round and that card lasted a good 2 years, and its still usable. Thats the nice thing about the nV 6800 gt and ultra, they will be usable. Take alook at the graphics for Unreal 3 tech demo, its not the graphics thats slowing it down to 30 fps, as much as they want to say that, its the CPU. Playing Unreal 2004 on a 3800+ gig with a gf 6 gt, ultra, or x800pro, or xt I don't get higher then 60 fps at highest resolutions. On a 2.5 gig its at 40, the unreal 3 tech demo at e3 was on a 3.4 and it was going 40. Unreal 3 physics are alot more intesive then ut 2004.

My speculation is Unreal 3's techonlogy is cpu limited majorly limited. I think they mentioned in one of the first released video's that the engine can handle 20 character per screen on today's technology the gf 6800'.

So this kinda supports the idea that it is cpu limited.



Thata true the performance is close but the end of it all, the next revision is just a speed boost, then in a year a new architecture with more speed, but not too much, at most you will see a 50% increase which really won't be utilized unless cpu's get fast enough till the gpu is the bottleneck again.
 
Hah-hem, notice the edit time/date at all? :lol:


- You need a licence to buy a gun, but they'll sell anyone a stamp <i>(or internet account)</i> ! - <font color=green>RED </font color=green> <font color=red> GREEN</font color=red> GA to SK :evil:
 
The thing is while SM3.0's life span is X+ Y years, it's still a question of when and how it's adopted within the lifespan of the cards.

3Dc DEFINITELY has a more finite lifespan, and heck we may see squat from it, but it's equally as 'unknown'/unproven right now.

The impact of either may simply be little more than a check box feature, and may not provide significant impact within the next 6months (by which time the next refresh arrives). That's the main point, if nothing comes out as a killer app, then there's no driving force. D]|[ on it's own and without equivalent bells and wistles may have more impact on the GF6800 than any 'SM3.0 Game' over the next year. That seems more realistic to me. Of course, that's still a guess until the actual game hits the test benches, and even more importantly the stores, in a FINAL format.A while it may bring little more speed to the table than a GF6800 running equal with an X800XT, that will be enough to be a big buying decision advantage to many who've waited a long time for this particular title, and for anyone who thinks that this will be a very influential game for future titles. That impact to me is more important than something that may only offer shinier water/bumpier surfaces and a few FPS in games that have much MUCH less influence.

Likely ONE Single 'KILLER app' will make and break the 'winner' of this round, not the checkboxes/bulletpoint featuress.


- You need a licence to buy a gun, but they'll sell anyone a stamp <i>(or internet account)</i> ! - <font color=green>RED </font color=green> <font color=red> GREEN</font color=red> GA to SK :evil:
 

Spitfire_x86

Splendid
Jun 26, 2002
7,248
0
25,780
Where is X300 benchamrks?

------------
<A HREF="http://geocities.com/spitfire_x86" target="_new">My Website</A>

<A HREF="http://geocities.com/spitfire_x86/myrig.html" target="_new">My Rig</A> & <A HREF="http://geocities.com/spitfire_x86/benchmark.html" target="_new">3DMark score</A>
 
Where is X300 benchamrks?
There have been three posted by me here.

Check Extremetech and Sudhain's GMA900 reviews, both included X300 results. It appears Sudhain included the X300SE results, snce Extremetech's results are far better.
DH (DriverHeaven) included the whole ATI line.

- You need a licence to buy a gun, but they'll sell anyone a stamp <i>(or internet account)</i> ! - <font color=green>RED </font color=green> <font color=red> GREEN</font color=red> GA to SK :evil:
 

TRENDING THREADS