Could it really have happened? Is it true?

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
i love ATI myself, if ATI was a warm pie id drop everything and have sex with it. But i still recomended he 4200 over the 8500 to my friend because it IS better god damnit, not the same. And whoever thinks the 9700 is only a LITTLE bit better then the 4600 is a stupid face.
 
bare in mind that having the fastest GPU is nice, but not essential for business, as the real money is in the Ti4200 range, top cards like the 9700 and the top GF-FX aren't going to hold a company afloat...so i think Nvidia are still doing pretty well to be honest, but lets wait for cards to appear, with benchmarks, before we really put too much effort into discussing something we know very little about.
 
there is no way in hell ATI is driving NVidia out of business right now with the 9700 pro. All they may have done is take the top performance crown away this does not mean that Nvidia is gonna go bankrupt. Just look at the growth in Nvidia as a company compared to ati plus all the exclusives with dell etc. The high end graphics card market is not where the profits lie, i bet that even now the sales of a ti4200 are way higher than the 9700pro or the 9500pro. Its not about top performance when it comes to staying in business, its about money. Hell Nvidia has had a slight edge in performance for a couple of years and they have not come close to putting out ATI. Nvidia only drove 3dfx out by undercutting them in price, not be vastly superior performance.

:tongue: <b>If it aint broke, <font color=red> overclock </font color=red> it</b>.
 
I'm sorry, I seem to be the only one who is irritated by davepermen's writing style. I was referring to him, if you want to know.

Greetz,
Bikeman

<i>Then again, that's just my opinion</i>
 
hm. you mean i'm negative?
i'm not negative about the nv30.
i'm negative about:
the way nvidia tries to make everything to believe it is the solution, whyle it is still far away from being availible, and no real useful info is out.
the way nvidia implements additional features over the r300 (there is only a low amount of additional features, but they hype it in an awesome way), features that are incompatible to anything else, wich makes it holy difficult for developers to work for both in an optimal way (so nvidia gets support as they sell more, and ati, making an equaly good card, gets a drop back in the ass, just because they did not sold that much cards..)
nvidia tries to get developers to work for them and only for them by making their stuff more and more complete incompatible to anything else. its cool if you want to work for nvidia only. then you get great tools. but i, and very much else, don't want to. thats like coding only for amd, or only for intel => stupid. you as developer eighter loose possible market (if you code for radeon, you can easy port to mac for example..), or have to do the work twice. _ALL_ other vendors are more or less compatible to eachothers. only nvidia is there _WAY_ offroad.
the numbers suggest, as much discussed, that the card is _NOT_THAT_MUCH_FASTER_ for the numbers it has. it sounds more like a p4, clocked to 2 or 3 giga, but still at the same speed as a athlonXP at 1.5 to 1.8giga. they state they use powerful knowledge, clever techniques. no, its bruteforce. bad to overclock. bad to advance. etc.

the shown 350m-vertices against the 325m-vertices show yet that raw clockspeed isn't anymore.

about the more and more like processors:
a reverse square root is slower than a multiplication on my radeon9700 in the pixelshader. in short: instructions don't have the same speed (else they would have to be all at the lowest common denominator, or so.. well, slower at least).
so yes, you can't listen to numbers anymore, as you can't on processors. benchmarks are the only thing that count from now on.

the hairdryerFX is still a powerful card, no one can say it isnt (from the numbers).
its just not a great new step imho. so i'm quite dissapointed on seeing nvidia did it this way. and their marketing methods are really ugly blinding..
who ever has to make a huge show about something (and nvidia does make an imense huge show about something wich will not be availible for the next months for the end user in a high count), there is something in the air. for me, that show, that announcement, all was just to diss ati out the leader market, making people believe in nvidia again.

ati is still the leader. their card is nearly capable of doing everything the nv30 can (from the specs, from devpoint of view, its equal), and it looks like it does perform nearly as good (depending on situation, of course).

so what?

i am dissapointed about nvidia, i don't like how i had to work with them, how they act, and what their future plans are. thats why i diss them.

"take a look around" - limp bizkit

www.google.com
 
Implementing those extra "features" just for marketing can´t be the answer. It costs (I´m asuming) to make a chip that goes beyond the current specs, they wouldn´t do that just for the marketing-value of it (I doubt they´re that sneaky :smile: ).

<font color=red>I´m starting to feel like a real computer consultant.</font color=red>
 
That's what I am thinking.
I doubt these added features are as weak as dave claims.
Cg has got to be more than just a few added surprises to make the DirectX 9 newcomer worth it.

However I agree with dave's anger about nVidia's card specs. No really, WHY, just WHY are we moving to a clock speed isn't everything CPU-class world?
GPUs were one in a millions because they had impressive IPC performance. For nVidia to go ruin 0.13m by releasing a 175MHZ higher core speed to compete 10% more, is horrible, and an insult!

However some things I believe should technically rock, were that they have 8 pixel shaders, one on each pipe (previous no company would say how many PS are there), and 8 vertex shaders as well, if I am not mistaken. That should be sufficiently powerful, if not nearly twice as the R300.

--
*You can do anything you set your mind to man. -Eminem
 
Before I just read his reply, I was also feeling irritated, as if what he knows is what he's using to prove that nVidia's new features are crap.
But his reply now has changed a bit my opinion as he has a point. However I am still not with him on the features side, I do think they are more than just small things he claims to be.

--
*You can do anything you set your mind to man. -Eminem
 
While the 9700 pro is the performance king right now, it won't give ATI more market share. Neither will nvidias FX. No top end card ever has or ever will. The only good thing is that the technology of top tier cards trickles down to the mid and low range cards, which nvidia dominates right now. However, the 9700, 9500pro and 9500 give ATI an excellent array of weapons with which to battle nvidia's gf4 ti series. After reading a number of reviews about the 9700 and 9500pro, it was clear to me that the 9500 pro is on par with the ti4400 and 4600 and kicks the 4200's ass at a great price of 199 maybe lower, while the 9700 clearly beats them all at a still reasonable price of 300. I think that it will be about 5-6 months before nvidia can get a grip back on the midrange market, and by then ATI may have captured that cashcow segment of the market. Basically I think that the real battle will be fought in the midrange market, and no matter who wins the "performance king" crown, either nvidia or ATI, I think that nvidias delay of the fx has given ATI the oppurtunity to take the market share crown for the first half of 2003, maybe longer depending on how fast nvidia gets their fx generation mid cards to the market. I returned my ti4200 yesterday, after I read up on the 9500pro, which I plan to purchase very soon. ATI got my dollar, and will probably get many more very soon.

Jsf
 
Thanks guys, this has really turned into a positive discussion and thank you dave for not backlashing at the guy that was quite rude to you. There is no need for that!

Now, on the topic of marketshare.... Granted, a top-end card ITSELF is not the money maker or what gives you marketshare... atleast directly.

but... taking the crown affects how your entire company is viewed as a whole. So, if ATI maintains a reputation as having the most powerful card available then it is only a matter of time until that reputation trickles down to the average user and they start buying ATI cards in the midrange.

"There is no dark or light side, only power and those too weak to seek it."
 
I am hoping that ATI will get enough of the market share that they will become as well known as NVidia in the graphics card market. If there are two names that are recognized by everyone the competition increases exponentially and it makes a better market for the buyer. For my self I won't be buying a computer for another six months, but when I do I hope that NVidia and ATI have good midrange cards out to chose from. Right now I would buy the 9500 Pro, but in six months who know what the best midrange card will be?

<font color=green>My other personality is schitzofrenic.</font color=green>
 
okay. about the features that nvidia does have.

they have:
real branching in vertexshaders.
that means, you can have loops, branches, conditionals depending on vertex data. a vertex can for example store a value true or false for.. uhm.. move me 2 units to the left (stupid example😀), and then you can, depending on that value, move him 2 units to the left.
you can't do that on the ati. there you can only set some constants for the vertexshader, wich determines such a thing. that means, stuff per object. set a flag that determines if you want to move your _object_ (the whole enemy,powerup,wall, whatever) 2 units to the left.

now nvidias thing _is_indeed_ more powerful. but it means you have to fully redesign on what you do in a vertex program, what you do on the cpu and everything. vertex programs are mainly there for animation, named vertex skinning. and there the whole design to use the nvidia features means that you have to recode everything. look at ut2003 and you see skinning in action, the animations are realtime generated and do indeed look great. but they could implement them the same way for all gpu's! now for nvidia they would have to redesign. and this for nvidias card only. in the case of ut2003, that will happen, as nvidia sponsored them quite a huge bit to do so. but in normal situations, it depends.
the real vertexshader design (wich includes branching and all) has even much more features (like per vertex textures for stuff like programable displacement mapping, but much other usages are possible) are _not_ implemented. those, and not nvidias thingy, will be the next standard (vs3.0 in dx9) vertex shader/program. so why working for one gpu only? its quite a bunch of work..


pixel shaders:
nvidia has 1024 instructions, ati has.. 96 or so (64 math and 32 texture instructions max). that means much bigger pixel shading programs. on the other hand, they have no branching or anything in, wich would, guess what? be the next standard after the standard of ati. now again, you have to code completely new pixelshaders for nvidias card, and only for this one. ask carmack how easy it is to work with different pixelshader-versions, and he will tell you (he develops doom3, for all that don't know). in short: it's crap. why? because you can't just say look, my pixelshader has 200 instructions. that means i have to split it into 3 parts on the ati. you can't just split, but you have to redesign your data setup and everything to do those 3 passes. that leads to different design on dx9 cards than on the gfFX. for a gain of what? supporting one lonely gpu. and the additional pixelshader instructions are often not used. (looking at the output of cg compilers currently: they compiled to a simple perpixellighting program to 61 instructions! i have the same in 32 instructions.. so what? 😀).

the additional features over standard opengl1.5 and dx9 cards:
from 96 to 1024 pixelshader instructions.
real branching in vertex shaders.

to the next standard:
real branching in pixelshaders.
textures in vertex shaders.

the next real standard will be out on the next card _after_ nv30, i guess. till then, dx9 is standard (as there are much different cards for dx9 comming as we all know).

the feature set is ridiculous. its fully off the standard.

about cg:
its just a programming language (wich i think is not that well designed) to generate vertex and pixel programs. you can run them (partially thanks to the "great" support of nvidia to make it a standard) on ati card as well. cg was just put out to make their cards (gf3,gf4) look programable in parts that aren't (gf3 and gf4 have, hardware wise, no pixelshaders.. its just an advanced multitexture unit, but most of the features have been there yet on matrox and ati cards.. envbump,dot3 lighting, etc.. still gf3 and gf4 where fun to work with😀), then they used cg to spit out a crappy emulator for the nv30, so that people "can code" for a not existing gpu, to make them not think they would need to move to ati to do such stuff.

there aren't much features on the card, but they are done wisely: those features make the card look so much more advanced, while it isn't really. and with that, and names like cinematic effects and cg - "c for graphics" everywhere, they look so future proof. they aren't. its just another gpu. a hardware and software wise not even good done gpu imho.
as stated above. moving to .13, boosting to 500mhz and 1gig ram. should be faster, not? should be over 100% faster, not? well.. it will not be. so its not a good card.



btw, its quite good possible that microsoft will not implement the additional features of nvidia for dx9 but rather for dx9.1 (wich will take over a half year to come out). that means for a long time you will not even be _able_ to use the non-standard features of the gfFX in dx. in gl, you can, but there its even more work.

hope that clarifies a bit.

btw, the ati card on its hand has real 128bit floating point textures, cubemaps, 3d textures, what ever you want. the gfFX doesn't have. makes demos like the HDR natural lighting quite a bunch more work to code (meaning eating up much more pixelshader instrs to emulate cubemaps and all that), wich on the other hand just leads to a) bad performance in comparison and b) less real useable pixelshader instructions..

well, i think you got it now a bit more😀

just believe me: i will never diss something if i would not have a reason. really not.

the only thing i can't diss for now is the real performance. but the theoretical performance, the stated numbers from nvidia and all that, that isn't "wow fast" for me. and the rest isn't as well.

"take a look around" - limp bizkit

www.google.com
 
BTW, can I note that I don't like the total negative attitude that is taken towards this new product.
<b>No</b>

Some like it and some don't, that's life, get used to it. :lol:

and you davepermen, those _things_ you _use_ remind me of people who use that damn "rabbit ears" while speaking, I just want to hurt those people when they do that 5-6 times in less than a minute. Always want to reach out and grab those damn fingers and twist/twist/twist till they say "uncle". LOL :wink:

I'm not in a bad mood, people have problem uderstanding when I'm kidding. (even when we're face to face)

<font color=red>Got a silent setup, now I can hear myself thinking.... great silence</font color=red><P ID="edit"><FONT SIZE=-1><EM>Edited by AndrewT on 11/29/02 03:44 AM.</EM></FONT></P>
 
and you davepermen, those _things_ you _use_ remind me of people who use that damn "rabbit ears" while speaking, I just want to hurt those people when they do that 5-6 times in less than a minute. Always want to reach out and grab those damn fingers and twist/twist/twist till they say "uncle". LOL
actually, its to TYPE WITH CAPSLOCK, YOU KNOW?!?!?. i just hate capital letters.. _got_it_?
you mean "those rabbit ears?". there is the " symbol for them, ya know? 😀

"take a look around" - limp bizkit

www.google.com
 
"As big a flop as the VooDoos"? I ran one of those "flop" VooDoo5 cards up until Unreal Tournament 2003 was released. It ran everything I threw at it up to that point. I wouldn't exactly say that the card was a "flop"! I'd say it had a pretty good life as video cards go.

UsHe_564

"You can run, but your punk ass will only die tired!"
 
"don't want to see them put nVidia our of business"? I would, especially after what nVidia did to us VooDoo card owners when they helped put 3DFX out of business! The part that really pissed me off was when they took over all of the remaining 3DFX stock, tooling, technology, etc. and then left us VooDoo card owners out in the cold, having to resort to "Hacked" drivers for Windowz XP! I guess they thought I'd scrap my VooDoo cards and "rush out" and buy one of theirs, NOT!!!!! I held out until the very end and only upgraded to an ATI Radeon 8500 AFTER the release of UT2003. I hope the bastards go under! I hope SiS will get their act together on the Xabre400 soon. I bought one of those but it wouldn't run UT2003 well enough to play, sent it back. The 8500 does fine though and I'll continue to use ATI cards until someone else, (except for nVidia) comes up with a better deal.

UsHeR_564

"You can run, but your punk ass will only die tired!"
 
Why are you so angry? It's just business! 3dfx got sloppy and nvidia used that to their advantage. That’s what is happening all over the world every day: company takeovers whether they are hostile or not. I can understand that you are pissed because of the lack of driver support, but do you honestly believe that any company would continue to use a good deal of their workforce to develop drivers for a product that’s reached the end of the road with no way back? I, for sure, don’t. Writing drivers for products they haven’t any interest in (have not made any money on) is just plain stupid when the competition is as tough as it is in the graphics industry. Writing new drivers for old products produced by nvidia itself is an entirely different matter altogether.

By the way, writing stable drivers is a very time consuming process and that is why almost 2/3 of the technical staff at nvidia are software engineers.

I’m not on nvidia’s payroll, I’m just a bit cynicall.

Regards
Andreas



<P ID="edit"><FONT SIZE=-1><EM>Edited by AndyK on 11/29/02 03:23 PM.</EM></FONT></P>
 
NVIDIA made the XBox's GPU, which took months and months of developing and resources. That speed bump slowed NVIDIA's work on the Geforce 4, and allowed "PC only" ATI to take the lead. NVIDIA can't make products for consoles and PC's while holding the top performance spot. Now that (as far as I know) NVIDIA isn't working on console GPU's, they can focus on the FX and likely take back the crown unless ATI's new chip is a real monster.
Remember when it was Pentium 2, and only Pentium 2? Now with AMD's compitition, processors are at great prices and performance. Compitition is good for us (the consumers). I welcome ATI's current lead, it forces NVIDIA to get back in the game, and lower prices =)

Most people live at the speed of life, I drink Mountain Dew!
 
Haha, that's the most original excuse I've heard yet... not forgetting nvidia's comparative company size..number of developers..software writers..finance....
 
Um ATi is not a PC ONLY company.
What about the GameCube "Powered by ATi" sticker which proves the Dolphin GPU by ATi inside?

--
*You can do anything you set your mind to man. -Eminem
 
I've always wondered what's with the underscores. I thought it was just a side-effect from a programming addict who works over 10 hours programming each day, that the underscore effect spreaded out.

--
*You can do anything you set your mind to man. -Eminem