Could it really have happened? Is it true?

Page 3 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
nVidia is still working on console-chips. XBox2 anyone?

<font color=red>I´m starting to feel like a real computer consultant.</font color=red>
 
Woah. I didn't know ATI make Game Cube's GPU. So there's compitition on console and PC.
Did Sony make their own chip for PS2?

Most people live at the speed of life, I drink Mountain Dew!
 
It looks like sony made their own 'graphics sythesizer' as they call it. <A HREF="http://us.playstation.com/hardware/PS2/415007666.asp" target="_new">http://us.playstation.com/hardware/PS2/415007666.asp</A>

<font color=green>My other personality is schitzofrenic.</font color=green>
 
The gamecube gpu was made by a company (can't remember the name) which ATI bought after the design was finished, but that doesn't stop nintendo from slapping a sticker on the cube. As far as I know the R300 core was the first project where the 2 design teams worked together. Actually reminds me a little of the nvidia-3dfx cooperation on the nv30.

Regards
Andreas
 
There was no co-operation by 3Dfx and nVidia.

Why do people have a problem realizing that 3Dfx don´t exist anymore. The NV30 is designed by nVidia, which partially consists of ex-3Dfx employees. They may or may not have used technology that 3Dfx was working on before the collapse, but that´s an empty point today since nVidia owns all former 3Dfx technologies and patents.

<font color=red>I´m starting to feel like a real computer consultant.</font color=red>
 
Actually I know that, and when I review my post I see that my last sentence really didn't make much sense the way it was written. I merely wanted to state that the nv30 is the product of both nvidia and former 3dfx engineers which is a fact that I'm sure everybody knows by now. Of course bying the most valuable parts of a company and leaving an empty shell can hardly be described as cooperation.

By the way, the company that designed the flipper gpu for nintendo is/was ArtX. I don't know if the company still exists as subsidiary of ATI or if it "suffered" the same fate as 3dfx.

Regards
Andreas
 
Nevermind what I wrote in this post, I just read Anandtech's GC probing again.
It does seem nothing was put in the GC that was by ATi, it was only attaching a sticket that claims they made the Flipper GPU. (Dolphin for others)
Also it has Fixed T&L so now Pixel and Vertex Shaders. But if that's the case, Dave! How were they able to create the jaw-dropping water effects in Mario Sunshine?!
I recall these were only possible by pixel shaders like the 3d Mark Nature Test ones!


--
*You can do anything you set your mind to man. -Eminem<P ID="edit"><FONT SIZE=-1><EM>Edited by Eden on 11/30/02 09:14 AM.</EM></FONT></P>
 
Just saw your edit before I posted. I totally agre with you that the water effects are amazing, but anand writes that the fixed function pipes are probably a good deal better than the fixed functions pipes on dx7 hardware (although I know that it still wouldn't allow anything like per-pixel lighting). I'm sure Nintendo were quite involved in the development of the flipper, and great water effects could have been a requirement.
 
Additionally the water effects took no frame rate drops, they were simply jaw dropping and could extend to the horizen with animation.

I think the GC is showing that it can compete Xbox very well even graphically-wise when well spent on. It may not compare to it or beat it, but it sure as hell does a good job as the little cube!
Metroid Prime on my GC looks awesome and plays awesome anyways, so I'm happy!

--
*You can do anything you set your mind to man. -Eminem
 
Ok will keep that in mind! Actually I just realized that I have begun to use terms such as "suffer" and "empty shell" when writing about what nvidia "did" to 3dfx. Talking about being a fast learner...... lol!
 
actually the game cube has NO dx8 or higher effects .. its all DX7!!!!!!!!!!!!!

i read that it has no pixel or vertex shaders as someone else said before me.. and from personal experience i found the graphics capablilities to be somewhat disappointing

but then a game like metriod prime comes out and makes you realize that fast hardware is only half of the story. you gotta have good programming to back it up..with that you can make good games no matter how fast the hardware your working with
 
I found Mario Sunshine to really destroy what DX7 is supposed to be, and feel like DX8. Mario's face is well animated and the polygon amount gives a good realism. The distance view is also unlimited so it really pushes the graphics and keeps it all awesome.

--
*You can do anything you set your mind to man. -Eminem
 
uh, oh, that hurts..

gc does not have any dx standards in => not even dx1 fits onto gc.

so still its quite awesome? and all fixed function? wtf?

simple: it has just the features you need. and you can enable or disable them, by choosing some path. now, if you want to learn something today, that is technically equivalent to how the geforce3 and geforce4 work. they _DONT_HAVE_ programable pixel shaders. _NO_. they _DONT_. dx8 does make the fixed function pixel pipeline _look_like_ it is programable. but opengl coders know bether. its a fixed function.

that does not mean fancy per pixel effects are not possible.

environmental bumpmapping like in 3dmark2001, the water, was possible on the matrox g400, on the old, first radeons, etc. why not the gc?. its a simple technique, quite old. all except nvidia knew that the effect looks amazing. nvidia forgot it till the gf3 came out. and then they called it "programable" *urgh*.

yes, gf3 and gf4 can do quite much per pixel, but thats all just configuration of the path, where the data has to go through. and those paths are predefined. now the gc has some paths there as well. one is environmental bump mapping used for the fancy water (haven't seen it actually, i'll check it out to make sure it is that effect..). gc can as well do bumpmapping, and perpixellighting. you can do most of that even on old voodoo's or tnt's. its just a mather of gpu speed actually. the api only defines how you can code for it, but what was never much of a problem.. (sure, newer hw can do stuff bether or for the first time old couldn't, but you could work around most of it..).

the gc is an awesome piece of hw. and i prefer it over xbox for gaming, just because i'm one of the generation of nintendo. nothing can beat mario,zelda,donkey,who ever.. metroid.. ahh..

😀

i don't even know what the gc can do exactly, i could ask one of my friends, coding for gc. btw, his dev project is awesome, worth to buy a gc when its out. but i'm not allowed to talk about it.. you know the blahblah😀

"take a look around" - limp bizkit

www.google.com
 
oh really? well technically if you designed a game in DX7 and made the polygon count so high, and used the it to its full potential so that only a Radeon 9700pro could run it fast enuff it would look sweet.

and theres alot of potential STILL in DX7... for example Bump Mapping (one of the most incredible effects IMO) has been around sinse DX6 but NO ONE USES IT!!!!! i cant understand why they dont, because it makes everything look so real and gorgeous and even on my old Radeon Original it can run at over 120fps with environment bump mapping, totally freaking incredible effect.. and its only DX6 I say again!!!!!! bah...

maybe your right and the gamecube does have DX8-like effects.. if so then my sources are wrong .. dammit lol
 
yes your right about the GPU speed

as you said there have been great effects for a long time they just havent been utilized. my radeon can do pixel shaders, not supported by ATI, but it has the hardware to. but, when its used i get like 10FPS roofles .. .

DX7 can have some INCREDIBLE effects.. just look at warcraft3 (says dx8.1, but it only uses dx7 effects, and maybe a more effecient dx8 texture rendering process) and quake3. q3 is outdated yes, but Jediknight was released on the engine and it looks gorgeous..

anyways all this is very OOT
 
I'm really astonished by what you're writing about the gf3 and gf4 not being programmable, but you sound like you actually develop shaders and therefore know quite a bit about the hardware. Why has none of the major hardware sites pointed that out? If they have then please post a link.

To phial: You say that you have run pixel shaders on your radeon. How did you acomplish that?

Right now I'm really confused!!

- Edit: What about vertex shaders? Are they also a big bluff?

Regards
Andreas


<P ID="edit"><FONT SIZE=-1><EM>Edited by AndyK on 11/30/02 05:02 PM.</EM></FONT></P>
 
Vertex shaders have always been doable on any system or card. If a card does not support it, the CPU will take the task, as it is only programming the movements and such.
Pixel Shaders AFAIK can't as they have specific instructions that would not be emulated on a non-pixel shader card.

Can you make a PSX do bilinear filtering? No, at least not since no game ever had that. It's all huge squary textures. However, you can always try and emulate it, by adding so many smaller squares all together so it'd look like it's filtered (naturally adding pixels on a picture makes it clearer), but holy crap your performance would go under. And it's not even entirely the same picture quality as real BFiltering. I guess that's how pixel shaders work.

Gawd I wish I could start already my 3d animation years to learn a bit about these... 🙁

--
**Canadian joke:
Here we don't say the word "retarded", we say "Alliance"! -Mike Bullard**
 
Something that mixes me a lot here is all these new "DirectX9 features" claimed DX, why do we also see them on OGL?
I mean, if they were only for DX9, how does OpenGL eventually get more realistic effects?
Vertex Shaders and PS are technically DX 8 features, so how does OGL get around it?
I guess I am mixed with the programming APIs and the features stuff... uhhhhh, sometimes an all-knowing graphics expert can screw the simple stuff you THOUGHT you knew! :wink:

--
**Canadian joke:
Here we don't say the word "retarded", we say "Alliance"! -Mike Bullard**
 
I know that Vertex shaders are very much doable on the cpu.

As a response to your last post:
dx8 features are only called dx8 because dx is the bar which graphics cards are measured up against. You could just as well call it OpenGL 1.X (forgot the number) features(or whatever you want) - it's just features exposed by the API.


<P ID="edit"><FONT SIZE=-1><EM>Edited by AndyK on 11/30/02 06:28 PM.</EM></FONT></P>
 
Yes but I was just telling you that Vertex shading probably shouldn't be a bluff, as even a CPU can make geometry vertex movement. However the fact the hardware has these, and can store even more instructions now and has efficient control of them, is what makes Vertex Shaders Hardware based much better.

--
**Canadian joke:
Here we don't say the word "retarded", we say "Alliance"! -Mike Bullard**
 
NVIDIA made the XBox's GPU, which took months and months of developing and resources. That speed bump slowed NVIDIA's work on the Geforce 4, and allowed "PC only" ATI to take the lead. NVIDIA can't make products for consoles and PC's while holding the top performance spot. Now that (as far as I know) NVIDIA isn't working on console GPU's, they can focus on the FX and likely take back the crown unless ATI's new chip is a real monster.
Remember when it was Pentium 2, and only Pentium 2? Now with AMD's compitition, processors are at great prices and performance. Compitition is good for us (the consumers). I welcome ATI's current lead, it forces NVIDIA to get back in the game, and lower prices =)

uh, dude... ATI created a custom chip for the Nintendo GAMECUBE.

bah, I see that was already addressed... sorry :)

"There is no dark or light side, only power and those too weak to seek it."<P ID="edit"><FONT SIZE=-1><EM>Edited by vegandago on 11/30/02 11:14 PM.</EM></FONT></P>
 
Which I also claimed like you but was wrong, ATi didn't create the chip, ArtX did. All ATi did was use it under their name as they had aquired the company. Of course as Anandtech says, this sticker will be a huge help to ATi's sales in the future as people see that their GC uses that company's powerful graphics. It will tempt them more easily. Not a bad marketting strategy at all, but it's like nVidia, it isn't entirely honest.

--
**Canadian joke:
Here we don't say the word "retarded", we say "Alliance"! -Mike Bullard**
 
But the fun part is that after ATI aquired ArtX, they immediatly started the ArtX staff working on the r300. When nVidia aquired 3Dfx it took them years to get the 3Dfx engineers to work on the nv30. ATI assimulated (god, I sound like a Borg) alot quicker than nVidia. I fully believe that nVidia took over 3Dfx just to eliminate a competitor and not for any "good" reasons, whereas ATI took over ArtX because they actually wanted what ArtX had.

Anyway, the reason why I hate nVidia is because of the gforce 4 MX. IMHO they should have called it the geforce 2 MX. I know several friends who were suckered into buying it becase it was "a geforce 4 card". When I tell them that they would be better off with a geforce 3 card than that POS they never believe me.

IMHO nVidia's marketing dept is too powerful, and ATI's is not powerfull enough. nVidia's marketing dept is responsible for the MX line, as well as for the pomp and ceremony surrounding the paper release fo the NV30 months before it will be available strictly to take the wind out of ATI's Christmas sales. On the other hand, I never heard anything definate about the r300 until after it was released, which was a couple of weeks before it was available, and I am still hearing nothing about the r350 other than it may use DDR2, even though the r350 should be available within a month of the nv30.

--------------
Knowan likes you. Knowan is your friend. Knowan thinks you're great.
 
"When I tell them that they would be better off with a geforce 3 card than that POS they never believe me."

most people are like sheep, you can show them why they're wrong, give them plenty of proof and still don't believe you. you just have to let it go and let them suffer.

I watched at work 3 guys almost getting into a fist fight because one said drugs are stupid and two said drugs are cool, the two decided that the one against drugs know nothing and he shouldn't say anything. the guy who said drugs are stupid got so frustrated that he just left the room, then the two drug lovers told others that the guy must be crazy.

you can't win against dumb people, they have no clue when they're wrong. and when there is more than one it's like they got their own support group so you have no chance. but there are also the lemmings a.k.a. sheep, they just follow whatever others do and believe it's the only (and right) way because others do it.

so don't bother, not worth it 😉



<b><font color=blue>Press 1 if you want to be on hold, 2 for disconnect, 3 for a representative who will put you on hold before disconnecting.</font color=blue></b>
 
Where do you work Andrew? :smile: That about drugs, I guess it´s sort of up to the individual. Everyone is entitled to his/hers opinion though. *NOTE* I don´t do drugs.

<font color=red>I´m starting to feel like a real computer consultant.</font color=red>