leaked NV20 specs! impressive!

Page 5 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
G

Guest

Guest
you look at the specs, the core speed is way faster then the memory clock
250-300 Mhz DDR the original post says... that is 500-600 Mhz equivalent. compared to a 300 Mhz core. Guess they did learn. 300 Mhz DDR btw, is *extremely* fast.. I also heard it would be coupled to a 256 bit interface, doubling the throughput again. Just to give you an idea, a GTS has "only" a 166 Mhz memory clock (DDR so 333 Mhz equivalent) with a 128bit bus. If the leaked specs are right, the NV20 will have 3-4x higher memory performance.. that is nothing to laugh at, especially if coupled with the hidden surface removal features. Frankly, i wont really believe it, until i see it. I've never heard of such fast DDR chips. Even the Ultra has "only" 230 Mhz DDR memory, and is damn expensive because if it.
 

dannyaa

Distinguished
Jan 1, 2001
594
0
18,980
So I take it NVIDIA lost its 6 month cycle now? This one is more like a year! So when is the next card after NV-20 coming out? if this card hits shelfs June, their next card if still on development cycle would be out this time next year wouldnt it?
Now that 3Dfx is dead the only real competition is from some of the other companies like ATI. I actually really hope ATI starts producing some really good cards to drive these prices down, $800 is insane!! I remember when a brand new 3D card was 200 for the best one... 800 dollars for one card is more than I spent on a 1.2GHz chip and ram etc...!!!!
 

Grizely1

Splendid
Dec 31, 2007
7,810
0
30,780
Well they've lost it but I think they will regain it again, once they start producing different versions of the NV20 (like they did with the GeForce)

----------------------
I don't hate Intel............ Do I?
 
G

Guest

Guest
someone that works for nvidia said the geforce3 gets over 120fps in 1600x1200x32 in quake3. thats over 2x ultra speeds. only a 500MHz 256 bit memory bus makes this possible :)
 
G

Guest

Guest
Ace, I think you are missing the point on memory speed. The memory will be faster on the NV20 than on the GeForce2Ultra but the real key is that the NV20 uses the memory so much more effeciently. You can't directly compare the memory bandwidth between current cards and the NV20 and come out with anything meaningful. The NV20 has the ability to do a LOT more with the same amount of bandwidth, and I don't think memory speed will hold them back this time.

As a note, I am sure I read somewhere that the 256bit memory bus was a myth and that it will be only 128 like current cards. I am looking forward to finding out for sure though :)

Regards,
Warden
 
G

Guest

Guest
dont ya wish ye were a millionare? man i cant stop dreaming of the geforce3's massive performance!
 
G

Guest

Guest
Sixteen colors? Two hundred and fifty-six?

The human eye can't tell the difference.




Tom Mc

Even a fool, when he remains silent, appears wise.
 
G

Guest

Guest
in just a few hours the geforce3's specs and benchmarks shall go public!!!!!!!!!!!!!!
 
G

Guest

Guest
Hehe, had to resurect your favorite thread didn't you..... :)

Cheers,
Warden

(edited for typos--yeah, even in a post this short!)<P ID="edit"><FONT SIZE=-1><EM>Edited by warden on 02/26/01 05:15 AM.</EM></FONT></P>
 
G

Guest

Guest
LOL yup and its 11:10 here. i know the geforce3 goes public on the 27th, but what time, i not know. netherless its expected to render 150fps in 1600x1200x32 accroding to what the Nvidia employees say
 

noko

Distinguished
Jan 22, 2001
2,414
1
19,785
I've seen animations on a computer at varying speeds up to around 120FPS, bottom line they are smooth at 30FPS or 120FPS (120 FPS due to monitor limitations). If 30 FPS is jerky or noticiable then watching a movie in the theater at 24 FPS must be just terrible, how could anyone take it? Why havn't movies died a long time ago wih new technology replacing that slow 24 FPS crap but hey, lets go to the movies :smile: .

<P ID="edit"><FONT SIZE=-1><EM>Edited by noko on 02/27/01 06:19 AM.</EM></FONT></P>
 
G

Guest

Guest
I was merely curious if you had seen any good comparisons of frame rates in movie type settings, and was not really making any point. I have not had the chance to see such a comparison. Of course, we both know that games with their average frame rates are a whole other story. But I agree that you don't need a buffer of 200 fps. When I look at game benchmarks, I only pay attention to 1024x768x32 and above, as this paints a much better picture of their high-level performance. I also consider the value of next-gen cards like the GeForce 3 to be in adding more quality, like 1024x768x32 FSAA, not for running things at 120+ fps.

Cheers,
Warden
 

noko

Distinguished
Jan 22, 2001
2,414
1
19,785
Having enough power to do FSAA at 1024x768x32 or beyond would definitely set the GF3 apart from everybody else, which would be very noticable. Can the GF3 do skeletal/bone animations like the Radeon? I hope so, that way real life like character movements would become more standand in games.
 

HolyGrenade

Distinguished
Feb 8, 2001
3,359
0
20,780
FOR THE IMPATIANT PEOPLE:

MOVIES: GOOD @ 24+ FPS

GAMES: NOMINAL @ 24+ FPS, OK @ 40+ FPS, NICE @ 60+ FPS


FOR EVERYONE ELSE:

The movie theatre is dark. the only bright light is being projected onto a silver screen. Each frame is being projected all at once. (unlike, tv/monitors where it is scanned line by line.)

A bright light in the middle of the dark stays a while in the eye. This "afterimage" stays even if the source is switched off. At this rate the highest amount of "discreet" images the eye would be closer to 13 fps rather than 30 fps. The next frame transitions in well before the afterimage of the current is lost from the eye.

This also works on TVs to an extent. The image on a TV is very vivid compared to that on a Computer monitor. It is also much softer. Moniters were designed to be viewed from very small distances and to show very sharp high resolution images. So, there goes the vividity (is that a word?) and the softness. So, there is a less of an afterimage form a monitor.

Now the next bit includes all recorded motion pictures whether it is projected, displayed on TV and to an extent on monitor. The Human Eye cannot concentrate on detail and movement simultaneously. This is where Motion blurring comes into effect. For example, take any car chase scene and and pause it (works best on DVD's, also good on freeze frame on tv's, and ok on perfect still pause on videos) youll see it is blurry as hell (not that i've seen hell) and very little of the detail you percieved when it was in motion still remains.

CGI in games however, is extremely sharp. On monitors, where all the sharpness is retained, the motion seems jerky even circa 30fps. That is why, we all like our games at 60+ fps.
 
G

Guest

Guest
Good call, want to play games get a gaming computer! Want to play with graphics, then this machine should do it without this kind of video card. Most likely they will make a card for your computer, but what it the point.

The right tool, for the right job.

The Facts:
Games = PC, Gaming Boxes
Graphic Design = Macintosh

And that's all there is to it!
 

noko

Distinguished
Jan 22, 2001
2,414
1
19,785
Light shutter on a movie projector isn't instantaneous or so good that it is able to show the momentary stopped frame instantly. Once the movie frame has moved into the lightpath and stops the shutter opening moves across the stopped frame to allow light to pass through the lens and subsequently onto the screen. So in effect there is a fast drawing of the frame on the screen. The shutter then blocks the light passing through the frame undrawing the image on the screeen and the screen is momentarily <b>black</b> while the next film frame is position behind the shutter (Thats right, only light reaching the screen is ambient hopefully in a dark theater). Now the timing of the shown image frame is longer then the switching of the frame behind a closed shutter but both still are happening 24 times a second, a lighted image then blank screen. So in reality it is more like a strobe with a longer period of time with light on then off but still happening at 24 FPS.

So now we are dealing with the human side. Take a light in a dark room and switch it off and you continue to see the light for awhile. Persistance of the eye or the firing of nerve cells in the retina after the lightsource is removed, in other words the eye has a time delay when light is changed allowing a 24 FPS movie to look fluid. In additiion this time delay due to nerve cell response time will cause blurring to fast moving lighted object. Go outside in the sun and place you fingers in front of your eyes with them spread and move your hand back and forth. As you speed up shacking your hand back and forth what do you see? Naturally you will see a blurring of the fingers. Now how many fingers can you count in a second moving across your field of vision moving rapidly? Not many. Still you can tell if your hand is moving faster and faster by the amount of blurring that occurs. Not the number of fingers that you can count crossing your field of vision.

So 24 FPS allows for fluid movement in a movie theater except when the title and credits scolls by then you see some flicker, at least I notice more. Why? Now your eye has something which has more contrast meaning a amplified difference that it can discern especially if the credits are not anti-alias with soft shawdows blending in the back ground with the scolling characters. So a black and white text on a monitor which is being updated at 60 hz (60times/sec) the eye can discern the flickering. Now take that 60 hz monitor and and blend a color background in 24bit with a textured text nearly matching the background and the flickering will go away. I watch my DVD's on my monitor at a 85hz refresh rate but still the frame rate for a movie is, guest what, 24 FPS buddy. The video stream may be 30 FPS, acually 29.97 FPS NTSC but the movie was originally recorded at 24 FPS so really I could be seeing less then 24 FPS. Well when I watch a DVD up close on my monitor it looks pretty smooth running at 24 FPS. Kinda blows that up close monitor theory.

Why wouldn't a game be smooth at 24 FPS? You said it in your discussion, freeze the movie frame on the DVD and it is blurry with fast pace moving objects. Think about it. When you where moving your hands back and forth you could tell the speed of your hand by the amount of blurring you saw. Now with a DVD, (film at 24 FPS converted into mpeg2 format), has a recorded images from the film which exhibits properties similar to what the eye does; blurs when motion is present. So basically our brain is confusing the blurriness with motion like with your hand outside. So 24 FPS can look good on a high resolution monitor a.k.a DVD as well as on the movie screen.

A Video camera is sorta the same way and also has persistance and will blur naturally if the motion is moving fast enough. So 30 FPS seems to be ok with video but not perfect. If the image has alot of subdue colors then the eye pretty much thinks it is fluid. If you run your computer to your monitor and place black and white text on the screen, well you know, it sucks.

Now most video games don't have much blurring routines involved so only the human eye limitations come into play. So like you said some people can tell the difference between 30FPS and 60FPS or can they? Well if we are dealing with people I guest there will be some difference in ability of the person to discern FPS. I doubt anybody could tell me FPS of a giving image as in that one is running at 42FPS and that one is running at 48FPS. NOT!!!!! But how about 30FPS and 60FPS. Well once again games don't work that way. If you ever run 3dMark2000 you will notice that the frame rate is moving all around the place. One moment it is at 140FPS and at the next 42FPS. Did you notice the slowdown? Probably not but once the FPS gets less then your tolerance zone (maybe 18FPS, 20FPS and if you really good 30FPS) then you will notice hey man that is jerky.

Do you really think a computer game runs at a constant 60FPS and that is the magic number for smoothness? Benchmarks give averages and the game may never even play at that frame rate anytime during the whole session. In reality the game frame rate is jumping all around the place it is only when the frame dips below you tolerance or threshold can you say hey man this is jerky and I got Fragg!

One more item, your monitor imposes its own limitation on FPS, if your monitor is at 60hz refresh rate (all scanlines, pixels etc) meaning being updated 60 times a sec then any frame rate greater than 60FPS would not be displayed on the screen. :smile:

<P ID="edit"><FONT SIZE=-1><EM>Edited by noko on 02/27/01 05:29 PM.</EM></FONT></P>