a direct question about 8800 series

anticupidon

Distinguished
Jan 20, 2007
231
0
18,680
hi all
i'm plannig to change my good ol'card 6600gt 128 to a better card so i want a good one and i'm consider as an option a 8800 card ...but i have a question for all you outhere
it is really that important to have more and more fps in the games
i ask this cuz i have seen a topic where they said that the human eye can perceive only 75 fps...so i remember... :?
what if u buy a card who gives you a rate of 100 120 fps in hl2 or farcry and the human eye perceive only a procentage ot that ..what happens with the rest of the fps ..are these numbers just for testing purpose only or to raise our ego ...look i m the king of da hood i have a 8800 gtx card :twisted: ...!lol :D
 

mpjesse

Splendid
generally speaking, the higher the max FPS the higher the average FPS.

the 2 most important aspects of a video card is it's AVERAGE frame's per second performance and minimum FPS performance. Yes, the eye cannot percieve beyond 30 FPS. However, in gaming, performance will occasionally drop below 30FPS, which gives the impression that the card is not powerful enough and can lead to a poor gaming experience.

in all games, the 8800 series has an average FPS higher than 30. in very few cases does it even drop below 30fps at average resolutions (1280x1024)

again, this is very generally speaking.
 

Dr_asik

Distinguished
Mar 8, 2006
607
0
18,980
I think 60 fps and up is perfect for just about any game. Others will say more, maybe less, but that's not really the point. The point in buying a very fast card is that if it can output 120fps with a certain game with certain video settings, it'll output less in a more graphical-intensive game. Yeah, the 8800GTX is overkill for almost any game you can throw at it, but that ensures you can turn up any graphical bells and whistle and not suffer any performance drawback. Yeah, perfectly acceptable gaming performance can be found with much more modest video card, it just depends what it takes to satisfy you. Or how reasonable you are. :wink:
 

pidesd

Distinguished
Feb 20, 2007
141
0
18,680
a fast card right now ensures performance for the years to come. it depends how long you want to keep that card, what kind of money your willing to spend, what size your monitor is so it s not necessarely a waste.
 

cleeve

Illustrious
Yes, the eye cannot percieve beyond 30 FPS.

The human eye can percieve MUCH higher than 30 FPS. But the minimum required for smooth animation is around 25-30. You're right though, it's the minimum framerate you have to worry about.

Also, just because something is delivering 30 fps doesn't mean it's smooth. During alot of movemenrt, even 30 fps can be a bit choppy.

At 60 or so though, everything is quite smooth and pretty much everyone is OK with that except Uber FPS players.

The real cap is the refresh rate of the monitor. At 85 Hz, if you're getting more than 85 fps it'll cause tearing... that's why Vsync is an option.
 

stemnin

Distinguished
Dec 28, 2006
1,450
0
19,280
Most movies is filmed at 24 FPS, for you FPS freakos out there, 300 is filmed mostly at 50-150 FPS (various cameras), I can't wait to see it btw...

Really though, I want a minimum FPS for smooth motion, the point at which my eye can't detect.
 

barlag

Distinguished
Feb 27, 2006
26
0
18,530
25-30 Fps only looks smooth with motion blur. I explained this in another post.
The camera's frame exposure (PAL) is set to 1/25th of a second, therefore people running in the movie will look blurred in frames (pause an action dvd if you dont believe me). This gives the illusion that we can only see 25FPS. I can tell the difference between 80hz and 100hz monitor settings on CRTs, but above 100hz its not really noticable.

We do not use motion blur on the PC.
An 8800GTX might get 300FPS in current games, but with 22+" monitors and and enabling all antialiasing etc it's not even that fast. The minimum framerate should stay above 50fps for silk smooth gameplay its something that an 8800gtx might not even be able achieve with some of the first directX 10 titles in high res and high quality options.

Thus in conclusion it is best to go for the best card available, it is more futureproof and minimum framerates are much better on a GTX than on a GTS or something else.

Gabor
 

cleeve

Illustrious
Barlaq is right about motion blur, but the bottom line is you want 60 fps.

If it dips into the 30s once in a while you'll be ok, but if you can sustain a minimum of 60 you're in the sweet spot.

To add a final complication, your requirements depend on the type of game you're playing. First person shooters or 'twitch' games really need a high framerate because if you're framerate gets choppy you are much less effective.

RPGs or RTSs can make do with less of a framerate because combat is based on stats, not how fast you aim and shoot. That's why people can still play Oblivion at 20 fps...
 

rodney_ws

Splendid
Dec 29, 2005
3,819
0
22,810
generally speaking, the higher the max FPS the higher the average FPS.

the 2 most important aspects of a video card is it's AVERAGE frame's per second performance and minimum FPS performance. Yes, the eye cannot percieve beyond 30 FPS. However, in gaming, performance will occasionally drop below 30FPS, which gives the impression that the card is not powerful enough and can lead to a poor gaming experience.

in all games, the 8800 series has an average FPS higher than 30. in very few cases does it even drop below 30fps at average resolutions (1280x1024)

again, this is very generally speaking.

I disagree with the whole thing about the eye can see no more than 30fps, I can clearly tell the difference when FPS drops from 60 to 30.
Well, I completely disagree with this... isn't TV just 30 fps? It's not like people are complaining about that.
 

carver_g

Distinguished
Feb 10, 2007
341
0
18,810
You're comparing apples to oranges when comparing TV programming to the non-stop action of most video games. Try running any game with a FPS lock, Flight Simulator for example, and compare what you see at 30fps vs. 60fps. You may notice a *slight* difference. By slight, I mean huge.

For the record, people apparently have complained about their TVs because progressive scan, which came out years ago, raised the max FPS to 60.
 

dsidious

Distinguished
Dec 9, 2006
285
0
18,780
A lot of LCD monitors can only display 60 fps (that's the refresh rate of 60 Hz). Anything more is overkill if you have that kind of monitor.

Of course, you might have a (really expensive) CRT monitor with a refresh rate of 100 Hz, or LCD technology might improve a lot in the future.

For now, try deciding which games you like most, then find benchmarks for those and pick a card that offers at least 60 fps with all the eye candy enabled (AA, AF, whatever).

Or just wait for the 8950 GX2 to be released and then buy an 8800 GTX at reduced price on eBay...

(Edit) There was a very nice article on THG some time ago explaining how you need a really good CPU to really get the best out of the 8800 cards. If your CPU is from the same year as your original card then it will probably be a bottleneck. To replace the CPU with an E6600 or better you'd probably need a new motherboard. Sorry about the bad news...
 

surf2di4

Distinguished
Dec 9, 2005
78
0
18,630
let me go check. but i belive TV i.e. ntsc is 60 cycles interleaved so you get half the picture on 1 cycle and the other half on the next cycle. so it is 60 fps just its half the info each frame! also lcd refresh as fast as very 5ms. depending on size and screen. it is only 60 hz when in analog. not digital.
 

mpjesse

Splendid
generally speaking, the higher the max FPS the higher the average FPS.

the 2 most important aspects of a video card is it's AVERAGE frame's per second performance and minimum FPS performance. Yes, the eye cannot percieve beyond 30 FPS. However, in gaming, performance will occasionally drop below 30FPS, which gives the impression that the card is not powerful enough and can lead to a poor gaming experience.

in all games, the 8800 series has an average FPS higher than 30. in very few cases does it even drop below 30fps at average resolutions (1280x1024)

again, this is very generally speaking.

that's because the game is dropping below 30FPS occassionally. If your card is nearing 30fps, it's safe to say that it is probably going lower sometimes, but FRAPS or whatever else isn't catching it. anyways, i know what you mean.

I disagree with the whole thing about the eye can see no more than 30fps, I can clearly tell the difference when FPS drops from 60 to 30.
 

mpjesse

Splendid
You're comparing apples to oranges when comparing TV programming to the non-stop action of most video games. Try running any game with a FPS lock, Flight Simulator for example, and compare what you see at 30fps vs. 60fps. You may notice a *slight* difference. By slight, I mean huge.

For the record, people apparently have complained about their TVs because progressive scan, which came out years ago, raised the max FPS to 60.

Hmmm... I believe the reason for going to progressive scan was to eliminate artifacting on HDTV's. As far as I'm concerned, progressive scan doesn't make the picture look any smoother... just better.
 

mpjesse

Splendid
let me go check. but i belive TV i.e. ntsc is 60 cycles interleaved so you get half the picture on 1 cycle and the other half on the next cycle. so it is 60 fps just its half the info each frame! also lcd refresh as fast as very 5ms. depending on size and screen. it is only 60 hz when in analog. not digital.

Other technical standards in the final recommendation were a frame rate (image rate) of 30 frames per second consisting of 2 interlaced fields per frame (2:1 interlacing) at 262½ lines per field or 60 fields per second along with an aspect ratio of 4:3, and frequency modulation for the sound signal.

FYI. You're seeing both interlaced fields at the same time... so it's really just 30fps. I think you're confusing fields per second with frames per second.

Response time and refresh rate have little to do with each other BTW. Response time is the time it takes for a pixel to change from one color to another (usually measured in Gray to Gray by the panel manufacturers, which is cheating). Refresh rate is the number of times a display is illuminated. But you're right, refresh rate doesn't much apply to LCD's because the display is always illuminated... a distinct difference from CRT's.
 

kaotao

Distinguished
Apr 26, 2006
1,740
0
19,780
Well, I completely disagree with this... isn't TV just 30 fps? It's not like people are complaining about that.

TV also has motion blur though, PC games don't.
Check Barlaq's post above.

That is, of course until Crysis comes out. :)
 

anticupidon

Distinguished
Jan 20, 2007
231
0
18,680
first of all thanks for the input
second it's the average frame rate that its important to me cuz i play at a res 1280x1024....
third as i was saying im planning to change my card for a 8800 card but i'm thinking about power comsumption and all ...it seems that
with the arrival of a new gaming era dx 10 the other side of the coin is that our electricity bills wiil raise like hell.....where to .....i dont know
sorry but my green ideas seems to burst out....
see ya
 

jorbor36

Distinguished
Aug 9, 2006
50
0
18,630
well movies like lord of the rings are 60fps

But I can tell a difference between 60-50fps.

But of course most monitors refresh rates are at like 75, so 140fps is just like 75. I always set my max fps to 75, cause it would be overworking my card to go anything over that
 

cleeve

Illustrious
FYI. You're seeing both interlaced fields at the same time... so it's really just 30fps. I think you're confusing fields per second with frames per second.

The NTSC standard is 30 frames per second (I think it's actually 29.97 or something), but if it's INTERLACED then each of those frames is split into two interlaced fields.

So NTSC Television in North America is 60 fields per second, or 30 frames per second, both are accurate.

If it's progressive though, then there are no fields. It'd be simply 30 frames per second.
 

cleeve

Illustrious
That is, of course until Crysis comes out. :)

Heheh, too true. Although it has to be accurate motion blur.

There are lots of games with motion blur, (Need for speed for instance) but it's just a cheesy effect, it doesn't do much to smooth out the presentation.

If Crysis has accurate motion blur, that'd be nice. But I imagine it's going to take a good deal of processing power... which leaves the conundrum: do I leave motion blur off to get better framerates, or do I enable it and get worse framerates but possibly the appearance of smoother motion?... :p