Archived from groups: alt.comp.periphs.mainboard.asus (
More info?)
"Kyle" <me@privacy.net> wrote in message
news:2mv31cFr2pnjU1@uni-berlin.de...
>
>
> "Jens Nixdorf" <spamtrap@trackpoint.de> wrote in message
> news:ced4rq$mtg$03$1@news.t-online.com...
> > Bubba Sort wrote:
> >
> > > Well, I was just listening to a couple of programmers discuss this
> > > yesterday. One of them said that the human eye can only see around
30-40
> > > fps; anything over this is overkill. So I don't know the exact answer
to
> > > your first question but it should be within this range.
> >
> > There is a difference between the illusion of motion, which you have at
> > least around 25 fps and higher (like in tv, video and so on), and the
> > quickness you could see a thing the first time. So some of the following
> > will be a little bit theoretical for the most people (and also for me),
> but
> > gamers still are believing in this:
> >
> > In a 3D-Shooter: If you have 100 fps, and at 0.59 sec a new enemy
appears,
> > then you can see this enemy exactly at 0.49 sec. If you have 25 fps you
> > will see this enemy later, exactly at 0.52 sec. According to the enemy's
> > time to react you will be dead at 25 fps or alive at 100 fps
😉
> >
> > As conclusion for the original poster: It's not important to see a
> "smooth"
> > game, but to have enough time to react. So i can give only one answer to
> > gamers: Higher (fps) is always better.
> >
> > regards, Jens
>
> I'm not certain what your proposed timing numbers are intended to
exemplify,
> but .03 seconds difference is interesting. Most "online game servers" run
a
> game refresh rate of 20-25 fps (in terms of time, a 50 millisecond or 40
> millisecond update interval rate), so once your vid card equals or
surpasses
> this value or speed, little is gained by a higher fps. An analysis of
> reaction time versus fps should include a consideration of the timing
> related to data received from a game server. This example assumes 2
players
> receive data from the server with the same ping/delay. Using a 30 fps vid
> card, a first computer requires 33 mS to render a new image based on newly
> received game data. At 50 fps, a second computer requires 20 mS to render
> the image. Any advantage enjoyed by the player using a faster vid card is
> in the 13 mS time difference. In essence, there will be a 13 mS time
> advantage on every rendered frame, but never any more. I would estimate
> that any delay less than 40 mS will be negligible in impact on a player
> given typical human reaction times.
>
> More importantly is the ability of a vid card to deliver at least 30 fps
> under the most demanding conditions, such as when there are a large number
> of moving or animated actors on the computer screen, a large number of
sound
> effects playing simultaneously, and a large quantity of data flowing over
> the network connection. These factors along with the response time to a
> game server are the most critical.
>
> I offer these comments in view of my experiences of moving up to faster
vid
> cards, yet seeing no improvement in my ability to react once my vid card
fps
> exceeded a minimum of 30 fps, since the on screen animations are still
time
> locked to the rate at which data from a game server arrives at my
computer.
> --
> Best regards,
> Kyle
>
>
>
once you do in fact visually see an object on screen, you still have to take
into account the time it takes to mentally perceive that object as to
whether or not it's a threat, and then press a button on a keyboard, mouse,
joystick, controller, etc., to respond appropriately...