Refresh rate, 85 or 100Hz?

G

Guest

Guest
Archived from groups: alt.comp.periphs.videocards.nvidia (More info?)

Which setting would you think would be better, my monitor supports refresh
rates upto 150Hz.
 
G

Guest

Guest
Archived from groups: alt.comp.periphs.videocards.nvidia (More info?)

Depends on the user and ambient light level. Most people find 85 Hz to be
the minimum acceptable refresh rate. Personally I run at 100 because I can
still see flicker at 85 in the dark.

A higher refresh rate has negligible impact on performance, but is more
stressful on the monitor. The old 15" Panasonic on my secondary PC shows
shaky imagery in the corners at 100 Hz, so I have to run at 85.

--
"War is the continuation of politics by other means.
It can therefore be said that politics is war without
bloodshed while war is politics with bloodshed."


"Richard Dower" <richarddower@hotmail.com> wrote in message
news:dgji4m$j02$1@reader01.news.esat.net...
> Which setting would you think would be better, my monitor supports refresh
> rates upto 150Hz.
>
 
G

Guest

Guest
Archived from groups: alt.comp.periphs.videocards.nvidia (More info?)

Richard Dower wrote:

> Which setting would you think would be better, my monitor supports refresh
> rates upto 150Hz.
>
>

In general, you should use the slowest refresh rate that does not cause
eye strain due to flicker. Typically this means anything at or faster
than about 75Hz. Faster refresh rates just use up video card computing
power to little if any benefit.

If you want to prove this to yourself, try any video test program such
as MadOnion and run the test at 75Hz and 150Hz. What happens to your
test scores.
 
G

Guest

Guest
Archived from groups: alt.comp.periphs.videocards.nvidia (More info?)

On Sun, 18 Sep 2005 12:13:31 +0100, Richard Dower wrote:

> Which setting would you think would be better, my monitor supports refresh
> rates upto 150Hz.

100 would be perfect if the picure is clear and not blury !
 
G

Guest

Guest
Archived from groups: alt.comp.periphs.videocards.nvidia (More info?)

85hz and 100hz seem most common.
I use 100hz for most resolutions,but 85 is fine also,when using higher
settings the image could get a bit more blurry.
Remember that a television operates at 60hz or 100hz in europe.
I believe 50hz and 100hz in US ?
Anyway,no need to run your pc at higher settings for this,or only if you are
a graphics designer with a very big monitor screen,then it could be usefull
to have the highest setting possible.
In general,the bigger the screen the higher the frequency can be set.

"Richard Dower" <richarddower@hotmail.com> schreef in bericht
news:dgji4m$j02$1@reader01.news.esat.net...
> Which setting would you think would be better, my monitor supports refresh
> rates upto 150Hz.
>
>
 
G

Guest

Guest
Archived from groups: alt.comp.periphs.videocards.nvidia (More info?)

The higher you set the refresh rate, the harder your monitor has to work.
Set it to the lowest refresh rate that prevents you from seeing any flashing
of the screen. Usually that's around 75 or 85 Hz. Running it at 100 Hz
will shorten the monitor's life.

--
DaveW
__________

"Richard Dower" <richarddower@hotmail.com> wrote in message
news:dgji4m$j02$1@reader01.news.esat.net...
> Which setting would you think would be better, my monitor supports refresh
> rates upto 150Hz.
>
 
G

Guest

Guest
Archived from groups: alt.comp.periphs.videocards.nvidia (More info?)

'Flow' wrote, in part:
| 85hz and 100hz seem most common.
| I use 100hz for most resolutions,but 85 is fine also,when using higher
| settings the image could get a bit more blurry.
| Remember that a television operates at 60hz or 100hz in europe.
| I believe 50hz and 100hz in US ?Television (PAL or SECAM) in Europe is 25
frames per second (50 fields per second.)
_____

Television (NTSC) in the USA is 30 frames per second (60 fields per second.)
But television images (PAL, SECAM, or NTSC) are interlaced signals and
television CRTs are optimized for slower phosphor decay than computer CRT
monitors, so the viewing experience isn't directly compatible.

Some newer, more advanced television monitors digitize the video signal,
buffer it, and display the image non-interlaced at double or quadruple the
original frame rate. This produces a better viewing experiece (viewers
accustomed to NTSC find PAL and SECAM to have objectional flicker at the
normal 25 frames per second/ 50 fields per second because of phosphor decay.
Optical film projection is normally at 24 frames per second, but no flicker
is apparent because the illumination of the screen is at a constant level
during each frame with a very brief black interval between each frame as the
film is advanced. Rotating prism projecters can vitrually eliminate the
black interval.

CRT omputer monitors are optimized for the highest frame rate the sweep
electronics can accomplish.
Increasing the frame rate increases the required high frequency response for
circuits in the display adapter, monitor and connecting cables even though
the resolution of each frame is not increased. Higher frame rates can
result in lower detail on the screen.

Phil Weldon

"Flow" <Flowing@zonnet.nl> wrote in message
news:pQeXe.1421$le5.238@amstwist00...
> 85hz and 100hz seem most common.
> I use 100hz for most resolutions,but 85 is fine also,when using higher
> settings the image could get a bit more blurry.
> Remember that a television operates at 60hz or 100hz in europe.
> I believe 50hz and 100hz in US ?
> Anyway,no need to run your pc at higher settings for this,or only if you
> are
> a graphics designer with a very big monitor screen,then it could be
> usefull
> to have the highest setting possible.
> In general,the bigger the screen the higher the frequency can be set.
>
> "Richard Dower" <richarddower@hotmail.com> schreef in bericht
> news:dgji4m$j02$1@reader01.news.esat.net...
>> Which setting would you think would be better, my monitor supports
>> refresh
>> rates upto 150Hz.
>>
>>
>
>
 
G

Guest

Guest
G

Guest

Guest
Archived from groups: alt.comp.periphs.videocards.nvidia (More info?)

'DaveW' wrote:
| The higher you set the refresh rate, the harder your monitor has to work.
| Set it to the lowest refresh rate that prevents you from seeing any
flashing
| of the screen. Usually that's around 75 or 85 Hz. Running it at 100 Hz
| will shorten the monitor's life.
_____

Based on?

Phil Weldon

"DaveW" <nowhere@dot.org> wrote in message
news:f8GdnZvH2cQLZbDeRVn-tw@comcast.com...
> The higher you set the refresh rate, the harder your monitor has to work.
> Set it to the lowest refresh rate that prevents you from seeing any
> flashing of the screen. Usually that's around 75 or 85 Hz. Running it at
> 100 Hz will shorten the monitor's life.
>
> --
> DaveW
> __________
>
> "Richard Dower" <richarddower@hotmail.com> wrote in message
> news:dgji4m$j02$1@reader01.news.esat.net...
>> Which setting would you think would be better, my monitor supports
>> refresh rates upto 150Hz.
>>
>
>
 
G

Guest

Guest
Archived from groups: alt.comp.periphs.videocards.nvidia (More info?)

"Robert Gault" <robert.gault@worldnet.att.net> wrote in message
news:XyeXe.55202$qY1.42379@bgtnsc04-news.ops.worldnet.att.net...
> Richard Dower wrote:
>
>> Which setting would you think would be better, my monitor supports
>> refresh rates upto 150Hz.
>
> In general, you should use the slowest refresh rate that does not cause
> eye strain due to flicker. Typically this means anything at or faster than
> about 75Hz. Faster refresh rates just use up video card computing power to
> little if any benefit.
>
> If you want to prove this to yourself, try any video test program such as
> MadOnion and run the test at 75Hz and 150Hz. What happens to your test
> scores.

I found that 75 was the best for me, my monitor, and my eyes. I once bumped
it up to 100 or something, and for the next couldn't work out why I was
getting headaches all the time.

Of course, I finally realised what it was, put it back down to 75 and I no
longer got the headaches.
 

Charlie

Distinguished
Apr 5, 2004
474
0
18,780
Archived from groups: alt.comp.periphs.videocards.nvidia (More info?)

Straighten me up on this..I have read previously where the fps will be
enhanced with higher refresh rates...what is that all about?

eMachines W2686/1.5G ram/AMD Athlon 2.6Ghz/GeForce 6600gt

--

Charlie






"Dragoncarer" <wee@ihaveabrandspankingnew.computer> wrote in message
news:432d8c47@dnews.tpgi.com.au...
>
> "Robert Gault" <robert.gault@worldnet.att.net> wrote in message
> news:XyeXe.55202$qY1.42379@bgtnsc04-news.ops.worldnet.att.net...
>> Richard Dower wrote:
>>
>>> Which setting would you think would be better, my monitor supports
>>> refresh rates upto 150Hz.
>>
>> In general, you should use the slowest refresh rate that does not cause
>> eye strain due to flicker. Typically this means anything at or faster
>> than about 75Hz. Faster refresh rates just use up video card computing
>> power to little if any benefit.
>>
>> If you want to prove this to yourself, try any video test program such as
>> MadOnion and run the test at 75Hz and 150Hz. What happens to your test
>> scores.
>
> I found that 75 was the best for me, my monitor, and my eyes. I once
> bumped it up to 100 or something, and for the next couldn't work out why I
> was getting headaches all the time.
>
> Of course, I finally realised what it was, put it back down to 75 and I no
> longer got the headaches.
>
 
G

Guest

Guest
Archived from groups: alt.comp.periphs.videocards.nvidia (More info?)

On Sun, 18 Sep 2005 16:30:34 -0700, "DaveW" <nowhere@dot.org> wrote:

>The higher you set the refresh rate, the harder your monitor has to work.
>Set it to the lowest refresh rate that prevents you from seeing any flashing
>of the screen. Usually that's around 75 or 85 Hz. Running it at 100 Hz
>will shorten the monitor's life.



Very very true, I have always use 72 Hz but now with a LCD its set at
60 Hz as a LCD is not scanned and does not really need the higher (75Hz)
rate from my observations .
 
G

Guest

Guest
Archived from groups: alt.comp.periphs.videocards.nvidia (More info?)

Depending on monitor size and quality of the internal components, obviously.
75 Hz causes noticeable flicker in dark rooms. How much is a healthy pair of
eyes worth compared to a CRT?

--
"War is the continuation of politics by other means.
It can therefore be said that politics is war without
bloodshed while war is politics with bloodshed."


"DaveW" <nowhere@dot.org> wrote in message
news:f8GdnZvH2cQLZbDeRVn-tw@comcast.com...
> The higher you set the refresh rate, the harder your monitor has to work.
> Set it to the lowest refresh rate that prevents you from seeing any
> flashing of the screen. Usually that's around 75 or 85 Hz. Running it at
> 100 Hz will shorten the monitor's life.
>
> --
> DaveW
 
G

Guest

Guest
Archived from groups: alt.comp.periphs.videocards.nvidia (More info?)

Only in apps(games mostly) where Vertical Sync in used and enabled.
If vsync is enabled the games' framerate will top out at the current refresh
rate.

"Charlie" <charlie@play.com> wrote in message
news:11ir6mqp7072272@news.supernews.com...
> Straighten me up on this..I have read previously where the fps will be
> enhanced with higher refresh rates...what is that all about?
>
> eMachines W2686/1.5G ram/AMD Athlon 2.6Ghz/GeForce 6600gt
>
> --
>
> Charlie
>
>
>
>
>
>
> "Dragoncarer" <wee@ihaveabrandspankingnew.computer> wrote in message
> news:432d8c47@dnews.tpgi.com.au...
> >
> > "Robert Gault" <robert.gault@worldnet.att.net> wrote in message
> > news:XyeXe.55202$qY1.42379@bgtnsc04-news.ops.worldnet.att.net...
> >> Richard Dower wrote:
> >>
> >>> Which setting would you think would be better, my monitor supports
> >>> refresh rates upto 150Hz.
> >>
> >> In general, you should use the slowest refresh rate that does not cause
> >> eye strain due to flicker. Typically this means anything at or faster
> >> than about 75Hz. Faster refresh rates just use up video card computing
> >> power to little if any benefit.
> >>
> >> If you want to prove this to yourself, try any video test program such
as
> >> MadOnion and run the test at 75Hz and 150Hz. What happens to your test
> >> scores.
> >
> > I found that 75 was the best for me, my monitor, and my eyes. I once
> > bumped it up to 100 or something, and for the next couldn't work out why
I
> > was getting headaches all the time.
> >
> > Of course, I finally realised what it was, put it back down to 75 and I
no
> > longer got the headaches.
> >
>
>
 
G

Guest

Guest
Archived from groups: alt.comp.periphs.videocards.nvidia (More info?)

Thank you for the explanaition phil :) i always mix up the pal and ntsc
frequencies.

"Phil Weldon" <notdiscosed@example.com> schreef in bericht
news:rDjXe.256$q1.101@newsread3.news.atl.earthlink.net...
> 'Flow' wrote, in part:
> | 85hz and 100hz seem most common.
> | I use 100hz for most resolutions,but 85 is fine also,when using higher
> | settings the image could get a bit more blurry.
> | Remember that a television operates at 60hz or 100hz in europe.
> | I believe 50hz and 100hz in US ?Television (PAL or SECAM) in Europe is
25
> frames per second (50 fields per second.)
> _____
>
> Television (NTSC) in the USA is 30 frames per second (60 fields per
second.)
> But television images (PAL, SECAM, or NTSC) are interlaced signals and
> television CRTs are optimized for slower phosphor decay than computer CRT
> monitors, so the viewing experience isn't directly compatible.
>
> Some newer, more advanced television monitors digitize the video signal,
> buffer it, and display the image non-interlaced at double or quadruple the
> original frame rate. This produces a better viewing experiece (viewers
> accustomed to NTSC find PAL and SECAM to have objectional flicker at the
> normal 25 frames per second/ 50 fields per second because of phosphor
decay.
> Optical film projection is normally at 24 frames per second, but no
flicker
> is apparent because the illumination of the screen is at a constant level
> during each frame with a very brief black interval between each frame as
the
> film is advanced. Rotating prism projecters can vitrually eliminate the
> black interval.
>
> CRT omputer monitors are optimized for the highest frame rate the sweep
> electronics can accomplish.
> Increasing the frame rate increases the required high frequency response
for
> circuits in the display adapter, monitor and connecting cables even though
> the resolution of each frame is not increased. Higher frame rates can
> result in lower detail on the screen.
>
> Phil Weldon
>
> "Flow" <Flowing@zonnet.nl> wrote in message
> news:pQeXe.1421$le5.238@amstwist00...
> > 85hz and 100hz seem most common.
> > I use 100hz for most resolutions,but 85 is fine also,when using higher
> > settings the image could get a bit more blurry.
> > Remember that a television operates at 60hz or 100hz in europe.
> > I believe 50hz and 100hz in US ?
> > Anyway,no need to run your pc at higher settings for this,or only if you
> > are
> > a graphics designer with a very big monitor screen,then it could be
> > usefull
> > to have the highest setting possible.
> > In general,the bigger the screen the higher the frequency can be set.
> >
> > "Richard Dower" <richarddower@hotmail.com> schreef in bericht
> > news:dgji4m$j02$1@reader01.news.esat.net...
> >> Which setting would you think would be better, my monitor supports
> >> refresh
> >> rates upto 150Hz.
> >>
> >>
> >
> >
>
>
 
G

Guest

Guest
Archived from groups: alt.comp.periphs.videocards.nvidia (More info?)

'Flow' wrote:
| Thank you for the explanaition phil :) i always mix up the pal and ntsc
| frequencies.
_____

There is an easy way to remember; the European standards PAL and SECAM have
a frame rate originally based on the AC electrical frequency there - 50 Hz.
The NTSC standard used in Japan and the US have a frame rate originally
based on the AC frequency used there - 60 Hz. Before the days of color the
horizontal and vertical frequencies didn't need to be very exact and the
clock used was just twice the horizontal rate (half so that alternate fields
would be offset 1/2 horizontal line for interlace. With the advent of
modern color broadcast televison the clock is based on the color subcarrier
frequency (~3.58 MHz for NTSC, ~ 4.43 MHz for PAL and SECAM) and the
horizontal/vertical frequencies are divided down from that clock.

Phil Weldon

"Flow" <Flowing@zonnet.nl> wrote in message
news:ZoHXe.26$b66.17@amstwist00...
> Thank you for the explanaition phil :) i always mix up the pal and ntsc
> frequencies.
>
 
G

Guest

Guest
Archived from groups: alt.comp.periphs.videocards.nvidia (More info?)

On Sun, 18 Sep 2005 14:09:59 GMT, Robert Gault
<robert.gault@worldnet.att.net> wrote:


>If you want to prove this to yourself, try any video test program such
>as MadOnion and run the test at 75Hz and 150Hz. What happens to your
>test scores.

Nothing, because most modern benchmark tools turn off vsync prior to
running the test. Because of that, your example is worthless.