jlwtech writes:
> ... This was done with video projection equipment, ...
Was this done with DLPs? Even so, it's not remotely the same thing.
> ... Most of the "audience" couldn't tell the difference.
The music industry made the same mistake with the hifi market, selecting 16bit over 24bit
as a default because, "most people", couldn't tell the difference. Thing is, the people could
tell the difference were the audiophiles, classic fans and people with the real money to spend.
> With computer games, while the average framerate may be 30, the actual time between
> frames can vary wildly, so it's a lot more noticeable. ...
Hence why there's such emphasis now on minimum frame rates and broken frames which
include analysis tools such as FCAT, newer technologies to drive displays in a better way, etc.
> If you have played on any console other than the 720 and ps4, it was 30fps. (with a few, rare, exceptions)
Traditionally, consoles could get away with lower frame rates because field-based analogue displays
naturally blended one half-screenfull of dots into the next (phosphor persistence). This doesn't work
so well with newer digital displays though (hence the use of overscanning, etc.), and certainly not
low-latency computer monitors.
> One of the reasons I asked my initial question was the fact that movies and tv shows are all displayed at 24fps.
Not exactly. Only traditional cinema film and the original lesser 1080p/24 are 24Hz, and that's
changing fast (who would by anything less than 1080p 50Hz today?), with standards up to 4K/100Hz
being explored atm (checkout what Douglas Trumbull is up to with this 3D research). 24Hz was a
hangover from the early days of cinema, there was never any rational reason to use it for TV (CRTs
used for radar in WW2 were way better) and certainly not for HD, but of course initial compatibility
meant it had to be that way. Personally I don't like fast motion in standard cinematic film, far too
jerky/blurry, but others are less bothered by it. Ironic that the recent higher refresh Hobbit movies left
some viewers feeling less than impressed - people have gotten too used to the 24Hz flicker, it's
become part of the cinematic norm for their visual experience, at least for older viewers anyway.
Unsurprisingly, younger viewers prefer the newer format, though perhaps part of the problem is that
with a higher refresh and greater resolution, the level of detail in set design, makeup, CGI, etc., has
to be much better, otherwise the result doesn't look right. Reminds me of comments from some TV
presenters who don't like HD, because the greater detail suddenly means audiences can see their
spotty/wrinkled faces in full hideous detail.
There are various TV standards, but the most common formats are 30Hz (60 fields) NTSC and 25Hz
(50 fields) PAL; the latter has a 40% higher resolution, but the lower refresh results in a 16% higher
overall difference, if one can use blunt math in such a manner. PAL does have better colour though.
There's also SECAM, which used to be used in France, though I'm not sure if that's still the case.
> Most people don't believe that the first time they hear it (especially gamers and enthusiasts)...
It helps if one reads the Ladybird book on TVs at age 6.
And then ends up in a career messing
with VR tech, CAVEs, film, etc. My
dissertation was on stereo vision and the side effects of gaming,
albeit it a narrow study due to limited time.
So much of this is a lot more complicated than typical hobbyists like to imagine - read the refs below
and you'll see what I mean.
> I have not yet experienced gaming over 60fps, but I have seen some video comparisons where one
> is 24fps and the other is 72fps, and I didn't really notice a difference.
Assuming you had a display capable of showing the difference, all this really means is that you may
not be able to discern the two, but most people can, and as with the hifi market, it tends to be that
gaming fanatics who want the best quality are also the people with serious money to spend.
Alas the problem with TV standards and video, etc., is the plethora of legacy issues which affect
how the tech has evolved, never affording the possibility of a clean break with the messy early
analogue days which endured so many compromises. This is true even of the latest 2K/4K/etc.
formats which are a bit of pickle for similar reasons (there's no single standard; I can post a
summary if you like).
If you want to get a headache with some proper background reading on all this, see:
http://www.lurkertech.com/lg/
ie.:
http://www.lurkertech.com/lg/video-systems/
http://www.lurkertech.com/lg/fields/
http://www.lurkertech.com/lg/pixelaspect/
Sometimes I miss my old CRT monitor for gaming which supported 2048x1536 @ 96Hz, but there's
no doubt that a modern IPS panels like the HP/Dells I have now do give much better pixel precision,
colour accuracy, etc. At 96Hz, the CRT was struggling with just a VGA signal feed, but Oblivion
& Stalker did look good.
Ian.