flat panel monitors

Page 4 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

Bob Myers writes:

> Sorry, you missed the important part of that: can you maintain 24-bit
> accuracy in an analog system which fills a 200 MHz bandwidth,
> in any current practical example?

I'm not sure what you mean by "24-bit accuracy." How many bits per
second?

You can always maintain at least the accuracy of the equivalent digital
system.

If you can push 200 Mbps through a digital channel, you can also get at
least 200 Mbps through the same channel with analog encoding (and
typically more). However, the analog equipment may cost more.

> But so far, this is merely an assertion on your part; you have
> not given any real theoretical or practical reason why this
> should be so.

Information theory proves it.

> Again, practical examples exist of "digital"
> systems which come very, very close to the Shannon limit
> for their assumed channels. So what's this basic, inherent
> limit that you seem to be assuming for "digital" transmissions?

The basic limit is the fact that you declare anything below a certain
level to be noise. You thus sacrifice any actual signal below that
level, and in doing so you also sacrifice part of your bandwidth. You
don't make this arbitrary distinction in an analog system, so your
bandwidth is limited only by the _actual_ noise in the channel.

> Not only used them, I've designed around them. There
> is most definitely an upper limit on the resolution of ANY
> CRT; it's just more obvious in the case of the standard
> tricolor type.

What limits resolution in a monochrome CRT? Scanning electron
microscopes prove that electron beams can be pretty finely focused.

--
Transpose hotmail and mxsmanic in my e-mail address to reach me directly.
 
Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

Mxsmanic wrote:

> Bob Myers writes:
>
>> Sorry, you missed the important part of that: can you maintain 24-bit
>> accuracy in an analog system which fills a 200 MHz bandwidth,
>> in any current practical example?
>
> I'm not sure what you mean by "24-bit accuracy." How many bits per
> second?
>
> You can always maintain at least the accuracy of the equivalent digital
> system.
>
> If you can push 200 Mbps through a digital channel, you can also get at
> least 200 Mbps through the same channel with analog encoding (and
> typically more). However, the analog equipment may cost more.
>
>> But so far, this is merely an assertion on your part; you have
>> not given any real theoretical or practical reason why this
>> should be so.
>
> Information theory proves it.
>
>> Again, practical examples exist of "digital"
>> systems which come very, very close to the Shannon limit
>> for their assumed channels. So what's this basic, inherent
>> limit that you seem to be assuming for "digital" transmissions?
>
> The basic limit is the fact that you declare anything below a certain
> level to be noise. You thus sacrifice any actual signal below that
> level, and in doing so you also sacrifice part of your bandwidth. You
> don't make this arbitrary distinction in an analog system, so your
> bandwidth is limited only by the _actual_ noise in the channel.

This is one of the silliest arguments I have ever seen. Noise is not
signal. If your signal is dropping significantly below the noise threshold
then you've got a problem.

I think you have the properties of analog _measurement_ systems confused
with the properties of analog _communication_ systems.

>> Not only used them, I've designed around them. There
>> is most definitely an upper limit on the resolution of ANY
>> CRT; it's just more obvious in the case of the standard
>> tricolor type.
>
> What limits resolution in a monochrome CRT?

Generally the cost, but beyond that the grain size of the phosphor.

> Scanning electron
> microscopes prove that electron beams can be pretty finely focused.

At low energy levels in a cavity between multiple magnets.

--
--John
Reply to jclarke at ae tee tee global dot net
(was jclarke at eye bee em dot net)
 
Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

J. Clarke writes:

> And of course you are a better marketing guy than the
> people at Sony.

Time will tell.

> So why ain't you running a 50 billion dollar corporation if
> you're such an expert?

I haven't been hired for such a position, and I don't own such a
corporation myself. Then again, I have no ambition to do this type of
work, either.

> That's odd, it was my understanding that this year LCDs outsold CRTs for the
> first time. Or maybe that was last year.

Neither, as I recall.

> Television sets are not computer monitors.

Same technology.

> And there are very few CRT televisions that can display full HD.

So?

--
Transpose hotmail and mxsmanic in my e-mail address to reach me directly.
 
Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

Mxsmanic wrote:

> J. Clarke writes:
>
>> And of course you are a better marketing guy than the
>> people at Sony.
>
> Time will tell.
>
>> So why ain't you running a 50 billion dollar corporation if
>> you're such an expert?
>
> I haven't been hired for such a position,

Why not if you're so smart. What are you doing that pays better?

> and I don't own such a
> corporation myself.

Why not, o marketing genius? You being so much smarter than the people at
Sony and all it should be easy for you to build a business to that size.

> Then again, I have no ambition to do this type of
> work, either.

So what work do you have ambition to do? If it's anything in electronics,
don't quit your day job.

>> That's odd, it was my understanding that this year LCDs outsold CRTs for
>> the
>> first time. Or maybe that was last year.
>
> Neither, as I recall.

Well, I wouldn't expect you to get that right. Try googling "LCD CRT Sales"
and I think you'll find that your recollections is considerably in error.

>> Television sets are not computer monitors.
>
> Same technology.

So? Bicycle tires and car tires are "the same technology" but the fact that
steel-belted radials are making few inroads into the bicycle market does
not mean that they aren't selling well in the car market. The two markets
are different.

>> And there are very few CRT televisions that can display full HD.
>
> So?

So they're clearly obsolescent if they can't display the current broadcast
standard.
>

--
--John
Reply to jclarke at ae tee tee global dot net
(was jclarke at eye bee em dot net)
 
Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

J. Clarke writes:

> This is one of the silliest arguments I have ever seen.

In other words, you disagree.

> Noise is not signal.

Yes. But the definition of both is arbitrary.

> If your signal is dropping significantly below the noise threshold
> then you've got a problem.

Not if there is no noise obscuring it.

> I think you have the properties of analog _measurement_ systems confused
> with the properties of analog _communication_ systems.

Same thing.

> Generally the cost, but beyond that the grain size of the phosphor.

And how does the grain size compare to LCD pixel sizes?

> At low energy levels in a cavity between multiple magnets.

Not unlike a vacuum tube.

--
Transpose hotmail and mxsmanic in my e-mail address to reach me directly.
 
Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

Mxsmanic wrote:

> J. Clarke writes:
>
>> This is one of the silliest arguments I have ever seen.
>
> In other words, you disagree.
>
>> Noise is not signal.
>
> Yes. But the definition of both is arbitrary.

"Signal" is whatever information you are trying to transfer. "Noise" is
whatever is on the line that you did not intentionally put there. There is
nothing "arbitrary" about it.

>> If your signal is dropping significantly below the noise threshold
>> then you've got a problem.
>
> Not if there is no noise obscuring it.

If there is "no noise obscuring it" then it is not not dropping below the
noise threshold.

>> I think you have the properties of analog _measurement_ systems confused
>> with the properties of analog _communication_ systems.
>
> Same thing.

No, not the same thing. In a measurement system you don't control the
signal that you're trying to measure, in a communication system you do.

>> Generally the cost, but beyond that the grain size of the phosphor.
>
> And how does the grain size compare to LCD pixel sizes?

What difference does it make? You're claiming that there is no limit.
>
>> At low energy levels in a cavity between multiple magnets.
>
> Not unlike a vacuum tube.

You don't have a clue how a vacuum tube, a CRT display, or an electron
microscope work, do you? First, most vacuum tubes have no magnets of any
kind associated with them. Second, running an electron microscope with the
same beam energy as a CRT would destroy just about any sample you put in it
in short order. Third, the type of CRT uses in computer monitors and
televisions has one set of magnets, not the several sets that are used in
focussing an electron microscope, and the cavity in the microscope is also
much smaller than the face of a CRT.

But you don't care about the facts, do you? Since you claim to be manic you
might look into a mood stabilizer.


--
--John
Reply to jclarke at ae tee tee global dot net
(was jclarke at eye bee em dot net)
 
Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

"Mxsmanic" <mxsmanic@hotmail.com> wrote in message
news:rc82s0pghlqob6r95lq2ibmvp3if9aqehl@4ax.com...
> Pixels (or subpixels) on a flat panel have a constant luminosity
> throughout their dimensions. On a CRT, the luminosity will vary
> depending on the colors of the adjacent pixels and the bandwidth of the
> monitor. This can make aliasing less obvious on a CRT. I'm not saying
> that this is good or bad, just that it happens.

Exactly. That's part of that change in overall "look" I
was talking about. But it's not the same thing as
actually being able to resolve the "pixels" any finer than
the limit imposed by the mask/dot pitch. One thing you
clearly CAN get with CRTs is "softer" edges on things,
but on the other hand you're also having to deal with
luminance and color errors in single-pixel details (when
the size of the "pixel" within the image is getting down
to the same size as the dot triad or mask pitch).

> It's interesting that ClearType makes text on an LCD look much better in
> most cases, even though it makes the pixels "messier" with its low-level
> antialiasing. I haven't tried ClearType on a CRT, so I don't know what
> that does (I doubt that it works very well).

Here again is a case where "works well" is pretty much a
matter of personal preference; I don't care much for ClearType
on a CRT, but that's just because I'm overly sensitive to that
"messier" aspect you mentioned. (I can never use myself as
a test subject for display evaluations, since after all this time
I simply do NOT look at these things the same way the average
user will. Display engineers are the last people you want
subjectively evaluating displays. 🙂)

Bob M.
 
Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

"Mxsmanic" <mxsmanic@hotmail.com> wrote in message
news:lv82s09777c9e027bsoo5gso5ijflj4tvi@4ax.com...
> Bob Myers writes:
>
> > Sorry, you missed the important part of that: can you maintain 24-bit
> > accuracy in an analog system which fills a 200 MHz bandwidth,
> > in any current practical example?
>
> I'm not sure what you mean by "24-bit accuracy." How many bits per
> second?

"24-bit accuracy" is not dependent on the data rate. It simply
means that your system can accurately produce, communicate,
and interpret levels, repeatedly and without error, to within
half of the implied LSB value - in this case, whatever the peak
expected signal would be, divided by (2^24-1). For instance,
in a typical analog video system (with 0.7 Vp-p signal swings),
"24-bit accuracy" would mean that you are confident you can
determine the signal amplitude to within about 21 nV - and
yes, that's NANOvolts. But this is simply not possible in any
real-world video system, since the noise in any such system
over the specified bandwidth is significantly higher than this
value. (The thermal noise across a 75-ohm termination
resistor at room temperature alone is about 25 mV RMS.)

> You can always maintain at least the accuracy of the equivalent digital
> system.

Sure; but that's just it - you can always build an EQUIVALENT
digital system. You can't do better than the noise limit in
either case, and the noise limit sets the bound on accuracy -
and so information capacity - no matter how you encode the
information, whether it's in "analog" or "digital" form.

> If you can push 200 Mbps through a digital channel, you can also get at
> least 200 Mbps through the same channel with analog encoding (and
> typically more). However, the analog equipment may cost more.

Sorry - not "typically more". You're still comparing
specific examples of "analog" and "digital"; "digital" does NOT
imply that straight binary coding, with the bits transmitted in
serial fashion on each physical channel, is your only option.
For instance, a standard U.S. TV channel is 6 MHz wide -
and yet, under the U.S. digital broadcast standard, digital
TV transmissions typically operate at an average data rate
of slightly below 20 Mbps. How do you think that happens?
(You should also not assume that straight binary, serial
transmission is all we will ever see in display interfaces; there
are published standards which employ more efficient coding
methods.)

>
> > But so far, this is merely an assertion on your part; you have
> > not given any real theoretical or practical reason why this
> > should be so.
>
> Information theory proves it.

Information theory proves exactly the opposite; it shows
that the maximum capacity of a given channel is fixed, and
that that limit is exactly the same for all possible general
forms of coding. Specific types within those general forms
may be better or worse than others, but the maximum limit
remains unchanged.

> The basic limit is the fact that you declare anything below a certain
> level to be noise.

Which is equally true in analog systems. No analog system
can provide "infinite" accuracy, or anything remotely approaching
it, and for the same fundamental reasons that limit digital. You
are also here again assume that a digital system cannot be made
noise-adaptive, which is incorrect even in current practice.


> What limits resolution in a monochrome CRT? Scanning electron
> microscopes prove that electron beams can be pretty finely focused.

Yes, but there the beam does not have to pack enough punch
to light up phosphor. There is an inescapeable tradeoff between
beam current and spot size, and also a point below which practical
phosphors simply cannot be driven to usable levels. This, along
with the unavoidable degradation in spot size and shape which
results from the deflection system (another little detail that SEMs
don't worry about in nearly the same way) results in some very
well-defined limits on any practical CRT


Bob M.
 
Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

"Mxsmanic" <mxsmanic@hotmail.com> wrote in message
news:l053s0l4f8vitjr5km4t7fbkre922h6fm7@4ax.com...

> > Noise is not signal.
>
> Yes. But the definition of both is arbitrary.

Not at all; "signal" is the information we desire to
communicate; "noise" is EVERYTHING else that gets
in the way of doing that. That's hardly arbitrary, given
that the goal of any communications system is to transfer
information.

> > If your signal is dropping significantly below the noise threshold
> > then you've got a problem.
>
> Not if there is no noise obscuring it.

Ummm...how does your signal drop below the noise
threshold, and still have "no noise obscuring it"?


> > Generally the cost, but beyond that the grain size of the phosphor.
>
> And how does the grain size compare to LCD pixel sizes?

Not relevant; the claim was that there was no inherent limit to
the resolution of a monochrome CRT, not how that technology
compared to others. Phosphor grain size DOES set one such
limit, but other limits have already come into play well before
that point is reached.

Also, please note that practical LCD devices have been
constructed with pixel sizes well below 10 microns. You HAVE
heard of LCOS, right? (And yes, those ARE usable as direct-
view displays.) It would be POSSIBLE to produce pixels of
that size on a larger display, even on a glass substrate (the
best polysilicon-on-glass process can now work with 0.8 micron
design rules) - there's just no practical reason for doing so.


> > At low energy levels in a cavity between multiple magnets.
>
> Not unlike a vacuum tube.

Very much unlike the vacuum tube in question in any
practical form.

FWIW, the best CRT display I saw, in terms of resolution, got
ALMOST to 300 dpi, or a "pixel" size of about 0.085 mm.
It may still be in limited production, but it was used only in an
extremely expensive and very limited-volume monitor. This
level of performance HAS been duplicated within the past 5
years in a fairly large-screen monochrome LCD monitor, again
aimed at a very limited high-end market.

Bob M.
 
Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

> Bob Myers <nospamplease@address.invalid> wrote:

> FWIW, the best CRT display I saw, in terms of
> resolution, got ALMOST to 300 dpi, or a "pixel"
> size of about 0.085 mm.

Laser printing intro'd around 180 dpi. Inkjet printing
intro'd at 75 dpi. Both grew quickly, and are now at
1200 dpi or higher.

A 300 dpi monitor of any typical size would of course
impose a huge raster requirement, only recently economical,
and a substantial rendering compute load for games,
CAD, etc., something not necessarily desired even yet.

But the limitations on monitor res are set by Microsoft.

In the early days of Windows, Mr.Bill or one of his
minions made an assumption that displays would always
be in the 60-100 dpi range, and locked their icons and
early screen raster font pixel sizes to that assumption.

XP may finally have icon scaling, but the installed
base of pre-XP Windows does not, nor do many existing
and legacy apps.

If you go much above 100 dpi, Windows is nearly or
completely unusable. I had a .22mm dp CRT (115 dpi).
A true 1600x1200 on a such a 21"(20V) CRT is just beyond
the limit for what's practical to use with Windows.

Sony demo'd a .15mm dp (169 dpi) CRT at COMDEX a few years
ago, but never released it, probably due to this. IBM has
a 200 dpi LCD (the T221), but doesn't promote to the general
market, partly on account of the Windows.ico problem. (It also
requires dual-link DVI.)

Display technology is not the reason why we don't today
have common higher-res displays.

--
Regards, Bob Niland mailto:name@ispname.tld
http://www.access-one.com/rjn email4rjn AT yahoo DOT com
NOT speaking for any employer, client or Internet Service Provider.
 
Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

"Mxsmanic" <mxsmanic@hotmail.com> wrote in message
news:9i82s052m7gu864ll137sq68h1qgc93364@4ax.com...
> What recent changes have been made to Trinitrons?

Sony continued to patent changes to the gun design,
manufacturing processes, etc., right up to the day that
monitor-type Trinitrons left production, and continues to do
so in the case of their TV tubes. Being able to build something
based on the technology described in the expired original
patents would let you duplicate Trinitrons the way they were
in the 1970s, which would mean an expensive and, by
current standards, lower-performance tube. No one is
going to invest in the equipment needed to produce these
things (which is different than a standard CRT process) just
to make 1970s tubes, when they can already make flat-screen
conventional-mask designs more cheaply and with better
performance.


> I was considering the Diamontron for a time, but I understand that,
> although it's an aperture grille like a Trinitron, it apparently just
> doesn't have the quality of a true Trinitron. I guess that's all
> becoming increasingly academic now.

Absolutely.

> > Because of the high overhead necessary to continue to make
> > the basic technology (in this case, the CRT) in light of a rapidly
> > diminishing market. You don't think Sony and NEC have stopped
> > making these things just because they were tired of playing
> > in the professional market, do you?
>
> I think they stopped making them out of misguided business decisions.

Time will tell. Having discussed this matter with both
companies, I personally am convinced that their decision was
made on a very sound basis and at (for them) the correct time,
although it certainly has caused some problems for those who
might still wish to purchase these products.

>
> > CRT plants ARE fairly expensive things to keep running, and it
> > simply is not feasible to run them for production runs that are
> > going to be measured in the tens of thousands per year, tops.
>
> The vast majority of monitors being sold today are still CRTs. This is
> even more true for television sets.

In the case of monitors, you are incorrect. LCDs took over
the #1 spot in the monitor market by unit volume this past
year. (They had already been #1 in terms of revenue for some
time, of course.) The TV market remains a good 85% or better
CRT, but with that share expected to decline over this decade
and into the next. The rate of decline may accelerate, depending
on what happens with the pricing of other technologies; it will
not be reversed.

Bob M.
 
Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

"Mxsmanic" <mxsmanic@hotmail.com> wrote in message
news😛t43s09v108jqfoddbubufhl5anppmd7jo@4ax.com...
> > That's odd, it was my understanding that this year LCDs outsold CRTs for
the
> > first time. Or maybe that was last year.
>
> Neither, as I recall.

This year. Actually, several quarters ago. It was last
year if only certain markets (U.S., Europe, Japan) are c
considered. This year, the switch occurred in the worldwide
numbers.


> > Television sets are not computer monitors.
>
> Same technology.

And the Saturn V is the same basic technology as my
kid's little Estes model rocket - stuff shoots out THIS end,
and it moves in the direction the OTHER end is pointing. 🙂
TV and monitor CRTs are VERY different beasts. (One
significant difference at this point is that the former remains
in production in a lot of places that have already given up
on the latter!)


Bob M.
 
Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

"Bob Niland" <email4rjn@yahoo.com> wrote in message
news😱psi33110bft8z8r@news.individual.net...
> > Bob Myers <nospamplease@address.invalid> wrote:
>
> > FWIW, the best CRT display I saw, in terms of
> > resolution, got ALMOST to 300 dpi, or a "pixel"
> > size of about 0.085 mm.
>
> Laser printing intro'd around 180 dpi. Inkjet printing
> intro'd at 75 dpi. Both grew quickly, and are now at
> 1200 dpi or higher.

OK, but I'm not sure how that's relevant here. The
above represents the best monochrome CRT I have
ever seen or heard about - and there is also basically
zero development underway right now aimed at pushing
these devices beyond that point.

> A 300 dpi monitor of any typical size would of course
> impose a huge raster requirement, only recently economical,
> and a substantial rendering compute load for games,
> CAD, etc., something not necessarily desired even yet.

Well, again, this was a monochrome product, with a very
limited gray scale ("bit depth") capability, and aimed at
an extremely high-end niche market. It was shipped only
with a custom graphics system which, as I recall, produced
something like 4k x 3k at perhaps 4-6 bits/pixel. The
computational/rendering load was not all that significant
here, as it was aimed at such things as document review,
radiology, etc., where you're simply dealing with static images
scanned in from another source.

>
> But the limitations on monitor res are set by Microsoft.
>

No; Bill Gates may sometimes be accused of thinking he's
God, but I seriously doubt that even he would claim
responsibility for the laws of physics. Microsoft is also not
the only game in town, and especially not in the markets
where such displays are of interest.


> Sony demo'd a .15mm dp (169 dpi) CRT at COMDEX a few years
> ago, but never released it, probably due to this. IBM has
> a 200 dpi LCD (the T221), but doesn't promote to the general
> market, partly on account of the Windows.ico problem. (It also
> requires dual-link DVI.)

More precisely, QUAD-link DVI; monitors using that
9.2 Mpixel panel require two dual-link DVI outputs from
the graphics card. And at that, the delivered frame rate
at the monitor input is somewhat under 50 Hz. (The panel
itself doesn't necessarily run at that rate, due to some other
internal bizarreness.

> Display technology is not the reason why we don't today
> have common higher-res displays.

Well, perhaps it's not the ONLY reason. There ARE
some markets that would very much like to have a full-color,
video-rate 300 dpi display, if only such things (and the
hardware to drive them) were available for a price less
than the contents of Ft. Knox, despite the limitations Windows
might also place on things.

Bob M.
 
Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

Bob Myers wrote:

>
> "Bob Niland" <email4rjn@yahoo.com> wrote in message
> news😱psi33110bft8z8r@news.individual.net...
>> > Bob Myers <nospamplease@address.invalid> wrote:
>>
>> > FWIW, the best CRT display I saw, in terms of
>> > resolution, got ALMOST to 300 dpi, or a "pixel"
>> > size of about 0.085 mm.
>>
>> Laser printing intro'd around 180 dpi. Inkjet printing
>> intro'd at 75 dpi. Both grew quickly, and are now at
>> 1200 dpi or higher.
>
> OK, but I'm not sure how that's relevant here. The
> above represents the best monochrome CRT I have
> ever seen or heard about - and there is also basically
> zero development underway right now aimed at pushing
> these devices beyond that point.
>
>> A 300 dpi monitor of any typical size would of course
>> impose a huge raster requirement, only recently economical,
>> and a substantial rendering compute load for games,
>> CAD, etc., something not necessarily desired even yet.
>
> Well, again, this was a monochrome product, with a very
> limited gray scale ("bit depth") capability, and aimed at
> an extremely high-end niche market. It was shipped only
> with a custom graphics system which, as I recall, produced
> something like 4k x 3k at perhaps 4-6 bits/pixel. The
> computational/rendering load was not all that significant
> here, as it was aimed at such things as document review,
> radiology, etc., where you're simply dealing with static images
> scanned in from another source.
>
>>
>> But the limitations on monitor res are set by Microsoft.
>>
>
> No; Bill Gates may sometimes be accused of thinking he's
> God, but I seriously doubt that even he would claim
> responsibility for the laws of physics. Microsoft is also not
> the only game in town, and especially not in the markets
> where such displays are of interest.
>
>
>> Sony demo'd a .15mm dp (169 dpi) CRT at COMDEX a few years
>> ago, but never released it, probably due to this. IBM has
>> a 200 dpi LCD (the T221), but doesn't promote to the general
>> market, partly on account of the Windows.ico problem. (It also
>> requires dual-link DVI.)
>
> More precisely, QUAD-link DVI; monitors using that
> 9.2 Mpixel panel require two dual-link DVI outputs from
> the graphics card. And at that, the delivered frame rate
> at the monitor input is somewhat under 50 Hz. (The panel
> itself doesn't necessarily run at that rate, due to some other
> internal bizarreness.
>
>> Display technology is not the reason why we don't today
>> have common higher-res displays.
>
> Well, perhaps it's not the ONLY reason. There ARE
> some markets that would very much like to have a full-color,
> video-rate 300 dpi display, if only such things (and the
> hardware to drive them) were available for a price less
> than the contents of Ft. Knox, despite the limitations Windows
> might also place on things.

IBM and Viewsonic both used to sell monitors using that panel to end-users.
I notice that Viewsonic no longer lists it on their Web site and IBM only
sells it bundled with Intellistations and suitable video boards now.

Still, I would think that the $8000 price tag was more of an obstacle to its
widespread acceptance than any concern over the manner in which it displays
Windows icons.
>
> Bob M.

--
--John
Reply to jclarke at ae tee tee global dot net
(was jclarke at eye bee em dot net)
 
Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

> Bob Myers <nospamplease@address.invalid> wrote:

>> Laser printing intro'd around 180 dpi. Inkjet printing
>> intro'd at 75 dpi. Both grew quickly, and are now at
>> 1200 dpi or higher.

> OK, but I'm not sure how that's relevant here.

Only in that the topic of higher display resolutions
has arisen. No one today would consider buying a
raster printer of less than 600 dpi (@ 1 bit depth).
"Photo" quality is widely considered to be at least
200 dpi (24b).

It would probably surprise most people, if they ran
the numbers, to discover that the majority of available
displays are "only" 100 dpi. The industry takes no pains
to point this out, of course 🙂

>> But the limitations on monitor res are set by Microsoft.

Let me restate that. The limitations on what you can
sell into the retail market, in any economic quantity,
are set by MS. Computer artisans might well want to
have something closer to "photo" res on screen, but
if the GUI/dialogs are unusable ...

> Microsoft is also not the only game in town, and
> especially not in the markets
> where such displays are of interest.

Specialty markets certainly exist. I'm speaking of
displays that CDW might offer.

>> IBM has a 200 dpi LCD (the T221) ...(It also
>> requires dual-link DVI.)

> More precisely, QUAD-link DVI;

Zounds. IBM doesn't make the technical details on the T221
very prominent on their web, so I didn't see that. My
guess, of course, is that they don't want affluent
but otherwise ordinary end users buying these things
and then discovering the gotchas (starting with: who
makes a quad-link card? Matrox?)

--
Regards, Bob Niland mailto:name@ispname.tld
http://www.access-one.com/rjn email4rjn AT yahoo DOT com
NOT speaking for any employer, client or Internet Service Provider.
 
Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

Bob Niland wrote:

>> Bob Myers <nospamplease@address.invalid> wrote:
>
>>> Laser printing intro'd around 180 dpi. Inkjet printing
>>> intro'd at 75 dpi. Both grew quickly, and are now at
>>> 1200 dpi or higher.
>
>> OK, but I'm not sure how that's relevant here.
>
> Only in that the topic of higher display resolutions
> has arisen. No one today would consider buying a
> raster printer of less than 600 dpi (@ 1 bit depth).
> "Photo" quality is widely considered to be at least
> 200 dpi (24b).
>
> It would probably surprise most people, if they ran
> the numbers, to discover that the majority of available
> displays are "only" 100 dpi. The industry takes no pains
> to point this out, of course 🙂
>
>>> But the limitations on monitor res are set by Microsoft.
>
> Let me restate that. The limitations on what you can
> sell into the retail market, in any economic quantity,
> are set by MS. Computer artisans might well want to
> have something closer to "photo" res on screen, but
> if the GUI/dialogs are unusable ...

No, the limitation on what you can sell into the retail market, in any
economic quantity, is how low you can get the price. Right now, high res
displays cost more than lower res displays and larger displays with a given
res cost more than smaller ones, and this has a lot more to do with the
ability of the manufacturers of the panels to tune their production
processes than it does with Microsoft.

>> Microsoft is also not the only game in town, and
>> especially not in the markets
>> where such displays are of interest.
>
> Specialty markets certainly exist. I'm speaking of
> displays that CDW might offer.

Well, let's see. CDW offers Apple displays and Apples won't run any
Microsoft operating system.

>>> IBM has a 200 dpi LCD (the T221) ...(It also
>>> requires dual-link DVI.)
>
>> More precisely, QUAD-link DVI;
>
> Zounds. IBM doesn't make the technical details on the T221
> very prominent on their web, so I didn't see that. My
> guess, of course, is that they don't want affluent
> but otherwise ordinary end users buying these things
> and then discovering the gotchas (starting with: who
> makes a quad-link card? Matrox?)

There's a whole list of boards that can drive it on the IBM site--all
workstation boards IIRC.


--
--John
Reply to jclarke at ae tee tee global dot net
(was jclarke at eye bee em dot net)
 
Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

Bob Myers writes:

> This year. Actually, several quarters ago. It was last
> year if only certain markets (U.S., Europe, Japan) are c
> considered. This year, the switch occurred in the worldwide
> numbers.

What happens when individual countries and regions are examined?

--
Transpose hotmail and mxsmanic in my e-mail address to reach me directly.
 
Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

J. Clarke writes:

> "Signal" is whatever information you are trying to transfer. "Noise" is
> whatever is on the line that you did not intentionally put there. There is
> nothing "arbitrary" about it.

You've just illustrated the arbitrary character of the distinction.

> If there is "no noise obscuring it" then it is not not dropping below the
> noise threshold.

It can drop below your predetermined noise threshold, as in digital
systems.

> What difference does it make?

That depends on the ratio between the two sizes.

> But you don't care about the facts, do you?

I enjoy facts. Do you have any?

--
Transpose hotmail and mxsmanic in my e-mail address to reach me directly.
 
Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

Bob Myers writes:

> Not at all; "signal" is the information we desire to
> communicate; "noise" is EVERYTHING else that gets
> in the way of doing that.

And our desire is not arbitrary?

> That's hardly arbitrary, given
> that the goal of any communications system is to transfer
> information.

What makes the distinction between information and noise, if not our
arbitrary decisions?

> Ummm...how does your signal drop below the noise
> threshold, and still have "no noise obscuring it"?

You define a specific threshold and call anything below it noise, and
anything above it signal. This is the basis of digital systems.

The advantage is that you have error-free transmission, as long as the
actual noise level stays below your predefined threshold. The
disadvantages are that you lose any real information below the
threshold, and that you sacrifice part of the channel's bandwidth in
order to eliminate errors.

Of course, for it all to work in a practical sense, you must define your
artificial threshold so that it is above the actual noise level in the
channel.

--
Transpose hotmail and mxsmanic in my e-mail address to reach me directly.
 
Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

Bob Niland writes:

> In the early days of Windows, Mr.Bill or one of his
> minions made an assumption that displays would always
> be in the 60-100 dpi range, and locked their icons and
> early screen raster font pixel sizes to that assumption.

These problems had been considered long before "Mr. Bill" came around.

The actual limitations are imposed by human vision, not by any
engineering difficulties. There's no point in building displays that do
substantially better than human vision can perceive (if they are
intended for human eyes).

--
Transpose hotmail and mxsmanic in my e-mail address to reach me directly.
 
Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

Bob Myers writes:

> "24-bit accuracy" is not dependent on the data rate. It simply
> means that your system can accurately produce, communicate,
> and interpret levels, repeatedly and without error, to within
> half of the implied LSB value - in this case, whatever the peak
> expected signal would be, divided by (2^24-1). For instance,
> in a typical analog video system (with 0.7 Vp-p signal swings),
> "24-bit accuracy" would mean that you are confident you can
> determine the signal amplitude to within about 21 nV - and
> yes, that's NANOvolts. But this is simply not possible in any
> real-world video system, since the noise in any such system
> over the specified bandwidth is significantly higher than this
> value.

Then there's not much point in using it in any system with analog
components, which includes all systems that interface with the physical
world, which in turn includes all display systems.

> Sure; but that's just it - you can always build an EQUIVALENT
> digital system.

Digital systems can never match analog systems, not even in theory.

But ironically, in practice, it's often cheaper and easier to build very
precise digital systems than it is to build equally precise analog
systems, at least where chains of components that accumulate errors are
required.

> For instance, a standard U.S. TV channel is 6 MHz wide -
> and yet, under the U.S. digital broadcast standard, digital
> TV transmissions typically operate at an average data rate
> of slightly below 20 Mbps. How do you think that happens?

It just depends on the noise level. If the noise level is zero, the
capacity of the channel is infinite.

> Information theory proves exactly the opposite; it shows
> that the maximum capacity of a given channel is fixed, and
> that that limit is exactly the same for all possible general
> forms of coding.

It also shows that the lower the noise level, the higher the capacity of
the channel, all else being equal, which means that a noise-free channel
has infinite capacity.

Thus, if you improve systems in a way that lowers noise, you can get
more capacity out of them. This is how dial-up modems have been doing
it for years.

--
Transpose hotmail and mxsmanic in my e-mail address to reach me directly.
 
Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

"Mxsmanic" <mxsmanic@hotmail.com> wrote in message
news:nat3s01530vvorchau7ooc57r3f19egiv2@4ax.com...
> What happens when individual countries and regions are examined?

Obviously, there are still areas where the CRT leads
the LCD in unit volume; these are pretty much
without exception the developing markets where
the cost advantage of the CRT outweighs all other
considerations. Still, the LCD is coming on surprisingly
strong even in these areas, and especially (according
to recent industry news) in mainland China. China
very much wants to move into being a "high-tech"
locale, joining Japan, Korea, and Taiwan in supplying
LCD panels and such to the world.

Bob M.
 
Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

"Mxsmanic" <mxsmanic@hotmail.com> wrote in message
news:ndt3s0d2f8a1fiuo6voe92b2e02dt9v7od@4ax.com...
> J. Clarke writes:
>
> > "Signal" is whatever information you are trying to transfer. "Noise" is
> > whatever is on the line that you did not intentionally put there. There
is
> > nothing "arbitrary" about it.
>
> You've just illustrated the arbitrary character of the distinction.

How is this arbitrary? It's fundamental to information theory -
signal is what you want, and noise is what you don't want.

> > If there is "no noise obscuring it" then it is not not dropping below
the
> > noise threshold.
>
> It can drop below your predetermined noise threshold, as in digital
> systems.

Well, assuming that the "digital" system in question could
not adapt to this, you would be correct. But that's not the
case in many digital systems, and certainly is not a restriction
which applies to "digital" per se.

Bob M.
 
Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

"Mxsmanic" <mxsmanic@hotmail.com> wrote in message
news:tht3s0dg90ef1l2shaoffvsnnk48o3v795@4ax.com...
> Bob Myers writes:
>
> > Not at all; "signal" is the information we desire to
> > communicate; "noise" is EVERYTHING else that gets
> > in the way of doing that.
>
> And our desire is not arbitrary?

Only if you believe that the distinction of
information from noise, which is fundamental
to information theory, is "arbitrary." But since
we ARE talking here about concepts which
are addressed in that field, you'd better be
willing to stick within the definitions used there,
whether or not you consider them "arbitrary."

Identifying a certain amount of electromotive
force to be a "volt" is also arguably arbitrary;
this does not at all mean that it is not useful to
phrase discussions of certain electrical phenomena
in terms of volts.

> What makes the distinction between information and noise, if not our
> arbitrary decisions?

Sorry, but we're discussion information theory
here. If you wish to discuss semantics, that's on
the other side of the campus. But don't expect
too many from THIS discussion to follow you
there.


Bob M.
 
Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

"Mxsmanic" <mxsmanic@hotmail.com> wrote in message
news:not3s01jupdh0bhg3ve09l7nrovr8s6cgf@4ax.com...
> The actual limitations are imposed by human vision, not by any
> engineering difficulties. There's no point in building displays that do
> substantially better than human vision can perceive (if they are
> intended for human eyes).

While this is certainly true, it is of academic interest
only. There is no practical display technology which
challenges the limits of human spatial acuity in all
situations. There is most certainly nothing available
for desktop monitors which does this, given the fact
that it's not all that hard to put my eyes six inches from
the screen.

Bob M.