It appears that the terms "dots per inch" and "sensitivity", as
applied to mice, mean exactly the opposite of what they might
be expected to mean. The more dots per inch, or the greater the
sensitivity, the coarser the mouse pointer motion, and the more
difficult it is to control. Move the mouse a little bit and the
pointer jumps wildly across the screen.
Mice were designed to be capable of greater and greater dpi over
the years. When they were first invented, it would have been
easier to make 8000 dpi mice than 800 dpi mice, but since screens
had fewer pixels, lower dpi were required. So higher dpi mice
were introduced for faster speeds on larger screens, not because
of improving technology.
I want greater precision, not speed, so I need lower dpi.
That just sounds so wrong.
It is opposite the way a scanner or printer works. Higher dpi in
a scanner or printer means greater detail and finer control of the
image. Higher dpi in a mouse means less detail and coarser control.
But a mouse works essentially the same way as a scanner. It sees
the surface under it as moving dots. The more dots it can see per
inch, the finer the control and more precise it can be.
Is there a specification for mice that identifies the dpi that the
mouse can see, rather than the dpi that the pointer moves?
applied to mice, mean exactly the opposite of what they might
be expected to mean. The more dots per inch, or the greater the
sensitivity, the coarser the mouse pointer motion, and the more
difficult it is to control. Move the mouse a little bit and the
pointer jumps wildly across the screen.
Mice were designed to be capable of greater and greater dpi over
the years. When they were first invented, it would have been
easier to make 8000 dpi mice than 800 dpi mice, but since screens
had fewer pixels, lower dpi were required. So higher dpi mice
were introduced for faster speeds on larger screens, not because
of improving technology.
I want greater precision, not speed, so I need lower dpi.
That just sounds so wrong.
It is opposite the way a scanner or printer works. Higher dpi in
a scanner or printer means greater detail and finer control of the
image. Higher dpi in a mouse means less detail and coarser control.
But a mouse works essentially the same way as a scanner. It sees
the surface under it as moving dots. The more dots it can see per
inch, the finer the control and more precise it can be.
Is there a specification for mice that identifies the dpi that the
mouse can see, rather than the dpi that the pointer moves?