Kevin, I assume you meant 12 bits per channel, not 12 bits per colour.
Btw, SGI was doing 48bit RGBA more than 20 years ago. It's nothing new.
Only the switch to flat panels has meant the fidelity of perceived images
has gone down since then; in the past, CRTs could show the 4096 different
shades of each colour just fine with 36bit RGB; 12bits for Alpha has always
been important for accurate low-light/contrast visual simulation and precision
imaging/video, while 16bit to 64bit greyscale has been used in medical imaging,
GIS and defense imaging analysis for ages aswell, eg. the Group Station for
Defense Imaging was dealing with 60GB 2D images probably a decade before
many of the readers of this forum were even born. 😀
Modern display tech is merely catching up to what high-end gfx was doing
in the last century with CRTs. *yawn*
Ian.