These were both naive statements, even at the time, and one misunderstands perceptual acuity while the other mispredicts technology trends.
The professional video industry has defined standards for 10-bits per channel (and higher), since the mid-90's. Back then, I remember hearing about how film telecines used 12-bit log, and one professional CGI package used 16-bit linear for film work. Nobody who ever said "32-bit was the best graphics can get" had studied color science or human visual perception.
More ignorant statements by people who had no firm basis for making them. Not only that, but it also matters what kind of display technology you're talking about.