To support 10-bit color the following are needed:
- A monitor supporting it.
- A GPU supporting it (only AMD FirePro and NVIDIA Quadro support this?).
- Compatible software. Unless I am mistaken there are very few programs out there supporting 10-bit color. Photoshop is a notable example.
The questions are about how 10-bit monitors perform in comparison with 8-bit monitors:
- In which situations would a 10-bit monitor give a noticeable advantage over an 8-bit monitor (say, for professional photography)?
- Have 10-bit monitors been compared against 8-bit monitors based on subjective or objective tests? What were the results?
- Human eyes can see only 10m colors, so would using a monitor with 1b colors make a difference?
Best Answer
I think the biggest factor in this is not the high fidelity output, but the possibility to more accurately match a given target color.
Especially when working in print, you want to take care that what you're seeing on screen matches the printed result to a tee. That is much harder if you only have a small amount of colors to chose from. If you have a billion colors, it's much easier to produce a match.
HP also brings up what they call "banding", an effect that can be seen when very similar colors are displayed close together and become too distinguishable from each other.
Input
Additional Information
Photoshop can manipulate and display images that use more than 8 bits per color channel. That does not imply direct support for 10 bit per color channel displays.
That was at least the case in 2010.