Most scanning software and most picture editing software - all that I have seen, in fact - uses a simple histogram to give some hint as to relatively how many samples are at any particular grey-scale level. I don't find this particularly useful, as often a picture may have a larger range than is immediately visible, but with a relatively few samples at either extreme. This can lead to adjustments to the picture which remove information at the extremes, since the operator is relying on the accuracy of the display system to show him the image.
Outside a professional image editing suite, with calibrated displays and room lighting, I suspect that accuracy is rarely to be had. Angle of view on many LCD screens is a particular matter for concern, with a movement of only a few degrees vertically changing the apparent contrast and brightness of the screen (IPS screens are much better in this respect).
I spent over thirty years in professional broadcast engineering (the BBC) and there is a tool used regularly there to accurately judge exposure (even in a regularly calibrated environment) - a waveform monitor. In such a display, the signal is scanned across its width and the vertical position of the trace shows the luminance of the signal; if large areas are at the same brightness then trace increases in brightness.
I've been playing with a proof-of-concept for such a display using Python: This is very much an unfinished work in progress. Here are some examples (each picture being worth a thousand words, this saves eight thousand words of explanation):
Any interest in this kind of display? It's currently living on a Linux Mint system, running Python 3.4 with the GTK+ toolkit and with the UI built in Glade. I don't think there is any difficulty porting to other OSes but that's someone else's problem; I don't have access. If anyone else wants to play, you're welcome; I'll post the code.
Neil
Bookmarks