Originally Posted by
Bernice Loui
Digital imaging as a technology must have filtering and varying degrees of numeric processing which can and does alter the results. How the results are altered can be widely adjusted to deliver a specific perception of the resulting image, i.e. the illusion of extreme sharpness-resolution when it is not really there in the original data acquired.
Some time ago during a LF get together, one of the participants show me a digital print that appeared really, really sharp with extreme resolution. Within moments of looking at this digital print, I noted that it has been digitally sharpened. He was very surprised that this was visible in the print, with that an explanation of how and what visual clues in this print pointed to the fact is has been digitally sharpened. As that discussion went on, it turns out that image was made at f90, then digitally sharpened and worked over to produce a sharp, high resolution, contrasty and snappy look.
Point being, digital is a completely different technology to film and their results are inherently difference is baked into how they work to produce images. IMO, comparing the two is more academic than realizing they should be treated as different imaging technologies producing different results.
Bernice
Bookmarks