PDA

View Full Version : Capibilities of film (400 lp/mm)



Neal Shields
15-Feb-2005, 07:41
The latest issue of "Camera Lens News" (Zeiss) talks about Gigabit film.


http://www.zeiss.de/C12567A8003B58B9/ContentsWWWIntern/9B38941E0C36CF0DC1256F2C002B7DBB (http://www.zeiss.de/C12567A8003B58B9/ContentsWWWIntern/9B38941E0C36CF0DC1256F2C002B7DBB)

They say that with their new lenses they can record 400 lp/mm on film.

An earlier issue of the same publication says that they use Velvia to test lenses for color film.

I find it interesting that with all the latest electronic magic they still have to use film to get enough resolution to test lenses.

Donald Qualls
15-Feb-2005, 08:12
They don't "have to" -- it's quite possible to design a digital sensor system to test resolution at and well beyond that kind of number, using aerial image magnification, moving masks and other trickery to make a sensor resolve at considerably below the physical sensing element size. However, it's a lot cheaper, and easier to store the results, if you use film.

Ralph Barker
15-Feb-2005, 10:48
Neal - are you suggesting that digital might not be better than film afterall? (lol)

Interestingly, this gets back to the question previously discussed here - how much real detail (as opposed to interpolated pixels) is necessary in an image. To approach 400 lp/mm, a "full frame" sensor in a 35mm-based digital would need to be at least 9600x14000 pixels, depending on how one defines resolving a line pair (i.e. 2x the resolution, or 1x +1).

Neal Shields
15-Feb-2005, 12:10
Unfortunately for those us us using films like Tmax and Velvia it also takes away any excuses we might have for a picture being less than perfectly sharp.

The best answer (supported by testing) that I have seen to: "how much is enough" is in Ctein's book "post exposure".

He shows that up to a detail level measuring 30 lp/mm people can tell the difference (if it is great enough say 20 to 30) between otherwise identical prints compaired side by side.

I would love to see some data on at what point people can't tell between real data and interpolated. I suspect that it would be strongly subject dependent.

Most of the serious testing that I have seen indicates at least two if not three rows of pixels are required to resolve a line pair.

It isn’t the “how many angels can dance on the head of a pin” argument that some might think. There has been some serious research on digital archiving where they need to know exactly how many pixels that they need to retain information content in a document. They don’t want to use any more memory than necessary but they don’t want to lose information either. (I can’t find the links right now but they are numerous.)

I am having a bit of trouble getting my mind around the "1x+1". I thought I had it and it slipped away. Won't I always have to have a row of black pixels and a row of white ones for a line pair?

QT Luong
15-Feb-2005, 15:19
Film continues to have higher resolution than top digital sensors such as 1DsII, however, any scanning results in lower MTF, well correlated with perception of sharpness.

Henry Ambrose
15-Feb-2005, 16:30
QT,

"any scanning"

Do you mean poor or average scanning or did you mean literally "any scanning"

QT Luong
15-Feb-2005, 22:58
To quantify my previous statement, measured with Imatest, film scanned on a current 4000dpi desktop scanner gets less than 30 cycles/mm at MTF 50, while the 1DsII gets close to 50 cycles/mm (50/1.4 @f8). In general, there is a difference of only about 25% between a 4000dpi deskstop scan and a top drum scan.