I know there is a school that believes in scanning at a slightly higher resolution than the real optical resolution of your scanner but i don't see any real advantage in doing this .... nor have any tests I have done shown there to be any advantage. I am also not sure that I understand the interpolation issue as it relates to the scan.
Using the 4990 as an example, when you scan at the stated resolution of the scanner, 4800, all you are doing is capturing blank information. The scanner is designed to theoretically scan at a resolution of 4800 but all most of the CCD's are doing is capturing blank info. The real world resolution of this scanner is ~2200 so if you scan at that or near that (2400) you will capture all the available info (depending on your settings) and will keep your file size much more reasonable. There are even some that argue that scanning at the higher resolution will create 'noise' in the image but I have no proof of that.
The point here is that the hardware's real resolution, when we are talking about consumer flatbed scanners, and the manufacturer's claimed resolution differ by a factor of 2-4 depending on the scanner in question. Yes, scan at the hardware's optical resolution; it's real resolution not its theoretical resolution as claimed in the specs.
As discussed in many other threads, these issues do not exist with the high end scanners which actually deliver as promised; in part because there are pinting industry standards (mostly set by Seybold) to whcih the manufacturers of these machines test and write their specs.
Bookmarks