Lot was written about the true resolution of different scanners - let me take as an example one of the flat-bed scanners - where the true resolution at the setting of 4800 dpi is only about 2000 dpi. But what happens when we resize the final scan?
To make this all more clear let me consider two different scanners at the same scanning spi - say 4000:
scanner A) The true resolution is very close to spi of the scanners (good quality drum scanners or other high end ones)
scanner B) True resolution is about 1/2 of the scanners spi (that is 2000). In other words only (approximately) 1/4 of the produced data contains true information - the rest is just ballast.
Now - what happens to the information in such a scan (where the true resolution lags behind the spi) if we DOWNSIZE the file (for example with Photoshop) - say by linear factor 2 (factor 4 in the total amount of pixels). The scan (A) will loose 3/4 of the original information, but the resulting image will have the same "information density" as the original.
But what happens to the scan (B)? Will the software miraculously manage to throw away the "useless" 3/4 of the image and leave the 1/4 full of information? In such a case - after the downsizing - the scan (A) should be more-less identical to the scan (B) ??? Or the resizing software "randomly" decides which data/pixels to keep and we will end up with a scan that is not any sharper than the original. That would be the best and the worst scenario respectively.
But what truly happens?
The above was triggered by the review of one of the Plustek 7600i scanner where it was found out, that true resolution @ 7200 spi is about 3250, but @ 3600 spi only 2600. While this is not the same as what I discussed above, it just got me thinking..