One is to consider how many film grains it takes to record the smallest amount of actual image information. In order for film to work at all, it has to take a lot of film grains to successfully describe a "unit" of image information. Because if it doesn't, photography has to look like a Seurat painting. So there is a many->one relationship between film grains and image detail, yes?
This in turn implies that there is some resolution that can be used in scanning said film, that will capture all the image detail, but will not capture much, if any, grain detail. Some middle ground as it were. In fact, there seems to be many scanner operators that believe that scanning above around 2400-3200 ppi becomes meaningless because it doesn't capture any more image detail. All it does is capture more grain detail (aka noise). There seems to be some elements of truth to this "school" of scanning.
I say "some elements of truth" because if nothing else, this seems to offer an explanation of why scanning works at all, and why it works so bloody well -- why the scanned copy (second generation) can be such an excellent representation of the film (first generation copy) that it is, for all intents and purposes, an exact copy. What you loose when you scan is some film grain data, but you lose very little of the actual image information (note the distinction between the concepts of data and information). And this is why it doesn't matter that a scanner can't image the film grain itself.