Just because you can see a detail that you perceive to be a hair doesn't mean you are really resolving lines as thin as the hair. Try having someone hold a few strands of hair at the same distance and see if you can count them. I bet you can't.
What was the lighting like on the foreground and background? As an extreme example; was her hair back or side lit and the background more than 50% or more darker than her hair? What color was the background, did it complement or contrast with the color of her hair?
My wife also says to stop staring at her , Neal. It is ten years over between the two of you and knowing that you are staring at her from so close gives her the creeps. (Disclaimer: This post is intended to impart a sense of humor. Given
an internet forum's inability to carry inflections, tone and facial expressions it may
fail miserably in its intent. The sender acknowledges the limitations of
the technology and assigns to the software in which this message was
composed any ill feelings that may arise.)
Interesting thoughts, but it brings to mind three technological marvels: MP3, JPEG, and MPEG. How do these compression systems work? From my understanding (which I freely acknowledge to be very limited indeed), they factor out what the brain is least likely to notice missing. So, for an MP3, the algorithm knows (somehow) what the brain won't miss anyway, and strips those frequencies out; the higher the level of compression, the more comes out, until eventually you notice the degredation. Same with JPEGs, except (amazing) this time with visual data, not simply aural. That this even works is an amazing testimony to the complexity of how we think; that people figured it out AND built algorithms that manipulate it is simply astounding to me.
That said, it strikes me that your average viewer-- especially if he/she is NOT given a side-by-side comparison with a traditional print-- finds 300 DPI perfectly acceptable possibly in some relation to this phenomenon. That is, it's not merely about resolution, but also about how the brain functions and recognizes images. Add to that some interpolation from a decent printer, and you've got a lot of factors at play.
One more thought-- it may be unfair to compare the resolving ability of the eye for something live, like the hair of the person in front of you, to the resolving ability of the same eye when looking at a photo. Photos are simply different things to view-- not 3D, not active, not limited by the natural constraints of photography (even large format). Perhaps we adjust what is acceptable resolution on a print because psychologically (and subconsciously) we know that it's just a print. The same resolution (or color, or contrast) would be surreal in "real life" circumstances. We do this with audio and video, too, right? Has anyone else noticed how nearly all of CBS' primetime shows use high-contrast, low-saturation imaging? It looks nothing like real-life, but who watches TV to watch "real life" anyway?
There were two counteracting effects at work. Red hair is the most coarse (and therefore most visible), followed by brunette and blonde. This was probably offset by the smoking fires of Hades because you were looking at the redhead instead of following the sermon.
No, really. Several folks had the right idea. You can see stars, which are essentially dimensionless points (at least from space). Even from under our undulating atmosphere (she wasn't undulating, was she?), apparent stellar disk sizes are on the order of an arc second or two, well below the one arc minute resolution threshhold. Our eyes no doubt then blur this one second disk out to an arc minute or more.
Given sufficient conrast, it's perfectly possible to see things you can't resolve.
Write again when you're eyes are good enough to see her underwear!
On another note, the most current human optical research I've seen suggests that the resolution limit of the human eye for objects at 10 inches (the closest most people with good eyes can focus) is 11line pairs/mm. This translates to about 550 dots per inch. This is the research used by engineers at Schneider.
Keep in mind a few things: 1) most people's eyes aren't nearly this good. This number presumes theoretically perfect optics, and a resolution limit based on that minimum focussing distance the density of rods and cones in the retina. Most eyes can resolve 6 or 7 lp/mm, and typically only at quite high contrast. 2) resolution itself is a very small component of print quality. It doesn't even have much to do with perceived sharpness. 3) the number of dots per inch (which we should call frequency, to avoid confusing it with resolution of actual image detail) influences things besides resolveable detail. For example, you won't be able to count more hairs on a 1200dpi print than on a 600dpi print, all else being equal. But slightly oblique lines, and broad radius curves, will be smoother on the higher resolution print. This has nothing to do with detail resolution, only with edge rendering.
300 dpi can certainly make a nice looking print. 600 dpi will be capable of showing a little more visible detail. 1200 will be able to make lines and curves (and type) almost perfectly smooth (although 600 dpi, with anti-aliasing, comes extremely close).
Bookmarks