Let me say up-front that this is all speculation-- albeit educated speculation-- and not based on any solid facts, other than history and experience. I am not a prophet (nor the son of a prophet...), but I do think that my prediction is reasonable.
Let me begin with the computer industry in general. There is a general principle called "Moore's Law" which is widely considered to be accurate (to a point) and should hold for the foreseeable future. It basically says that there is an exponential rate of progress in the speed of transistors to the amount of time that passes-- e.g., that the speed doubles every two years. Anyone who has been watching the progress of chip-speed advancement knows that this has held, ever since Moore postulated it in the mid-60s. I've been simply astonished at the chips available today, and the raw power that they represent; it is not only possible to get a 3.4 Ghz Intel processor today, but they are becoming common as offerings in mid-to-high-end desktop machines.
That said, I don't know anyone who owns one-- in fact, I don't even know anyone who wants to own one. There are basically two realms of computing that utilize such lightning-fast chipsets: gamers and video editors (and, ironically, the video editors almost exclusively use Apple Macintosh machines, to Intel's chagrin). No one else-- not even hardcore professional photo manipulators-- require this sort of technology, nor will they for another year or even two (at which point, according to Moore's Law, we will have chips twice as fast...). There is simply more speed and power in those high-speed Pentium 4s than the average computing world could ever use.
As a result, the computer industry is no longer supplying according to demand when it comes to the fastest chips. Instead, they are progressing (because they must, to stay competitive with one another) for the sake of progress. It is useful to know that, when Windows 2010 or Mac OS C requires it, I will have the chipspeed required to run my system at a comfortable pace; however, there is no true stake in the highest-speed chips for guys like me (and most, if not all, of you).
What does this have to do with photography? I think everything. The most clear and present threat to film photography is the digital photography industry. "Listening" to some of the discussions here, one might think otherwise: it might be inferred that the enemies of traditional film photography (and especially the less mainstream avenues of it, such as Large Format) are the manufacturers, who, with malice aforethought, discontinue our most precious products to spite the buyer and control what sorts of photography we're even able to do. They are simply following their business charters, however, which says that they must turn a profit.
As digital photography has progressed, it has become (for many) a truly viable alternative to film photography. In fact, at this point, the main things which prohibit the average snapshooter from converting totally are convenience and the too-high costs of ink cartridges. For more serious shooters, issues of convenience are less of a factor than image permanence, but there still remain only a few obstacles from making the jump. Granted, this is obviously less the case for a Medium-Format photographer, as digital equipment gets very costly very quickly, and almost every Large Format photographer is stuck with a hybrid-digital option at best, as few can afford the equipment it takes to go 100% LF digital. However, the threat remains, as the photo industry takes its cues, on average, from the "pro-sumer" who usually shoots a high-end 35mm SLR. And the profits that the industry must produce are found right now in the film-to-digital-conversion market; witness the Nikon D100, etc.
While the threat to film photography exists, I don't think it will stand the test of time to prevail over film photography. Moore's Law works with digital camera transistors, too, and before very long, the same phenomenon we see now with the processor in the average desktop computer will occur in the digital photography world: namely, that there will be more pixels, more megabytes (or gigabytes), and more resolution than a photographer could ever need, no matter what Epson develops. It is bound to happen, since we are already to the point where most point-and-shoot (or as we used to call them at the camera store, PHD-- "push here, dummy") digitals can print up to 8x10 with little or no difference from a 35mm negative enlargement.
Why, then, is there hope for film photography? Because when this happens-- when every digital camera can print flawless poster-sized prints-- people will quit buying cameras for their lack of limitations, and will start buying cameras for the sake of photography again. And, just like folks who shoot 35mm film often get around to trying (and liking) larger formats, folks who shoot digital will often get around to trying film photography. Thus, the digital photography "movement" that we are experiencing now, rather than posing a true long-term threat to film photography, will actually be a benefit, a refinement of sorts, to good photography.
As a result of this, I see the implications for the industry being this: those companies that can and will hold onto their "traditional" lines of equipment and supplies will, in 5-7 years' time, become quite successful, even profitable, in those areas. Those companies which can't, or won't, will regret it, but the absence of their weak commitments to the future of photography will not hurt us in the end.