I've worked here in Silicon Valley as a test engineer for three decades mainly in computers and networking equipment. However never worked directly on storage products like hard drives etc so my input below is from a modest technical perspective.
One thing missing in the equation most of us doing backups are familiar with is there is more to guaranteeing long term data reliable than simply backing data up and then placing that media into long term storage. Each form of storage media goes through a period of quality evolution where data integrity gets increasingly more reliable until newer technologies appear and the old technology becomes neglected and static. Consider what has happened to magnetic tape backups. Unless one transfers information from old tape backups to newer storage media, it becomes increasingly difficult to even run appropriate applications on current computers and opeating systems to read that data back. Not only does the hardware go out of production but so does product support and data standards. Often the whole companies that had anything to do with the technology have come and gone. Consumer products are just horrible in this way while commercial and corporate product support tends to last longer as long as someone can make money from it.
The micro electronic devices being created today are mind bogglingly complex and tiny. Internal device structures required to maintain opeational functionality are sometimes just a few molecules wide. Manufacturer's engineer such devices to work for the expected lifetime of those products plus a few more years. However that does not guarantee that some transistor material structure inside one of these devices is still going to maintain its physical integrity years beyond that lifetime. There are just a lot of chemical and atomic level things that molecular matter does over long periods that one cannot be sure about unless someone took time to actually do the science and engineering to know that. So our electronic devices, our storage media, all have large unknowns into the distant future. There may be some understanding by scientists where some of these things loose reliability but not much of that filters down to the rest of us. Fortunately most of us probably don't care what happens even 20 years hence with curent media. Heck there might be hand-sized cameras by then with a couple orders of magnitude better resolution and quality making anything our generation is creating of small value. Still don't go burying your CD's in the ground while intending to dig them up 50 years from now in order to be able to read data off them.
Another issue is what one is doing with the storage media to guarantee data is still ok. I backed up many CD's from the mid 90's that generally seemed ok. Like the FAT tables remained fine as well as most files. But then I'd come across a few image files that when opened by Photoshop showed the usual colorful thin lines of corrupted data segments. If that happens on the fat table or other key structures, the whole media may be hosed. So really what one ought to be doing is periodically running comparison software for each of two sets of identical media backups in order to catch when errors crop up. That of course requires two complete storage systems running on the same computer whether that is two CD drives, two DVD drives, two backup hard drives.
And then as new storage technology comes out, transfering all the old backups into new. And did you bother to somehow record what is on each backed up media? Think you'll actually remember even ten years later? Or will you have to dig through each one inefficiently in order to rediscover what each contains? No doubt much might be tossed. It if one actually knows contents without accessing media dealing with a pile of stored media becomes far more efficient. Recently I thought about this, so before my long term backups get out of hand, I'm going to start making file listings of every piece of media I use for backups that gets saved separately. ...David
Bookmarks