By Willie Jones, UM Records Analyst
The following is a guest post by Chelcie Rowell
Frequency of occurring? Rare. Impact of occurring? Huge. I’m talking about digital disasters.
Stewards of digital content, like stewards of analog/paper content, must plan for a catastrophe in advance in order to minimize loss and recover quickly. True, digital disasters may occur infrequently. But at the scale that institutions collect digital content and for the length of time that institutions wish to preserve digital content the risk of a disaster is non-trivial.
Disasters may be natural (such as tornadoes and earthquakes) or failures of infrastructure (such as power failures). Disasters may result from intentional human action (such as cyber-terrorism) or simply human error (such as accidental deletion).
A digital disaster negatively impacts an institution’s digital content. What distinguishes many catastrophes that threaten digital content from those that threaten analog/paper content is that digital disasters may be much less visible. Bit rot is a one-in-a-million occurrence, for example, but when it happens special tools are needed to seek it out and prevent a digital disaster.
The following is an experience related by Pixar:
Pixar faced a digital disaster of comparably catastrophic impact involving the film Toy Story 2. As described by a Pixar technical editor, an accidental deletion wiped the working files before the film was finished. What audiences experience as an animated film is actually a complex digital object that contains thousands upon thousands of smaller files. Combined, these files are rendered into frames—including animation, set, and lighting data—that sequentially make up the moving image.
As the accidental deletion unfolded, pieces of that complex digital object were removed from disk, seemingly before the makers’ very eyes. As Oren Jacob, the film’s assistant technical director, put it “Woody’s hat disappeared. And then his boots disappeared. And then as we kept checking, he disappeared entirely. Woody’s gone.”
Fortunately, the studio was able to quickly restore the film from back-ups. But after the back-ups were revealed to be corrupt, the only recourse was to inventory different versions of the back-up and perform human-intensive quality reviews to stitch together enough valid data to render a relatively complete film. Jacob recalled, “In the end, human eyes scanned, read, understood, looked for weirdness, and made a decision on something like 30,000 files that weekend.”
** July 19th, 2012 by Bill LeFurgy, Digital Initiatives Manager, Library of Congress **
Both these episodes raise the issue of risk tolerance. When an institution manages unique digital materials, it needs to seriously consider what steps have to be taken for disaster prevention and mitigation.
(Permission to use or re-print) July 20, 2012
This entry was posted on Tuesday, November 6th, 2012 at 11:29 am and is filed under 2012 - 4th Quarter.