by Mike Casey, Director of Technical Operations, Audio/Video, Media Digitization and Preservation Initiative, Indiana University
Quality Control as Risk Management
In this post, I’ll continue exploring the quality control system developed for the audio/video side of the MDPI project. My earlier post on types of quality control ended with a discussion of risk-based QC. This is a type of quality control that involves identifying points of heightened risk among formats and digitization workflows so that more resources can be directed to them. In fact, the entire QC function of a media digitization operation may be conceptualized as an exercise in risk management. Quality control procedures must provide a reasonable level of confidence that the products of digitization meet expectations. However, we know that all human beings make mistakes, and therefore there is a risk that the human-run digitization operation will make mistakes. A useful definition of risk for our purposes is the chance or probability of a loss.
Risk in Media Digitization
The primary high-level risk within a media digitization operation is that it will produce digital files that do not meet an organization’s specification for preservation, and that these ‘bad’ files will be stored into the future as trusted preservation surrogates. More specifically, there is a risk that the digitization process itself will introduce errors into the files (loss of accuracy). There is also a risk that the work won’t be done optimally—that we will have ‘left something on the table’ that results in, generally speaking, audio that could have sounded better or video that could have looked better (loss of fidelity and accuracy). Further, there is a risk that errors will be introduced into the metadata that accompanies a recording (loss of context and interpretability). Finally, there is a risk that the operation won’t digitally preserve an item at all because of a mistake in keeping track of recordings or files (loss of preservation).
The impact of these losses on future uses of the target content may be subtle or profound. They can result in researchers using representations of content that are inaccurate or of lower quality than is possible, reaching false conclusions based on misleading or absent metadata, or not discovering content at all because it was not preserved.
These risks can be managed by the QC system in tandem with a robust quality assurance (QA) program. Risk management may be defined as “the identification, assessment, and prioritization of risks followed by coordinated and economical application of resources to minimize, monitor, and control the probability and/or impact of unfortunate events.” Classic risk management thinking identifies four basic areas in which an operation may respond to or treat risks:
- Transfer the risk
- Avoid the risk
- Reduce or mitigate the risk
- Accept the risk
Transferring the risk involves moving responsibility to another entity so that the impact to the organization carrying the risk is minimized or removed. A typical example is the use of insurance. The insurance company assumes specifically defined financial risks from a policy holder who pays a premium for this service. However, risk transfer may also be employed through a contract that contains an indemnity clause in which one party agrees to be financially responsible for specified liabilities that may be incurred by another party.
Within the audio/video part of the MDPI project, some risk is transferred to the digitization vendor through provisions of its contract with IU. The contract stipulates that the vendor is responsible for fixing all problems and re-digitizing as necessary at no cost to MDPI as long as those problems are reported within a fixed time period after digitization (40 days for audio and 30 days for video). This provision protects IU from mistakes made by the vendor as long as they are detected promptly.
There are two ways in which an operation can avoid risk altogether: by choosing not to take whatever action exposes it to risk or by employing a resource that removes the risk.
Avoiding actions that result in greater risk is often not an option for media digitization operations. For example, the use of parallel transfer workflows is typically considered to carry greater risk than 1:1 workflows, although this is a complex question that is open to debate. It may not be a reasonable option for institutions with cost and time limitations to use only 1:1 workflows, which cost more and take more time, to digitize their collections. Similar issues may be found in specific workflow steps. Performing an azimuth adjustment as part of an audio tape digitization workflow adds significant risk—if not done well, the audio will be missing high frequencies. However, this step is essential to obtaining maximum fidelity and accuracy in the digital file and is considered mandatory.
There are, however, some areas in which risk can be successfully avoided. For example, MDPI removes risk by designing QC checks into its post-processing system. All files are handled by post-processing and therefore all are subject to this series of checks. A list of some of the QC checks performed by the post-processing system may be found in the earlier blog post on types of quality control. In this way, we remove practically all risk for the variables checked by the system. For example, all audio files are checked for bit depth. If the files are not 24 bit, they will be failed and the original recordings sent back to the vendor for re-digitization. There is no risk of inaccuracy since computers are quite capable of accurately checking this variable and every single file is checked.
Even when risk can’t be avoided altogether, the probability of loss can still be lessened by applying controls or taking particular actions. With this in mind, MDPI has implemented a number of policies, procedures, and programs to reduce the risk that out-of-spec digital files will be placed into preservation storage. First, all fragile formats such as wax cylinders, lacquer discs, wire recordings, and ½” EIAJ videotapes are automatically routed to the 1:1 workflows used by the IU Media Digitization Studios (IUMDS) rather than the parallel transfer workflows used by the vendor. Playback of these formats typically requires constant attention, which can only be provided by an engineer digitizing one recording at a time. There is the added bonus that recordings in these formats tend to represent some of the highest value content for IU. This policy is more akin to a quality assurance step than a quality control check.
MDPI also implemented a human-intensive QC program to reduce risk. An MDPI staff member listens to or views files from a randomly selected sample of 10% of digitized recordings in order to judge their suitability for preservation storage. This part of the QC operation includes the use of Direct QC, value-based QC, and risk-based QC as described in the earlier post on types of QC.
Finally, MDPI has implemented a program to retrospectively select and perform focused QC on files created from some of IU’s highest-value recordings and collections to provide further assurance that they meet our specification. Our aim is to confirm that the most significant content owned by IU, as selected by curators from the media-holding units themselves, was digitized accurately.
Finally, we may choose to accept a risk that we consider tolerable, recognizing it and working with it in the interest of achieving a greater gain. For example, the IU Music Library holds some 38,000 open reel tape recordings of student and faculty recitals and concerts dating from the 1950s. While curatorial staff tell us that there are recordings of a number of prominent classical and jazz musicians interspersed within this collection, the majority of items are judged to be of moderate value. That is, they are valuable enough to justify digitization and long-term preservation but are not considered highly valuable. In addition, some of the most valuable tapes were digitized as part of an earlier project. Digitizing this collection using parallel transfer workflows may entail a greater risk than using 1:1 workflows. IU is willing to accept this risk rather than incurring the much higher cost of using 1:1 workflows to digitize this large collection.
The boundaries between these risk response procedures can be a little blurry. For example, transferring risk can also be thought of as a form of avoiding risk for the party that gave the risk to another entity. Also, reducing risk can imply accepting what is left of the risk after it is reduced.
It is critical to acknowledge and confront the risks inherent in digitization work for the sake of future staff who must manage preserved content and for future researchers who must rely on the content for their inquiry. Although it is not possible to remove or reduce all risk associated with media digitization, it is feasible to manage risk through the QC system so that we have a high level of confidence that the products of digitization meet our specification. Using procedures for transferring, avoiding, reducing, and accepting risk enables us to find problems, prevent problems, reduce the likelihood that problems will occur, and understand areas in which a small number of problems are acceptable. All of this engenders trust in the output of the digitization process.