Digital obsolescence is the risk of data loss because of inabilities to access digital assets, due to the hardware or software required for information retrieval being repeatedly replaced by newer devices and systems, resulting in increasingly incompatible formats. [2] [3] While the threat of an eventual "digital dark age" (where large swaths of important cultural and intellectual information stored on archaic formats becomes irretrievably lost) was initially met with little concern until the 1990s, modern digital preservation efforts in the information and archival fields have implemented protocols and strategies such as data migration and technical audits, while the salvage and emulation of antiquated hardware and software address digital obsolescence to limit the potential damage to long-term information access. [3] [4] [5]
A false sense of security persists regarding digital documents: because an infinite number of identical copies can be created from original files, many users assume that their documents have a virtually indefinite shelf life. [5] In reality, the mediums utilized for digital information storage and access present unique preservation challenges compared to many of the physical formats traditionally handled by archives and libraries. Paper materials and printed media migrated to film-based microform, for example, can be accessible for centuries if created and maintained under ideal conditions, compared to mere decades of physical stability offered by magnetic tape and disk or optical formats. [7] Therefore, digital media have more urgent preservation concerns than the gradual change in written or spoken language experienced with the printed word.
Little professional thought in the fields of library and archival science was directed toward the topic of digital obsolescence as the use of computerized systems grew more widespread and commonplace, but much discussion began to emerge in the 1990s. [4] [5] Despite this, few options were proposed as genuine alternatives to the standard method of continuously migrating data to increasingly newer storage media, employed since magnetic tape began succeeding paper punch cards as practical data storage in the 1960s and 1970s. [4] [8] [9] These basic migration practices persist into the modern era of hard disk and solid-state drives as research has shown many digital storage mediums frequently last considerably shorter in the field compared to manufacturer claims or laboratory testing, leading to the facetious observation that "digital documents last forever—or five years, whichever comes first." [5]
The causes for digital obsolescence aren’t always purely technical. Capitalistic accumulation and consumerism have been labeled key motivators toward digital obsolescence in society, with newly introduced products frequently assigned greater value than older products. [10] Digital preservation relies on the continuous maintenance and usage of hardware and software formats, which the threat of obsolescence can interfere with. Four types of digital obsolescence exist in the realm of hardware and software access:. [4]
Because the majority of digital information relies on two factors for curation and retrieval, it is important to separately classify how digital obsolescence impacts digital preservation through both hardware and software mediums.
Hardware concerns are two-fold in archival and library fields: in addition to the physical storage medium of magnetic tape, optical disc, or solid-state computer memory, a separate electronic device is often required for information access. And while proper storage can help mitigate some environmental vulnerabilities to storage formats (including dust, humidity, radiation, and temperature) and extend preservation for decades, there are other inevitable endangering factors. [12] [7] Magnetic tape and floppy disks are vulnerable to both the deterioration of adhesive holding the magnetic data layer to its backing or the demagnetization of the data layer, commonly called "bit rot"; optical discs are specifically susceptible to physical damage to their readable surface, and to oxidation occurring between improperly sealed outer layers; a process referred to as "disc rot" or, inaccurately, "laser rot" (particularly in reference to LaserDiscs). [13] Older forms of floating-gate MOSFET based read-only-memory storage such as (some) cartridges and (most) memory cards encounter their own form of bit rot when the charges representing individual bits of binary information dissipate beyond a certain level (called "flipping") and the data is rendered unreadable. [14]
The operability of a format’s appropriate playback or recording device possess their own vulnerabilities. Cassette decks and disk drives rely on the functionality of precision-manufactured moving parts that are susceptible to damages caused by repetitive physical stress and foreign materials like dust and grime. Routine maintenance, calibrations, and cleaning operations can help extend the lifetime of many devices, but broken or failing parts will need repair or replacement: sourcing parts becomes more difficult and expensive as the supply stock for older machines reaches scarcity, and user technical skills grow challenged as newer machines and storage formats use less electromechanical parts and more integrated circuits and other complex components. [12]
Only a decade after the 1970s Viking program, NASA personnel discovered that much of the mission data stored on magnetic tapes, including over 3000 unprocessed images of the Martian surface transmitted by the two Viking probes, was inaccessible due to a multitude of factors. [15] While in possession of indecipherable notes written by long-departed or deceased programmers, the computer hardware and source code needed to correctly run the decoding software had been replaced and disposed of by the agency. [15] [4] Information was eventually recovered after more than a year of reverse engineering how the raw data was encoded onto the tapes, which included consulting with the original engineers of the Viking landers’ cameras and imaging hardware. [15] NASA experienced similar issues when attempting to recover and process images from 1960s lunar orbiter missions. Engineers at the Jet Propulsion Laboratory acknowledged in 1990, following a one-year search that located a compatible data tape reader at a United States Air Force base, that a missing part might need to be rebuilt in-house if a replacement could not be sourced from computer salvage yards. [15]
Over the past several decades, there have been a number of various, once industry-standard file formats and application platforms for data, images, and text that have been repeatedly replaced and superseded by newer iterations of software formats and applications, often with increasingly greater degrees of incompatibility between each other and along their own product lines. Such incompatibilities now frequently extend to which version of the operating system is installed on the system (such as instances of Microsoft Works predating Version 4.5 being unable to run on the Windows 2000 operating system and beyond). One example of a developer cancelling an instance of planned obsolescence occurred in 2008, when Microsoft retracted intentions of an Office service package dropping support for a number of older file formats, due to the intensity of public outcry. [16]
Systemic obsolescence in software can be exemplified by the history of the word processor WordStar. A popular option for WYSIWYG document editing on C/PM and MS-DOS operating systems during the 1980s, a delayed port to Windows 1.0 caused WordStar to lose significant market share to competitors WordPerfect and Microsoft Word by 1991. [17] [18] Further development of the Windows version stopped in 1994, and WordStar 7 for MS-DOS was last updated in 1999. [19] Over time, any version of WordStar grew increasingly incompatible with modern versions of Windows beyond 3.1 to the frustration of long-devoted users, including authors William F. Buckley, Jr. and Anne Rice. [20] [21]
Digital obsolescence has a prominent effect on the preservation of video game history, since many older games and hardware were regarded by players as ephemeral products, due to the continuous process of computer hardware upgrading and home console generation cycles. Such cycles are often the result of both systemic and technical obsolescence. Some of the oldest computer games, like 1962's Spacewar! for the PDP-1 commercial minicomputer, were developed for hardware platforms so outdated that they are virtually nonexistent today. [22] Many older games of the 1960s and 1970s built for contemporary mainframe terminals and microcomputers can only be played today through software emulation. While video games and other software applications can be orphaned by their parent developers or publishing companies, the copyright issues surrounding software are a very complicated hurdle in the path of digital preservation. [22]
One prime example of copyright issues with software were those encountered during preservation efforts for the BBC Domesday Project, a 1986 UK multimedia data collection survey that commemorated the 900th anniversary of the original Domesday Book. While the project's specially customized LaserDisc reader resulted in its own hardware-based preservation problems, the combination of one million personal copyrights belonging to participating civilians, in addition to corporate claims on the specialized computer hardware, means that publicly accessible digital preservation efforts might be stalled until 2090. [23] [24]
Organizations possessing digital archives should perform assessments of their records in order to identify file corruption and reduce the risks associated with file format obsolescence. Such assessments can be accomplished through internal file format action plans, which list digital file types in an archive's holdings and assess the actions taken in order to ensure continued accessibility. [25]
One emerging strategic avenue in combatting digital obsolescence is the adoption of open source software, due to source code availability, transparency, and potential adaptability in modern hardware environments. [26] [27] For example, the Apache Software Foundation's OpenOffice application supports access for a number of legacy word processor formats, including Version 6 of Microsoft Word, and basic support for Version 4 of WordPerfect. [16] This contrasts with criticism directed toward Microsoft's own purported Open XML format from the open source community for non-disclosure agreements and translator demands. [27]
Standard strategies for digital preservation utilized by information institutions are frequently interconnected or otherwise related in function or purpose. Bitstream copying (or data backup) is a foundational operation often employed before many other practices, and facilitates establishing the redundancy of multiple storage locations: refreshing is the transportation of unchanging data, frequently between identical or functionally similar storage formats, while migration converts the format or coding of digital information to enable moving it between different operating systems and hardware generations. [4] Normalization reduces organizational complexity for archival institutions by reducing the number of similar filetypes through conversion, and encapsulation assembles digital information with its associated metadata to guarantee information accessibility. [4] Digital archives employ canonicalization to ensure that key aspects of documents have survived the process of conversion, while a reliance on standards established by regional archival institutions maintains organization within the broader spectrum of the field. [4] Technology preservation (also called computer museum) and digital archeology respectively involve institutions maintaining possession or access to legacy hardware and software platforms, and the salvaging methods employed to recover digital information from damaged or obsolete media and devices. [4] Following recovery, some data, such as documentation, can be converted to analog backups in the form of physically accessible copies, while executable code can be launched through emulation platforms within modern hardware and software environments designed to simulate obsolete computer systems. [4]
Writing in 1999, Jeff Rothenberg was critical of many contemporary preservation procedures and how they improperly addressed digital obsolescence as the most prominent problem in long-term digital information storage. Rothenberg disapproved of the reliance on hard copies, arguing that printing digital documents stripped them of their inherent "digital" qualities, including machine readability and dynamic, user functionalities. [5] Computer museums were also cited as an inadequate practice. There are practical limitations of a limited number of locations capable of maintaining obsolete hardware forever, realistically limiting the full access capabilities of legacy digital documents: additionally, most older data rarely exists in coding formats to take full advantage of their original hardware or software environments. [5] Two digital preservation processes specifically criticized were the implementation of relational database (RDB) standards and an overreliance on migration. While designed for standardization, RDBs and the features of their management systems (RDBMS) often promoted unintentional tribalistic practices among regional institutions, introducing incompatibilities between RDBs: meanwhile, the ubiquity of file and program migration frequently risked failing to compensate for conversional paradigm shifts between increasingly newer software environments. [5] Emulation, with the digital data supported by an encapsulation of metadata, documentation, and software and emulation environment specifications, was argued as the most ideal preservation practice in the face of digital obsolescence. [5]
The UK National Archives published a second revision to their Information Assurance Maturity Model (IAMM) in 2009, overviewing digital obsolescence risk management for institutions and businesses. After instructing senior information risk owners on the initial requirements that determined both potential risk of digital obsolescence and the mitigating actions to counter it, the guide dissects a multi-step process toward maintaining digital continuity of archival information. [28] Such steps run the gamut from enforcing responsibility of information continuity and confirming the degree of content metadata, to ensuring critical information discovery through institutional usage and that system migration doesn’t affect information accessibility, to guaranteeing IT support and enforcing contingency plans for information survivability through organizational changes. [28]
In 2014, the National Digital Stewardship Alliance recommended developing file format action plans, stating "it is important to shift from more abstract considerations about file format obsolescence to develop actionable strategies for monitoring and mining information about the heterogeneous digital files the organizations are managing". [29] Other important resources for assessment support are the Library of Congress' Sustainability of Digital Formats page, and the UK National Archives' PRONOM online file format registry.
CERN began its Digital Memory Project in 2016, aiming to preserve decades of the organization’s media output through standardized initiatives. [30] CERN determined that their solution would require continuous access to metadata, the implementation of an Open Archival Information System (OAIS) archive as soon as possible to reduce costs, and the advance execution of any new system’s archiving plan. [30] Using OAIS, CERN developed certification for trustworthy digital repositories (TDR), the ISO 16363 standard, and implemented E-Ternity as the prototype for its compliant digital archive model. [30]
On 1 January 2021, Adobe ended support and blocked content from running in its Flash Player in response to the advancements in open standards for the Web. [31] This action followed a July 2017 announcement despite affecting the user experience for millions of websites to varying degrees. [32] Since January 2018, the Flashpoint Archive has been one of several Adobe Flash Player preservation projects, salvaging more than 160,000 animations and games. [33]
Computer data storage or digital data storage is a technology consisting of computer components and recording media that are used to retain digital data. It is a core function and fundamental component of computers.
Data storage is the recording (storing) of information (data) in a storage medium. Handwriting, phonographic recording, magnetic tape, and optical discs are all examples of storage media. Biological molecules such as RNA and DNA are considered by some as data storage. Recording may be accomplished with virtually any form of energy. Electronic data storage requires electrical power to store and retrieve data.
A disk image is a snapshot of a storage device's structure and data typically stored in one or more computer files on another storage device.
Digitization is the process of converting information into a digital format. The result is the representation of an object, image, sound, document, or signal obtained by generating a series of numbers that describe a discrete set of points or samples. The result is called digital representation or, more specifically, a digital image, for the object, and digital form, for the signal. In modern practice, the digitized data is in the form of binary numbers, which facilitates processing by digital computers and other operations, but digitizing simply means "the conversion of analog source material into a numerical format"; the decimal or any other number system can be used instead.
A ROM image, or ROM file, is a computer file which contains a copy of the data from a read-only memory chip, often from a video game cartridge, or used to contain a computer's firmware, or from an arcade game's main board. The term is frequently used in the context of emulation, whereby older games or firmware are copied to ROM files on modern computers and can, using a piece of software known as an emulator, be run on a different device than which they were designed for. ROM burners are used to copy ROM images to hardware, such as ROM cartridges, or ROM chips, for debugging and QA testing.
The BBC Domesday Project was a partnership between Acorn Computers, Philips, Logica, and the BBC to mark the 900th anniversary of the original Domesday Book, an 11th-century census of England. It has been cited as an example of digital obsolescence on account of the physical medium used for data storage.
DECtape, originally called Microtape, is a magnetic tape data storage medium used with many Digital Equipment Corporation computers, including the PDP-6, PDP-8, LINC-8, PDP-9, PDP-10, PDP-11, PDP-12, and the PDP-15. On DEC's 32-bit systems, VAX/VMS support for it was implemented but did not become an official part of the product lineup.
Data migration is the process of selecting, preparing, extracting, and transforming data and permanently transferring it from one computer storage system to another. Additionally, the validation of migrated data for completeness and the decommissioning of legacy data storage are considered part of the entire data migration process. Data migration is a key consideration for any system implementation, upgrade, or consolidation, and it is typically performed in such a way as to be as automated as possible, freeing up human resources from tedious tasks. Data migration occurs for a variety of reasons, including server or storage equipment replacements, maintenance or upgrades, application migration, website consolidation, disaster recovery, and data center relocation.
UVC-based preservation is an archival strategy for handling the preservation of digital objects. It employs the use of a Universal Virtual Computer (UVC)—a virtual machine (VM) specifically designed for archival purposes, that allows both emulation and migration to a language-neutral format like XML.
In library and archival science, digital preservation is a formal process to ensure that digital information of continuing value remains accessible and usable in the long term. It involves planning, resource allocation, and application of preservation methods and technologies, and combines policies, strategies and actions to ensure access to reformatted and "born-digital" content, regardless of the challenges of media failure and technological change. The goal of digital preservation is the accurate rendering of authenticated content over time.
Digital permanence addresses the history and development of digital storage techniques, specifically quantifying the expected lifetime of data stored on various digital media and the factors which influence the permanence of digital data. It is often a mix of ensuring the data itself can be retained on a particular form of media and that the technology remains viable. Where possible, as well as describing expected lifetimes, factors affecting data retention will be detailed, including potential technology issues.
The term Open Archival Information System refers to the ISO OAIS Reference Model for an OAIS. This reference model is defined by recommendation CCSDS 650.0-B-2 of the Consultative Committee for Space Data Systems; this text is identical to = 57284 ISO 14721:2012. The CCSDS's purview is space agencies, but the OAIS model it developed has proved useful to other organizations and institutions with digital archiving needs. OAIS, known as ISO 14721:2003, is widely accepted and utilized by various organizations and disciplines, both national and international, and was designed to ensure preservation. The OAIS standard, published in 2005, is considered the optimum standard to create and maintain a digital repository over a long period of time.
The digital dark age is a lack of historical information in the digital age as a direct result of outdated file formats, software, or hardware that becomes corrupt, scarce, or inaccessible as technologies evolve and data decays. Future generations may find it difficult or impossible to retrieve electronic documents and multimedia, because they have been recorded in an obsolete and obscure file format, or on an obsolete physical medium; for example, floppy disks. The name derives from the term Dark Ages in the sense that there could be a relative lack of records in the digital age as documents are transferred to digital formats and original copies are lost. An early mention of the term was at a conference of the International Federation of Library Associations and Institutions (IFLA) in 1997. The term was also mentioned in 1998 at the Time and Bits conference, which was co-sponsored by the Long Now Foundation and the Getty Conservation Institute.
The conservation and restoration of new media art is the study and practice of techniques for sustaining new media art created using from materials such as digital, biological, performative, and other variable media.
Oral history preservation is the field that deals with the care and upkeep of oral history materials, whatever format they may be in. Oral history is a method of historical documentation, using interviews with living survivors of the time being investigated. Oral history often touches on topics scarcely touched on by written documents, and by doing so, fills in the gaps of records that make up early historical documents.
Digital curation is the selection, preservation, maintenance, collection, and archiving of digital assets. Digital curation establishes, maintains, and adds value to repositories of digital data for present and future use. This is often accomplished by archivists, librarians, scientists, historians, and scholars. Enterprises are starting to use digital curation to improve the quality of information and data within their operational and strategic processes. Successful digital curation will mitigate digital obsolescence, keeping the information accessible to users indefinitely. Digital curation includes digital asset management, data curation, digital preservation, and electronic records management.
PREservation Metadata: Implementation Strategies (PREMIS) is the de facto digital preservation metadata standard.
In computing, an emulator is hardware or software that enables one computer system to behave like another computer system. An emulator typically enables the host system to run software or use peripheral devices designed for the guest system. Emulation refers to the ability of a computer program in an electronic device to emulate another program or device.
Database preservation usually involves converting the information stored in a database to a form likely to be accessible in the long term as technology changes, without losing the initial characteristics of the data.
The conservation and restoration of time-based media art is the practice of preserving time-based works of art. Preserving time-based media is a complex undertaking within the field of conservation that requires an understanding of both physical and digital conservation methods. It is the job of the conservator to evaluate possible changes made to the artwork over time. These changes could include short, medium, and long-term effects caused by the environment, exhibition-design, technicians, preferences, or technological development. The approach to each work is determined through various conservation and preservation strategies, continuous education and training, and resources available from institutions and organization across the globe.
{{cite journal}}
: Cite journal requires |journal=
(help){{cite web}}
: Missing or empty |title=
(help)