Emulator

Last updated
DOSBox emulates the command-line interface of DOS. DOSBox screenshot.png
DOSBox emulates the command-line interface of DOS.

In computing, an emulator is hardware or software that enables one computer system (called the host) to behave like another computer system (called the guest). An emulator typically enables the host system to run software or use peripheral devices designed for the guest system. Emulation refers to the ability of a computer program in an electronic device to emulate (or imitate) another program or device. Many printers, for example, are designed to emulate Hewlett-Packard LaserJet printers because so much software is written for HP printers. If a non-HP printer emulates an HP printer, any software written for a real HP printer will also run in the non-HP printer emulation and produce equivalent printing. Since at least the 1990s, many video game enthusiasts have used emulators to play classic (and/or forgotten) arcade games from the 1980s using the games' original 1980s machine code and data, which is interpreted by a current-era system.

Computing Activity that uses computers

Computing is any activity that uses computers to manage, process, and communicate information for various purposes. It includes development of both hardware and software. Computing is a critical, integral component of modern industrial technology. Major computing disciplines include computer engineering, software engineering, computer science, information systems, and information technology.

Computer program Instructions to be executed by a computer

A computer program is a collection of instructions that performs a specific task when executed by a computer. Most computer devices require programs to function properly.

Hewlett-Packard American information technology company

The Hewlett-Packard Company or Hewlett-Packard was an American multinational information technology company headquartered in Palo Alto, California. It developed and provided a wide variety of hardware components as well as software and related services to consumers, small- and medium-sized businesses (SMBs) and large enterprises, including customers in the government, health and education sectors.

Contents

A hardware emulator is an emulator which takes the form of a hardware device. Examples include the DOS-compatible card installed in some 1990s-era Macintosh computers like the Centris 610 or Performa 630 that allowed them to run personal computer (PC) software programs and FPGA-based hardware emulators. In a theoretical sense, the Church-Turing thesis implies that (under the assumption that enough memory is available) any operating environment can be emulated within any other environment. However, in practice, it can be quite difficult, particularly when the exact behavior of the system to be emulated is not documented and has to be deduced through reverse engineering. It also says nothing about timing constraints; if the emulator does not perform as quickly as the original hardware, the emulated software may run much more slowly than it would have on the original hardware, possibly triggering timer interrupts that alter behavior.

Personal computer Computer intended for use by an individual person

A personal computer (PC) is a multi-purpose computer whose size, capabilities, and price make it feasible for individual use. Personal computers are intended to be operated directly by an end user, rather than by a computer expert or technician. Unlike large costly minicomputer and mainframes, time-sharing by many people at the same time is not used with personal computers.

Field-programmable gate array array of logic gates that are reprogrammable

A field-programmable gate array (FPGA) is an integrated circuit designed to be configured by a customer or a designer after manufacturing – hence the term "field-programmable". The FPGA configuration is generally specified using a hardware description language (HDL), similar to that used for an Application-Specific Integrated Circuit (ASIC). Circuit diagrams were previously used to specify the configuration, but this is increasingly rare due to the advent of electronic design automation tools.

Hardware emulation Emulating hardware devices in IC design

In integrated circuit design, hardware emulation is the process of imitating the behavior of one or more pieces of hardware with another piece of hardware, typically a special purpose emulation system. The emulation model is usually based on a hardware description language source code, which is compiled into the format used by emulation system. The goal is normally debugging and functional verification of the system being designed. Often an emulator is fast enough to be plugged into a working target system in place of a yet-to-be-built chip, so the whole system can be debugged with live data. This is a specific case of in-circuit emulation.

In preservation

Emulation is a strategy in digital preservation to combat obsolescence. Emulation focuses on recreating an original computer environment, which can be time-consuming and difficult to achieve, but valuable because of its ability to maintain a closer connection to the authenticity of the digital object. [2] Emulation addresses the original hardware and software environment of the digital object, and recreates it on a current machine. [3] The emulator allows the user to have access to any kind of application or operating system on a current platform, while the software runs as it did in its original environment. [4] Jeffery Rothenberg, an early proponent of emulation as a digital preservation strategy states, "the ideal approach would provide a single extensible, long-term solution that can be designed once and for all and applied uniformly, automatically, and in synchrony (for example, at every refresh cycle) to all types of documents and media". [5] He further states that this should not only apply to out of date systems, but also be upwardly mobile to future unknown systems. [6] Practically speaking, when a certain application is released in a new version, rather than address compatibility issues and migration for every digital object created in the previous version of that application, one could create an emulator for the application, allowing access to all of said digital objects.

In library and archival science, digital preservation is a formal endeavor to ensure that digital information of continuing value remains accessible and usable. It involves planning, resource allocation, and application of preservation methods and technologies, and it combines policies, strategies and actions to ensure access to reformatted and "born-digital" content, regardless of the challenges of media failure and technological change. The goal of digital preservation is the accurate rendering of authenticated content over time. The Association for Library Collections and Technical Services Preservation and Reformatting Section of the American Library Association, defined digital preservation as combination of "policies, strategies and actions that ensure access to digital content over time." According to the Harrod's Librarian Glossary, digital preservation is the method of keeping digital material alive so that they remain usable as technological advances render original hardware and software specification obsolete.

Obsolescence state of being which occurs when an object, service, or practice is no longer wanted even though it may still be in good working order

Obsolescence is the state of being which occurs when an object, service, or practice is no longer wanted even though it may still be in good working order.

Computer hardware physical components of a computer

Computer hardware includes the physical, tangible parts or components of a computer, such as the cabinet, central processing unit, monitor, keyboard, computer data storage, graphics card, sound card, speakers and motherboard. By contrast, software is instructions that can be stored and run by hardware. Hardware is so-termed because it is "hard" or rigid with respect to changes or modifications; whereas software is "soft" because it is easy to update or change. Intermediate between software and hardware is "firmware", which is software that is strongly coupled to the particular hardware of a computer system and thus the most difficult to change but also among the most stable with respect to consistency of interface. The progression from levels of "hardness" to "softness" in computer systems parallels a progression of layers of abstraction in computing.

Benefits

Basilisk II emulates a Macintosh 68k using interpretation code and dynamic recompilation. Basilisk snapshot9.png
Basilisk II emulates a Macintosh 68k using interpretation code and dynamic recompilation.
Wage labour Relationship where a worker sells labour to an employer

Wage labour is the socioeconomic relationship between a worker and an employer, where the worker sells their labour power under a formal or informal employment contract. These transactions usually occur in a labour market where wages or salaries are market-determined.

Data migration is the process of selecting, preparing, extracting, and transforming data and permanently transferring it from one computer storage system to another. Additionally, the validation of migrated data for completeness and the decommissioning of legacy data storage are considered part of the entire data migration process. Data migration is a key consideration for any system implementation, upgrade, or consolidation, and it is typically performed in such a way as to be as automated as possible, freeing up human resources from tedious tasks. Data migration occurs for a variety of reasons, including server or storage equipment replacements, maintenance or upgrades, application migration, website consolidation, disaster recovery, and data center relocation.

Application software computer software designed to perform a group of coordinated functions, tasks, or activities for the benefit of the user

Application software is software designed to perform a group of coordinated functions, tasks, or activities for the benefit of the user. Examples of an application include a word processor, a spreadsheet, an accounting application, a web browser, an email client, a media player, a file viewer, an aeronautical flight simulator, a console game or a photo editor. The collective noun application software refers to all applications collectively. This contrasts with system software, which is mainly involved with running the computer.

Obstacles

Intellectual property (IP) is a category of property that includes intangible creations of the human intellect. Intellectual property encompasses two types of rights: industrial property rights and copyright. It was not until the 19th century that the term "intellectual property" began to be used, and not until the late 20th century that it became commonplace in the majority of the world.

Market share is the percentage of a market accounted for by a specific entity. In a survey of nearly 200 senior marketing managers, 67% responded that they found the revenue- "dollar market share" metric very useful, while 61% found "unit market share" very useful.

Copyright is a form of intellectual property that grants the creator of an original creative work an exclusive legal right to determine whether and under what conditions this original work may be copied and used by others, usually for a limited term of years. The exclusive rights are not absolute but limited by limitations and exceptions to copyright law, including fair use. A major limitation on copyright on ideas is that copyright protects only the original expression of ideas, and not the underlying ideas themselves.

In new media art

Because of its primary use of digital formats, new media art relies heavily on emulation as a preservation strategy. Artists such as Cory Arcangel specialize in resurrecting obsolete technologies in their artwork and recognize the importance of a decentralized and deinstitutionalized process for the preservation of digital culture. In many cases, the goal of emulation in new media art is to preserve a digital medium so that it can be saved indefinitely and reproduced without error, so that there is no reliance on hardware that ages and becomes obsolete. The paradox is that the emulation and the emulator have to be made to work on future computers. [14]

In future systems design

Emulation techniques are commonly used during the design and development of new systems. It eases the development process by providing the ability to detect, recreate and repair flaws in the design even before the system is actually built. [15] It is particularly useful in the design of multi-core systems, where concurrency errors can be very difficult to detect and correct without the controlled environment provided by virtual hardware. [16] This also allows the software development to take place before the hardware is ready, [17] thus helping to validate design decisions.

Types

Windows XP running an Acorn Archimedes emulator, which is in turn running a Sinclair ZX Spectrum emulator. Emu running emu.jpg
Windows XP running an Acorn Archimedes emulator, which is in turn running a Sinclair ZX Spectrum emulator.
Tetris running on the Wzonka-Lad GameBoy emulator on AmigaOS, itself running on E-UAE on a modern Fedora Linux system. Wzonka-Lad on E-UAE on Linux.png
Tetris running on the Wzonka-Lad GameBoy emulator on AmigaOS, itself running on E-UAE on a modern Fedora Linux system.

Most emulators just emulate a hardware architecture—if operating system firmware or software is required for the desired software, it must be provided as well (and may itself be emulated). Both the OS and the software will then be interpreted by the emulator, rather than being run by native hardware. Apart from this interpreter for the emulated binary machine's language, some other hardware (such as input or output devices) must be provided in virtual form as well; for example, if writing to a specific memory location should influence what is displayed on the screen, then this would need to be emulated. While emulation could, if taken to the extreme, go down to the atomic level, basing its output on a simulation of the actual circuitry from a virtual power source, this would be a highly unusual solution. Emulators typically stop at a simulation of the documented hardware specifications and digital logic. Sufficient emulation of some hardware platforms requires extreme accuracy, down to the level of individual clock cycles, undocumented features, unpredictable analog elements, and implementation bugs. This is particularly the case with classic home computers such as the Commodore 64, whose software often depends on highly sophisticated low-level programming tricks invented by game programmers and the "demoscene".

In contrast, some other platforms have had very little use of direct hardware addressing, such as an emulator for the PlayStation Vita [18] . In these cases, a simple compatibility layer may suffice. This translates system calls for the foreign system into system calls for the host system e.g., the Linux compatibility layer used on *BSD to run closed source Linux native software on FreeBSD, NetBSD and OpenBSD. For example, while the Nintendo 64 graphic processor was fully programmable, most games used one of a few pre-made programs, which were mostly self-contained and communicated with the game via FIFO; therefore, many emulators do not emulate the graphic processor at all, but simply interpret the commands received from the CPU as the original program would. Developers of software for embedded systems or video game consoles often design their software on especially accurate emulators called simulators before trying it on the real hardware. This is so that software can be produced and tested before the final hardware exists in large quantities, so that it can be tested without taking the time to copy the program to be debugged at a low level and without introducing the side effects of a debugger. In many cases, the simulator is actually produced by the company providing the hardware, which theoretically increases its accuracy. Math co-processor emulators allow programs compiled with math instructions to run on machines that don't have the co-processor installed, but the extra work done by the CPU may slow the system down. If a math coprocessor isn't installed or present on the CPU, when the CPU executes any co-processor instruction it will make a determined interrupt (coprocessor not available), calling the math emulator routines. When the instruction is successfully emulated, the program continues executing.

Structure

Typically, an emulator is divided into modules that correspond roughly to the emulated computer's subsystems. Most often, an emulator will be composed of the following modules:

Buses are often not emulated, either for reasons of performance or simplicity, and virtual peripherals communicate directly with the CPU or the memory subsystem.

Memory subsystem

It is possible for the memory subsystem emulation to be reduced to simply an array of elements each sized like an emulated word; however, this model fails very quickly as soon as any location in the computer's logical memory does not match physical memory. This clearly is the case whenever the emulated hardware allows for advanced memory management (in which case, the MMU logic can be embedded in the memory emulator, made a module of its own, or sometimes integrated into the CPU simulator). Even if the emulated computer does not feature an MMU, though, there are usually other factors that break the equivalence between logical and physical memory: many (if not most) architectures offer memory-mapped I/O; even those that do not often have a block of logical memory mapped to ROM, which means that the memory-array module must be discarded if the read-only nature of ROM is to be emulated. Features such as bank switching or segmentation may also complicate memory emulation. As a result, most emulators implement at least two procedures for writing to and reading from logical memory, and it is these procedures' duty to map every access to the correct location of the correct object.

On a base-limit addressing system where memory from address 0 to address ROMSIZE-1 is read-only memory, while the rest is RAM, something along the line of the following procedures would be typical:

voidWriteMemory(wordAddress,wordValue){wordRealAddress;RealAddress=Address+BaseRegister;if((RealAddress<LimitRegister)&&(RealAddress>ROMSIZE)){Memory[RealAddress]=Value;}else{RaiseInterrupt(INT_SEGFAULT);}}
wordReadMemory(wordAddress){wordRealAddress;RealAddress=Address+BaseRegister;if(RealAddress<LimitRegister){returnMemory[RealAddress];}else{RaiseInterrupt(INT_SEGFAULT);returnNULL;}}

CPU simulator

The CPU simulator is often the most complicated part of an emulator. Many emulators are written using "pre-packaged" CPU simulators, in order to concentrate on good and efficient emulation of a specific machine. The simplest form of a CPU simulator is an interpreter, which is a computer program that follows the execution flow of the emulated program code and, for every machine code instruction encountered, executes operations on the host processor that are semantically equivalent to the original instructions. This is made possible by assigning a variable to each register and flag of the simulated CPU. The logic of the simulated CPU can then more or less be directly translated into software algorithms, creating a software re-implementation that basically mirrors the original hardware implementation.

The following example illustrates how CPU simulation can be accomplished by an interpreter. In this case, interrupts are checked-for before every instruction executed, though this behavior is rare in real emulators for performance reasons (it is generally faster to use a subroutine to do the work of an interrupt).

voidExecute(void){if(Interrupt!=INT_NONE){SuperUser=TRUE;WriteMemory(++StackPointer,ProgramCounter);ProgramCounter=InterruptPointer;}switch(ReadMemory(ProgramCounter++)){/*         * Handling of every valid instruction         * goes here...         */default:Interrupt=INT_ILLEGAL;}}

Interpreters are very popular as computer simulators, as they are much simpler to implement than more time-efficient alternative solutions, and their speed is more than adequate for emulating computers of more than roughly a decade ago on modern machines. However, the speed penalty inherent in interpretation can be a problem when emulating computers whose processor speed is on the same order of magnitude as the host machine[ dubious ]. Until not many years ago, emulation in such situations was considered completely impractical by many[ dubious ].

What allowed breaking through this restriction were the advances in dynamic recompilation techniques[ dubious ]. Simple a priori translation of emulated program code into code runnable on the host architecture is usually impossible because of several reasons:

Various forms of dynamic recompilation, including the popular Just In Time compiler (JIT) technique, try to circumvent these problems by waiting until the processor control flow jumps into a location containing untranslated code, and only then ("just in time") translates a block of the code into host code that can be executed. The translated code is kept in a code cache [ dubious ], and the original code is not lost or affected; this way, even data segments can be (meaninglessly) translated by the recompiler, resulting in no more than a waste of translation time. Speed may not be desirable as some older games were not designed with the speed of faster computers in mind. A game designed for a 30 MHz PC with a level timer of 300 game seconds might only give the player 30 seconds on a 300 MHz PC. Other programs, such as some DOS programs, may not even run on faster computers. Particularly when emulating computers which were "closed-box", in which changes to the core of the system were not typical, software may use techniques that depend on specific characteristics of the computer it ran on (e.g. its CPU's speed) and thus precise control of the speed of emulation is important for such applications to be properly emulated.

Input/output (I/O)

Most emulators do not, as mentioned earlier, emulate the main system bus; each I/O device is thus often treated as a special case, and no consistent interface for virtual peripherals is provided. This can result in a performance advantage, since each I/O module can be tailored to the characteristics of the emulated device; designs based on a standard, unified I/O API can, however, rival such simpler models, if well thought-out, and they have the additional advantage of "automatically" providing a plug-in service through which third-party virtual devices can be used within the emulator. A unified I/O API may not necessarily mirror the structure of the real hardware bus: bus design is limited by several electric constraints and a need for hardware concurrency management that can mostly be ignored in a software implementation.

Even in emulators that treat each device as a special case, there is usually a common basic infrastructure for:

Comparison with simulation

The word "emulator" was coined in 1963 at IBM [19] during development of the NPL (IBM 360) product line, using a "new combination of software, microcode, and hardware". [20] They discovered that using microcode hardware instead of software simulation, to execute programs written for earlier IBM computers, dramatically increased simulation speed. Earlier, IBM provided simulators for, e.g., the 650 on the 705. [21] In addition to simulators, IBM had compatibility features on the 709 and 7090, [22] for which it provided the IBM 709 computer with a program to run legacy programs written for the IBM 704 on the 709 and later on the IBM 7090. This program used the instructions added by the compatibility feature [23] to trap instructions requiring special handling; all other 704 instructions ran the same on a 7090. The compatibility feature on the 1410 [24] only required setting a console toggle switch, not a support program.

In 1963, when microcode was first used to speed up this simulation process, IBM engineers coined the term "emulator" to describe the concept. In the 2000s, it has become common to use the word "emulate" in the context of software. However, before 1980, "emulation" referred only to emulation with a hardware or microcode assist, while "simulation" referred to pure software emulation. [25] For example, a computer specially built for running programs designed for another architecture is an emulator. In contrast, a simulator could be a program which runs on a PC, so that old Atari games can be simulated on it. Purists continue to insist on this distinction, but currently the term "emulation" often means the complete imitation of a machine executing binary code while "simulation" often refers to computer simulation, where a computer program is used to simulate an abstract model. Computer simulation is used in virtually every scientific and engineering domain and Computer Science is no exception, with several projects simulating abstract models of computer systems, such as network simulation, which both practically and semantically differs from network emulation. [26]

Logic simulators

Logic simulation is the use of a computer program to simulate the operation of a digital circuit such as a processor. This is done after a digital circuit has been designed in logic equations, but before the circuit is fabricated in hardware.

Functional simulators

Functional simulation is the use of a computer program to simulate the execution of a second computer program written in symbolic assembly language or compiler language, rather than in binary machine code. By using a functional simulator, programmers can execute and trace selected sections of source code to search for programming errors (bugs), without generating binary code. This is distinct from simulating execution of binary code, which is software emulation. The first functional simulator was written by Autonetics about 1960 for testing assembly language programs for later execution in military computer D-17B. This made it possible for flight programs to be written, executed, and tested before D-17B computer hardware had been built. Autonetics also programmed a functional simulator for testing flight programs for later execution in the military computer D-37C.

Video game consoles

Video game console emulators are programs that allow a personal computer or video game console to emulate another video game console. They are most often used to play older 1980s-era video games on 2010s-era personal computers and more contemporary video game consoles. They are also used to translate games into other languages, to modify existing games, and in the development process of "home brew" DIY demos and in the creation of new games for older systems. The Internet has helped in the spread of console emulators, as most - if not all - would be unavailable for sale in retail outlets. Examples of console emulators that have been released in the last few decades are: RPCS3, Dolphin, Cemu, PCSX2, PPSSPP, ZSNES, Xenia, Citra, ePSXe, Project64, Visual Boy Advance, Nestopia, XQEMU, and Yuzu.

Terminal

Terminal emulators are software programs that provide modern computers and devices interactive access to applications running on mainframe computer operating systems or other host systems such as HP-UX or OpenVMS. Terminals such as the IBM 3270 or VT100 and many others, are no longer produced as physical devices. Instead, software running on modern operating systems simulates a "dumb" terminal and is able to render the graphical and text elements of the host application, send keystrokes and process commands using the appropriate terminal protocol. Some terminal emulation applications include Attachmate Reflection, IBM Personal Communications, and Micro Focus Rumba.

Impersonation by malware

Due to their popularity, emulators have been impersonated by malware. Most of these emulators are for video game consoles like the Xbox 360, Xbox One, Nintendo 3DS, etc. Generally such emulators make currently impossible claims such as being able to run Xbox One and Xbox 360 games in a single program. [27]

See article Console emulator — Legal issues

United States

As computers and global computer networks continued to advance and emulator developers grew more skilled in their work, the length of time between the commercial release of a console and its successful emulation began to shrink. Fifth generation consoles such as Nintendo 64, PlayStation and sixth generation handhelds, such as the Game Boy Advance, saw significant progress toward emulation during their production. This led to an effort by console manufacturers to stop unofficial emulation, but consistent failures such as Sega v. Accolade 977 F.2d 1510 (9th Cir. 1992), Sony Computer Entertainment, Inc. v. Connectix Corporation 203 F.3d 596 (2000), and Sony Computer Entertainment America v. Bleem 214 F.3d 1022 (2000), [28] have had the opposite effect. According to all legal precedents, emulation is legal within the United States. However, unauthorized distribution of copyrighted code remains illegal, according to both country-specific copyright and international copyright law under the Berne Convention. [29] [ better source needed ] Under United States law, obtaining a dumped copy of the original machine's BIOS is legal under the ruling Lewis Galoob Toys, Inc. v. Nintendo of America, Inc. , 964 F.2d 965 (9th Cir. 1992) as fair use as long as the user obtained a legally purchased copy of the machine. To mitigate this however, several emulators for platforms such as Game Boy Advance are capable of running without a BIOS file, using high-level emulation to simulate BIOS subroutines at a slight cost in emulation accuracy.[ citation needed ]

See also

Related Research Articles

Microcode is a computer hardware technique that imposes an interpreter between the CPU hardware and the programmer-visible instruction set architecture of the computer. As such, the microcode is a layer of hardware-level instructions that implement higher-level machine code instructions or internal state machine sequencing in many digital processing elements. Microcode is used in general-purpose central processing units, although in current desktop CPUs it is only a fallback path for cases that the faster hardwired control unit cannot handle.

Operating system collection of software that manages computer hardware resources

An operating system (OS) is system software that manages computer hardware and software resources and provides common services for computer programs.

PDP-8 First commercially successful minicomputer

The PDP-8 is a 12-bit minicomputer that was produced by Digital Equipment Corporation (DEC). It was the first commercially successful minicomputer, with over 50,000 examples being sold over the model's lifetime. Its basic design follows the pioneering LINC but has a smaller instruction set, which is an expanded version of the PDP-5 instruction set. Similar machines from DEC are the PDP-12 which is a modernized version of the PDP-8 and LINC concepts, and the PDP-14 industrial controller system.

Virtual memory Operating System level memory management technique

In computing, virtual memory is a memory management technique that provides an "idealized abstraction of the storage resources that are actually available on a given machine" which "creates the illusion to users of a very large (main) memory."

In computing, booting is starting up a computer or computer appliance until it can be used. It can be initiated by hardware such as a button press or by software command. After the power is switched on, a computer's central processing unit (CPU) has no software in its main memory, so some process must load software into memory before it can be executed. This may be done by hardware or firmware in the CPU, or by a separate processor in the computer system.

The IBM System/370 (S/370) was a model range of IBM mainframe computers announced on June 30, 1970 as the successors to the System/360 family. The series mostly maintained backward compatibility with the S/360, allowing an easy migration path for customers; this, plus improved performance, were the dominant themes of the product announcement. In September 1990, the System/370 line was replaced with the System/390.

Real mode, also called real address mode, is an operating mode of all x86-compatible CPUs. Real mode is characterized by a 20-bit segmented memory address space and unlimited direct software access to all addressable memory, I/O addresses and peripheral hardware. Real mode provides no support for memory protection, multitasking, or code privilege levels.

In computer architecture, 64-bit computing is the use of processors that have datapath widths, integer size, and memory address widths of 64 bits. Also, 64-bit computer architectures for central processing units (CPUs) and arithmetic logic units (ALUs) are those that are based on processor registers, address buses, or data buses of that size. From the software perspective, 64-bit computing means the use of code with 64-bit virtual memory addresses. However, not all 64-bit instruction sets support full 64-bit virtual memory addresses; x86-64 and ARMv8, for example, support only 48 bits of virtual address, with the remaining 16 bits of the virtual address required to be all 0's or all 1's, and several 64-bit instruction sets support fewer than 64 bits of physical memory address.

In computing, binary translation is a form of binary recompilation where sequences of instructions are translated from a source instruction set to the target instruction set. In some cases such as instruction set simulation, the target instruction set may be the same as the source instruction set, providing testing and debugging features such as instruction trace, conditional breakpoints and hot spot detection.

In-circuit emulation (ICE) is the use of a hardware device or in-circuit emulator used to debug the software of an embedded system. It operates by using a processor with the additional ability to support debugging operations, as well as to carry out the main function of the system. Particularly for older systems, with limited processors, this usually involved replacing the processor temporarily with a hardware emulator: a more powerful although more expensive version. It was historically in the form of bond-out processor which has many internal signals brought out for the purpose of debugging. These signals provide information about the state of the processor.

Emulation may refer to:

An instruction set simulator (ISS) is a simulation model, usually coded in a high-level programming language, which mimics the behavior of a mainframe or microprocessor by "reading" instructions and maintaining internal variables which represent the processor's registers.

A computer architecture simulator is a program that simulates the execution of computer architecture.

SIMMON was a proprietary software testing system developed in the late 1960s in the IBM Product Test Laboratory, then at Poughkeepsie, N.Y. It was designed for the then-new line of System/360 computers as a vehicle for testing the software that IBM was developing for that architecture. SIMMON was first described at the IBM SimSymp 1968 symposium, held at Rye, New York.

Binary-code compatibility is a property of computer systems meaning that they can run the same executable code, typically machine code for a general-purpose computer CPU. Source-code compatibility, on the other hand, means that recompilation or interpretation is necessary before the program can be run.

Video game console emulator program that reproduces video game consoles behavior

A video game console emulator is a type of emulator that allows a computing device to emulate a video game console's hardware and play its games on the emulating platform. More often than not, emulators carry additional features that surpass the limitations of the original hardware, such as broader controller compatibility, timescale control, greater performance, clearer quality, easier access to memory modifications, one-click cheat codes, and unlocking of gameplay features. Emulators are also a useful tool in the development process of homebrew demos and the creation of new games for older, discontinued, or more rare consoles.

Computer architecture Set of rules and methods that describe the functionality, organization, and implementation of computer systems

In computer engineering, computer architecture is a set of rules and methods that describe the functionality, organization, and implementation of computer systems. Some definitions of architecture define it as describing the capabilities and programming model of a computer but not a particular implementation. In other definitions computer architecture involves instruction set architecture design, microarchitecture design, logic design, and implementation.

MikroSim

MikroSim is an educational software computer program for hardware-non-specific explanation of the general functioning and behaviour of a virtual processor, running on the Microsoft Windows operating system. Devices like miniaturized calculators, microcontroller, microprocessors, and computer can be explained on custom-developed instruction code on a register transfer level controlled by sequences of micro instructions (microcode). Based on this it is possible to develop an instruction set to control a virtual application board at higher level of abstraction.

In computing, a system virtual machine is a virtual machine provides a complete system platform which supports the execution of a complete operating system (OS). These usually emulate an existing architecture, and are built with the purpose of either providing a platform to run programs where the real hardware is not available for use, or of having multiple instances of virtual machines leading to more efficient use of computing resources, both in terms of energy consumption and cost effectiveness, or both. A VM was originally defined by Popek and Goldberg as "an efficient, isolated duplicate of a real machine".

References

  1. Warick, Mike (April 1988). "MS-DOS Emulation For The 64". Compute!. p. 43. Retrieved 10 November 2013.
  2. "What is emulation?". Koninklijke Bibliotheek. Retrieved 2007-12-11.
  3. van der Hoeven, Jeffrey, Bram Lohman, and Remco Verdegem. "Emulation for Digital Preservation in Practice: The Results." The International Journal of Digital Curation 2.2 (2007): 123-132.
  4. 1 2 Muira, Gregory. " Pushing the Boundaries of Traditional Heritage Policy: maintaining long-term access to multimedia content." IFLA Journal 33 (2007): 323-326.
  5. Rothenberg, Jeffrey (1998). ""Criteria for an Ideal Solution." Avoiding Technological Quicksand: Finding a Viable Technical Foundation for Digital Preservation". Council on Library and Information Resources. Washington, DC. Retrieved 2008-03-08.
  6. Rothenberg, Jeffrey. "The Emulation Solution." Avoiding Technological Quicksand: Finding a Viable Technical Foundation for Digital Preservation. Washington, DC: Council on Library and Information Resources, 1998. Council on Library and Information Resources. 2008. 28 Mar. 2008 http://www.clir.org/pubs/reports/rothenberg/contents.html
  7. Miura, Gregory (2016). "Pushing the Boundaries of Traditional Heritage Policy: Maintaining long-term access to multimedia content". IFLA Journal. 33 (4): 323–6. doi:10.1177/0340035207086058.
  8. Granger, Stewart. Digital Preservation & Emulation: from theory to practice. Proc. of the ichim01 Meeting, vol. 2, 3 -7 Sept. 2001. Milano, Italy. Toronto: Archives and Museum Informatics, University of Toronto, 2001. 28 Mar. 2008 http://www.leeds.ac.uk/cedars/pubconf/papers/ichim01SG.html
  9. Verdegem, Remco; Lohman, Bram; Van Der Hoeven, Jeffrey (2008). "Emulation for Digital Preservation in Practice: The Results". International Journal of Digital Curation. 2 (2): 123–32. doi:10.2218/ijdc.v2i2.35.
  10. Granger, Stewart. "Emulation as a Digital Preservation Strategy." D-Lib Magazine 6.19 (2000). 29 Mar 2008 http://www.dlib.org/dlib/october00/granger/10granger.html
  11. Rothenberg, Jeffrey. "The Emulation Solution." Avoiding Technological Quicksand: Finding a Viable Technical Foundation for Digital Preservation. Washington, DC: Council on Library and Information Resources, 1998. Council on Library and Information Resources. 2008. 28 Mar. 2008
  12. http://tcrf.net/Pok%C3%A9mon_Black_and_White#Anti-Piracy%5B%5D
  13. "Mega Man Star Force - The Cutting Room Floor". tcrf.net.
  14. "Echoes of Art: Emulation as preservation strategy" . Retrieved 2007-12-11.
  15. Peter Magnusson (2004). "Full System Simulation: Software Development's Missing Link".
  16. "Debugging and Full System Simulation".
  17. Vania Joloboff (2009). "Full System Simulation of Embedded Systems" (PDF).
  18. PSVEP. "PlayStation Vita Emulator Project" . Retrieved 2017-08-08.
  19. Pugh, Emerson W. (1995). Building IBM: Shaping an Industry and Its Technology. MIT. p. 274. ISBN   0-262-16147-8.
  20. Pugh, Emerson W.; et al. (1991). IBM's 360 and Early 370 Systems. MIT. ISBN   0-262-16123-0. pages 160-161
  21. Simulation of the IBM 650 on the IBM 705
  22. "IBM Archives: 7090 Data Processing System (continued)". www-03.ibm.com. 23 January 2003.
  23. "System Compatibility Operations". Reference Manual IBM 7090 Data Processing System (PDF). March 1962. pp. 65–66. A22-6528-4.
  24. "System Compatibility Operations". IBM 1410 Principles of Operation (PDF). March 1962. pp. 56–57, 98–100. A22-0526-3.
  25. Tucker, S. G (1965). "Emulation of large systems". Communications of the ACM. 8 (12): 753–61. doi:10.1145/365691.365931.
  26. "Network simulation or emulation?". Network World. Network World. Retrieved 22 September 2017.
  27. "The Emulation Imitation". Malwarebytes Labs. Retrieved 2016-05-30.
  28. see Midway Manufacturing Co. v. Artic International, Inc. , 574 F.Supp. 999, aff'd, 704 F.2d 1009 (9th Cir 1982) (holding the computer ROM of Pac Man to be a sufficient fixation for purposes of copyright law even though the game changes each time played.) and Article 2 of the Berne Convention