Software portability

Last updated
Software portability can be exemplified with multiple devices running the same video game. Utd platforms.jpg
Software portability can be exemplified with multiple devices running the same video game.

Software portability is a design objective for source code to be easily made to run on different platforms. An aid to portability is the generalized abstraction between the application logic and system interfaces. When software with the same functionality is produced for several computing platforms, portability is the key issue for development cost reduction.

Contents

Strategies

Software portability may involve:

Similar systems

When operating systems of the same family are installed on two computers with processors with similar instruction sets it is often possible to transfer the files implementing program files between them.

In the simplest case, the file or files may simply be copied from one machine to the other. However, in many cases, the software is installed on a computer in a way which depends upon its detailed hardware, software, and setup, with device drivers for particular devices, using installed operating system and supporting software components, and using different drives or directories.

In some cases, software, usually described as "portable software", is specifically designed to run on different computers with compatible operating systems and processors, without any machine-dependent installation. Porting is no more than transferring specified directories and their contents. Software installed on portable mass storage devices such as USB sticks can be used on any compatible computer on simply plugging the storage device in, and stores all configuration information on the removable device. Hardware- and software-specific information is often stored in configuration files in specified locations such as the registry on Windows).

Software which is not portable in this sense must be modified much more to support the environment on the destination machine.

Different processors

As of 2011 the majority of desktop and laptop computers used microprocessors compatible with the 32- and 64-bit x86 instruction sets. Smaller portable devices use processors with different and incompatible instruction sets, such as ARM. The difference between larger and smaller devices is such that detailed software operation is different; an application designed to display suitably on a large screen cannot simply be ported to a pocket-sized smartphone with a tiny screen even if the functionality is similar.

Web applications are required to be processor independent, so portability can be achieved by using web programming techniques, writing in JavaScript. Such a program can run in a common web browser. Such web applications must, for security reasons, have limited control over the host computer, especially regarding reading and writing files. Non-web programs, installed upon a computer in the normal manner, can have more control, and yet achieve system portability by linking to portable libraries providing the same interface on different systems.

Source code portability

Software can be compiled and linked from source code for different operating systems and processors if written in a programming language supporting compilation for the platforms. This is usually a task for the program developers; typical users have neither access to the source code nor the required skills.

In open-source environments such as Linux the source code is available to all. In earlier days source code was often distributed in a standardised format, and could be built into executable code with a standard Make tool for any particular system by moderately knowledgeable users if no errors occurred during the build. Some Linux distributions distribute software to users in source form. In these cases there is usually no need for detailed adaptation of the software for the system; it is distributed in a way which modifies the compilation process to match the system.

Effort to port source code

Even with seemingly portable languages like C and C++, the effort to port source code can vary considerably. The authors of UNIX/32V (1979) reported that "[t]he (Bourne) shell [...] required by far the largest conversion effort of any supposedly portable program, for the simple reason that it is not portable." [1]

Sometimes the effort consists of recompiling the source code, but sometimes it is necessary to rewrite major parts of the software. Many language specifications describe implementation defined behaviour (e.g. right shifting a signed integer in C can do a logical or an arithmetic shift). Operating system functions or third party libraries might not be available on the target system. Some functions can be available on a target system, but exhibit slightly different behavior such as utime() fails under Windows with EACCES, when it is called for a directory). The program code can contain unportable things, like the paths of include files, drive letters, or the backslash. Implementation defined things like byte order and the size of an int can also raise the porting effort. In practice the claim of languages, like C and C++, to have the WOCA (write once, compile anywhere) is arguable.

See also

Related Research Articles

<span class="mw-page-title-main">Software</span> Non-tangible executable component of a computer

Software is a collection of programs and data that tell a computer how to perform specific tasks. Software often includes associated software documentation. This is in contrast to hardware, from which the system is built and which actually performs the work.

Computer programming or coding is the composition of sequences of instructions, called programs, that computers can follow to perform tasks. It involves designing and implementing algorithms, step-by-step specifications of procedures, by writing code in one or more programming languages. Programmers typically use high-level programming languages that are more easily intelligible to humans than machine code, which is directly executed by the central processing unit. Proficient programming usually requires expertise in several different subjects, including knowledge of the application domain, details of programming languages and generic code libraries, specialized algorithms, and formal logic.

<span class="mw-page-title-main">Linux distribution</span> Operating system based on the Linux kernel

A Linux distribution is an operating system made from a software collection that includes the Linux kernel and often a package management system. Linux users usually obtain their operating system by downloading one of the Linux distributions, which are available for a wide variety of systems ranging from embedded devices and personal computers to powerful supercomputers.

<span class="mw-page-title-main">Operating system</span> Software that manages computer hardware resources

An operating system (OS) is system software that manages computer hardware and software resources, and provides common services for computer programs.

<span class="mw-page-title-main">Plan 9 from Bell Labs</span> Distributed operating system

Plan 9 from Bell Labs is a distributed operating system which originated from the Computing Science Research Center (CSRC) at Bell Labs in the mid-1980s and built on UNIX concepts first developed there in the late 1960s. Since 2000, Plan 9 has been free and open-source. The final official release was in early 2015.

<span class="mw-page-title-main">Application binary interface</span> Binary interface between two program units

In computer software, an application binary interface (ABI) is an interface between two binary program modules. Often, one of these modules is a library or operating system facility, and the other is a program that is being run by a user.

In computing, cross-platform software is computer software that is designed to work in several computing platforms. Some cross-platform software requires a separate build for each platform, but some can be directly run on any platform without special preparation, being written in an interpreted language or compiled to portable bytecode for which the interpreters or run-time packages are common or standard components of all supported platforms.

A computing platform, digital platform, or software platform is an environment in which software is executed. It may be the hardware or the operating system (OS), a web browser and associated application programming interfaces, or other underlying software, as long as the program code is executed using the services provided by the platform. Computing platforms have different abstraction levels, including a computer architecture, an OS, or runtime libraries. A computing platform is the stage on which computer programs can run.

Hardware abstractions are sets of routines in software that provide programs with access to hardware resources through programming interfaces. The programming interface allows all devices in a particular class C of hardware devices to be accessed through identical interfaces even though C may contain different subclasses of devices that each provide a different hardware interface.

In computers, a printer driver or a print processor is a piece of software on a computer that converts the data to be printed to a format that a printer can understand. The purpose of printer drivers is to allow applications to do printing without being aware of the technical details of each printer model.

<span class="mw-page-title-main">Inferno (operating system)</span> Distributed operating system

Inferno is a distributed operating system started at Bell Labs and now developed and maintained by Vita Nuova Holdings as free software under the MIT License. Inferno was based on the experience gained with Plan 9 from Bell Labs, and the further research of Bell Labs into operating systems, languages, on-the-fly compilers, graphics, security, networking and portability. The name of the operating system, many of its associated programs, and that of the current company, were inspired by Dante Alighieri's Divine Comedy. In Italian, Inferno means "hell", of which there are nine circles in Dante's Divine Comedy.

In computing, an abstraction layer or abstraction level is a way of hiding the working details of a subsystem. Examples of software models that use layers of abstraction include the OSI model for network protocols, OpenGL, and other graphics libraries, which allow the separation of concerns to facilitate interoperability and platform independence. Another example is Media Transfer Protocol.

<span class="mw-page-title-main">SpareMiNT</span> Software distribution based on FreeMiNT

SpareMiNT is a software distribution based on FreeMiNT, which consists of a MiNT-like operating system (OS) and kernel plus GEM compatible AES.

<span class="mw-page-title-main">Linux</span> Family of Unix-like operating systems

Linux is a family of open-source Unix-like operating systems based on the Linux kernel, an operating system kernel first released on September 17, 1991, by Linus Torvalds. Linux is typically packaged as a Linux distribution (distro), which includes the kernel and supporting system software and libraries, many of which are provided by the GNU Project. Many Linux distributions use the word "Linux" in their name, but the Free Software Foundation uses and recommends the name "GNU/Linux" to emphasize the use and importance of GNU software in many distributions, causing some controversy.

The following is a timeline of virtualization development. In computing, virtualization is the use of a computer to simulate another computer. Through virtualization, a host simulates a guest by exposing virtual hardware devices, which may be done through software or by allowing access to a physical device connected to the machine.

Binary-code compatibility is a property of a computer system, meaning that it can run the same executable code, typically machine code for a general-purpose computer Central processing unit (CPU), that another computer system can run. Source-code compatibility, on the other hand, means that recompilation or interpretation is necessary before the program can be run on the compatible system.

<span class="mw-page-title-main">Unix</span> Family of computer operating systems

Unix is a family of multitasking, multi-user computer operating systems that derive from the original AT&T Unix, whose development started in 1969 at the Bell Labs research center by Ken Thompson, Dennis Ritchie, and others.

<span class="mw-page-title-main">Scripting language</span> Programming language for run-time events

A scripting language or script language is a programming language that is used to manipulate, customize, and automate the facilities of an existing system. Scripting languages are usually interpreted at runtime rather than compiled.

<span class="mw-page-title-main">NetBSD</span> Free and open-source Unix-like operating system

NetBSD is a free and open-source Unix operating system based on the Berkeley Software Distribution (BSD). It was the first open-source BSD descendant officially released after 386BSD was forked. It continues to be actively developed and is available for many platforms, including servers, desktops, handheld devices, and embedded systems.

The following outline is provided as an overview of and topical guide to the Perl programming language:

References

  1. Thomas B. London and John F. Reiser (1978). A Unix operating system for the DEC VAX-11/780 computer. Bell Labs internal memo 78-1353-4.

Sources