Machine-dependent software

Last updated

Machine-dependent software is software that runs only on a specific computer. Applications that run on multiple computer architectures are called machine-independent, or cross-platform. [1] Many organisations opt for such software because they believe that machine-dependent software is an asset and will attract more buyers. Organizations that want application software to work on heterogeneous computers may port that software to the other machines. Deploying machine-dependent applications on such architectures, such applications require porting. This procedure includes composing, or re-composing, the application's code to suit the target platform.

Contents

Porting

Porting is the process of converting an application from one architecture to another. [2] Software languages such as Java are designed so that applications can migrate across architectures without source code modifications. The term is applied when programming/equipment is changed to make it usable in a different architecture.

Code that does not operate properly on a specific system must be ported to another system.

Porting effort depends upon a few variables, including the degree to which the first environment (the source stage) varies from the new environment (the objective stage) and the experience of the creators in knowing platform-specific programming dialects. [3]

Many languages offer a machine independent intermediate code that can be processed by platform-specific interpreters to address incompatibilities. [4] The transitional representation characterises a virtual machine that can execute all modules written in the intermediate dialect. The intermediate code guidelines are interpreted into distinct machine code arrangements by a code generator to make executable code. The intermediate code may also be executed directly without static conversion into platform-specific code. [5]

Approaches

See also

Related Research Articles

Computer programming is the process of performing a particular computation, usually by designing and building an executable computer program. Programming involves tasks such as analysis, generating algorithms, profiling algorithms' accuracy and resource consumption, and the implementation of algorithms. The source code of a program is written in one or more languages that are intelligible to programmers, rather than machine code, which is directly executed by the central processing unit. The purpose of programming is to find a sequence of instructions that will automate the performance of a task on a computer, often for solving a given problem. Proficient programming thus usually requires expertise in several different subjects, including knowledge of the application domain, specialized algorithms, and formal logic.

In computing, a compiler is a computer program that translates computer code written in one programming language into another language. The name "compiler" is primarily used for programs that translate source code from a high-level programming language to a low-level programming language to create an executable program.

<span class="mw-page-title-main">Machine code</span> Set of instructions executed directly by a computers central processing unit (CPU)

In computer programming, machine code is any low-level programming language, consisting of machine language instructions, which are used to control a computer's central processing unit (CPU). Each instruction causes the CPU to perform a very specific task, such as a load, a store, a jump, or an arithmetic logic unit (ALU) operation on one or more units of data in the CPU's registers or memory.

In computing, source code, or simply code, is any collection of text, with or without comments, written using a human-readable programming language, usually as plain text. The source code of a program is specially designed to facilitate the work of computer programmers, who specify the actions to be performed by a computer mostly by writing source code.

In computing, a "virtual machine" (VM) is the virtualization or emulation of a computer system. Virtual machines are based on computer architectures and provide the functionality of a physical computer. Their implementations may involve specialized hardware, software, or a combination of the two. Virtual machines differ and are organized by their function, shown here:

<span class="mw-page-title-main">Interpreter (computing)</span> Program that executes source code without a separate compilation step

In computer science, an interpreter is a computer program that directly executes instructions written in a programming or scripting language, without requiring them previously to have been compiled into a machine language program. An interpreter generally uses one of the following strategies for program execution:

  1. Parse the source code and perform its behavior directly;
  2. Translate source code into some efficient intermediate representation or object code and immediately execute that;
  3. Explicitly execute stored precompiled bytecode made by a compiler and matched with the interpreter Virtual Machine.
<span class="mw-page-title-main">Application binary interface</span> Binary interface between two program units

In computer software, an application binary interface (ABI) is an interface between two binary program modules. Often, one of these modules is a library or operating system facility, and the other is a program that is being run by a user.

In computing, cross-platform software is computer software that is designed to work in several computing platforms. Some cross-platform software requires a separate build for each platform, but some can be directly run on any platform without special preparation, being written in an interpreted language or compiled to portable bytecode for which the interpreters or run-time packages are common or standard components of all supported platforms.

A computing platform or digital platform is an environment in which a piece of software is executed. It may be the hardware or the operating system (OS), even a web browser and associated application programming interfaces, or other underlying software, as long as the program code is executed with it. Computing platforms have different abstraction levels, including a computer architecture, an OS, or runtime libraries. A computing platform is the stage on which computer programs can run.

In software engineering, porting is the process of adapting software for the purpose of achieving some form of execution in a computing environment that is different from the one that a given program was originally designed for. The term is also used when software/hardware is changed to make them usable in different environments.

In computing, binary translation is a form of binary recompilation where sequences of instructions are translated from a source instruction set to the target instruction set. In some cases such as instruction set simulation, the target instruction set may be the same as the source instruction set, providing testing and debugging features such as instruction trace, conditional breakpoints and hot spot detection.

In computing, just-in-time (JIT) compilation is a way of executing computer code that involves compilation during execution of a program rather than before execution. This may consist of source code translation but is more commonly bytecode translation to machine code, which is then executed directly. A system implementing a JIT compiler typically continuously analyses the code being executed and identifies parts of the code where the speedup gained from compilation or recompilation would outweigh the overhead of compiling that code.

<span class="mw-page-title-main">LLVM</span> Compiler backend for multiple programming languages

LLVM is a set of compiler and toolchain technologies that can be used to develop a front end for any programming language and a back end for any instruction set architecture. LLVM is designed around a language-independent intermediate representation (IR) that serves as a portable, high-level assembly language that can be optimized with a variety of transformations over multiple passes. The name LLVM originally stood for Low Level Virtual Machine, though the project has expanded and the name is no longer officially an acronym.

In software engineering, retargeting is an attribute of software development tools that have been specifically designed to generate code for more than one computing platform.

<span class="mw-page-title-main">Parallax Propeller</span> Multi-core microcontroller

The Parallax P8X32A Propeller is a multi-core processor parallel computer architecture microcontroller chip with eight 32-bit reduced instruction set computer (RISC) central processing unit (CPU) cores. Introduced in 2006, it is designed and sold by Parallax, Inc.

<span class="mw-page-title-main">Software portability</span> Ability of a program to run on different platforms with little alteration

A computer program is said to be portable if there is very low effort required to make it run on different platforms. The pre-requirement for portability is the generalized abstraction between the application logic and system interfaces. When software with the same functionality is produced for several computing platforms, portability is the key issue for development cost reduction.

In computer science, ahead-of-time compilation is the act of compiling an (often) higher-level programming language into an (often) lower-level language before execution of a program, usually at build-time, to reduce the amount of work needed to be performed at run time.

An application-specific instruction set processor (ASIP) is a component used in system on a chip design. The instruction set architecture of an ASIP is tailored to benefit a specific application. This specialization of the core provides a tradeoff between the flexibility of a general purpose central processing unit (CPU) and the performance of an application-specific integrated circuit (ASIC).

Data-intensive computing is a class of parallel computing applications which use a data parallel approach to process large volumes of data typically terabytes or petabytes in size and typically referred to as big data. Computing applications which devote most of their execution time to computational requirements are deemed compute-intensive, whereas computing applications which require large volumes of data and devote most of their processing time to I/O and manipulation of data are deemed data-intensive.

References

  1. Agrawala, & Rauscher (2014)
  2. Rashid, Patnaik, & Bhattacherjee, 2014
  3. Huang, Li, & Xie, 2015
  4. Yin, et al., 2012
  5. Mathur, Miles, & Du, 2015