Job (computing)

Last updated

In computing, a job is a unit of work or unit of execution (that performs said work). A component of a job (as a unit of work) is called a task or a step (if sequential, as in a job stream). As a unit of execution, a job may be concretely identified with a single process, which may in turn have subprocesses (child processes; the process corresponding to the job being the parent process) which perform the tasks or steps that comprise the work of the job; or with a process group; or with an abstract reference to a process or process group, as in Unix job control.

Contents

Jobs can be started interactively, such as from a command line, or scheduled for non-interactive execution by a job scheduler, and then controlled via automatic or manual job control. Jobs that have finite input can complete, successfully or unsuccessfully, or fail to complete and eventually be terminated. By contrast, online processing such as by servers has open-ended input (they service requests as long as they run), and thus never complete, only stopping when terminated (sometimes called "canceled"): a server's job is never done.

History

The term "job" has a traditional meaning as "piece of work", from Middle English "jobbe of work", and is used as such in manufacturing, in the phrase "job production", meaning "custom production", where it is contrasted with batch production (many items at once, one step at a time) and flow production (many items at once, all steps at the same time, by item). Note that these distinctions have become blurred in computing, where the oxymoronic term "batch job" is found, and used either for a one-off job or for a round of "batch processing" (same processing step applied to many items at once, originally punch cards).

In this sense of "job", a programmable computer performs "jobs", as each one can be different from the last. The term "job" is also common in operations research, predating its use in computing, in such uses as job shop scheduling (see, for example Baker & Dzielinski (1960) and references thereof from throughout the 1950s, including several "System Research Department Reports" from IBM Research Center). This analogy is applied to computer systems, where the system resources are analogous to machines in a job shop, and the goal of scheduling is to minimize the total time from beginning to end (makespan). The term "job" for computing work dates to the mid 1950s, as in this use from 1955:

"The program for an individual job is then written, calling up these subroutines by name wherever required, thus avoiding rewriting them for individual problems". [1]

The term continued in occasional use, such as for the IBM 709 (1958), and in wider use by early 1960s, such as for the IBM 7090, with widespread use from the Job Control Language of OS/360 (announced 1964). A standard early use of "job" is for compiling a program from source code, as this is a one-off task. The compiled program can then be run on batches of data.

See also

Further reading

Related Research Articles

<span class="mw-page-title-main">Computer multitasking</span> Concurrent execution of multiple processes

In computing, multitasking is the concurrent execution of multiple tasks over a certain period of time. New tasks can interrupt already started ones before they finish, instead of waiting for them to end. As a result, a computer executes segments of multiple tasks in an interleaved manner, while the tasks share common processing resources such as central processing units (CPUs) and main memory. Multitasking automatically interrupts the running program, saving its state and loading the saved state of another program and transferring control to it. This "context switch" may be initiated at fixed time intervals, or the running program may be coded to signal to the supervisory software when it can be interrupted.

<span class="mw-page-title-main">Operating system</span> Software that manages computer hardware resources

An operating system (OS) is system software that manages computer hardware and software resources, and provides common services for computer programs.

Computerized batch processing is a method of running software programs called jobs in batches automatically. While users are required to submit the jobs, no other interaction by the user is required to process the batch. Batches may automatically be run at scheduled times as well as being run contingent on the availability of computer resources.

<span class="mw-page-title-main">Process (computing)</span> Particular execution of a computer program

In computing, a process is the instance of a computer program that is being executed by one or many threads. There are many different process models, some of which are light weight, but almost all processes are rooted in an operating system (OS) process which comprises the program code, assigned system resources, physical and logical access permissions, and data structures to initiate, control and coordinate execution activity. Depending on the OS, a process may be made up of multiple threads of execution that execute instructions concurrently.

<span class="mw-page-title-main">User interface</span> Means by which a user interacts with and controls a machine

In the industrial design field of human–computer interaction, a user interface (UI) is the space where interactions between humans and machines occur. The goal of this interaction is to allow effective operation and control of the machine from the human end, while the machine simultaneously feeds back information that aids the operators' decision-making process. Examples of this broad concept of user interfaces include the interactive aspects of computer operating systems, hand tools, heavy machinery operator controls and process controls. The design considerations applicable when creating user interfaces are related to, or involve such disciplines as, ergonomics and psychology.

<span class="mw-page-title-main">History of operating systems</span> Aspect of computing history

Computer operating systems (OSes) provide a set of functions needed and used by most application programs on a computer, and the links needed to control and synchronize computer hardware. On the first computers, with no operating system, every program needed the full hardware specification to run correctly and perform standard tasks, and its own drivers for peripheral devices like printers and punched paper card readers. The growing complexity of hardware and application programs eventually made operating systems a necessity for everyday use.

In computing, scheduling is the action of assigning resources to perform tasks. The resources may be processors, network links or expansion cards. The tasks may be threads, processes or data flows.

Job Control Language (JCL) is a name for scripting languages used on IBM mainframe operating systems to instruct the system on how to run a batch job or start a subsystem. The purpose of JCL is to say which programs to run, using which files or devices for input or output, and at times to also indicate under what conditions to skip a step. Parameters in the JCL can also provide accounting information for tracking the resources used by a job as well as which machine the job should run on.

<span class="mw-page-title-main">Spooling</span> Form of multitasking in computers

In computing, spooling is a specialized form of multi-programming for the purpose of copying data between different devices. In contemporary systems, it is usually used for mediating between a computer application and a slow peripheral, such as a printer. Spooling allows programs to "hand off" work to be done by the peripheral and then proceed to other tasks, or to not begin until input has been transcribed. A dedicated program, the spooler, maintains an orderly sequence of jobs for the peripheral and feeds it data at its own rate. Conversely, for slow input peripherals, such as a card reader, a spooler can maintain a sequence of computational jobs waiting for data, starting each job when all of the relevant input is available; see batch processing. The spool itself refers to the sequence of jobs, or the storage area where they are held. In many cases, the spooler is able to drive devices at their full rated speed with minimal impact on other processing.

In computing, single program, multiple data (SPMD) is a term that has been used to refer to computational models for exploiting parallelism where-by multiple processors cooperate in the execution of a program in order to obtain results faster.

In computing, a pipeline, also known as a data pipeline, is a set of data processing elements connected in series, where the output of one element is the input of the next one. The elements of a pipeline are often executed in parallel or in time-sliced fashion. Some amount of buffer storage is often inserted between elements.

The Job Entry Subsystem (JES) is a component of IBM's MVS mainframe operating systems that is responsible for managing batch workloads. In modern times, there are two distinct implementations of the Job Entry System called JES2 and JES3. They are designed to provide efficient execution of batch jobs.

<span class="mw-page-title-main">Task (computing)</span> Unit of execution or work in software

In computing, a task is a unit of execution or a unit of work. The term is ambiguous; precise alternative terms include process, light-weight process, thread, step, request, or query. In the adjacent diagram, there are queues of incoming work to do and outgoing completed work, and a thread pool of threads to perform this work. Either the work units themselves or the threads that perform the work can be referred to as "tasks", and these can be referred to respectively as requests/responses/threads, incoming tasks/completed tasks/threads, or requests/responses/tasks.

A job scheduler is a computer application for controlling unattended background program execution of jobs. This is commonly called batch scheduling, as execution of non-interactive jobs is often called batch processing, though traditional job and batch are distinguished and contrasted; see that page for details. Other synonyms include batch system, distributed resource management system (DRMS), distributed resource manager (DRM), and, commonly today, workload automation (WLA). The data structure of jobs to run is known as the job queue.

Scheduling is the process of arranging, controlling and optimizing work and workloads in a production process or manufacturing process. Scheduling is used to allocate plant and machinery resources, plan human resources, plan production processes and purchase materials.

In system software, a job queue, is a data structure maintained by job scheduler software containing jobs to run.

In a non-interactive computer system, particularly IBM mainframes, a job stream, jobstream, or simply job is the sequence of job control language statements (JCL) and data that comprise a single "unit of work for an operating system". The term job traditionally means a one-off piece of work, and is contrasted with a batch, but non-interactive computation has come to be called "batch processing", and thus a unit of batch processing is often called a job, or by the oxymoronic term batch job; see job for details. Performing a job consists of executing one or more programs. Each program execution, called a job step, jobstep, or step, is usually related in some way to the others in the job. Steps in a job are executed sequentially, possibly depending on the results of previous steps, particularly in batch processing.

In computing, job control refers to the control of multiple tasks or jobs on a computer system, ensuring that they each have access to adequate resources to perform correctly, that competition for limited resources does not cause a deadlock where two or more jobs are unable to complete, resolving such situations where they do occur, and terminating jobs that, for any reason, are not performing as expected.

DAC-1, for Design Augmented by Computer, was one of the earliest graphical computer aided design systems. Developed by General Motors, IBM was brought in as a partner in 1960 and the two developed the system and released it to production in 1963. It was publicly unveiled at the Fall Joint Computer Conference in Detroit 1964. GM used the DAC system, continually modified, into the 1970s when it was succeeded by CADANCE.

<span class="mw-page-title-main">MTS system architecture</span> Software organization of the Michigan Terminal System

MTS System Architecture describes the software organization of the Michigan Terminal System, a time-sharing computer operating system in use from 1967 to 1999 on IBM S/360-67, IBM System/370, and compatible computers.

References

  1. Armour Research Foundation, ed. (1955). (Unknown title). Computer Applications. Vol. 2. Macmillan. p.  68.