This article needs additional citations for verification .(October 2008) |
Software deployment is all of the activities that make a software system available for use. [1] [2]
The general deployment process consists of several interrelated activities with possible transitions between them. These activities can occur on the producer side or on the consumer side or both. Because every software system is unique, the precise processes or procedures within each activity can hardly be defined. Therefore, "deployment" should be interpreted as a general process that has to be customized according to specific requirements or characteristics. [3]
This section needs expansion. You can help by adding to it. (January 2017) |
When computers were extremely large, expensive, and bulky (mainframes and minicomputers), the software was often bundled together with the hardware by manufacturers. If business software needed to be installed on an existing computer, this might require an expensive, time-consuming visit by a systems architect or a consultant. For complex, on-premises installation of enterprise software today, this can still sometimes be the case.
However, with the development of mass-market software for the new age of microcomputers in the 1980s came new forms of software distribution – first cartridges, then Compact Cassettes, then floppy disks, then (in the 1990s and later) optical media, the internet and flash drives. This meant that software deployment could be left to the customer. However, it was also increasingly recognized over time that configuration of the software by the customer was important and that this should ideally have a user-friendly interface (rather than, for example, requiring the customer to edit registry entries on Windows).
In pre-internet software deployments, deployments (and their closely related cousin, new software releases) were by nature expensive, infrequent, bulky affairs. It is arguable therefore that the spread of the internet made end-to-end agile software development possible. Indeed, the advent of cloud computing and software as a service meant that software could be deployed to a large number of customers in minutes, over the internet. This also meant that typically, deployment schedules were now determined by the software supplier, not by the customers. Such flexibility led to the rise of continuous delivery as a viable option, especially for less risky web applications.
Other options for software deployment include blue–green deployment and canary release deployment.
The complexity and variability of software products have fostered the emergence of specialized roles for coordinating and engineering the deployment process. For desktop systems, end-users frequently also become the "software deployers" when they install a software package on their machine. The deployment of enterprise software involves many more roles, and those roles typically change as the application progresses from the test (pre-production) to production environments. Typical roles involved in software deployments for enterprise applications may include:
A Linux distribution is an operating system that includes the Linux kernel for its kernel functionality. Although the name does not imply product distribution per se, a distro, if distributed on its own, is often obtained via a website intended specifically for the purpose. Distros have been designed for a wide variety of systems ranging from personal computers to servers and from embedded devices to supercomputers.
In telecommunications, provisioning involves the process of preparing and equipping a network to allow it to provide new services to its users. In National Security/Emergency Preparedness telecommunications services, "provisioning" equates to "initiation" and includes altering the state of an existing priority service or capability.
A package manager or package-management system is a collection of software tools that automates the process of installing, upgrading, configuring, and removing computer programs for a computer in a consistent manner.
Advanced package tool, or APT, is a free-software user interface that works with core libraries to handle the installation and removal of software on Debian and Debian-based Linux distributions. APT simplifies the process of managing software on Unix-like computer systems by automating the retrieval, configuration and installation of software packages, either from precompiled files or by compiling source code.
Most software systems have installation procedures that are needed before they can be used for their main purpose. Testing these procedures to achieve an installed software system that may be used is known as installation testing. These procedures may involve full or partial upgrades, and install/uninstall processes.
Installation of a computer program, is the act of making the program ready for execution. Installation refers to the particular configuration of software or hardware with a view to making it usable with the computer. A soft or digital copy of the piece of software (program) is needed to install it. There are different processes of installing a piece of software (program). Because the process varies for each program and each computer, programs often come with an installer, a specialised program responsible for doing whatever is needed for the installation. Installation may be part of a larger software deployment process.
Dependency hell is a colloquial term for the frustration of some software users who have installed software packages which have dependencies on specific versions of other software packages.
VectorLinux, abbreviated VL, was a Linux distribution for the x86 platform based on the Slackware Linux distribution, originally developed by Canadian developers Robert S. Lange and Darell Stavem. Since version 7 the Standard Edition is also available for the x86-64 platform, known as VLocity64 7.
ClickOnce is a component of Microsoft .NET Framework 2.0 and later, and supports deploying applications made with Windows Forms or Windows Presentation Foundation. It is similar to Java Web Start for the Java Platform or Zero Install for Linux.
Software remastering is software development that recreates system software and applications while incorporating customizations, with the intent that it is copied and run elsewhere for "off-label" usage. The term comes from remastering in media production, where it is similarly distinguished from mere copying.
This is a comparison of notable free and open-source configuration management software, suitable for tasks like server configuration, orchestration and infrastructure as code typically performed by a system administrator.
Windows Vista contains a range of new technologies and features that are intended to help network administrators and power users better manage their systems. Notable changes include a complete replacement of both the Windows Setup and the Windows startup processes, completely rewritten deployment mechanisms, new diagnostic and health monitoring tools such as random access memory diagnostic program, support for per-application Remote Desktop sessions, a completely new Task Scheduler, and a range of new Group Policy settings covering many of the features new to Windows Vista. Subsystem for UNIX Applications, which provides a POSIX-compatible environment is also introduced.
Microsoft Application Virtualization is an application virtualization and application streaming solution from Microsoft. It was originally developed by Softricity, a company based in Boston, Massachusetts, acquired by Microsoft on July 17, 2006. App-V represents Microsoft's entry to the application virtualization market, alongside their other virtualization technologies such as Hyper-V, Microsoft User Environment Virtualization (UE-V), Remote Desktop Services, and System Center Virtual Machine Manager.
Configurable Network Computing or CNC is JD Edwards's (JDE) client–server proprietary architecture and methodology. Now a division of the Oracle Corporation, Oracle continues to sponsor the ongoing development of the JD Edwards Enterprise Resource Planning (ERP) system, While highly flexible, the CNC architecture is proprietary and, as such, it cannot be exported to any other systems. While the CNC architecture's chief 'Claim to fame', insulation of applications from the underlying database and operating systems, were largely superseded by modern web-based technology, nevertheless CNC technology continues to be at the heart of both JD Edwards' One World and Enterprise One architecture and is planned to play a significant role Oracle's developing fusion architecture initiative. While a proprietary architecture, CNC is neither an Oracle nor JDE product offering. The term CNC also refers to the systems analysts who install, maintain, manage and enhance this architecture. CNC's are also one of the three technical areas in the JD Edwards Enterprise Resource Planning ERP which include developer/report writer and functional/business analysts.
The Linux Schools Project is an operating system designed for schools. It is a Linux distribution based on Ubuntu. The project maintains two custom distributions, with one designed for use on servers and the other for use with the server version on client machines. The server distribution is the official Karoshi, while the client is known as Karoshi Client.
Salix OS is a multi-purpose Linux distribution based on Slackware.
A definitive media library is a secure information technology repository in which an organisation's definitive, authorised versions of software media are stored and protected. Before an organisation releases any new or changed application software into its operational environment, any such software should be fully tested and quality assured. The definitive media library provides the storage area for software objects ready for deployment and should only contain master copies of controlled software media configuration items (CIs) that have passed appropriate quality assurance checks, typically including both procured and bespoke application and gold build source code and executables. In the context of the ITIL best practice framework, the term definitive media library supersedes the term definitive software library referred to prior to version ITIL v3.
EMCO Remote Installer is a software distribution tool for Windows. It allows network administrators to install and uninstall software on remote Windows computers connected to a local network, and to audit installed software and Windows updates remotely.
OpenLMI provides a common management infrastructure for Linux systems. Available operations include configuration of various operating system parameters and services, hardware components configuration, and monitoring of system resources. Services provided by OpenLMI can be accessed both locally and remotely, using multiple programming languages and standardized APIs.
A DevOps toolchain is a set or combination of tools that aid in the delivery, development, and management of software applications throughout the systems development life cycle, as coordinated by an organisation that uses DevOps practices.
Configuration management systems are designed to make controlling large numbers of servers easy for administrators and operations teams. They allow you to control many different systems in an automated way from one central location.