Stages of growth model

Last updated

Stages-of-growth model is a theoretical model for the growth of information technology (IT) in a business or similar organization. It was developed by Richard L. Nolan during the early 1970s, and with the final version of the model published by him in the Harvard Business Review in 1979. [1]

Contents

Development

Both articles describing the stages were first published in the Harvard Business Review. The first proposal was made in 1973 and consisted of only four stages. [2] Two additional stages were added in 1979 to complete his six-stage model. [1]

Summary

Nolan's model concerns the general approach to IT in business. The model proposes that evolution of IT in organizations begins slowly in Stage I, the "initiation" stage. This stage is marked by "hands off" user awareness and an emphasis on functional applications to reduce costs. Stage I is followed by further growth of IT in the "contagion" stage. In this stage there is a proliferation of applications as well as the potential for more problems to arise. During Stage III a need for "control" arises. Centralized controls are put in place and a shift occurs from management of computers to management of data resources. Next, in Stage IV, "integration" of diverse technological solutions evolves. Management of data allows development without increasing IT expenditures in Stage V. Finally, in Stage VI, "maturity", high control is exercised by using all the information from the previous stages. [3]

Stage I – Initiation

In this stage, information technology is first introduced into the organization. According to Nolan's article in 1973, computers were introduced into companies for two reasons. The first reason deals with the company reaching a size where the administrative processes cannot be accomplished without computers. Also, the success of the business justifies large investment in specialized equipment. The second reason deals with computational needs. Nolan defined the critical size of the company as the most prevalent reason for computer acquisition. Due to the unfamiliarity of personnel with the technology, users tend to take a "hands off" approach to new technology. This introductory software is simple to use and cheap to implement, which provides substantial monetary savings to the company. During this stage, the IT department receives little attention from management and work in a "carefree" atmosphere. [1] [2]

Stage I Key points:

Stage II – Contagion

Even though the computers are recognised as “change agents” in Stage I, Nolan acknowledged that many users become alienated by computing. Because of this, Stage II is characterised by a managerial need to explain the potential of computer applications to alienated users. This leads to the adoption of computers in a range of different areas. A problem that arises in Stage II is that project and budgetary controls are not developed. Unavoidably, this leads to a saturation of existing computer capacity and more sophisticated computer systems being obtained. System sophistication requires employing specialised professionals. Due to the shortage of qualified individuals, implementing these employees results in high salaries. The budget for computer organisation rises significantly and causes concern for management. Although the price of Stage II is high, it is evident that planning and control of computer systems is necessary. [1] [2]

Stage II Key points:

Stage III – Control

Stage III is a reaction against excessive and uncontrolled expenditures of time and money spent on computer systems, and the major problem for management is the organization of tasks for control of computer operating costs. In this stage, project management and management report systems are organized, which leads to development of programming, documentation, and operation standards. During Stage III, a shift occurs from management of computers to management of data resources. This shift is an outcome of analysis of how to increase management control and planning in expending data processing operations. Also, the shift provides flexibility in data processing that is needed in a case of management's new controls. The major characteristic of Stage III is reconstruction of data processing operation. [1] [2]

Stage III Key points:

Stage IV – Integration

Stage IV features the adoption of new technology to integrate systems that were previously separate entities. This creates data processing (IT) expenditure growth rates similar to that of Stage II. In the latter half of Stage IV, exclusive reliance on computer controls leads to inefficiencies. The inefficiencies associated with rapid growth may create another wave of problems simultaneously. This is the last stage that Nolan acknowledged in his initial proposal of the stages of growth in 1973. [1] [2]

Stage IV Key points:

Stage V – Data administration

Nolan determined that four stages were not enough to describe the proliferation of IT in an organization and added Stage V in 1979. Stage V features a new emphasis on managing corporate data rather than IT. Like the proceeding Stage VI, it is marked by the development and maturity of the new concept of data administration. [1]

Stage V Key points:

Stage VI – Maturity

In Stage VI, the application portfolio — tasks like orderly entry, general ledger, and material requirements planning — is completed and its structure “mirrors” the organization and information flows in the company. During this stage, tracking sales growth becomes an important aspect. On the average, 10% batch and remote job entry, 60% are dedicated to data base and data communications processing, 5% personal computing, 25% minicomputer processing. Management control systems are used the most in Stage VI (40%). There are three aspects of management control; manufacturing, marketing and financial. Manufacturing control demands forecasting — looking down the road for future needs. Marketing control strictly deals with research. Financial control, forecasts cash requirements for the future. Stage VI exercises high control, by compiling all of the information from Stages I through V. This allows the organization to function at high levels of efficiency and effectiveness. [1]

Stage VI Key points:

Initial reaction

Richard Nolan's Stages of Growth Model seemed ahead of its time when it was first published in the 1970s. [4]

Legacy

Critics agree that Nolan's model presents several shortcomings and is slightly out of date. As time has progressed, Richard Nolan's Stages of Growth Model has revealed some apparent weaknesses. However, many agree that this does not take away from his innovative look into the realm of computing development.[ citation needed ]

Criticism

An argument posed dealt with the main focus on the change in budget, and whether it is “reasonable to assume that a single variable serves as a suitable surrogate for so much.” [4] It seems logical that this single variable could be an indicator of other variables such as the organizational environment or an organization's learning curve, but not that it is the sole driving force of the entire model. Nolan shows little connection that would make his initial point a valid one.

In his model, Richard Nolan states that the force behind the growth of computing through the stages is technological change. King and Kramer [4] find this to be far too general as they say, “there are additional factors that should be considered. Most important are the "demand-side" factors that create a ripe environment for technological changes to be considered and adopted.” [4] As proposed, technological change has a multitude of facets that determine its necessity. Change cannot be brought forth unless it is needed under certain circumstances. Unwarranted change would result in excess costs and potential failure of the process.

Last, the stages of growth model assumes straightforward organizational goals that are to be determined through the technological change. This can be viewed as very naïve from the user perspective. King and Kraemer state, “the question of whether organizational goals are uniform and consistent guides for the behavior of organizational actors, as opposed to dynamic and changing targets that result from competition and conflict among organizational actors, has received considerable attention in the literature on computing.” [4] Clearly, organizational goals are ever changing and sometimes rigid indicators of direction. They cannot be “uniform” objectives that are not subject to change.

Related Research Articles

The Capability Maturity Model (CMM) is a development model created in 1986 after a study of data collected from organizations that contracted with the U.S. Department of Defense, who funded the research. The term "maturity" relates to the degree of formality and optimization of processes, from ad hoc practices, to formally defined steps, to managed result metrics, to active optimization of the processes.

<span class="mw-page-title-main">Database</span> Organized collection of data in computing

In computing, a database is an organized collection of data stored and accessed electronically through the use of a database management system. Small databases can be stored on a file system, while large databases are hosted on computer clusters or cloud storage. The design of databases spans formal techniques and practical considerations, including data modeling, efficient data representation and storage, query languages, security and privacy of sensitive data, and distributed computing issues, including supporting concurrent access and fault tolerance.

<span class="mw-page-title-main">Information security</span> Protecting information by mitigating risk

Information security, sometimes shortened to InfoSec, is the practice of protecting information by mitigating information risks. It is part of information risk management. It typically involves preventing or reducing the probability of unauthorized or inappropriate access to data or the unlawful use, disclosure, disruption, deletion, corruption, modification, inspection, recording, or devaluation of information. It also involves actions intended to reduce the adverse impacts of such incidents. Protected information may take any form, e.g., electronic or physical, tangible, or intangible. Information security's primary focus is the balanced protection of data confidentiality, integrity, and availability while maintaining a focus on efficient policy implementation, all without hampering organization productivity. This is largely achieved through a structured risk management process that involves:

A management information system (MIS) is an information system used for decision-making, and for the coordination, control, analysis, and visualization of information in an organization. The study of the management information systems involves people, processes and technology in an organizational context.

An information system (IS) is a formal, sociotechnical, organizational system designed to collect, process, store, and distribute information. From a sociotechnical perspective, information systems are composed by four components: task, people, structure, and technology. Information systems can be defined as an integration of components for collection, storage and processing of data of which the data is used to provide information, contribute to knowledge as well as digital products that facilitate decision making.

The following outline is provided as an overview of and topical guide to software engineering:

<span class="mw-page-title-main">Product lifecycle</span> Duration of processing of products from inception, to engineering, design & manufacture

In industry, product lifecycle management (PLM) is the process of managing the entire lifecycle of a product from its inception through the engineering, design and manufacture, as well as the service and disposal of manufactured products. PLM integrates people, data, processes, and business systems and provides a product information backbone for companies and their extended enterprises.

Software as a service is a software licensing and delivery model in which software is licensed on a subscription basis and is centrally hosted. SaaS is also known as on-demand software, web-based software, or web-hosted software.

Enterprise software, also known as enterprise application software (EAS), is computer software used to satisfy the needs of an organization rather than individual users. Such organizations include businesses, schools, interest-based user groups, clubs, charities, and governments. Enterprise software is an integral part of a computer-based information system.

The Center for Information Technology (CIT) is one of the 27 institutes and centers that compose the National Institutes of Health (NIH), an agency of the U.S. Department of Health and Human Services (HHS), a cabinet-level department of the Executive Branch of the United States Federal Government. Originating in 1954 as a central processing facility in the NIH Office of the Director, the Division of Computer Research and Technology was established in 1964, merging in 1998 with the NIH Office of the CIO and the NIH Office of Research Services Telecommunications Branch to form a new organization, the CIT.

IT portfolio management is the application of systematic management to the investments, projects and activities of enterprise Information Technology (IT) departments. Examples of IT portfolios would be planned initiatives, projects, and ongoing IT services. The promise of IT portfolio management is the quantification of previously informal IT efforts, enabling measurement and objective evaluation of investment scenarios.

Richard L. Nolan is an American business theorist, and Emeritus Professor of Business Administration at the Harvard Business School.

CNGrid is the Chinese national high performance computing network supported by 863 Program.

In information systems, applications architecture or application architecture is one of several architecture domains that form the pillars of an enterprise architecture (EA).

<span class="mw-page-title-main">Cloud computing</span> Form of shared Internet-based computing

Cloud computing is the on-demand availability of computer system resources, especially data storage and computing power, without direct active management by the user. Large clouds often have functions distributed over multiple locations, each of which is a data center. Cloud computing relies on sharing of resources to achieve coherence and typically uses a pay-as-you-go model, which can help in reducing capital expenses but may also lead to unexpected operating expenses for users.

<span class="mw-page-title-main">Cloud computing security</span> Methods used to protect cloud based assets

Cloud computing security or, more simply, cloud security, refers to a broad set of policies, technologies, applications, and controls utilized to protect virtualized IP, data, applications, services, and the associated infrastructure of cloud computing. It is a sub-domain of computer security, network security, and, more broadly, information security.

A distributed operating system is system software over a collection of independent software, networked, communicating, and physically separate computational nodes. They handle jobs which are serviced by multiple CPUs. Each individual node holds a specific software subset of the global aggregate operating system. Each subset is a composite of two distinct service provisioners. The first is a ubiquitous minimal kernel, or microkernel, that directly controls that node's hardware. Second is a higher-level collection of system management components that coordinate the node's individual and collaborative activities. These components abstract microkernel functions and support user applications.

Data-intensive computing is a class of parallel computing applications which use a data parallel approach to process large volumes of data typically terabytes or petabytes in size and typically referred to as big data. Computing applications which devote most of their execution time to computational requirements are deemed compute-intensive, whereas computing applications which require large volumes of data and devote most of their processing time to I/O and manipulation of data are deemed data-intensive.

The following outline is provided as an overview of and topical guide to software development:

The following is provided as an overview of and topical guide to databases:

References

  1. 1 2 3 4 5 6 7 8 Nonna, Richard. "Managing The Crisis In Data Processing". Harvard Business Review. 57 (2): 115–126.
  2. 1 2 3 4 5 Nolan, Richard (1973). "Managing The Computer Resource: A Stage Hypothesis". Communications of the ACM. 16 (4): 399–405. doi: 10.1145/362280.362284 .
  3. Gottschalk, Petter (2002). "Toward a Model of Growth Stages for Knowledge Management Technology in Law Firms". Informing Science. 5 (2): 81–93.
  4. 1 2 3 4 5 King, John Leslie A; Kramer, Kenneth L. (1984). "Evolution and organizational information systems: an assessment of Nolan's stage model". Communications of the ACM. ACM. 27 (5): 466–475. doi: 10.1145/358189.358074 .

Further reading