Big data maturity model

Last updated

Big data maturity models (BDMM) are the artifacts used to measure big data maturity. [1] These models help organizations to create structure around their big data capabilities and to identify where to start. [2] They provide tools that assist organizations to define goals around their big data program and to communicate their big data vision to the entire organization. BDMMs also provide a methodology to measure and monitor the state of a company's big data capability, the effort required to complete their current stage or phase of maturity and to progress to the next stage. Additionally, BDMMs measure and manage the speed of both the progress and adoption of big data programs in the organization. [1]

Contents

The goals of BDMMs are:

Key organizational areas refer to "people, process and technology" and the subcomponents include [3] alignment, architecture, data, data governance, delivery, development, measurement, program governance, scope, skills, sponsorship, statistical modelling, technology, value and visualization.

The stages or phases in BDMMs depict the various ways in which data can be used in an organization and is one of the key tools to set direction and monitor the health of an organization's big data programs. [4] [5]

An underlying assumption is that a high level of big data maturity correlates with an increase in revenue and reduction in operational expense. However, reaching the highest level of maturity involves major investments over many years. [6] Only a few companies are considered to be at a "mature" stage of big data and analytics. [7] These include internet-based companies (such as LinkedIn, Facebook, and Amazon) and other non-Internet-based companies, including financial institutions (fraud analysis, real-time customer messaging and behavioral modeling) and retail organizations (click-stream analytics together with self-service analytics for teams). [6]

Categories

Big data maturity models can be broken down into three broad categories namely: [1]

Descriptive

Descriptive models assess the current firm maturity through qualitative positioning of the firm in various stages or phases. The model does not provide any recommendations as to how a firm would improve their big data maturity.

Big data and analytics maturity model (IBM model)

This descriptive model aims to assess the value generated from big data investments towards supporting strategic business initiatives.

Maturity levels

The model consists of the following maturity levels:

Assessment areas

Maturity levels also cover areas in matrix format focusing on: business strategy, information, analytics, culture and execution, architecture and governance.

[8]

Knowledgent big data maturity assessment

Consisting of an assessment survey, this big data maturity model assesses an organization's readiness to execute big data initiatives. Furthermore, the model aims to identify the steps and appropriate technologies that will lead an organization towards big data maturity. [9]

Comparative

Comparative big data maturity models aim to benchmark an organization in relation to its industry peers and normally consist of a survey containing quantitative and qualitative information.

CSC big data maturity tool

The CSC big data maturity tool acts as a comparative tool to benchmark an organization's big data maturity. A survey is undertaken and the results are then compared to other organizations within a specific industry and within the wider market. [10]

TDWI big data maturity model

The TDWI big data maturity model is a model in the current big data maturity area and therefore consists of a significant body of knowledge. [6]

Maturity stages

The different stages of maturity in the TDWI BDMM can be summarized as follows:

Stage 1: Nascent

The nascent stage as a pre–big data environment. During this stage:

Stage 2: Pre-adoption

During the pre-adoption stage:

Stage 3: Early adoptionThe "chasm" There is then generally a series of hurdles it needs to overcome. These hurdles include:

Stage 4: Corporate adoption

The corporate adoption stage is characterized by the involvement of end-users, an organization gains further insight and the way of conducting business is transformed. During this stage:

Stage 5: Mature / visionary

Only a few organizations can be considered as visionary in terms of big data and big data analytics. During this stage an organization:

Research findings

TDWI [6] did an assessment on 600 organizations and found that the majority of organizations are either in the pre-adoption (50%) or early adoption (36%) stages. Additionally, only 8% of the sample have managed to move past the chasm towards corporate adoption or being mature/visionary.

Prescriptive

The majority of prescriptive BDMMs follow a similar modus operandi in that the current situation is first assessed followed by phases plotting the path towards increased big data maturity. Examples are:

Info-tech big data maturity assessment tool

This maturity model is prescriptive in the sense that the model consists of four distinct phases that each plot a path towards big data maturity. Phases are:

[11]

Radcliffe big data maturity model

The Radcliffe big data maturity model, as other models, also consists of distinct maturity levels ranging from:

[5]

Booz & Company's model

This BDMM provides a framework that not only enables organizations to view the extent of their current maturity, but also to identify goals and opportunities for growth in big data maturity. The model consists of four stages namely,

[4]

Van Veenstra's model

The prescriptive model proposed by Van Veenstra aims to firstly explore the existing big data environment of the organization followed by exploitation opportunities and a growth path towards big data maturity. The model makes use of four phases namely:

[12]

Critical evaluation

Current BDMMs have been evaluated under the following criteria: [1]

The TDWI and CSC have the strongest overall performance with steady scores in each of the criteria groups. The overall results communicate that the top performer models are extensive, balanced, well-documented, easy to use, and they address a good number of big data capabilities that are utilized in business value creation. The models of Booz & Company and Knowledgent are close seconds and these mid-performers address big data value creation in a commendable manner, but fall short when examining the completeness of the models and the ease of application. Knowledgent suffers from poor quality of development, having barely documented any of its development processes. The rest of the models, i.e. Infotech, Radcliffe, van Veenstra and IBM, have been categorized as low performers. Whilst their content is well aligned with business value creation through big data capabilities, they all lack quality of development, ease of application and extensiveness. Lowest scores were awarded to IBM and Van Veenstra, since both are providing low level guidance for the respective maturity model's practical use, and they completely lack in documentation, ultimately resulting in poor quality of development and evaluation. [1]

See also

Related Research Articles

The Capability Maturity Model (CMM) is a development model created in 1986 after a study of data collected from organizations that contracted with the U.S. Department of Defense, who funded the research. The term "maturity" relates to the degree of formality and optimization of processes, from ad hoc practices, to formally defined steps, to managed result metrics, to active optimization of the processes.

Business intelligence (BI) consists of strategies, methodologies, and technologies used by enterprises for data analysis and management of business information. Common functions of BI technologies include reporting, online analytical processing, analytics, dashboard development, data mining, process mining, complex event processing, business performance management, benchmarking, text mining, predictive analytics, and prescriptive analytics.

ISO/IEC 15504Information technology – Process assessment, also termed Software Process Improvement and Capability dEtermination (SPICE), is a set of technical standards documents for the computer software development process and related business management functions. It is one of the joint International Organization for Standardization (ISO) and International Electrotechnical Commission (IEC) standards, which was developed by the ISO and IEC joint subcommittee, ISO/IEC JTC 1/SC 7.

Capability Maturity Model Integration (CMMI) is a process level improvement training and appraisal program. Administered by the CMMI Institute, a subsidiary of ISACA, it was developed at Carnegie Mellon University (CMU). It is required by many U.S. Government contracts, especially in software development. CMU claims CMMI can be used to guide process improvement across a project, division, or an entire organization.

Microsoft Solutions Framework (MSF) is a set of principles, models, disciplines, concepts, and guidelines for delivering information technology services from Microsoft. MSF is not limited to developing applications only; it is also applicable to other IT projects like deployment, networking or infrastructure projects. MSF does not force the developer to use a specific methodology.

The implementation maturity model (IMM) is an instrument to help an organization in assessing and determining the degree of maturity of its implementation processes.

Governance, risk management and compliance (GRC) is the term covering an organization's approach across these three practices: governance, risk management, and compliance.

<span class="mw-page-title-main">Network Centric Operations Industry Consortium</span> International standards adoption organization

The Network Centric Operations Industry Consortium (NCOIC) is an international not-for-profit, chartered in the United States, whose goal is to facilitate the adoption of cross-domain interoperability standards. Formed in September 2004, the organization is composed of more than 50 members and advisors representing business, government organizations and academic institutions in 12 countries.

The Trillium Model, created by a collaborative team from Bell Canada, Northern Telecom and Bell Northern Research combines requirements from the ISO 9000 series, the Capability Maturity Model (CMM) for software, and the Baldrige Criteria for Performance ExcellenceArchived 2016-08-04 at the Wayback Machine, with software quality standards from the IEEE. Trillium has a telecommunications orientation and provides customer focus. The practices in the Trillium Model are derived from a benchmarking exercise which focused on all practices that would contribute to an organization's product development and support capability. The Trillium Model covers all aspects of the software development life-cycle, most system and product development and support activities, and a significant number of related marketing activities. Many of the practices described in the model can be applied directly to hardware development.

A maturity model is a framework for measuring an organization's maturity, or that of a business function within an organization, with maturity being defined as a measurement of the ability of an organization for continuous improvement in a particular discipline. The higher the maturity, the higher will be the chances that incidents or errors will lead to improvements either in the quality or in the use of the resources of the discipline as implemented by the organization.

Integrated business planning (IBP) is a process for translating desired business outcomes into financial and operational resource requirements, with the overarching objective of maximizing profit and / or cash flow, while cutting down risk. The business outcomes, on which IBP processes focus, can be expressed in terms of the achievement of the following types of targets:

<span class="mw-page-title-main">Enterprise Architecture Assessment Framework</span>

The Enterprise Architecture Assessment Framework (EAAF) was created by the US Federal government Office of Management and Budget (OMB) to allow federal agencies to assess and report their enterprise architecture activity and maturity, and advance the use of enterprise architecture in the federal government.

Information governance, or IG, is the overall strategy for information at an organization. Information governance balances the risk that information presents with the value that information provides. Information governance helps with legal compliance, operational transparency, and reducing expenditures associated with legal discovery. An organization can establish a consistent and logical framework for employees to handle data through their information governance policies and procedures. These policies guide proper behavior regarding how organizations and their employees handle information whether it is physically or electronically.

In software engineering, a software development process or software development life cycle (SDLC) is a process of planning and managing software development. It typically involves dividing software development work into smaller, parallel, or sequential steps or sub-processes to improve design and/or product management. The methodology may include the pre-definition of specific deliverables and artifacts that are created and completed by a project team to develop or maintain an application.

Prescriptive analytics is a form of business analytics which suggests decision options for how to take advantage of a future opportunity or mitigate a future risk, and shows the implication of each decision option. It enables an enterprise to consider "the best course of action to take" in the light of information derived from descriptive and predictive analytics.

Agile Business Intelligence (BI) refers to the use of Agile software development for BI projects, aiming to reduce the time it takes to show value to the organization in comparison to other approaches. Agile BI attempts to enable the BI team, businesspeople or stakeholders to make business decisions, and to make them more quickly.

<span class="mw-page-title-main">Tudor IT Process Assessment</span> Process assessment framework

Tudor IT Process Assessment (TIPA) is a methodological framework for process assessment. Its first version was published in 2003 by the Public Research Centre Henri Tudor based in Luxembourg. TIPA is now a registered trademark of the Luxembourg Institute of Science and Technology (LIST). TIPA offers a structured approach to determine process capability compared to recognized best practices. TIPA also supports process improvement by providing a gap analysis and proposing improvement recommendations.

Software intelligence is insight into the inner workings and structural condition of software assets produced by software designed to analyze database structure, software framework and source code to better understand and control complex software systems in information technology environments. Similarly to business intelligence (BI), software intelligence is produced by a set of software tools and techniques for the mining of data and the software's inner-structure. Results are automatically produced and feed a knowledge base containing technical documentation and blueprints of the innerworking of applications, and make it available to all to be used by business and software stakeholders to make informed decisions, measure the efficiency of software development organizations, communicate about the software health, prevent software catastrophes.

Innovation management measurement helps companies in understanding the current status of their innovation capabilities and practices. Throughout this control areas of strength and weakness are identified and the organizations get a clue where they have to concentrate on to maximize the future success of their innovation procedures. Furthermore, the measurement of innovation assists firms in fostering an innovation culture within the organization and in spreading the awareness of the importance of innovation. It also discloses the restrictions for creativity and opportunity for innovation. Because of all these arguments it is very important to measure the degree of innovation in the company, also in comparison with other companies. On the other hand, firms have to be careful not to misapply the wrong metrics, because they could threaten innovation and influence thinking in the wrong way.

<span class="mw-page-title-main">Technology readiness level</span> Method for estimating the maturity of technologies

Technology readiness levels (TRLs) are a method for estimating the maturity of technologies during the acquisition phase of a program. TRLs enable consistent and uniform discussions of technical maturity across different types of technology. TRL is determined during a technology readiness assessment (TRA) that examines program concepts, technology requirements, and demonstrated technology capabilities. TRLs are based on a scale from 1 to 9 with 9 being the most mature technology.

References

  1. 1 2 3 4 5 Braun, Henrik (2015). "Evaluation of Big Data Maturity Models: A benchmarking study to support big data assessment in organizations". Masters Thesis – Tampere University of Technology.
  2. Halper, F., & Krishnan, K. (2014). TDWI Big Data Maturity Model Guide. TDWI Research.
  3. Krishnan (2014). "Measuring maturity of big data initiatives". Archived from the original on 2015-03-16. Retrieved 2017-05-21.
  4. 1 2 El-Darwiche; et al. (2014). "Big Data Maturity: An action plan for policymakers and executives". World Economic Forum.
  5. 1 2 "Leverage a Big Data Maturity model to build your big data roadmap" (PDF). 2014. Archived from the original (PDF) on 2017-08-02. Retrieved 2017-05-21.
  6. 1 2 3 4 Halper, Fern (2016). "A Guide to Achieving Big Data Analytics Maturity". TDWI Benchmark Guide.
  7. "How to summit "data maturity mountain" and make data your superpower". Fast Company. December 15, 2023. Retrieved October 8, 2024.
  8. "Big Data & Analytics Maturity Model". IBM Big Data & Analytics Hub. Retrieved 2017-05-21.
  9. "Home | Big Data Maturity Assessment". bigdatamaturity.knowledgent.com. Archived from the original on 2015-02-14. Retrieved 2017-05-21.
  10. Inc., Creative services by Cyclone Interactive Multimedia Group, Inc. (www.cycloneinteractive.com) Site designed and hosted by Cyclone Interactive Multimedia Group. "CSC Big Data Maturity Tool: Business Value, Drivers, and Challenges". csc.bigdatamaturity.com. Retrieved 2017-05-21.{{cite web}}: |last= has generic name (help)CS1 maint: multiple names: authors list (link)
  11. "Big Data Maturity Assessment Tool". www.infotech.com. Retrieved 2017-05-21.
  12. van Veenstra, Anne Fleur. "Big Data in Small Steps: Assessing the value of data" (PDF). White Paper.