Big data maturity model

Last updated

Big data maturity models (BDMM) are the artifacts used to measure big data maturity. [1] These models help organizations to create structure around their big data capabilities and to identify where to start. [2] They provide tools that assist organizations to define goals around their big data program and to communicate their big data vision to the entire organization. BDMMs also provide a methodology to measure and monitor the state of a company's big data capability, the effort required to complete their current stage or phase of maturity and to progress to the next stage. Additionally, BDMMs measure and manage the speed of both the progress and adoption of big data programs in the organization. [1]

Contents

The goals of BDMMs are:

Key organizational areas refer to "people, process and technology" and the subcomponents include [3] alignment, architecture, data, data governance, delivery, development, measurement, program governance, scope, skills, sponsorship, statistical modelling, technology, value and visualization.

The stages or phases in BDMMs depict the various ways in which data can be used in an organization and is one of the key tools to set direction and monitor the health of an organization's big data programs. [4] [5]

An underlying assumption is that a high level of big data maturity correlates with an increase in revenue and reduction in operational expense. However, reaching the highest level of maturity involves major investments over many years. [6] Only a few companies are considered to be at a "mature" stage of big data and analytics. These include internet-based companies (such as LinkedIn, Facebook, and Amazon) and other non-Internet-based companies, including financial institutions (fraud analysis, real-time customer messaging and behavioral modeling) and retail organizations (click-stream analytics together with self-service analytics for teams). [6]

Categories

Big data maturity models can be broken down into three broad categories namely: [1]

Descriptive

Descriptive models assess the current firm maturity through qualitative positioning of the firm in various stages or phases. The model does not provide any recommendations as to how a firm would improve their big data maturity.

Big data and analytics maturity model (IBM model)

This descriptive model aims to assess the value generated from big data investments towards supporting strategic business initiatives.

Maturity levels

The model consists of the following maturity levels:

Assessment areas

Maturity levels also cover areas in matrix format focusing on: business strategy, information, analytics, culture and execution, architecture and governance.

[7]

Knowledgent big data maturity assessment

Consisting of an assessment survey, this big data maturity model assesses an organization's readiness to execute big data initiatives. Furthermore, the model aims to identify the steps and appropriate technologies that will lead an organization towards big data maturity. [8]

Comparative

Comparative big data maturity models aim to benchmark an organization in relation to its industry peers and normally consist of a survey containing quantitative and qualitative information.

CSC big data maturity tool

The CSC big data maturity tool acts as a comparative tool to benchmark an organization's big data maturity. A survey is undertaken and the results are then compared to other organizations within a specific industry and within the wider market. [9]

TDWI big data maturity model

The TDWI big data maturity model is a model in the current big data maturity area and therefore consists of a significant body of knowledge. [6]

Maturity stages

The different stages of maturity in the TDWI BDMM can be summarized as follows:

Stage 1: Nascent

The nascent stage as a pre–big data environment. During this stage:

Stage 2: Pre-adoption

During the pre-adoption stage:

Stage 3: Early adoptionThe "chasm" There is then generally a series of hurdles it needs to overcome. These hurdles include:

Stage 4: Corporate adoption

The corporate adoption stage is characterized by the involvement of end-users, an organization gains further insight and the way of conducting business is transformed. During this stage:

Stage 5: Mature / visionary

Only a few organizations can be considered as visionary in terms of big data and big data analytics. During this stage an organization:

Research findings

TDWI [6] did an assessment on 600 organizations and found that the majority of organizations are either in the pre-adoption (50%) or early adoption (36%) stages. Additionally, only 8% of the sample have managed to move past the chasm towards corporate adoption or being mature/visionary.

Prescriptive

The majority of prescriptive BDMMs follow a similar modus operandi in that the current situation is first assessed followed by phases plotting the path towards increased big data maturity. Examples are:

Info-tech big data maturity assessment tool

This maturity model is prescriptive in the sense that the model consists of four distinct phases that each plot a path towards big data maturity. Phases are:

[10]

Radcliffe big data maturity model

The Radcliffe big data maturity model, as other models, also consists of distinct maturity levels ranging from:

[5]

Booz & Company's model

This BDMM provides a framework that not only enables organizations to view the extent of their current maturity, but also to identify goals and opportunities for growth in big data maturity. The model consists of four stages namely,

[4]

Van Veenstra's model

The prescriptive model proposed by Van Veenstra aims to firstly explore the existing big data environment of the organization followed by exploitation opportunities and a growth path towards big data maturity. The model makes use of four phases namely:

[11]

Critical evaluation

Current BDMMs have been evaluated under the following criteria: [1]

The TDWI and CSC have the strongest overall performance with steady scores in each of the criteria groups. The overall results communicate that the top performer models are extensive, balanced, well-documented, easy to use, and they address a good number of big data capabilities that are utilized in business value creation. The models of Booz & Company and Knowledgent are close seconds and these mid-performers address big data value creation in a commendable manner, but fall short when examining the completeness of the models and the ease of application. Knowledgent suffers from poor quality of development, having barely documented any of its development processes. The rest of the models, i.e. Infotech, Radcliffe, van Veenstra and IBM, have been categorized as low performers. Whilst their content is well aligned with business value creation through big data capabilities, they all lack quality of development, ease of application and extensiveness. Lowest scores were awarded to IBM and Van Veenstra, since both are providing low level guidance for the respective maturity model's practical use, and they completely lack in documentation, ultimately resulting in poor quality of development and evaluation. [1]

See also

Related Research Articles

The Capability Maturity Model (CMM) is a development model created in 1986 after a study of data collected from organizations that contracted with the U.S. Department of Defense, who funded the research. The term "maturity" relates to the degree of formality and optimization of processes, from ad hoc practices, to formally defined steps, to managed result metrics, to active optimization of the processes.

Business intelligence (BI) comprises the strategies and technologies used by enterprises for the data analysis and management of business information. Common functions of business intelligence technologies include reporting, online analytical processing, analytics, dashboard development, data mining, process mining, complex event processing, business performance management, benchmarking, text mining, predictive analytics, and prescriptive analytics.

Information technology (IT)governance is a subset discipline of corporate governance, focused on information technology (IT) and its performance and risk management. The interest in IT governance is due to the ongoing need within organizations to focus value creation efforts on an organization's strategic objectives and to better manage the performance of those responsible for creating this value in the best interest of all stakeholders. It has evolved from The Principles of Scientific Management, Total Quality Management and ISO 9001 Quality management system.

ISO/IEC 15504Information technology – Process assessment, also termed Software Process Improvement and Capability Determination (SPICE), is a set of technical standards documents for the computer software development process and related business management functions. It is one of the joint International Organization for Standardization (ISO) and International Electrotechnical Commission (IEC) standards, which was developed by the ISO and IEC joint subcommittee, ISO/IEC JTC 1/SC 7.

Capability Maturity Model Integration (CMMI) is a process level improvement training and appraisal program. Administered by the CMMI Institute, a subsidiary of ISACA, it was developed at Carnegie Mellon University (CMU). It is required by many U.S. Government contracts, especially in software development. CMU claims CMMI can be used to guide process improvement across a project, division, or an entire organization. CMMI defines the following maturity levels for processes: Initial, Managed, Defined, Quantitatively Managed, and Optimizing. Version 2.0 was published in 2018. CMMI is registered in the U.S. Patent and Trademark Office by CMU.

Enterprise risk management (ERM) in business includes the methods and processes used by organizations to manage risks and seize opportunities related to the achievement of their objectives. ERM provides a framework for risk management, which typically involves identifying particular events or circumstances relevant to the organization's objectives, assessing them in terms of likelihood and magnitude of impact, determining a response strategy, and monitoring process. By identifying and proactively addressing risks and opportunities, business enterprises protect and create value for their stakeholders, including owners, employees, customers, regulators, and society overall.

Microsoft Solutions Framework (MSF) is a set of principles, models, disciplines, concepts, and guidelines for delivering information technology services from Microsoft. MSF is not limited to developing applications only; it is also applicable to other IT projects like deployment, networking or infrastructure projects. MSF does not force the developer to use a specific methodology.

The implementation maturity model (IMM) is an instrument to help an organization in assessing and determining the degree of maturity of its implementation processes.

The Portfolio, Programme and Project Management Maturity Model (P3M3) is a management maturity model looking across an organization at how it delivers its projects, programmes and portfolio(s). P3M3 looks at the whole system and not just one processes. The hierarchical approach enables organisations to assess their capability and then plot a roadmap for improvement prioritised by those key process areas (KPAs) which will make the biggest impact on performance.

The Trillium Model, created by a collaborative team from Bell Canada, Northern Telecom and Bell Northern Research combines requirements from the ISO 9000 series, the Capability Maturity Model (CMM) for software, and the Baldrige Criteria for Performance Excellence, with software quality standards from the IEEE. Trillium has a telecommunications orientation and provides customer focus. The practices in the Trillium Model are derived from a benchmarking exercise which focused on all practices that would contribute to an organization's product development and support capability. The Trillium Model covers all aspects of the software development life-cycle, most system and product development and support activities, and a significant number of related marketing activities. Many of the practices described in the model can be applied directly to hardware development.

An operating model is both an abstract and visual representation (model) of how an organization delivers value to its customers or beneficiaries as well as how an organization actually runs itself.

Information governance, or IG, is the overall strategy for information at an organization. Information governance balances the risk that information presents with the value that information provides. Information governance helps with legal compliance, operational transparency, and reducing expenditures associated with legal discovery. An organization can establish a consistent and logical framework for employees to handle data through their information governance policies and procedures. These policies guide proper behavior regarding how organizations and their employees handle information whether it is physically or electronically created (ESI).

In software engineering, a software development process is a process of dividing software development work into smaller, parallel, or sequential steps or sub-processes to improve design, product management. It is also known as a software development life cycle (SDLC). The methodology may include the pre-definition of specific deliverables and artifacts that are created and completed by a project team to develop or maintain an application.

<span class="mw-page-title-main">IT risk management</span>

IT risk management is the application of risk management methods to information technology in order to manage IT risk, i.e.:

<span class="mw-page-title-main">Prescriptive analytics</span>

Prescriptive analytics is the third and final phase of business analytics, which also includes descriptive and predictive analytics.

<span class="mw-page-title-main">Tudor IT Process Assessment</span> Process assessment framework

Tudor IT Process Assessment (TIPA) is a methodological framework for process assessment. Its first version was published in 2003 by the Public Research Centre Henri Tudor based in Luxembourg. TIPA is now a registered trademark of the Luxembourg Institute of Science and Technology (LIST). TIPA offers a structured approach to determine process capability compared to recognized best practices. TIPA also supports process improvement by providing a gap analysis and proposing improvement recommendations.

Software Intelligence is insight into the structural condition of software assets produced by software designed to analyze database structure, software framework and source code to better understand and control complex software systems in Information Technology environments. Similarly to Business Intelligence (BI), Software Intelligence is produced by a set of software tools and techniques for the mining of data and software inner-structure. End results are information used by business and software stakeholders to make informed decisions, measure the efficiency of software development organizations, communicate about software health, prevent software catastrophes.

Innovation management measurement helps companies in understanding the current status of their innovation capabilities and practices. Throughout this control areas of strength and weakness are identified and the organizations get a clue where they have to concentrate on to maximize the future success of their innovation procedures. Furthermore, the measurement of innovation assists firms in fostering an innovation culture within the organization and in spreading the awareness of the importance of innovation. It also discloses the restrictions for creativity and opportunity for innovation. Because of all these arguments it is very important to measure the degree of innovation in the company, also in comparison with other companies. On the other hand, firms have to be careful not to misapply the wrong metrics, because they could threaten innovation and influence thinking in the wrong way.

<span class="mw-page-title-main">Technology readiness level</span> Method for estimating the maturity of technologies

Technology readiness levels (TRLs) are a method for estimating the maturity of technologies during the acquisition phase of a program. TRLs enable consistent and uniform discussions of technical maturity across different types of technology. TRL is determined during a technology readiness assessment (TRA) that examines program concepts, technology requirements, and demonstrated technology capabilities. TRLs are based on a scale from 1 to 9 with 9 being the most mature technology.

Cybersecurity Capacity Maturity Model for Nations (CMM) is a framework developed to review the cybersecurity capacity maturity of a country across five dimensions. The five dimensions covers the capacity area required by a country to improve its cybersecurity posture. It was designed by Global Cyber Security Capacity Centre (GCSCC) of University of Oxford and first of its kind framework for countries to review their cybersecurity capacity, benchmark it and receive recommendation for improvement. Each dimension is divided into factors and the factors broken down into aspects. The review process includes rating each factor or aspect along five stages that represents the how well a country is doing in respect to that factor or aspect. The recommendations includes guidance on areas of cybersecurity that needs improvement and thus will require more focus and investment. As at June, 2021, the framework has been adopted and implemented in over 80 countries worldwide. Its' deployment has been catalyzed by the involvement of international organizations such as the Organization of American States (OAS), the World Bank (WB), the International Telecommunication Union (ITU) and the Commonwealth Telecommunications Union (CTO) and Global Forum on Cyber Expertise (GFCE).

References

  1. 1 2 3 4 5 Braun, Henrik (2015). "Evaluation of Big Data Maturity Models: A benchmarking study to support big data assessment in organizations". Masters Thesis – Tampere University of Technology.
  2. Halper, F., & Krishnan, K. (2014). TDWI Big Data Maturity Model Guide. TDWI Research.
  3. Krishnan (2014). "Measuring maturity of big data initiatives". Archived from the original on 2015-03-16. Retrieved 2017-05-21.
  4. 1 2 El-Darwiche; et al. (2014). "Big Data Maturity: An action plan for policymakers and executives". World Economic Forum.
  5. 1 2 "Leverage a Big Data Maturity model to build your big data roadmap" (PDF). 2014. Archived from the original (PDF) on 2017-08-02. Retrieved 2017-05-21.
  6. 1 2 3 4 Halper, Fern (2016). "A Guide to Achieving Big Data Analytics Maturity". TDWI Benchmark Guide.
  7. "Big Data & Analytics Maturity Model". IBM Big Data & Analytics Hub. Retrieved 2017-05-21.
  8. "Home | Big Data Maturity Assessment". bigdatamaturity.knowledgent.com. Archived from the original on 2015-02-14. Retrieved 2017-05-21.
  9. Inc., Creative services by Cyclone Interactive Multimedia Group, Inc. (www.cycloneinteractive.com) Site designed and hosted by Cyclone Interactive Multimedia Group. "CSC Big Data Maturity Tool: Business Value, Drivers, and Challenges". csc.bigdatamaturity.com. Retrieved 2017-05-21.
  10. "Big Data Maturity Assessment Tool". www.infotech.com. Retrieved 2017-05-21.
  11. van Veenstra, Anne Fleur. "Big Data in Small Steps: Assessing the value of data" (PDF). White Paper.