Information capital

Last updated

Information capital is a concept which asserts that information has intrinsic value which can be shared and leveraged within and between organizations. Information capital connotes that sharing information is a means of sharing power, supporting personnel, and optimizing working processes. [1] Information capital is the pieces of information which enables the exchange of Knowledge capital.

Contents

Overview

In management, information capital is usually described as a set of data which are valuable for organisation and can be reached through different data storing systems, such as intro and internet systems, computer databases, libraries, and information sharing networks. [2] Information capital can be used not only by organisations but by individuals as well. For example, if use of information capital enables an individual to analyse his spending on a certain type of products and determine how it compares in relation to his spendings on other products or to the spendings of other people, this might affect his future purchasing decisions. [3] In the Information Era, efficient use of information capital highly depends on information capital readiness with IT, as information capital derived from information systems readiness. So companies which are investing more in IT systems might get competitive advantage over other businesses [4]

Information capital market

Information capital markets are commercial markets for the buying and selling of information and data. These markets connect data aggregators with organisations and individuals who need information for business, scientific or any other purposes. Regulating acts such as Data Protection Act 1998 and Data Protection Directive are imposed to control information capital market and prevent inappropriate usage of personal information by data aggregators or any other individuals and organizations. Although information has been bought and sold since ancient times, the idea of an information marketplace is relatively recent. [5] The first Information market has formed around the Credit bureau type of organization for the exchange of personal information in the financial industry. [6] But since that time Information market have changed radically. Nowadays Information markets are mainly hosted on electronic based data aggregation systems. Vast majority of them are accessible for both governments and organizations within corporate or any other sectors. [7] Some information capital market platforms can be accessed directly by the public, for example SocialSafe Ltd which is social media backup tool that also allows users to download their content from a variety of social networks to their own personal data store and then sell this information directly. [8]

Era of Big data

Big data is large amounts of information which are so massive and complex that they become impossible to analyse using traditional data processing technologies, requiring special technologies instead. [9] Recent advances in big data analyses have a potential to change the way information capital market operates nowadays, because if commercial organisations will be able to analyse and structure information about millions of people in any part of the world, this will negate the value of information which comes from one individual or organization, and will allow companies to make faster and more accurate data-driven decisions. Some scientists even predict that advances in big data analysis will have even larger effect on information capital market than the creation of the internet. [10]

List of companies operating in big data analytics:

IBM - IBM offers DB2, Informix and InfoSphere database software, Cognos and SPSS analytics applications, and Global Services division. [11]

HP - HP is a major provider of big data software analysis tools.

Oracle - Oracle develops both hardware and software products for big data processing. They include Oracle NoSQL Database, Apache Hadoop, Oracle Data Integrator and many other.

SAP - SAP is a largest provider of software appliances for big data handling and analytics. [12]

Microsoft - Microsoft in partnership with Hortonworks offers the HDInsights tool which is used to analyse unstructured information provided by data aggregators.

Google - Google is working on development of BigQuery - first cloud-based big data processing platform.

See also

Related Research Articles

Customer relationship management (CRM) is a process in which a business or another organization administers its interactions with customers, typically using data analysis to study large amounts of information.

Oracle Corporation is an American multinational computer technology company headquartered in Austin, Texas. Co-founded in 1977 by Larry Ellison, who remains executive chairman, Oracle ranked as the third-largest software company in the world by revenue and market capitalization as of 2020, and the company's seat in Forbes Global 2000 was 80 in 2023.

<span class="mw-page-title-main">Analytics</span> Discovery, interpretation, and communication of meaningful patterns in data

Analytics is the systematic computational analysis of data or statistics. It is used for the discovery, interpretation, and communication of meaningful patterns in data, which also falls under and directly relates to the umbrella term, data science. Analytics also entails applying data patterns toward effective decision-making. It can be valuable in areas rich with recorded information; analytics relies on the simultaneous application of statistics, computer programming, and operations research to quantify performance.

Managerial economics is a branch of economics involving the application of economic methods in the organizational decision-making process. Economics is the study of the production, distribution, and consumption of goods and services. Managerial economics involves the use of economic theories and principles to make decisions regarding the allocation of scarce resources. It guides managers in making decisions relating to the company's customers, competitors, suppliers, and internal operations.

<span class="mw-page-title-main">Data management</span> Disciplines related to managing data as a resource

Data management comprises all disciplines related to handling data as a valuable resource, it is the practice of managing an organization's data so it can be analyzed for decision making.

Knowledge workers are workers whose main capital is knowledge. Examples include ICT professionals, physicians, pharmacists, architects, engineers, scientists, design thinkers, public accountants, lawyers, editors, and academics, whose job is to "think for a living".

<span class="mw-page-title-main">SAS (software)</span> Statistical software

SAS is a statistical software suite developed by SAS Institute for data management, advanced analytics, multivariate analysis, business intelligence, criminal investigation, and predictive analytics. SAS' analytical software is built upon artificial intelligence and utilizes machine learning, deep learning and generative AI to manage and model data. The software is widely used in industries such as finance, insurance, health care and education.

<span class="mw-page-title-main">Business analyst</span> Person who analyses and documents business processes

A business analyst (BA) is a person who processes, interprets and documents business processes, products, services and software through analysis of data. The role of a business analyst is to ensure business efficiency increases through their knowledge of both IT and business function.

Predictive analytics, or predictive AI, encompasses a variety of statistical techniques from data mining, predictive modeling, and machine learning that analyze current and historical facts to make predictions about future or otherwise unknown events.

Procurement software refers to a range of business software designed to streamline and automate purchasing processes for businesses and organizations. By managing information flows and transactions between procuring entities, suppliers, and partners, procurement software aims to cut costs, improve efficiency, and boost organizational performance.

Master data represents "data about the business entities that provide context for business transactions". The most commonly found categories of master data are parties, products, financial structures and locational concepts.

<span class="mw-page-title-main">Greenplum</span> American data technology company

Greenplum is a big data technology based on MPP architecture and the Postgres open source database technology. The technology was created by a company of the same name headquartered in San Mateo, California around 2005. Greenplum was acquired by EMC Corporation in July 2010.

<span class="mw-page-title-main">Pentaho</span> Business intelligence software

Pentaho is the brand name for several Data Management software products that make up the Pentaho+ Data Platform. These include Pentaho Data Integration, Pentaho Business Analytics, Pentaho Data Catalog, and Pentaho Data Optimiser.

HP Information Management Software is a software from the HP Software Division, used to organize, protect, retrieve, acquire, manage, and maintain information. The HP Software Division also offers information analytics software. The amount of data that companies have to deal with has grown tremendously over the past decade, making the management of this information more difficult. The University of California at Berkeley claims the amount of information produced globally increases by 30 percent annually. An April 2010 Information Management article cited a survey in which nearly 90 percent of businesses blame poor performance on data growth. The survey concluded that for many businesses their applications and databases are growing by 50 percent or more annually, making it difficult to manage the rapid expansion of information.

<span class="mw-page-title-main">Big data</span> Extremely large or complex datasets

Big data primarily refers to data sets that are too large or complex to be dealt with by traditional data-processing software. Data with many entries (rows) offer greater statistical power, while data with higher complexity may lead to a higher false discovery rate.

Actian is an American software company headquartered in Santa Clara, California that provides analytics-related software, products, and services. The company sells database software and technology, cloud engineered systems, and data integration solutions.

<span class="mw-page-title-main">Data lake</span> System or repository of data stored in its natural/raw format

A data lake is a system or repository of data stored in its natural/raw format, usually object blobs or files. A data lake is usually a single store of data including raw copies of source system data, sensor data, social data etc., and transformed data used for tasks such as reporting, visualization, advanced analytics, and machine learning. A data lake can include structured data from relational databases, semi-structured data, unstructured data, and binary data. A data lake can be established on premises or in the cloud.

Embedded analytics enables organisations to integrate analytics capabilities into their own, often software as a service, applications, portals, or websites. This differs from embedded software and web analytics.

A human resources management system (HRMS), also human resources information system (HRIS) or human capital management (HCM) system, is a form of human resources (HR) software that combines a number of systems and processes to ensure the easy management of human resources, business processes and data. Human resources software is used by businesses to combine a number of necessary HR functions, such as storing employee data, managing payroll, recruitment, benefits administration, time and attendance, employee performance management, and tracking competency and training records.

<span class="mw-page-title-main">Apache Pinot</span> Open-source distributed data store

Apache Pinot is a column-oriented, open-source, distributed data store written in Java. Pinot is designed to execute OLAP queries with low latency. It is suited in contexts where fast analytics, such as aggregations, are needed on immutable data, possibly, with real-time data ingestion. The name Pinot comes from the Pinot grape vines that are pressed into liquid that is used to produce a variety of different wines. The founders of the database chose the name as a metaphor for analyzing vast quantities of data from a variety of different file formats or streaming data sources.

References

  1. Albert Wee Kwan Tan; Petros Theodorou (1 January 2009). Strategic Information Technology and Portfolio Management. Idea Group Inc (IGI). pp. 254–. ISBN   978-1-59904-689-1.
  2. Robert Kaplan; David P. Norton (30 December 2013). Strategy Maps: Converting Intangible Assets into Tangible Outcomes. Harvard Business Press. pp. 179–. ISBN   978-1-4221-6349-8.
  3. "Are You Maximizing Your Information Capital? | Innovation Insights | WIRED". www.wired.com. Archived from the original on 2014-04-07.
  4. Masanobu Kosuga, Yasuhiro Monden; Yasuhiro Monden (1 January 2007). Japanese Management Accounting Today. World Scientific. pp. 125–. ISBN   978-981-277-973-1.
  5. Alessandro Ludovico (2012). Post-digital Print: The Mutation of Publishing Since 1894. Onomatopee. ISBN   978-90-78454-87-8.
  6. United States. Congress. Senate. Select Committee on Small Business (1980). The Impact of Commercial Credit Reporting Practices on Small Business: Hearings Before the Select Committee on Small Business, United States Senate, Ninety-sixth Congress, First Session ... October 31 and November 1, 1979. U.S. Government Printing Office.{{cite book}}: |author= has generic name (help)
  7. Commission on Physical Sciences, Mathematics, and Applications; Computer Science and Telecommunications Board; Committee on the Internet in the Evolving Information Infrastructure; National Research Council, Division on Engineering and Physical Sciences (22 January 2001). The Internet's Coming of Age . National Academies Press. pp.  136–. ISBN   978-0-309-06992-2.{{cite book}}: CS1 maint: multiple names: authors list (link)
  8. Fernando Francisco Padró (1 January 2004). Statistical Handbook on the Social Safety Net. Greenwood Publishing Group. pp. 1–. ISBN   978-1-57356-516-5.
  9. Anand Rajaraman; Jeffrey David Ullman (27 October 2011). Mining of Massive Datasets. Cambridge University Press. ISBN   978-1-139-50534-5.
  10. Viktor Mayer-Schönberger; Kenneth Cukier (2013). Big Data: A Revolution that Will Transform how We Live, Work, and Think. Houghton Mifflin Harcourt. ISBN   978-0-544-00269-2.
  11. Paul Zikopoulos (1 October 2011). Understanding Big Data: Analytics for Enterprise Class Hadoop and Streaming Data. McGraw Hill Professional. ISBN   978-0-07-179054-3.
  12. Financial Times (29 January 2013). Decoding Big Data. Penguin Books Limited. pp. 13–. ISBN   978-0-670-92384-7.