Information capital

Last updated

Information capital is a concept which asserts that information has intrinsic value which can be shared and leveraged within and between organizations. Information capital connotes that sharing information is a means of sharing power, supporting personnel, and optimizing working processes. [1] Information capital is the pieces of information which enables the exchange of Knowledge capital.

Contents

Overview

In management, information capital is usually described as a set of data which are valuable for organisation and can be reached through different data storing systems, such as intro and internet systems, computer databases, libraries, and information sharing networks. [2] Information capital can be used not only by organisations but by individuals as well. For example, if use of information capital enables an individual to analyse his spending on a certain type of products and determine how it compares in relation to his spendings on other products or to the spendings of other people, this might affect his future purchasing decisions. [3] In the Information Era, efficient use of information capital highly depends on information capital readiness with IT, as information capital derived from information systems readiness. So companies which are investing more in IT systems might get competitive advantage over other businesses [4]

Information capital market

Information capital markets are commercial markets for the buying and selling of information and data. These markets connect data aggregators with organisations and individuals who need information for business, scientific or any other purposes. Regulating acts such as Data Protection Act 1998 and Data Protection Directive are imposed to control information capital market and prevent inappropriate usage of personal information by data aggregators or any other individuals and organizations. Although information has been bought and sold since ancient times, the idea of an information marketplace is relatively recent. [5] The first Information market has formed around the Credit bureau type of organization for the exchange of personal information in the financial industry. [6] But since that time Information market have changed radically. Nowadays Information markets are mainly hosted on electronic based data aggregation systems. Vast majority of them are accessible for both governments and organizations within corporate or any other sectors. [7] Some information capital market platforms can be accessed directly by the public, for example SocialSafe Ltd which is social media backup tool that also allows users to download their content from a variety of social networks to their own personal data store and then sell this information directly. [8]

Era of Big data

Big data is large amounts of information which are so massive and complex that they become impossible to analyse using traditional data processing technologies, requiring special technologies instead. [9] Recent advances in big data analyses have a potential to change the way information capital market operates nowadays, because if commercial organisations will be able to analyse and structure information about millions of people in any part of the world, this will negate the value of information which comes from one individual or organization, and will allow companies to make faster and more accurate data-driven decisions. Some scientists even predict that advances in big data analysis will have even larger effect on information capital market than the creation of the internet. [10]

List of companies operating in big data analytics:

IBM - IBM offers DB2, Informix and InfoSphere database software, Cognos and SPSS analytics applications, and Global Services division. [11]

HP - HP is a major provider of big data software analysis tools.

Oracle - Oracle develops both hardware and software products for big data processing. They include Oracle NoSQL Database, Apache Hadoop, Oracle Data Integrator and many other.

SAP - SAP is a largest provider of software appliances for big data handling and analytics. [12]

Microsoft - Microsoft in partnership with Hortonworks offers the HDInsights tool which is used to analyse unstructured information provided by data aggregators.

Google - Google is working on development of BigQuery - first cloud-based big data processing platform.

See also

Related Research Articles

Customer relationship management (CRM) is a process in which a business or other organization administers its interactions with customers, typically using data analysis to study large amounts of information.

<span class="mw-page-title-main">Oracle Corporation</span> American multinational computer technology corporation

Oracle Corporation is an American multinational computer technology corporation headquartered in Austin, Texas. In 2020, Oracle was the third-largest software company in the world by revenue and market capitalization. The company sells database software and technology, cloud engineered systems, and enterprise software products, such as enterprise resource planning (ERP) software, human capital management (HCM) software, customer relationship management (CRM) software, enterprise performance management (EPM) software, and supply chain management (SCM) software.

<span class="mw-page-title-main">R (programming language)</span> Programming language for statistics

R is a programming language for statistical computing and graphics supported by the R Core Team and the R Foundation for Statistical Computing. Created by statisticians Ross Ihaka and Robert Gentleman, R is used among data miners, bioinformaticians and statisticians for data analysis and developing statistical software. Users have created packages to augment the functions of the R language.

Analytics is the systematic computational analysis of data or statistics. It is used for the discovery, interpretation, and communication of meaningful patterns in data. It also entails applying data patterns toward effective decision-making. It can be valuable in areas rich with recorded information; analytics relies on the simultaneous application of statistics, computer programming, and operations research to quantify performance.

Knowledge workers are workers whose main capital is knowledge. Examples include programmers, physicians, pharmacists, architects, engineers, scientists, design thinkers, public accountants, lawyers, editors, and academics, whose job is to "think for a living".

<span class="mw-page-title-main">SAS (software)</span> Statistical software

SAS is a statistical software suite developed by SAS Institute for data management, advanced analytics, multivariate analysis, business intelligence, criminal investigation, and predictive analytics.

<span class="mw-page-title-main">Business analyst</span> Person who analyses and documents a business

A business analyst (BA) is a person who processes, interprets and documents business processes, products, services and software through analysis of data. The role of a business analyst is to ensure business efficiency increases through their knowledge of both IT and business function.

<span class="mw-page-title-main">Hyperion Solutions</span> American software company

Hyperion Solutions Corporation was a software company located in Santa Clara, California, which was acquired by Oracle Corporation in 2007. Many of its products were targeted at the business intelligence (BI) and business performance management markets, and as of 2013 were developed and sold as Oracle Hyperion products. Hyperion Solutions was formed from the merger of Hyperion Software and Arbor Software in 1998.

Predictive analytics is a form of business analytics applying machine learning to generate a predictive model for certain business applications. As such, it encompasses a variety of statistical techniques from predictive modeling and machine learning that analyze current and historical facts to make predictions about future or otherwise unknown events. It represents a major subset of machine learning applications; in some contexts, it is synonymous with machine learning.

Master data represents "data about the business entities that provide context for business transactions". The most commonly found categories of master data are parties, products, financial structures and locational concepts.

<span class="mw-page-title-main">Greenplum</span>

Greenplum is a big data technology based on MPP architecture and the Postgres open source database technology. The technology was created by a company of the same name headquartered in San Mateo, California around 2005. Greenplum was acquired by EMC Corporation in July 2010.

Data curation is the organization and integration of data collected from various sources. It involves annotation, publication and presentation of the data such that the value of the data is maintained over time, and the data remains available for reuse and preservation. Data curation includes "all the processes needed for principled and controlled data creation, maintenance, and management, together with the capacity to add value to data". In science, data curation may indicate the process of extraction of important information from scientific texts, such as research articles by experts, to be converted into an electronic format, such as an entry of a biological database.

HP Information Management Software is a software from the HP Software Division, used to organize, protect, retrieve, acquire, manage and maintain information. The HP Software Division also offers information analytics software. The amount of data that companies have to deal with has grown tremendously over the past decade, making the management of this information more difficult. The University of California at Berkeley claims the amount of information produced globally increases by 30 percent annually. An April 2010 Information Management article cited a survey in which nearly 90 percent of businesses blame poor performance on data growth. The survey concluded that for many businesses their applications and databases are growing by 50 percent or more annually, making it difficult to manage the rapid expansion of information.

<span class="mw-page-title-main">Big data</span> Information assets characterized by high volume, velocity, and variety

Big data primarily refers to data sets that are too large or complex to be dealt with by traditional data-processing application software. Data with many entries (rows) offer greater statistical power, while data with higher complexity may lead to a higher false discovery rate. Though used sometimes loosely partly because of a lack of formal definition, the interpretation that seems to best describe big data is the one associated with large body of information that we could not comprehend when used only in smaller amounts.

Cloudera, Inc. is an American software company providing an enterprise data management and analytics platform. The platform can run in a private cloud, as well as on AWS, Azure, and GCP. It allows users to store and analyze data using hardware and software in cloud-based and data center operations, spanning hybrid and multi-cloud environments. Cloudera offers cloud-native analytics for data distribution, data engineering, data warehousing, transactional data, streaming data, data science, and machine learning.

Revolution Analytics is a statistical software company focused on developing open source and "open-core" versions of the free and open source software R for enterprise, academic and analytics customers. Revolution Analytics was founded in 2007 as REvolution Computing providing support and services for R in a model similar to Red Hat's approach with Linux in the 1990s as well as bolt-on additions for parallel processing. In 2009 the company received nine million in venture capital from Intel along with a private equity firm and named Norman H. Nie as their new CEO. In 2010 the company announced the name change as well as a change in focus. Their core product, Revolution R, would be offered free to academic users and their commercial software would focus on big data, large scale multiprocessor computing, and multi-core functionality.

<span class="mw-page-title-main">Actian</span> American software company

Actian is an American software company headquartered in Sunnyvale, California that provides analytics-related software, products, and services. The company sells database software and technology, cloud engineered systems, and data integration solutions.

A data lake is a system or repository of data stored in its natural/raw format, usually object blobs or files. A data lake is usually a single store of data including raw copies of source system data, sensor data, social data etc., and transformed data used for tasks such as reporting, visualization, advanced analytics and machine learning. A data lake can include structured data from relational databases, semi-structured data, unstructured data and binary data. A data lake can be established "on premises" or "in the cloud".

A human resources management system (HRMS) or Human Resources Information System (HRIS) or Human Capital Management (HCM) is a form of Human Resources (HR) software that combines a number of systems and processes to ensure the easy management of human resources, business processes and data. Human resources software is used by businesses to combine a number of necessary HR functions, such as storing employee data, managing payroll, recruitment, benefits administration, time and attendance, employee performance management, and tracking competency and training records.

<span class="mw-page-title-main">Oracle Cloud</span> Cloud computing service

Oracle Cloud is a cloud computing service offered by Oracle Corporation providing servers, storage, network, applications and services through a global network of Oracle Corporation managed data centers. The company allows these services to be provisioned on demand over the Internet.

References

  1. Albert Wee Kwan Tan; Petros Theodorou (1 January 2009). Strategic Information Technology and Portfolio Management. Idea Group Inc (IGI). pp. 254–. ISBN   978-1-59904-689-1.
  2. Robert Kaplan; David P. Norton (30 December 2013). Strategy Maps: Converting Intangible Assets into Tangible Outcomes. Harvard Business Press. pp. 179–. ISBN   978-1-4221-6349-8.
  3. "Are You Maximizing Your Information Capital? | Innovation Insights | WIRED". www.wired.com. Archived from the original on 2014-04-07.
  4. Masanobu Kosuga, Yasuhiro Monden; Yasuhiro Monden (1 January 2007). Japanese Management Accounting Today. World Scientific. pp. 125–. ISBN   978-981-277-973-1.
  5. Alessandro Ludovico (2012). Post-digital Print: The Mutation of Publishing Since 1894. Onomatopee. ISBN   978-90-78454-87-8.
  6. United States. Congress. Senate. Select Committee on Small Business (1980). The Impact of Commercial Credit Reporting Practices on Small Business: Hearings Before the Select Committee on Small Business, United States Senate, Ninety-sixth Congress, First Session ... October 31 and November 1, 1979. U.S. Government Printing Office.{{cite book}}: |author= has generic name (help)
  7. Commission on Physical Sciences, Mathematics, and Applications; Computer Science and Telecommunications Board; Committee on the Internet in the Evolving Information Infrastructure; National Research Council, Division on Engineering and Physical Sciences (22 January 2001). The Internet's Coming of Age . National Academies Press. pp.  136–. ISBN   978-0-309-06992-2.{{cite book}}: CS1 maint: multiple names: authors list (link)
  8. Fernando Francisco Padró (1 January 2004). Statistical Handbook on the Social Safety Net. Greenwood Publishing Group. pp. 1–. ISBN   978-1-57356-516-5.
  9. Anand Rajaraman; Jeffrey David Ullman (27 October 2011). Mining of Massive Datasets. Cambridge University Press. ISBN   978-1-139-50534-5.
  10. Viktor Mayer-Schönberger; Kenneth Cukier (2013). Big Data: A Revolution that Will Transform how We Live, Work, and Think. Houghton Mifflin Harcourt. ISBN   978-0-544-00269-2.
  11. Paul Zikopoulos (1 October 2011). Understanding Big Data: Analytics for Enterprise Class Hadoop and Streaming Data. McGraw Hill Professional. ISBN   978-0-07-179054-3.
  12. Financial Times (29 January 2013). Decoding Big Data. Penguin Books Limited. pp. 13–. ISBN   978-0-670-92384-7.