Information capital

Last updated

Information capital is a concept which asserts that information has intrinsic value which can be shared and leveraged within and between organizations. Information capital connotes that sharing information is a means of sharing power, supporting personnel, and optimizing working processes. [1] Information capital is the pieces of information which enables the exchange of Knowledge capital.



In management, information capital is usually described as a set of data which are valuable for organisation and can be reached through different data storing systems, such as intro and internet systems, computer databases, libraries, and information sharing networks. [2] Information capital can be used not only by organisations but by individuals as well. For example, if use of information capital enables an individual to analyse his spending on a certain type of products and determine how it compares in relation to his spendings on other products or to the spendings of other people, this might affect his future purchasing decisions. [3] In the Information Era, efficient use of information capital highly depends on information capital readiness with IT, as information capital derived from information systems readiness. So companies which are investing more in IT systems might get competitive advantage over other businesses [4]

Information capital market

Information capital markets are commercial markets for the buying and selling of information and data. These markets connect data aggregators with organisations and individuals who need information for business, scientific or any other purposes. Regulating acts such as Data Protection Act 1998 and Data Protection Directive are imposed to control information capital market and prevent inappropriate usage of personal information by data aggregators or any other individuals and organizations. Although information has been bought and sold since ancient times, the idea of an information marketplace is relatively recent. [5] The first Information market has formed around the Credit bureau type of organization for the exchange of personal information in the financial industry. [6] But since that time Information market have changed radically. Nowadays Information markets are mainly hosted on electronic based data aggregation systems. Vast majority of them are accessible for both governments and organizations within corporate or any other sectors. [7] Some information capital market platforms can be accessed directly by the public, for example SocialSafe Ltd which is social media backup tool that also allows users to download their content from a variety of social networks to their own personal data store and then sell this information directly. [8]

Data Protection Act 1998 act which defines UK law on the processing of data on identifiable living people

The Data Protection Act 1998 is a United Kingdom Act of Parliament designed to protect personal data stored on computers or in an organised paper filing system. It enacted the EU Data Protection Directive 1995's provisions on the protection, processing and movement of data.

The Data Protection Directive was a European Union directive adopted in 1995 which regulates the processing of personal data within the European Union. It is an important component of EU privacy and human rights law.

Although information has been bought and sold since ancient times, the idea of an information marketplace is relatively recent. The nature of such markets is still evolving, which complicates development of sustainable business models. However, certain attributes of information markets are beginning to be understood, such as diminished participation costs, opportunities for customization, shifting customer relations, and a need for order.

Era of Big data

Big data is large amounts of information which are so massive and complex that they become impossible to analyse using traditional data processing technologies, requiring special technologies instead. [9] Recent advances in big data analyses have a potential to change the way information capital market operates nowadays, because if commercial organisations will be able to analyse and structure information about millions of people in any part of the world, this will negate the value of information which comes from one individual or organization, and will allow companies to make faster and more accurate data-driven decisions. Some scientists even predict that advances in big data analysis will have even larger effect on information capital market than the creation of the internet. [10]

Big data Information assets characterized by such a high volume, velocity and variety to require specific technology and analytical methods for its transformation into value

"Big data" is a field that treats ways to analyze, systematically extract information from, or otherwise deal with data sets that are too large or complex to be dealt with by traditional data-processing application software. Data with many cases (rows) offer greater statistical power, while data with higher complexity may lead to a higher false discovery rate. Big data challenges include capturing data, data storage, data analysis, search, sharing, transfer, visualization, querying, updating, information privacy and data source. Big data was originally associated with three key concepts: volume, variety, and velocity. Other concepts later attributed with big data are veracity and value.

List of companies operating in big data analytics:

IBM - IBM offers DB2, Informix and InfoSphere database software, Cognos and SPSS analytics applications, and Global Services division. [11]

HP - HP is a major provider of big data software analysis tools.

Oracle - Oracle develops both hardware and software products for big data processing. They include Oracle NoSQL Database, Apache Hadoop, Oracle Data Integrator and many other.

SAP - SAP is a largest provider of software appliances for big data handling and analytics. [12]

Microsoft - Microsoft in partnership with Hortonworks offers the HDInsights tool which is used to analyse unstructured information provided by data aggregators.

Google - Google is working on development of BigQuery - first cloud-based big data processing platform.

See also

Related Research Articles

Customer-relationship management (CRM) is an approach to manage a company's interaction with current and potential customers. It uses data analysis about customers' history with a company to improve business relationships with customers, specifically focusing on customer retention and ultimately driving sales growth.

Business intelligence (BI) comprises the strategies and technologies used by enterprises for the data analysis of business information. BI technologies provide historical, current and predictive views of business operations. Common functions of business intelligence technologies include reporting, online analytical processing, analytics, data mining, process mining, complex event processing, business performance management, benchmarking, text mining, predictive analytics and prescriptive analytics. BI technologies can handle large amounts of structured and sometimes unstructured data to help identify, develop and otherwise create new strategic business opportunities. They aim to allow for the easy interpretation of these big data. Identifying new opportunities and implementing an effective strategy based on insights can provide businesses with a competitive market advantage and long-term stability.

Social network analysis

Social network analysis (SNA) is the process of investigating social structures through the use of networks and graph theory. It characterizes networked structures in terms of nodes and the ties, edges, or links that connect them. Examples of social structures commonly visualized through social network analysis include social media networks, memes spread, information circulation, friendship and acquaintance networks, business networks, social networks, collaboration graphs, kinship, disease transmission, and sexual relationships. These networks are often visualized through sociograms in which nodes are represented as points and ties are represented as lines. These visualizations provide a means of qualitatively assessing networks by varying the visual representation of their nodes and edges to reflect attributes of interest.

Analytics discovery, interpretation, and communication of meaningful patterns in data

Analytics is the discovery, interpretation, and communication of meaningful patterns in data; and the process of applying those patterns towards effective decision making. In other words, analytics can be understood as the connective tissue between data and effective decision making, within an organization. Especially valuable in areas rich with recorded information, analytics relies on the simultaneous application of statistics, computer programming and operations research to quantify performance.

SAS (software) statistical software

SAS is a software suite developed by SAS Institute for advanced analytics, multivariate analysis, business intelligence, data management, and predictive analytics.

MicroStrategy Incorporated is a company that provides business intelligence (BI), mobile software, and cloud-based services. Founded in 1989 by Michael J. Saylor and Sanju Bansal, the firm develops software to analyze internal and external data in order to make business decisions and to develop mobile apps. It is a public company headquartered in Tysons Corner, Virginia, in the Washington metropolitan area. Its primary business analytics competitors include SAP AG Business Objects, IBM Cognos, and Oracle Corporation's BI Platform. Saylor is the CEO and chairman of the board.

Apache Hadoop is a collection of open-source software utilities that facilitate using a network of many computers to solve problems involving massive amounts of data and computation. It provides a software framework for distributed storage and processing of big data using the MapReduce programming model. Originally designed for computer clusters built from commodity hardware—still the common use—it has also found use on clusters of higher-end hardware. All the modules in Hadoop are designed with a fundamental assumption that hardware failures are common occurrences and should be automatically handled by the framework.

Workday, Inc. American software company

Workday, Inc. is an on‑demand (cloud-based) financial management and human capital management software vendor. It was founded by David Duffield, founder and former CEO of ERP company PeopleSoft, and former PeopleSoft chief strategist Aneel Bhusri following Oracle's hostile takeover of PeopleSoft in 2005. In October 2012, it launched a successful initial public offering that valued the company at $9.5 billion.

Vertica company

Vertica Systems is an analytic database management software company. Vertica was founded in 2005 by database researcher Michael Stonebraker and Andrew Palmer. Palmer was the founding CEO. Ralph Breslauer and Christopher P. Lynch served as later CEOs.

HP Information Management Software is software from the HP Software Division, used to organize, protect, retrieve, acquire, manage and maintain information. The HP Software Division also offers information analytics software. The amount of data that companies have to deal with has grown tremendously over the past decade, making the management of this information more difficult. The University of California at Berkeley claims the amount of information produced globally increases by 30 percent annually. An April 2010 Information Management article cited a survey in which nearly 90 percent of businesses blame poor performance on data growth. The survey concluded that for many businesses their applications and databases are growing by 50 percent or more annually, making it difficult to manage the rapid expansion of information. Because of this Information explosion, IT companies have created technology solutions to help businesses manage this ever-expanding data.

Cloudera, Inc. is a US-based software company that provides a software platform for data engineering, data warehousing, machine learning and analytics that runs in the cloud or on premises.

Revolution Analytics is a statistical software company focused on developing open source and "open-core" versions of the free and open source software R for enterprise, academic and analytics customers. Revolution Analytics was founded in 2007 as REvolution Computing providing support and services for R in a model similar to Red Hat's approach with Linux in the 1990s as well as bolt-on additions for parallel processing. In 2009 the company received nine million in venture capital from Intel along with a private equity firm and named Norman H. Nie as their new CEO. In 2010 the company announced the name change as well as a change in focus. Their core product, Revolution R, would be offered free to academic users and their commercial software would focus on big data, large scale multiprocessor computing, and multi-core functionality.

The Oracle Big Data Appliance consists of hardware and software from Oracle Corporation sold as a computer appliance. It was announced in 2011, promoted for consolidating and loading unstructured data into Oracle Database software.

The third platform is a term coined by marketing firm International Data Corporation (IDC) for a model of a computing platform. It was promoted as inter-dependencies between mobile computing, social media, cloud computing, and information / analytics, and possibly the Internet of Things. The term was in use in 2013, and possibly earlier. Gartner claimed that these interdependent trends were "transforming the way people and businesses relate to technology" and have since provided a number of reports on the topic.

Platfora, Inc. is a big data analytics company based in San Mateo, California. The firm’s software works with the open-source software framework Apache Hadoop to assist with data analysis, data visualization, and sharing.

A data lake is a system or repository of data stored in its natural format, usually object blobs or files. A data lake is usually a single store of all enterprise data including raw copies of source system data and transformed data used for tasks such as reporting, visualization, analytics and machine learning. A data lake can include structured data from relational databases, semi-structured data, unstructured data and binary data.

The Cray Urika-XA extreme analytics platform, manufactured by supercomputer maker Cray Inc., is an appliance that analyzes the massive amounts of data—usually called big data—that supercomputers collect. Organizations that use supercomputers have traditionally used multiple smaller off-the-shelf systems for data analysis. But as organizations see a dramatic increase in the amount of data they collect—everything from research data to retail transactions—they need data analytics systems that can make sense of it and help them use it strategically. In a nod to organizations that lean toward open-source software, the Urika-XA comes pre-installed with Cloudera Enterprise Hadoop and Apache Spark.

Nandu Jayakumar is an information technology executive and computer programmer based in San Francisco, California. He joined Oracle Corporation as VP of Engineering in 2017. He is best known for his contributions to the field of Big Data and for speaking and writing about managing data at large scale.


  1. Albert Wee Kwan Tan; Petros Theodorou (1 January 2009). Strategic Information Technology and Portfolio Management. Idea Group Inc (IGI). pp. 254–. ISBN   978-1-59904-689-1.
  2. Robert Kaplan; David P. Norton (30 December 2013). Strategy Maps: Converting Intangible Assets into Tangible Outcomes. Harvard Business Press. pp. 179–. ISBN   978-1-4221-6349-8.
  3. {{cite web}}
  4. Masanobu Kosuga, Yasuhiro Monden; Yasuhiro Monden (1 January 2007). Japanese Management Accounting Today. World Scientific. pp. 125–. ISBN   978-981-277-973-1.
  5. Alessandro Ludovico (2012). Post-digital Print: The Mutation of Publishing Since 1894. Onomatopee. ISBN   978-90-78454-87-8.
  6. United States. Congress. Senate. Select Committee on Small Business (1980). The Impact of Commercial Credit Reporting Practices on Small Business: Hearings Before the Select Committee on Small Business, United States Senate, Ninety-sixth Congress, First Session ... October 31 and November 1, 1979. U.S. Government Printing Office.
  7. Commission on Physical Sciences, Mathematics, and Applications; Computer Science and Telecommunications Board; Committee on the Internet in the Evolving Information Infrastructure; National Research Council, Division on Engineering and Physical Sciences (22 January 2001). The Internet's Coming of Age. National Academies Press. pp. 136–. ISBN   978-0-309-06992-2.CS1 maint: Multiple names: authors list (link)
  8. Fernando Francisco Padró (1 January 2004). Statistical Handbook on the Social Safety Net. Greenwood Publishing Group. pp. 1–. ISBN   978-1-57356-516-5.
  9. Anand Rajaraman; Jeffrey David Ullman (27 October 2011). Mining of Massive Datasets. Cambridge University Press. ISBN   978-1-139-50534-5.
  10. Viktor Mayer-Schönberger; Kenneth Cukier (2013). Big Data: A Revolution that Will Transform how We Live, Work, and Think. Houghton Mifflin Harcourt. ISBN   0-544-00269-5.
  11. Paul Zikopoulos (1 October 2011). Understanding Big Data: Analytics for Enterprise Class Hadoop and Streaming Data. McGraw Hill Professional. ISBN   978-0-07-179054-3.
  12. Financial Times (29 January 2013). Decoding Big Data. Penguin Books Limited. pp. 13–. ISBN   978-0-670-92384-7.