Data collection

Last updated
Example of data collection in the biological sciences: Adelie penguins are identified and weighed each time they cross the automated weighbridge on their way to or from the sea. Automated weighbridge for Adelie penguins - journal.pone.0085291.g002.png
Example of data collection in the biological sciences: Adélie penguins are identified and weighed each time they cross the automated weighbridge on their way to or from the sea.

Data collection or data gathering is the process of gathering and measuring information on targeted variables in an established system, which then enables one to answer relevant questions and evaluate outcomes. Data collection is a research component in all study fields, including physical and social sciences, humanities, [2] and business. While methods vary by discipline, the emphasis on ensuring accurate and honest collection remains the same. The goal for all data collection is to capture evidence that allows data analysis to lead to the formulation of credible answers to the questions that have been posed.

Contents

Regardless of the field of or preference for defining data (quantitative or qualitative), accurate data collection is essential to maintain research integrity. The selection of appropriate data collection instruments (existing, modified, or newly developed) and delineated instructions for their correct use reduce the likelihood of errors.

Methodology

Data collection and validation consist of four steps when it involves taking a census and seven steps when it involves sampling. [3]

A formal data collection process is necessary, as it ensures that the data gathered are both defined and accurate. This way, subsequent decisions based on arguments embodied in the findings are made using valid data. [4] The process provides both a baseline from which to measure and in certain cases an indication of what to improve.

Tools

Data collection system

Data management platform

Data management platforms (DMP) are centralized storage and analytical systems for data, mainly used in marketing. DMPs exist to compile and transform large amounts of demand and supply data into discernible information. Marketers may want to receive and utilize first, second and third-party data.DMPs enable this, because they are the aggregate system of DSPs (demand side platform) and SSPs (supply side platform). DMPs are integral for optimizing and future advertising campaigns.

Data integrity issues

The main reason for maintaining data integrity is to support the observation of errors in the data collection process. Those errors may be made intentionally (deliberate falsification) or non-intentionally (random or systematic errors). [5]

There are two approaches that may protect data integrity and secure scientific validity of study results: [6]

Quality assurance (QA)

QA's focus is prevention, which is primarily a cost-effective activity to protect the integrity of data collection. Standardization of protocol, with comprehensive and detailed procedure descriptions for data collection, are central for prevention. The risk of failing to identify problems and errors in the research process is often caused by poorly written guidelines. Listed are several examples of such failures:

  • Uncertainty of timing, methods and identification of the responsible person
  • Partial listing of items needed to be collected
  • Vague description of data collection instruments instead of rigorous step-by-step instructions on administering tests
  • Failure to recognize exact content and strategies for training and retraining staff members responsible for data collection
  • Unclear instructions for using, making adjustments to, and calibrating data collection equipment
  • No predetermined mechanism to document changes in procedures that occur during the investigation

User privacy issues

There are serious concerns about the integrity of individual user data collected by cloud computing, because this data is transferred across countries that have different standards of protection for individual user data. [7] Information processing has advanced to the level where user data can now be used to predict what an individual is saying before they even speak. [8]

Quality control (QC)

Since QC actions occur during or after the data collection, all the details can be carefully documented. There is a necessity for a clearly defined communication structure as a precondition for establishing monitoring systems. Uncertainty about the flow of information is not recommended, as a poorly organized communication structure leads to lax monitoring and can also limit the opportunities for detecting errors. Quality control is also responsible for the identification of actions necessary for correcting faulty data collection practices and also minimizing such future occurrences. A team is more likely to not realize the necessity to perform these actions if their procedures are written vaguely and are not based on feedback or education.

Data collection problems that necessitate prompt action:

See also

Related Research Articles

<span class="mw-page-title-main">Research</span> Systematic study undertaken to increase knowledge

Research is "creative and systematic work undertaken to increase the stock of knowledge". It involves the collection, organization, and analysis of evidence to increase understanding of a topic, characterized by a particular attentiveness to controlling sources of bias and error. These activities are characterized by accounting and controlling for biases. A research project may be an expansion of past work in the field. To test the validity of instruments, procedures, or experiments, research may replicate elements of prior projects or the project as a whole.

Marketing research is the systematic gathering, recording, and analysis of qualitative and quantitative data about issues relating to marketing products and services. The goal is to identify and assess how changing elements of the marketing mix impacts customer behavior.

<span class="mw-page-title-main">Usability</span> Capacity of a system for its users to perform tasks

Usability can be described as the capacity of a system to provide a condition for its users to perform the tasks safely, effectively, and efficiently while enjoying the experience. In software engineering, usability is the degree to which a software can be used by specified consumers to achieve quantified objectives with effectiveness, efficiency, and satisfaction in a quantified context of use.

<span class="mw-page-title-main">Multimethodology</span>

Multimethodology or multimethod research includes the use of more than one method of data collection or research in a research study or set of related studies. Mixed methods research is more specific in that it includes the mixing of qualitative and quantitative data, methods, methodologies, and/or paradigms in a research study or set of related studies. One could argue that mixed methods research is a special case of multimethod research. Another applicable, but less often used label, for multi or mixed research is methodological pluralism. All of these approaches to professional and academic research emphasize that monomethod research can be improved through the use of multiple data sources, methods, research methodologies, perspectives, standpoints, and paradigms.

<span class="mw-page-title-main">Quantitative research</span> All procedures for the numerical representation of empirical facts

Quantitative research is a research strategy that focuses on quantifying the collection and analysis of data. It is formed from a deductive approach where emphasis is placed on the testing of theory, shaped by empiricist and positivist philosophies.

Educational research refers to the systematic collection and analysis of data related to the field of education. Research may involve a variety of methods and various aspects of education including student learning, interaction, teaching methods, teacher training, and classroom dynamics.

An assay is an investigative (analytic) procedure in laboratory medicine, mining, pharmacology, environmental biology and molecular biology for qualitatively assessing or quantitatively measuring the presence, amount, or functional activity of a target entity. The measured entity is often called the analyte, the measurand, or the target of the assay. The analyte can be a drug, biochemical substance, chemical element or compound, or cell in an organism or organic sample. An assay usually aims to measure an analyte's intensive property and express it in the relevant measurement unit.

<span class="mw-page-title-main">Methodology</span> Study of research methods

In its most common sense, methodology is the study of research methods. However, the term can also refer to the methods themselves or to the philosophical discussion of associated background assumptions. A method is a structured procedure for bringing about a certain goal, like acquiring knowledge or verifying knowledge claims. This normally involves various steps, like choosing a sample, collecting data from this sample, and interpreting the data. The study of methods concerns a detailed description and analysis of these processes. It includes evaluative aspects by comparing different methods. This way, it is assessed what advantages and disadvantages they have and for what research goals they may be used. These descriptions and evaluations depend on philosophical background assumptions. Examples are how to conceptualize the studied phenomena and what constitutes evidence for or against them. When understood in the widest sense, methodology also includes the discussion of these more abstract issues.

In the context of software engineering, software quality refers to two related but distinct notions:

An information technology audit, or information systems audit, is an examination of the management controls within an Information technology (IT) infrastructure and business applications. The evaluation of evidence obtained determines if the information systems are safeguarding assets, maintaining data integrity, and operating effectively to achieve the organization's goals or objectives. These reviews may be performed in conjunction with a financial statement audit, internal audit, or other form of attestation engagement.

Reliability engineering is a sub-discipline of systems engineering that emphasizes the ability of equipment to function without failure. Reliability describes the ability of a system or component to function under stated conditions for a specified period. Reliability is closely related to availability, which is typically described as the ability of a component or system to function at a specified moment or interval of time.

A standard operating procedure (SOP) is a set of step-by-step instructions compiled by an organization to help workers carry out routine operations. SOPs aim to achieve efficiency, quality output, and uniformity of performance, while reducing miscommunication and failure to comply with industry regulations.

Internal control, as defined by accounting and auditing, is a process for assuring of an organization's objectives in operational effectiveness and efficiency, reliable financial reporting, and compliance with laws, regulations and policies. A broad concept, internal control involves everything that controls risks to an organization.

A test method is a method for a test in science or engineering, such as a physical test, chemical test, or statistical test. It is a definitive procedure that produces a test result. In order to ensure accurate and relevant test results, a test method should be "explicit, unambiguous, and experimentally feasible.", as well as effective and reproducible.

Software quality control is the set of procedures used by organizations to ensure that a software product will meet its quality goals at the best value to the customer, and to continually improve the organization’s ability to produce software products in the future.

<span class="mw-page-title-main">Continuous auditing</span>

Continuous auditing is an automatic method used to perform auditing activities, such as control and risk assessments, on a more frequent basis. Technology plays a key role in continuous audit activities by helping to automate the identification of exceptions or anomalies, analyze patterns within the digits of key numeric fields, review trends, and test controls, among other activities.

A physical test is a qualitative or quantitative procedure that consists of determination of one or more characteristics of a given product, process or service according to a specified procedure. Often this is part of an experiment.

Analytical quality control (AQC) refers to all those processes and procedures designed to ensure that the results of laboratory analysis are consistent, comparable, accurate and within specified limits of precision. Constituents submitted to the analytical laboratory must be accurately described to avoid faulty interpretations, approximations, or incorrect results. The qualitative and quantitative data generated from the laboratory can then be used for decision making. In the chemical sense, quantitative analysis refers to the measurement of the amount or concentration of an element or chemical compound in a matrix that differs from the element or compound. Fields such as industry, medicine, and law enforcement can make use of AQC.

The UK Data Service is the largest digital repository for quantitative and qualitative social science and humanities research data in the United Kingdom. The organisation is funded by the UK government through the Economic and Social Research Council and is led by the UK Data Archive at the University of Essex, in partnership with other universities.

Willem Egbert (Wim) Saris is a Dutch sociologist and Emeritus Professor of Statistics and Methodology, especially known for his work on "Causal modelling in non-experimental research" and measurement errors.

References

  1. Lescroël, A. L.; Ballard, G.; Grémillet, D.; Authier, M.; Ainley, D. G. (2014). Descamps, Sébastien (ed.). "Antarctic Climate Change: Extreme Events Disrupt Plastic Phenotypic Response in Adélie Penguins". PLOS ONE. 9 (1): e85291. Bibcode:2014PLoSO...985291L. doi: 10.1371/journal.pone.0085291 . PMC   3906005 . PMID   24489657.
  2. Vuong, Quan-Hoang; La, Viet-Phuong; Vuong, Thu-Trang; Ho, Manh-Toan; Nguyen, Hong-Kong T.; Nguyen, Viet-Ha; Pham, Hiep-Hung; Ho, Manh-Tung (September 25, 2018). "An open database of productivity in Vietnam's social sciences and humanities for public use". Scientific Data. 5: 180188. Bibcode:2018NatSD...580188V. doi:10.1038/sdata.2018.188. PMC   6154282 . PMID   30251992.
  3. Ziafati Bafarasat, A. (2021) Collecting and validating data: A simple guide for researchers. Advance. Preprint.. https://doi.org/10.31124/advance.13637864.v1
  4. Data Collection and Analysis By Dr. Roger Sapsford, Victor Jupp ISBN   0-7619-5046-X
  5. Northern Illinois University (2005). "Data Collection". Responsible Conduct in Data Management. Retrieved June 8, 2019.
  6. Most, Marlene M.; Craddick, Shirley; Crawford, Staci; Redican, Susan; Rhodes, Donna; Rukenbrod, Fran; Laws, Reesa (October 2003). "Dietary quality assurance processes of the DASH-Sodium controlled diet study". Journal of the American Dietetic Association. 103 (10): 1339–1346. doi:10.1016/s0002-8223(03)01080-0. PMID   14520254.
  7. Wang, Faye Fangfei (10 January 2014). Law of Electronic Commercial Transactions: Contemporary Issues in the EU, US and China. Routledge. p. 154. ISBN   978-1-134-11522-8.
  8. "Data, not privacy, is the real danger". NBC News. 4 February 2019.