Informal inferential reasoning

Last updated

In statistics education, informal inferential reasoning (also called informal inference) refers to the process of making a generalization based on data (samples) about a wider universe (population/process) while taking into account uncertainty without using the formal statistical procedure or methods (e.g. P-values, t-test, hypothesis testing, significance test).

Contents

Like formal statistical inference, the purpose of informal inferential reasoning is to draw conclusions about a wider universe (population/process) from data (sample). However, in contrast with formal statistical inference, formal statistical procedure or methods are not necessarily used.

In statistics education literature, the term "informal" is used to distinguish informal inferential reasoning from a formal method of statistical inference.

Informal Inferential Reasoning and Statistical Inference

Since everyday life involves making decisions based on data, making inferences is an important skill to have. However, a number of studies on assessments of students’ understanding statistical inference suggest that students have difficulties in reasoning about inference. [1]

Given the importance of reasoning about statistical inference and difficulties that students have with this type of reasoning, statistics educators and researchers have been exploring alternative approaches towards teaching statistical inference. [2] Recent research suggests that students have some sound intuitions about data and these intuitions can be refined and nudged towards prescriptive theory of inferential reasoning. [3] More of an informal and conceptual approach that build on the previous big ideas and make connection between foundational concepts is therefore favorable. [1]

Recently, informal inferential reasoning has been the focus of research and discussion among researchers and educators in statistics education as it is seen as having a potential to help build fundamental concepts that underlie formal statistical inference. Many advocate that underlying concepts and skills of inference should be introduced early in the course or curriculum as they can help make the formal statistical inference more accessible (see published reaction of Garfield & Zieffler to [4] ).

Three essential characteristics

According to Statistical Reasoning, Thinking and Literacy forum, three essential principles to informal inference are:

  1. generalizations (including predictions, parameter estimates, and conclusions) that go beyond describing the given data;
  2. the use of data as evidence for those generalizations; and
  3. conclusions that express a degree of uncertainty, whether or not quantified, accounting for the variability or uncertainty that is unavoidable when generalizing beyond the immediate data to a population or a process. [5] [6]

Core Statistical Ideas

Informal inferential reasoning involved the following related ideas [3]

Bakker and Derry (2011) argue for using inferentialism as a philosophical foundation to develop informal inferential reasoning and therefore address three major challenges in statistics education--(1) avoiding students' inert knowledge (not being able to apply what they have learned to new problems), (2) avoiding atomistic approaches to teaching statistics, and (3) sequencing topics to create coherence in curriculum from a students' perspective. [8]

Tasks that Involve Informal Inferential Reasoning

Zieffler et al. (2008) suggest three types of tasks that have been used in studies of students' informal inferential reasoning and its development.

  1. Estimate and draw a graph of a population based on a sample
  2. Compare two or more samples of data to infer whether there is a real difference between the populations from which they were sampled
  3. Judge which of two competing models or statements is more likely to be true. [2]

Tasks that involve "growing samples" [9] [7] are also fruitful for developing informal inferential reasoning [10]

Related Research Articles

<span class="mw-page-title-main">Descriptive statistics</span> Type of statistics

A descriptive statistic is a summary statistic that quantitatively describes or summarizes features from a collection of information, while descriptive statistics is the process of using and analysing those statistics. Descriptive statistics is distinguished from inferential statistics by its aim to summarize a sample, rather than use the data to learn about the population that the sample of data is thought to represent. This generally means that descriptive statistics, unlike inferential statistics, is not developed on the basis of probability theory, and are frequently nonparametric statistics. Even when a data analysis draws its main conclusions using inferential statistics, descriptive statistics are generally also presented. For example, in papers reporting on human subjects, typically a table is included giving the overall sample size, sample sizes in important subgroups, and demographic or clinical characteristics such as the average age, the proportion of subjects of each sex, the proportion of subjects with related co-morbidities, etc.

<span class="mw-page-title-main">Statistics</span> Study of the collection, analysis, interpretation, and presentation of data

Statistics is the discipline that concerns the collection, organization, analysis, interpretation, and presentation of data. In applying statistics to a scientific, industrial, or social problem, it is conventional to begin with a statistical population or a statistical model to be studied. Populations can be diverse groups of people or objects such as "all people living in a country" or "every atom composing a crystal". Statistics deals with every aspect of data, including the planning of data collection in terms of the design of surveys and experiments.

A statistical model is a mathematical model that embodies a set of statistical assumptions concerning the generation of sample data. A statistical model represents, often in considerably idealized form, the data-generating process. When referring specifically to probabilities, the corresponding term is probabilistic model. All statistical hypothesis tests and all statistical estimators are derived via statistical models. More generally, statistical models are part of the foundation of statistical inference. A statistical model is usually specified as a mathematical relationship between one or more random variables and other non-random variables. As such, a statistical model is "a formal representation of a theory".

<span class="mw-page-title-main">Statistical inference</span> Process of using data analysis

Statistical inference is the process of using data analysis to infer properties of an underlying distribution of probability. Inferential statistical analysis infers properties of a population, for example by testing hypotheses and deriving estimates. It is assumed that the observed data set is sampled from a larger population.

<span class="mw-page-title-main">Fallacy</span> Argument that uses faulty reasoning

A fallacy is the use of invalid or otherwise faulty reasoning in the construction of an argument that may appear to be well-reasoned if unnoticed. The term was introduced in the Western intellectual tradition by the Aristotelian De Sophisticis Elenchis.

Critical thinking is the analysis of available facts, evidence, observations, and arguments in order to form a judgement by the application of rational, skeptical, and unbiased analyses and evaluation. The application of critical thinking includes self-directed, self-disciplined, self-monitored, and self-corrective habits of the mind, thus a critical thinker is a person who practices the skills of critical thinking or has been trained and educated in its disciplines. Philosopher Richard W. Paul said that the mind of a critical thinker engages the person's intellectual abilities and personality traits. Critical thinking presupposes assent to rigorous standards of excellence and mindful command of their use in effective communication and problem solving, and a commitment to overcome egocentrism and sociocentrism.

Inferences are steps in reasoning, moving from premises to logical consequences; etymologically, the word infer means to "carry forward". Inference is theoretically traditionally divided into deduction and induction, a distinction that in Europe dates at least to Aristotle. Deduction is inference deriving logical conclusions from premises known or assumed to be true, with the laws of valid inference being studied in logic. Induction is inference from particular evidence to a universal conclusion. A third type of inference is sometimes distinguished, notably by Charles Sanders Peirce, contradistinguishing abduction from induction.

<span class="mw-page-title-main">Mathematical statistics</span> Branch of statistics

Mathematical statistics is the application of probability theory, a branch of mathematics, to statistics, as opposed to techniques for collecting statistical data. Specific mathematical techniques which are used for this include mathematical analysis, linear algebra, stochastic analysis, differential equations, and measure theory.

<span class="mw-page-title-main">Confounding</span> Variable or factor in causal inference

In causal inference, a confounder is a variable that influences both the dependent variable and independent variable, causing a spurious association. Confounding is a causal concept, and as such, cannot be described in terms of correlations or associations. The existence of confounders is an important quantitative explanation why correlation does not imply causation. Some notations are explicitly designed to identify the existence, possible existence, or non-existence of confounders in causal relationships between elements of a system.

Statistical conclusion validity is the degree to which conclusions about the relationship among variables based on the data are correct or "reasonable". This began as being solely about whether the statistical conclusion about the relationship of the variables was correct, but now there is a movement towards moving to "reasonable" conclusions that use: quantitative, statistical, and qualitative data. Fundamentally, two types of errors can occur: type I and type II. Statistical conclusion validity concerns the qualities of the study that make these types of errors more likely. Statistical conclusion validity involves ensuring the use of adequate sampling procedures, appropriate statistical tests, and reliable measurement procedures.

<span class="mw-page-title-main">Informal logic</span> Branch of logic

Informal logic encompasses the principles of logic and logical thought outside of a formal setting. However, the precise definition of "informal logic" is a matter of some dispute. Ralph H. Johnson and J. Anthony Blair define informal logic as "a branch of logic whose task is to develop non-formal standards, criteria, procedures for the analysis, interpretation, evaluation, criticism and construction of argumentation." This definition reflects what had been implicit in their practice and what others were doing in their informal logic texts.

Statistics models the collection, organization, analysis, interpretation, and presentation of data. Statistics are also used to solve mathematical problems. Conclusions drawn from statistical analysis typically contain certainties, uncertainties, or the probability of an event occurring. Statistics can be used to predict or classify events based on a large set of data, which is an integral application of statistics in fields such as genomics, economics, bioinformatics, and machine learning.

TinkerPlots is exploratory data analysis and modeling software designed for use by students in grades 4 through university. It was designed by Clifford Konold and Craig Miller at the University of Massachusetts Amherst and is currently published by Learn Troop. It runs on Windows XP or later and Mac OS 10.4 or later. The program allows users to enter their own data, to import them from other applications or the Web, or to generate them using a sampling engine. The program also comes with 50 multivariate data sets.

<span class="mw-page-title-main">Analytical skill</span> Crucial skill in all different fields of work and life

Analytical skill is the ability to deconstruct information into smaller categories in order to draw conclusions. Analytical skill consists of categories that include logical reasoning, critical thinking, communication, research, data analysis and creativity. Analytical skill is taught in contemporary education with the intention of fostering the appropriate practises for future professions. The professions that adopt analytical skill include educational institutions, public institutions, community organisations and industry.

Statistics education is the practice of teaching and learning of statistics, along with the associated scholarly research.

<span class="mw-page-title-main">Joachim Engel</span> German scientist and a professor

Joachim Engel is a German scientist and a professor. Since 2006 he has been professor of Mathematics and Mathematical Education at the Ludwigsburg University of Education, after two years as a Professor of Mathematical Education at Leibniz University Hannover (2004–2006). Before becoming a professor he worked as a research fellow at the University of Heidelberg in applied mathematics and the University of Bonn in Economics and was a Visiting Assistant Professor at the University of Michigan in Ann Arbor.

<span class="mw-page-title-main">Romano Scozzafava</span> Italian mathematician (born 1935)

Romano Scozzafava is an Italian mathematician known for his contributions to subjective probability along the lines of Bruno de Finetti, based on the concept of coherence. He taught Probability Calculus at the Engineering Faculty of the Sapienza University of Rome from 1979 to his retirement.

Intuitive statistics, or folk statistics, is the cognitive phenomenon where organisms use data to make generalizations and predictions about the world. This can be a small amount of sample data or training instances, which in turn contribute to inductive inferences about either population-level properties, future data, or both. Inferences can involve revising hypotheses, or beliefs, in light of probabilistic data that inform and motivate future predictions. The informal tendency for cognitive animals to intuitively generate statistical inferences, when formalized with certain axioms of probability theory, constitutes statistics as an academic discipline.

Inferential confusion is a meta-cognitive state of confusion that becomes pathological when an individual fails to interpret reality correctly and considers an obsessional belief or subjective reality as an actual probability. It causes an individual to mistrust their senses and rely on self-created narratives ignoring evidence and the objectivity of events. These self-created narratives come from memories, information, and associations that aren't related- therefore, it deals with the fictional nature of obsessions. It causes the individual to overestimate the threat.

<span class="mw-page-title-main">Roderick J. A. Little</span> Ph.D. University of London 1974

Roderick Joseph Alexander Little is an academic statistician, whose main research contributions lie in the statistical analysis of data with missing values and the analysis of complex sample survey data. Little is Richard D. Remington Distinguished University Professor of Biostatistics in the Department of Biostatistics at the University of Michigan, where he also holds academic appointments in the Department of Statistics and the Institute for Social Research.

References

  1. 1 2 Garfield, J.B., & Ben-Zvi, D. (2008). Learning to reason about statistical inference. In Developing students’ statistical reasoning: connecting research and teaching (pp.261-288). New York, NY: Springer.
  2. 1 2 Zieffler, A., Garfield, J., delMas, R., & Reading, C. (2008). A framework to support research on informal inferential reasoning. Statistical Education Research Journal, 7(2), 40-58. [Available online from http://www.stat.auckland.ac.nz/~iase/serj/SERJ7(2)_Zieffler.pdf]
  3. 1 2 Rubin, A., Hammerman, J. K., & Konold, C. (2006). Exploring informal inference with interactive visualization software. In A. Rossman & B. Chance (Eds), Proceedings of the Seventh International Conference on Teaching Statistics. Salvador, Bahia, Brazil: International Association for Statistical Education.
  4. Wild, C. J., Pfannkuch, M., Regan, M., & Horton, N. J. (2011). Towards more accessible conceptions of statistical inference. Journal of the Royal Statistical Society, Series A (Statistics in Society), 174(2), 247 – 295. [Available online from http://onlinelibrary.wiley.com/doi/10.1111/j.1467-985X.2010.00678.x/full]
  5. Makar, K. & Rubin, A. (2009). A framework for thinking about informal statistical inference. Statistics Education Research Journal, 8(1), 82-105. [Available online from http://iase-web.org/documents/SERJ/SERJ8(1)_Makar_Rubin.pdf]
  6. Wild, C. J., Pfannkuch, M., Regan, M. and Horton, N. J. (2010) Inferential reasoning: learning to "make a call" in theory. In Proc. 8th Int. Conf. Teaching Statistics (ed. C. Reading). The Hague: International Statistical Institute. [Available online from http://www.stat.auckland.ac.nz/~iase/publications/icots8/ICOTS8_8B1_WILD.pdf]
  7. 1 2 3 Konold, C., & Pollatsek, A. (2002). Data analysis as the search for signals in noisy processes. Journal for Research in Mathematics Education, 33(4), 259-289.
  8. Bakker, A., & Derry, J. (2011). Lessons from inferentialism for statistics education. Mathematical Thinking and Learning, 13(1-2), 5-26.
  9. Bakker, A. (2004). Reasoning about shape as a pattern in variability. Statistics Education Research Journal, 3 (2), 64-83. [Available online at http://iase-web.org/documents/SERJ/SERJ3(2)_Bakker.pdf]
  10. Ben-Zvi, D. (2006, July). Scaffolding students’ informal inference and argumentation. In Proceedings of the Seventh International Conference on Teaching Statistics. [Available online at http://iase-web.org/documents/papers/icots7/2D1_BENZ.pdf]

Additional References