Author | Bryan Caplan |
---|---|
Audio read by | Allan Robertson |
Language | English |
Subject |
|
Publisher | Princeton University Press |
Publication date | January 30, 2018 |
Publication place | United States |
Media type |
|
Pages | 417 Hardcover |
ISBN | 978-0691174655 |
The Case Against Education: Why the Education System Is a Waste of Time and Money [1] is a book written by libertarian economist Bryan Caplan and published in 2018 by Princeton University Press. Drawing on the economic concept of job market signaling and research in educational psychology, the book argues that much of higher education is very inefficient and has only a small effect in improving human capital, contrary to the conventional consensus in labor economics.
Caplan argues that the primary function of education is not to enhance students' skills but to certify their intelligence, conscientiousness, and conformity—attributes that are valued by employers. He ultimately estimates that approximately 80% of individuals' return to education is the result of signaling, with the remainder due to human capital accumulation.
The foundation of the drive to increase educational attainment across the board is the human capital model of education, which began with the research of Gary Becker. [2] The model suggests that increasing educational attainment causes increased prosperity by endowing students with increased skills. As a consequence, subsidies to education are seen as a positive investment that increases economic growth and creates spillover effects by improving civic engagement, happiness, health, etc.
Caplan argues against the model due to several contradictions, though he does not dispute that higher educational attainment is strongly correlated with increased individual income. He highlights how most adults rarely remember much of what they were taught in school not related to their career besides English and math, and even the latter two are inadequate. He also analyzes the sheepskin effect, where the largest increases in income from higher educational attainment occur after attaining an academic degree, but not for those who dropped out of college despite usually having completed some courses. He finally criticizes educational inflation, the increasing educational requirements for occupations that do not require them, as indicating educational attainment is relative and not nearly as beneficial for society as portrayed.
The simple human capital model tends to assume that knowledge is retained indefinitely, while a ubiquitous theme in educational interventions is that "fadeout" (i.e., forgetting) reliably occurs. [3] To take a simple example, we may compute the present value of a marginal fact that increases a person's productivity by as:where is the discount rate used to compute the present value. If is $100 and is 5%, then the present value of learning is $2,000. But this is at odds with the concept of fadeout. To correct for this, assume that the probability density function for retaining follows an exponential distribution—with the corresponding survival function . Then the present value of learning , accounting for fadeout, is given by:Since the expected value of an exponential distribution is , we may tune this parameter based on assumptions about how long is retained. Below is a table showing what the present value is based on and the expected retention time of the fact:
3 Months | 6 Months | 1 Year | 2 Years | 3 Years | 5 Years | 10 Years |
---|---|---|---|---|---|---|
$24.69 | $48.78 | $95.24 | $181.82 | $260.87 | $400.00 | $666.67 |
Regardless of the retention time assumption, the present value of learning is significantly reduced.
The main alternative to the human capital model of education is the signaling model of education. The idea of job market signaling through educational attainment goes back to the work of Michael Spence. [4] The model Spence developed suggested that, even if a student did not gain any skills through an educational program, the program can still be useful so long as the signal from completing the program is correlated with traits that predict job performance.
Throughout the book, Caplan details a series of observations that suggest a significant role for signaling in the return to education:
Given the above signs of signaling, Caplan argues in ch. 5–6 [1] that the selfish return to education is greater than the social return to education, suggesting that greater educational attainment creates a negative externality (p. 198 [1] ). In other words, status is zero-sum; skill is not (p. 229 [1] ).
For many students, Caplan argues that most of the negative social return to pursuing further education comes from the incursion of student debt and lost employment opportunities for students who are unlikely to complete college (p. 210-211, ch. 8 [1] ). He suggests that these students would be better served by vocational education.
Caplan advocates two major policy responses to the problem of signaling in education:
The first recommendation is that government needs to sharply cut education funding, since public education spending in the United States across all levels tops $1 trillion annually. [13] The second recommendation is to encourage greater vocational education, because students who are unlikely to succeed in college should develop practical skills to function in the labor market. Caplan argues for an increased emphasis on vocational education that is similar in nature to the systems in Germany [14] and Switzerland. [15] [16]
This section needs expansion. You can help by adding to it. (November 2022) |
In probability theory and statistics, a probability distribution is the mathematical function that gives the probabilities of occurrence of possible outcomes for an experiment. It is a mathematical description of a random phenomenon in terms of its sample space and the probabilities of events.
A career is an individual's metaphorical "journey" through learning, work and other aspects of life. There are a number of ways to define career and the term is used in a variety of ways.
In probability theory and statistics, the Weibull distribution is a continuous probability distribution. It models a broad range of random variables, largely in the nature of a time to failure or time between events. Examples are maximum one-day rainfalls and the time a user spends on a web page.
Bryan Douglas Caplan is an American economist and author. He is a professor of economics at George Mason University, a senior research fellow at the Mercatus Center, an adjunct scholar at the Cato Institute, and a former contributor to the Freakonomics blog and EconLog. He currently publishes his own blog, Bet on It. Caplan is a self-described "economic libertarian". The bulk of Caplan's academic work is in behavioral economics and public economics, especially public choice theory.
Quantization, in mathematics and digital signal processing, is the process of mapping input values from a large set to output values in a (countable) smaller set, often with a finite number of elements. Rounding and truncation are typical examples of quantization processes. Quantization is involved to some degree in nearly all digital signal processing, as the process of representing a signal in digital form ordinarily involves rounding. Quantization also forms the core of essentially all lossy compression algorithms.
Social mobility is the movement of individuals, families, households or other categories of people within or between social strata in a society. It is a change in social status relative to one's current social location within a given society. This movement occurs between layers or tiers in an open system of social stratification. Open stratification systems are those in which at least some value is given to achieved status characteristics in a society. The movement can be in a downward or upward direction. Markers for social mobility such as education and class, are used to predict, discuss and learn more about an individual or a group's mobility in society.
In digital image processing and computer vision, image segmentation is the process of partitioning a digital image into multiple image segments, also known as image regions or image objects. The goal of segmentation is to simplify and/or change the representation of an image into something that is more meaningful and easier to analyze. Image segmentation is typically used to locate objects and boundaries in images. More precisely, image segmentation is the process of assigning a label to every pixel in an image such that pixels with the same label share certain characteristics.
In economics, hyperbolic discounting is a time-inconsistent model of delay discounting. It is one of the cornerstones of behavioral economics and its brain-basis is actively being studied by neuroeconomics researchers.
Temporal difference (TD) learning refers to a class of model-free reinforcement learning methods which learn by bootstrapping from the current estimate of the value function. These methods sample from the environment, like Monte Carlo methods, and perform updates based on current estimates, like dynamic programming methods.
Predictive learning is a machine learning technique where an artificial intelligence model is fed new data to develop an understanding of its environment, capabilities, and limitations. The fields of neuroscience, business, robotics, computer vision, and other fields employ this technique extensively. This concept was developed and expanded by French computer scientist Yann LeCun in 1988 during his career at Bell Labs, where he trained models to detect handwriting so that financial companies could automate check processing.
In probability and statistics, the quantile function outputs the value of a random variable such that its probability is less than or equal to an input probability value. Intuitively, the quantile function associates with a range at and below a probability input the likelihood that a random variable is realized in that range for some probability distribution. It is also called the percentile function, percent-point function, inverse cumulative distribution function or inverse distribution function.
The Wisconsin model of socio-economic attainment is a model that describes and explains an individual's social mobility and its economic, social, and psychological determinants. The logistics of this model are primarily attributed to William H. Sewell and colleagues including Archibald Haller, Alejandro Portes and Robert M. Hauser. The model receives its name from the state in which a significant amount of the research and analysis was completed. Unlike the previous research on this topic by Peter Blau and Otis Dudley Duncan, this model encompasses more than just educational and occupational factors and their effect on social mobility for American males. The Wisconsin model has been described as "pervasive in its influence on the style and content of research in several subfields of sociology."
Activity recognition aims to recognize the actions and goals of one or more agents from a series of observations on the agents' actions and the environmental conditions. Since the 1980s, this research field has captured the attention of several computer science communities due to its strength in providing personalized support for many different applications and its connection to many different fields of study such as medicine, human-computer interaction, or sociology.
The relationship between fertility and intelligence has been investigated in many demographic studies. There is evidence that, on a population level, measures of intelligence such as educational attainment and literacy are negatively correlated with fertility rate in some contexts.
Education economics or the economics of education is the study of economic issues relating to education, including the demand for education, the financing and provision of education, and the comparative efficiency of various educational programs and policies. From early works on the relationship between schooling and labor market outcomes for individuals, the field of the economics of education has grown rapidly to cover virtually all areas with linkages to education.
The Education Index is a component of the Human Development Index (HDI) published every year by the United Nations Development Programme. Alongside the economical indicators (GDP) and Life Expectancy Index, it helps measure the educational attainment. GNI (PPP) per capita and life expectancy are also used with the education index to get the HDI of each country.
In statistics and machine learning, lasso is a regression analysis method that performs both variable selection and regularization in order to enhance the prediction accuracy and interpretability of the resulting statistical model. The lasso method assumes that the coefficients of the linear model are sparse, meaning that few of them are non-zero. It was originally introduced in geophysics, and later by Robert Tibshirani, who coined the term.
In probability theory and statistics, the Poisson distribution is a discrete probability distribution that expresses the probability of a given number of events occurring in a fixed interval of time if these events occur with a known constant mean rate and independently of the time since the last event. It can also be used for the number of events in other types of intervals than time, and in dimension greater than 1.
Sparse dictionary learning is a representation learning method which aims at finding a sparse representation of the input data in the form of a linear combination of basic elements as well as those basic elements themselves. These elements are called atoms and they compose a dictionary. Atoms in the dictionary are not required to be orthogonal, and they may be an over-complete spanning set. This problem setup also allows the dimensionality of the signals being represented to be higher than the one of the signals being observed. The above two properties lead to having seemingly redundant atoms that allow multiple representations of the same signal but also provide an improvement in sparsity and flexibility of the representation.
Jensenism is a term coined by New York Times writer Lee Edson. Named after educational psychologist Arthur Jensen, it was originally defined as "the theory that IQ is largely determined by the genes". The term was coined after Jensen published the article "How Much Can We Boost IQ and Scholastic Achievement?" in the Harvard Educational Review in 1969. It has since been included in several dictionaries.