# Truncate selection

Last updated

Truncate selection may refer to:

## Related Research Articles

In statistics, sampling bias is a bias in which a sample is collected in such a way that some members of the intended population have a lower sampling probability than others. It results in a biased sample, a non-random sample of a population in which all individuals, or instances, were not equally likely to have been selected. If this is not accounted for, results can be erroneously attributed to the phenomenon under study rather than to the method of sampling. In geometry, the truncated icosahedron is an Archimedean solid, one of 13 convex isogonal nonprismatic solids whose faces are two or more types of regular polygons. In geometry, the truncated tetrahedron is an Archimedean solid. It has 4 regular hexagonal faces, 4 equilateral triangle faces, 12 vertices and 18 edges. It can be constructed by truncating all 4 vertices of a regular tetrahedron at one third of the original edge length. In geometry, the truncated octahedron is an Archimedean solid. It has 14 faces, 36 edges, and 24 vertices. Since each of its faces has point symmetry the truncated octahedron is a zonohedron. It is also the Goldberg polyhedron GIV(1,1), containing square and hexagonal faces. Like the cube, it can tessellate 3-dimensional space, as a permutohedron. In geometry, the truncated cube, or truncated hexahedron, is an Archimedean solid. It has 14 regular faces, 36 edges, and 24 vertices. In geometry, the truncated cuboctahedron is an Archimedean solid, named by Kepler as a truncation of a cuboctahedron. It has 12 square faces, 8 regular hexagonal faces, 6 regular octagonal faces, 48 vertices and 72 edges. Since each of its faces has point symmetry, the truncated cuboctahedron is a zonohedron. The truncated cuboctahedron can tessellate with the octagonal prism. In geometry, the truncated icosidodecahedron is an Archimedean solid, one of thirteen convex isogonal nonprismatic solids constructed by two or more types of regular polygon faces.

In animal and plant breeding, truncation selection is a standard method in selective breeding in selecting animals to be bred for the next generation. Animals are ranked by their phenotypic value on some trait such as milk production, and the top percentage is reproduced. The effects of truncation selection for a continuous trait can be modeled by the standard breeder's equation by using heritability and truncated normal distributions; on a binary trait, it can be modeled easily using the liability threshold model. It is considered an easy and efficient method of breeding. In geometry, the truncated dodecahedron is an Archimedean solid. It has 12 regular decagonal faces, 20 regular triangular faces, 60 vertices and 90 edges.

Selection is the stage of a genetic algorithm in which individual genomes are chosen from a population for later breeding. In geometry, the small stellated dodecahedron is a Kepler-Poinsot polyhedron, named by Arthur Cayley, and with Schläfli symbol {5/2,5}. It is one of four nonconvex regular polyhedra. It is composed of 12 pentagrammic faces, with five pentagrams meeting at each vertex.

In geometry, a truncated tesseract is a uniform 4-polytope formed as the truncation of the regular tesseract. In geometry, a truncation is an operation in any dimension that cuts polytope vertices, creating a new facet in place of each vertex. The term originates from Kepler's names for the Archimedean solids.

Truncated regression models are a class of models in which the sample has been truncated for certain ranges of the dependent variable. That means observations with values in the dependent variable below or above certain thresholds are systematically excluded from the sample. Therefore, whole observations are missing, so that neither the dependent nor the independent variable is known. This is in contrast to censored regression models where only the value of the dependent variable is clustered at a lower threshold, an upper threshold, or both, while the value for independent variables is available. In probability and statistics, the truncated normal distribution is the probability distribution derived from that of a normally distributed random variable by bounding the random variable from either below or above. The truncated normal distribution has wide applications in statistics and econometrics. For example, it is used to model the probabilities of the binary outcomes in the probit model and to model censored data in the Tobit model.

The Heckman correction is a statistical technique to correct bias from non-randomly selected samples or otherwise incidentally truncated dependent variables, a pervasive issue in quantitative social sciences when using observational data. Conceptually, this is achieved by explicitly modelling the individual sampling probability of each observation together with the conditional expectation of the dependent variable. The resulting likelihood function is mathematically similar to the Tobit model for censored dependent variables, a connection first drawn by James Heckman in 1976. Heckman also developed a two-step control function approach to estimate this model, which reduced the computional burden of having to estimate both equations jointly, albeit at the cost of inefficiency. Heckman received the Nobel Memorial Prize in Economic Sciences in 2000 for his work in this field.

In statistics, truncation results in values that are limited above or below, resulting in a truncated sample. A random variable is said to be truncated from below if, for some threshold value , the exact value of is known for all cases , but unknown for all cases . Similarly, truncation from above means the exact value of is known in cases where , but unknown when .

A limited dependent variable is a variable whose range of possible values is "restricted in some important way." In econometrics, the term is often used when estimation of the relationship between the limited dependent variable of interest and other variables requires methods that take this restriction into account. For example, this may arise when the variable of interest is constrained to lie between zero and one, as in the case of a probability, or is constrained to be positive, as in the case of wages or hours worked.

In probability theory, the Mills ratio of a continuous random variable is the function

Truncated Newton methods, also known as Hessian-free optimization, are a family of optimization algorithms designed for optimizing non-linear functions with large numbers of independent variables. A truncated Newton method consists of repeated application of an iterative optimization algorithm to approximately solve Newton's equations, to determine an update to the function's parameters. The inner solver is truncated, i.e., run for only a limited number of iterations. It follows that, for truncated Newton methods to work, the inner solver needs to produce a good approximation in a finite number of iterations; conjugate gradient has been suggested and evaluated as a candidate inner loop. Another prerequisite is good preconditioning for the inner algorithm.