Tolerance analysis

Last updated

Tolerance analysis is the general term for activities related to the study of accumulated variation in mechanical parts and assemblies. Its methods may be used on other types of systems subject to accumulated variation, such as mechanical and electrical systems. Engineers analyze tolerances for the purpose of evaluating geometric dimensioning and tolerancing (GD&T). Methods include 2D tolerance stacks, 3D Monte Carlo simulations, and datum conversions.

Contents

Tolerance stackups or tolerance stacks are used to describe the problem-solving process in mechanical engineering of calculating the effects of the accumulated variation that is allowed by specified dimensions and tolerances. Typically these dimensions and tolerances are specified on an engineering drawing. Arithmetic tolerance stackups use the worst-case maximum or minimum values of dimensions and tolerances to calculate the maximum and minimum distance (clearance or interference) between two features or parts. Statistical tolerance stackups evaluate the maximum and minimum values based on the absolute arithmetic calculation combined with some method for establishing likelihood of obtaining the maximum and minimum values, such as Root Sum Square (RSS) or Monte-Carlo methods.

Modeling

In performing a tolerance analysis, there are two fundamentally different analysis tools for predicting stackup variation: worst-case analysis and statistical analysis.

Worst-case

Worst-case tolerance analysis is the traditional type of tolerance stackup calculation. The individual variables are placed at their tolerance limits in order to make the measurement as large or as small as possible. The worst-case model does not consider the distribution of the individual variables, but rather that those variables do not exceed their respective specified limits. This model predicts the maximum expected variation of the measurement. Designing to worst-case tolerance requirements guarantees 100 percent of the parts will assemble and function properly, regardless of the actual component variation. The major drawback is that the worst-case model often requires very tight individual component tolerances. The obvious result is expensive manufacturing and inspection processes and/or high scrap rates. Worst-case tolerancing is often required by the customer for critical mechanical interfaces and spare part replacement interfaces. When worst-case tolerancing is not a contract requirement, properly applied statistical tolerancing can ensure acceptable assembly yields with increased component tolerances and lower fabrication costs.

Statistical variation

The statistical variation analysis model takes advantage of the principles of statistics to relax the component tolerances without sacrificing quality. Each component's variation is modeled as a statistical distribution and these distributions are summed to predict the distribution of the assembly measurement. Thus, statistical variation analysis predicts a distribution that describes the assembly variation, not the extreme values of that variation. This analysis model provides increased design flexibility by allowing the designer to design to any quality level, not just 100 percent.

There are two chief methods for performing the statistical analysis. In one, the expected distributions are modified in accordance with the relevant geometric multipliers within tolerance limits and then combined using mathematical operations to provide a composite of the distributions. The geometric multipliers are generated by making small deltas to the nominal dimensions. The immediate value to this method is that the output is smooth, but it fails to account for geometric misalignment allowed for by the tolerances; if a size dimension is placed between two parallel surfaces, it is assumed the surfaces will remain parallel, even though the tolerance does not require this. Because the CAD engine performs the variation sensitivity analysis, there is no output available to drive secondary programs such as stress analysis.

In the other, the variations are simulated by allowing random changes to geometry, constrained by expected distributions within allowed tolerances with the resulting parts assembled, and then measurements of critical places are recorded as if in an actual manufacturing environment. The collected data is analyzed to find a fit with a known distribution and mean and standard deviations derived from them. The immediate value to this method is that the output represents what is acceptable, even when that is from imperfect geometry and, because it uses recorded data to perform its analysis, it is possible to include actual factory inspection data into the analysis to see the effect of proposed changes on real data. In addition, because the engine for the analysis is performing the variation internally, not based on CAD regeneration, it is possible to link the variation engine output to another program. For example, a rectangular bar may vary in width and thickness; the variation engine could output those numbers to a stress program which passes back peak stress as a result and the dimensional variation be used to determine likely stress variations. The disadvantage is that each run is unique, so there will be variation from analysis to analysis for the output distribution and mean, just like would come from a factory.

While no official engineering standard covers the process or format of tolerance analysis and stackups, these are essential components of good product design. Tolerance stackups should be used as part of the mechanical design process, both as a predictive and a problem-solving tool. The methods used to conduct a tolerance stackup depend somewhat upon the engineering dimensioning and tolerancing standards that are referenced in the engineering documentation, such as American Society of Mechanical Engineers (ASME) Y14.5, ASME Y14.41, or the relevant ISO dimensioning and tolerancing standards. Understanding the tolerances, concepts and boundaries created by these standards is vital to performing accurate calculations.

Tolerance stackups serve engineers by:

Concept of Tolerance vector loop

The starting point for the tolerance loop; typically this is one side of an intended gap, after pushing the various parts in the assembly to one side or another of their loose range of motion. Vector loops define the assembly constraints that locate the parts of the assembly relative to each other. The vectors represent the dimensions that contribute to tolerance stackup in the assembly. The vectors are joined tip-to-tail, forming a chain, passing through each part in the assembly in succession. A vector loop must obey certain modeling rules as it passes through a part. It must:

  1. enter through a joint,
  2. follow the datum path to the Datum Reference Frame (DRF),
  3. follow a second datum path leading to another joint, and
  4. exit to the next adjacent part in the assembly.

Additional modeling rules for vector loops include:

  1. Loops must pass through every part and every joint in the assembly.
  2. A single vector loop may not pass through the same part or the same joint twice, but it may start and end in the same part.
  3. If a vector loop includes exactly the same dimension twice, in opposite directions, the dimension is redundant and must be omitted.
  4. There must be enough loops to solve for all of the kinematic variables (joint degrees of freedom). You will need one loop for each three variables.

The above rules will vary depending on whether 1D, 2D or 3D tolerance stackup method is used.

Concerns with tolerance stackups

A safety factor is often included in designs because of concerns about:

See also

Related Research Articles

Analysis is the process of breaking a complex topic or substance into smaller parts in order to gain a better understanding of it. The technique has been applied in the study of mathematics and logic since before Aristotle, though analysis as a formal concept is a relatively recent development.

In statistics, a central tendency is a central or typical value for a probability distribution. It may also be called a center or location of the distribution. Colloquially, measures of central tendency are often called averages. The term central tendency dates from the late 1920s.

Vector calculus, or vector analysis, is concerned with differentiation and integration of vector fields, primarily in 3-dimensional Euclidean space The term "vector calculus" is sometimes used as a synonym for the broader subject of multivariable calculus, which includes vector calculus as well as partial differentiation and multiple integration. Vector calculus plays an important role in differential geometry and in the study of partial differential equations. It is used extensively in physics and engineering, especially in the description of electromagnetic fields, gravitational fields, and fluid flow.

Engineering drawing A type of technical drawing used to define requirements for engineered items

An engineering drawing is a type of technical drawing that is used to convey information about an object. A common use is to specify the geometry necessary for the construction of a component and is called a detail drawing. Usually, a number of drawings are necessary to completely specify even a simple component. The drawings are linked together by a master drawing or assembly drawing which gives the drawing numbers of the subsequent detailed components, quantities required, construction materials and possibly 3D images that can be used to locate individual items. Although mostly consisting of pictographic representations, abbreviations and symbols are used for brevity and additional textual explanations may also be provided to convey the necessary information.

Preference regression

Preference regression is a statistical technique used by marketers to determine consumers’ preferred core benefits. It usually supplements product positioning techniques like multi dimensional scaling or factor analysis and is used to create ideal vectors on perceptual maps.

Geometric dimensioning and tolerancing

Geometric Dimensioning and Tolerancing (GD&T) is a system for defining and communicating engineering tolerances. It uses a symbolic language on engineering drawings and computer-generated three-dimensional solid models that explicitly describe nominal geometry and its allowable variation. It tells the manufacturing staff and machines what degree of accuracy and precision is needed on each controlled feature of the part. GD&T is used to define the nominal geometry of parts and assemblies, to define the allowable variation in form and possible size of individual features, and to define the allowable variation between features.

Engineering tolerance

Engineering tolerance is the permissible limit or limits of variation in:

  1. a physical dimension;
  2. a measured value or physical property of a material, manufactured object, system, or service;
  3. other measured values ;
  4. in engineering and safety, a physical distance or space (tolerance), as in a truck (lorry), train or boat under a bridge as well as a train in a tunnel ;
  5. in mechanical engineering the space between a bolt and a nut or a hole, etc.

Stress–strain analysis is an engineering discipline that uses many methods to determine the stresses and strains in materials and structures subjected to forces. In continuum mechanics, stress is a physical quantity that expresses the internal forces that neighboring particles of a continuous material exert on each other, while strain is the measure of the deformation of the material.

Sensitivity analysis is the study of how the uncertainty in the output of a mathematical model or system can be divided and allocated to different sources of uncertainty in its inputs. A related practice is uncertainty analysis, which has a greater focus on uncertainty quantification and propagation of uncertainty; ideally, uncertainty and sensitivity analysis should be run in tandem.

A datum reference or just datum is some important part of an object—such as a point, line, plane, hole, set of holes, or pair of surfaces—that serves as a reference in defining the geometry of the object and (often) in measuring aspects of the actual geometry to assess how closely they match with the nominal value, which may be an ideal, standard, average, or desired value. For example, on a car's wheel, the lug nut holes define a bolt circle that is a datum from which the location of the rim can be defined and measured. This matters because the hub and rim need to be concentric to within close limits. The concept of datums is used in many fields, including carpentry, metalworking, needlework, geometric dimensioning and tolerancing (GD&T), aviation, surveying, and others.

A process is a unique combination of tools, materials, methods, and people engaged in producing a measurable output; for example a manufacturing line for machine parts. All processes have inherent statistical variability which can be evaluated by statistical methods.

Model-based definition (MBD), sometimes called digital product definition (DPD), is the practice of using 3D models within 3D CAD software to define individual components and product assemblies. The types of information included are geometric dimensioning and tolerancing (GD&T), component level materials, assembly level bills of materials, engineering configurations, design intent, etc. By contrast, other methodologies have historically required accompanying use of 2D engineering drawings to provide such details.

Robustification is a form of optimisation whereby a system is made less sensitive to the effects of random variability, or noise, that is present in that system’s input variables and parameters. The process is typically associated with engineering systems, but the process can also be applied to a political policy, a business strategy or any other system that is subject to the effects of random variability.

Plot (graphics) Graphical technique for data sets

A plot is a graphical technique for representing a data set, usually as a graph showing the relationship between two or more variables. The plot can be drawn by hand or by a computer. In the past, sometimes mechanical or electronic plotters were used. Graphs are a visual representation of the relationship between variables, which are very useful for humans who can then quickly derive an understanding which may not have come from lists of values. Given a scale or ruler, graphs can also be used to read off the value of an unknown variable plotted as a function of a known one, but this can also be done with data presented in tabular form. Graphs of functions are used in mathematics, sciences, engineering, technology, finance, and other areas.

ISO 128 is an international standard (ISO), about the general principles of presentation in technical drawings, specifically the graphical representation of objects on technical drawings.

The following outline is provided as an overview of and topical guide to formal science:

Production drawings are complete sets of drawings that detail the manufacturing and assembly of products.

Linear regression statistical approach for modeling the relationship between a scalar dependent variable and one or more explanatory variables

In statistics, linear regression is a linear approach to modeling the relationship between a scalar response and one or more explanatory variables. The case of one explanatory variable is called simple linear regression. For more than one explanatory variable, the process is called multiple linear regression. This term is distinct from multivariate linear regression, where multiple correlated dependent variables are predicted, rather than a single scalar variable.

ASME Y14.5 is a standard published by the American Society of Mechanical Engineers (ASME) to establish rules, symbols, definitions, requirements, defaults, and recommended practices for stating and interpreting Geometric Dimensions and Tolerances (GD&T). ASME/ANSI issued the first version of this Y-series standard in 1973.

References