Structural reliability

Last updated
Failure occurs when loads (s) are larger than resistance (R) Reliability.png
Failure occurs when loads (s) are larger than resistance (R)

Structural reliability is about applying reliability engineering theories to buildings and, more generally, structural analysis. [1] [2] Reliability is also used as a probabilistic measure of structural safety. The reliability of a structure is defined as the probability of complement of failure . The failure occurs when the total applied load is larger than the total resistance of the structure. Structural reliability has become known as a design philosophy in the twenty-first century, and it might replace traditional deterministic ways of design [3] and maintenance. [2]

Contents

Theory

In structural reliability studies, both loads and resistances are modeled as probabilistic variables. Using this approach the probability of failure of a structure is calculated. When loads and resistances are explicit and have their own independent function, the probability of failure could be formulated as follows. [1] [2]

 

 

 

 

(1)

where is the probability of failure, is the cumulative distribution function of resistance (R), and is the probability density of load (S).

However, in most cases, the distribution of loads and resistances are not independent and the probability of failure is defined via the following more general formula.

 

 

 

 

(2)

where 𝑋 is the vector of the basic variables, and G(X) that is called is the limit state function could be a line, surface or volume that the integral is taken on its surface.

Solution approaches

Analytical solutions

In some cases when load and resistance are explicitly expressed (such as equation ( 1 ) above), and their distributions are normal , the integral of equation ( 1 ) has a closed-form solution as follows.

 

 

 

 

(3)

Simulation

In most cases load and resistance are not normally distributed. Therefore, solving the integrals of equations ( 1 ) and ( 2 ) analytically is impossible. Using Monte Carlo simulation is an approach that could be used in such cases. [1] [4]

Related Research Articles

<span class="mw-page-title-main">Probability distribution</span> Mathematical function for the probability a given outcome occurs in an experiment

In probability theory and statistics, a probability distribution is the mathematical function that gives the probabilities of occurrence of different possible outcomes for an experiment. It is a mathematical description of a random phenomenon in terms of its sample space and the probabilities of events.

A Bayesian network is a probabilistic graphical model that represents a set of variables and their conditional dependencies via a directed acyclic graph (DAG). Bayesian networks are ideal for taking an event that occurred and predicting the likelihood that any one of several possible known causes was the contributing factor. For example, a Bayesian network could represent the probabilistic relationships between diseases and symptoms. Given symptoms, the network can be used to compute the probabilities of the presence of various diseases.

In engineering, a factor of safety (FoS), also known as safety factor (SF), expresses how much stronger a system is than it needs to be for an intended load. Safety factors are often calculated using detailed analysis because comprehensive testing is impractical on many projects, such as bridges and buildings, but the structure's ability to carry a load must be determined to a reasonable accuracy.

Structural analysis is a branch of solid mechanics which uses simplified models for solids like bars, beams and shells for engineering decision making. Its main objective is to determine the effect of loads on the physical structures and their components. In contrast to theory of elasticity, the models used in structure analysis are often differential equations in one spatial variable. Structures subject to this type of analysis include all that must withstand loads, such as buildings, bridges, aircraft and ships. Structural analysis uses ideas from applied mechanics, materials science and applied mathematics to compute a structure's deformations, internal forces, stresses, support reactions, velocity, accelerations, and stability. The results of the analysis are used to verify a structure's fitness for use, often precluding physical tests. Structural analysis is thus a key part of the engineering design of structures.

Multi-disciplinary design optimization (MDO) is a field of engineering that uses optimization methods to solve design problems incorporating a number of disciplines. It is also known as multidisciplinary system design optimization (MSDO), and Multidisciplinary Design Analysis and Optimization (MDAO).

Topology optimization (TO) is a mathematical method that optimizes material layout within a given design space, for a given set of loads, boundary conditions and constraints with the goal of maximizing the performance of the system. Topology optimization is different from shape optimization and sizing optimization in the sense that the design can attain any shape within the design space, instead of dealing with predefined configurations.

Failure rate is the frequency with which an engineered system or component fails, expressed in failures per unit of time. It is usually denoted by the Greek letter λ (lambda) and is often used in reliability engineering.

Reliability engineering is a sub-discipline of systems engineering that emphasizes the ability of equipment to function without failure. Reliability describes the ability of a system or component to function under stated conditions for a specified period of time. Reliability is closely related to availability, which is typically described as the ability of a component or system to function at a specified moment or interval of time.

<span class="mw-page-title-main">Estimation of distribution algorithm</span>

Estimation of distribution algorithms (EDAs), sometimes called probabilistic model-building genetic algorithms (PMBGAs), are stochastic optimization methods that guide the search for the optimum by building and sampling explicit probabilistic models of promising candidate solutions. Optimization is viewed as a series of incremental updates of a probabilistic model, starting with the model encoding an uninformative prior over admissible solutions and ending with the model that generates only the global optima.

Uncertainty quantification (UQ) is the science of quantitative characterization and estimation of uncertainties in both computational and real world applications. It tries to determine how likely certain outcomes are if some aspects of the system are not exactly known. An example would be to predict the acceleration of a human body in a head-on crash with another car: even if the speed was exactly known, small differences in the manufacturing of individual cars, how tightly every bolt has been tightened, etc., will lead to different results that can only be predicted in a statistical sense.

Probabilistic design is a discipline within engineering design. It deals primarily with the consideration of the effects of random variability upon the performance of an engineering system during the design phase. Typically, these effects are related to quality and reliability. Thus, probabilistic design is a tool that is mostly used in areas that are concerned with quality and reliability. For example, product design, quality control, systems engineering, machine design, civil engineering and manufacturing. It differs from the classical approach to design by assuming a small probability of failure instead of using the safety factor.

<span class="mw-page-title-main">Structural engineering theory</span>

Structural engineering depends upon a detailed knowledge of loads, physics and materials to understand and predict how structures support and resist self-weight and imposed loads. To apply the knowledge successfully structural engineers will need a detailed knowledge of mathematics and of relevant empirical and theoretical design codes. They will also need to know about the corrosion resistance of the materials and structures, especially when those structures are exposed to the external environment.

<span class="mw-page-title-main">Probability box</span> Characterization of uncertain numbers consisting of both aleatoric and epistemic uncertainties

A probability box is a characterization of uncertain numbers consisting of both aleatoric and epistemic uncertainties that is often used in risk analysis or quantitative uncertainty modeling where numerical calculations must be performed. Probability bounds analysis is used to make arithmetic and logical calculations with p-boxes.

Probability bounds analysis (PBA) is a collection of methods of uncertainty propagation for making qualitative and quantitative calculations in the face of uncertainties of various kinds. It is used to project partial information about random variables and other quantities through mathematical expressions. For instance, it computes sure bounds on the distribution of a sum, product, or more complex function, given only sure bounds on the distributions of the inputs. Such bounds are called probability boxes, and constrain cumulative probability distributions.

<span class="mw-page-title-main">OptiSLang</span>

optiSLang is a software platform for CAE-based sensitivity analysis, multi-disciplinary optimization (MDO) and robustness evaluation. It is developed by Dynardo GmbH and provides a framework for numerical Robust Design Optimization (RDO) and stochastic analysis by identifying variables which contribute most to a predefined optimization goal. This includes also the evaluation of robustness, i.e. the sensitivity towards scatter of design variables or random fluctuations of parameters. In 2019, Dynardo GmbH was acquired by Ansys.

<span class="mw-page-title-main">Cascade chart (NDI interval reliability)</span>

A cascade chart is tool that can be used in damage tolerance analysis to determine the proper inspection interval, based on reliability analysis, considering all the context uncertainties. The chart is called a "cascade chart" because the scatter of data points and downward curvature resembles a waterfall or cascade. This name was first introduced by Dr. Alberto W Mello in his work "Reliability prediction for structures under cyclic loads and recurring inspections". Materials subject to cyclic loads, as shown in the graph on the right, may form and propagate cracks over time due to fatigue. Therefore, it is essential to determine a reliable inspection interval. There are numerous factors that must be considered to determine this inspection interval. The non-destructive inspection (NDI) technique must have a high probability of detecting a crack in the material. If missed, a crack may lead the structure to a catastrophic failure before the next inspection. On the other hand, the inspection interval cannot be too frequent that the structure's maintenance is no longer profitable.

<span class="mw-page-title-main">Deterioration modeling</span>

Deterioration modeling is the process of modeling and predicting the physical conditions of equipment, structures, infrastructure or any other physical assets. The condition of infrastructure is represented either using a deterministic index or the probability of failure. Examples of such performance measures are pavement condition index for roads or bridge condition index for bridges. For probabilistic measures, which are the focus of reliability theory, probability of failure or reliability index are used. Deterioration models are instrumental to infrastructure asset management and are the basis for maintenance and rehabilitation decision-making. The condition of all physical infrastructure degrade over time. A deterioration model can help decision-makers to understand how fast the condition drops or violates a certain threshold.

In the mathematical theory of probability, a generalized renewal process (GRP) or G-renewal process is a stochastic point process used to model failure/repair behavior of repairable systems in reliability engineering. Poisson point process is a particular case of GRP.

Probabilistic numerics is a scientific field at the intersection of statistics, machine learning and applied mathematics, where tasks in numerical analysis including finding numerical solutions for integration, linear algebra, optimisation and differential equations are seen as problems of statistical, probabilistic, or Bayesian inference.

Fast probability integration (FPI) is a method of determining the probability of a class of events, particularly a failure event, that is faster to execute than Monte Carlo analysis. It is used where large numbers of time-variant variables contribute to the reliability of a system. The method was proposed by Wen and Chen in 1987.

References

  1. 1 2 3 Melchers, R. E. (2002), "Structural Reliability Analysis and Prediction," 2nd Ed., John Wiley, Chichester, UK.
  2. 1 2 3 Piryonesi, Sayed Madeh; Tavakolan, Mehdi (9 January 2017). "A mathematical programming model for solving cost-safety optimization (CSO) problems in the maintenance of structures". KSCE Journal of Civil Engineering. 21 (6): 2226–2234. doi:10.1007/s12205-017-0531-z. S2CID   255532962.
  3. Choi, S. K., Grandhi, R., & Canfield, R. A. (2006). Reliability-based structural design. Springer Science & Business Media.
  4. Okasha, N. M., & Frangopol, D. M. (2009). Lifetime-oriented multi-objective optimization of structural maintenance considering system reliability, redundancy and life-cycle cost using GA. Structural Safety, 31(6), 460-474.