The third medium contact(TMC) is an implicit formulation for contact mechanics. Contacting bodies are embedded in a highly compliant medium (the third medium), which becomes increasingly stiff under compression. The stiffening of the third medium allows tractions to be transferred between the contacting bodies when the third medium between the bodies is compressed. In itself, the method is inexact; however, in contrast to most other contact methods, the third medium approach is continuous and differentiable, which makes it applicable to applications such as topology optimization. [1] [2] [3] [4] [5] [6]
The method was first proposed by Peter Wriggers , Jörg Schröder, and Alexander Schwarz where a St. Venant-Kirchhoff material was used to model the third medium. [7] This approach requires explicit treatment of surface normals. A simplification to the method was offered by Bog et al. by applying a Hencky material with the inherent property of becoming rigid under ultimate compression. [8] This property has made the explicit treatment of surface normals redundant, thereby transforming the third medium contact method into a fully implicit method, which is a contrast to the more widely used Mortar methods or Penalty methods. The addition of a new regularization by Bluhm et al. to stabilize the third medium further extended the method to applications involving moderate sliding, rendering it practically applicable. [1]
A material with the property that it becomes increasingly stiff under compression is augmented by a regularization term. In terms of strain energy density, this may be expressed as
,
where represents the augmented strain energy density in the third medium, is the regularization term representing the inner product of the spatial Hessian by itself, and is the underlying strain energy density of the third medium, e.g. a Neo-Hookean solid or another hyperelastic material. The term is commonly referred to as HuHu-regularization.
Bra–ket notation, also called Dirac notation, is a notation for linear algebra and linear operators on complex vector spaces together with their dual space both in the finite-dimensional and infinite-dimensional case. It is specifically designed to ease the types of calculations that frequently come up in quantum mechanics. Its use in quantum mechanics is quite widespread.
The Navier–Stokes equations are partial differential equations which describe the motion of viscous fluid substances. They were named after French engineer and physicist Claude-Louis Navier and the Irish physicist and mathematician George Gabriel Stokes. They were developed over several decades of progressively building the theories, from 1822 (Navier) to 1842–1850 (Stokes).
In calculus, Taylor's theorem gives an approximation of a -times differentiable function around a given point by a polynomial of degree , called the -th-order Taylor polynomial. For a smooth function, the Taylor polynomial is the truncation at the order of the Taylor series of the function. The first-order Taylor polynomial is the linear approximation of the function, and the second-order Taylor polynomial is often referred to as the quadratic approximation. There are several versions of Taylor's theorem, some giving explicit estimates of the approximation error of the function by its Taylor polynomial.
In linear algebra, a square matrix is called diagonalizable or non-defective if it is similar to a diagonal matrix. That is, if there exists an invertible matrix and a diagonal matrix such that . This is equivalent to . This property exists for any linear map: for a finite-dimensional vector space , a linear map is called diagonalizable if there exists an ordered basis of consisting of eigenvectors of . These definitions are equivalent: if has a matrix representation as above, then the column vectors of form a basis consisting of eigenvectors of , and the diagonal entries of are the corresponding eigenvalues of ; with respect to this eigenvector basis, is represented by .
Multi-task learning (MTL) is a subfield of machine learning in which multiple learning tasks are solved at the same time, while exploiting commonalities and differences across tasks. This can result in improved learning efficiency and prediction accuracy for the task-specific models, when compared to training the models separately. Inherently, Multi-task learning is a multi-objective optimization problem having trade-offs between different tasks. Early versions of MTL were called "hints".
In mathematics, a norm is a function from a real or complex vector space to the non-negative real numbers that behaves in certain ways like the distance from the origin: it commutes with scaling, obeys a form of the triangle inequality, and is zero only at the origin. In particular, the Euclidean distance in a Euclidean space is defined by a norm on the associated Euclidean vector space, called the Euclidean norm, the 2-norm, or, sometimes, the magnitude or length of the vector. This norm can be defined as the square root of the inner product of a vector with itself.
Topology optimization is a mathematical method that optimizes material layout within a given design space, for a given set of loads, boundary conditions and constraints with the goal of maximizing the performance of the system. Topology optimization is different from shape optimization and sizing optimization in the sense that the design can attain any shape within the design space, instead of dealing with predefined configurations.
The Gauss–Newton algorithm is used to solve non-linear least squares problems, which is equivalent to minimizing a sum of squared function values. It is an extension of Newton's method for finding a minimum of a non-linear function. Since a sum of squares must be nonnegative, the algorithm can be viewed as using Newton's method to iteratively approximate zeroes of the components of the sum, and thus minimizing the sum. In this sense, the algorithm is also an effective method for solving overdetermined systems of equations. It has the advantage that second derivatives, which can be challenging to compute, are not required.
Smoothed-particle hydrodynamics (SPH) is a computational method used for simulating the mechanics of continuum media, such as solid mechanics and fluid flows. It was developed by Gingold and Monaghan and Lucy in 1977, initially for astrophysical problems. It has been used in many fields of research, including astrophysics, ballistics, volcanology, and oceanography. It is a meshfree Lagrangian method, and the resolution of the method can easily be adjusted with respect to variables such as density.
In mathematics a radial basis function (RBF) is a real-valued function whose value depends only on the distance between the input and some fixed point, either the origin, so that , or some other fixed point , called a center, so that . Any function that satisfies the property is a radial function. The distance is usually Euclidean distance, although other metrics are sometimes used. They are often used as a collection which forms a basis for some function space of interest, hence the name.
In mathematical optimization, the Karush–Kuhn–Tucker (KKT) conditions, also known as the Kuhn–Tucker conditions, are first derivative tests for a solution in nonlinear programming to be optimal, provided that some regularity conditions are satisfied.
A mixed model, mixed-effects model or mixed error-component model is a statistical model containing both fixed effects and random effects. These models are useful in a wide variety of disciplines in the physical, biological and social sciences. They are particularly useful in settings where repeated measurements are made on the same statistical units, or where measurements are made on clusters of related statistical units. Mixed models are often preferred over traditional analysis of variance regression models because they don't rely on the independent observations assumption. Further, they have their flexibility in dealing with missing values and uneven spacing of repeated measurements. The Mixed model analysis allows measurements to be explicitly modeled in a wider variety of correlation and variance-covariance avoiding biased estimations structures.
In contact mechanics, the term unilateral contact, also called unilateral constraint, denotes a mechanical constraint which prevents penetration between two rigid/flexible bodies. Constraints of this kind are omnipresent in non-smooth multibody dynamics applications, such as granular flows, legged robot, vehicle dynamics, particle damping, imperfect joints, or rocket landings. In these applications, the unilateral constraints result in impacts happening, therefore requiring suitable methods to deal with such constraints.
In applied mathematics, topological data analysis (TDA) is an approach to the analysis of datasets using techniques from topology. Extraction of information from datasets that are high-dimensional, incomplete and noisy is generally challenging. TDA provides a general framework to analyze such data in a manner that is insensitive to the particular metric chosen and provides dimensionality reduction and robustness to noise. Beyond this, it inherits functoriality, a fundamental concept of modern mathematics, from its topological nature, which allows it to adapt to new mathematical tools.
The Signorini problem is an elastostatics problem in linear elasticity: it consists in finding the elastic equilibrium configuration of an anisotropic non-homogeneous elastic body, resting on a rigid frictionless surface and subject only to its mass forces. The name was coined by Gaetano Fichera to honour his teacher, Antonio Signorini: the original name coined by him is problem with ambiguous boundary conditions.
Generative design is an iterative design process that uses software to generate outputs that fulfill a set of constraints iteratively adjusted by a designer. Whether a human, test program, or artificial intelligence, the designer algorithmically or manually refines the feasible region of the program's inputs and outputs with each iteration to fulfill evolving design requirements. By employing computing power to evaluate more design permutations than a human alone is capable of, the process is capable of producing an optimal design that mimics nature's evolutionary approach to design through genetic variation and selection. The output can be images, sounds, architectural models, animation, and much more. It is, therefore, a fast method of exploring design possibilities that is used in various design fields such as art, architecture, communication design, and product design.
Quantum optimization algorithms are quantum algorithms that are used to solve optimization problems. Mathematical optimization deals with finding the best solution to a problem from a set of possible solutions. Mostly, the optimization problem is formulated as a minimization problem, where one tries to minimize an error which depends on the solution: the optimal solution has the minimal error. Different optimization techniques are applied in various fields such as mechanics, economics and engineering, and as the complexity and amount of data involved rise, more efficient ways of solving optimization problems are needed. Quantum computing may allow problems which are not practically feasible on classical computers to be solved, or suggest a considerable speed up with respect to the best known classical algorithm.
The variational multiscale method (VMS) is a technique used for deriving models and numerical methods for multiscale phenomena. The VMS framework has been mainly applied to design stabilized finite element methods in which stability of the standard Galerkin method is not ensured both in terms of singular perturbation and of compatibility conditions with the finite element spaces.
The streamline upwind Petrov–Galerkin pressure-stabilizing Petrov–Galerkin formulation for incompressible Navier–Stokes equations can be used for finite element computations of high Reynolds number incompressible flow using equal order of finite element space by introducing additional stabilization terms in the Navier–Stokes Galerkin formulation.
Ole Sigmund is a Danish Professor in Mechanical Engineering who has made fundamental contributions to the field of topology optimization, including microstructure design, nano optics, photonic crystals, Matlab code, acoustics, and fluids. In 2003 he co-authored the highly cited book "Topology Optimization: Theory, Methods and Applications" with Martin P. Bendsøe. His research group was the first to achieve giga-resolution topology optimization, making it for the first time possible to optimize an entire Boeing 777 wing structure.
{{cite journal}}
: Cite journal requires |journal=
(help)