Developer(s) | OptiY GmbH |
---|---|
Operating system | Windows |
Type | Technical computing |
License | Proprietary |
Website | www.optiy.eu |
OptiY is a design environment software that provides modern optimization strategies and state of the art probabilistic algorithms for uncertainty, reliability, robustness, sensitivity analysis, data-mining and meta-modeling.
OptiY is an multidisciplinary design environment, which provides direct and generic interfaces to many CAD/CAE-systems and house-intern codes. Furthermore, a complex COM-interface and a user-node with predefined template are available so that user can self-integrate extern programs for ease of use. The insertion of any system to an arbitrary process chain is very easy using the graphical workflow editor. Collaborating different simulation model classes is possible as networks, finite-element-method, multi-body-system, material test bench etc.
Data mining is the process of extracting hidden patterns from data. Data mining identifies trends within data that go beyond simple data analysis. Through the use of sophisticated algorithms, non-statistician users have the opportunity to identify key attributes of processes and target opportunities. Data mining is becoming an increasingly important tool to transform this data into information. It is commonly used in a wide range of applications such as manufacturing, marketing, fraud detection and scientific discovery etc.
Local Sensitivity as correlation coefficients and partial derivatives can only used, if the correlation between input and output is linear. If the correlation is nonlinear, the global sensitivity analysis has to be used based on the variance-relationship between input- and output-distribution as Sobol index. With sensitivity analysis, the system complexity can be reduced and the cause-and-effect chain can be explained. [1] [2]
The variability, uncertainty, tolerance and error of the technical systems play an important part by the product design process. These cause by manufacturing inaccuracy, process uncertainty, environment influences, abrasion and human factors etc. They are characterized by a stochastic distribution. The deterministic simulation cannot predict the real system behaviors due to the input variability and uncertainty, because one model calculation shows only one point in the design space. Probabilistic simulation has to be performed. Thereby, the output distributions will be calculated from input distributions based on the deterministic simulation model by any simulation system. The realistic system behaviors can be derivate from these output distributions. [3] [4]
The variability of parameters causes often a failure of the system. Reliability analysis (Failure mode and effects analysis) investigates the boundary violation of output due to input variability. The failure mechanisms of components are known in the specification for the product development. They are identified by measurement, field data collection, material data, customer-specifications etc. In the simulation, the satisfaction of all product specifications is defined as constraints of the simulation results. The system reliability is given, if all constraints scatter insight the defined boundaries. Although a nominal parameter simulation shows that all values of the constraints are located in reliable boundaries, the system reliability however cannot be warranted due to input variability. A part of the constraints variability, which violates the defined boundaries, is called the failure probability of the solution. Reliability analysis computes the failure probability of the single components and also of the total system at a given time point. [5]
Meta-modeling is the combination of Surrogate model and Physics-informed neural networks which is a process to win the mathematical relationship between input and output parameters. The new way of modeling is any mix of some imperfect data and some imperfect physical components getting accurate meta-model for real-time computing. [6] [7]
Predicting fatigue (material) has been one of the most important problems in design engineering for reliability and quality. They have several practical uses: rapid design optimization during development phase of a product and predicting field use limits as well as failure analysis of product returned from the field or failed in qualification test. Fatigue analysis focus on the thermal and mechanical failure mechanism. Most fatigue failure can be attributed to thermo-mechanical stresses caused by differences in the coefficient of thermal and mechanical expansion. The fatigue failures will occur when the component experiences cyclic stresses and strains that produce permanent damage.
In development process of technical products, there are frequently design problems with many evaluation goals or criteria as low cost, high quality, low noise etc. Design parameters have to be found to minimize all criteria. In contrast to a single optimization, there is another order structure between parameter and criteria spaces at a multi-objective Optimization. Criteria conflict each other. Trying to minimize a criterion, other criteria may be maximized. There is not only one solution, but also a Pareto optimal solution frontier. Multi-objective optimization finds all Pareto solutions automatically with a single run. The multiple decision making support tool is also available to select one best suitable solution from them. [8]
Variability, uncertainty and tolerance have to be considered for design process of technical systems to assure the highly required quality and reliability. They are uncontrollable, unpredictable and cause the uncertainty satisfaction of the required product specifications. The design goal is assuring of the specified product functionalities in spite of unavoidable variability and uncertainty. The approach solving this problem is robust design of the product parameters in the early design process (Robust Parameter Design (RPD)). Thereby, optimal product parameters should be found. Within, the system behavior is robust and insensitive in spite of unavoidable variability. E.g. the consistent variability and uncertainty leads only to the smallest variability of the product characteristics. So, the required product specifications will be always satisfied. [9]
Multi-disciplinary design optimization (MDO) is a field of engineering that uses optimization methods to solve design problems incorporating a number of disciplines. It is also known as multidisciplinary system design optimization (MSDO), and Multidisciplinary Design Analysis and Optimization (MDAO).
Sensitivity analysis is the study of how the uncertainty in the output of a mathematical model or system can be divided and allocated to different sources of uncertainty in its inputs. A related practice is uncertainty analysis, which has a greater focus on uncertainty quantification and propagation of uncertainty; ideally, uncertainty and sensitivity analysis should be run in tandem.
Design for Six Sigma (DFSS) is an engineering design process, business process management method related to traditional Six Sigma. It is used in many industries, like finance, marketing, basic engineering, process industries, waste management, and electronics. It is based on the use of statistical tools like linear regression and enables empirical research similar to that performed in other fields, such as social science. While the tools and order used in Six Sigma require a process to be in place and functioning, DFSS has the objective of determining the needs of customers and the business, and driving those needs into the product solution so created. It is used for product or process design in contrast with process improvement. Measurement is the most important part of most Six Sigma or DFSS tools, but whereas in Six Sigma measurements are made from an existing process, DFSS focuses on gaining a deep insight into customer needs and using these to inform every design decision and trade-off.
Methods engineering is a subspecialty of industrial engineering and manufacturing engineering concerned with human integration in industrial production processes.
Reliability engineering is a sub-discipline of systems engineering that emphasizes the ability of equipment to function without failure. Reliability describes the ability of a system or component to function under stated conditions for a specified period of time. Reliability is closely related to availability, which is typically described as the ability of a component or system to function at a specified moment or interval of time.
Uncertainty quantification (UQ) is the science of quantitative characterization and estimation of uncertainties in both computational and real world applications. It tries to determine how likely certain outcomes are if some aspects of the system are not exactly known. An example would be to predict the acceleration of a human body in a head-on crash with another car: even if the speed was exactly known, small differences in the manufacturing of individual cars, how tightly every bolt has been tightened, etc., will lead to different results that can only be predicted in a statistical sense.
Probabilistic design is a discipline within engineering design. It deals primarily with the consideration of the effects of random variability upon the performance of an engineering system during the design phase. Typically, these effects are related to quality and reliability. Thus, probabilistic design is a tool that is mostly used in areas that are concerned with quality and reliability. For example, product design, quality control, systems engineering, machine design, civil engineering and manufacturing. It differs from the classical approach to design by assuming a small probability of failure instead of using the safety factor.
Robustification is a form of optimisation whereby a system is made less sensitive to the effects of random variability, or noise, that is present in that system's input variables and parameters. The process is typically associated with engineering systems, but the process can also be applied to a political policy, a business strategy or any other system that is subject to the effects of random variability.
Electrical power system simulation involves power system modeling and network simulation in order to analyze electrical power systems using design/offline or real-time data. Power system simulation software's are a class of computer simulation programs that focus on the operation of electrical power systems. These types of computer programs are used in a wide range of planning and operational situations for electric power systems.
Quantification of Margins and Uncertainty (QMU) is a decision support methodology for complex technical decisions. QMU focuses on the identification, characterization, and analysis of performance thresholds and their associated margins for engineering systems that are evaluated under conditions of uncertainty, particularly when portions of those results are generated using computational modeling and simulation. QMU has traditionally been applied to complex systems where comprehensive experimental test data is not readily available and cannot be easily generated for either end-to-end system execution or for specific subsystems of interest. Examples of systems where QMU has been applied include nuclear weapons performance, qualification, and stockpile assessment. QMU focuses on characterizing in detail the various sources of uncertainty that exist in a model, thus allowing the uncertainty in the system response output variables to be well quantified. These sources are frequently described in terms of probability distributions to account for the stochastic nature of complex engineering systems. The characterization of uncertainty supports comparisons of design margins for key system performance metrics to the uncertainty associated with their calculation by the model. QMU supports risk-informed decision-making processes where computational simulation results provide one of several inputs to the decision-making authority. There is currently no standardized methodology across the simulation community for conducting QMU; the term is applied to a variety of different modeling and simulation techniques that focus on rigorously quantifying model uncertainty in order to support comparison to design margins.
Worst-case circuit analysis is a cost-effective means of screening a design to ensure with a high degree of confidence that potential defects and deficiencies are identified and eliminated prior to and during test, production, and delivery.
A probability box is a characterization of uncertain numbers consisting of both aleatoric and epistemic uncertainties that is often used in risk analysis or quantitative uncertainty modeling where numerical calculations must be performed. Probability bounds analysis is used to make arithmetic and logical calculations with p-boxes.
Kimeme is an open platform for multi-objective optimization and multidisciplinary design optimization. It is intended to be coupled with external numerical software such as computer-aided design (CAD), finite element analysis (FEM), structural analysis and computational fluid dynamics tools. It was developed by Cyber Dyne Srl and provides both a design environment for problem definition and analysis and a software network infrastructure to distribute the computational load.
Red Cedar Technology is a software development and engineering services company. Red Cedar Technology was founded by Michigan State University professors Ron Averill and Erik Goodman in 1999. The headquarters is located in East Lansing, Michigan, near MSU's campus. Red Cedar Technology develops and distributes the HEEDS Professional suite of design optimization software. HEEDS is based on spin-out technology from Michigan State University. On June 30, 2013 Red Cedar Technology was acquired by CD-adapco. CD-adapco was acquired in 2016 by Siemens Digital Industries Software.
Optimus is a Process Integration and Design Optimization (PIDO) platform developed by Noesis Solutions. Noesis Solutions takes part in key research projects, such as PHAROS and MATRIX.
pSeven is a DSE software platform that was developed by DATADVANCE that features design, simulation and analysis capabilities and assists in design decisions. It provides integration with third party CAD and CAE software tools, multi-objective and robust optimization algorithms, data analysis, and uncertainty quantification tools.
optiSLang is a software platform for CAE-based sensitivity analysis, multi-disciplinary optimization (MDO) and robustness evaluation. It was originally developed by Dynardo GmbH and provides a framework for numerical Robust Design Optimization (RDO) and stochastic analysis by identifying variables which contribute most to a predefined optimization goal. This includes also the evaluation of robustness, i.e. the sensitivity towards scatter of design variables or random fluctuations of parameters. In 2019, Dynardo GmbH was acquired by Ansys.
Sensitivity analysis identifies how uncertainties in input parameters affect important measures of building performance, such as cost, indoor thermal comfort, or CO2 emissions. Input parameters for buildings fall into roughly three categories:
In regression analysis, an interval predictor model (IPM) is an approach to regression where bounds on the function to be approximated are obtained. This differs from other techniques in machine learning, where usually one wishes to estimate point values or an entire probability distribution. Interval Predictor Models are sometimes referred to as a nonparametric regression technique, because a potentially infinite set of functions are contained by the IPM, and no specific distribution is implied for the regressed variables.