Florida Atlantic University, USA
Title: Uncertainty analysis in engineering: Past, present, and future
Biography: Isaac Elishakoff
Uncertainty quantification is becoming a very extensive field of research in recent years. Great scientists or government officials in unison pinpoint of its importance. According to Albert Einstein as far as the propositions of mathematics refer to reality, they are not certain; and as far as they are certain, they do not refer to reality. According to the former queen of the Netherlands Beatrix, if one thing today is certain, it is a feeling of uncertainty—a premonition that the future cannot be a simple extension of the present. Galileo advises us: Measure what can be measured, and make measurable what cannot be measured. We measure uncertainty via roughly speaking via three alternative approaches: theory of probability and random processes; fuzzy sets based approached and bounding approaches, based on worst possible response, in combination with making worst possible response as high as possible. The first two theories are associated with a given measure, like probability density or membership function. The latter approach is known in the literature by various names such as guaranteed approach, convex modeling of uncertainty or information-gap theory, interval analysis and so on. The worst case scenario is easiest to explain to the boss or to the laymen. Roman poet Ovid (43 BCE-18 CE) advises us that: I see and approve better things, but follow the worse. William Shakespeare propagates analogous idea that: Since the affairs of men rest still uncertain, let’s reason with the worst that may befall. Naturally worst case scenario may turn to be very conservative. Hence there is a necessity of minimizing the worst case response. The current lecture will deliver into two sub-parts of the uncertainty modeling. Specifically the first part discusses various approaches of stochastic linearization and demonstrates the advantages of the recently proposed non-classical methodologies. It turns out that energy based linearization technique produces superb results. Second part deals with data enclosing problem and bounding the uncertain data with proper rectangles, ellipsoids, or super-ellipsoids suggested independently by Gabriel lame and Piet Hein. We suggest utilizing such enclosing of the data that the maximum predicted response is minimal. The super-ellipsoidal modeling is showed as the superior to all other techniques. The example of composite plate with four-dimensional data enclosed in super-ellipsoid is considered in detail. General recommendations are made for uncertainty quantification in conjunction with available data.