"Although they play a fundamental role in nearly all
branches of mathematics, inequalities are usually obtained by ad hoc methods rather
than as consequences of some underlying 'theory of inequalities'. For certain kinds of inequalities. the notion of majorization leads to such a theory
that is sometimes extremely useful and powerful for deriving inequalities." (Albert
W Marshall, "Inequalities: Theory of Majorization and its Applications", 1979)
"Inequalities are useful for bounding quantities that might otherwise be hard to compute."
"The triangle inequality is perhaps the most important property for proving theorems involving distance. The name is appropriate because the triangle inequality is an abstraction of the property that the sum of the lengths of two sides of a triangle must be at least as large as the length of the third side."
"There are three reasons for the study of inequalities:
practical, theoretical and aesthetic. In many practical investigations, it is
necessary to bound one quantity by another. The classical inequalities are very
useful for this purpose. From the theoretical point of view, very simple
questions give rise to entire theories. […] Finally, let us turn to the
aesthetic aspects. As has been pointed out, beauty is in the eye of the
beholder. However. it is generally agreed that certain pieces of music, art, or
mathematics are beautiful. There is an elegance to inequalities that makes them
very attractive."
"[...] we may also look for representations which make inequalities obvious. Often, these representations are maxima or minima of certain quantities. […] we know that many inequalities are associated with geometric properties. Hence. we can go in either direction. We can find the geometric equivalent of an analytic result, or the analytic consequence of a geometric fact such as convexity or duality." (Claudi Alsina & Roger B Nelsen, "When Less is More: Visualizing Basic Inequalities", 2009)
"Second Law of thermodynamics is not an equality, but an inequality, asserting merely that a certain quantity referred to as the entropy of an isolated system - which is a measure of the system’s disorder, or ‘randomness’ - is greater (or at least not smaller) at later times than it was at earlier times." (Roger Penrose, "Cycles of Time: An Extraordinary New View of the Universe", 2010)
No comments:
Post a Comment