"For a linear operator, the change of the basis of the underlying linear space corresponds to a similarity transformation of the matrices." (Eberhard Zeidler, "Quantum Field Theory I: Gauge Theory", 2006)
"Representations of symmetries with the aid of linear operators (e.g., matrices) play a crucial role in modern physics. In particular, this concerns the linear representations of groups, Lie algebras, and quantum groups (Hopf algebras)" (Eberhard Zeidler, "Quantum Field Theory I: Gauge Theory", 2006)
"Solvable Lie algebras are close to both upper triangular matrices and commutative Lie algebras. In contrast to this, semisimple Lie algebras are as far as possible from being commutative. By Levi’s decomposition theorem, any Lie algebra is built out of a solvable and a semisimple one. The nontrivial prototype of a solvable Lie algebra is the Heisenberg algebra." (Eberhard Zeidler, "Quantum Field Theory I: Gauge Theory", 2006)
"The Gaussian elimination method is a universal method for solving finite-dimensional linear matrix equations on computers." (Eberhard Zeidler, "Quantum Field Theory I: Gauge Theory", 2006)
"Although one speaks nowadays of the determinant of a matrix, the two concepts had different origins. In particular, determinants appeared before matrices, and the early stages in their history were closely tied to linear equations. Subsequent problems that gave rise to new uses of determinants included elimination theory (finding conditions under which two polynomials have a common root), transformation of coordinates to simplify algebraic expressions (e.g., quadratic forms), change of variables in multiple integrals, solution of systems of differential equations, and celestial mechanics." (Israel Kleiner, "A History of Abstract Algebra", 2007)
"Linear algebra is a very useful subject, and its basic concepts arose and were used in different areas of mathematics and its applications. It is therefore not surprising that the subject had its roots in such diverse fields as number theory (both elementary and algebraic), geometry, abstract algebra (groups, rings, fields, Galois theory), anal ysis (differential equations, integral equations, and functional analysis), and physics. Among the elementary concepts of linear algebra are linear equations, matrices, determinants, linear transformations, linear independence, dimension, bilinear forms, quadratic forms, and vector spaces. Since these concepts are closely interconnected, several usually appear in a given context (e.g., linear equations and matrices) and it is often impossible to disengage them." (Israel Kleiner, "A History of Abstract Algebra", 2007)
"Matrices are 'natural' mathematical objects: they appear in connection with linear equations, linear transformations, and also in conjunction with bilinear and quadratic forms, which were important in geometry, analysis, number theory, and physics. Matrices as rectangular arrays of numbers appeared around 200 BC in Chinese mathematics, but there they were merely abbreviations for systems of linear equations. Matrices become important only when they are operated on - added, subtracted, and especially multiplied; more important, when it is shown what use they are to be put to." (Israel Kleiner, "A History of Abstract Algebra", 2007)
"One of the current ideas regarding the Riemann hypothesis is that the zeros of the zeta function can be interpreted as eigenvalues of certain matrices. This line of thinking is attractive and is potentially a good way to attack the hypothesis, since it gives a possible connection to physical phenomena. [...] Empirical results indicate that the zeros of the Riemann zeta function are indeed distributed like the eigenvalues of certain matrix ensembles, in particular the Gaussian unitary ensemble. This suggests that random matrix theory might provide an avenue for the proof of the Riemann hypothesis." (Peter Borwein et al, "The Riemann Hypothesis: A Resource for the Afficionado and Virtuoso Alike", 2007)
"Anyone who has played with Rubik’s cube knows that twisting the top clockwise and then rotating the right hand side to the back gives a different pattern than if you did the two operations in the reverse order. It is easier to see this with a die. If you rotate a die clockwise and then about the vertical, it will be oriented differently to the case where you had first rotated about the vertical and then clockwise. This is why matrices have proved so useful in keeping track of what happens when things rotate in three dimensions, as the order matters." (Frank Close, "Antimatter", 2009)
"Many phenomena require more than just real numbers to describe them mathematically. One such generalization of numbers is known as ‘matrices’. These involve numbers arranged in columns or rows with their own rules for addition and multiplication. Ordi nary numbers correspond to having the same number all down the top left to bottom right diagonal [...]." (Frank Close, "Antimatter", 2009)
"Using matrices, Dirac was able to write an equation relating the total energy of a body to a sum of its energy at rest and its energy in motion, all consistent with Einstein’s theory of relativity. The fact that matrices keep account of what happens when things rotate was a bonus, as the maths was apparently saying that an electron can itself rotate: can spin! Furthermore, the fact that he had been able to solve the mathematics by using the simplest matrices, where a single number was replaced by two columns of pairs, implied a ‘two-ness’ to the spin, precisely what the Zeeman effect had implied. The missing ingredi ent in Schrodinger’s theory had miraculously emerged from the mathematics of matrices, which had been forced on Dirac by the requirements of Einstein’s theory of relativity." (Frank Close, "Antimatter", 2009
No comments:
Post a Comment