Blog Archive

Tuesday, September 28, 2021

09-27-2021-1825 - Eigenvalues and eigenvectors Vibration Analysis

Shiny Disco Ball On Nightclub Good For Background Stock Photo, Picture And  Royalty Free Image. Image 40915755.

In linear algebra, an eigenvector (/ˈaɪɡənËŒvÉ›ktÉ™r/) or characteristic vector of a linear transformation is a nonzero vector that changes at most by a scalar factor when that linear transformation is applied to it. The corresponding eigenvalue, often denoted by ,[1] is the factor by which the eigenvector is scaled.

Geometrically, an eigenvector, corresponding to a real nonzero eigenvalue, points in a direction in which it is stretched by the transformation and the eigenvalue is the factor by which it is stretched. If the eigenvalue is negative, the direction is reversed.[2] Loosely speaking, in a multidimensional vector space, the eigenvector is not rotated.

Formal definition[edit]

If T is a linear transformation from a vector space V over a field F into itself and v is a nonzero vector in V, then v is an eigenvector of T if T(v) is a scalar multiple of v. This can be written as

where Î» is a scalar in F, known as the eigenvaluecharacteristic value, or characteristic root associated with v.

There is a direct correspondence between n-by-n square matrices and linear transformations from an n-dimensional vector space into itself, given any basis of the vector space. Hence, in a finite-dimensional vector space, it is equivalent to define eigenvalues and eigenvectors using either the language of matrices, or the language of linear transformations.[3][4]

If V is finite-dimensional, the above equation is equivalent to[5]

where A is the matrix representation of T and u is the coordinate vector of v.


https://en.wikipedia.org/wiki/Eigenvalues_and_eigenvectors


Vibration analysis[edit]

Mode shape of a tuning fork at eigenfrequency 440.09 Hz

Eigenvalue problems occur naturally in the vibration analysis of mechanical structures with many degrees of freedom. The eigenvalues are the natural frequencies (or eigenfrequencies) of vibration, and the eigenvectors are the shapes of these vibrational modes. In particular, undamped vibration is governed by

or

that is, acceleration is proportional to position (i.e., we expect  to be sinusoidal in time).

In  dimensions,  becomes a mass matrix and  a stiffness matrix. Admissible solutions are then a linear combination of solutions to the generalized eigenvalue problem

where  is the eigenvalue and  is the (imaginary) angular frequency. The principal vibration modes are different from the principal compliance modes, which are the eigenvectors of  alone. Furthermore, damped vibration, governed by

leads to a so-called quadratic eigenvalue problem,

This can be reduced to a generalized eigenvalue problem by algebraic manipulation at the cost of solving a larger system.

The orthogonality properties of the eigenvectors allows decoupling of the differential equations so that the system can be represented as linear summation of the eigenvectors. The eigenvalue problem of complex structures is often solved using finite element analysis, but neatly generalize the solution to scalar-valued vibration problems.

Eigenfaces[edit]

Eigenfaces as examples of eigenvectors

In image processing, processed images of faces can be seen as vectors whose components are the brightnesses of each pixel.[51] The dimension of this vector space is the number of pixels. The eigenvectors of the covariance matrix associated with a large set of normalized pictures of faces are called eigenfaces; this is an example of principal component analysis. They are very useful for expressing any face image as a linear combination of some of them. In the facial recognition branch of biometrics, eigenfaces provide a means of applying data compression to faces for identification purposes. Research related to eigen vision systems determining hand gestures has also been made.

Similar to this concept, eigenvoices represent the general direction of variability in human pronunciations of a particular utterance, such as a word in a language. Based on a linear combination of such eigenvoices, a new voice pronunciation of the word can be constructed. These concepts have been found useful in automatic speech recognition systems for speaker adaptation.

Tensor of moment of inertia[edit]

In mechanics, the eigenvectors of the moment of inertia tensor define the principal axes of a rigid body. The tensor of moment of inertia is a key quantity required to determine the rotation of a rigid body around its center of mass.

Stress tensor[edit]

In solid mechanics, the stress tensor is symmetric and so can be decomposed into a diagonal tensor with the eigenvalues on the diagonal and eigenvectors as a basis. Because it is diagonal, in this orientation, the stress tensor has no shear components; the components it does have are the principal components.

Graphs[edit]

In spectral graph theory, an eigenvalue of a graph is defined as an eigenvalue of the graph's adjacency matrix , or (increasingly) of the graph's Laplacian matrix due to its discrete Laplace operator, which is either  (sometimes called the combinatorial Laplacian) or  (sometimes called the normalized Laplacian), where  is a diagonal matrix with  equal to the degree of vertex , and in , the th diagonal entry is . The th principal eigenvector of a graph is defined as either the eigenvector corresponding to the th largest or th smallest eigenvalue of the Laplacian. The first principal eigenvector of the graph is also referred to merely as the principal eigenvector.

The principal eigenvector is used to measure the centrality of its vertices. An example is Google's PageRank algorithm. The principal eigenvector of a modified adjacency matrix of the World Wide Web graph gives the page ranks as its components. This vector corresponds to the stationary distribution of the Markov chainrepresented by the row-normalized adjacency matrix; however, the adjacency matrix must first be modified to ensure a stationary distribution exists. The second smallest eigenvector can be used to partition the graph into clusters, via spectral clustering. Other methods are also available for clustering.

Basic reproduction number[edit]

The basic reproduction number () is a fundamental number in the study of how infectious diseases spread. If one infectious person is put into a population of completely susceptible people, then  is the average number of people that one typical infectious person will infect. The generation time of an infection is the time, , from one person becoming infected to the next person becoming infected. In a heterogeneous population, the next generation matrix defines how many people in the population will become infected after time  has passed.  is then the largest eigenvalue of the next generation matrix.[52][53]

See also[edit]




https://en.wikipedia.org/wiki/Eigenvalues_and_eigenvectors#Vibration_analysis


History[edit]

Eigenvalues are often introduced in the context of linear algebra or matrix theory. Historically, however, they arose in the study of quadratic forms and differential equations.

In the 18th century, Leonhard Euler studied the rotational motion of a rigid body, and discovered the importance of the principal axes.[a] Joseph-Louis Lagrange realized that the principal axes are the eigenvectors of the inertia matrix.[11]

In the early 19th century, Augustin-Louis Cauchy saw how their work could be used to classify the quadric surfaces, and generalized it to arbitrary dimensions.[12] Cauchy also coined the term racine caractéristique (characteristic root), for what is now called eigenvalue; his term survives in characteristic equation.[b]

Later, Joseph Fourier used the work of Lagrange and Pierre-Simon Laplace to solve the heat equation by separation of variables in his famous 1822 book Théorie analytique de la chaleur.[13] Charles-François Sturm developed Fourier's ideas further, and brought them to the attention of Cauchy, who combined them with his own ideas and arrived at the fact that real symmetric matrices have real eigenvalues.[12] This was extended by Charles Hermitein 1855 to what are now called Hermitian matrices.[14]

Around the same time, Francesco Brioschi proved that the eigenvalues of orthogonal matrices lie on the unit circle,[12] and Alfred Clebsch found the corresponding result for skew-symmetric matrices.[14] Finally, Karl Weierstrass clarified an important aspect in the stability theory started by Laplace, by realizing that defective matrices can cause instability.[12]

In the meantime, Joseph Liouville studied eigenvalue problems similar to those of Sturm; the discipline that grew out of their work is now called Sturm–Liouville theory.[15] Schwarz studied the first eigenvalue of Laplace's equation on general domains towards the end of the 19th century, while Poincaréstudied Poisson's equation a few years later.[16]

At the start of the 20th century, David Hilbert studied the eigenvalues of integral operators by viewing the operators as infinite matrices.[17] He was the first to use the German word eigen, which means "own",[7] to denote eigenvalues and eigenvectors in 1904,[c] though he may have been following a related usage by Hermann von Helmholtz. For some time, the standard term in English was "proper value", but the more distinctive term "eigenvalue" is the standard today.[18]

The first numerical algorithm for computing eigenvalues and eigenvectors appeared in 1929, when Richard von Mises published the power method. One of the most popular methods today, the QR algorithm, was proposed independently by John G. F. Francis[19] and Vera Kublanovskaya[20] in 1961.[21][22]


https://en.wikipedia.org/wiki/Eigenvalues_and_eigenvectors#Vibration_analysis


No comments:

Post a Comment