Featured

Eigenvectors of a Matrix and How to Find Eigenvectors of a Matrix

0 0
Read Time:2 Minute, 47 Second

The unique set of vectors related to the linear system of equations is called eigenvectors or latent vectors or proper vectors or characteristic vectors. It is mostly used on a square matrix. A square matrix consisting of n entries where n represents the number of rows or columns are called eigenvectors of a matrix. x denotes the eigenvector. When a linear transformation is applied, the direction of an eigenvector remains unchanged.

The most efficient way to find the eigenvalues and eigenvectors is the QR algorithm that evolved in 1961. To calculate eigenvalues and eigenvectors for large Hermitian space matrices, the Lanczos algorithm is one such example of a best iterative method.

Eigenvector-eigenvalue identity

Consider the case of a Hermitian matrix. The norm of the component of j of an eigenvector that is normalized can be found using the eigenvalues of a matrix and the minor matrix of the corresponding eigenvalues.

Mj is the submatrix formed by eliminating the jth row and column from the original matrix.

Steps to determine the eigenvectors of a matrix

The eigenvalues of the matrix have to be found in order to find the eigenvectors. This article answers the question – how to find the eigenvectors of a matrix. For any square matrix A, AX = λX

AX – λX = 0

(A – λI) X = 0…..(1)

The condition will hold good only when (A – λI) is singular. That is |A – λI| = 0 …..(2)

(2) is the characteristic equation of the matrix.

The roots of the characteristic equation represent the eigenvalues of matrix A.

To find eigenvectors, substitute each eigenvalue in (1) and solve it by the method of

Gaussian elimination by converting the given matrix to (A – λI) = 0 such that all the rows consist of only zeroes are at the bottom and solve the linear system of equations obtained.

Some properties of eigenvalues include the following.

  • The eigenvalues of hermitian and real symmetric matrices are real.
  • The eigenvalues of skew hermitian and real skew-symmetric matrices are imaginary, pure, or 0.
  • The eigenvalues of orthogonal and unitary matrices = unit modulus = |λ| = 1.
  • The eigenvalues of the transpose of matrix A = eigenvalue of matrix A.
  • The sum of diagonal elements of matrix A = the sum of its eigenvalues.
  • The determinant of matrix A is the same as the product of eigenvalues.
  • The size of matrix A equals the highest number of distinct elements of matrix A.
  • If there exist 2 matrices A and B of the same order, then the eigenvalues of AB and the eigenvalues of BA are the same.

Generalized eigenvector

It is a vector that satisfies certain conditions that are more liberal when compared to that of an ordinary eigenvector. It is defined for a n x n matrix.

Applications of eigenvalues and eigenvectors

  • The usage of the above concepts can be found in the topics of linear differential equations. It finds its application in obtaining the rate of change or to determine the relationships between the given variables.
  • They are used in recognition of face techniques called eigenfaces.
  • Eigenvalues and eigenvectors are the basic foundations of the core of the analysis of components.
  • Eigenvectors can be used to rank the item points in a data set given.
  • Eigenvalues find its significance in regularisation.

There are numerous applications of eigenvalues and eigenvectors, a few of them are listed above for reference.

Happy
Happy
0 %
Sad
Sad
0 %
Excited
Excited
0 %
Sleepy
Sleepy
0 %
Angry
Angry
0 %
Surprise
Surprise
0 %

Average Rating

5 Star
0%
4 Star
0%
3 Star
0%
2 Star
0%
1 Star
0%

Leave a Reply

Your email address will not be published. Required fields are marked *