Eigenvectors of a Matrix

JEE CRASH  Lite COURSE 2022

What are Eigenvectors of a Matrix?

When comparing a matrix and a scalar, the most significant difference is that the scalar is made up of only one number, but the matrix is made up of rows and columns of numbers. Three additional operations are used on the rows or columns within a single matrix, in addition to understanding how to utilize the four basic operations with matrices. 


These operations are referred to as row operations, and they are as follows:

  • Within a matrix, any two rows can be switched.

  • A non-zero scalar can be multiplied by each entry in a row.

  • You can combine two rows that have been multiplied by a scalar.

When working with matrices, knowing how to do these row operations is incredibly useful. When determining the eigenvectors of a matrix, one example of their utility comes into play. You will learn what eigenvectors are and how to find them for a given matrix from the subtopics given below.


Although, you must first know eigenvalues to comprehend eigenvectors. You can't find the eigenvectors of a matrix without first knowing the eigenvalues.


What are Eigenvectors? 

To understand what eigenvalues and eigenvectors are, consider the following: When you multiply a matrix (A) by a vector (v), the result is another vector (y). The vector you receive as an answer is sometimes a scaled version of the original vector.


The scalar, denoted by the Greek symbol lambda, is an eigenvalue of matrix A, and v is an eigenvector associated with lambda when you have a scaled version of the starting vector. By solving the following equation for v, we may obtain the eigenvectors for a given eigenvalue. 


Eigenvector of a Matrix is also known as a Proper Vector, Latent Vector or Characteristic Vector. Eigenvectors are defined as a reference of a square matrix. A matrix represents a rectangular array of numbers or other elements of the same kind. It generally represents a system of linear equations. 


Eigen Vectors is a very useful concept related to matrices. These are also used in calculus to solve differential equations and many other applications related to it. A scaled version of a matrix-vector is called an eigenvector. Through example problems, you can learn how to calculate eigenvectors and eigenvalues from matrices and how to use them in scalar equations.


There are basically two types of eigenvectors:

  •     Left Eigenvector

  •     Right Eigenvector

Let us go ahead and understand the eigenvector, how to find the eigenvalue of a 2×2 matrix, its technique and various other concepts related to it.


Eigenvector Method

The method of determining the eigenvector of a matrix is explained below:

If A be an n×n matrix and λ (lambda) be the eigenvalues associated with it. Then, eigenvector v can be defined as:

Av = λv

If I be the identity matrix of the same order as A, then

 (A−λI)v=0

The eigenvector associated with matrix A can be determined using the above method.

Here, v corresponds to eigenvector belonging to each eigenvalue and is written as:


\[v = \begin{bmatrix} v_1 \\  v_2 \\ . \\  . \\ . \\ v_n \end{bmatrix}\]


Eigenvector Equation

The equation corresponding to each eigenvalue of a matrix can be written as:

AX = λ X

It is formally known as the eigenvector equation.

In place of λ, we put each eigenvalue one by one and get the eigenvector equation which enables us to solve for the eigenvector belonging to each eigenvalue.

For example: Suppose that there are two eigenvalues λ1​ = 0 and λ2​ = 1 of any 2×2 matrix.

Then,

AX = λ1​ X A =  O …..(1)

and

AX = λ2​ X A =  1

(A–I)X = O…. (2)

Equations (1) and (2) are eigenvector equations for a given matrix.

Where I corresponds to the identity matrix of the same order as A

O = zero matrix of the same order as A X = Eigenvector which is equal to [xy]

  (as A is of order 2)


How to Find Eigenvector

The following are the steps to find eigenvectors of a matrix:

Step 1: Determine the eigenvalues of the given matrix A using the equation det (A – λI) = 0, where I is equivalent order identity matrix as A. Denote each eigenvalue of λ1​, λ2​, λ3​,...

Step 2: Substitute the value of λ1​ in equation AX = λ1​ X or (A – λ1​ I) X = O.

Step 3: Calculate the value of eigenvector X which is associated with eigenvalue λ1​.

Step 4: Repeat steps 3 and 4 for other eigenvalues λ2​, λ3​, … as well.


Operation on Rows

You might have spent your early school years learning the Fundamentals of Mathematical Operations such as addition, subtraction, multiplication, and division. You need to master these operations all over again in linear algebra. Until now, you might have worked with scalars, but now you need to deal with vectors and matrices.


Generalized Eigenvector

Eigenvectors are not very different from generalized eigenvectors. It is defined in the following way:

A generalized eigenvector associated with an eigenvalue λ of an n times n×n matrix is denoted by a nonzero vector X and is defined as:

(A−λI)k = 0

Where k is some positive integer.

For k = 1 ⇒ (A−λI) = 0

Therefore, if k = 1, then the eigenvector of matrix A is its generalized eigenvector.


Eigenvector Orthogonality

A vector quantity is known to possess magnitude as well as direction. Orthogonality is a concept of two eigenvectors of a matrix being at right angles to each other. We can say that when two eigenvectors are perpendicular to each other, they are said to be orthogonal eigenvectors.


Left Eigenvector

Eigenvector that is represented in the form of a row vector is called a left eigenvector. It satisfies the following condition:

AXL​=λXL​

Where A is given a matrix of order n and λ be one of its eigenvalues. XL​ is denoted by a row vector [x1x2...xn]


Right Eigenvector

In the same way as the left eigenvector, the right eigenvector is denoted by XR​. It is defined as an eigenvector that is written in the form of a column vector, satisfying the condition given below:

AXR​=λXR​

In which, A denotes an n×n square matrix and represents its eigenvalue.

\[X_R = \begin{bmatrix} x_1 \\  x_2 \\ . \\  . \\ . \\ x_n \end{bmatrix}\]


Power Method for Eigenvectors

Power method is another method for computing eigenvectors of a matrix. It is an iterative method that is used in numerical analysis. Power method works in the following way:

Let us assume that A is a matrix of order n×n and λ1​, λ2​,…,λn​ be its eigenvalues, such that λ1​ be the dominant eigenvalue. We are to select an initial approximate value x0​ for a dominant eigenvector of A.

Then

X1= AX0     ……(1)

X2 = AX1 = AA(X0) = A2X0      .....(using equation 1)

Similarly, we have

X3  = A3X0

Xk = AkX0


Solved Examples

1. Evaluate the Eigenvalues for the Following Matrix:

\[A=\begin{bmatrix}4 & 6 \\1 & 5 \end{bmatrix}\]

Solution:

Given,

\[A=\begin{bmatrix}4 & 6 \\1 & 5 \end{bmatrix}\]

Therefore,

\[\left | A-\lambda_{1} \right |=\begin{bmatrix}4-\lambda & 6 \\1 & 5-\lambda \end{bmatrix} = 0\]

 ∣ A - λI ∣ = 0 

\[A-\lambda_{1}=\begin{bmatrix}4-\lambda & 6 \\1 & 5-\lambda \end{bmatrix}\]

∣A - λI ∣ = (4 - λ)(5 - λ) - 6 = 0

∣ A - λI ∣ = 20 - 5λ - 4λ + λ2 - 6 = 0

∣ A - λI ∣ = λ2 - 9λ + 14 = 0

∣A - λI ∣ = (λ - 7)(λ - 2) = 0

∣ A - λI ∣ = λ = 7 or λ = 2


2. Evaluate the Eigenvectors for the Following Matrix:

\[A=\begin{bmatrix}1 & 4 \\-4 & -7 \end{bmatrix}\]

Solution:

\[A=\begin{bmatrix}1 & 4 \\-4 & -7 \end{bmatrix}\]

\[A-\lambda_{1}=\begin{bmatrix}1-\lambda & 4 \\-4 & -7-\lambda \end{bmatrix}\]

\[\left | A-\lambda_{1} \right |=\begin{bmatrix}1-\lambda & 4 \\-4 & -7-\lambda \end{bmatrix}\]                                                  

(1 - λ)(- 7 - λ)- 4 (-4) = 0

(λ + 3)2 = 0

λ = -3, -3

Using the eigenvector equation,

AX = -3X

A + 3I = 0

\[(\begin{bmatrix}1 & 4 \\-4 & -7 \end{bmatrix}+\begin{bmatrix}3 & 0 \\0 & 3 \end{bmatrix})\begin{bmatrix}x \\y \end{bmatrix}=\begin{bmatrix}0 \\0 \end{bmatrix}\]

Which gives:

4x + 4y = 0

or

x + y = 0

Let us set x = k, then y = -k

Therefore, the required eigenvector is:

\[X=\begin{bmatrix}x \\ y\end{bmatrix}=k\begin{bmatrix}1 \\ -1\end{bmatrix}\]


Did You know?

Eigenvectors are applicable in many fields in real life. Some of the important ones are illustrated below:

  • Eigenvector decomposition is widely used in Mathematics in order to solve linear equations of the first order, in ranking matrices, in differential calculus etc.

  • Eigenvectors are used in Physics to study simple modes of oscillation.

  • This concept is widely used in Quantum Mechanics and Atomic and Molecular Physics. In the Hartree-Fock theory, the atomic and molecular orbitals are defined by the eigenvectors of the Fock operator.

  • Eigenvectors are applied in almost all branches of engineering.

  • Eigenvectors and Eigenvalues are used in Geology and the study of glacial till.

  • The vibration analysis of mechanical structures with many degrees of freedom is done using eigenvalue problems. The eigenvalues denote the natural frequencies, also called the eigenfrequencies of vibration. The shapes of the vibration modes are denoted by the eigenvectors.


What is meant by an Eigenvalue? 

A scalar quantity associated with a linear transformation in a vector space is called an eigenvalue. Eigenvalues are the roots of the linear equation matrix system. It is also thought to be equivalent to the matrix diagonalization procedure. The eigenvalue is also defined as a scalar associated with a linear set of equations that equals the vector derived by transformation operating on the vector when multiplied by a nonzero vector.

If 'A' is a (k x k) square matrix and 'v' is a vector, then lambda is a scalar quantity that is written as follows:

Av = λv

The eigenvalue of matrix 'A' is called lambda in this case.

The following equation can also be written:

 0 = (A−λI)v

Where "I" is the same-order identity matrix as A.

Book your Free Demo session
Get a flavour of LIVE classes here at Vedantu
Vedantu Improvement Promise
We promise improvement in marks or get your fees back. T&C Apply*
FAQs (Frequently Asked Questions)

1. Are all Eigenvectors always Orthogonal?

For any matrix, the eigenvectors are not always orthogonal. However, in a symmetric matrix, where eigenvalues are always real, the corresponding eigenvalues are always orthogonal. A matrix A, multiplied with its transpose, yields a symmetric matrix in which the eigenvectors are always orthogonal. The principal component analysis is applied to the symmetric matrix, hence the eigenvectors will always be orthogonal.

2. Can a Real Matrix have Complex Eigenvectors?

If an n×n matrix has all real values, then it is not necessary that the eigenvalues of the matrix are all real numbers. The eigenvalues are obtained from solutions of a quadratic polynomial. It is not always necessary for a quadratic polynomial to yield real values. For example: If a matrix has an eigenvalue like t2+1, then it will yield an imaginary result.

3. What is a left eigenvector?

If an Eigenvector is represented in terms of a row vector, it is then called a left eigenvector and the condition that it follows is AXL​=λXL, here 'A' is the given matrix having order 'n' and 'λ' is one of the eigenvectors of A. 'XL' means a row vector. So, by the above definition, a row vector can be called a left eigenvector. 

4. What is a right eigenvector?

As left eigenvectors are represented by row vectors, similarly, a column vector represents the right eigenvector. The condition followed by the right eigenvector is AXL​=λXL, here 'A' is the given matrix having order 'n' and 'λ' is one of the eigenvalues of matrix A. 'XL' represents the row vector. So, when we write an eigenvector in terms of a column vector, it can then be called a right eigenvector. 

5. What is the orthogonality of eigenvectors?

The concept of orthogonality is based on two quantities being perpendicular to each other. Similarly, in terms of eigenvectors, when two eigenvectors are perpendicular to each other, they are said to be orthogonal. In other words, when two eigenvectors are inclined to each other at an angle of 90°, they become orthogonal to each other and this principle is known as the orthogonality of eigenvectors. 

Comment