Courses
Courses for Kids
Free study material
Offline Centres
More
Store Icon
Store

How to prove this theorem? Let \[A\] be an \[n \times n\] matrix and \[\lambda \] an Eigenvalue of \[A\]. Then \[\lambda + \mu \] is an Eigenvalue of the matrix \[M = A + \mu I\] where \[I\] is the \[n \times n\] unit matrix.

seo-qna
SearchIcon
Answer
VerifiedVerified
414k+ views
Hint:
Here, we will use the definition of the Eigenvalue for a matrix. Then we will use the given matrix and multiply an Eigenvector. Then we will substitute the Eigen value to simplify the equation. We will use the condition of linearity and Identity to prove the statement.

Formula Used:
We will use the following Properties:
1) Linearity Property: \[f\left( {ax + by} \right) = af\left( x \right) + bf\left( y \right)\]
2) Identity Property: \[AI = A = IA\]

Complete step by step solution:
We are given that \[\lambda \] is an Eigenvalue of the\[n \times n\]matrix\[A\].
We know that by definition, if \[A\nu = \lambda \nu \] for \[\nu \ne 0\], we say that \[\lambda \] is the Eigenvalue for \[\nu \] and \[\nu \] is an Eigenvector for \[\lambda \].
\[A\nu = \lambda \nu \] ……………………………………………………\[\left( 1 \right)\]
Now, we will prove that \[\lambda + \mu \] is an Eigenvalue of the given matrix \[M = A + \mu I\] where \[I\] is the \[n \times n\] unit matrix.
Now, let us consider the given matrix
\[M = A + \mu I\]
By multiplying the Eigenvector \[\nu \] on both the sides of the given matrix, we get
\[ \Rightarrow M\nu = \left( {A + \mu I} \right)\nu \]
Using the Linearity Property \[f\left( {ax + by} \right) = af\left( x \right) + bf\left( y \right)\], we get
\[ \Rightarrow M\nu = A\nu + \mu I\nu \]
Now, it is given that \[I\] is the \[n \times n\] unit matrix, so by using the identity property \[AI = A = IA\], we get
\[ \Rightarrow M\nu = A\nu + \mu \nu \]
Now, by using the equation \[\left( 1 \right)\], we get
\[ \Rightarrow M\nu = \lambda \nu + \mu \nu \]
Again by using the Linearity Property \[f\left( {ax + by} \right) = af\left( x \right) + bf\left( y \right)\], we get
\[ \Rightarrow M\nu = \left( {\lambda + \mu } \right)\nu \]
So, \[\lambda + \mu \] is an Eigenvalue of the matrix \[M\] and \[\nu \] is the corresponding Eigenvector of the matrix \[M\] for the Eigenvalue \[\lambda + \mu \] .

Theorem: Let \[A\] be an \[n \times n\] matrix and \[\lambda \] an Eigenvalue of \[A\]. Then \[\lambda + \mu \] is an Eigenvalue of the matrix \[M = A + \mu I\] where \[I\] is the \[n \times n\] unit matrix.
Hence proved.

Note:
We know that an Eigenvector of a matrix is a non-zero vector in \[{{\bf{R}}^n}\] such that \[A\nu = \lambda \nu \] for some scalar\[\lambda \]. An Eigenvalue of a matrix is a scalar \[\lambda \] such that \[A\nu = \lambda \nu \] has a non-trivial solution. We should note that an Eigenvector and an Eigenvalue are found only for square matrices. Also, an Eigenvector is always non-zero but Eigenvalues may be equal to zero. A square matrix is defined as a matrix where the number of rows is equal to the number of columns.