Courses
Courses for Kids
Free study material
Offline Centres
More
Store Icon
Store
seo-qna
SearchIcon
banner

Find the inverse of the matrix $\left[ \begin{matrix}
   1 & 0 & 1 \\
   0 & 2 & 3 \\
   1 & 2 & 1 \\
\end{matrix} \right]$ by using elementary column transformation.

Answer
VerifiedVerified
549.9k+ views
Hint: To solve the question given above, we will first find out what is a matrix. Then we will find out what are elementary column transformations and what are the kind of elementary transformations we can apply. Then, we will assume that the given matrix is A. We will write A = AI. Using appropriate elementary column transformations, we will write the above equation as I = AB where B will be the inverse of matrix A.

Complete step by step solution:
Before we start to solve the question given above, we must know what is a matrix. A matrix is a rectangular array or table of numbers, or expressions, arranged in rows and columns. Elementary column transformations are the operations performed on columns of the matrices to transform it into a different form so that the calculations become simpler. We can only apply following elementary column transformations:
(i) We can interchange the column within the same matrix.
(ii) We can multiply the entire column with a non-zero number.
(iii) We can add one column to another column multiplied by a non-zero number.
Now, let us assume that the matrix given in question is A. Thus, we have:
$A=\left[ \begin{matrix}
   1 & 0 & 1 \\
   0 & 2 & 3 \\
   1 & 2 & 1 \\
\end{matrix} \right]$
Now, we can write A also as:
$\begin{align}
  & A=IA \\
 & \Rightarrow \left[ \begin{matrix}
   1 & 0 & 1 \\
   0 & 2 & 3 \\
   1 & 2 & 1 \\
\end{matrix} \right]=\left[ \begin{matrix}
   1 & 0 & 0 \\
   0 & 1 & 0 \\
   0 & 0 & 1 \\
\end{matrix} \right]A \\
\end{align}$
Now, we will subtract the first column from third column. Thus, we will get:
$\begin{align}
  & \Rightarrow \left[ \begin{matrix}
   1 & 0 & \left( 1-1 \right) \\
   0 & 2 & \left( 3-0 \right) \\
   1 & 2 & \left( 1-1 \right) \\
\end{matrix} \right]=\left[ \begin{matrix}
   1 & 0 & \left( 0-1 \right) \\
   0 & 1 & \left( 0-0 \right) \\
   0 & 0 & \left( 1-0 \right) \\
\end{matrix} \right]A \\
 & \Rightarrow \left[ \begin{matrix}
   1 & 0 & 0 \\
   0 & 2 & 3 \\
   1 & 2 & 0 \\
\end{matrix} \right]=\left[ \begin{matrix}
   1 & 0 & -1 \\
   0 & 1 & 0 \\
   0 & 0 & 1 \\
\end{matrix} \right]A \\
\end{align}$
Now, we will subtract the second column multiplied by $\left( \dfrac{3}{2} \right)$ from third column. Thus, we will get:
$\begin{align}
  & \Rightarrow \left[ \begin{matrix}
   1 & 0 & \left( 0-\dfrac{3}{2}\times 0 \right) \\
   0 & 2 & \left( 3-\dfrac{3}{2}\left( 2 \right) \right) \\
   1 & 2 & \left( 0-\dfrac{3}{2}\left( 2 \right) \right) \\
\end{matrix} \right]=\left[ \begin{matrix}
   1 & 0 & \left( -1-\dfrac{3}{2}\times 0 \right) \\
   0 & 1 & \left( 0-\dfrac{3}{2}\left( 1 \right) \right) \\
   0 & 0 & \left( 1-\dfrac{3}{2}\times 0 \right) \\
\end{matrix} \right]A \\
 & \Rightarrow \left[ \begin{matrix}
   1 & 0 & 0 \\
   0 & 2 & 0 \\
   1 & 2 & -3 \\
\end{matrix} \right]=\left[ \begin{matrix}
   1 & 0 & -1 \\
   0 & 1 & \dfrac{-3}{2} \\
   0 & 0 & 1 \\
\end{matrix} \right]A \\
\end{align}$
Now, we will add the third column multiplied by $\left( \dfrac{2}{3} \right)$ to second column. Thus, we will get:
$\begin{align}
  & \Rightarrow \left[ \begin{matrix}
   1 & \left( 0+\dfrac{2}{3}\left( 0 \right) \right) & 0 \\
   0 & \left( 2+\dfrac{2}{3}\left( 0 \right) \right) & 0 \\
   1 & \left( 2+\dfrac{2}{3}\left( -3 \right) \right) & -3 \\
\end{matrix} \right]=\left[ \begin{matrix}
   1 & 0+\dfrac{2}{3}\left( -1 \right) & -1 \\
   0 & 1+\dfrac{2}{3}\left( \dfrac{-3}{2} \right) & \dfrac{-3}{2} \\
   0 & 0+\dfrac{2}{3}\left( 1 \right) & 1 \\
\end{matrix} \right]A \\
 & \Rightarrow \left[ \begin{matrix}
   1 & 0 & 0 \\
   0 & 2 & 0 \\
   1 & 0 & -3 \\
\end{matrix} \right]=\left[ \begin{matrix}
   1 & \dfrac{-2}{3} & -1 \\
   0 & 0 & \dfrac{-3}{2} \\
   0 & \dfrac{2}{3} & 1 \\
\end{matrix} \right]A \\
\end{align}$
Now, we will add the third column multiplied by $\left( \dfrac{1}{3} \right)$ to first column. Thus, we will get:
$\begin{align}
  & \Rightarrow \left[ \begin{matrix}
   1+\left( \dfrac{1}{3}\times 0 \right) & 0 & 0 \\
   0+\left( \dfrac{1}{3}\times 0 \right) & 2 & 0 \\
   1+\left( \dfrac{1}{3}\times \left( -1 \right) \right) & 0 & -3 \\
\end{matrix} \right]=\left[ \begin{matrix}
   1+\dfrac{1}{3}\left( -1 \right) & \dfrac{-2}{3} & -1 \\
   0+\dfrac{1}{3}\left( \dfrac{-3}{2} \right) & 0 & \dfrac{-3}{2} \\
   0+\dfrac{1}{3}\left( 1 \right) & \dfrac{2}{3} & 1 \\
\end{matrix} \right]A \\
 & \Rightarrow \left[ \begin{matrix}
   1 & 0 & 0 \\
   0 & 2 & 0 \\
   0 & 0 & -3 \\
\end{matrix} \right]=\left[ \begin{matrix}
   \dfrac{2}{3} & \dfrac{-2}{3} & -1 \\
   \dfrac{-1}{2} & 0 & \dfrac{-3}{2} \\
   \dfrac{1}{3} & \dfrac{2}{3} & 1 \\
\end{matrix} \right]A \\
\end{align}$
Now, we will divide the second column by 2 and third column by (-3). Thus, we will get:
$\begin{align}
  & \Rightarrow \left[ \begin{matrix}
   1 & 0 & 0 \\
   0 & 1 & 0 \\
   0 & 0 & 1 \\
\end{matrix} \right]=\left[ \begin{matrix}
   \dfrac{2}{3} & \dfrac{-1}{3} & \dfrac{1}{3} \\
   \dfrac{-1}{2} & 0 & \dfrac{1}{2} \\
   \dfrac{1}{3} & \dfrac{1}{3} & \dfrac{-1}{3} \\
\end{matrix} \right]A \\
 & \Rightarrow I=\left[ \begin{matrix}
   \dfrac{2}{3} & \dfrac{-1}{3} & \dfrac{1}{3} \\
   \dfrac{-1}{2} & 0 & \dfrac{1}{2} \\
   \dfrac{1}{3} & \dfrac{1}{3} & \dfrac{-1}{3} \\
\end{matrix} \right]A............\left( 1 \right) \\
\end{align}$
We know that:
$I=A{{A}^{-1}}..................\left( 2 \right)$
On comparing (1) and (2), we will get:
${{A}^{-1}}=\left[ \begin{matrix}
   \dfrac{2}{3} & \dfrac{-1}{3} & \dfrac{1}{3} \\
   \dfrac{-1}{2} & 0 & \dfrac{1}{2} \\
   \dfrac{1}{3} & \dfrac{1}{3} & \dfrac{-1}{3} \\
\end{matrix} \right]$
Thus, the above matrix is the required inverse matrix.

Note: One important thing to remember is that the inverse does not exist for any matrix. Inverse only exists for those matrices whose determinant is non-zero. Thus, if a matrix is given and if we have to find its inverse matrix and its determinant is zero, then we will not be able to find its inverse.