Courses
Courses for Kids
Free study material
Offline Centres
More
Store Icon
Store
seo-qna
SearchIcon
banner

If \[A(\alpha ,\beta ) = \left( {\begin{array}{*{20}{c}}
  {\cos \alpha }&{\sin \alpha }&0 \\
{ - \sin \alpha }&{\cos \alpha }&0 \\
  0&0&{{e^\beta }}
\end{array}} \right)\], then $A{\left( {\alpha ,\beta } \right)^{ - 1}}$ is equal to
A) ($ - \alpha , - \beta $)
B) ( $ - \alpha ,\beta $)
C) ( $\alpha , - \beta $)
D) ( $\alpha ,\beta $)

Answer
VerifiedVerified
509.4k+ views
Hint: In the above question, first we will find the determinant of this matrix. Then we will find the adjoint of the matrix. To find the adjoint of the matrix we have to determine the cofactor of the given matrix. Then we know that the adjoint of a matrix is the transpose of the cofactor matrix. Finally, to find the inverse of the matrix we will divide the adjoint of the matrix by determinant of the matrix.

Formula used: $A{\left( {\alpha ,\beta } \right)^{ - 1}}$= $\dfrac{{adj(A(\alpha ,\beta ))}}{{|A(\alpha ,\beta )|}}$.

Complete step-by-step answer:
Given, \[A(\alpha ,\beta ) = \left( {\begin{array}{*{20}{c}}
  {\cos \alpha }&{\sin \alpha }&0 \\
{ - \sin \alpha }&{\cos \alpha }&0 \\
  0&0&{{e^\beta }}
\end{array}} \right)\]
$|A(\alpha ,\beta )| = {e^\beta }({\cos ^2}\alpha + {\sin ^2}\beta ) = {e^\beta }$
Now, $A{\left( {\alpha ,\beta } \right)^{ - 1}}$= $\dfrac{{adj(A(\alpha ,\beta ))}}{{|A(\alpha ,\beta )|}}$
Cofactors of \[A(\alpha ,\beta )\] = $\left( {\begin{array}{*{20}{c}}
  {{e^\beta }\cos \alpha }&{{e^{^\beta }}\sin \alpha }&0 \\
{ - {e^\beta }\sin \alpha }&{{e^\beta }\cos \alpha }&0 \\
  0&0&1
\end{array}} \right)$
We know that the adjoint of a matrix \[A(\alpha ,\beta )\]is the transpose of the cofactor matrix of \[A(\alpha ,\beta )\].
adj (A ($\alpha ,\beta $)) = $\left( {\begin{array}{*{20}{c}}
  {{e^\beta }\cos \alpha }&{ - {e^{^\beta }}\sin \alpha }&0 \\
  {{e^\beta }\sin \alpha }&{{e^\beta }\cos \alpha }&0 \\
  0&0&1
\end{array}} \right)$
$A{\left( {\alpha ,\beta } \right)^{ - 1}}$= $\dfrac{1}{{{e^\beta }}}\left( {\begin{array}{*{20}{c}}
  {{e^\beta }\cos \alpha }&{ - {e^{^\beta }}\sin \alpha }&0 \\
  {{e^\beta }\sin \alpha }&{{e^\beta }\cos \alpha }&0 \\
  0&0&1
\end{array}} \right)$
We know that cos ($ - \alpha $) = cos$\alpha $and sin ( $ - \alpha $) = $ - \sin \alpha $.
$A{\left( {\alpha ,\beta } \right)^{ - 1}}$= $\left( {\begin{array}{*{20}{c}}
  {\cos ( - \alpha )}&{\sin ( - \alpha )}&0 \\
{ - \sin ( - \alpha )}&{\cos ( - \alpha )}&0 \\
  0&0&{{e^{ - \beta }}}
\end{array}} \right)$= A ($ - \alpha , - \beta $).

So, option A is the correct option.

Note: Matrix is an arrangement of numbers into rows and columns. It is an array of numbers. A matrix is a rectangular arrangement of numbers into rows and columns. For example: A matrix has 3 rows and 4 columns. We can do so many things with matrix: Addition- We can add two matrices; but before adding both the matrix must have the same size, i.e., the rows must match in size, and columns must match is size. Subtraction- We can subtract two matrices; subtracting is actually defined as the addition of a negative matrix: A + (- B). Inverse of a matrix is possible only for a square matrix.