A matrix which has the same number of rows and columns is called a square matrix. An identity matrix is a matrix in which the principal diagonal elements are 1 and all other elements except principal diagonal elements are zero. A transpose of any matrix is obtained by transferring the elements in its rows to its columns and vice versa. Any square matrix is said to be orthogonal if the product of the matrix and its transpose is equal to an identity matrix of the same order. The condition for orthogonal matrix is stated below:

where , A is any square matrix of order n x n.

AT is the transpose of matrix ‘A’

I is the identity matrix of order n x n

Orthogonal Matrix Example 2 x 2:

Consider a square matrix of order 2 x 2 given as

A = \[\begin{bmatrix}a & b\\c & d \end{bmatrix}\]

Transpose of this matrix is written by interchanging the rows and columns.

A\[^{T}\] = \[\begin{bmatrix}a & c\\b & d \end{bmatrix}\]

An identity matrix of 2 x 2 is written as:

I = \[\begin{bmatrix}1 & 0\\0 & 1 \end{bmatrix}\]

For an orthogonal matrix example 2 x 2, consider:

A = \[\begin{bmatrix}0 & 1\\1 & 0 \end{bmatrix}\]

The transpose of this matrix is given as:

A\[^{T}\] = \[\begin{bmatrix}0 & 1\\1 & 0 \end{bmatrix}\]

For the matrix to be orthogonal, the condition for the orthogonal matrix should be satisfied.

A⋅AT = AT⋅A = I

A⋅AT = AT⋅A

= \[\begin{bmatrix}0 & 1\\1 & 0 \end{bmatrix}\] x \[\begin{bmatrix}0 & 1\\1 & 0 \end{bmatrix}\]

= \[\begin{bmatrix}0 \times 0 + 1\times1 & 0 \times 1+1\times 0 \\1 \times 0+0\times1 & 1 \times 1 + 0 \times 0 \end{bmatrix}\]

= \[\begin{bmatrix}1 & 0\\0 & 1 \end{bmatrix}\] = I

Introduction to Matrices:

A matrix in Mathematics is a representation of numbers, expressions or symbols as a rectangular array in the form of rows and columns. Matrices are used to perform basic mathematical operations in a linear or any other complex system of equations. Various types of matrices include diagonal, triangular, square, identity, symmetric, skew symmetric, invertible, definite and orthogonal matrices. There are several important terms associated with matrices which are the prerequisites to understand the concept of orthogonal matrices and their properties.

Square Matrix: Any matrix that has the same number of rows as that of the columns.

Symmetric Matrix: Any matrix that does not change when its transpose is written is called a symmetric matrix.

Identity Matrix: Any matrix in which has the values of its principal diagonal elements equal to unity and all the other elements except the principal diagonal elements equal to zero.

Multiplication of Matrices: Matrix multiplication is possible only when the number of rows in the second matrix is equal to the number columns in the first matrix. Let us consider an orthogonal matrix example 3 x 3. It can be multiplied with any other matrix which has only three rows; neither more than three nor less than three because the number of columns in the first matrix is 3. Matrix multiplication satisfies associative property. However, matrix multiplication does not satisfy the commutative property of multiplication.

Diagonal Matrix: A matrix in which all the elements except principal diagonal elements are zero.

Inverse of a Matrix: Inverse of a matrix is obtained when the adjoint of the matrix is divided by its determinant.

Determinant of a Matrix: Determinant is a special number that is calculated in case of square matrices.

Orthogonal Matrix Properties:

Orthogonal matrices are generally square matrices of order n x n.

All the elements of any orthogonal matrix are real in nature.

All the orthogonal matrices are symmetric in nature. (A symmetric matrix is a square matrix whose transpose is the same as that of the matrix).

Identity matrix of any order m x m is an orthogonal matrix.

When two orthogonal matrices are multiplied, the product thus obtained is also an orthogonal matrix.

A number of orthogonal matrices of the same order form a group called the orthogonal group.

When the transpose of an orthogonal matrix is written, it is to be observed that the transpose is also orthogonal.

An orthogonal matrix of any order has its inverse also as an orthogonal matrix.

All diagonal matrices are orthogonal.

All the orthogonal matrices of any order n x n have the value of their determinant equal to ±1.

Eigenvector of any orthogonal matrix is also orthogonal and real. It is also true that the eigenvalues of orthogonal matrices are ±1.

Orthogonal Matrix Example 2 x 2

Consider a 2 x 2 matrix defined by ‘A’ as shown below. Analyze whether the given matrix A is an orthogonal matrix or not.

A = \[\begin{bmatrix}cos x & sin x\\-sin x & cos x \end{bmatrix}\]

Solution:

From the properties of an orthogonal matrix, it is known that the determinant of an orthogonal matrix is ±1. The determinant of matrix ‘A’ is calculated as:

|A| = cos x . cos x - sin x (-sin x)

|A| = cos2 x + sin2 x (From trigonometric identities: cos2 x + sin2 x = 1)

|A| = 1

Therefore, the given matrix is an orthogonal matrix.

Orthogonal Matrix Example 3 x 3:

Sumanth says that a 3 x 3 matrix ‘P’ given below is not orthogonal. Without performing any calculations, state reasons to prove that Sumanth is wrong.

P = \[\begin{bmatrix} 5 & 0 & 0\\ 0 & -3 & 0\\ 0 & 0 & 7 \end{bmatrix}\]

Solution:

The matrix ‘P’ is a diagonal matrix with all the other elements except principal diagonal elements equal to zero.

The transpose of matrix ‘P’ is also the same as that of matrix P.

Therefore, from the properties of orthogonal matrices, it is evident that the given matrix P is an orthogonal matrix and Sumanth’s assumption is wrong.

Fun Quiz:

An orthogonal matrix example 3 x 3 is multiplied by its transpose. Which of the following statements is true in this case.

Transpose of the matrix is equal to a 3 x 3 identity matrix.

The product of transpose and inverse is a matrix of order 3 x 3 with all the elements except principal diagonal elements equal to 1.

Multiplication of a matrix and its transpose satisfies commutative law of multiplication.

Which of the following is not a property of the orthogonal matrix?

Determinant is always equal to +1

Orthogonal matrices are symmetric in nature

Inverse is not equal to transpose

FAQ (Frequently Asked Questions)

1. Why are Orthogonal Matrices Important?

Orthogonal matrices find their importance in various calculations of Physics and Mathematics. It represents the dot product of vectors in linear transformations. Orthogonal matrices also act as an isometry of Euclidean space. Euclidean space is a two dimensional or three dimensional space in which Euclid’s axioms and postulates are valid. A few examples of Euclidean space are reflection, rotation and rotoreflection. Orthogonal matrices are also used in representing and computations involving unitary transformations. A group of orthogonal matrices is especially defined to denote rotations and reflections. Hence orthogonal matrices are very important in both Physics and Mathematics.

2. How do You Know Whether the Matrix is Orthogonal or Not?

Matrices are generally a representation of an array of numbers, expressions or symbols in the form of rows and columns. There are several facts about orthogonal matrices which can be used to determine whether a given matrix is orthogonal or not.

In an orthogonal matrix, the number of rows and columns should be equal. However, all square matrices may not be orthogonal in nature.

All identity matrices are orthogonal. However, all orthogonal matrices need not be identity matrices.

Orthogonal matrices are symmetric

All the square matrices whose determinant is equal to ±1 are orthogonal matrices.

All square matrices which satisfy the condition for orthogonal matrices are orthogonal.

A⋅A^{T} = A^{T}⋅A = I