Courses for Kids
Free study material
Offline Centres
Store Icon

Algebra of Matrices

Reviewed by:
Last updated date: 22nd Jul 2024
Total views: 451.5k
Views today: 9.51k
hightlight icon
highlight icon
highlight icon
share icon
copy icon

What is a Matrix?

A matrix is ​​a rectangular matrix of numbers arranged in columns and rows. Arrays are usually enclosed in square brackets. The horizontal and vertical rows of the matrix entries are called rows and columns, respectively. The size of the matrix is ​​determined by the number of rows and columns in the matrix. A matrix consisting of m rows and n columns is called an m × n matrix or m by n matrix, and m and n are its dimensions.

Matrix algebra is a field of mathematics that deals with vector spaces between different dimensions. The innovation of matrix algebra was born out of the spatial planes that exist in our coordinate space.

Types of Matrices

  • Singular Matrix

The matrices are called the singular matrix or matrices if and only if the determinant of the matrix is equal to zero. For example, if we take a matrix A, whose elements of the first column are zero, then automatically its determinant will be zero.

According to the rules and properties of the determinants, we know that the determinant, in the considered matrix, is zero. Therefore, matrix A is definitely a singular matrix. Hence, a singular matrix is non-convertible in nature. This implies that its inverse does not exist.

  • Non Singular Matrix

Non singular matrix is defined as a square matrix whose determinant is not equal to zero i.e., which is not a singular matrix and it has a matrix inverse. Non-singular matrices are sometimes also known as regular matrices. A square matrix is non singular if its determinant is non zero.

Thus, a square matrix is said to be a non singular matrix, in other words, the matrix with a matrix inverse. The Nonsingular matrices are often also referred to as regular matrices. A square matrix is nonsingular if and only if it is having non zero determinants i.e., the determinant of the matrix should not be zero. For example, there is a non singular matrix of the order $3\times 3$:

There are many types of similar matrices we come across in the algebra of matrices. But finding the solutions of matrices varies from one matrix type to another.

  • Invertible Matrices

A matrix is ​​a pattern of numbers arranged in rows and columns. The number of rows and columns in a matrix is ​​called its dimension and is given by $m\times n$,  where m and n represent the number of rows and columns, respectively. You can perform basic mathematical operations on the matrix, such as addition, subtraction, multiplication, and division. Now, let us discuss the inverse or invertible vertices of a matrix.

The $n\times n$  dimensional matrix A is said to be invertible if and only if there is another matrix B of the same dimension such that AB = BA = I, where I is the identity matrix of the same order. B is known as the inverse of matrix A.

The inverse of matrix A is represented by the symbol $A^{-1}$. The invertible matrix is ​​also called a non-singular matrix or a non-degenerate matrix.

For example matrices A and B are given:

Now, multiply the matrices A and B, we get:

Similarly, multiply the matrices B and A, we get:

Therefore, it is proven that AB = BA = I. Thus, $A^{-1}=B$, the inverse of matrix A is B and vice versa.

Invertible Matrix Theorem

Theorem 1:

Statement: If there exists an inverse of a square matrix, it is always unique.


Let us consider matrix A to be a square matrix of order $n\times n$. Let us assume that, the matrices B and C to be inverses of matrix A.

Then, we know that according to the invertible matrix condition, B will be inverse of A if and only if:

$\Rightarrow AB = BA = I$

Similarly, C will be the inverse of A if and only if:

$\Rightarrow AC = CA = I$

But, according to the algebra of matrices, we know that:

$\Rightarrow B = BI$

$\Rightarrow B =B(AC)$

$\Rightarrow B = (BA)C = IC$

$\Rightarrow B = C$

Hence, it is proven that B = C, or in other words we can say B and C are the same matrices or identical matrices.

Theorem 2:

Statement: If A and B are matrices of the same order and are invertible, then $\left ( AB \right )^{-1}=B^{-1} . A^{-1}$.


Now, we have to prove that $\left ( AB \right )^{-1}=B^{-1} . A^{-1}$

Now, according to the definition of the invertible matrix, we know that:

$\Rightarrow (AB)(AB)^{-1}=(AB)^{-1}(AB)=I$

Multiplying $A^{-1}$ on both sides:

$\Rightarrow A^{-1} (AB)(AB)^{-1}=(A)^{-1}I$

$\Rightarrow (A^{-1}A)B (AB)^{-1}=A^{-1}$

$\Rightarrow B(AB)^{-1}=A^{-1}$

$\left ( AB \right )^{-1}=B^{-1} . A^{-1}$

Hence the proof.

Matrices and Linear Algebra

Linear algebra is the core of almost every field of mathematics. For example, linear algebra is the basis of advanced geometry, including the definition of basic objects such as lines, planes, and rotations. Furthermore, functional analysis is a branch of mathematical analysis, which can be considered as the application of linear algebra in functional space.

Linear Algebra is also used in most fields of advanced science and engineering because it provides many common phenomena to be modeled and dynamic calculations using such models. For nonlinear systems that cannot be modeled by linear algebra, it is generally used to deal with first-order approximations, using the fact that the differential of a multivariate function at a point is the linear mapping of the function closest to that point.

Matrix Algebra

Matrix algebra involves operations on matrices, such as addition, subtraction, and multiplication. Let's better understand matrix operations.

Matrix Addition/Subtraction:

  • Two matrices can be added/subtracted if and only if the number of rows and columns of the two matrices is the same, or the order of the matrices is the same.

  • For addition/subtraction, each element of the first matrix is added/subtracted from the element of the second matrix.

  • For example, adding two matrices A and B as shown below:

Matrix Multiplication

Just as matrices can be multiplied in two ways,

  • Scalar multiplication: Scalar multiplication-it involves multiplying a scalar by a matrix. Each element in the array must be multiplied by a scalar to form a new array.


  • Multiplying with another matrix: Multiplication of a matrix with another matrix: If the number of columns in the first matrix is ​​equal to the number of rows in the second matrix, the two matrices can be multiplied. Let A and B be the two 22matrices, then these matrices multiplication will be:

Did You Know?

Two arrays can be multiplied by different sizes as long as the number of columns in the first matrix equals the number of rows in the second. The result of multiplication, called the product, is another matrix with the same number of rows as the first and the same number of columns as the second.

Practice Question MCQs

1. How many elements will there be in the matrix if the order of the matrix is $m\times n$?

  1. 2mn

  2. m+n

  3. m×n

  4. mn2

Answer: C)

2. Which of the following is not true for matrix multiplication?

  1. Commutative Property

  2. Multiplicative Property

  3. Distributive Property

  4. Associative Property

Answer: A)


Matrices are an effective tool for representing, manipulating, and studying linear mappings across finite-dimensional vector spaces (if you have chosen basis). Matrices can also represent quadratic forms (for example, hessian matrices can be used in analysis to examine the behavior of critical points). Matrices are extremely useful in 3D geometry (e.g., computer graphics) and are quite strong. A basic 4x4 matrix may describe many transformations simultaneously (translation, rotation, scaling, perspective/orthogonal projection). In the broadest sense, matrices (and a very essential subset of matrices, vectors) allow you to generalize from single variable equations to equations with an infinite number of variables. Some of the rules change along the process, which emphasizes the significance of knowing about matrices.

Competitive Exams after 12th Science

FAQs on Algebra of Matrices

1. How do we perform matrix in algebra?

  • Perform normal matrix operations (addition, subtraction, multiplication, etc.). 

  • Determine the rank of a matrix (that is, distinguish between singular and non-singular matrices). 

  • Find the determinant and inverse of any square matrix. 

  • Solve simultaneous linear equations using matrix algebra.

2. What are the matrices in linear algebra?

A matrix (plural is a matrix) is a rectangular matrix of numbers, symbols, or expressions arranged in rows and columns. Matrices can be used to describe and manipulate many linear equations, a system of linear equations.

3. What are the five types of algebra?

There are five different types of algebra: elementary algebra, abstract algebra, advanced algebra, commutative algebra, and linear algebra. All of these branches have various formulae, applications, and purposes for determining variable values.