The mathematical branch which generally comprises all the relationships and laws which remain constant and valid irrespective of the coordinate system using which the quantities have been specified is called tensor analysis. Such relationships are usually known as covariant. The main reason to develop tensors was to formally study and manipulate geometric patterns which may come up during the analysis of mathematical curves. Therefore, as a vector extension, tensors were discovered. Gregorio Ricci Curbasto along with his student Tullio Levi Civita was the first to develop a tensor analysis for physicists. Tensor calculus is also known as Ricci calculus. It was further used by Albert Einstein to devise his famous theory of relativity.
Before going deeper into tensor analysis, we need to study a proper introduction to vector and tensor analysis. Any quantity having both direction and magnitude is known as a vector. It is represented by an arrow and follows the parallelogram law of addition. For every coordinate system, a vector has different sets of components. The vector components change accordingly when the coordinate system changes. It happens as per the transformation law in mathematics.
There are two properties of this transformation. First, irrespective of the coordinates, vector relationships will exist. Second, according to vector tensor analysis, after some sequential changes, the original coordinates are achieved. Here the components will be the same as in the starting condition. But it is read in terms of components with all coordinates considered on an equal level. Hence, in any n dimensions, a vector will have n components.
Tensor Analysis Overview
Now that you have received an introduction to vector and tensor analysis we will move on to vector and tensor analysis with applications. A tensor can be defined as any entity with components that can change depending on the transformation law. This law is a more general version of the vector transformation law but with the same two properties as mentioned above. Each tensor component is denoted by a letter with subscript and superscript. All the coordinates are numbered from 1 to n.
Later we will see in-depth vector and tensor analysis with applications. Some of the special tensor cases constitute vectors and scalars. Vectors have n components in each coordinate system while scalars have only one component for each. There is no need for pictorial representation. It is because an objective relationship that is free of all coordinate systems can be seen if a linear equation of tensors is valid in all coordinate systems when proved to be valid in one system.
Types of Tensors
Two types of tensors will catch someone’s interest in principal tensor analysis. These are the curvature tensor and metrical tensor. Using a metrical tensor one can convert components of a vector to vector magnitudes. Let the components of vector C be C1 and C2 in a simple two-dimensional plane with perpendicular coordinates. Here, the magnitude of V will be V12 + V22. Here, the 1s and 0s are not written. But once you write it down, the entire set of components for the metrical tensor (1,0,0,1) becomes visible. A more generalized equation can be written with the help of oblique coordinates.
The new quantities will likely become the new metrical tensor coordinates. The curvature tensor is a much more complicated and complex tensor created out of the metrical tensor itself. It is used to define intrinsic curve aspects and features of the n-dimensional space where it belongs in tensor analysis for physicists. Many physics equations can be written in a form where it is independent of coordinate systems with help of tensor calculus. This is the exact opposite of infinitesimal calculus.
Application of Tensors
You have already learned in the introduction to tensor analysis that Einstein has used it to derive the theory of relativity. Tensors have a vast application in physics and mathematical geometry. The mathematical explanation of electromagnetism is also defined by tensors. The vector analysis acts as a primer in tensor analysis and relativity. Elasticity, quantum theory, machine learning, mechanics, relativity are all affected by tensors.
Did You Know?
A vector can be disintegrated into Einstein sum that represents the contraction of tensors.
Every vector can be represented in two ways. One has a covariant component and a contravariant basis. While the other is a contravariant component and a covariant basis.
A matrix of scalar elements makes up a metric tensor. One can lower or raise the index on other tensors by contraction wherein the covariant tensor gets converted to contravariant.