Measurement is an essential factor for us to understand the external world. Through millions of life years, we have developed a measurement sense. Measurements need tools, which provide a quantity for scientists. The problem that occurs here is, the result of every measurement by any measuring instrument contains a sort of uncertainty, where this uncertainty is called an Error.
Accuracy and precision are more essential factors to consider while taking any measurements. These terms explain how close a measurement is to either a known or accepted value. Let us look at precision, accuracy, and their differences in detail.
Accuracy can be explained as the ability of an instrument to measure the accurate value. In other terms, the closeness of the measured value to a true or standard value. Accuracy is obtained by taking small readings, and these small readings reduce the calculation error. The accuracy of the system has been classified into three types as given below.
The accuracy of an instrument only at a particular point on its scale is called point accuracy. It is quite important to make a note; this accuracy does not give any info on the general accuracy of the instrument.
Accuracy as the Percentage of Scale Range
The uniform scale range defines the accuracy of a measurement. We can understand this better with the help of the below example:
Let us consider a clinical thermometer having a scale range of up to 5000 C. The thermometer has an accuracy of ±0.5; it means ±0.5 percent of increase or decrease in the instrument value, which is negligible. But if the same reading is more or less than 0.50 C, it is considered as a high-value error.
Accuracy as the Percentage of True Value
This type of accuracy of the instrument is defined by identifying the measured value with regards to their true value. The instruments’ accuracy can be neglected up to ±0.5 percent from the true value.
The closeness of either two or more measurements to each other is called the precision of a substance. If a given substance is weighed five times and get 3.2 kg every time, the measurement is very precise but not necessarily accurate. Precision is always independent of accuracy. The example discussed below will explain to you how you can be precise but not accurate and vice versa. Precision can sometimes be separated as follows.
This is the variation, arising when the conditions are kept identical, and repeated measurements are taken in a short period of time.
It is the variation, arisen when used the same measurement process among different operators and instruments and over longer time periods.
(Image to be added soon)
Example of Accuracy and Precision
There are many real-time examples that can be discussed in the concept of Accuracy and Precision. Let us discuss one among them.
Let us go through a good analogy to understand the difference between precision and accuracy with example. Imagine if a football player is shooting at the goal. If the player shoots the ball into the goal, he is said to be accurate. Also, a football player who keeps striking the same goalpost is precise but not accurate.
Thus, a football player can be accurate without being precise if he hits the ball all over the place but still, he scores. Also, a precise player will hit the ball to the same spot repeatedly, irrespective of whether or not he scores. A football player, who is accurate and precise, will not only aim at a single spot but also score a goal.
(Image to be added soon)
Considering the image, the image on the top left shows the target hit at a high precision point and accuracy. The image on the top right shows the target hit with high accuracy but low precision. The image listed on the bottom left shows the target hit at a high precision but with low accuracy. Finally, the image placed on the bottom right shows the target hit at low accuracy and at low precision.
List Out the Difference Between Accuracy and Precision in Tabular Form
The difference between accuracy and precision is tabulated below: