Difference Between Accuracy and Precision

Measurement is an essential factor for us to understand the external world. Through millions of life years, we have developed a measurement sense. Measurements need tools, which provide a quantity for scientists. The problem that occurs here is, the result of every measurement by any measuring instrument contains a sort of uncertainty, where this uncertainty is called an Error.

Accuracy and precision are more essential factors to consider while taking any measurements. These terms explain how close a measurement is to either a known or accepted value. Let us look at precision, accuracy, and their differences in detail.


Accuracy can be explained as the ability of an instrument to measure the accurate value. In other terms, the closeness of the measured value to a true or standard value. Accuracy is obtained by taking small readings, and these small readings reduce the calculation error. The accuracy of the system has been classified into three types as given below.

Point Accuracy

The accuracy of an instrument only at a particular point on its scale is called point accuracy. It is quite important to make a note; this accuracy does not give any info on the general accuracy of the instrument.

Accuracy as the Percentage of Scale Range

The uniform scale range defines the accuracy of a measurement. We can understand this better with the help of the below example:

Let us consider a clinical thermometer having a scale range of up to 5000 C. The thermometer has an accuracy of ±0.5; it means ±0.5 percent of increase or decrease in the instrument value, which is negligible. But if the same reading is more or less than 0.50 C, it is considered as a high-value error.

Accuracy as the Percentage of True Value

This type of accuracy of the instrument is defined by identifying the measured value with regards to their true value. The instruments’ accuracy can be neglected up to ±0.5 percent from the true value.


The closeness of either two or more measurements to each other is called the precision of a substance. If a given substance is weighed five times and get 3.2 kg every time, the measurement is very precise but not necessarily accurate. Precision is always independent of accuracy. The example discussed below will explain to you how you can be precise but not accurate and vice versa. Precision can sometimes be separated as follows.


This is the variation, arising when the conditions are kept identical, and repeated measurements are taken in a short period of time.


It is the variation, arisen when used the same measurement process among different operators and instruments and over longer time periods.

(Image to be added soon)

Example of Accuracy and Precision

There are many real-time examples that can be discussed in the concept of Accuracy and Precision. Let us discuss one among them.

Let us go through a good analogy to understand the difference between precision and accuracy with example. Imagine if a football player is shooting at the goal. If the player shoots the ball into the goal, he is said to be accurate. Also, a football player who keeps striking the same goalpost is precise but not accurate.

Thus, a football player can be accurate without being precise if he hits the ball all over the place but still, he scores. Also, a precise player will hit the ball to the same spot repeatedly, irrespective of whether or not he scores. A football player, who is accurate and precise, will not only aim at a single spot but also score a goal.

(Image to be added soon)

Considering the image, the image on the top left shows the target hit at a high precision point and accuracy. The image on the top right shows the target hit with high accuracy but low precision. The image listed on the bottom left shows the target hit at a high precision but with low accuracy. Finally, the image placed on the bottom right shows the target hit at low accuracy and at low precision.

List Out the Difference Between Accuracy and Precision in Tabular Form

The difference between accuracy and precision is tabulated below:



Accuracy is the level of agreement between the absolute measurement and the actual measurement.

Precision implies the variation level that lies in the values of several measurements of a similar factor.

It represents how the results agree with the standard value closely.

It represents how the results closely agree with one another.

Accuracy is a single-factor or measurement.

Here, the multiple measurements or factors are needed.

Possibly, a measurement can be accurate on occasion as a fluke. But, for a measurement to be accurate consistently, it should also be precise.

The results can be precise without being accurate. And, alternatively, the results can be accurate and precise.

Accuracy is not dependent on precision.

Precision is not dependent on accuracy.

FAQs (Frequently Asked Questions)

1. Give an Example of Accuracy and Precision?

Let us discuss a simple, understandable, and a real-time example of accuracy and precision.

Consider, if the weather temperature is noted at 28 °C outside and it is the same as 28 °C outside, then the resultant measurement is said to be an accurate one. Whereas, if the thermometer registers the same temperature continuously for several days, then the measurement is also said to be precise.

2. How to Determine the Percent Error?

Let us come to know how to determine the percent error from the following steps.

Step 1 - Take the difference between the accepted and experimental value.

Step 2 - Now, take the absolute value of step 1.

Step 3 - Divide the answer that is obtained from step 2 by the accepted value.

Step 4 - Then, multiply the answer, which you got in step 3 by 100

Step 5 -  Finally, add the percentage (%) symbol to represent the answer in percentage.

3. List Out the Difference Between Accuracy and Precision of Measurements

Accuracy - It is related to the closeness of a signal measurement to its true value.

Precision - It refers to the closeness of a set of values obtained from the identical measurements of a quantity.