Courses
Courses for Kids
Free study material
Offline Centres
More
Store Icon
Store
seo-qna
SearchIcon
banner

What is the standard deviation of the standard normal distribution?

Answer
VerifiedVerified
509.7k+ views
Hint: To solve this question we need to know the concept statistics and probability. The Standard deviation of a standard normal distribution is defined as the degree at which the given measurement deviates from the mean of the set of values.

Complete step-by-step answer:
The question asks from us the explanation of the standard deviation of the standard normal distribution. The normal distribution in probability theory, is a kind of continuous probability distribution for a real- valued random variable. The standard deviation is a measure of the amount of variation of a set of values. In case of a low standard deviation, it indicates that the values tend to be close to the mean or the expected values of the given set of numbers, while in case of a high standard deviation it indicates that the values are spread out over a wider range.
Now taking standard normal deviation in case of normal distribution. For the normal distribution the mean of the set of the data becomes zero and variance is $1$ . So on applying the formula for the standard deviation which says standard deviation is the square root of variance of the set of values. On doing this we get:
$\Rightarrow \text{standard deviation = }\sqrt{\text{variance}}$
On substituting the value we get:
$\Rightarrow \text{standard deviation = }\sqrt{1}$
Square root of $1$ is $1$.
$\Rightarrow \text{standard deviation = 1}$
$\therefore $ The standard deviation of the standard normal distribution is $1$.

Note: Standard deviation is mostly represented as the mathematical texts sigma which is represented by the Greek letter “Sigma” which is represented as $\sigma $. An important property of Standard deviation states that unlike the variance the unit is same as the data.