## Introduction

It's an interesting topic of Mathematics that deals with the outcome of a random event. The term "probability" refers to the chance or possibility of a particular result. It explains the likelihood of a specific occurrence occurring. We frequently use phrases like 'It will most likely rain today,' 'he will most likely pass the test,' 'there is very little chance of a storm tonight,' and ‘most likely the price of onion will rise again.' We use the word probability instead of words like chance, doubt, maybe, likely, and so on in all of these statements. Probability is defined as the ability to forecast an event based on historical data or the quantity and types of probable outcomes.

### Important Terms in Probability

Experiment

Random Experiment

Sample Space

Event

Elementary Event

Compound Event

Trial

Conditional Probability

Independent Events

Partition of a Simple Space

Bayes’ Theorem

Total Probability Theorem

### Different Types of Events

Complimentary Event: For every event A, there corresponds to another event A’ called the complementary event to A. It is also called the event ‘not A’.

For example, take the experiment ‘of tossing three coins’. An associated sample space is

S = {HHH, HHT, HTH, THH, HTT, THT, TTH,TTT}

Let A = {HTH, HHT, THH} be the event ‘only one tail appears’.

Thus, the complementary event ‘not A’ to the event A is

A’ = {HHH, HTT, THT, TTH, TTT}

Thus, $A'=\left\{ \omega :\omega \in S \,\, and \,\, \omega \notin A\right\}= S-A$

$\Rightarrow A'$ is also denoted by $A^{c} \,\, or \,\, \overline{A}$.

The Event ‘A or B’: When the sets A and B are two events associated with a sample space, then $A\cup B$ is the event ‘either A or B or both’ also sometimes called ‘A or B’.

The Event ‘A and B’: If A and B are two events, then the set $A\cap B$ denotes the event ‘A and B’.

The Event ‘A but not B’: the set $A-B$ denotes the event ‘A but not B’. Also, $A-B$ is the same as $A\cap B'$ or $A-\left ( A\cap B \right )$.

Equally Likely Events: Two or more events are said to be equally likely if, after taking into consideration all relevant evidence, none can be expected in preference to another. Simple events connected with a random experiment are always equally likely but compound events may or may not be so. Thus, the simple events 'Head' and 'Tail' connected with the random experiment of tossing an unbiased coin are equally likely. Similarly, the six simple events 'one', 'two', ...., 'six' connected with the random experiment of throwing an unbiased die are equally likely but the compound events A and B are equally likely when P(A) = P(B)

Exhaustive Events: Consider the experiment of throwing a dice. Then Sample space

S = {1, 2, 3, 4, 5, 6} and define the following events

A : a number less than 3 appears,

B : a number greater than 2 but less than 5 appears,

C : a number greater than 4 appears

Then A = {1, 2}, B = {3, 4}, and C = {5, 6}

$\Rightarrow A\cup B\cup C =$ {1, 2, 3, 4, 5, 6} = S

In general, if $E_{1},\, E_{2}, ..., \,E_{n}$ are n events of a sample space S and if $E_{1}\cup E_{2}\cup .....\cup E_{n}= \bigcup_{i=1}^{n} E_{i}=S$, then $E_{1},\, E_{2}, ..., \,E_{n}$ are called exhaustive events.

Mutually Exclusive Events: If A and B are two events, such that $A\cap B=\phi$ then events are mutually exclusive.

### Axiomatic Approach to Probability

Probability is the measure of the likelihood that an event will occur. Probability is quantified as a number between 0 and 1. Axiomatic approach is a way of describing the probability of an event.

In this approach, some axioms or rules are depicted to assign probabilities.

For any event E, $0\leqslant p(E)\leq 1$

P(S) = 1

If E and F are mutually exclusive events, then $P\left ( E\cup F \right )=P\left ( E \right )+P\left ( F \right )$

$P\left ( \omega _{1}\right )+P\left ( \omega _{2}\right )+... + P\left ( \omega _{n}\right )=1$

Let S be a sample space and E be an event, i.e., n(S) = n and n(E) = m. if each outcome is equally likely, then it follows that

$P(E)=\frac{1}{n}+\frac{1}{n}+....+\frac{1}{n} \left (\text{ m times }\right )=\frac{m}{n}$

### Mathematical or Classical Definition of Probability

In axiomatic approach of the probability, we learned

$\Rightarrow P(E)=\frac{1}{n}+\frac{1}{n}+....+\frac{1}{n} \left (\text{ m times} \right )=\frac{m}{n}$

$=\frac{n(E)}{n(S)}=\frac{\text{Number of elements in E}}{\text{Number of elements in S}}$

If E is any event and E’ be the complement of event E. then $P(E)+P(E')=1$.

$=\frac{\text{Number of cases favourable to event E}}{\text{Total number of cases}}$

For example, when two coins are tossed, sample space S = {HH, HT, TH, TT} and let E be the event of occurrence of at least one tail. Then E = {HT, TH, TT}.

$\therefore P(E)=\frac{n(E)}{n(S)}=\frac{3}{4}$

### Addition Theorem of Probability

$P\left ( A\cup B \right )=P(A)+P(B)-P\left ( A\cap B \right )$

If A and B are mutually exclusive events, then $A\cap B=\phi$ and hence $P(A\cap B)=0$.

$P\left ( A\cup B \right )=P(A)+P(B)$

If A, B, and C are any three events in a sample space S, then

$P(A\cup B\cup C)=P(A)+P(B)+P(C)-P(A\cap B)-P(B\cap C)-P(A\cap C)+ P(A\cap B\cap C)$

If A, B, and C are mutually exclusive events,

$\therefore P(A\cup B\cup C)=P(A)+P(B)+P(C)$

If A and B are any two events, then $\left ( A-B\right )\cap \left ( A\cap B \right )=\phi$ and $A=\left ( A-B \right )\cup \left ( A\cap B \right )$ .

### Conditional Probability

Conditional Probability is a measure of the probability of an event given that another event has occurred. If the event of interest is A and the event B is known or assumed to have occurred, “the conditional probability of A given B”, or “the probability of A under the condition B”, is usually written as $P(A|B), \,\ P(A/B), \,\ P\left ( \frac{A}{B} \right )$

$\Rightarrow P(A/B)=\frac{n(A\cap B)}{n(B)}$

### Multiplication Theorem of Probability

We know that the conditional probability of event A given that B has occurred is denoted by P(A/B) and is determined by

$\Rightarrow P(A/B)=\frac{n(A\cap B)}{n(B)},\,\ P(B)\neq 0$

$\therefore P(A\cap B)=P(A/B) . P(B)$ ……………. (i)

$\Rightarrow P(B/A)=\frac{n(B\cap A)}{n(A)},\,\ P(A)\neq 0$

$\therefore P(A\cap B)=P(B/A) . P(A)$ ……………. (ii)

From eq (i) and (ii), we get

$\therefore P(A\cap B)=P(A/B) . P(B)= P(B/A) . P(A)$. If $P(A)\neq 0,\,\ P(B)\neq 0$

### Total Probability Theorem

Let $\left\{A_{1}, \,\ A_{2},..., A_{n} \right\}$ be a partition of the sample space S, and suppose that each of the events $A_{1},\, A_{2},\,...,A_{n}$ has non zero probability of occurrence.

Let A be any event associated with S. Then

$\Rightarrow P(A)= P(A_{1})P\left ( A/A_{1} \right )+ P(A_{2})P\left ( A/A_{2} \right )+....+P(A_{n})P\left ( A/A_{n} \right )$

$= \sum_{i=1}^{n}P(A_{i}). P(A/A_{i})$

This is called the total probability theorem.

### Bayes’ Theorem

Let $A_{1},\, A_{2},\,...,A_{n}$ be n non-empty events which constitute a partition of sample space S. So, $A_{1},\, A_{2},\,...,A_{n}$ are pairwise disjoint and $A_{1}\cup A_{2}\cup ....\cup A_{n} = S$. if A is any event of non-zero probability, then

$\Rightarrow P(A_{i}/A)=\frac{P(A_{i})\,\ P(A/A_{i})}{\sum_{i=j}^{n}P(A_{j}). P(A/A_{j})}$

### Bernoulli Trial and Binomial Distribution

The Bernoulli trial is also known as a binomial trial where only two outcomes of a given experiment are possible. We call these outcomes “success” and “failure”.

Characteristics of Bernoulli trial are as follows:

It can have only two outcomes that can be labelled as success and failure.

Probabilities of success and failure remain unchanged through each trial.

The trials are independent of each other.

Number of trials is fixed.

If p is the probability of success, then the probability of failure is $q= 1-p$.

Consider an experiment consisting of n Bernoulli trials, in which probability of success in each trial is p.

Then, the probability of r success ( exactly r successes) is given by $(r+1)^{th}$ term in the expansion of $(p+q)^{n}$

$\Rightarrow P(X=r)=\,\ ^{n}C_{r}\,\ p^{r}\,\,q^{n-r}$

Or $P(X=r)=\,\ ^{n}C_{n-r}\,\ p^{r}\,\,q^{n-r}$

where $p+q=1$ and r = 0, 1,2, 3, ……n

In the experiment, probability of

At least ‘r’ successes, $P(X\geq r)=\sum_{\lambda =r}^{n} \,\ ^{n}C_{r}\,\ p^{\lambda }\,\,q^{n-\lambda }$

At most ‘r’ successes, $P(X\leq r)=\sum_{\lambda =0}^{n} \,\ ^{n}C_{r}\,\ p^{\lambda }\,\,q^{n-\lambda }$

### Solved Examples

Example 1: A box B1 contains 1 white ball, 3 red balls, and 2 black balls. Another box B2 contains 2 white balls, 3 red balls, and 4 black balls. A third box B3 contains 3 white balls, 4 red balls, and 5 black balls.

1. If a ball is drawn from each of the boxes B1, B2, and B3, the probability that all 3 drawn balls are of the same colour is

82/648

90/648

558/648

566/648

Solution: (a)

Here, required probability P = P( all are white) + P(all are red) + P(all are black)

$= \frac{1}{6}\times \frac{2}{9}\times \frac{3}{12}+\frac{3}{6}\times \frac{3}{9}\times \frac{4}{12}+\frac{2}{6}\times \frac{4}{9}\times \frac{5}{12}$

$=\frac{6}{648}+\frac{36}{648}+\frac{40}{648}$

$=\frac{82}{648}$

2. If 2 balls are drawn (without replacement) from a randomly selected box and one of the balls is white and the other ball is red, the probability that these 2 balls are drawn from box B2 is

116/182

126/181

65/181

55/181

Solution: (d)

Let A : one ball is white and the other is red.

$E_{1}$ : both balls are from box $B_{1}$ ,

$E_{2}$ : both balls are from box $B_{2}$,

$E_{3}$ : both balls are from box $B_{3}$

Here, P(required) = $P\left ( \frac{E_{2}}{A} \right )$

$=\frac{P\left ( \frac{A}{E_{2}} \right ). P(E_{2})}{P\left ( \frac{A}{E_{1}} \right ). P(E_{1})+P\left ( \frac{A}{E_{2}} \right ). P(E_{2})+P\left ( \frac{A}{E_{3}} \right ). P(E_{3})}$

$=\frac{\,\ \frac{\,\ ^{2}C_{1}\times ^{3}C_{1}}{\,\ ^{9}C_{2}}\times \frac{1}{3} }{\,\ \frac{\,\ ^{1}C_{1}\times ^{3}C_{1}}{\,\ ^{6}C_{2}}\times \frac{1}{3}+\,\ \frac{\,\ ^{2}C_{1}\times ^{3}C_{1}}{\,\ ^{9}C_{2}}\times \frac{1}{3}+\,\ \frac{\,\ ^{3}C_{1}\times ^{4}C_{1}}{\,\ ^{12}C_{2}}\times \frac{1}{3}}$

$=\frac{\frac{1}{6}}{\frac{1}{5}+\frac{1}{6}+\frac{2}{11}}$

$=\frac{55}{181}$

### Solved Problems of Previous Year Question

1. Four persons independently solve a certain problem correctly with probabilities $\frac{1}{2}, \,\ \frac{3}{4}, \,\ \frac{1}{4}, \,\ \frac{1}{8}.$ Then the probability that the problem is solved correctly by at least one of them is

$\frac{235}{256}$

$\frac{21}{256}$

$\frac{3}{256}$

$\frac{253}{256}$

Ans: (a)

P(at least one of them solves correctly) = 1 - P(none of them solves correctly)

$= 1 - \left ( \frac{1}{2}\times \frac{1}{4}\times \frac{3}{4}\times \frac{7}{8} \right )= \frac{235}{256}$

Hence, option (a) is the correct answer.

2. Let E and F be two independent events. The probability that exactly one of them occurs is 11/25 and the probability that none of them occurs is 2/25. If P(T) denotes the probability of occurrence of the event T, then

P(E) = ⅘ , P(F) = ⅗

P(E) = ⅕, P(F) = ⅖

P(E) = ⅖ , P(F) = ⅕

P(E) = ⅗ , P(F) = ⅘

Answer: a, d

Let P(E) = x and P(F) = y

$\Rightarrow P(E\cup F)-P(E\cap F)= \frac{11}{25}$

$\Rightarrow x + y -2xy =\frac{11}{25}$ …………(i)

And $P\left ( \overline{E}\cap \overline{F} \right )=\frac{2}{5}$

$\Rightarrow (1-x)(1-y)=\frac{2}{25}$

$\Rightarrow 1-x-y+xy = \frac{2}{25}$ ……………(ii)

From eq (i) and (ii),

$\Rightarrow xy = \frac{12}{25}$ and $x + y = \frac{7}{5}$

On solving, we get,

$\Rightarrow x =\frac{4}{5},\,\ y = \frac{3}{5}\,\ or\,\ x =\frac{3}{5},\,\ y = \frac{4}{5}$

Hence, option (a) and (d) are correct.

3. The minimum number of times a fair coin needs to be tossed so that the probability of getting at least two heads is at least 0.96 is …….

Ans: 8

Let the coin be tossed n times,

P(at least two heads) = 1 - P(no heads) - P(exactly one head)

P(at least two heads) $=1 - \left ( \frac{1}{2} \right )^{n}-\,\ ^{n}C_{1}.\left ( \frac{1}{2} \right )^{n}\geq 0.96$

$\Rightarrow \frac{4}{100}\geq \frac{n+1}{2^{n}}$

$\Rightarrow \frac{2^{n}}{n+1}\geq 25$

Therefore, the least value of n is 8.

### Practice Question

1. A box contains tickets numbered from 1 to 20. Three tickets are drawn from the box with the replacement. The probability that the largest number on the tickets is 7 is

$1-\left ( \frac{19}{20} \right )^{5}$

$\left ( \frac{19}{20} \right )^{5}$

$\left ( \frac{3}{4} \right )^{5}$

$90\left ( \frac{1}{4} \right )^{5}$

Answer: 1. (d)

2. The probability that a 50-year-old man will be alive at 60 is 0.83 and the probability that a 45-year-old woman will be alive at 55 is 0.87. Then

The probability that both will be alive is 0.7221.

At least one of them will be alive is 0.9779.

At least one of them will be alive is 0.8230.

The probability that both will be alive is 0.6320.

Answer: (a), (b)

### Conclusion

In this chapter, we have elaborated on concepts and solutions to questions on the topic of Probability. Everything you're looking for is available in a single location. Students can carefully read through the concepts, definitions, and questions in the PDFs, which are also free to download, and understand the concepts used to solve these questions. This will be extremely beneficial to the students in their exams.

## FAQs on JEE - Probability

1. What is the probability of an event?

The probability of an event is the measure of the chance of occurrence of an event. Any mathematical event that is sure to occur is called a certain event. Any event which does not have any chance of occurrence is called an impossible event. The probability of any event ranges from 0 to 1, zero being the minimum value and one is the maximum value of the probability of any event. The probability of all events in a sample space sums up to unity. The probability of occurrence of any event is equal to the ratio of the number of favourable outcomes to the total number of outcomes. For example, if a dice is thrown upwards, the total number of possible outcomes is 6 (1, 2, 3, 4, 5, 6). The probability of getting an even number is calculated as:

Probability = No. of favourable outcomes / Total number of outcomes

= 3/6 = ½

2. How is the Bernoulli trial related to the binomial distribution?

Both are types of the discrete probability distribution that obtains the probability of success in an outcome. The outcomes in each distribution are independent. Bernoulli distribution is a distribution with only two possible outcomes; “yes” with probability p and “no” with probability 1-p. For example, tossing a coin has two possible outcomes. Head which can be referred to as “yes” or Trial which can be referred to as “no”. The Bernoulli distribution is used only for a single trial. If multiple repeated n Bernoulli trials are carried out with p probability of success, the distribution becomes binomial. For example, tossing a coin five times is a binomial experiment.

3. Can the conditional probability be used in real life?

It’s been many years since the conditional probability is in use. Conditional probability is being actively used in diverse fields such as insurance, calculus, and politics. A typical real-life example of conditional probability would be the re-election of a ruling political party depending upon the voting preference of voters and perhaps a successful marketing campaign — even the probability of the opponent party making blunders during press!

The weather forecast team might announce that your area has a probability of getting a cyclone of 60%. Still so, this fact is conditional on various factors, such as the probability of:

A large system of sufficient warm, moist winds over the ocean rising from near the surface.

Rise of the warm wind in your area

High humidity

An area of lower air pressure below

We can state that the conditional probability for a cyclone to form depends on all the above events.