Courses
Courses for Kids
Free study material
Offline Centres
More
Store Icon
Store

An object is thrown vertically upward from a height of 2m at $1m{{s}^{-1}}$. How long will it take for the object to hit the ground?

seo-qna
SearchIcon
Answer
VerifiedVerified
408.9k+ views
Hint: We have to observe the motion of the object which is thrown vertically with some velocity. This means that this will not be the case of an object freely falling under gravity. However, since the object hits the ground therefore, it will be acted upon by the force of gravity. We shall analyze the velocity of the object at various points of its motion and apply the equations of motion at those points.

Complete answer:
We have marked three points in the journey of the object that are A, B and C respectively. Also, we break up the motion of the object into two parts, from A to B and from B to C.
seo images

We shall first look into the motion of the object from A to B.
At point A, the object has an initial velocity of $1m{{s}^{-1}}$. When it is thrown vertically, it does not fall downwards directly rather it goes upward to reach its maximum height. At the maximum height, the velocity of the object is zero. Thus, the acceleration acting on it is $-10m{{s}^{2}}$.
Applying the equation of motion, $v=u+at$ and substituting values, $v=0m{{s}^{-1}},u=1m{{s}^{-1}},a=-10m{{s}^{-2}},t={{t}_{1}}$, we get
$\Rightarrow 0=1+\left( -10 \right){{t}_{1}}$
$\Rightarrow 10{{t}_{1}}=1$
Dividing both sides by 10, we get
$\Rightarrow {{t}_{1}}=\dfrac{1}{10}s$
Also, ${{h}_{1}}=\dfrac{{{v}^{2}}-{{u}^{2}}}{2a}$
$\begin{align}
  & \Rightarrow {{h}_{1}}=\dfrac{0-{{1}^{2}}}{2\left( -10 \right)} \\
 & \Rightarrow {{h}_{1}}=\dfrac{1}{20}m \\
\end{align}$
$\Rightarrow {{h}_{1}}=0.05m$
Now, from in motion from point B to C, the object is freely falling under gravity from its maximum height. Applying the equation of motion, $s=ut+\dfrac{1}{2}a{{t}^{2}}$ and substituting the values, $s=2m+{{h}_{1}},u=0m{{s}^{-1}},a=10m{{s}^{-2}},t={{t}_{2}}$ , we get
$\Rightarrow 2+0.05=\left( 0 \right){{t}_{2}}+\dfrac{1}{2}\left( 10 \right)t_{2}^{2}$
$\Rightarrow 2.05=5t_{2}^{2}$
$\Rightarrow {{t}_{2}}=\sqrt{\dfrac{2.05}{5}}s$
 Thus, the total time of motion of the object is ${{t}_{1}}+{{t}_{2}}$.
$\begin{align}
  & \Rightarrow t=\dfrac{1}{10}+\sqrt{\dfrac{2.05}{5}} \\
 & \Rightarrow t=0.1+0.64 \\
\end{align}$
$\Rightarrow t=0.74s$
Therefore, time taken for the object to hit the ground is 0.74 seconds.

Note:
In motion from A to B, we have taken the acceleration of the object as $-10m{{s}^{-2}}$ because it is going in the exact opposite of the acceleration due to gravity. However, in motion from B to C, we have taken the acceleration to be $10m{{s}^{-2}}$ because then the object is moving in the same direction as that of the acceleration due to gravity.