I encountered the following problem. I am not very sure on how to solve it. Can someone help me solve the problem.

Problem:
A dart is thrown onto a circular plate with unit radius. X is the random variable representing the distance of the point where the dart lands from the origin of the plate.
Assume that the dart always lands on the plate and that the dart is equally likely to land
anywhere on the plate.

Find (i) P(X < a) and (ii) P(a < X < b), where a < b <=1.