I'm stuck on this problem:

What is the probability that a randomly selected observation exceeds the

a. Mean of a normal distribution?

b. Median of a normal distribution?

c. Mean of a nonnormal distribution?

d. Median of a nonnormal distribution?

So far, I believe that to parts b and d, for the median of both normal and nonnormal, it would be .5, since the median is the 50th percentile of the distribution. I also think that it would be .5 for the mean of a normal distribution, since, given that its normal, should be about the same value as the median. But what about the mean of a nonnormal distribution?? I want to say that it depends on a distribution's skewness, but I"m not sure.

Any sort of help would be appreciated. Thanks!