I'm very new to statistics as well as this forum, so please bear with me.

I've tried solving this for about 30 minutes now and I've made pretty much 0 progress on it. The question is:

Suppose that evens A and C are mutually exclusive. P(not A)=0.7, P(CB)=0.4, what are the maximum and minimum values for P(not C)?

An answer is helpful, but what I'm really looking for is detailed steps on how to approach, simplify, and solve this problem.

Thanks in advance. ]]>

I wonder if someone can kindly help. I have a dataset which I've uploaded and I'm trying to work out a sensible distribution. It represents the number of throws a darts player needs before he can aim for a double. The minimum possible is 8 and the maximum possible is theoretically infinite although good players would very rarely go beyond 30 or so.

I thought a lognormal distribution might fit best but would be very grateful for a second opinion. You will see that the data peaks on certain numbers of darts, presumably because certain scores (eg 180) are more common than others due to the fact that darts players have particular habits and scoring is not random.

Any thoughts/advice would be most welcome and appreciated.

Thanks in advance.