I'm trying to figure out how the normal distribution works if a distribution of scores is normally distributed, but the scores don't go below zero?

For example, if we are looking at exam scores - even if the scores of the exam are normally distributed, nobody on an exam can get less than a zero on the exam.

The normal distribution is supposed to be asymptotic - it never touches the X-axis. But how can this be in the above case - once you get down to zero, wouldn't the curve just end?

Thanks to anyone who can help clarify this for me.

Thanks,

Frodo