Does anyone know how to calculate predicted probabilities for a discrete-time event history analysis? Just in case the term "predicted probabilities" isn't clear, I'll provide an example based on OLS regression.

Suppose I estimate the effect of education on income (I'll overlook the obvious problems involved in using income as a dependent variable in OLS). This yields the following result:


Therefore, I would predict that someone with ten years of education would have an income of 50:

INCOME = 10 + (4 * 10)


Now how might I obtain these predicted values for a discrete-time event history analysis? The problem, of course, is that the observed value on my dependent variable is very small after I construct my person-month data file. Many respondents might be observed for hundreds of months before the event I'm modeling actually occurs. For the data I'm analyzing, in which respondents are observed for a long time, the mean on my dependent variable in the person-month data file is .0025. The resulting hazard ratios are meaningful, but any attempt at constructing predicted probabilities yields absurdly tiny values.

Instead, I'd like to construct meaningful predicted probabilities. Does anyone have any thoughts? The only thing that comes to mind is weighing each predicted probability I calculate by the number of person-months I have in the data for that respondent. Would this work?

Thank you for your time. I welcome all suggestions.