Hazard ratios are frequently interpreted as relative risks. I know the hazard ratio is a function a time that describes the instantanious ratio of rates. But in a Cox model we make the assumption of proportional hazards, which means that hazard ratio is constant over time. For example, if HR = 2 at all times I think that we at the end will have twice as many events in one group, which gives as a relative risk of 2?

I know the difference between HR and RR but i'm not sure about this. Is it still incorrect to interpret the hazard ratio as a relative risk under the assumption of proportional hazards?

Update: I think I found the answer. It's incorrect to say that we at the end will have twice as many events in one group even if HR = 2 (constant). The risk set is updated oven time and the ratio of the number of events will not be constant.

I know the difference between HR and RR but i'm not sure about this. Is it still incorrect to interpret the hazard ratio as a relative risk under the assumption of proportional hazards?

Update: I think I found the answer. It's incorrect to say that we at the end will have twice as many events in one group even if HR = 2 (constant). The risk set is updated oven time and the ratio of the number of events will not be constant.

Last edited: