I am working on a project where a machine has a part replaced every 500 hours it's been run or it fails (which ever comes first). I have been asked to determine if we should lower the replacement time to 100 hours to make the machine have better reliability. I have a lot of data of failures of the current situation but I don't have data on what happens if they replace it every 100 hours. My initial thought and research leads me to creating my own simulation with bootstrapping however I don't know if this is statistically valid. Experts - how would you determine if they should make the change? What statistical methods would you use?