How do you prove something cools "too fast"? Here is the full question:
For simulated annealing, some temperature schedules decrease excruciatingly slowly. It is reasonable to ask whether we could decrease the temperature faster and still retain a guarantee of convergence in distribution to global optima. Let c be a positive number, and consider performing simulated annealing with the cooling schedule Tn = bn^(−c.) Can you give an example that shows that such a schedule decreases too fast, in the sense that the process has positive probability of getting stuck in a local minimum forever? Thus, even
Tn = n^(−.0001) cools “too fast”!
I have no idea how to go about solving this problem, but someone suggested comparing how fast it goes to 0 as n goes to infinity. Any help would be greatly appreciated. Thanks in advance!
Advertise on Talk Stats