Calibrating time-series simulations / backtests


New Member
Suppose I have a time-series data generating process that I am trying to simulate. The simulations are controlled by 2 discrete variables: −3<= A<=3 and 0<= B<= 10. When backtesting, the simulations perform reasonably well, but I'd like the find the optimal values of A and B for calibration. I have simulated all 77 possible combinations. Can I simply fit a regression lm(Sim ~ Real) for each of the 77 combinations & select the one with the highest R^2 value?