Question about finding MLE asymptotic standard error in AR(1)-garch(3,1) model in SAS

#1
The AR(1)-GARCH(3,1) model is as follows:

Y(t) = e(t) - phi * Y(t-1) ---> AR(1), where phi is the parameter
e(t) = sqrt(h(t)) * error(t) ---> error(t) follows N(0,1) iid
h(t) = w + alpha * [e(t-1)**2] + gamma1 * h(t-1) + gamma2 * h(t-2) + gamma3 * h(t-3) ---> GARCH(3,1), where w, alpha, gamma1, gamma2, gamma3 are parameters

ALL the parameters are known, and I use them to simulate series Y (with length of 500) in SAS. (Of course, Y follows this AR(1)-GARCH(3,1) model). Given the series, I use PROC AUTOREG in SAS to find Maximum Likelihood Estimate and its sample standard error for the GARCH model. I generate 10000 time series Y and get 10000 MLEs and the corresponding sample standard errors. Therefore, I can draw a CDF plot for those 10000 MLEs to see their distribution. And here comes my question: to compare with the SAMPLE CDF plot of MLE, I need to draw another CDF plot, which shows the asymptotic distribution of those MLEs. Theories state that MLE follows NORMAL distribution asymptotically with mean = true_parameter, but I don't know how to calculate the asymptotic standard error of the MLE. This theoretical standard error should be calculated with only the GIVEN parameter values and the sample size N (here N=500). But as the model is complicated, it's nearly impossible to find an explicit form of the likelihood function, thus I can't find the asymptotic standard error of those MLEs.

If anyone is familiar with SAS and has a sound knowledge in GARCH, please leave me a message or drop me an email to shenxw0310@gmail.com, I may show you my code and the dataset.

Thank you!!!