**[SOLVED] CFA by 'hand'**

Ok so I am trying to do a cfa by hand. I am doing the tutorial found here

The central equation is the model implied covariance matrix [TEX]\Sigma = LL^T+E [/TEX] where L is a matrix of loadings and E is a matrix of errors. The idea is to use maximum liklihood to get estimates for L and E which minimise the descrepency between the implied and the observed covariance matrix. The descrepency function to be minimized is [TEX] F = log \left | \Sigma \right | + tr(S\Sigma{^{-1}}) - log\left | S \right | - p[/TEX] where sigma is the implied covariance matrix and S is the observed and p is the number of indicator items.

Ok so far in R I have got this far:

Code:

```
dat<- '.804 .399 .500 .367 .451 .510 .487 .833 .433 .283 .372 .377 .621 .529 .805 .339 .551 .543 .478 .362 .441 .733 .332 .341 .570 .461 .695 .438 .780 .556 .626 .455 .667 .438 .693 .825'
dat<- strsplit(dat, ' ')
dat<-as.numeric(unlist(dat))
covar1<- t(matrix(dat, nrow=6, ncol=6))#cov matrix for one factor CFA
L1<- rep(.7,6)#starting values for loadings
LF<- NULL# one factor model with latent variance constrained to 1 is
#an identity matrix so not needed in this case
E1<- diag(.3, 6, 6)#starting values for error
discrepency<- function(covar, L, E){
#descrepency function
# I need to apply ML to this to find L and E
sigma<- L%*%t(L)+E
log(det(sigma))+
sum(diag(covar%*%solve(sigma)))-
log(det(covar))-
nrow(covar)}
```

1. Have I correctly reproduced the descrepency function here. My initial results for the implied covariance matrix based on starting values is different from the tutorial but I think that is because they got it wrong.

2. How do I get the ML estimates for L and E. I have tried stats4:::mle from base but with no success.