+ Reply to Thread
Results 1 to 2 of 2

Thread: least square estimator of a random variable

  1. #1
    Points: 1,951, Level: 26
    Level completed: 51%, Points required for next Level: 49

    Posts
    19
    Thanks
    1
    Thanked 0 Times in 0 Posts

    least square estimator of a random variable




    Suppose we have an observation y = c+ e where c is an unknown constant and e is error with the pdf = exp(-e-1) for e >-1 . We want to determine the least square estimator of c given by the c* which minimizes the error cost function E(c) = .5(y-c)^2

    Minimizing the error cost is done by taking the derivative wrt c so y=c.

    I understand in the matrix case E(c) = T(e)e where T( ) is the transpose . where y=Hc+e

    = T(y-Hc) (y-Hc) = T(y)y -T( x)Hy -T(y)Hx +T(x)T(H)Hx .
    The derivative wrt x is -2T(y)H+2T(x)T(H)H= 0 => x=inverse(T(H)H)*T(H)y

    I guess I am just confused on the scalar case on what to do.

  2. #2
    TS Contributor
    Points: 7,081, Level: 55
    Level completed: 66%, Points required for next Level: 69

    Location
    Copenhagen , Denmark
    Posts
    515
    Thanks
    71
    Thanked 123 Times in 116 Posts

    Re: least square estimator of a random variable


    I am confused as to what the question is...

+ Reply to Thread

           




Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts






Advertise on Talk Stats