+ Reply to Thread
Results 1 to 1 of 1

Thread: proving regression with dummy variables gives same estimates as separate models

  1. #1
    Points: 483, Level: 9
    Level completed: 66%, Points required for next Level: 17

    Posts
    1
    Thanks
    0
    Thanked 0 Times in 0 Posts

    proving regression with dummy variables gives same estimates as separate models




    See attachment for problem description:

    To prove this I started by using: vector of Beta estimates for category k = [Bk], matrix of estimates for category k = [Xk], Y estimates for category = [Yk].
    1. Find betas for single category:
    Calculating [Ba] = (Xa' * Xa)inv * (Xa' *Ya) for the

    2. Find betas for matrix of combined categories for the general case.
    In block diagonal form:

    [X] becomes [Xa 0 ... 0]
    becomes [0 Xb... 0]
    becomes [0 0... Xk]

    [Y] becomes [Ya]
    [Yb] ...
    [Yk]
    and you do the same multiplication B = (X' * X)inverse * (X'*Y).

    Multiplying this out I get (X' * X)inverse =
    [1/Xa^2 0 ... 0]
    [0 1/Xb^2... 0]
    [0 0... 1/Xk^2]
    and X' * Y:
    [XaYa ..... 0]
    [0 XbYb.... 0]
    [0 0.. XkYk]
    and eventually I get B = Ya * Xa'.

    Now that I'm this far I am not sure how to prove that the singular case for Ba = Ya*Xa' without adding specific numbers. Does anyone know how to proceed? Any help would be appreciated.
    Attached Images  

+ Reply to Thread

           




Tags for this Thread

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts






Advertise on Talk Stats