proving regression with dummy variables gives same estimates as separate models

#1
See attachment for problem description:

To prove this I started by using: vector of Beta estimates for category k = [Bk], matrix of estimates for category k = [Xk], Y estimates for category = [Yk].
1. Find betas for single category:
Calculating [Ba] = (Xa' * Xa)inv * (Xa' *Ya) for the

2. Find betas for matrix of combined categories for the general case.
In block diagonal form:

[X] becomes [Xa 0 ... 0]
becomes [0 Xb... 0]
becomes [0 0... Xk]

[Y] becomes [Ya]
[Yb] ...
[Yk]
and you do the same multiplication B = (X' * X)inverse * (X'*Y).

Multiplying this out I get (X' * X)inverse =
[1/Xa^2 0 ... 0]
[0 1/Xb^2... 0]
[0 0... 1/Xk^2]
and X' * Y:
[XaYa ..... 0]
[0 XbYb.... 0]
[0 0.. XkYk]
and eventually I get B = Ya * Xa'.

Now that I'm this far I am not sure how to prove that the singular case for Ba = Ya*Xa' without adding specific numbers. Does anyone know how to proceed? Any help would be appreciated.