Ok. So now you are given

\( \Pr\{Y = 1\} = p, \Pr\{Y = 2\} = q, \Pr\{Y = 3\} = 1 - p - q \)

The probability mass function of \( Y \) is

\( f_Y(y) = \left\{ \begin{matrix} p & ~ {\rm if} ~ y = 1 \\

q & ~ {\rm if} ~ y = 2 \\

1 - p - q & ~ {\rm if} ~ y = 3 \\

0 & ~ {\rm otherwise} \end{matrix} \right. \)

However, you can write a more condensed product form:

\( f_Y(y) = \left\{ \begin{matrix}

p^{\frac {(y-2)(y-3)} {2}} q^{-(y-1)(y-3)}

(1 - p - q)^{\frac {(y-1)(y-2)} {2}}& ~ {\rm if} ~ y = 1, 2, 3 \\

0 & ~ {\rm otherwise} \end{matrix} \right. \)

such that the joint log-likelihood for n sample

\( L(p,q;y_1, ..., y_n) = a \ln p + b \ln q + c \ln (1 - p - q) \)

where \( a= \frac {1} {2} \sum_{i = 1}^n (y_i - 1)(y_i - 2),

b = -\sum_{i = 1}^n (y_i - 1)(y_i - 3),

c = \frac {1} {2} \sum_{i = 1}^n (y_i - 2)(y_i - 3)

= n - a - b \)

which can be readily obtained from your sample.

Now we take the partial differentiation with respect to the parameters

and set them equal to 0:

\( \left.\frac {\partial L} {\partial p} \right|_{p = \hat{p}, q = \hat{q}}

= \frac {a} {\hat{p}} - \frac {c} {1 - \hat{p} - \hat{q}} = 0 \)

\( \left.\frac {\partial L} {\partial q} \right|_{p = \hat{p}, q = \hat{q}}

= \frac {b} {\hat{q}} - \frac {c} {1 - \hat{p} - \hat{q}} = 0 \)

Simplify them, we have a 2 by 2 linear system in \( \hat{p}, \hat{q} \):

\( \left\{\begin{matrix} (a+c)\hat{p} + a\hat{q} = a \\

b\hat{p} + (b+c)\hat{q} = b \end{matrix}\right. \)

Eventually, we have

\( \hat{p} = \frac {a} {n}, \hat{q} = \frac {b} {n} \)

Again note \( a + b + c = n \)

But note that a, b, c are just counts of the outcomes 1, 2, 3.

So the MLE are just the ratio of the specific counts to the total sample.