# Probability in Hidden Markov Model

#### PlpPlp

##### New Member
Hello,

I'm considering a Hidden Markov Model as follows:

$$X_{n + 1} = F_n(X_n,\Theta,\eta_n)$$ (that's X_{n+1} here, I don't know why the + is removed)
$$Y_n = G_n(X_n,\Theta,\xi_n)$$

where the $$Y_n$$ are the observations and the $$X_n$$ the hidden states. At some point, I have to deal with this probability

$$p(X_k = x_k | X_{k-1} = x_{k-1}, Y_{0:N} = y_{0:N})$$

where $$0 < k < N$$ and $$Y_{0:N}$$ denotes the $$Y_0, \dots, Y_N$$. If it were

$$p(X_k = x_k | X_{k-1} = x_{k-1}, Y_{0:k} = y_{0:k})$$

I would know how to deal with it (I think) but I wonder what it changes to add the observations for all future times. Any idea how I could handle this probability, maybe express it with respect to usual and known probabilities?

Is there something I'm completely missing here?

Thanks a lot,

Last edited by a moderator: