It's easy to show that a normal PDF maximizes Shannon entropy. My data are discrete, so I have a discrete normal PMF. I do know the variance of the data, but have only one to three actual observations. Can I infer a mean that maximizes entropy? It is NOT just the mean of the observations. For example, if known variance is infinite then the mean has to be zero regardless of the observations. Ideas will be much appreciated.