mean and variance of a beta distribution

#1
I'm trying to do point (b) of exercise 3.30 of the book "Statistical Inference" (Casella & Berger).
The exercise says:
"Use the identities of Theorem 3.4.2. to
(a) calculate the variance of a binomial random variable.
(b) calculate the mean and variance of a beta(a,b) random variable."
Theorem 3.4.2 says the following.
If X is a random variable with pdf or pmf of the form
\(f(x|\theta) = h(x)c(\theta)\exp\left( \sum_{i=1}^k w_i(\theta)t_i(x) \right)\)
(exponential family), then
\(E \left( \sum_{i=1}^k \frac{\partial w_i(\theta)}{\partial \theta_j} t_i(X) \right) = - \frac{\partial}{\partial\theta_j}\log c(\theta)\)
and
\(Var \left( \sum_{i=1}^k \frac{\partial w_i(\theta)}{\partial \theta_j} t_i(X) \right) = - \frac{\partial^2}{\partial\theta_j^2}\log c(\theta) - E \left( \sum_{i=1}^k \frac{\partial^2 w_i(\theta)}{\partial \theta_j^2} t_i(X) \right)\).

Point (a) is doable, but what about point (b)? First of all, since we have two parameters (a and b), do we get a system of two equations? My difficulties is that I have a cumbersome E[logX] and then I have to take derivatives of B(a,b). It's a nightmare... If I'm not wrong,
\(c(a,b) = 1/B(a,b)\)
\(t_1(x) = \log x, w_1(a,b) = a-1\)
\(t_2(x) = \log (1-x), w_2(a,b) = b-1\).
Any help?
 

BGM

TS Contributor
#2
In my Casella and Berger, part b is asking for \( \text{Poisson}(\lambda) \) instead of Beta distribution. Not sure if this method is useful for Beta distribution. Anyway you may take a look at

http://en.wikipedia.org/wiki/Beta_distribution#Geometric_mean

and there are lots of calculations for \( E[\ln X] \) and other related terms. And as you expected you will need to in terms of some non-elementary function like the digamma function which is the derivative of log-gamma function.