The exercise says:

"Use the identities of Theorem 3.4.2. to

(a) calculate the variance of a binomial random variable.

(b) calculate the mean and variance of a beta(a,b) random variable."

Theorem 3.4.2 says the following.

If X is a random variable with pdf or pmf of the form

\(f(x|\theta) = h(x)c(\theta)\exp\left( \sum_{i=1}^k w_i(\theta)t_i(x) \right)\)

(exponential family), then

\(E \left( \sum_{i=1}^k \frac{\partial w_i(\theta)}{\partial \theta_j} t_i(X) \right) = - \frac{\partial}{\partial\theta_j}\log c(\theta)\)

and

\(Var \left( \sum_{i=1}^k \frac{\partial w_i(\theta)}{\partial \theta_j} t_i(X) \right) = - \frac{\partial^2}{\partial\theta_j^2}\log c(\theta) - E \left( \sum_{i=1}^k \frac{\partial^2 w_i(\theta)}{\partial \theta_j^2} t_i(X) \right)\).

Point (a) is doable, but what about point (b)? First of all, since we have two parameters (a and b), do we get a system of two equations? My difficulties is that I have a cumbersome E[logX] and then I have to take derivatives of B(a,b). It's a nightmare... If I'm not wrong,

\(c(a,b) = 1/B(a,b)\)

\(t_1(x) = \log x, w_1(a,b) = a-1\)

\(t_2(x) = \log (1-x), w_2(a,b) = b-1\).

Any help?