Posterior of a Lognormal Shape Parameter, in Terms of Prior Location & Shape

#1
I am setting up a Bayesian analysis using a lognormal distribution with skewed data. I want to estimate the "standard deviations" of samples taken from different bins, where sample-sizes available from some bins are very large, and others are very small. I have reliable prior information available about the overall distribution of "standard deviation" values, which I want to build into the analysis.

I consulted Daniel Fink's paper on conjugate priors to get the posterior distribution of the lognormal "standard deviation" when the "mean" is known (p. 21):
https://www.johndcook.com/CompendiumOfConjugatePriors.pdf

("Mean" and "standard deviation" are in quotes to remind us that we are dealing with the mean and standard deviation of the log of our random variable, and not of the random variable itself).

According to Fink, the prior and posterior distributions are gamma distributions, with parameters a' = a + n/2 and b' = b + L/2, where L = SUM(ln x - u) ^ 2 (Note that Fink uses the version of the gamma distribution where the term within the exponential is -arg/b).

Since the prior and posterior probabilities are gamma distributed, I thought it would be easy enough to convert the a's and b's into observable means and standard deviations, since for both the primed and unprimed quantities, ab = u and ab^2 = s^2.

This premise seems to be badly flawed.

Solving for s'^2, I get s'^2 = (u'/u)s^2 + u'L/2. Now, in the case where I do know the means and u'=u, since (in the case of positive mean) all the terms on the right-hand side of the equation are positive, the result is that the more observations I make, the larger my posterior standard deviation gets.

???

If the posterior is a gamma distribution, and I'm not dealing with zeros in any odd places, then the relation between the a's,b's, u's and s's are all well defined. So what am I doing wrong, to get apparent nonsense results?