# Stupid question: deriving standard dev from mean, C.I., and sample size

#### Syriloth

##### New Member
So I'm trying to get a standard deviation value for a table of statistics that includes a mean, a 95% confidence interval, and a sample size. I am actually unsure if the sample size is correct, more on that later. It's been a while since I've taken statistics and attempting to research this myself has only led me to be more confused.

I'll use an example from my dataset:

mean: 0.65
95% C.I.: 0.244 - 0.485
sample size: 68

I am actually unsure what to do with this. I feel like standard deviation ought to be recoverable, but when I try to research it I just get immediately confused. I can find a lot of guides for obtaining standard deviation and for obtaining a confidence interval from that, and I feel like it should be straightforward to go backward from there but I'm just lost.

Sorry for being so stupid. I know this should be easy, but I think I've just managed to make things worse for myself and I really would appreciate somebody explaining this to me.

Secondarily, I'm not sure that the sample size given in the dataset I am working with is actually correct. The paper I am taking this data from lists several different sample sizes (I will spare you the details as to why) and it is not clear which one was used to develop the confidence intervals that they list. If I am unable to confidently identify the correct sample size, is it still possible to get an estimate of standard deviation, even if it cannot be specifically derived? Close would be better than nothing in this case.