Hello all. I'd love some help with a problem I'm having, which is not for homework but for my research (I'm a new graduate student in biophysics). Here is the problem:
I'm measuring the number of times I "observe" a particular gene in an experiment. Let's say I observe it N times. I want to compute the error in the measurement. HOWEVER, there is a twist: each time I observe the gene, it gets a fractional score (<1) based on how certain I am that I saw that gene (this is due to the experimental details). So I have a series of scores s1,s2,s3,...,sN for each gene. What I want is the error in the total score S=s1+s2+...+sN.
The way I've been thinking about this is, if I used just a score of 1 for each measurement, it would be a simple counting error problem: N measurements, sqrt(N) error. Should perhaps the error be S/sqrt(N) (the "standard" error)?
Thank you very much for your help.