I'm evaluating the precision of measuring angles in a certain experiment - the object lies in a Petri dish under a certain angle to the 'north' position. I have measured the photo of the object 15 times during different 15 days, so I have 15 values (mean and median = 336.6 degrees) and range of measurement (335.5-337.0, i.e. 1.5 degree). Standard deviation of mean is 0.39 degree.

My question is, do I understand correctly that the error of measurement will be (SD/mean)*100% (i.e. 0.11%), and not (1/2*range/mean)*100% (i.e. 0.22%)?

Thank you.