statistics question re margin of error, standard deviation

#1
I took a test for which the score range was 70-130. A score of 100 represents the average score of my peer group. The standard deviation was 15. My score was 105.

I am wondering how accurate the test is, i.e. would a score of 100 in a hypothetical perfectly devised test be truly anywhere between 90 and 110? 95 and 105? So I am thinking that what I want to know is the "standard deviation". Is that correct?

But if so, I note that I read somewhere that the margin of error is usually twice the standard deviation. That would mean in this case that a score of 100 could truly reflect anything from 70-130?!?