Gallup poll

#1
Hello everyone, my first post!

I haven't really taken a class in statistics so forgive me if the question seems silly.

Here's an article from the gallup website describing the methodology of their polls. http://media.gallup.com/PDF/FAQ/HowArePolls.pdf. The following is taken from the article.

____________________

For example, with a sample size of 1,000 national adults, (derived using careful random selection procedures), the results are highly likely to be accurate within a margin of error of plus or minus three percentage points. Thus, if we find in a given poll that President Clinton's approval rating is 50%, the margin of error indicates that the true rating is very likely to be between 53% and 47%. It is very unlikely to be higher or lower than that.

To be more specific, the laws of probability say that if we were to conduct the same survey 100 times, asking people in each survey to rate the job Bill Clinton is doing as president, in 95 out of those 100 polls, we would find his rating to be between 47% and 53%. In only five of those surveys would we expect his rating to be higher or lower than that due to chance error.

_____________________


However, I don't agree with this interpretation. As far as I understand, the correct interpretation of the poll should be that in approximately 95 out of 100 polls, the "true" rating is inside the margin of error intervals for that particular survey (not the given 1st survey number of 47 and 53%).

Here's an example in which the above statement is completely wrong. It's conceivable that the first poll completely missed the true rating (about 5% of the time). In this scenario, there's no way (almost zero probability) that 95 out of 100 polls would find the rating to be between 47 to 53%.

Am I missing something here?