I think I'm going to commit the cardinal internet forum sin and make my first post a possibly boneheaded question for people who have better things to do.... But here we go anyways:)

I have a CS background with a very minimal amount of stats and have a question related to internet gaming statistics. My question is:

How can I determine what a large enough sample size would be to determine the average win rate in a game for a single player over thousands of games without actually playing thousands of games. Is 100 games enough? This is a 15 vs 15 game. Assuming that players are of different skill levels on completely randomized teams and the outcome of the game is only affected by the skills of the individual players.

ex) player A plays 100 games, wins 55%. this is +/-X% of what that player would see if they were to play 10000 games.

What sort of statistics problem does this translate into and how can I represent it in notation and solve?

If this is too basic perhaps someone can point me toward a good resource for figuring this out myself?

Thanks!

Sean ]]>

http://news.sciencemag.org/biology/2...-less-and-less ]]>