Entropy

#1
Can someone give me a run down on information entropy... i've read the wiki article on it... so I understand that its a measure of uncertainty associated with a random variable but I don't understand how maximising it is beneficial... judging by the example with the coin (i.e. if your tossing a coin to maximise the entropy you need to ensure that the coin is fair) but thats the thing... in order to benefit from a random event wouldnt the best idea be to minimise and not maximise it? So based on Baynesian? stats i.e. where you have prior information on certain data?/events? etc... how can it make sense to maximise entropy if by knowing past data and assuming a contiuation of the same general trend in that data you can win a lot more. So using that coin example assuming that it is biased and doesnt get switched to a different coin with a different entropy? you could play a game and win a lot more if the entropy is low!

P.S. sorry for posting in both the probability and the statistics forums... I'm not sure which heading this is under.