My grandson is a huge football fan. His team, the Seattle Seahawks, made the playoffs. Sadly mine, the 49ers, didn't.

The last 2 years, we have had a little playoff competition to see who is the better prognosticator. We each give odds and scores for each of the playoff games and then compare the results.

This year, I'd like to analyze the results mathematically. He's showing some interest in math and I'd like to encourage that.

If I have each of our odds and scores predictions for the 4 games this weekend, is there a formula I can use to calculate the accuracy of those predictions after we have the final scores?

For each game, I will have predictions by me and by him of the odds and the final scores. I'm thinking of making a spreadsheet something like this:

Code: 
                Him          Me
  Kansas City   16  60%     28  65%
  Houston       12  40%     21  35%
  Pittsburgh     9  53%     13  48%
  Cincinnati     7  47%     16  52%
  Green Bay     21  44%     28  51%
  Washington    30  56%     27  49%
  Seattle       24  90%     21  45%
  Minnesota     23  10%     24  55%


Given this data, how can I calculate some sort of accuracy rating once I get the final scores?

I can calculate the number right/wrong for one statistic.

For the scores, I can add up the differences between the predictions and the actual scores. Is there a better way?

But how can I evaluate the odds? If he gives a team a 90% chance and I give them a 70% chance and the team wins, he should get a better score. If the team loses, I should. I can just add up the differences, like with the scores, but it seems like it ought to be more logarithmic (less linear).