Cohen's Kappa Problem

I am aiming to check the inter-rater reliability of a scale using Cohen's Kappa. I have input the 5 scores as their own variables for Rater A and the same again for Rater B. (1A, 2A, 3A, 4A, 5A and 1B, 2B, 3B etc).

In each variable, the scores range from 0-40 and are not categories - just scores. (It would be impossible to categorise them due to the wide variation in scores)

I run Cohen's Kappa on SPSS and for 2A*2B and 3A*3B i get an answer no problem. My problem is when i run it on 1A*1B, 4A*4B and 5a*5B. I get the error message "Kappa statistics cannot be computed. They require a symmetric 2-way table in which the values of the first variable match the values of the second variable."

Is this because the two scorers have not used exactly the same numbers? (For example, rater A gives a score of 8 somewhere in the results and rater B does not give that score at all?)

Any way to overcome this? I can't quite understand why it works for 2 of the variables and not the other 3. I've read somewhere about weighting kappa(?), but i'm not sure how to go about this or whether it would work.

Any help would be appreciated.:)