STDEV at Different Means

Im trying to measure 'consistency' in baseball scores amongst two different sample groups.

For example, in a sample of 162 games each, one team had a mean of 5.06 runs scored and a standard deviation of 3.43. One team had a mean of 3.94 and a standard deviation of 2.62.

Which team has been more 'consistent'? I don't know what the real method is here. I don't think it's linear.... like, stdev/mean. What I'd like to be able to say is that one team was more or less consistent than the other, and I wonder how I'd arrive at that answer.

Thank you.
If something is consistent, it varies little or follows a known pattern. You are not talking about pattern or sequences here, so the issue has to do with variability. One would immediately say stdev/mean is the measure. But you rule this out saying " I don't think it's linear.... like, stdev/mean". Do you have explanation for this? I know
stdev/mean as standardized variation which is used for comparing the variability of two or more variables especially when they are measured at different scales. The difference in the means here is significant (you can test it) to warrant using stdev/mean.