"Averaging" linear trends = linear regression slopes ?

Hello everyone,

I was not able to find a solution for the following question, explaining it with a simple example: We have N=12 weather boxes measuring every minute the air temperature and they are distributed in a quite large area (e.g. 500 km^2). Now we measure for 30 years the temperatures and calculate the linear regression slopes and we see a little positive trend. But now someone asks us about this area...how OVERALL was the (linear) temperature development?...We have more than 60 000 measurement points per weather box over which we calculate each slope (and residuals show more or less normal gaussian distribution)...

Can we just simply AVERAGE these slopes to get an OVERALL slope? I guess not, perhaps they should be transformed and retransformed before, as it is done with correlations (Fisher's Z-transformation) ? Or just use the median?

I could not find anything about this question, but it should be a common, known problem when we want to "sum up" linear temporal trends over different samples.

Can anyone help me how to "average" linear trends = linear regression slopes ?


Cookie Scientist
Yes, averaging the 12 individual slopes is a sensible thing to do. Technically a more sophisticated approach could involve fitting a multilevel model with time points nested in weather boxes, but as long as (1) the weather boxes are all observed over essentially the same time frame, and (2) there are similar total numbers of measurements for each box (i.e., there are not some weather boxes with a lot more missing data than other boxes), the results of the multilevel model should end up being very very close to just the simple average of the 12 slopes.