In a rectangular cartesian coordinate of length L and breadth B, you choose any two points randomly (i.e. you choose random X and random Y for both the points) using random uniform distribution.

You then calculate the distance between them (simple linear distance given by pythagorean theorem on x and y differences).

By simulation I can find the mean value of d for given values of L and B, however, I need a general formula to derive this.

Does anyone have an idea how to do this?

(From simulation, for square of length 1, mean d =0.5; but for rectangles, i couldnt find any pattern)