a long time ago I have read somewhere, that a regression can overcome heteroskedasticity/ variance inhomogenity up to certain amount.

It might have been working something like the variance is calculated for certain target variable intervals. If the highest variance was maximal m-times larger the lowest variance, on could suppose, that the calculated regression coefficients are still usable, or something like this. Unfortunately I do not find it again.

Do you know this over the thumb rule regarding deviating variances?