Here is more background on this question. I apologize for the length. Let's assume that a untested groundwater well (well x) is near a contamination source. We have data on 20 water quality parameters for 100 wells in the area. The idea is that if for one of the parameters well x has a value that is one standard deviation above the mean for the 100 wells, this would not indicate contamination - the one parameter sample is not outside of the 95% confidence interval. However, if we have 5 parameters for well x that are one standard deviation above the respective means for the 100 wells, then we could multiply the probabilities of these test values being above the 100 well baseline. For example: 0.16^5 = 0.0001048576. In other words, we would expect to see a sample exceed one st. dev. of the baseline mean for these five parameters at the same time in only 105 cases out of a million.

Does this logic make sense?

Also some of these 20 parameters are highly correlated and several would violate the assumptions of a normal distribution.

Any help is much appreciated!!!