I am trying to calculate an error rate, but I am facing problems deciding which Average calculation would be the most accurate.
Here is the case:
Producing 1000 items
First Check Point:
Reviewed : 20 % of total produced
Errors Found : 10
Internal Error Rate : 10/200 = 5%
Second Check Point:
There is a case when somebody can find an error right after it occurs. In this case I need to calculate this error, but I cannot add to the calculation from the internally reviewed items. Calculation like this would not be accurate: let say 200 items have been reviewed randomly and 10 errors have been found. I can not add this error up to the 10 errors found internally and assume 201 items have been reviewed, because the last item was not picked up randomly.
So I calculate Second Error Rate base on the total amount of items:
In this case 1/1000 = 0.1 %
Final Check Point:
Errors Found: 2
As information for these errors comes from the end user, I assume that from the total amount of items 1000, only 2 errors have been found.
With this presumption I am calculating Third Error Rate:
2/1000 = 0.2%
Internal Error rate 1 : 5 %
Internal Error rate 2 : 0.1%
External Error rate : 0.2%
What would be the most accurate estimation of Error rate based on those 3 different calculations???
I think that simple average is not the right calculation.
I really count on your help!
Advertise on Talk Stats