As used by many of today’s companies, the Overall Equipment Effectiveness (OEE) metric has been around a long time. Essentially, the OEE metric reports on how effectively a manufacturing operation is utilized. The conventional practice for computing this performance metric is: OEE = Availability x Performance x Quality.
A cursory review of the literature will reveal that when OEE = 85%, the performance is considered to be world-class. Assuming a uniform weighting, we can reverse-compute the percentage for each of the three OEE components. Doing so reveals that .85^(1/3) = 0.9473, or about 94.73%. To confirm this, we simple compute .9473 x .9473 x .9473 = .85, or 85%. This is to say that if Availability, Performance and Quality are each 94.73%, then OEE = 85%.
Of particular interest, let’s zero in on the Quality component. If the Quality portion of the OEE metric is 94.73%, we can convert this to an equivalent Sigma value. Doing so reveals Sigma = 3.12. Certainly, this level of Sigma is not world-class. In fact, it’s not even average (4 Sigma). Also of interest, a great many sources on the OEE metric advocate a quality rate of 60% is quite typical (i.e., average). Interestingly, this converts to 2.5 Sigma. However, after over 30 years of detailed quality benchmarking, we know the typical level of quality is about 4 Sigma.
To reinforce the latter point, just go to the internet and search for the published defect rates of common products and services. For example, consider airline baggage handling. Today, the number of mishandled bags is about 8.83 per 1,000 (reference: http://www.rfidjournal.com/articles/view?10992). This means the quality rate is currently about 1 – (8.83/1,000) = .99117, or about 99.12%, which directly converts to a Sigma value of 3.87.
So why is there such a large discrepancy between OEE’s idea of a typical level of quality and that validated through years of empirical benchmarking? How does one reconcile the chasm?
What are your thoughts and opinions?