The Shifty Business of Process Shifts: Part 2


Shift and Drift

The Historical Context

Many have often said that the projection of history can forecast the future.  Well, this may be true in some areas of life, but certainly not by considering today’s common notion of the 1.5 sigma shift factor (as associated with Motorola’s Six Sigma Initiative).  In retrospect, there was a fork in the road at which point a wrong turn was made by quality professionals and many contemporary Six Sigma practitioners.

By tracing the technical origins of Six Sigma, we will come to understand why the standardized shift factor was vital to certain aspects of Six Sigma.  More specifically, we will learn that the shift factor was originally intended for use during the course of product design, not real-time process control (as many falsely believe).

On the subject of learning, Dr. George E. P. Box published a key article in 1976. The article was entitled: “Statistics and Science.”  In this article he emphasized the importance of balancing theory and practice.  He stated: “One important idea is that science is a means whereby learning is achieved, not by mere theoretical speculation on the one hand, nor by the undirected accumulation of practical facts on the other, but rather by a motivated iteration between theory and practice.”

In light of such wisdom, the 1.5 sigma shift factor was established as a point at which statistical theory was harmonized with the empirical evidence, but done so in a purposeful and rational way.  In this context, the shift factor must be viewed as an engineering model, where that model can be used to purposefully and rationally compensate a product design analysis for the inevitable influence of random process centering errors.

Dr. Box goes on to say: “For the theory-practice iteration to work, the scientist must be, as it were, mentally ambidextrous; fascinated equally on the one hand by possible meanings, theories, and tentative models to be induced from data and the practical reality of the real world, and on the other with the factual implications deducible from tentative theories, models and hypotheses.”

In the same article, Dr. Box encourages us to always remember that: “all models are wrong, but some are useful.”  Here again, in this context, such guidance is highly applicable to the 1.5 sigma shift factor.

In this sense, the standardized 1.5 shift factor is a first-order approximation and should never be treated as some type of physical law or theoretical constraint that must always be applied in every situation.  Again, the shift factor is an engineering model of the first-order, not a precise number that should somehow be used to control and report the behavior of a production process.

Review of the Literature

A cursory review of today’s literature on this topic will reveal that many quality and Six Sigma practitioners attempt to treat the shift factor as a constant when assessing and reporting process capability.  For those that attempt such a misguided use of the standardized shift model, they usually wind-up with a nonsensical array performance measures.

Owing to their highly suspect outcomes, they often formulate arguments against the shift factor.  Of course, when the outcomes of such misguided applications occur, its much easier for them to lay blame at the door step of the model than accept their own misunderstanding of how, why and when to apply the model.

Without due consideration, the outcomes of such unsuccessful applications are often broadcast throughout the larger Six Sigma community by way of articles, white papers and discussion boards.  Unfortunately, they never stop to question the context of their applications.  As location is to real-estate, context is to the shift model.  If the context of application is inappropriate, the resulting conclusions will likely prove faulty.

Again, the shift factor is only intended for use during the course of product design (and certain benchmarking exercises).  The shift factor was never intended for use during course of real-time process control and reporting.  Consequently, the current and next generation of Six Sigma and quality practitioners becomes tainted by such seemingly legitimate arguments against the shift factor.  The latter points cannot be overemphasized.

From this perspective, opponents of the shift factor are absolutely correct – the shift factor has no place in the world of statistical process control (SPC) or statistical process monitoring (SPM).  Even the world renowned Dr. Donald J. Wheeler of SPC fame got it wrong when he labeled the 1.5 sigma shift “goofy.”  He simply viewed it from the wrong angle.  In short, he had the right answer to the wrong question.

However, on the flip side, the 1.5 sigma shift factor has great relevance in the world of product design analysis and optimization.  To fully understand the different contexts and better appreciate their consequential implications, we will survey the technical evolution of Six Sigma.  In this way, we can better understand where (and why) the field of quality and Six Sigma “ran off the road” when it comes to Motorola’s standardized shift factor.

The Early Days of Six Sigma

Prior to the innovation of Six Sigma, Motorola was struggling with how to make a quantum change in the quality of its products and services.  Several different quality improvement goals were established and then latter elevated.  Thus, the birth of Six Sigma.

In the beginning, there were only a handful of us (at Motorola) that were directly involved in the initial formulation of Six Sigma (1982-1985), chief among which was Mr. Bill Smith and this author – see table on page 83, How Management Innovation Happens,” MIT Sloan Management Review.

During this period of time (1982-1985), we tossed around many ideas and approaches that could be used to achieve a Six Sigma level of quality by 1992.  At day’s end, our developmental goal was to create a methodology that could link product reliability targets to design tolerances and process capability; and do so in a more predictive manner.

In this way, we could “design in” quality rather than becoming masters at “detecting and fixing” defects.  Thus, we could greatly improve customer satisfaction while concurrently reducing the company’s total cost structure.  It was during 1986 when Motorola officially adopted Six Sigma.  Correspondingly, this author published the first definitive essay on the subject: The Nature of Six Sigma Quality,”

This booklet initially served as the definitive source for Six Sigma.  It also presented and discussed several innovative manufacturing and engineering concepts.  According to Motorola University Press, over 500,000 copies of this booklet were printed and distributed throughout the world.

As Six Sigma was still in a fluid state until about 1987, several grass-root initiatives got underway across Motorola.  The aim of such initiatives was to further solidify the form, fit and function of Six Sigma, as well as to find ways it could be expanded in terms of scope and depth of application.  Over time, the cream of such grass-root efforts rose to the top.

As a matter of informal protocol, the top technical ideas and innovations from around the company were often first passed through a local filter to evaluate things like feasibility and utility.  After this, the surviving ideas and innovations made their way to one or more of the various corporate oversight and policy groups, like the Motorola Corporate Quality Council and Motorola Science Advisory Board.

Of course, the purpose of the corporate-level filter was to review and adopt the best-of-the-best, so to speak.  After passing through this filter, the top ideas were passed on to the Motorola Training and Education Center (latter designated as Motorola University in 1989).  Essentially, Motorola University would package and distribute the new knowledge in the form of books, manuals and training programs.

At the end of the line, Motorola executives, managers, engineers, technicians and general employees would make use of the new knowledge (to varied degrees of application).  Naturally, the feedback from such field applications was passed back through the aforementioned filters.  In this way, Motorola was able to quickly deploy and implement new technical knowledge to the benefit of its stakeholders.

Of interest, 1988 was a pivotal year for Motorola and Six Sigma.  During this time, Dr. Ron Lawson and this author published our initial work on the subject of Six Sigma Producibility Analysis and Process Characterization, which quickly migrated to the US Navy.  Also during 1988, this author worked in conjunction with Mr. Reigle Stewart to publish a textbook and supporting software on the subject of Six Sigma Mechanical Design Tolerancing.

The crown jewel for this year was Motorola winning the first Malcolm Baldrige National Quality Award, for which Six Sigma was credited as playing a major role.

The Center of Development

With the aim of consolidating and centralizing the grass-root efforts, this author prepared a formal proposal in late 1989.  The proposal was submitted to the senior executive team of corporate Motorola in January of 1990.  The proposal was subsequently approved by Mr. Bob Galvin (then CEO and Chairman of Motorola).

To implement the proposal’s intent, the Six Sigma Research Institute (SSRI) was formed and operationalized in 1990 by this author and executive.  Through this strategic thrust, Six Sigma began to find its way into the professional head-space of executives, managers, engineers and technicians.  In 1992, this executive founded the Six Sigma Technical Institute (SSTI) and launched its first program, which was the technical corollary of the Motorola Management Institute (MMI)

In about the same timeframe Six Sigma managed to secure its place within the MMI curriculum.  Then in 1992, under the direction of this author, SSRI began creating the BlackBelt infrastructure at Motorola, Kodak, Texas Instruments, IBM, and Digital – all of which were members of SSRI.

In short, Six Sigma was quickly changing the company’s perspective of quality.  Before Six Sigma, the company mindset was built around the “Business of Quality.” After solidifying Six Sigma, the mindset switched to the “Quality of Business.”  Indeed, this was huge philosophical shift with many implications.

An excellent summary of Six Sigma’s early times appeared in a Quality Digest article prepared by Dr. John S. Ramberg, a prestigious engineering professor at the University of Arizona.  Dr. Ramberg’s highly informative article was entitled: Six Sigma: Fad or Fundamental?”

Owing to his direct involvement with Motorola on this subject, Dr. Ramberg’s narrative provides a crisp, clear and highly insightful look at the technical side of Six Sigma, as well as the early role it played at Motorola.  Of particular interest, the reader is strongly encouraged to read his sidebar discussion: “Origin of Six Sigma: Designing for Performance Excellence.”  Another highly accurate chronology of Six Sigma’s history and evolution is provided by Process Quality Associates, Inc.

The Solid State

By 1993, the Six Sigma program started to build significant momentum within the mainstream of Motorola, as well as several other companies.  From this perspective, it’s easy to see that Six Sigma had transitioned from a liquid to a virtually solid state.

Elsewhere in the company, we were steadily applying the Six Sigma principles and methods to design engineering problems, as well research and development activities.  Even seemingly diverse areas such as the corporate law department and cafeteria started using Six Sigma as a means of improvement.

In terms of tools, SSRI had developed and rolled-out an electrical design analysis and optimization methodology for both analog and digital circuits.  Also in the toolbox was a statistical tolerancing system for mechanical design engineers.  In short, the Six Sigma tool box was getting bigger and the applications were many, especially in the area of design engineering.

Also about 1993, Motorola’s quality engineers were consistently using Six Sigma methods in the design and development of new manufacturing processes and factories.  In parallel to this, reliability engineers began systematically working with product designers to reverse engineer certain product component specifications based on a desired reliability model.

For example, a reliability engineer might elect to start with the targeted instantaneous failure rate of a product.  From here, the engineer would reverse compute the pre-screen defect rate which, in turn, would be rationally allocated back to the product system components.

From this information and data, the quality engineer would develop an appropriate process capability model for both the centered and shifted cases.  In this manner, the quality engineers, reliability engineers and product engineers would work together as a team to concurrently design a product that would be robust to process variations in both bandwidth and centering.  Such is the Six Sigma way of thinking.

Unfortunately, it was during 1993 that Mr. Bill Smith passed away in the company cafeteria.  His achievements at Motorola were numerous, with Six Sigma being his crown jewel.  Without saying, his loss represented a setback in the momentum toward Six Sigma, but through his wisdom and teachings, the continuance of Six Sigma was assured.

By 1994, the principles and methods of Six Sigma had even pierced the veil of software engineering.  At about this time, the core concepts and practices of Six Sigma were being slowly integrated into the areas of financial risk assessment and abatement.  Overall, Six Sigma was rapidly moving toward its destiny as a world-class system of business management.

Online MindPro Black Belt Training and Certification
Six Sigma Wings for Heroes
The Great Discovery
Dr. Mikel J. Harry Biography & Professional Vita

Business Phone: 480.515.0890

Business Email: Mikel.Harry@SS-MI.com

Copyright 2013 Dr. Mikel J. Harry, Ltd.

About Mikel Harry

Dr. Harry has been widely recognized in many of today's notable publications as the Co-Creator of Six Sigma and the world's leading authority within this field. His book entitled Six Sigma: The Breakthrough Management Strategy Revolutionizing the World’s Top Corporations has been on the best seller list of the Wall Street Journal, New York Times, Business Week, and Amazon.com. He has been a consultant to many of the world’s top senior executives, such as Jack Welch, former CEO and Chairman of General Electric Corporation. Dr. Harry has also been a featured guest on popular television programs, such as the premier NBC show "Power Lunch." He is often quoted in newspapers like USA Today and interviewed by the media, such as The Economic Times. In addition, Dr. Harry has received many distinguished awards in recognition of his contributions to industry and society. At the present time, Dr. Harry is Chairman of the Six Sigma Management Institute and CEO of The Great Discovery, LLC.
This entry was posted in Six Sigma (Technical) and tagged , , , , , , , , , , . Bookmark the permalink.

4 Responses to The Shifty Business of Process Shifts: Part 2

  1. jaggujee says:

    Mikel,
    Do you guarantee Six Sigma yield in every process that you consult for under the banner of SS? If not, SS is a misnomer, mislabel, misrepresentation and a mistake on part of those who buy into it.

    From the Probability and Statistics I learned, SS is the conclusion from the data crunching only if the data are 100% within the specs. Repackaging the tried and true methods and labeling them Six Sigma does not guarantee SS yield,

    I see that you consulted for GE. GE saved billions not because of SS methodology, but because of the opportunities that existed. Any Cost Improvement drive under any name would have resulted in savings.

    SS does nothing more than raise the hopes. All the contents of SS are covered in IE curriculum of the last century in couple of courses. I discovered while getting certified because it was required at a company where I worked.

    Jaggujee

    • Mikel Harry says:

      jaggujee:

      Thank you for your comments. As to your first question (paragraph 1), it would appear that you answered your own question in paragraph 2. With the greatest of respect, I must humbly disagree to both the question and the answer you provided. In terms of your commentary in paragraph 3 about General Electric, the literature clearly shows that Jack Welch (via his books on the subject) would strongly disagree with your conclusions about GE. If any improvement program would have brought about GE’s tremendous savings, then why did GE hold back until Six Sigma come along? We do agree that Six Sigma raises hopes; and justifiably so. In support of this assertion, I have included a link to financial reports from top Wall Street analysts concerning the true impact of Six Sigma. Within this report, you will notice that many of the top investment firms believe that Six Sigma produces verifiable bottom-line benefits for the shareholders. Of course, when the money talks, you know its real. Again, I respect your viewpoint of Six Sigma; however, in my opinion, your experience with Six Sigma seems quite inconsistent with the published experiences of some of the world’s top business executives.

      Mikel J. Harry, Ph.D.
      Co-Creator of Six Sigma
      National Best Selling Author
      Consultant to World’s Top Executives
      Working Philanthropist

  2. Gabriel Jones says:

    Dear Mikel,

    I read the post with interest. I have a question around shift. In “Resolving the Mysteries of Six Sigma,” you point out that, it is an “equivalent” shift, not a “literal” shift; and, that you have to start with a 6 sigma capability to ensure a 4 sigma design. Can this “that you have to start with a 6 sigma capability to ensure a 4 sigma design” be further elucidated? with an example.

    Cheers,
    Gabriel

    • Mikel Harry says:

      Gabriel:

      Thank you for your question. I certainly understand how the subject matter associated with your question might be somewhat confusing to those that lack experience with Six Sigma, especially in the areas of design engineering and process qualification.

      Obviously, by the nature of your question, it would seem you are fairly well experienced in the practices of Six Sigma; consequently, I’ll spend less time defining terms (you probably already know) and go straight to the answer.

      However, just to make sure we’re on the same page (from a context point of view), it would appear that your question was generated from the following extracts (as provided in the book you referenced):

      “Since the analyst’s goal was to net Cp = 1.33 [4 Sigma] with at least 99.5 percent certainty, she determined that a process capability of Cp = 2.0 [6 Sigma] or greater would have to be discovered upon execution of the DPQ (based on df = 29).”

      This point was latter reinforced:

      “As the reader should now understand, a short-term sampling capability of ±6 sigma must be realized (during qualification) in order to be at least 99.5 percent confident that the net effect of random sampling error (as accumulated over an extended period of time) will not compromise or otherwise degrade the idealized design capability of ±4 sigma, given that the DPQ was founded on n = 30 random samples collected over a relatively short period of time.”

      The context of your question is set in an example where an engineer has selected a process that she believes can “meet the design specifications” for a certain product performance characteristic.

      This means that her management wants to qualify the selected process before granting final approval to bring it “online” for full scale production.

      At this point, she knows that the design specification is a symmetrical bilateral tolerance. Owing to its symmetry, she decided to work exclusively with the top end (USL – T). In this case, the upper specification limit was given as USL = 130 and the target value as T = 100.

      Furthermore, the design margin M was specified as M = .5, or 50%. This is what we would call a Six Sigma Margin of Safety, whereas a Four Sigma Margin of Safety would be M = 25%.

      At this point, she made a first-order approximation of the population process standard deviation, which was given as S = (1 – M)(USL – T)/3 = (1 – .50)(130 – 100)/3 = 5.0.

      Thus, she established the best-case process capability requirement as Z.usl = (130 – 100)/5.0 = 6.0. Following this, she reviewed the qualification sampling requirement, which specified n = 30 and 1 – alpha = .995, or 99.5%.

      Using only one side of the chi-square distribution, the upper confidence interval of the first-order standard deviation (re: S = 5.0) was determined to be S = sqrt((30 – 1)52)/13.12) = 7.43, or about 7.5. Thus, she determined the worst-case process capability expectation to be Z.usl = (130 – 100)/7.5 = 4.0.

      This is to say that if she targets a 6 Sigma level of capability and her sample of n = 30 reveals a capability of 6 Sigma, she would be at least 99.5% certain that her worst-case scenario would be a true capability of 4 Sigma, which is a conventional level of performance.

      Generally speaking, this is to say she must start with a 6 Sigma capability (as estimated from the sample) in order to ensure a minimum capability of 4 Sigma or greater (in accordance to the qualification criteria).

      As a footnote to my response, you might like to know that I will be publishing another white paper on the Six Sigma Model (SSM), which includes a complete description of the 1.5 sigma shift, as well as where it comes from and what its used for.

      The following is a few highlights from the paper that might peak your interest:

      “My primary reason for publishing this paper is attributable to the mass of misunderstandings that are currently being propagated about the Six Sigma Model within the Six Sigma community. Most practitioners are not aware that the SSM is actually a set of performance objectives, where those objectives are established on the basis of human judgment, not sampling statistics or mathematics. In reality, idealized objectives are not subject to any type or form of “proof.” Objectives stand on their own merits. Such things as sampling statistics, mathematics and personal experience can be used to facilitate setting objectives, but not used as tools to authenticate or otherwise validate the worthiness of those objectives. For example, much time and effort has been wasted trying to prove or disprove the shift factor. The shift factor is not universal and should not be used as a surrogate for actual data, yet many attempt to do so.”

      I hope this response has answered your insightful question. I wish you the best.

      Respectfully,

      Mikel J. Harry, Ph.D.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s