Friday, May 28, 2021

Mistakes Were Made but Not by Me! Why we justify foolish beliefs, bad decisions, and hurful acts.


 If you have not read the book, Mistakes were Made but Not by Me! A must read. below is a story from the book. One of my favorites.  Then the definitions of cognitive dissonance, the engine of "self-justification." Below this information is a link to a very informative interview with the authors. Enjoy!

"More than half a century ago, a young social psychologist named Leon Festinger and two associates infiltrated a group of people who believed the world would end on December 21, 1954. They wanted to know what would happen to the group when (they hoped!) the prophecy failed. The group’s leader, whom the researchers called Marian Keech, promised that the faithful would be picked up by a flying saucer and elevated to safety at midnight on December 20. Many of her followers quit their jobs, gave away their houses, and disbursed their savings in anticipation of the end. Who needs money in outer space? Others waited in fear or resignation in their homes. (Mrs. Keech’s husband, a nonbeliever, went to bed early and slept soundly through the night, while his wife and her followers prayed in the living room.) Festinger made his own prediction: The believers who had not made a strong commitment to the prophecy—who awaited the end of the world by themselves at home, hoping they weren’t going to die at midnight—would quietly lose their faith in Mrs. Keech. But those who had given away their possessions and waited with other believers for the spaceship, he said, would increase their belief in her mystical abilities. In fact, they would now do everything they could to get others to join them. At midnight, with no sign of a spaceship in the yard, the group was feeling a little nervous. By 2:00 a.m., they were getting seriously worried. At 4:45 a.m., Mrs. Keech had a new vision: The world had been spared, she said, because of the impressive faith of her little band. “And mighty is the word of God,” she told her followers, “and by his word have ye been saved—for from the mouth of death have ye been delivered and at no time has there been such a force loosed upon the Earth. Not since the beginning of time upon this Earth has there been such a force of Good and light as now floods this room.” The group’s mood shifted from despair to exhilaration. Many of the group’s members who had not felt the need to proselytize before December 21 began calling the press to report the miracle. Soon they were out on the streets, buttonholing passersby, trying to convert them. Mrs. Keech’s prediction had failed, but not Leon Festinger’s.

"The engine that drives self-justification, the energy that produces the need to justify our actions and decisions—especially the wrong ones—is the unpleasant feeling that Festinger called “cognitive dissonance.” Cognitive dissonance is a state of tension that occurs whenever a person holds two cognitions (ideas, attitudes, beliefs, opinions) that are psychologically inconsistent, such as “Smoking is a dumb thing to do because it could kill me” and “I smoke two packs a day.” Dissonance produces mental discomfort that ranges from minor pangs to deep anguish; people don’t rest easy until they find a way to reduce it. In this example, the most direct way for a smoker to reduce dissonance is by quitting. But if she has tried to quit and failed, now she must reduce dissonance by convincing herself that smoking isn’t really so harmful, that smoking is worth the risk because it helps her relax or prevents her from gaining weight (after all, obesity is a health risk too), and so on. Most smokers manage to reduce dissonance in many such ingenious, if self-deluding, ways."

As humans we are adept at self-justification. The more intelligent we are, the better we can deceive ourselves. After all, we have to live with ourselves.

Reference: Tavris, Caroll. Mistakes Were Made (but Not by Me): Why We Justify Foolish Beliefs, Bad Decisions, and Hurtful Acts (pp. 15-16). HMH Books.

For a great interview with the authors, please access this link:

 #130 - Carol Tavris, Ph.D. & Elliot Aronson, Ph.D.: Recognizing and overcoming cognitive dissonance - Peter Attia (peterattiamd.com)

Richard Feynman on the Scientific Method

Richard Feynman the Nobel winning physicist, gave some very famous lectures. This one was at Cornell University and focused on the Scientific Method. Note his use of language, starts simple and then builds the theory. Enjoy: https://fs.blog/2009/12/mental-model-scientific-method/



Tuesday, May 1, 2018

Time to Retire the 16th Century Root Cause Phrase and Thinking  An Update


Systems thinking has destroyed the idea of single cause thinking from the 16th century. Systems thinking has been on a roll since Bertalanffy wrote General Systems Theory in 1968. In spite of systems thinking, the use of "root cause" phrase persists.

Psychologically, there is an upside and downside of the phrase “Root Cause Analysis (RCA).” The upside, the illusion of single cause thinking gives people hope. It sends the message that one thing is going on and they can handle that. On the downside, the mental image of a “root cause,” leads people to finding a cause. I once watched in horror as a Master Black Belt (MBB) led a group of engineers in a high-tech company through a multi-voting exercise on an Ishikawa Diagram. Once the MBB had all the votes, they focused on the “top cause.” There was a short plan put together to investigate this single "cause."  Since I was visiting, I was silent until someone asked me what I thought. I asked a question about the possible covariance of factors for the application being discussed. After one engineer that the factors do indeed interact, they got back to reality. Rather than one factor, they needed to consider multiple factors in a designed experiment.

There is hope. People are waking up! In 2015, The National Patient Safety Foundation exposed many of the problems with the myth of "Root Cause Analysis:" From the report:

"RCA itself is problematic and does not describe the activity’s intended purpose. First, the term implies that there is one root cause, which is counter to the fact that health care is complex and that there are generally many contributing factors that must be considered in understanding why an event occurred. In light of this complexity, there is seldom one magic bullet that will address the various hazards and systems vulnerabilities, which means that there generally needs to be more than one corrective action. Second, the term RCA only identifies its purpose as analysis, which is clearly not its only or principal objective, as evidenced by existing regulatory requirements for what an RCA is to accomplish. The ultimate purpose of an RCA is to identify hazards and systems vulnerabilities so that action scan be taken that improve patient safety by preventing future harm.

The term RCA also seems to violate the Chinese proverb “The beginning of wisdom is to call things by their right names,” and this may itself be part of the underlying reason why the effectiveness of RCAs is so variable. While it might be better not to use the term RCA, it is so imbedded in the patient safety culture that completely renaming the process could cause confusion."


The last line is tragic, unlearning is usually the first step in learning for many (some try to avoid it at all costs). From this line, the authors are in effect protecting people from learning. Cognitive Dissonance is a natural part of how we learn, adapt and change. The paper on RCA2 can be found here:

http://www.ihi.org/resources/Pages/Tools/RCA2-Improving-Root-Cause-Analyses-and-Actions-to-Prevent-Harm.aspx

The effort to restore systems thinking and 21st century science continued in February, 2017 with publication by Kiran Gupta, MD, MPH, and Audrey Lyndon, PhD, entitled Rethinking Root Cause Analysis. This paper has some great tables that describe the various problems associated with RCA. The authors are working with reference to 2015 paper referenced before.
Recently, Duncan Mackillop observed on LinkedIn: "I wonder what the "root cause" is of something that's gone right? After all, things go right and things go wrong for the same reasons." Typically, "root cause" thinking is only invoked when something has gone wrong. Duncan's reframing shows how limited single cause thinking is to any endeavor beyond the very simplistic.  


References:

1. http://www.ihi.org/resources/Pages/Tools/RCA2-Improving-Root-Cause-Analyses-and-Actions-to-Prevent-Harm.aspx



Tuesday, March 21, 2017

Part 2: Complexity of the Real World Has Outpaced the Myth of the Linear Method – The Need for a Science-Based Methodology Based on Questions.


Part 1 (see link to Part 1 here), presented some information on the use of linear models that typically reflect the ideal journey of a development or learning journey. Unfortunately, the real world typically deviates from this ideal view. What sort of model should we use that would more reflect our learning journey?
Learning from Deming – The Analytic Study
Deming offers some help on distinguishing two types of studies for learning from data and our experience in the real world. He classified studies into two types (1975), depending on the type of action that will be taken[i]:
1.       Enumerative study: one in which action will be taken on the universe that was studied (e.g., conducting a census, or sampling materials for a decision on acceptance or pricing).
2.       Analytic study: one in which action will be taken on a cause system to improve performance of a product, process, or system in the future (e.g., a study to select a future raw-material supplier or using a Shewhart control chart to learn and improve a process).
Note the key word in definition of analytic study – future. For Deming, the problem is prediction. The Foreword to the book, Quality Improvement through Planned Experimentation by Moen, Nolan and Provost (2009), Deming discusses how the results of studies for improvement are used to predict:
Why does anyone make a comparison of two methods, two treatments, two processes, two materials? Why does anyone carry out a test or an experiment? The answer is to predict; to predict whether one of the methods or materials tested will in the future, under a specified range of conditions, perform better than the other one. Prediction is the problem, whether we are talking about applied science, research and development, engineering, or management in industry, education, or government. The question is, what do the data tell us? How do they help us to predict?
What sort of model should we follow that helps us with the problem of prediction? When in doubt we can usually fall back on the methods used in science where questions become our guideposts for learning as we make predictions about the effect of our changes. Questions are powerful.

Science begins and ends in questions, but does not end in the same question in which it began..[ii]
Thinking about how to phrase useful questions is no easy feat. Dennett (2009) has offered a good description of the challenge of getting to useful questions[iii]:
“…anybody who has ever tackled a truly tough problem knows that one of the most difficult tasks is finding the right questions to ask and the right order to ask them in. You have to figure out not only what you don’t know, but what you need to know and don’t need to know, and what you need to know in order to figure out what you need to know, and so forth. The form our questions take opens up some avenues and closes off others, and we don’t want to waste time and energy barking up the wrong trees.”

This quote leads us to some observations about the utility and power of questions as we make changes and improvements:


       When making improvements, questions lay out the journey of learning to develop, test and implement changes.

       A good question can give us the opportunity to view many possible answers. People in the same process have different experiences and perspectives. These differences may produce very different answers and predictions relative to a well-crafted question.
       Curiosity is critical to the crafting of questions. People who are wedded to the current way of doing things can have their thinking impacted as they journey with others on the process of answering the questions. With technical change, social consequences follow, questions help the adult learner with self-discovery, a powerful process for social change.
       Questions expand our thinking and lead to more questions and possibilities for making changes. Answers usually end this process of thinking, discovery, and learning.

PDSA – Learning Engine for Analytic Studies

Moving from linear methods that do not include the need for questions and corresponding predictions to a methodology that is useful for an analytic study is essential. One such methodology is Deming’s PDSA cycle. Deming’s learning journey that led to the PDSA cycle began with his study of C.I. Lewis’s book, Mind and the World Order (1929) and ended with the publication of the PDSA cycle in Deming’s book, The New Economics (1994). Deming honored Dr. Walter Shewhart by calling the PDSA cycle, The Shewhart Cycle for Learning and Improvement[iv]. Deming’s journey of learning that led to the PDSA cycle was documented by Moen and Norman (2009) in the paper, Circling Back: Clearing up myths about the Deming cycle and Seeing How it Keeps Evolving[v].

Associates in Process Improvement (API) in 1994 added the rigor and importance of questions in the Plan phase of the PDSA cycle[vi]. Figure 1 describes this contribution:

Figure 1: PDSA Cycle


When developing questions for a PDSA cycle, and individual or team is sometimes tempted to utilize yes/no type questions. Table 1 describes moving the yes/no type question to an inquiry based question[vii].


Table 1: Moving the Yes/No Question to an Inquiry Type Question
Why is it important to pose the inquiry based question? When we convene a team of people to make an improvement, these people have had different experiences and interpretations in the system. More importantly, they may come from very different parts of the system. If a yes/no type question is posed, people could all answer yes or no for various reasons, based on their experiences while making predictions and miss the opportunities for sharing their experiences and interpretations. We surface these different experiences during the act of making predictions and explain our theories behind our predictions. An inquiry type question forces the sharing of different perspectives during the act of making predictions, which in turn could change how we collect data or carry out a test. Consider the example:

Using an exercise to teach the use of the PDSA cycle, questions were developed and people were asked to predict what impact exercise had on blood pressure. A test was then carried out by having people run up and down stairs, then collecting the data on the different blood pressures. As predictions were then discussed, a nurse posed the theory that women generally have higher blood pressures than men. This led to the need to stratify the data into male and female categories to test this theory. During the Do part of the PDSA cycle, we observed that men, some with beer bellies, were huffing and puffing during the brief experiment. The women in the experiment were all doing very well and is was obvious that the women had been working out. During the Study, the theory of women having higher blood pressures than men was updated with the caveat, it depends on the health and physical shape of those in the experiment. Act – In the future, further stratify the data by history of physical exercise and health. 

API adopted the PDSA cycle as an engine to drive learning. The Plan includes a plan for data collection. In Do, data are plotted during the collection or test providing information. During Study, the predictions are considered relative to the information collected in Do, creating knowledge. Action is then based on knowledge.

Built into the journey of moving from data to knowledge in the PDSA cycle, is idea of deductive and inductive learning[viii]. From Plan to Do is the deductive approach. A theory is tested with the aid of a prediction. In the Do phase, observations are made and departures from the predictions are noted. From Do to Study, the inductive learning process takes place. Gaps and surprises (anomalies) to the prediction are studied and the theory is updated or thrown out as needed. Action is then taken on the new learning to improve our ability to predict, the essence of the analytic approach. Figure 2 describes this iterative process of learning.
Figure 2: PDSA – The Iterative Nature of Learning and Improvement

 Model for Improvement and PDSA

The PDSA cycle is designed around a very tactical idea of developing, testing, and implementing proven changes. The Model for Improvement uses three broad questions to frame a project. The PDSA cycle is driven by the third question, What change can we make that will result in improvement? The Model for Improvement uses three broad questions to frame a project. The PDSA cycle is driven by the question, What change can we make that will result in improvement? which allows iterative PDSA’s for learning from changes.  Often the PDSA is misused by trying to frame a whole project using a single large PDSA instead of multiple, iterative PDSA’s. The latter are far more effective for learning the impact of changes as we work toward the overall objective of the project, which is captured in the first question of the Model, What are we trying to accomplish?  The second question, how do we know a change is an improvement?, provides the measures which tell us from our changes whether we are accomplishing our objective framed by question one of the Model.  Figure 3 describes the Model for Improvement.

Figure 3: Model for Improvement – Three Questions and PDSA Cycle

 Conclusion

Part 1 presented some information on the use of linear models that typically reflect the ideal journey of a development or learning journey. Unfortunately, the real world typically deviates from this ideal view.

·         The work of Dr. Jeff Conklin described how two developers following the same model did not perceive the world in the same way at the same time. One saw a problem, the other a solution.

·         Margaret Wheatley has observed that the fixation on linear models persists because people merely match their real-world results to model given. The myth of the linear model working continues.

·         The distinction between complicated and complex systems:  Complicated things can be figured out with time, like how a calculator works. Complexity operates in open systems, the dynamics of which are changing as well as the experience of the observer. This idea was captured in the quote from Heraclitus: It is impossible to step into the same river twice.

Part 2 has presented Deming’s distinction between analytic and enumerative studies. As Deming observed, “the problem is prediction.” In moving from a linear model to a model that will be useful for addressing the dynamics of the analytic environment we are better prepared to deal with the discoveries and challenges presented by the real world.  

In recent years, various models for improvement have been developed. Some with a focus on using various improvement tools in some prescribed order. This practice usually leads to a waste of valuable time applying tools rather than targeting the learning around useful questions and making improvements –quickly.  The good news, people are becoming aware of prescriptive models that emphasize the use of tools rather than focusing on improvement[ix]. Improvement professionals should encourage:

·         The use of methods that are useful in analytic studies. How do these methods lead to questions and enhance our ability to predict?

·         A method that employs questions that lead to more appropriate data collection and tests of changes.

·         Testing as the foundation of science and critical to carrying out an analytic study. Models that put an emphasis on implementation without adequate testing should be avoided.

·         The use of the scientific method which includes the ideas of deductive and inductive learning. Models that have the scientific method built in should be encouraged. PDSA was developed with this idea in mind.  

·         Avoidance of focusing on the use of tools rather than learning and improvement.

Finally, I am reminded of a conversation with a very bright individual who had led several successful improvement efforts: “Cliff, following an array of tools is easy. Developing the necessary questions is hard.” I couldn’t agree more. Thinking is very difficult.  Being involved in the activity trap of using tools creates excitement and the illusion of progress. Ultimately, the management will be asking the question, “What have you done for me lately?” If asked this question, you will want to show a list of improvements that were executed quickly. If forced to show all the activity and no results, this is called a reduction in force. Far too many improvement professionals were forced out of a job in the last recession. President Truman observed, “History does not repeat itself, we just keep on making the same mistakes.” Time to get focused on the science of improvement and the questions that drive tackling the analytic problem and our ability to impact the future.


References:




[i]     Moen, Ronald; Nolan, Thomas; Provost, Lloyd. Quality Improvement Through Planned Experimentation 3/E McGraw-Hill (2009)
[ii]     Six Easy Pieces: Essentials of Physics Explained by Its Most Brilliant Teacher Audible – Abridged
Richard P. Feynman, ©1963, 1989, 1995 The California Institute of Technology; (P)2005 Perseus Publishing. Note: I remember hearing these lectures and have attributed the quote to Feynman here, but have been unable to really nail it down. Would appreciate a more accurate attribution. Please contact me: cnorman@apiweb.org
[iii]    Breaking the Spell – Religion as a Natural Phenomenon, Daniel C. Dennett (2006, Viking Press, p. 19)
[iv]    W. Edwards Deming. The New Economics for Industry, Government, Education, Second Edition (p. 132). Published by MIT (1994).
[v]     Moen, R.D. Norman, C.L. Circling Back: Clearing up myths about the Deming cycle and Seeing How it Keeps Evolving, Quality Progress, American Society for Quality, November, 2010
[vi]    Langley,G. J., Nolan, K. M., Nolan, T. W., 1994. "The Foundation of Improvement." Quality Progress, ASQC, June,1994, pp. 81-86.
[vii]   Langley, G. J., Moen, R. D., Nolan, K. M., Nolan, T. W., Norman, C. L., Provost, L. P., 2009. The Improvement Guide: A Practical Approach to Enhancing Organizational Performance (JOSSEY-BASS BUSINESS & MANAGEMENT SERIES)
[viii]   Langley, Gerald J.; Moen, Ronald D.; Nolan, Kevin M.; Nolan, Thomas W.; Norman, Clifford L.; Provost, Lloyd P. (2009) The Improvement Guide: A Practical Approach to Enhancing Organizational Performance. Wiley Publishing.
[ix]    RIP SIX SIGMA !!!!!, Published on March 2, 2017 by Satyarth Pandey, Linkedin Post

Wednesday, February 15, 2017

Part 1: Complexity of the Real World Has Outpaced the Myth of the Linear Method

Dr. Russel Ackoff spoke often of how we were prepared to deal with the real world in school. We were usually presented with a “case study.” We would then busy ourselves in reading and developing ideas around the case study. We would then turn in or present the case to the teacher for a grade. Ackoff noted that in the real world, problems do not come in the form a case study, but as a “mess.” Part of our journey is understanding the mess and developing a statement of the challenge.  Laurence J. Peter warns us in the quote:
“Some problems are so complex that you have to be highly intelligent and well informed just to be undecided about them.”
The challenge before most of us trying to make improvements is the complexity of environment in which we are working. Systems that are not closed but open systems that are dynamic accompanied by influences for which we may or may not be aware. Ramo (2016) draws the distinction between complicated and complex systems:
Complicated mechanisms can be designed, predicted, and controlled. Jet engines, artificial hearts, and your calculator are complicated in this sense. They may contain billions of interacting parts, but they can be laid out and repeatedly, predictably made and used. They don’t change. Complex systems, by contrast, can’t be so precisely engineered. They are hard to fully control. Human immunology is complex in this sense. The World Wide Web is complex.
Complicated systems have the property of being closed systems, while complex systems operate in an open environment that is very dynamic. Many of us working to make improvements in our organizations can relate readily to the idea of complex system. Into this fray, many of us are given problem solving methods that are linear. We are presented with the idea that if we follow the method, we will be led to a solution. Dr. Jeff Conklin presented some research around the so-called “Waterfall” method commonly used to develop software. Figure 1 describes this method:
Figure 1: Waterfall Method of Problem Solving

In this method, we are to gather data, analyze the data, formulate a solution and implement the solution. How does the real world react to this linear path? Conklin then presented the experience of one designer following the method. Figure 2 describes how the perception of the designer vacillates from problem to solution over the course of using this method:
Figure 2: Waterfall Method with One Designer

Conklin referred to this vacillation as a “wicked journey.” If you have worked on an improvement effort of any complexity you can appreciate this journey. One day filled with hope and a solution, the next day, frustrated by an unintended consequence of your change, you are now faced with the challenge to adapt to the new circumstances you are facing. More work to do!
Life would be good if we could handle complex challenges by ourselves (the lone designer) as in Figure 2. Complex challenges usually require subject matter knowledge of other people. What happens as we add other people? Do they share our perceptions of the problem and solution? Figure 3 describes the journey with two designers:
Figure 3: Waterfall Method with Two Designers

From Figure 3, we can readily see the perceptions between the two designers track at times and are very different at other times. For improvement teams, we usually have 3-5 people on a team. Conklin refers to this addition of people as “social complexity.” Personality intelligence tells us that people are very different. Their perceptions of the same events, data, etc. may be very different given how they learn and their subject matter knowledge.
We had an improvement team in an international tech company that used a method called Understanding, Develop Changes, Test Changes and Implement Changes (UDTI). Within each of the defined phases the Plan-Do-Study-Act (PDSA) cycles were utilized. The team referred to their journey as “wicked.” Figure 4 describes this team’s journey:
Figure 4: Using UDTI with PDSA Cycles to Make an Improvement

In following the PDSA cycles in the figure, you can imagine the frustration of the team in PDSA 12 when a rush to implementation led to failure and required a visit back to the “Understand” phase. After this learning, testing was always done before implementation. The six cycles of implementation at the end were the spread of known changes to other regional groups.
What is the downside of the vacillation between the stages of the linear method? When an improvement team discovers an unintended consequence of a test, they must go back to a prior stage of the linear model, many see this as a failure.  Experienced people with improvement efforts understand that when addressing complex challenges, learning and unlearning are natural parts of the journey. However, the same organization that used the UDTI method had one team in Europe that eliminated all the failed PDSA cycles of the improvement journey, forcing a perfect match to the method. Unfortunately, this sort of practice, while helping self-esteem has nothing to do with the science of improvement.
People who use such linear methods often discuss the vacillation of the hopeful path. One of my colleagues is looking for the first project of any complexity that follows such a method. So far, we have not found one. Margaret Wheatley once commented on why the myth of success with linear methods continues: “After the fact, people usually report their journey by the prescribed method, thereby reinforcing their use.” Dr. Jeff Conklin and the UDTI team have given us some insight into use of such methods as they encounter a world of complexity. Hopefully, we won’t be surprised when the real world does not cooperate.
In Part 2, we will examine some methods based on the science of improvement. The importance of questions in addressing complex systems and help in addressing the social consequences of technical change.

References:
1.       Conklin, Jeff, Wicked Problems and Social Complexity (2008); This paper is Chapter 1 of Dialogue Mapping: Building Shared Understanding of Wicked Problems, by Jeff Conklin, Ph.D., Wiley, October 2005. For more information see the CogNexus Institute website http://www.cognexus.org. © 2001-2008 CogNexus Institute. Rev. Oct 2008.
2.       Ramo, Joshua Cooper. The Seventh Sense: Power, Fortune, and Survival in the Age of Networks (p. 137). Little, Brown and Company.
3.       Leadership and the New Science, Margaret Wheatley, Berrett Koehler Publishers, San Francisco, 1992. Note: In searching this book, we were not able to locate the quote from Wheatley. In communication with her staff, we were told to attribute. The reader may find this reference useful. Find Part 2 here



Acknowledgment
      Jesse Trevino and Jane Norman offered comments to make this more readable. Many thanks. Ron Moen added suggestions for including Deming’s concepts of enumerative and analytics studies. Bruce Boles, Jonathan Merrill and Dave Hearn reviewed. Many thanks to all for the help!