2015-02-24 16.23.42.jpg

Hi.

Welcome to my website. This is where I write about what interests me. Enjoy your visit!

HRO 13: Introduction To Questioning Attitude

HRO 13: Introduction To Questioning Attitude

Introduction

A questioning attitude is necessary to enact the mindfulness associated with High Reliability Organizing (HRO). It is fundamental to safe operat­ions and resilience in complex systems. Having a questioning attitude and expressing dissent openly operationalizes the belief that reliability is enhanced when members of the organization take nothing for granted. When leaders value dissenting opinions, people feel empowered to speak up about potential problems (Nemeth & Goncalo, 2011).

In the U.S. Navy, people talk endlessly about the importance of having a questioning attitude. I have discussed it in formal training and hundreds of small-group seminars. It is a top organizational value so I wanted to write about it. As with many of the HRO concepts I explore in my blog, I was unsatisfied writing about the way it is typically discussed, “It is important. It makes us safer. Do it more.” I wanted to understand why it works. I know that as an individual skill, it is hard to practice, but the reluctance to speak up has to be overcome.

In this post, I define questioning attitude and refer to research that suggests, like dissent, it opens decision makers to new possibilities and leads to better decisions even if it is discounted. Questioning attitude helps people in the organization listen to weak signals of problems. Questioning attitude shares all seven of Weick’s characteristics of sensemaking (Weick, 1997), a key means of developing shared understandings of conditions and safety.

Questioning Attitude-Definition

At its most basic level, having a questioning attitude means accepting nothing at face value. It requires verifying everything you see, hear, and are told, independently whenever possible. You employ your technical knowledge and experience to adopt a skeptical stance toward plans, operations and processes to answer questions like:

  • “Is this result what I expected?”

  • “Does anyone else know about this?”

  • * “Why did this happen?”

  • “What external forces could impact what we are about to do?”

  • “Is this action in accordance with the procedure?”

  • “Do we meet the required initial conditions for this procedure?”

  • “What if this other thing happens?”

In certain communities of the U.S. Navy, operators are expected to adopt a questioning attitude and speak up or challenge others, including superiors, whenever things don’t “look right” or they don’t understand something. Things that don’t look right might be misalignment between action taken and results, a record in a log, or a report that doesn’t match what a person has observed. Things not understood can be a course of action, a choice of procedure, or the diagnosis of a problem.

Questioning Attitude as Dissent

Asking questions when you suspect something isn’t what you expected it to be is a form of dissenting from the consensus of the group or status quo (“it’s always been that way”).

Group consensus can be comforting and harmonious for those on the “inside.” Leaders often prefer to feel “everyone is on the same page.” This makes consensus efficient as a means of motivating coordinated action. The first downside of consensus is that it can facilitate bad decisions (Asch, 1956; Bond & Smith, 1996, Janus, 1972). Expressing dissent can be irritating to the group and can lead to ostracism or severe criticism for the dissenter.

The second big problem with conformity is that we are frequently unaware that we are adapting our views to agree with the majority. Participants in many research studies just described it as feeling like the right thing to do. We follow the majority primarily for two reasons, both typically unconscious (Nemeth, 2018). First, our assumption is that truth is in the numbers of people agreeing or in the person with authority. We are more likely to believe something is wrong with our perception than think everyone else is an idiot. If the Commanding Officer wants to surface the submarine rapidly before getting the required evaluation from the sonar room, the sonor operators conclude he must know what he is doing. That didn’t work out so well for the USS GREENEVILLE and the crew of the EHIME MARU (Roberts & Tadmor, 2002; Shattuck & Miller, 2006). The second reason for conforming is our unconscious desire to be a member of the group, to be part of the team. Humans are social animals.

Dissent motivated by a questioning attitude is exactly what people need to make better decisions. Even when dissent isn’t welcomed, research suggests that it causes decision makers to reevaluate their views, consider other data, and make better decisions. It leads to divergent thinking on teams despite the criticism meted out to the dissenter. Instead of a linear thought process for problem-solving, people begin to weigh the pros and cons of the initial view as well as the dissent. The challenge opens decision makers to new possibilities they wouldn’t have considered otherwise (Nemeth, 1995). In this way, “dissent is a liberator" (Nemeth, 2018, p.197).

“Listening” for Weak Signals

As system complexity and coupling between components and subsystems increase, there is a tendency for interconnections between subsystems to be less transparent (Perrow, 1981). In complex systems like nuclear power plants, some problems develop slowly in subsystems or exist but are not visible until the “right” circumstances bring them to light. Once “triggered,” long-standing problems can appear suddenly and spread rapidly, impacting the operation of the entire system in undesirable ways (Perrow, 1984). The low visibility of some problems places priority on identifying and responding to them when they are small.

As indications of deeper, real problems, problems can result from operators making unobserved mistakes, equipment not operating as expected, or component configurations that reduce reliability like operating the ship’s steering control system in backup manual mode (NTSB, 2019). That ruined the day for ten sailors on the USS JOHN S MCCAIN.

When problems are small, you have more options for responding to them. Small problems require fewer resources, don’t spill over into interconnected systems or operations as quickly, and are not as intractable (Weick and Sutcliffe, 2015). Another name for small problems is “weak signals,” (Vaughan, 1996), which are subtle and equivocal indications of problems. Their smallness makes them easier to ignore or dismiss as unimportant, a challenge for recognition and action.

When problems are small, indications suggesting their existence can manifest without alarms or warning lights. A weak signal may only be visible to a single person, often the least experienced person in a group of senior operators. People with less experience and low status are the people in the organization least likely to speak up or not seen as credible observers because of their low status (Nemeth, 2018).

Even when noted, small problems are subject to equivocal interpretations. Not all weak signals are signs of actual problems. A junior operator’s lack of intuitions because of low experience can lead them to mistake something they don’t recognize as a problem when it is actually normal. This is another reason why speaking up can be a problem for people. They don’t want to look foolish.

Sensemaking and Questioning Attitude

Formulating ideas of what is going on and making sense of one’s surroundings so that one can speak up about things one doesn’t understand is no easy accomplishment (Weick, 1993). Like a skeptical stance for understanding ongoing events in a complex system, sensemaking is about building contextual rationality. It can start with unease, vague questions, murky answers, and hunches in search of data to rationalize what they are observing (Weick, 1995). Like sensemaking, questioning attitude is a process of retrospectively making sense of reality. It shares the seven properties of sensemaking.

Identity. Someone notices something that seems odd, out of place, or unexpected with respect to their understanding of what is normal. They subsequently make a personal decision, a hunch, a guess about what it might mean.

Retrospective. Noticing occurs in the “flow” of events, but it takes time to process what is noticed. It might have happened seconds, minutes, or hours ago like a log reading. Often, other things have to happen after the event to create a clearer context for noticing.

Enactment. Organizations conducting complex, possibly high-risk work have numerous procedures that must be executed in sequence. Orders are given and actions taken that are expected to produce observable results. The results are made sensible by the procedures. “I do this, then this should happen.” Through deep system knowledge and a questioning attitude, operators become tuned to noting the system outcomes from their actions and spotting deviations from “normal.”

Social. Questioning attitude might begin with what an individual notices, but its logic requires that doubt be verbalized to be evaluated by others. Operators need to describe their puzzlement or surprise and explain it using their technical knowledge. For example, an operator may record data hourly for some piece of machinery, a structural component, or an electronic sensor. If that reading is out of specification, it is first documented in a record (these are called “logs” in the Navy). In the Navy, the operator is required to note in the remarks section of their log who they informed and when. Supervisors look for this when they review logs.

Ongoing. The machinery and sensor readings in most complex systems operating are always changing. This means understandings of what is going on must be constantly updated. What was a normal indication five minutes ago might not be normal after a change. Equipment parameters and visual data change for starting or shutting down systems, for doing maintenance and testing, and for transient events like changing ship’s speed or course. Questioning attitude never stops.

Extracted Cues. Questioning attitude revolves around the cues people extract from systems related to their work. This might be Bridge equipment (sensor displays, control systems) or the components of the propulsion plant. The cues extracted by the operators in the midst of all the things going on are the very things they might need to question. It might be how some system is functioning, an order given by a superior, or a problem they have noticed.

Plausibility. Questioning attitude is driven by plausibility rather than accuracy. Operators make a guess that something doesn’t “look” right or is not operating as expected. They proceed to collect more data to see if their hunch was correct. What operators notice may actually be acceptable, but they shouldn’t make that decision without verification either with data or by talking to a superior. Explanations they receive from others can further tune their understanding.

Conclusion

A questioning attitude is a key skill to enact the mindfulness employed in High Reliability Organizing. Having a questioning attitude is a perspective that nothing should be accepted at face value. Whenever possible, confirm everything with data. Voicing a dissenting opinion isn’t easy because we can be blinded by our desire to conform to group consensus. Even when not blinded, there is strong social pressures to conform to the majority view. A questioning attitude amplifies weak signals of problems that could grow. The key features of questioning attitude map well to the seven properties of sensemaking as articulated by Weick (1997).

In my next post on questioning attitude, I argue that it is both a skill and a decision. As a skill, it can be improved through practice. As a decision, it reflects the choice to be suspicious that things are never quite what they seem and speak up when one has the data to show that. I give many suggestions for how to operationalize a questioning attitude through day-to-day operations. Finally, I offer some suggestions for overcoming resistance to “sticking out” and speaking up, which, though scary, are essential for questioning attitude to improve resilience and safety.

References

Note to readers: the references below belong to both of the questioning attitude posts (this one and the one that will come after).

Asch, S. E. (1956). Studies of independence and conformity: A minority of one against a unanimous majority. Psychological Monographs, 70(9), 1-70.

Bond, R. & Smith, P.B. (1996). Culture and conformity: A meta-analysis of studies using Asch’s (1952b, 1956) line judgment task. Psychological Bulletin, 119, 111- 137.

Day, D. V., Harrison, M. M., & Halpin, S. M. (2008). An integrative approach to leader development: Connecting adult development, identity, and expertise. Routledge.

DiGeronimo, M. & Koonce, B. (2016). Extreme operational excellence: Applying the US nuclear submarine culture to your organization. Outskirts Press, Inc.. Kindle Edition.

Dörner, D. (1996). The logic of failure: Recognizing and avoiding error in complex situations. Perseus Press.

Helmreich, R. L. (1999, August). Building safety on the three cultures of aviation. In Proceedings of the IATA human factors seminar (pp. 39-43). Retrieved from https://www.pacdeff.com/pdfs/3%20Cultures%20of%20Aviation%20Helmreich.pdf

Janis, I. L. (1972). Victims of groupthink. Houghton Mifflin.

Kruger, J., & Dunning, D. (1999). Unskilled and unaware of it: how difficulties in recognizing one's own incompetence lead to inflated self-assessments. Journal of Personality and Social Psychology, 77(6), 1121-1134.

National Transportation Safety Board (NTSB). (2019). Maritime accident report: collision between us navy destroyer john s mccain and tanker alnic mc singapore strait, 5 miles northeast of horsburgh lighthouse august 21, 2017. NTSB/MAR-19/01 PB2019-100970. https://www.ntsb.gov/investigations/accidentreports/reports/mar1901.pdf

Nemeth, C. J. (1995). Dissent as driving cognition, attitudes, and judgments. Social Cognition, 13(3), 273-291.

Nemeth, C. J. (2018). In defense of troublemakers: The power of dissent in life and business. Hachette UK.

Nemeth, C. J., & Goncalo, J. A. (2010). Rogues and heroes: Finding value in dissent. In J. Jetten & M. Hornsey (Eds.), Rebels in groups: Dissent, deviance, difference, and defiance (pp. 17-35). John Wiley & Sons.

Nikunen, K. (2014). Losing my profession: Age, experience and expertise in the changing newsrooms. Journalism, 15(7), 868-888.

Perrow, C. (1981). Normal accident at three mile island. Society, 18(5), 17-26.

Perrow, C. (1984). Normal accidents: Living with high risk technologies. Princeton University Press.

Roberts, K. H., & Tadmor, C. T. (2002). Lessons learned from non-medical industries: The tragedy of the USS Greeneville. Quality and Safety in Health Care, 11(4), 355-357.

Salamouris, I.S. (2013). How overconfidence influences entrepreneurship. J Innov Entrep 2(8) https://doi.org/10.1186/2192-5372-2-8

Shattuck, L. G., & Miller, N. L. (2006). Extending naturalistic decision making to complex organizations: A dynamic model of situated cognition. Organization Studies, 27(7), 989-1009.

Skala, D. (2008). Overconfidence in psychology and finance-An interdisciplinary literature review. Bank I Kredyt, (4), 33-50.

Weick, K. E. (1988). Enacted sensemaking in crisis situations [1]. Journal of Management Studies, 25(4), 305-317.

Weick, K.E. (1993). The collapse of sensemaking in organizations: The mann gulch disaster. Administrative Science Quarterly, 38(4), 628-652.

Weick, K. E. (1995). Sensemaking in organizations (Foundations for organizational science series). Sage Publications.

Weick, K.E., Sutcliffe, K.M. (2015). Managing the unexpected: Assuring high performance in an age of complexity (3rd ed.). Jossey-Bass.

Whitmore, Paul G.; Fry, John P (1974). Soft Skills: Definition, Behavioral Model Analysis, Training Procedures. Professional Paper 3-74., Research Report ERIC Number: ED158043, 48pp.

HRO 14: Questioning Attitude Part 2

HRO 14: Questioning Attitude Part 2

HRO 12: Close Encounters with Blind Spots

HRO 12: Close Encounters with Blind Spots