february 2015
Theodore Roosevelt and John Muir on Glacier Point, Yosemite Valley, California, in 1903.

by Ivan Pupulidy

Introduction: “Daring greatly” and a shift to learning

It is not the critic who counts; not the man who points out how the strong man stumbles, or where the doer of deeds could have done them better. The credit belongs to the man who is actually in the arena, whose face is marred by dust and sweat and blood; who strives valiantly; who errs, who comes short again and again, because there is no effort without error and shortcoming; but who does actually strive to do the deeds; who knows great enthusiasms, the great devotions; who spends himself in a worthy cause; who at the best knows in the end the triumph of high achievement, and who at the worst, if he fails, at least fails while daring greatly.” — Teddy Roosevelt, 1910

Roosevelt’s quote captures the need to innovate when the work environment delivers the unexpected. Innovation may well be the only way to deal with uncertainty and when it works, we often reward the person who took the risk. However when the outcome is considered to be a failure, then it is easy to hold individuals accountable and judge their actions to be errors. Our perspective must include the idea that failure, error and falling short are all part of the grand experiment called normal work. Having the strength to be vulnerable in our normal work allows us to question our beliefs and assumptions. Once we embrace the idea that we don’t know and can’t control everything, we create the opportunity to learn.

Accident reviews, Facilitated Learning Analyses and investigations have shown an interesting general trend – those involved in the incidents were doing what was considered normal work and their actions and decisions made sense to them at the time. Most frequently, these actions led to success and safety. This tells us that incidents and accidents are more typically the result of everyday influences on decision-making, instead of arising from the erratic behavior of faulty individuals (Dekker 2006, Reason, 1997; Woods et al., 1994).People do what makes sense given the situational indications, operational pressures, and organizational norms at the time. We can refer to these dynamic indicators as the conditions that influenced decisions and actions. The emphasis of investigations and reviews must, therefore, devote attention to capturing and presenting these conditions, as they place the actions in context. It is through this context that we can begin to shift our focus from judgment of error, to learning from actions and decisions.

Approaches to Error

There is an apparent contrast in approaches regarding error, which may result from the inconsistent ability to apply cause-effect relationships. In situations where there are normative standards of performance or measurable system parameters (e.g. mechanical systems, limited human-machine interface, or ergonomics), the identification, modeling and trending of errors can be of value. These situations are consistent with simple and complicated systems, where mathematical reduction works because the system delivers what is expected. In simple and complicated systems compliance with rules, regulations, policies and procedures may deliver the desired outcomes. However, such procedures may not achieve the goals we seek when we are involved in complex systems. [1. Complexity at first appears to be a weaving together of unique constituents that influence each other in varying ways. In this sense complex systems are like a fabric made up of events, actions, interactions, retroactions, determinations, and chance that together make up the system being considered (Morin 2008). This system consists of diverse, autonomous, interconnected and interactive components that have the capacity to adapt to change or to learn (Page 2011).]

Predictability offers the opportunity for control. Control is what managers are supposed to deliver. Not surprisingly, traditional project safety practices strive to achieve compliance, based on predictability. Organizations like the military have recognized the tension created in a workplace when compliance is required but cannot be delivered uniformly. To help relieve this tension, these organizations have adopted doctrinal approaches, instead of rule-based ones. For instance, the Naval Air Training and Operating Procedures Standardization (NATOPS) program demands mandatory compliance with stipulated manual procedures, but deviations are allowed per the following statements, found in all NATOPS manuals:

“NATOPS must be dynamic and stimulate rather than suppress individual thinking. Since aviation is a continuing, progressive profession, it is both desirable and necessary that new ideas and new techniques be expeditiously evaluated and incorporated if proven to be sound. To this end, commanding officers of aviation units are authorized to modify procedures contained herein for the purpose of assessing new ideas prior to initiating recommendations for permanent changes.”

“NATOPS manuals provide the best available operating instructions for most circumstances, but no manual is a substitute for sound judgment. Compound emergencies, available facilities, adverse weather or terrain, or considerations affecting the lives and property of others may require modification of the procedures contained herein. Read this manual from cover to cover. It is the air crewman’s responsibility to have a complete knowledge of its contents.”

“NATOPS is not intended to cover every contingency that may arise nor every rule of safety and good practice. Aviation personnel are expected to study and understand all applicable portions of the program.”

The military commonly faces situations that are uncertain, such as when the enemy responds to actions with their own innovations. Field Marshal Helmuth Graf von Moltke of the Prussian army (1800-1891) is credited with the statement, “Plans are nothing; planning is everything … no plan survives first contact with the enemy.” This concept draws heavily on his belief in doctrine or centralized command with decentralized execution, allowing field commanders to act without direct orders, yet within the intent of leadership.

Complex systems demand adaptation and innovation because the system delivers the unexpected or the unpredictable (McDaniel 2007, Weick & Sutcliff 2007, Morin 2008). Rules, regulations, policy or procedures cannot be written to address all the situations that people may face. “The process of understanding is not automatically driven by the forces of nature, but is the result of an active cooperative enterprise of persons in relationship” (Gergen 2003). In complex systems, sensemaking leads to critical thinking and innovation, which are required for workers to be successful when unpredictable situations emerge (Weick et al 2007).

The situation faced by the military in combat is not unlike wildland fire operations. In firefighting, fire is the perceived enemy, which will perform in unexpected ways. Sometimes it will behave as expected, however, it is opportunistic and will deliver unexpected outcomes as part of its nature. Fire can also react unpredictably when it is acted upon, as it adapts to conditions that may be unseen or un-seeable by firefighters. Firefighters in turn, are constantly adapting strategy and tactics to deal with the changing fire environment. In this way, firefighting and military operations share identities as complex adaptive systems.

Experts and novices

Within complex adaptive systems there are variable demands for compliance during mission performance. Some parts of the system will perform according to normative standards and will be predictable, while others will not. This can tease workers into believing that more things are predictable than actually are predictable. Expertise is required to recognize when the unexpected is present or may arise. As a result, our expectations of the performance of experts are different than our expectations of novices (See Table 1).

 

Table 1. Expectations of Novice and Expert (adapted from Flyvbjerg 2001).

Table 1. Expectations of Novice and Expert (adapted from Flyvbjerg 2001)

Following a successful mission in a complex adaptive system, the expert worker will likely be rewarded for adapting rules, regulations, policies or procedures to fit the observed conditions. This is often referred to as “thinking outside the box.” However, the same innovation or adaptation in a mission that is deemed unsuccessful will often result in admonishment of the individual. A sleight-of-hand can take place after an adverse outcome event, as suddenly the reviewer, leader or even the investigator may hold the expert accountable to the inflexible standards of a novice.

Stuart and Hubert Dreyfus (2001) create an image of the difference between the novice and the expert. The novice can be seen as one who follows rules rigidly, with little situational perception and no discretionary judgment.

Table 2. Five levels of learning: Novice to Expert (adapted from Dreyfus & Dreyfus in Flyvbjerg 2001).

Table 2. Five levels of learning: Novice to Expert (adapted from Dreyfus & Dreyfus in Flyvbjerg 2001)

Dreyfus-Dreyfus describe five levels of human-learning (Table 2), which are very similar to Rasmussen’s (1987) concept of ‘knowledge-based’, ‘rules-based’ and ‘skills-based’ performance, on which Reason (1990) draws heavily for his Swiss-cheese Model. For example, there is a strong similarity between ‘skills-based’ and ‘expert-level’ performance. “The Dreyfus model contains a qualitative jump from the first three levels, to the fourth and fifth levels. The jump implies an abandonment of rules-based thinking as the most important basis for action, and its replacement by context and intuition.” (Klein 1999, Flyvbjerg 2001). “The boundary between skill-based and rule-based performance is not quite distinct, and much depends on the level of training and on the attention the person devotes to the mission or task. In general, the skill-based performance rolls along without the person’s conscious attention, and he will be unable to describe how he controls and on what information he bases the performance” (Rasmussen 1983). Klein (1999) strengthens the connection between intuitive responses and expertise when he describes Recognition Primed Decision Making.

Simply judging actions and decisions made in “expert” or “skills-based” modes as errors will not account for the ways that expertise is expected to be applied to situations. A more appreciative approach would differ by considering “Local” or “Bounded Rationality Theory.” Rather than focusing on a defective character trait, human failure or absent defenses, an appreciative approach would presuppose that expert-level behavior is based on pragmatic social heuristics — a learning and discovery technique based on experience — and on the interplay between the mind and the environment (Gigerenzer, 2010).

Flyvbjerg (2001) points to “context, judgment, practice, trial and error, experience, common sense, intuition and bodily sensation” as higher levels in the learning process that take over from analysis and rationality. When the expert is dealing with the unexpected, a certain degree of experimentation is expected – and experimentation does not always provide perfect, error free, performance.

Error eradication models miss this key point and are limited to hierarchical or sequential decision-making (elements-rules-goals-plans-decisions), therefore they are ill-equipped to address the way decisions are made by experienced people in complex environments. The evaluation of expert level performance has to focus on the conditions that influenced decisions and actions, in order to develop an understanding of the context that shaped these decisions or actions. Context eclipses the action – or error – in importance, which is consistent with Reason’s focus on “error producing factors” (Reason 1997).

Summary: A learning approach to prevention

The first article in this series (“Recognize error to prevent accidents,” August 2014) discussed the journey from novice to expert. This article introduces the importance of understanding the differences in perspectives between novice and expert and it shows how experts, acting in a complex system, can find themselves in unfamiliar situations. Recognizing that the environment is delivering the unexpected may be easiest for the expert; however, this requires humility and allowing oneself to be vulnerable enough to admit that all the answers are not known.

There is also a growing understanding that the novice has a very important perspective to offer, which can only be useful if we are all humble enough to ask, listen and engage in group sensemaking. It is important for both leaders and followers to consider the importance of “upward voice” – beginners/novices should embrace the discomfort of vulnerability and speak up; experts should embrace the discomfort of vulnerability and create a safe environment for the powerless to speak up, when they see something “dumb, dangerous or different.”

Simple, complicated and complex systems produce very different results, ranging from the predictable to the highly uncertain. As a result, interaction with these systems in the moment and during reviews has to be very different. While simple systems may respond to linear management processes, complex systems require sensemaking, learning and improvisation, instead of command, control and checklists (McDaniel 2007). Expertise improves a leaders’ ability to recognize complex situations that require adaptive responses. Understanding the difference between these systems and the role of error recognition must be reflected in the review of incidents and accidents, so that we can include learning in our approaches to prevention.

+

Complexity at first appears to be a weaving together of unique constituents that influence each other in varying ways. In this sense complex systems are like a fabric made up of events, actions, interactions, retroactions, determinations, and chance that together make up the system being considered (Morin 2008). This system consists of diverse, autonomous, interconnected and interactive components that have the capacity to adapt to change or to learn (Page 2011).

+

Bibliography

Dekker, S. (2006). The field guide to understanding human error. Burlington, VT: Ashgate.

Flyvbjerg, B. (2001). Making social science matter: why social inquiry fails and how it can succeed again. Oxford, UK ; New York: Cambridge University Press.

Gigerenzer, G. (2010). Moral Satisficing: Rethinking Moral Behavior as Bounded Rationality. Topics in Cognitive Science, 2, 528-554.

Klein, G. (1999). Sources of Power: How People Make Decisions: MIT Press.

McDaniel Jr., R. R. (2007). Management Strategies for Complex Adaptive Systems. Performance Improvement Quarterly, 20, 21.

Morin, E. (2008). On Complexity. Cresskill, NJ: Hampton Press, Inc.

Page, S. E. (2011). Diversity and complexity. Princeton, NJ: Princeton University Press.

Rasmussen, J., Nixon, P., & Warner, F. (1990). Human Error and the Problem of Causality in Analysis of Accidents [and Discussion]. Philosophical transactions of the Royal Society of London. Series B, Biological sciences, 327, 449-462.

Rasmussen, J., Duncan, K., & Leplat, J. (1987). New technology and human error: J. Wiley. New York.

Reason, J. (1997). Managing the risks of organizational accidents. Aldershot, UK: Ashgate.

Reason, J. (1990). Human error. New York: Cambridge University Press.

Weick, K. E., & Sutcliffe, K. M. (2007). Managing the unexpected : resilient performance in an age of uncertainty. San Francisco, Calif.: Jossey-Bass.

Woods, D. D., Johannesen, L. J., Cook, R. I., & Sarter, N. B. (1994). Behind human error: cognitive systems, computers and hindsight. Dayton, OH: CSERIAC.