Friday, August 22, 2014

Swiss-Cheese Model (SCM) and Margin

Concept of MarginJames Reasons Swiss Cheese Model
Authors: Ivan Pupulidy, Director and Matt Carroll, Human Factors Specialist; USDA Forest Service Office of Learning

Comparing the Swiss Cheese Model (SCM) and Margin is not a direct “apples to apples” comparison. The SCM was introduced to the wildland fire community through the L-380 curriculum and is intended primarily, as an “ innovative framework for thinking about human error…[that] scrutinizes all levels in an organization when looking for the causes of human error” (Mission-Centered Solutions Inc., 2007, p. 55). It was designed to pinpoint the causes of an accident or error by describing the holes in defenses that, when aligned through multiple levels, create an error chain. Margin, is focused on the influence that conditions have on decisions and actions; it does not attempt to describe a linear causal relationship among conditions at various levels, rather it describes the collective influence of these conditions. The focus can then shift from cause to understanding the capacity to cope with uncertainty, error and surprise. SCM is intended to make sense of an accident after it happens, whereas margin, while it can be used in accident analysis, is designed to help users describe the potential for the system to do harm before an accident occurs.

While these differences may seem academic they are not, they have a significant effect on practical application and learning. In accident investigation for example the models we choose can influence what we look for and in turn influence what we ‘see’, or determine what is relevant to the investigation. Erik Hollnagel warns us that a model can bias the perspective of analysts when he describes the, “What –You-Look-For-Is-What-You-Find (WYLFIWYF)” principle (Lundberg, et al., 2009). This principle suggests that the model used for a review of an event will determine what you find and more importantly for us, what you fix[1]. This in turn affects what is learned and how that learning is applied to improve system performance. In light of this we should focus on how these models determine what we ‘see’ and therefore what we fix. We should ask what it is that we want to do, “Do we simply want to find and fix defenses (i.e., plugging hole, fixing/adding barriers) or do we want to find ways to increase the system’s capacity to weather error/surprise and uncertainty without consequence?”

SCM opened our eyes to the influence of upstream failures or holes in defenses, but is limited in that it is best used after the fact and it focuses on error or absence (looks for holes). By drawing an error chain you lose the ability to talk about the influence (good or bad) of other conditions throughout the system. Calling something an error, weakness, omission, failure (Mission-Centered Solutions Inc., 2007, p. 56) (Reason, 1990) artificially simplifies the nature of conditions; because every error was likely a solution to, or influenced by, something else. SCM results in plugging the detected holes in the failed barrier to avoid some downstream event. This process is designed to make systemic corrections at a managerial level (leadership adds more barriers or defenses, e.g. rules, regulations, policy, procedures or PPE). What is suggested is only part of the issue of prevention. SCM suggests the responsibility of creating safety is done up stream in the organizational leadership.

Margin opens the discussion to include the role of workers in the creation of safety. It provides a means of describing and detecting the capacity of a system by focusing on the margin available for action. After an accident conditions play a different role in the margin concept. They are intended to mapped or recognized as influences to decisions and actions. Describing these conditions places actions and decisions in context and allows us to move beyond fixes.

For an introduction to the concept of Margin, please refer to the following short video:

Works Cited
Lundberg, J., Rollenhagen, C. & Hollnagel, E., 2009. What-You-Look-For-Is-What-You-Find - The consequences of underlying accident models in eight accident investigation manuals. Safety Science, Volume 47, pp. 1297-1311.

Mission-Centered Solutions Inc., 2007. Fireline Leadership (L-380). Missoula (MT): Mission-Centered Solutions Inc..

Reason, J., 1990. The Contribution of Latent Human Failures to the Breakdown of Complex Systems. Philisophical Transactions of the Royal Society of London. Series B, Biological, 12 April, 327(1241), pp. 475-484.

[1] Based on the corollary principle of “What-You-Find-Is-What-You-Fix” (Lundberg, Rollenhagen, & Hollnagel, 2009).

No comments: