Volume 89, Number 1, 2011


Counterheroism, Common Knowledge, and Ergonomics: Concepts from Aviation That Could Improve Patient Safety

Geraint H. Lewis, Rhema Vaithianathan, Peter M. Hockey, Guy Hirst, and James P. Bagian

Nuffield Trust; University of Auckland; NHS South Central; Atrainablity Limited; University of Michigan


Context: Many safety initiatives have been transferred successfully from commercial aviation to health care. This article develops a typology of aviation safety initiatives, applies this to health care, and proposes safety measures that might be adopted more widely. It then presents an economic framework for determining the likely costs and benefits of different patient safety initiatives.

Methods: This article describes fifteen examples of error countermeasures that are used in public transport aviation, many of which are not routinely used in health care at present. Examples are the sterile cockpit rule, flight envelope protection, the first-names-only rule, and incentivized no-fault reporting. It develops a conceptual schema that is then used to argue why analogous initiatives might be usefully applied to health care and why physicians may resist them. Each example is measured against a set of economic criteria adopted from the taxation literature.

Findings: The initiatives considered in the article fall into three themes: safety concepts that seek to downplay the role of heroic individuals and instead emphasize the importance of teams and whole organizations; concepts that seek to increase and apply group knowledge of safety information and values; and concepts that promote safety by design. The salient costs to be considered by organizations wishing to adopt these suggestions are the compliance costs to clinicians, the administration costs to the organization, and the costs of behavioral distortions.

Conclusions: This article concludes that there is a range of safety initiatives used in commercial aviation that could have a positive impact on patient safety, and that adopting such initiatives may alter the safety culture of health care teams. The desirability of implementing each initiative, however, depends on the projected costs and benefits, which must be assessed for each situation.

Keywords: Medical error; safety management; health knowledge, attitudes, practice; human engineering; costs and cost analysis.

The comparative safety records of commercial aviation and health care have been widely publicized, and proposals to borrow safety concepts from aviation abound. In modern aviation, only one passenger’s life is lost per 10 million flights, compared with one iatrogenic death for every one hundred to three hundred hospital admissions (Hall 2006; Levinson 2010). Moreover, when safety concepts have been systematically adopted from aviation, the impact on patient safety has sometimes been substantial. For example, the Surgical Safety Checklist of the World Health Organization (WHO) is based on a concept introduced in aviation seventy years earlier (Godlee 2009). An evaluation found that it was relatively quick and cheap to implement and that in some settings, it reduced deaths and complications for surgical patients by more than a third (Haynes et al. 2009). Likewise, catheter-related bloodstream infections were eliminated from a surgical intensive care unit with a package of interventions used in aviation, including a checklist, a common layout for equipment, and the empowerment of junior staff to call for a procedure to be abandoned if guidelines were violated (Berenholtz et al. 2004).

Given these apparent successes, patient safety organizations such as the U.S. Department of Veterans Affairs (VA) National Center for Patient Safety and Britain’s National Patient Safety Agency (NPSA) have been championing a range of initiatives drawn from aviation, including checklists, crew resource management (CRM), team self-review, and close-call reporting (Neily, Dunn, and Mills 2004; NPSA 2004). Might there, however, be other error countermeasures used in aviation that could usefully be borrowed?

As an industry, health care is clearly unique in several respects, so researchers and policymakers should proceed with caution. Nonetheless, Marshall noted that successful quality and safety initiatives in many high-risk industries have the same set of underlying principles (Marshall 2009). He argued that the health care sector should continue borrowing from other industries because clinicians are able both to identify which concepts will transfer well and to make any adaptations that may be necessary.

In this article, we describe fifteen safety practices used routinely in aviation. Each initiative might have applications in health care, but only some of these examples are currently used widely in hospitals. From this list we identify three themes that we use to create a conceptual framework for classifying error countermeasures. As with all policy prescriptions, not all proposals are applicable to all situations, so we turn to the economics literature for a cost-benefit analysis and to taxation policy for a framework for gauging which safety initiatives are most likely to transfer well from aviation to health care. Finally, we return to our classification to consider the different reasons why initiatives borrowed from aviation may cause disinterest or antipathy among some doctors.

Safety Strategies from Aviation

Passengers flying on a modern commercial aircraft are protected by a plethora of advanced safety measures. Some of these practices and devices would appear to have little or no direct relevance to health care; for example, those relating to hijacks and other forms of sabotage. Using our combined aviation and clinical experiences, however, we have drawn up a list of fifteen airline safety measures that we believe may be applicable to health care (see table 1). Some of these initiatives, such as safety checklists and crew resource management, are already used by many health care providers around the world. But others, such as incentivized no-fault reporting and the obligatory use of first names among clinicians, are not currently in widespread use.

Conceptual Framework

While analyzing the fifteen examples listed in table 1, we identified three pervasive themes for safety initiatives used in public transport aviation: counterheroism, common knowledge, and ergonomics.

Counterheroism. Many of the aviation safety initiatives listed in table 1 are specifically designed to minimize the responsibility of individual pilots and instead to emphasize the importance of the team and the system as a whole for ensuring safety (Reason 2000).

The field of health care may often be characterized by a culture of individual “heroism.” For example, in their ethnographic study of operating rooms, Waring and colleagues observed that surgeons would respond to system deficiencies by finding ways around each problem they encountered (Waring, Harrison, and McDonald 2007). For instance, when items of surgical equipment were missing, surgeons modified, reshaped, or adjusted equipment designed for other uses. The surgeons described these actions as “adventurous,” “daring,” and necessary for “getting the job done.” In contrast, other members of the operating room team said they felt anxious about these “heroic” modifications of established practice. However, the frequency of system failures had bred a culture in which complaining about system failures was regarded as a criticism of the surgeons’ ability to innovate rather than as a condemnation of the system failure itself.

Amalberti and colleagues found that the frequency of unforeseen events in an operating room limits the degree of safety that can be provided through good systems, since individuals will inevitably be called on to respond to unpredicted and unpredictable events (Amalberti, Berwick, and Barach 2005). In contrast, Woods asserted that an organization’s ability to react to surprise events is, in fact, an important characteristic of a dynamically safe system (Woods 2006). Our contention is that the manner of a physician’s response to an error or threat to safety is closely tied to medicine’s current culture of heroism. We further contend that system failures and a culture of heroism may be self-reinforcing. In aviation, the team and the system are central to the safety culture. Fostering a comparable culture in health care therefore requires curbing individual heroism, which may be achieved through increased codification and other measures that downplay the role of individuals in ensuring safety. These restrictions will inevitably be seen as challenging the current “heroic” culture, and so opposition from some doctors is to be expected.

In modern aviation, rules and protocols are so deeply ingrained in the culture that if a pilot or a mechanic encounters an unusual situation, his or her response is not to try jury-rigging a way around the problem but, rather, to follow set procedures and to report the problem through official channels. Gawande observed that as a result, the “rock star status” of the first daring aviators has been diminished by tight regulations such as preflight checklists, and he noted that a similar transformation is now under way in medicine (Gawande 2007). Some of the mechanisms the aviation industry uses to encourage this nonheroic culture, such as promoting the use of first names only among staff, would be inherently challenging to clinicians. We believe, however, that the increased use of counterheroic measures in health care might foster a less obsequious workplace culture in the long run. In a less deferential, less hierarchical workplace, senior staff may not feel so pressured to make ad hoc adjustments in response to system failures, and junior staff may feel less awkward about raising concerns when they believe their seniors are about to make a mistake.

It is important to note that by proposing curbs on heroism, we are referring not to “heroic” treatments for patients with low survival probabilities, to “heroic” personal sacrifices by clinicians, or indeed to the systematic development of novel practices. Rather, we are proposing constraints only on makeshift adjustments made by individuals in response to hazards and adverse events.

Common Knowledge. In economics, the concept of common knowledge is used to denote values or information that not only are known to members of a group but also are known to be known, and are known to be known to be known, ad infinitum (Lewis 1969). For instance, the joint safety briefing for the pilots and cabin crew before a flight creates common knowledge because all members of the team are reminded not only of what they should do in the event of an emergency but also what is expected of them by their colleagues. After a rule has become common knowledge, if one member of the team is seen to violate that rule, then the other members of the team are “authorized” to bring this to the attention of the violator, whereas previously they might have felt less confident in doing so.

Publicizing a rule creates common knowledge and, in so doing, causes an important cultural change, even for rules that already have been tacitly acknowledged by all. For example, the European Aviation Safety Agency (EASA) mandates that all pilots be assessed on their CRM skills based on a detailed description of CRM methods and terminology (EASA 2009). It is our belief that these assessment criteria—being tightly codified and known to pilots, their colleagues, trainers, and employers—make it easier to insist on acceptable behavior in the cockpit. Other examples of common knowledge generated by the aviation sector are publicizing the “sterile cockpit” rule and the “bottle-to-throttle” rule and publishing detailed minimum safety requirements.

Ergonomics. Ergonomics, also called human factors engineering (HFE), is the science of designing products, processes, systems, and environments that take explicit account of the capabilities and behaviors of the people who will interact with them (Gosbee 2002). The aviation industry uses HFE extensively: examples are mistake-proofing, forcing functions, and flight envelope protection.

HFE is already used in health care as well, a notable example being medical gas connectors that are designed to prevent mistakes. Nonetheless, we believe that ergonomics can be applied much more widely to health care. For example, automatic identification technologies such as bar coding and radio frequency identification (RFID) could be used more extensively to reduce wrong-patient, wrong-drug, and wrong-dose errors. Other ways in which the aviation sector improves safety by design are standardizing instrument layouts and using flight recorders (“black boxes”) to encourage safe behaviors.

Clearly, many of the initiatives listed in table 1 involve more than one of the three themes we have identified, and table 2 shows what we believe is the relative importance of each theme for the fifteen initiatives listed in table 1.

An Economic Framework for Assessing Safety Initiatives

Although at an individual level, clinicians have an obligation to do no harm to their patients, the initiatives we discuss in this article concern changes at the organizational level. Given the infinite demands placed on health care budgets and the high costs of implementing certain safety initiatives, the opportunity costs may be substantial. So, to be acceptable to health care organizations, the safety initiatives advocated in this article would have to provide better value than other competing medical interventions. In other words, these safety initiatives must be cost-effective. Economists would regard as optimal any safety initiative whose marginal safety benefit outweighed its marginal implementation cost. In this section, we describe a framework for considering the cost-effectiveness of patient safety initiatives.

The benefits of the safety initiatives listed in table 1 depend on (1) the frequency and severity of the safety problem they are designed to address, (2) their effectiveness in mitigating that safety threat, and (3) the value that society places on safety in that context. One reason why commercial aviation is an atypical industry is that the reputational harm ensuing from an aircraft disaster is so great that the benefits of aviation safety interventions are almost infinite. As a result, the optimal level of safety in public transport aviation approaches complete safety.1 In contrast, other industries, such as ground-based forms of public transport like trains and taxis, do not go to such extreme lengths to eliminate risks. For example, train passengers are generally not required to wear seat belts, and taxi passengers are not issued with crash helmets. This suggests that for ground-based forms of public transport, officials place a greater weight on price, convenience, and quality when determining safety policy.

We believe that in health care, patient safety policy is similarly multidimensional. For this reason, it is unlikely to be optimal to attempt to eliminate all iatrogenic risks because the foregone opportunity costs would be so great. Therefore, the benefits from introducing any patient safety initiatives need to be tested against the benefits of other forms of health care, as well as against their costs.

Calculating benefits. In aviation, officials at the Federal Aviation Administration (FAA) use a system called Security Risk Management to estimate the cost-effectiveness of prospective safety programs (FAA 2002). In health care, one way to calculate the relative benefits of a patient safety initiative is to determine its incremental cost-effectiveness ratio, which is measured in units of cost per quality-adjusted life year (QALY) gained. An alternative method, given the high cost of many iatrogenic complications, is to quantify the benefits as the costs of averted adverse incidents (Semel et al. 2010).

Calculating costs. To calculate the costs of introducing a patient safety initiative, one approach might be to apply the criteria that economists use for assessing the implementation costs of taxes—taxes and patient safety initiatives being analogous in that both involve encumbrances borne by individuals for the greater good. These criteria are the compliance costs, the administration costs, and the costs of any ensuing behavioral distortions (Musgrave and Musgrave 1973). Table 3 sets out what these different costs might be for each of the fifteen safety initiatives listed in table 1. In the discussion that follows, we use the WHO’s Surgical Safety Checklist for purposes of illustration.

Compliance Costs

Compliance costs are the costs borne by people in complying with the rules. For a tax, these may include the costs to an individual of producing accounts and filing tax returns.2 Similarly, patient safety initiatives that require changes in clinical behavior impose compliance costs on the individual practitioner.

Compliance costs can be divided into one-off and ongoing costs. One-off compliance costs are usually associated with training and with changes in procedures or processes and often require considerable time and effort. Such costs can be minimized by simplifying protocols, by increasing flexibility so that individuals can comply in a wide variety of ways that are most convenient to them, and by providing a range of learning tools and resources.

Ongoing compliance costs are associated with demonstrating compliance, for example, by filling out forms. Recent advances in behavioral economics suggest that the best way to improve compliance may not necessarily be by increasing the sanctions on noncompliant behavior. Rather, ongoing compliance may be encouraged by creating social norms and group censure for noncompliant behavior.

In health care, policymakers should consider the costs of a proposed new patient safety initiative in regard to the time and effort required for the staff to comply with it. For instance, the surgical safety checklist has relatively low compliance costs because it is a simple procedure that takes only a few minutes per operation to complete.

Administration Costs

Administration costs refer to those costs incurred by an organization in imposing a particular initiative. In taxation policy, this refers to the costs borne by the Internal Revenue Service to collect taxes in a fair, efficient, and equitable manner. For a patient safety initiative, administration costs are those costs borne by a health care provider for ensuring that practices and processes are changed and for ongoing monitoring to ensure compliance. In health care, the administration costs of patient safety initiatives therefore relate to dissemination, training, monitoring, and auditing. Our example, the surgical safety checklist, has been found to have relatively low administration costs because training was straightforward, educational materials such as a training video were made available centrally, and copies of the checklist cost only a few cents per patient. But some of the other interventions listed in table 1 might have considerable administrative costs (see table 3).

Behavioral Distortions

Policymakers should be aware of two categories of behavioral distortion that may ensue from the introduction of a new patient safety intervention. First, people may deliberately undermine the efficacy of an initiative, which in the tax literature is referred to as evasion. In taxation, this is the cost imposed by individuals who restructure their affairs to avoid paying a tax, for instance, by resorting to black-market transactions in order to avoid paying sales taxes. Second, individuals may change or stop other behaviors, which we refer to as deadweight losses. For example, because a tax on labor discourages people from working, the result will be an avoidable reduction in economic activity.

Both types of behavior can be difficult to predict, and their detection requires long-term monitoring. For instance, opportunities for evading the surgical safety checklist may be relatively limited because it is used in a public place—the operating room—compared with the opportunities for evading safety initiatives applied in the privacy of the examination room, such as physicians washing their hands before seeing the next patient. This means that the checklist may be less likely to be ignored or poorly administered, although its use does require at least one team member to step up and insist on its use if others fail to start using it. Vats and colleagues documented several ways in which the WHO checklist can be evaded, such as by completing the checklist when some key members of the team are absent or by providing dismissive answers (Vats et al. 2010). Although going through the checklist in a perfunctory way might speed things up, this also would clearly reduce its efficacy (Neily et al. 2009). Deadweight losses might result if, for example, the surgical and anesthesia teams dispensed with other safeguards that they previously used in the operating room before the checklist was introduced.

The lesson from taxation policy is that some degree of behavioral distortion generally is tolerable in that it is unlikely to be cost-effective to eradicate every last behavioral distortion. The challenge, therefore, is to minimize these costs rather than to eradicate them.

Resistance from Physicians

Given the large volume of safety research that has been conducted in aviation over many decades, we believe that numerous ideas might usefully be adopted and disseminated more widely across the health care sector (Loukopoulos, Dismukes, and Barshi 2009). Nonetheless, we would expect resistance from both individual doctors and organized medicine. The framework we have presented for classifying patient safety initiatives may help explain the reasons for such resistance in the following ways.

Counterheroism

Safety initiatives that discourage individual heroism may be unpopular with doctors because they downplay the role of decisive, autonomous decision making in the face of uncertainty, which is seen as a core component of medical professionalism (Royal College of Physicians 2005). Given the prevailing culture of heroism, doctors might perceive any attempts to introduce tighter codification in health care, such as through more standardized layouts, as restrictions on their ability to innovate and hence as a threat to their professional status.

Common Knowledge

One of the hallmarks of a profession is the existence of a clearly defined, specialist knowledge base that is common knowledge to members of the profession but not to outsiders. Any measures that expand the circle of people who share that knowledge base so that it includes, for example, nurses, managers, patients, and/or visitors, thus may be viewed as undermining the status of the medical profession. At a time when doctors are facing many other challenges to their professional standing, such as through independent prescribing by nurses and pharmacists, any new measures that increase common knowledge may be unwelcome if they are seen as yet another threat to a physician’s status.

An example is a video produced by the U.S. Centers for Disease Control and Prevention (CDC), which is designed to encourage patients to ask their physician or nurse to wash their hands (CDC 2008). In the video, a patient is seen asking a physician to wash his hands. The physician is seen complying and explaining that he does not mind being reminded. The point of the video is to open up the knowledge regarding the importance to patients of hand washing, with the intention that the new shared knowledge between the doctor and the patient that doctors should wash their hands will also encourage doctors to do so even without being reminded.

Ergonomics

Finally, physicians may respond negatively to calls for better safety through design if they feel powerless to instigate the necessary design changes themselves. In the United Kingdom, the British Medical Journal is attempting to address this problem through an initiative in which doctors are encouraged to “pitch” their patient safety ideas to a panel, with the winning entries being published in the journal and considered for implementation across the health service by the National Patient Safety Agency (NPSA).

Conclusion

One drawback to borrowing concepts from other industries is that their implementation is affected by the deep-seated cultural attitudes of health care staff toward safety (Bosk et al. 2009). But we believe that the process of adopting safety concepts from aviation could in itself alter the culture of health care teams if the interventions dissuaded heroic actions, increased common knowledge, or encouraged safety by design. In other words, some degree of initial resistance from physicians is predictable, but it may be expected to give way over time to a change in workplace culture simply by virtue of the intervention itself. Such a cultural shift recently was demonstrated in a study of crew resource management programs in health care, in which personal behaviors and empowerment were seen to improve over the course of several years (Sax et al. 2009).

Clearly, the fact that a safety concept works well in the aviation industry does not necessarily mean that it is either needed or will be effective or indeed cost-effective in health care. For example, some of the suggestions listed in this article might be too draconian or too simplistic for health care, or they might create an atmosphere of mistrust. Although we listed in table 3 some possible drawbacks to our examples, we suspect that there will be many others. Very careful consideration, troubleshooting, evaluation, and monitoring therefore will be required before any of our examples are adopted and promoted more systematically.

Endnotes

1. This may be due to the public’s “extreme” reaction to aircraft disasters, based on an element of cognitive dissonance.

2. Compliance costs do not include the actual costs of the taxes themselves, since these were the direct intention of the policymakers rather than an unfortunate side effect.

References

Air Accidents Investigation Branch. 1990. Report on the Accident to Boeing 737-400, G-OBME, Near Kegworth, Leicestershire, on 8 January 1989. Report 4/1990. Available at http://www.aaib.gov. uk/publications/formal_reports/4_1990_g_obme.cfm (accessed December 15, 2010).

Amalberti, Y.A., D. Berwick, and P. Barach. 2005. Five System Barriers to Achieving Ultrasafe Health Care. Annals of Internal Medicine 142:756–64.

Association of Anaesthetists of Great Britain and Ireland. 2004. Checklist for Anaesthetic Equipment. Available at http://www. aagbi.org/publications/guidelines/docs/checklista404.pdf (accessed December 15, 2010).

Bell, C.M., and D.A. Redelmeier. 2001. Mortality among Patients Admitted to Hospitals on Weekends as Compared with Weekdays. New England Journal of Medicine 345(9):663–68.

Berenholtz, S.M., P.J. Pronovost, P.A. Lipsett, D. Hobson, K. Earsing, J.E. Farley, S. Milanovich, et al. 2004. Eliminating Catheter-Related Bloodstream Infections in the Intensive Care Unit. Critical Care Medicine 32(10):2014–20.

Besco, R.O. 1999. PACE: Probe, Alert, Challenge, and Emergency Action. Business and Commercial Aviation 84(6):72–74.

Borg, M.A. 2003. Bed Occupancy and Overcrowding as Determinant Factors in the Incidence of MRSA Infections within General Ward Settings. Journal of Hospital Infection 54(4):316–18.

Bosk, C.L., M. Dixon-Woods, C.A. Goeschel, and P.J. Pronovost. 2009. Reality Check for Checklists. The Lancet 374(9688):444–45.

CDC (Centers for Disease Control and Prevention). 2008. Hand Hygiene Saves Lives: Patient Admission Video. Available at http://www2c.cdc.gov/podcasts/player.asp?f=9467# (accessed December 15, 2010).

Civil Aviation Authority. 2010. Standards Document 24 (version 08). U.K. Civil Aviation Authority. Available at http://www. caa.co.uk/docs/33/srg_l&ts_Stds%20Doc%2024_v8.pdf (accessed December 15, 2010).

Collins, W.E., H.W. Mertens, and E.A. Higgins. 1987. Some Effects of Alcohol and Simulated Altitude on Complex Performance Scores and Breathalyzer Readings. Aviation, Space, and Environmental Medicine 58(4):328–32.

Cooper, G.E., M.D. White, and J.K. Lauber. 1979. Resource Management on the Flight Deck (NASA Conference Publication 2120). Moffett Field, CA: NASA Ames Research Center.

Croskerry, P., K.S. Cosby, S.M. Schenkel, and R.L. Wears. 2008. Patient Safety in Emergency Medicine. Philadelphia: Lippincott Williams & Wilkins.

Dunn, E.J., P.D. Mills, J. Neily, M.D. Crittenden, A.L. Carmack, and J.P. Bagian. 2007. Medical Team Training: Applying Crew Resource Management in the Veterans Health Administration. Joint Commission Journal on Quality and Patient Safety 33(6):317–25.

EASA (European Aviation Safety Agency). 2009. Notice of Proposed Amendment No. 200902C, Sections AMC OR.OPS.030.FC and OR.OPS.130.FC. Available at http://www.easa.europa.eu/ ws_prod/r/doc/NPA/NPA%202009-02C.pdf (accessed January 7, 2011).

FAA (Federal Aviation Administration). 1981. Code of Federal Regulations, Title 14, Part 121, Section 121.542; and Part 135, Section 135.100. Available at http://rgl.faa.gov/Regulatory_and_Guidance_Library/rgFAR.nsf/MainFrame?OpenFrameSet (accessed December 15, 2010).

FAA (Federal Aviation Administration). 2002. Security Risk Management Guide. Available at http://fast.faa.gov/Riskmgmt/ Secriskmgmt/secriskmgmt.htm (accessed December 15, 2010).

FAA (Federal Aviation Administration). 2006. Code of Federal Regulations, Title 14, Part 91, Section 91.17. Available at http://rgl.faa.gov/Regulatory_and_Guidance_Library/rgFAR.nsf/ MainFrame?OpenFrameSet (accessed December 15, 2010).

Gawande, A. 2007. The Checklist: If Something So Simple Can Transform Intensive Care, What Else Can It Do? New Yorker, December 10. Available at http://www.newyorker.com/reporting/ 2007/12/10/071210fa_fact_gawande (accessed December 15, 2010).

Gladwell, M. 2008. Outliers: The Story of Success. Boston: Little, Brown.

Godlee, F. 2009. Human as Hero. BMJ 338:b238.

Gosbee, J. 2002. Human Factors Engineering and Patient Safety. Quality and Safety in Health Care 11:352–54.

Gough, I. 2010. A Surgical Safety Checklist for Australia and New Zealand. ANZ Journal of Surgery 80(1/2):3–5.

Hall, S. 2006. Medical Error Death Risk 1 in 300. The Guardian, November 7. Available at http://www.guardian.co.uk/society/ 2006/nov/07/health.lifeandhealth (accessed December 15, 2010).

Haynes, A.B., T.G. Weiser, W.R. Berry, S.R. Lipsitz, A.H. Breizat, E.P. Dellinger, T. Herbosa, et al. 2009. A Surgical Safety Checklist to Reduce Morbidity and Mortality in a Global Population. New England Journal of Medicine 360(5):491–99.

Healey, A.N., C.P. Primus, and M. Koutantji. 2007. Quantifying Distraction and Interruption in Urological Surgery. Quality and Safety in Health Care 16:135–39.

Helmreich, R.L., and A. Merritt. 1998. Culture at Work in Aviation and Medicine: National, Organizational, and Professional Influences. Brookfield, VT: Ashgate.

Lande, K. 1997. Standardization of Flight Decks—Operational Aspects. In Aviation Safety, ed. H. Soekkha, 189–201. Ridderkerk: Ridderprint.

Lauda Air. 1999. Training Manual for Boeing 767 Crews. Available at http://www.aero-pack.de/aviation/767Manuals/767LaudaAir_Training%20Manual.pdf (accessed December 15, 2010).

Levin, A. 2009. Cockpit Chatter Cited in Six Crashes. USA Today, October 1. Available at http://www.usatoday.com/news/nation/2009-10-01-pilot-speak_N.htm (accessed December 15, 2010).

Levinson, D.R. 2010. Adverse Events in Hospitals: National Incidence among Medicare Beneficiaries. Washington, DC: U.S. Department of Health and Human Services, November. Available at http://oig.hhs.gov/oei/reports/oei-06-09-00090.pdf (accessed December 15, 2010).

Lewis, D.K. 1969. Convention: A Philosophical Study. Cambridge, MA: Harvard University Press.

Loukopoulos, L.D., R.K. Dismukes, and I. Barshi. 2009. The Multitasking Myth: Handling Complexity in Real-World Operations. Farnborough: Ashgate.

Marshall, M. 2009. Applying Quality Improvement Approaches to Health Care. BMJ 339:b3411.

Musgrave, R., and P. Musgrave. 1973. Public Finance in Theory and Practice. New York: McGraw-Hill.

NASA (National Aeronautics and Space Administration). 2007. Aviation Safety Reporting System: Confidentiality and Incentives to Report. Available at http://asrs.arc.nasa.gov/overview/ confidentiality.html (accessed December 15, 2010).

NASA (National Aeronautics and Space Administration). 2009. Patient Safety Reporting System. Available at http://www.psrs. arc.nasa.gov/web_docs/PSRS_Brochure09.pdf (accessed December 15, 2010).

Neily, J., E. Dunn, and P.D. Mills. 2004. Medical Team Training—An Overview. Topics in Patient Safety 4(5):1–3. Available at http://www.patientsafety.gov/TIPS/Docs/TIPS_NovDec04.pdf (accessed December 15, 2010).

Neily, J., P.D. Mills, N. Eldridge, E.J. Dunn, C. Samples, J.R. Turner, A. Revere, R.G. DePalma, and J.P. Bagian. 2009. Incorrect Surgical Procedures Within and Outside of the Operating Room. Archives of Surgery 144(11):1028–34.

Neily, J., P.D. Mills, Y. Young-Xu, B.T. Carney, P. West, D.H. Berger, L.M. Mazzia, D.E. Paull, and J.P. Bagian. 2010. Association between Implementation of a Medical Team Training Program and Surgical Mortality. JAMA 304(15):1693–700.

NPSA (National Patient Safety Agency). 2004. Seven Steps to Patient Safety: The Full Reference Guide. Available at http://www.nrls. npsa.nhs.uk/EasySiteWeb/getresource.axd?AssetID=59971&type= full&servicetype=Attachment (accessed December 15, 2010).

NPSA (National Patient Safety Agency). 2009a. Alerts. Available at http://www.nrls.npsa.nhs.uk/resources/type/alerts/ (accessed December 15, 2010).

NPSA (National Patient Safety Agency). 2009b. National Reporting and Learning System. Available at http://www.nrls.npsa.nhs.uk/report-a-patient-safety-incident/about-reporting-patient-safety-incidents/ (accessed December 15, 2010).

NPSA (National Patient Safety Agency). 2010. Safer Surgery Checklist for Cataract Surgery Only. Available at http://www.nrls.npsa.nhs.uk/EasySiteWeb/getresource.axd?AssetID=74125&type=full& servicetype=Attachment (accessed December 15, 2010).

NTSB (National Transportation Safety Board). 2006. Additional Flight Crew-Related Accident Information. In Aircraft Accident Report NTSB/AAR-06/01 (chapter 1.18.2). Available at http://www.ntsb.gov/publictn/2006/AAR0601.pdf (accessed December 15, 2010).

Oksaar, E. 1998. Social Networks, Communicative Acts and the Multilingual Individual: Methodological Issues in the Field of Language Change. In Language Change: Advances in Historical Sociolinguistics, ed. E.H. Jahr, Berlin, 3–20. Germany: Mouton de Gruyter.

Pape, T.M. 2003. Applying Airline Safety Practices to Medication Administration. Medical-Surgical Nursing 12:77–93.

Pownall, M. 2009. Complex Working Environment, Not Poor Training, Blamed for Drug Errors. BMJ 339:b5328.

Pronovost, P.J., C.A. Goeschel, K.L. Olsen, J.C. Pham, M.R. Miller, S.M. Berenholtz, J.B. Sexton, et al. 2009. Reducing Health Care Hazards: Lessons from the Commercial Aviation Safety Team. Health Affairs 28(3):w479–89.

Reason, J. 2000. Human Error: Models and Management. BMJ 320(7237):768–70.

Royal College of Physicians. 2005. Doctors in Society: Medical Professionalism in a Changing World. London.

Sax, H.C., P. Browne, R.J. Mayewski, R.J. Panzer, K.C. Hittner, R.L. Burke, and S. Coletta. 2009. Can Aviation-Based Team Training Elicit Sustainable Behavioral Change? Archives of Surgery 144(12):1133–37.

Semel, M.E., S. Resch, A.B. Haynes, L.M. Funk, A. Bader, W.R. Berry, T.G. Weiser, and A.A. Gawande. 2010. Adopting a Surgical Safety Checklist Could Save Money and Improve the Quality of Care in U.S. Hospitals. Health Affairs 29(9):1593–99.

Sims, J. 2010. Flight Recorder: The Witness Box. The Independent, September 22. Available at http://www.independent.co.uk/lifestyle/gadgets-and-tech/features/flight-recorder-the-witness-box- 2085594.html (accessed December 15, 2010).

Söder, J.C.M. 1991. Evaluatie Onderzoek VVN-Campagne Alcohol in Het Verkeer 1986–1991. VK 91–10. Haren: Traffic Research Centre, University of Groningen. Quoted in European Transport Safety Council. 1995. Reducing Traffic Injuries Resulting from Alcohol Impairment. Brussels. Available at http://www.etsc.eu/documents/Reducing%20traffic%20injuries%20resulting%20from%20alcohol%20impairment.pdf (accessed December 15, 2010).

Vats, A., C.A. Vincent, K. Nagpal, R.W. Davies, A. Darzi, and K. Moorthy. 2010. Practical Challenges of Introducing WHO Surgical Checklist: UK Pilot Experience. BMJ 340:b5433.

Waring, J., S. Harrison, and R. McDonald. 2007. A Culture of Safety or Coping? Ritualistic Behaviours in the Operating Theatre. Journal of Health Services Research and Policy 12(supp. 1):3–9.

WHO (World Health Organization). 2008a. Implementation Manual Surgical Safety Checklist. 1st ed. Available at http://www.who.int/entity/patientsafety/safesurgery/tools_resources/SSSL_Manual_finalJun08. pdf (accessed December 15, 2010).

WHO (World Health Organization). 2008b. Surgical Safety Checklist. 1st ed. Available at http://www.who.int/entity/patientsafety/ safesurgery/tools_resources/SSSL_Checklist_finalJun08.pdf (accessed December 15, 2010).

WHO (World Health Organization). 2008c. The WHO Surgical Safety Checklist: Adaptation Guide. Available at http://www.who.int/ patientsafety/safesurgery/checklist_adaptation.pdf (accessed December 15, 2010).

Wiese, J.G., M.G. Shlipak, and W.S. Browner. 2002. The Alcohol Hangover. Annals of Internal Medicine 132(11):897–902.

Williamson, M. 2010. David Warren: Inventor and Developer of the “Black Box” Flight Data Recorder. The Independent, July 31. Available at http://www.independent.co.uk/news/obituaries/davidwarren-inventor-and-developer-of-the-black-box-flight-datarecorder- 2040070.html (accessed December 15, 2010).

Woods, D.D. 2006. Essential Characteristics of Resilience. In Resilience Engineering: Concepts and Precepts, ed. E. Hollnagel, D.D. Woods, and N. Leveson, 21–34. Farnham: Ashgate.

Yesavage, J.A., and V.O. Leirer. 1986. Hangover Effects on Aircraft Pilots 14 Hours after Alcohol Ingestion: A Preliminary Report. American Journal of Psychiatry 143:1546–50.



Acknowledgments: Jennifer Dixon, Alan Garber, Sue Osborn, Martin Marshall, and three anonymous reviewers provided helpful comments on earlier drafts of this article.

Address correspondence to: Geraint H. Lewis, The Nuffield Trust, 59 New Cavendish Street, London W1G 7LP, United Kingdom (email: geraint.lewis@nuffieldtrust.org.uk).




The Milbank Memorial Fund is an endowed operating foundation that engages in nonpartisan analysis, study, research, and communication on significant issues in health policy. In the Fund's own publications, in reports, films, or books it publishes with other organizations, and in articles it commissions for publication by other organizations, the Fund endeavors to maintain the highest standards for accuracy and fairness. Statements by individual authors, however, do not necessarily reflect opinions or factual determinations of the Fund.

©2011 Milbank Memorial Fund. All rights reserved. This publication may be redistributed electronically, digitally, or in print for noncommercial purposes only as long as it remains wholly intact, including this copyright notice and disclaimer.

Printed in the United States of America.

Online producer: Stephanie Moe-Quiggle


Milbank Memorial Fund Homepage Milbank Memorial Quarterly Reports Books Editorial and Program Staff Boards