My Interview With Capt. Sully Sullenberger: On Aviation, Medicine, and Technology

The story of Chesley “Sully” Sullenberger – the “Miracle on the Hudson” pilot – is a modern American legend. I’ve gotten to know Captain Sullenberger over the past several years, and he is a warm, caring, and thoughtful person who saw, in the aftermath of his feat, an opportunity to promote safety in many industries, including healthcare.

In my continuing series of interviews I conducted for my upcoming book, The Digital Doctor: Hope, Hype, and Harm at the Dawn of Medicine’s Computer Age, here are excerpts of my interview with Sully, conducted at his house in San Francisco’s East Bay, on May 12, 2014.

Me and SullyBob Wachter: How did people think about automation in the early days of aviation?

Sully Sullenberger:  When automation became possible in aviation, people thought, “We can eliminate human error by automating everything.” We’ve learned that automation does not eliminate errors. Rather, it changes the nature of the errors that are made, and it makes possible new kinds of errors. The paradox of cockpit automation is that it can lower the pilot’s workload in phases of flight when the workload is already low, and it can increase the workload when the workload is already high.

If you’re in the cruise portion of a long distance flight, it’s possible to have the airplane programmed to fly a specified vertical and horizontal path for many hours without much intervention at all. That relegates the pilots to the role of monitor, something that humans are not good at. Witness the TSA security agents, who are screening countless passengers looking for that one time in thousands where there’s a threat. Humans are much better doers than monitors.

The other problem with technology is, at least for now, it can only manage what has been foreseen and for which it’s been programmed. So one of the weaknesses of technology is that it has a hard time handling “black swan” events, like our flight.

BW: Why are computers so good in aviation and so mediocre in healthcare?

SS:  I think there are a number of reasons. First we have a long history of putting safety as one of our highest priorities. From the very beginning we’ve known that aviation is an inherently risky endeavor. And, in the media age, we all see the video of the huge orange fireball as the jet fuel explodes. That gives us a sense of urgency.

The second component is that we take a systems approach. We look at it from end to end. We look at connectedness, the interrelatedness of all the things that we do, of the systems and the airplane, the human system, the technology system, and the air traffic control system. So we have to be the absolute master of the machine and all the systems, of the passengers, of the environmental conditions… All simultaneously, all continuously.

And third, we have this formal lessons-learned process that does root-cause analysis, makes findings about fact, causes and contributing factors, and makes recommendations for improving the system in terms of the designs, the policies, procedures, training, human performance, and standards. It’s a self-correcting mechanism.

BW:  In the early days of aviation, was there an overarching philosophy about the relationship between the people and the machines?

SS:  Yes. That the pilot should absolutely be informed about everything going on in his or her aircraft, and ultimately able to affect the outcome, to change it, to control it.

BW:  So the pilot would be in charge?

SS:  Always be in charge.

I asked whether there was another camp in aviation – folks who believed that the technology could, and should, fly the plane.

SS: Airbus was, and still is, different than Boeing in their automation philosophy. Airbus tended to embrace letting the airplane do a lot of things, where Boeing’s philosophy was that the pilot is and should be in control of every part of the process. For example, Airbus would set hard limits in the digital flight control system laws, the fly-by-wire laws that would prohibit the pilot from exceeding certain limitations in how fast or slow the airplane could fly, how steeply it could bank, how many degrees you could tilt the angle of the wing from horizontal or raise the nose above horizontal. Whereas Boeing takes the approach that there are rare occasions where, to avoid colliding with another airplane or crashing into the ground, you need to pull harder than the flight control system might otherwise allow. I tend to favor having the pilot directly and completely in control of the airplane. The downside of that is that every pilot has to be trained well, be highly experienced, and have a deep understanding of airplanes and how they work.

BW:  But you had that experience because you had a whole career before all of this technology. How does a young person get that? And how do we prevent the erosion of skills as the technology gets better?

SS: As we use the technology more and more – and we’re encouraged to do so by our airlines because it’s so efficient – then we get the sense that it’s almost infallible. And, because we haven’t done much manual handling of the airplane, we lose confidence in our ability to manually control the airplane as well as the automation can. That sometimes makes some pilots reluctant to quickly and effectively intervene when they see things going wrong or when the automation isn’t doing what they expect.

BW:  Is the solution forcing them to fly manually more of the time?

SS:  Yes. We have to design our systems to require our engagement. We cannot design a system that’s so hands off that we are simply required to sit there and watch it for 14 hours. That’s simply not going to work.

Another piece of the problem is that our systems should offer more options than all or nothing. I’ve been proposing an a la carte menu, with increasing or decreasing levels of technology they can use. The only question we have to ask ourselves is what level of technology is most appropriate for this phase of flight. The answer is the one that keeps us engaged and aware and able to quickly and effectively intervene, and also keeps our workload neither too high nor too low.

Screen Shot 2015-02-23 at 12.46.14 AMWe turned to January 15, 2009, the day Sully safely landed an Airbus A320 on the Hudson River, within a few minutes of a bird strike that disabled both the plane’s engines.

SS: What happened to us was a very rare event. And it happened late in my career, after I’d been an airline pilot for 29 years, a captain for 21 of those. In all that time, 42 years of flying, I had never experienced in flight the actual failure of even a single engine. That’s how reliable our propulsion technology is. And, in all that time I had never been in a situation in which I had doubted the outcome of a flight. I had never been challenged by a situation that I didn’t think I could handle.

BW:  Is there a strange part of one’s brain that says, “I hope that something like that would happen because I’ve been training for it my whole life”?

SS:  No.

BW:  Is that too crazy?

SS: No, it’s not, because it happens a bit in combat. I served in the Vietnam era but I was never in combat. I always wondered if I would have been up to the challenge. I think it’s just natural when you train for that to wonder: Would I be brave enough? Would I be resilient enough? Would I be a good enough leader? Would I be clever enough, strong enough to complete the mission and keep the people I was responsible for safe? I was prepared never to know the answer to that question. But I always wondered about it.

But in the airline world, it’s different. Our job is to make it calm and predictable. We make it look easy. We work so hard to plan and anticipate and have alternatives for every course of action.

And so on this flight, just 100 seconds after takeoff, I saw the birds about two seconds before we hit them. At that point we’re travelling at 316 feet per second, and there was not enough time or distance to maneuver a jet airplane away from them. It was like a Hitchcock film. I saw the birds fill the windscreen, I could hear the thumps and thuds as we struck them. And as the birds entered the center of both jet engines and began to damage them, I heard terrible noises I’d never heard before, severe vibrations and then within seconds, the burning smell came into the cabin.

We had a great advantage in that there was no ambiguity. I knew what the cause was and what that entailed, what that meant for us, and I knew it was going to be, after such a routine career, the challenge of a lifetime. Yet it was such a sudden shock, the startle factor was enormous. I could feel my blood pressure shoot up, my pulse spike, my perceptual field narrow. It was really distracting, and it was marginally debilitating. It absolutely interfered with my cognitive processing. It didn’t leave me with the ability to do the math on altitude and distance.

I had to do three things that, in retrospect, made the difference. First, I had to summon up from somewhere within me this professional calm that really isn’t calm at all. It meant having the discipline to compartmentalize and focus on the task at hand in spite of the stress.

Second, even though we had never trained for this, because I had such a well-defined paradigm in my mind about how to solve any aviation emergency, I was able to impose that paradigm on this situation and turn it into a problem that I could solve.

Finally, because of the extreme workload and time pressure I knew that there was not time for me to do everything I really needed to do. I chose to do the most important things, and try to do them very, very well. And I had the discipline to ignore everything else that I did not have time to do. Calming myself, setting clear priorities, managing the workload, load shedding… those were the key.

BW:  Did you do all that consciously, as in: “Here are all the things one might do, but here are the six that I’m going to do?” Or was it all instinct?

SS: I would say that I was able to quickly synthesize a lifetime of training and experience and intuitively understand that that was the approach I needed to take. It was partly a result of my military flight training, from being a fighter pilot. It was all very intuitive. Fly the airplane first. Make good choices. Pick the best place to land. That sort of process.

But I immediately did two very specific things that I remembered from the dual engine flameout checklist. I turned on the engine ignition so, if the engines could recover thrust, they would. And I started our aircraft’s auxiliary power unit, which is a small jet engine, a small turbine that has its own electrical generator. In a fly-by-wire airplane like the Airbus, it’s especially important to have an uninterrupted supply of electrical power because the plane does not have a mechanical, wires and pulleys connection between the flight control stick or wheel in the cockpit and the flight control surfaces on the wings and tail. Instead the pilot’s inputs are interpreted and mediated by flight control computers, which send electrical impulses to actuators that move the flight control surfaces of wings and tail. And that requires electric and hydraulic power.

Without that power, our systems and our information displays would have degraded. And the power also allowed us to keep intact all the flight envelope protection, which prevented us from getting too fast or too slow or banking too steeply.

Screen Shot 2015-02-23 at 12.48.06 AMAs we’re heading toward the water, I’m the one controlling the airplane, but my first officer, Jeff Skiles, is monitoring the performance of the airplane and my performance. I didn’t have time to tell him how to help me. Right before landing he intuitively understood that he needed to help me judge our altitude. And so he stopped trying to use the checklist to regain thrust from what turned out to be irreparably damaged engines and he began to call out to me altitude and airspeed. You’re 200 feet, you’re 100… that kind of stuff. And if I had misjudged that height by even a fraction of a second – I might have raised the nose too early, got too slow, lost lift and then dropped it in hard. If I waited too long we would hit nose first way too fast with too great a rate of descent and that could be very bad.

And even though I was not at the minimum speed at which the Airbus flight control protections would have prevented me from raising the nose further no matter how hard I pulled, there was a little-known feature of the Airbus software – that no airline pilots, no airlines knew about, only a few Airbus engineers knew about – that prevented me from getting that last bit of maximum aerodynamic performance out of the wing. This was discovered very late in the investigation.

BW:  You didn’t perceive it?

SS:  Well, I’m not sure.

BW:  You had other things on your mind.

SS:  I wasn’t sure. But it turns out at the end I was not quite at the maximum angle the wing could be allowed to try to create lift. And yet even though I was pulling back, commanding full nose up on the side stick, the flight control computers prevented me from getting any more performance. That was something the investigators debated.

BW:  It’s interesting that you spoke about how you were glad you were in this automated envelope of protection. I might have guessed that you would have wanted at that moment to be completely off cruise control and have total control yourself?

SS:  Well, it’s not at all cruise control. The automation was not flying the defined path for me. But it put guardrails out there. So it kept me from making a gross error. It kept me from stalling the airplane.

BW:  So you were happy that it was there at that moment?

SS:  It was not something I consciously thought about but it was a good thing to have the airplane be fully operational. But it turned out that because of this little known aspect of the flight control system, we hit a little bit harder than we would have had we been able to get that last little bit of lift out of the wing right before we landed. The NTSB recommended that the airlines teach the pilots about this feature, because we didn’t know about it.

I asked Sully what lessons he drew about the relationship between people and machines from the Hudson landing?

SS:  I don’t think there’s any way that technology could have done what we did that day.  Absolutely no way.

BW:  You can’t envision a future in which the engineers have built in a “both engines fail” mode and the technology perceives it and starts doing X and Y?

SS:  I think that would be possible, but then that doesn’t take into account the whole divert decision. About where your flight path is going to intersect the earth’s surface in 208 seconds. And someone would have had to have anticipated this specific circumstance, this particular black swan event and programmed it to do that.

Theoretically, maybe technology could do some of these things. But to handle the whole thing from start to finish required a lot of innovation, a lot of – it required us to take all the things that we had learned, adapt it, apply it in a new way to solve a problem we never anticipated and never trained for and get it right the first time. In 208 seconds.

BW: I can’t help but think that one of the riskiest things about technology is that it seems inconceivable that a young pilot today will have your skills and instincts, because they’ve grown up in such a different environment.

SS:  And that’s another great challenge – we have to find a way to pass on this institutional knowledge. And not just the what, but also the why. You see pilots of my generation, especially ones who have wanted to fly since we were five years old – we just couldn’t get enough. We couldn’t learn enough about the history of our profession, about historic accidents, about why we do what we do. Because almost every procedure we have, almost every rule in the Federal Aviation rulebook, almost every bit of knowledge we have is because someone, somewhere died. Often many people did. And so we have learned these important lessons at great cost, literally bought with blood. We dare not forget and have to relearn them.

Screen Shot 2015-02-23 at 12.50.46 AMSully then discussed the 1989 landing of United Flight 232 in Sioux City, Iowa, after a catastrophic engine failure severed all of the DC-10’s hydraulic systems, rendering the plane’s flight controls completely inoperable. A pilot named Al Haynes and his crew miraculously conjured up a system of using differential thrust to control the plane sufficiently to save 185 of the 297 people on board.

SS: I remember having that radio communication back and forth with Patrick Harten, the air traffic controller. I said to Patrick, “I’m not sure we can make any runway. We may end up in the Hudson.” At that moment, I was thinking, that’s just like what Al Haynes said when the air traffic controller at Sioux City said

BW:  Did Sioux City actually cross your mind at that moment?

SS:  Yeah, I thought about that flight, because at one point as they were approaching the Sioux City airport, the controllers want to give you the widest discretion they can to solve the problem any way you think is possible. And there were multiple runways of different directional orientations at Sioux City. The controller said to Haynes, “You’re clear to land on any runway.” At that point the control of the airplane was still so tenuous that Al said, almost laughing, “You want to be particular and make it a runway, huh?” And that thought – of Al’s comment – crossed my mind.

And so, yeah, we have to pass on this knowledge.

10 Responses to “My Interview With Capt. Sully Sullenberger: On Aviation, Medicine, and Technology”

  1. Davis Liu, MD February 23, 2015 at 3:11 pm #

    Bob – great interview! Your book is another must read!

    Captain Sullenberger is certainly one of my heroes and the issues you bring up are so timely and relevant in health care. My concerns about the next generation of doctors appears to be something aviation is also worried about with their pilots.

    With the Asiana air accident in San Francisco, the Air France accident over the Atlantic, and also the AirAsia accident, one has to wonder whether that time of different errors has come because this generation of pilots given the training and automation of aircraft are fundamentally different than Sullenberger’s era. If so, what can they do about it, how will they go about doing it, and when will they know they made an impact?

    For us, when will this issue occur in health care and what might we do about it now that we know?

    Keep up the amazing work!


    Davis Liu, MD

  2. Karthik Raghunathan February 23, 2015 at 6:24 pm #

    Bob :
    Great Interview. This back-and-forth – between automation and human control – in aviation and medicine is fascinating.

    Automation may allow for the precise ‘setting of parameters’ and ‘measurement of responses’ – probably impossible to do with human performance. Our system (in medicine) of education and training may do a decent job of producing people with ‘necessary’ skill and knowledge – but probably not ‘sufficient’ for rare black swan scenarios. How can one train for specific rare events that are only fully understood in hindsight (rhetorical question)?

    What may be needed in these “rara avis in terris” situations is the right person (like Sully Sullenberger was to pull of the miracle on the Hudson). Captain Sullenberger actively sought and let Jeff Skiles and ATC work with him ! We need ‘people’ like this. Medical School, Residency, and Fellowship selection processes appear geared toward picking brilliant people – not necessarily Team Players …

    Who knows what would have saved Libby Zion … but even a Dr. Sherlock Holmes in the new ACGME duty hour restricted world would need to actively seek support – since there are multiple teams caring for any one person.

    How do we build an inherently collaborative culture in Medicine ?


    Karthik Raghunathan, MD MPH

  3. RW February 25, 2015 at 7:23 pm #

    How much of the time that he is in the cockpit does a pilot spend doing data entry chores – do his passengers feel safe in their homes? do they wear helmets when biking? do they have any loose area rugs?

    If our HIT experts were involved in the aviation industry, Captain Sullenberger would be back in the cabin during take-off trying to determine who ordered the fruit plate.

  4. Desmond Shapiro February 28, 2015 at 12:49 am #

    Thank you . Enjoyable interview indeed.
    Seems to me that the big difference in aviation and medical safety model is that the pilot has skin(all of it) in the game . Getting buy in from physicians is much more difficult.
    Also use of simulators and recertication much more rigorous and prevalent in aviation
    Changing physician behavior and priority tends to rely on the following.
    Financial incentive.
    Financial disinincentive.
    State regulation
    Transparency (see cockpit voice recorders)
    How soon before we have video recorders in OR and ??

    • Todd Fraser March 13, 2015 at 7:18 am #

      What a fantastic interview, and a great topic of discussion. I’ve enjoyed reading all views presented.

      I am particularly interested in the issue of recertification, as Desmond mentioned above. I know this is something like the third rail in the US at the moment, but the stark difference between the aviation approach to credentialing and that in healthcare is impossible to ignore. I wrote about my limited exposure to this recently –

      Surely we, as an industry, need to take the bull by the horns and accept that we need to be more transparent in our performance monitoring? How can we expect to advance our practice if we a) have no idea how we are performing and b) refuse to openly analyse our deficiencies and share the findings on a global basis?

    • Dave Slaughter March 14, 2015 at 5:06 pm #

      Physicians have quite a bit of “skin in the game”. Getting buy in from physicians depends on the “buy in” contributing to their ability to perform at the “game”: ie the job of taking care of patients. I was struck in this interview by a couple of comments. First, at some point Captain Sullenberger realized he could not do everything he was supposed to do, so he focused on flying the plane, his primary task. Second, the technology actually prevented him from doing that as efficiently as possible at a critical moment. From the standpoint of a physician, much of the technology recently has caused us to focus on documentation rather than on patient care, using systems that are designed for billing and coding purposes rather than patient care (in fact are detrimental to patient care) and that have not had the bugs worked out before release. I would bet that if similar systems were implemented in a cockpit the pilots would refuse to fly. Much of the recertification that has been pushed on physicians seems to serve no useful purpose other than making somebody money. As an example I have been recertified for basic and advanced cardiac life support every two years for decades, learning and unlearning numerous permutations of algorithms. For in-hospital cardiac arrest, the only review I have seen about the efficacy of this was in the NEJM a few years back comparing survival rates for this from 1982 until just prior to the last ACLS revision. There was NO difference in survival (I would have expected some change due to a better selection of patients who are on DNR status, and I think mock codes are a good exercise, for the record, but I’m not sure the recertification itself makes much difference). At least we have pared down doing the number of things that are useless in a code. Even the ABIM seems belatedly to realize that some of their requirements have been poorly thought out. It doesn’t take a genius to recognize that in a field as comprehensive as medicine, spending CME time doing things that do not improve patient care or practical knowledge are not only useless but detrimental. Small wonder physicians don’t buy in. Finally, the same can be said of some of the performance measure foisted on us. An excellent example is the “four-hour rule” for pneumonia, which hasn’t improved the treatment of pneumonia but sure has resulted in many patients getting antibiotics they didn’t need.

      I am going to end with a quote from last week’s NEJM. It is about the British medical system but could apply anywhere:
      “General practice is being criticized for its small scale, isolation, and lack of accountability. Structures that used to be praised as entrepreneurial and flexible are now described pejoratively as a “cottage industry.” Unacceptable performance in a small number of practices is being highlighted rather than ignored. But most important, this new focus reflects a growing concern that the majority of general practices have neither the capacity nor the capability to respond to the increasing expectations placed on them by the public and policymakers — expectations that they will be able to cope with both the increase in the consultation (visit) rate from 3.9 consultations per patient per year in 1995 to 5.5 in 20092 and the increasing complexity of most of these consultations; that they will assume responsibility for the growing number of activities that are being shifted out of hospitals and into the community; that they will adopt the lead role in commissioning health services, with a more proactive, integrated, and population-based orientation; and that they will be at the forefront of technological and social innovation in care delivery.
      These growing expectations are hitting general practice at a difficult time. Overall spending on the NHS has risen by 18% since the 2005–2006 fiscal year, but the proportion of the budget allocated to general practice has dropped by 8% over the same period.3 In the past year, there has been a 15% decrease in applications for general practice training posts4; practices are finding it increasingly difficult to fill vacant equity-holding partnerships and salaried posts, and they are struggling to hold onto the growing number of older doctors who are considering early retirement. A specialty that has historically had an admirable reputation for simply absorbing whatever it was asked to do is now struggling to deal with the growing expectations that are being placed on it. Some commentators are saying that general practice is in crisis.”

      Bottom line, practicing physicians could use some help. Financial incentives which would enable physicians to spend more thoughtful time with patients, a record system that facilitated patient care rather than detracting from it, and allowed doctors to focus on patients rather than documentation, performance measures which were well thought out and made sense (and did not cause of deterioration of care in areas not being measured). Do this and I think you’ll find physicians will “buy in”.

      I don’t expect any of this from our current physician leaders or politicians.

  5. Which Doctor? February 28, 2015 at 12:08 pm #


    “Rather, it changes the nature of the errors that are made, and it makes possible new kinds of errors”

    Professor Ross Koppel reported this about EHRs and CPOE in JAMA nearly a decade ago. The same problems he describes then are still happening. Despicable.

    Such is NOT the program in aviation, because there is aggressive surveillance and remedy for the unintended adversity.

    In Medicine, Congress is rewriting the F D and C Act to exempt EHRs and CPOE from oversight. Now that is really helpful (to the vendors and their stockholders).

    The Bill is called the “Cures Bill”, and it it does anything but.

  6. Which Doctor? March 22, 2015 at 1:06 pm #

    I am thrilled to see that you have taken my comments and those of Koppel and Menoaliitle and incorporated them in to this articulate NY Times op-ed.

    I agree that med tech is transformative, but as I see it, in the negative direction.

    • Menoalittle March 22, 2015 at 8:28 pm #

      Which Doctor?,

      Thanks for the shout out.



      I read your provocative commentary at the link provided. Considering your defense of HIT that permeates your blog posts, you seem to have had an epiphany. Tell us what it was.

      Best regards,



  1. Wachter: AMCs Must Prepare “Digital Doctors” | Wing of Zock - March 3, 2015

    […] one, however, he undertook a more journalistic approach, interviewing nearly 100 people, from pilot Captain “Sully” Sullenberger to hospital chief innovation officer John […]

Leave a Reply

Your email address will not be published. Required fields are marked *