There’s a renowned fictional case study (originated by Jack Brittain and Sim Sitkin) which is used in business schools to help students understand the risks around poorly informed decision-making. The scenario that students are given is to imagine that they are John Carter, the founder of a car racing team. The team is coming up to a very important race – important because there is big prize money at stake, and doing well in the race could mean that they get great publicity but also that they stand a much better chance of winning a big new sponsorship deal which is sorely needed.
The problem however, is that in 7 out of the last 24 races the engine in the Carter racing car has failed, meaning an early retirement from those races. If the engine blows out in this race it will be on live television and may jeopardise not only the prize money but also the sponsorship deal, and potentially create risk for the driver. The head mechanic on the team has a hunch that the engine is failing when the weather is cooler. So the team makes a graph which plots the temperatures for all the times that the engine has failed. The graph shows a reasonably broad range of temperatures in those races (ranging from around 55 to 75 degrees Fahrenheit).

The chart does not definitively give the team the answer but the temperature for the upcoming race is forecast to be especially cool, at around 40 degrees. So with the stakes high, should John Carter go ahead and race or not?
Many business students typically opt for going ahead and entering the race since the data is inconclusive, and there is much to be gained if they do well. The lesson from the case study comes from the fact that few students ask to see the rest of the data – the temperatures from the races where the engine did not blow out (represented by the blue dots below). Include this data on the graph and the story looks a lot different – every race where the car finished successfully was run at temperatures exceeding 65 degrees Fahrenheit. Or put another way, the rate of failure above 65 degrees is 15%. Below this temperature and the failure rate was 100%.

The head mechanic was in fact right, but without the full data and the full story being presented it is surprisingly easy to make a poor decision.
As the mathematician Hannah Fry points out, this scenario is actually not just a business school fictional case study but a representation of a very real, and very tragic situation – the Space Shuttle Challenger disaster in January 1986. This tragic event was found to be the result of failing ‘O-ring’ pressure seals on the solid fuel booster rockets used in the launch. The two booster rockets which were attached to the massive launch fuel tank were themselves so big that they couldn’t be assembled in one go. So they were instead built in round sections that fitted together. The joints between the sections were sealed with rubber O-rings which flexed to form a solid seal.

The launch day was forecast to be a particularly cold 36 degrees. The hurriedly put together graph showing the temperatures at which problems had previously occurred (like the first graph above) was faxed to NASA at the Kennedy Space Centre and it was discussed at an emergency teleconference right before the launch. The apparent inconclusive nature of the data meant that no direct link between low temperatures and launch issues was made and the decision to go ahead was confirmed. Tragically however, the unusually low temperatures had reduced the flexility of the rubber O-rings and an imperfect seal resulted in fuel leaking from the booster, catching alight and causing the whole Shuttle to explode. As Hannah Fry says:
‘The chart implicitly defined the scope of relevance—and nobody seems to have asked for additional data points, the ones they couldn’t see. This is why the managers made the tragic decision to go ahead despite the weather.’
Had the engineers and managers seen the full data they would have likely come to a very different conclusion. It’s a salutary lesson about the importance of seeing the whole picture.
But there is another, perhaps even more salutary lesson that can be drawn from what happened around that launch on 28th January 1986. The night before the launch several engineers at Morton Thiokol, who NASA had contracted to build the rocket boosters, tried in vain to stop it from going ahead. They were over-ruled by managers from their own business and from NASA. Having access to the full data, Morton Thiokol engineers Bob Ebeling and Roger Boisjoly knew that at particularly low temperatures the rubber O-rings were at risk of not sealing properly.
There was a lot of pressure on NASA not to delay the launch. One of the Astronauts on board Challenger, Crystal McAuliffe, was due to be the first school teacher in space and the launch was scheduled to be broadcast into hundreds of American schools. President Ronald Reagan was due to mention the Shuttle mission in his upcoming State of the Union address. After hearing what Bob had to say about the risk, Morton Thiokol bosses initially called NASA to halt the launch but the NASA bosses did not react well to the news.
Wanting to provide more information and placate their largest client, the Morton Thiokol leaders sent over the first chart to NASA which appeared to show an inconclusive link between temperature and O-ring problems. The space agency pushed back on the Morton Thiokol executives, and the leaders there decided that they should OK the launch. Hearing that the launch was still going ahead Bob Ebeling jumped in his car to drive to Morton Thiokol headquarters to try and get the executives to change their mind but it was in vain.
Tragically Ebeling lived with depression and guilt for years after the disaster. He, and his colleague Roger Boisjoly, spoke anonymously to NPR thirty years ago to reveal the true story of what led to the tragedy that day.
There are many lessons for leadership from this tragic story. There is the lesson about how important it is to take the whole picture into account, and not to make critical decisions based on partial data. There is learning about listening carefully to the specialists who really understand the detail of a situation. And finally, there is the lesson about not letting misaligned incentives or outside pressure get in the way of good decision-making.
Such a sad story, but one from which we can definitely learn.
If you’d like to receive posts like this straight to your inbox do drop your email into the box below

Leave a Reply