Posted on 


 in , ,

The Danger of Blindly Following Machines


This episode of Tim Harford's podcast series Cautionary Tales has some brilliant stories about the dangers of blindly following technology. Tim talks about the phenomenon of what rangers in Death Valley California have termed death by GPS – genuine examples of where people have ended up in perilous situations by uncritically following the step-by-step turns of a GPS rather than paying attention to what was in front of them.

Like the tragic story of Alicia Sanchez, the 28 year-old nurse who when driving with her 6 year old son through Death Valley followed her GPS for 20 miles down a poorly defined track until her car became stuck. A week later a ranger discovered her Jeep buried axle deep in sand with SOS spelled out in medical tape on the windshield. Fortunately Alicia survived the ordeal but tragically her son did not. Or then there's the couple that blindly followed their GPS, driving past concrete barriers, orange barrels and road closed signs before driving off the edge of an unfinished bridge and plummeting forty feet. Or the man whose GPS took him and his car down a steep, narrow path in the Pennines until he was left teetering on a cliff edge. Or the Swedish tourists that whilst on holiday in Italy drove 400 miles off course and ended up in Carpi rather than Capri.

The problem is often not that the technology is wrong, but that it is too right. In calculating the most direct route it fails to take account of roads that are really just tracks, or ones that are now abandoned and not suitable for a car, or local knowledge which would tell you that taking that turning was a bad idea. Yet as Tim says a large part of the problem with phenomena like death by GPS is how we sometimes unquestioningly follow the technology and fail to just look out of the window. And that can have pretty huge consequences. Like the example of AIG, which lost $60 Billion in one quarter during the 2008 financial crisis as a result of betting everything on a financial probability formula called the Gaussian copula function.

It's interesting to think about the psychological effect that using technology like sat nav has. A study by Toro Ishigawa at the University of Tokyo for example, found that people who used a GPS to complete a navigational task had a reduced recall of their surroundings on the route that they had just navigated. Another study last year by researchers at the University of Zurich and Umea University in Sweden similarly concluded that whilst using automated navigation assistance helped people to navigate better, it's use also 'negatively affects attention to environment properties, spatial knowledge acquisition, and retention of spatial information'. The point is of-course that using technology in conjunction with human critical thinking is often better than technology alone, and definitely better than always blindly following what the machines or algorithms tell us to be true.

I think there's also something in these stories that speaks to the idea of plan continuation bias (or what aviation pilots call 'get-there-itis') which is the tendency of people to continue with an original course of action in spite of changing conditions, even when that plan is no longer viable. In the case of death by GPS this may be present in the feeling that the technology must be right and we'll find a better road or a way out just round the next corner. After all, in death by GPS cases that don't end with driving off a cliff or a half-finished bridge, the situation tends to get worse slowly but there's still the possibility that salvation is over the brow of the next hill.

Similarly, in business initiatives plan continuation bias can result from being emotionally invested in a way of solving problem that we have believed to be right from the beginning. The longer we continue with the project the more we want to be proved right and the harder it will be to be proved wrong, especially when significant resources have already been invested in getting this far.  A study by NASA of nine major U.S. airline accidents between 1990 and 2000 in which crew errors were thought to be among the probable causes found that plan continuation bias amongst pilots and aviation crew tends to get stronger the closer they are to reaching their destination. In other words the closer the plane was to final approach and landing the more likely the crew were to continue with a course of action even when elements of the situation were changing.

Similarly in business it's easy to get emotionally invested in a particular solution or path. We don't want to look foolish. The deeper you get in to a project the higher the cost of change, both in terms of tangible sunk costs but also the social cost involved. In extreme cases, even when it's realised that the course of action they've taken could be wrong a team or individual might be subject to feelings of entrapment which simply results in even more commitment and investment to a failing course of action in order to justify the amount of time and effort that has already been put in. 

In a separate post Tim Harford relates the story of the Torrey Canyon disaster in 1967. At 6.30 in the morning on March 18th the First Officer realised that the ship, sailing past the tip of Cornwall with 119,000 tonnes of crude oil, was further East than expected and instead of passing West of the Scilly Isles was actually passing through the treacherous channel between Cornwall and the islands. He changed course but when the Captain Pastrengo Rugiati was woken up he overrode the order, deciding that it was better to carry on than risk losing time. This, combined with a number of other smaller factors, resulted in the ship running aground and one of the worst oil spills in history.

In my second book I describe the mindset challenges that can get in the way of shifting from entrenched, linear, waterfall processes and ways of working to truly adaptive, agile ways of generating value. One of the key ones is the potential for plan continuation bias in linear processes and how easy it is to become path dependent, building up risk and shutting down the potential for adaptability the deeper you get into a course of action.


Truly agile processes celebrate change even late in a process, but they also celebrate learning. Simple human biases have the potential to totally derail projects and initiatives, but so does blindly following algorithms and machine-driven outputs. It's so important for teams to regularly check-in with each other. To have the humility to recognise when they have gaps in knowledge. But also to have the will and the critical thinking to question and challenge, and the openness to explore, look around them and adapt where necessary. When so much of what we do is informed by data and algorithms this is more important than ever.

Photo by Markus Spiske on Unsplash

Leave a Reply