Posted on 


 in ,

‘Gigantomania’ and why big projects fail

I’m looking forward to reading Bent Flyvbjerg’s new book ‘How Big Things Get Done‘. Bent, an academic at Oxford, has compiled a database of over 16,000 significantly-sized projects (including things like infrastructure and systems investment) which has revealed that only 8.5% of projects deliver to their original projected cost and time projections, and only 0.5% of projects deliver to initial cost, time and benefit forecasts. The reason, argues Bent, are the many psychological and political biases that are incorporated into projections. An over-reliance on intuition rather than data, planning fallacy and rushed planning all combine with what he calls ‘strategic misrepresentation’, or the deliberate under-estimating of costs in order to just get a project signed off and started. Sunk-cost fallacy then means that once they have began, large budget projects are much less likely to be stopped. As delays to large projects set in, the risks of things going wrong and overspends becoming even larger simply increases.

You see this not just in government projects but in significant investments in large companies, particularly where there is a preference for what you might call ‘gigantomania‘ – centrally-controlled grand projects of significant scale that perhaps are notable in also desiring to use the latest technology. Here it almost becomes more difficult to suggest a more measured, incremental approach which allows for better learning to take place. I wrote about ‘gigantomania’ in my latest book on agile marketing in the context of safe-to-fail experimentation. Here’s one of my favourite stories related to it (in a short extract taken from the book):

‘In the late 19th Century and early 20th Century engineer Peter Palchinsky played an important role in introducing more scientific methodology to Russian industry. He believed that Russian engineers were poorly equipped in dealing with challenges in the competitive world because rather than take an adaptive scientific approach to generating solutions, they would view every problem as a technical one and assume that if a proposed solution utilised the latest science and technology it was automatically the best solution. He had observed a number of huge Soviet infrastructure projects go disastrously wrong (like the introduction of large irrigation systems which contributed towards turning part of the Aral Sea into a desert). In response he introduced three simple rules for industrial design that would enable greater adaptability whilst mitigating risk:

  1. Variation: seek out ideas and try new things
  2. Survivability: where trying something new, do it on a scale where failure is survivable
  3. Selection: seek out feedback and learn from your mistake as you go along

A good example of Palchinsky’s thinking was his reaction to seeing the plans for a huge new hydroelectric dam on the Dnieper River. He recommended that instead of one large dam, a series of smaller dams and coal-fired power plants be built to solve the energy problem in the area. This would enable the engineers to learn from the dams built first and to adapt plant and dam designs to best serve the needs for electricity whilst also providing water and power control closer to the point of use and limiting the floodplain. The power needs and electrification requirements of the local area would survive abandoning a single small dam. His views were ignored and the huge dam was built near the southern city of Zaporizhzhya (in modern day Ukraine). As the project proceeded it fell behind schedule and vastly exceeded projected costs. Thousands of farmers were forced to abandon their farmland with little or no compensation and many were forced to work on the project in terrible conditions. In 1941, as Nazi German troops swept through Soviet-era Ukraine, Stalin’s secret police blew up the Dnieper Dam, flooding many villages in the banks of the river and killing thousands of civilians. 

Unfortunately, Palchinsky and his principles fell increasingly foul of Stalin’s preference for huge projects that were controlled centrally from Moscow with little regard for local conditions and where safety was sacrificed at the altar of outputs. Stalin believed that all Soviet industrial and engineering projects should be great in size and perhaps even the biggest in the world (a philosophy which came to be known as ‘gigantomania’ by the Western observers). This approach led to high accident rates and often to poor quality production. 

Eventually Palchinsky was arrested and sadly executed in 1929. Yet his principles live on as a reminder of the dangers of ‘gigantomania’. Variation, survivability and selection. Always seek out and try new things. Always try new things on a scale where failure is survivable. Always seek feedback and learn from your failures as you go along.’

Bent Flyvbjerg notes how the risk of massive budget and time overruns can be mitigated by rigourous planning, modularisation and standardisation. In other words spending time at the beginning to get the plan right but then breaking a big project down into smaller component parts which can then be more easily put together, reprioritised or amended. It seems that often we’ve yet to really learn the lessons that Peter Palchinsky championed all those years ago.

Image: By Нет данных Public Domain

One response to “‘Gigantomania’ and why big projects fail”

  1. Technology * Innovation * Publishing Newsletter #247 | Sandler Techworks

    […] ‘Gigantomania’ and why big projects fail […]

Leave a Reply