The Pessimistic Optimist's Guide to Change

David Lewis and Keith Harper report on lessons from a uniquely successful change programme at one of the UK’s best-known boroughs.

David Lewis and Keith Harper report on lessons from a uniquely successful change programme at one of the UK’s best-known boroughs.
change  finalThe Kensington Programme was a four-year long change programme designed to create a radically different working environment to reduce office accommodation costs and increase staff productivity at the Royal Borough of Kensington and Chelsea.  With a population of 158,000, the borough is one of the most densely populated in the UK and includes some of London’s most famous buildings (the Albert Hall, the Natural History Museum et al) and areas (including Portobello Road and Kensington High Street).

 The programme involved changing: 
• How people work 
• Where people work
• Business processes 
• Information technology and information management 
• Office layout 
• Mechanical and electrical infrastructure

All of this may sound familiar.  Countless other change programmes have set off with similar objectives.  The results were more unusual.

The planned programme budget was £24 million; the final cost was £23 million. The planned one-off savings were £6.1 million; the actual one-off savings were £7.6 million. The post programme planned annual savings were £2.65 million; the actual post programme annual savings were £3.8 million.

We were deeply involved in the Kensington Programme, one as programme manager (Keith Harper) and the other (David Lewis) as a programme consultant.  For us, it was a powerful experience and one which, we believe, provides vital lessons in how to make change programmes work. The importance of this is made clear by the daunting and depressing fact that the 70 per cent failure rate in change programmes has been constant for years. 

The headline from our experience with the Kensington Programme is that it pays to be a pessimist and an optimist -- a pessimist to construct a realistic cost budget; an optimistic to maximise the benefits. This needs to be a deliberate approach. Without conscious effort, planning assumptions and creative ideas are limited by the lazy ways our minds tend to work. Success is a matter of psychology.

The planning fallacy

Daniel Kahneman, psychologist and Nobel Prize winner, in his seminal book, Thinking Fast and Slow, explains our tendency to be over optimistic about how much time and effort it will take to achieve a task.  We make our estimates based on a best-case scenario, one in which everything goes according to plan and the plan is right.  This is the planning fallacy.

Behind this fallacy are two mental biases: the overconfidence bias and the not invented here bias.  Psychologists refer to the short cuts that our minds take to save mental effort as mental biases.  We all have them. We would not have evolved to be the highly successful species that we are without them.  We would still be waiting for conclusive evidence that the newly invented wheel was safe to use.

Biases have evolved to conserve the precious resources of the brain.  They are mechanisms our brains use to predict what will happen, or understand what is happening, without having to examine all the facts and go back to first principles.  In our daily business they normally serve us well.  But when we are faced with new situations, such as change programmes, the shortcuts don’t work.

With respect to the planning fallacy, the overconfidence bias is the belief that our estimates are better than they actually are. This bias is demonstrated in the 90 per cent confidence interval test. The test invites you to give a range within which you are 90 per cent confident the answer to a questions lies. 

Joshua Lewis, a researcher at Warwick University, applied the test to investment bankers. He asked them to give their 90 per cent confidence interval for how many times a particular swear word is used in the film The Wolf of Wall Street. Less than 25 per cent of participants gave a range that included the answer (which was 506, almost three times a minute).  This experiment has been repeated many times with different questions and similar results.  We all have a tendency to think our estimates are better than they are.

The not invented here bias, or we are different/better bias, can occur in two ways.  By failing to ask the question, has anybody done this or something similar, and if so, what can we learn from him or her? Or, by assuming that we will learn little from others because they are not as good as us, and our situation is special. Subsequently our estimates of the effort and time required to achieve our goals are much less well informed.

The planning fallacy at work

The planning fallacy means we underestimate the costs of the programme.  We then build a business case based on the underestimated costs, with a benefits side just sufficient to produce the required return on investment (ROI) relative to that understated cost.   The problem comes when about half way through the programme, when most of the budget has been spent, more money is needed to complete the programme.  Frequently, this is double or triple the original budget.  

The situation is compounded by another bias, the sunken cost bias or loss aversion bias. The argument goes that if we don’t increase the budget we will have wasted the money already spent and we will not get the benefits.  Decision makers, feeling they have a gun to their head, increase the budget, without a corresponding increase in benefits, thus undermining the business case. The business cases collapses but the programme continues.

The sunken cost bias blinds us to the realisation that the decision, whether to spend the additional money, should not be based solely on how much has been spent – it has gone, it is sunk -- but on the benefits that could accrue from spending the extra money on something else.  If the answer outweighs the benefits of completing the programme, then don’t increase the budget. Rescue any value you can and move on. 

In parallel to an assessment of alternative uses for the additional money, we should also consider how to increase the benefits of the existing programme.  We should ask, if I increase the budget how could I increase the benefits to maintain or even improve the existing programme’s ROI? However, the problem in doing this, at this late stage, is that many things are set in stone, significantly reducing the options for new benefits. 

The answer to the planning fallacy and the way to avoid the messy situation above is to become a pessimistic optimist: to adopt an informed pessimist’s approach to the cost side of the business case, and an informed optimist’s approach to the benefits side of the business case, and to do this from the start and throughout the programme.

The informed pessimist

The advantage of being an informed pessimist when constructing the costs of a business case is that it forces a much more creative and far reaching approach to identifying benefits, in order to achieve the required ROI. 

The word informed is important.  An informed pessimist avoids the best-case scenario trap by assuming the worst case.  And, at the same time, contains the extent of the worse case by drawing on information from their own experience and the experience of others.  Thus, mitigating the impact of both the overconfidence bias and not invented here bias.

The result is a realistic assessment of the time and effort required to deliver the programme. This in turn, forces a more thorough approach to maximising the potential benefits to achieve an appropriate ROI.

The informed optimist  Change programmes have many enemies and naysayers. This has been true since time immemorial.  In the face of such resistance only an optimist would seek to drive the change even deeper in the search for more benefits. But that is exactly what we need to do. 

Optimists have more luck.  Psychologist Richard Weisman demonstrated this in a simple experiment. He asked volunteers to define themselves as either lucky or unlucky. He then asked them to read through a newspaper and count how many photos were in it. The people who claimed to be lucky took on average a few seconds to accomplish the task.  Those claiming to be unlucky took on average two minutes. So what was the difference? 

On the second page of the newspaper a large message read: “Stop counting, there are 43 photos in this newspaper”.  The unlucky failed to see the notice and continued counting. The lucky people saw it, thanked their good luck, and stopped looking. As a further test, Weisman put another notice halfway through the paper:  “Stop counting and tell the experimenter that you’ve seen this notice and collect £250.” Again the majority of the unlucky participants failed to spot the notice.

Based on his research, Wiseman developed advice on how to adopt the mind-set of the lucky.


He identifies four attitudes and behaviours that can be learned:

• Create and notice chance opportunities
• Make lucky decisions by listening to your intuition
• Create self-fulfilling prophesies via positive expectations
• Adopt a resilient attitude that transforms bad luck into good

Lucky people adopt a less focused and more relaxed approach. This enables them to notice things beyond what others seem to be fixated on or worrying about. They see things that others don’t and thereby create more scope for creativity. It is this creativity that is the essential ingredient in maximising the potential for benefits.

Kensington planning fallacy application

The planning fallacy raised its ugly head from the beginning of the Kensington Programme. The programme manager was under pressure to construct a budget based on a two-year old estimate of the cost of the required mechanical and electrical refurbishment, plus a little extra to cover all other aspects of the programme -- IT, information management, new ways of working, design work, organisational and process changes. This amounted to a budget of £12 million. At this juncture the programme manager could have easily created a business case based on savings on energy bills and more efficient use of space, taken the money and run.

But he was in a pessimistic mood. His mind was filled with all the things that could go wrong, the resistance from staff and all the opportunities that would be missed. He also adopted an approach best described by Pablo Picasso: “Good artists imitate; great artists steal.”

He systematically approached similar organisations that had gone through one or more aspects of the planned programme. He not only took the programme team, to learn (and steal) from others, but groups of staff so they could talk to and engage with people going through similar changes.

It is worth recognising some of the reactions from senior people in the organisation to this ”pessimistic” attitude. For some, it was hard to understand why somebody who had been given a considerable sum of money, to undertake an important and beneficial programme, refused to accept it and instead, kept coming back with problems and demands for more money. In many organisations this frustrated and disappointed reaction from senior figures is enough to put an end to discussion.

Fortunately, prior to the programme, an enlightened advocate, Tim Ellis, from within the organisation had been educating senior staff on the role of programme sponsorship and the benefits of the MSP (Managing Successful Programmes) approach. The iterative process for building the business case, encapsulated in the identification and definition phase of MSP, reinforces and gives legitimacy to the informed pessimist approach. This is an example of where a good method combined with the right mind-set is effective.

At Kensington, the result was the development of an informed evidence-based budget of £24 million.  This was double the initial guesstimate. The important point is that while the budget doubled, none of it had been spent.  And nothing was as yet set in stone, thus leaving the door open for the creative process of maximising the benefits side of the business case. 

Biases at work

The biggest barrier to identifying benefits in change programmes is not resistance to change. It is the failure to recognise assumptions, challenge assumptions and question established practice.  These failings are human. There are reasons why we find it difficult to see things differently to the way we currently see them. Again, it is our mental biases that hold us back.  In this case the confirmation bias and the functional fixedness bias. 

The confirmation bias describes how, once we have decided what is right or how things should be done, we fail to notice or ignore any evidence that undermines our view. We have evolved this tendency to avoid the mental effort that would be involved in constantly challenging everything we think. Without this bias nothing much would ever happen. 

The functional fixedness bias describes our inability to see alternative uses for familiar objects. Again, if we were constantly challenging what the purpose and function of everything was we would never get on and use them.

Despite the value that biases provide, they are an impediment in times of change.  This is where the optimist approach comes in.  Optimists are not simply people walking around in blissful ignorance, assuming that everything will turn out for the best. Anyone adopting this attitude in life would soon be converted to pessimism faced with repeated blunders and failures. 

Being an optimist is both a frame of mind and a set of behaviours.  As discovered by Richard Weisman, optimists: create and notice chance opportunities and listen to their intuition. Noticing things and listening to your intuition are key behaviours for overcoming the confirmation bias and functional fixedness bias.

At Kensington we saw this very clearly. One of the major benefits delivered by the programme was achieved by replacing all desktop computers and desk phones with laptops and mobile phones.  When you see it written down it doesn’t sound that radical. Yet the process by which the idea was generated is instructive. 

For the IT specialists, the confirmation bias got in the way.  From their perspective, desktops were cheaper and easier to maintain.  Laptops were expensive, inclined to get stolen, lost or broken, and really only for use when you couldn’t get access to a desktop.  For staff, mobile phones were for use when away from the desk. They were not substitutes for desk phones. This is an example of functional fixedness. A mobile phone is for when you are on the move; a desk phone is for when you’re sitting at a desk. 

Occupied, as they should be, delivering their day-to-day business, it did not occur to either the IT specialists or staff to suggest eliminating desktops and desk phones. Why would it? When it was suggested, they responded through a lens shaped by their confirmation and functional fixedness biases. For IT, it was seen as a more expensive solution; for staff it was seen as an inferior solution.

The ideas came from the programme team. They noticed things, followed their intuition, asked questions and challenged assumptions. What they noticed was that many members of staff had both a desktop and a laptop. In fact some had as many as three PC devices.  They also noticed that while there was a desk phone on each desk many people also had mobile phones. So they asked, what would happen if we took away all desktops and the desk phones and equipped each member of staff with a laptop and a mobile? 

They also noticed that people behave differently at work to how they behave at home. People at home spent their evenings on mobile phones talking to family and friends and using laptops or tablets for social networking and entertainment. They did not sit at a desk with a desktop and desk phone. 

By noticing these things and asking questions, new insights, answers and data emerged. As many people had more than one PC device, equipping people with one laptop, reduced costs. Furthermore, it enabled people to work not just at desks, but in other work settings, in the office and outside, increasing flexibility and reducing costs. The IT team, initially sceptical of the switch to mobile phones, through asking questions discovered that a system on offer from the mobile phone company enabled free calls within office locations and the surrounding area. By questioning assumptions you discover things. It takes the informed optimistic approach to challenge assumptions and ask the questions that lead to new benefits.

Diversity and persistence

It cannot be underestimated how important it is to engage a team of optimists with diverse experiences and thinking styles in order to maximise benefits.   The programme construct provides just this opportunity.  It is by definition a multidisciplinary team.  If led by a person who is able to adopt an optimistic approach and encourage the same in others, great questions and great ideas will emerge. 

It is said that a business case is a living document. Yet, it is often ignored. The informed optimist never stops looking for ways to improve the business case by repeatedly asking the same four questions:

• How can I reduce the costs?
• How can I increase the benefits without increasing the costs?
• Are there new benefits possible that justify increasing the budget?
• Are there benefits we don’t need any more that save costs?

The 70 per cent failure rate in change programmes has been a constant for many years. When we see a repeating negative pattern as an outcome of human activity we need to look to our understanding of the human mind to design solutions.  It is not enough to develop sophisticated programme methodologies. While these are important, without recognising and working effectively with the way people think they will have little impact.

David Lewis ( is Programme Director for Executive Education at London Business School.

Keith Harper ( is Programme Manager at the Royal Borough of Kensington and Chelsea. He was the Programme Manager of the Kensington Programme.

Comments (0)