Associate Professor of Organisational Behaviour
Think at London Business School: fresh ideas and opinions from LBS faculty and other experts direct to your inbox
Remember all the fuss over those inauguration photos? On the day that Donald Trump was officially sworn in as US president, the National Park Service tweeted photos comparing the crowd size at Trump’s inauguration to the larger crowd at former president Barack Obama’s 2009 inauguration. Trump had boasted that he’d get an “unbelievable, perhaps record-setting turnout” – but he didn’t.
Over the next few days, White House Press Secretary Sean Spicer falsely claimed that Trump’s crowd was “the largest audience to ever witness an inauguration, period”. What happened next caught the interest of Daniel Effron, Associate Professor of Organisational Behaviour at London Business School. At some point, Trump’s people stopped defending the literal truth of that claim and started implying that it would have been true, if only something had been different – for example, the inauguration would have been bigger if the weather had been nicer.
“I found it very interesting that Trump’s spokespeople were making these statements about how the inauguration could have been bigger,” says Dr Effron. “This type of statement doesn’t change the literal truth of a false claim but, psychologically, it may bring the claim a little closer to the truth. Falsehoods that feel close to reality may be perceived as less dishonest when people imagine how they could have been true in alternative circumstances.”
When a claim is shown to be false, you can try to defend its truth – you can say, for instance, that Trump’s inauguration actually was bigger, there’s just a media conspiracy against him. But Dr Effron wanted to investigate how ethical people think it is for a person to make a statement that they know to be false. He reasoned that inviting people to imagine scenarios in which a falsehood could have been true would make them see the falsehood as less ethically problematic. So falsely claiming Trump’s inauguration was the largest in history might seem less morally problematic after you hear someone suggest that nicer weather could have increased the turnout.
Trump of course isn’t the first politician to tell falsehoods. “As political theorist Hannah Arendt put it, no-one has ever doubted that truth and politics are on rather bad terms with each other,” says Dr Effron. “In the US, presidential candidates from both sides of the aisle made all sorts of statements that were roundly debunked.”
So did their supporters. A Time magazine reporter claimed that Trump had moved the bust of Martin Luther King Jr out of the Oval Office but it turned out the reporter just hadn’t seen it. Dr Effron wondered whether imagining how a falsehood could have been true would lead people to let politicians across the political spectrum off the hook for telling it.
The findings from three experiments confirmed his hunch that when people were told to imagine how a falsehood could have been true, they found it less unethical to tell – even though they still recognised it as false.
However, the effect only occurred if the falsehood aligned with a person’s political views. Imagining how Trump’s inauguration could have been bigger only affects Trump supporters, not Clinton supporters. But that’s not because Clinton supporters are immune to the phenomenon. When the falsehood makes Clinton look good or Trump look bad, Clinton supporters similarly judge it to be less problematic.
Consider the false claim that Trump removed Martin Luther King’s bust. “If you’re a Clinton supporter and I ask you to consider whether Trump would have moved the bust if he could have got away with it, you might think, ‘Yes, I can see that.’ Then even though you know Trump didn’t really move the bust, you think it’s not so bad to falsely say he did because you can now imagine it happening,” Dr Effron says. “If you’re a Trump supporter and I tell you the same thing, you might think, ‘There’s no way Trump would have moved the bust, even if he could have got away with it.’ You’ll continue to think it’s pretty bad to falsely say he did because you can’t easily imagine it.”
Why does this matter? Firstly, because once you’ve imagined how something could have been true, you hold the person uttering the statement to laxer ethical standards. “To reduce the moral condemnation you receive for telling a falsehood, you don’t need to convince your supporters that what you said was actually true, you just need to invite them to imagine that it could have been true.”
Secondly, this act of imagining increases political polarisation and it’s hard to get anything done as a country if you can’t agree on basic judgements. Moral judgments are already politically polarised – people who oppose a politician judge them more harshly for telling a lie than people who support them.
Dr Effron’s research shows that imagining how the lie could have been true increases this political polarisation. Inviting people to consider how the falsehood could have been true increases the difference between supporters and opponents’ moral judgments. “It’s hard enough to talk across a division that’s purely political; it’s harder to talk across a moral division,” says Dr Effron. “Anything that increases differences in how moral we think people are makes me worried because it makes it a lot harder to work together.”
In Dr Effron’s experiments, he showed 2,783 people some false statements made in the context of the 2016 US election (on both sides of the political divide) such as, “No Trump products are made in the US” (falsely claimed by Hillary Clinton) and “Trump won the popular vote” (falsely claimed by Trump voters). Study participants were told that each statement was false and were then asked how unethical it was to make the statement. Half the participants were asked to reflect on a specific way in which each falsehood could have been true – for instance, “Trump would have made his products outside the US if he could have done so more cheaply.”
People asked to imagine how the falsehoods could have been true rated making the claims as less unethical. This effect only occurred reliably when people were judging falsehoods that aligned with their political beliefs.
Because imagining a scenario where something could have been different only reliably affected people when they judged falsehoods that fit with their political views, the counterfactual increased political polarisation. What’s interesting is that imagining a counterfactual increased the tendency to judge the falsehoods people liked as less unethical than the falsehoods they didn’t.
When Trump recently retweeted inflammatory anti-Muslim videos posted by Britain First, a far-right extremist group, people wanted to know if they were fabricated. Sarah Huckabee Sanders, the current US press secretary, said that it didn’t matter. What was important, in her view, was if they pointed to a larger problem that was true.
“Whether it’s a real video, the threat is real,” she told reporters. In essence, she seemed to be asserting that posting the video was not problematic because it could have shown real events, even if it didn’t. Dr Effron’s research suggests this assertion would be most persuasive to Trump supporters.
We know that people attend more to evidence that supports what they want to believe than to evidence that doesn’t – it’s called confirmation bias. “What I find so interesting about events that could have been true is that, by definition, they didn’t actually happen,” says Dr Effron. “If people can justify coming to their moral conclusions simply by imagining what they want to imagine, that’s pretty scary.”
It also suggests that when public figures are caught saying something that’s false, their supporters set a pretty low moral standard for what it takes to let the person who tells the lie off the hook. Dr Effron hopes his research might nudge people to look more closely at our tendency to excuse those leaders we like.
“We should be wary of our ability to imagine whatever we want. Imagination is a powerful human capacity. It allows us to do all sorts of great things – to write novels, invent things, learn from our mistakes – but this research suggests the dark side of imagining how falsehoods could have been true. It allows us to hold the people we admire to lower moral standards than the people we don’t.” It also suggests people are pretty easy to manipulate. This phenomenon applies equally in business, says Dr Effron. “Leaders often make statements that are demonstrably false. The research suggests that if we try to imagine how these statements could have been true, we’re more likely to excuse the leaders – but only if we like their lies.”
“People can usually point to ways their lies could have been true, which raises the concern that we can easily justify our own dishonesty – ‘Okay, we fudged the numbers a little bit, but those numbers would have been accurate if the economy hadn’t tanked, or if that client hadn’t reneged on their promise at the last minute.’” Next time your boss makes a comment like that, you might want to check the facts.
If you’d like to read more on this from Daniel Effron, read his article in The New York Times:
You must be a registered user to add a comment here. If you’ve already registered, please log in. If you haven’t registered yet, please register and log in.Login/Create a Profile