Originally published January 21, 2015.

In this article, research consultant Nicolas Gunkel, takes a closer look at the World Bank's 2015 Development Report and explores the hidden biases of development professionals. He outlines how these biases can compromise the quality of development work and suggests which strategies can be implemented in order to counteract these behavioral biases. 

Behavioral economics is en vogue right now, but behavioral economics of poverty is even sexier. A visit to your local bookshop will easily prove this point, in fact the evidence will be lined up nicely on one bookshelf. Scan the front covers and you’ll see a plethora of best sellers. In all likelihood, there will be several copies of Thaler and Sunstein’s “Nudge: Improving Decisions about Health, Wealth and Happiness”, next to that you’ll find Duflo and Banerjee’sPoor Economics: A Radical Rethinking of the Way to Fight Global Poverty” and Ariely’sPredictably Irrational: The Hidden Forces That Shape Our Destiny”.

Together, these researchers fill the big footsteps left behind by economics Nobel-prize winner Daniel Kahneman, whose work on human psychology and decision-making made significant strides in debunking the myth of human rationality by exposing heuristics and biases that structure our thinking. Recent publications have focused increasingly on the implications these mental modes have for people with restricted economic resources. An individual’s cognitive ability is, for instance, shown to be severely constrained in the contexts of scarcity (‘bandwith tax’), resulting in a pronounced short-term focus that may lead affected people to forgo crucial health or education investments.

Academic jargon aside, people with limited financial means experience emotional stress that affects their decision-making. Rather than focusing on the big picture, and what might increase the quality of their lives in the long term, (understandably) their decision-making is dominated by things that will bring them immediate relief or distraction.

Mind, Society and Behavior

While lately a lot of buzz has been around research on how to best help people escape poverty traps by ‘nudging’ them towards more sound economic decisions, a similar critical examination of the decision-making biases and shortcuts of development professionals has lagged behind. In light of this, the widely anticipated World Development Report 2015 (WDR 2015) published by the World Bank Group holds up the mirror in a surprisingly self-critical way to those, who as policy makers, consultants or development officials develop programs with the objective of alleviating the strains of poverty.

Early in Chapter Ten of the Report, titled “Mind, Society and Behavior”, the authors package their main message, albeit in a clinical, abstract way so as to remain as politically sensitive as possible despite the scathing critique it implies. It reads:

“[…] development professionals are susceptible to a host of cognitive biases, are influenced by their social tendencies and social environments, and use deeply ingrained mindsets when making choices.” (WDR 2015, p.181)

The punchline is clear: development professionals are not better than anyone else, either rich or poor, in guarding themselves against subtle influences of their decisions. They too follow social imperatives created by their surroundings, develop group-think, fall prey to lethargy and are reluctant to admit failure. The authors dig deeper into four common behavioral biases and replicate famous experiments from the literature with their own headquarter and country office staff. In some cases they compare their results to the answers given from people “in the bottom, middle and top thirds of the wealth distribution in the capital cities of selected developing countries” (p.188). Though the methodology of these experiments remains partly obscure (no confidence intervals of results given, no sample sizes of country samples given, not convincingly ruling out non-response bias etc.) these experiments still merit attention, because they point into the direction that development practitioners are far from infallible.

The Confirmation Bias

One of the prejudices analyzed in depth in the report is the ‘confirmation bias’: the tendency to find one’s own previously held beliefs supported by any new information arising even if this evidence proves the contrary. In a slight modification of a popular study by Kahan et al. (2013) the authors confront the sample of World Bank professionals with two sets of identical numerical data, one describing the effectiveness of skin cream to eliminate a rash, the other presenting evidence on the effect of minimum wages on the reduction of poverty. While around 65% of the staffers interpreted the numbers in the skin cream frame correctly, only around 45% of the respondents were right when the same numbers were shown embedded in the minimum wage frame. The authors go on to establish a relationship between interpretation accuracy and other survey responses on each respondent’s attitude towards income equality. They demonstrate that World Bank professionals who support income equality were much less accurate in their interpretations when the figures indicated that minimum wages raise poverty levels (‘ignoring evidence contrary to their ideology and seeing confirmation of their beliefs in the data’) than in the situation when the numbers showed that minimum wages decrease poverty levels.

What can be done to prevent one’s pre-existing conceptions from clouding immediate judgment? This is a question worth asking not only in the context of development work. The report suggests that by “expos[ing] people to opposing views and invit[ing] them to defend their own” (p. 183) people with a shared interest in the truth of the matter may tame the reflex action to fall back on their prior notions. Beyond that, simply asking oneself the question how easy it was to find information in support of one’s own perspective may already trigger a process of revalidation. The next step could be to play through the counterarguments to one’s personal view - like a good lawyer does - and try to find support for these.

The Sunk Cost Bias

Screen Shot 2015-01-21 at 2.06.27 PM

Another bias covered in the report has probably been experienced by anyone who has invested time/money/affection into something that was dear to her heart. It is called the ‘sunk cost bias’. It describes people’s escalating commitment to a plan of action once an initial investment has been made, even if such a plan proves unsuccessful. Such a situation may arise for a love who doesn’t get over his unrequited love, which he has pursued for so long; a broker who doesn’t sell his/her stocks because he/she invested so much time and effort building up their portfolio, or a development professional whose decision about continuing or stopping a project is largely contingent on the money already spent on the project. The last scenario finds verification in the internal survey data by the World Bank. Interestingly, as the chart indicates, World Bank staffers reported that their colleagues were generally more prone to commit funds to a failing project. This can be regarded as indication of a proven social norm of not canceling any projects (as the authors do) or as a consistent attempt by the respondents to portray themselves as more strictly rational and calculating than the rest.

How to encourage active resistance to the ‘sunk cost bias’? Organizations may reward rather than punish employees who have the courage to speak up when detecting underperformance ,as long as they make constructive suggestions about how to improve the situation. Furthermore, people may feel emboldened to pull the plug if there is a general organizational acknowledgement that failure, experimentation and risk-taking are intricately linked to bold ideas in a high volatility environment such as in development. Clear guidance on which administrative steps to take when a project is cancelled may also facilitate the process.

Closing the Gap

Finally, the most damning conclusion the report draws, and which its own survey results with World Bank professionals seem to corroborate, is that development policy makers too often lead a totally parallel existence next to the people they set out to help. The report explicitly states:

“Development policy makers and professionals usually are not familiar with the mental models and mindsets that poor people use. Policy makers are likely to live in different places from the poor, to send their children to different schools, to receive medical treatment in different hospitals, to travel on different modes of transport, and to have much stronger incentives to socialize with and listen to those who are more likely to be able to support their policy agenda and political career.” (WDR 2015, p.187)

Screen Shot 2015-01-21 at 2.06.59 PM

As a case in point the authors refer to their survey data on World Bank staff predictions about how much people in Jakarta, Nairobi and Lima agree to the assertion: ‘What happens to me in the future mostly depends on me’. Compared to the actual responses in these cities, the professionals greatly underestimated how much particularly the poorest third of the population thinks to be in control of their fates. Curiously, in these samples World Bank staff comes out on average as the group least convinced that they can shape their own future (cf. chart).

Breaking down this divide of mental models may succeed through learning to live with and among the people one serves. As I have previously written, cultural immersion is essential for gaining a rich and truer understanding of other people. It precipitates knowledge, trust and empathy – invaluable not only to fight entrenched mental categories, but also to find things that unite rather than separate us.

Though it appears considerably less sexy to work on your own decision traps and mental modes, the end result may be more beneficial to the people you work for than if you are endlessly agonizing about the intricacies of their cognition. By identifying your common biases as a development professional, you can also take concrete steps to deconstruct these biases, and in doing do, improve the quality of your work, and inherently, the quality of lives.

 

Nic Gunkel

Nicolas Gunkel is a research consultant and has conducted operational and ethnographic research on malnutrition. He is tweeting about development policy, international politics and law, as well as art history.