August 21, 2018

Too difficult, too disruptive, and too slow?

Evaluating the impact of humanitarian interventions is often dismissed as an impractical luxury. Lessons from seven evaluations in different contexts prove otherwise.

4 min read

More than 200 million people are in urgent need of humanitarian assistance across the world today. In 2017, the UN-coordinated appeals reported a shortfall of 41 percent, despite receiving a record amount of funding. As the demands on these limited funds increase, there is a concurrent increase in the need for high-quality evidence on the most effective ways to improve humanitarian programming.

Impact evaluations contribute to high-quality evidence on what works, how, and why, by examining causal links between interventions and outcomes. Answering these questions of attribution can be extremely challenging in humanitarian settings; a 3ie scoping paper highlights some areas where evidence is lacking.

An impact evaluation is often dismissed as an impractical luxury for being too difficult to conduct, too disruptive to ongoing interventions and teams on the ground, and too lengthy to bring practical insights. A frequently raised question is ‘how can you set aside funds for evaluations when people’s lives are at stake?’ But when the stakes are so high, it is all the more important to have evidence on the effectiveness of what we are doing.

An impact evaluation can often be dismissed as an impractical luxury – ‘How can you set aside funds for evaluations when people’s lives are at stake?’

donate now banner

3ie’s Humanitarian Assistance Evidence Programme, with support from Danida, DFID, UNOCHA, USAID and WFP, set out to address evidence gaps in the humanitarian sector by supporting the production of rigorous, high-quality impact evaluations related to food security, multi-sectoral humanitarian programming and interventions targeting malnutrition.

The seven impact evaluations under this programme have been conducted in Chad, Democratic Republic of Congo (DRC), Mali, Niger, Pakistan, Sudan and Uganda. They were undertaken in close consultation with implementing agency partners and employ a range of innovative research methods.

Related article: M&E: Whose job is it anyway?

One of the main lessons from the 3ie-funded studies is that there may be solutions at hand even for the most unanticipated challenges that may arise while conducting impact evaluations in humanitarian contexts.

1. Why are impact evaluations considered too difficult in humanitarian settings?

Getting buy-in from programme staff for an impact evaluation can be an uphill battle. A 3ie-supported study in DRC faced a long and at times difficult process of securing an approval from the implementing agency. Significant staff turnover, agency scepticism towards the research approach, and a tense socio-political climate, led to delays in starting the evaluation and posed significant risks to its completion.

In these difficult circumstances, involving programme staff in the design of the study and having informative conversations on the advantages of a rigorous, independent evaluation will go a long way. In this particular case, the implementing agency not only agreed to the suggested design eventually, they also added their own resources to cover the costs of randomising additional households into the programme.

impact evaluations

A female doctor with the International Medical Corps examines a young boy at a mobile health clinic in the village of Goza, near Dadu, in Pakistan’s Sindh province | Photo courtesy: 3ie

2. Are impact evaluations in humanitarian contexts too disruptive?

Another concern with impact evaluations is that they add to the workload of programme staff since they typically involve engagement with an external and independent evaluator. However, a well-designed impact evaluation and an experienced research team can help streamline processes and administrative data issues, while also improving the quality of the programme data and monitoring and evaluation (M&E) systems.

In Pakistan, the study team forged a close collaboration with the programme implementers to bring about improvements in the M&E systems and help disseminate results on the effectiveness of the resilience programme to other actors and to the Pakistan government. Best practices in data management introduced by the researchers, such as spot checks and debriefing sessions, were transmitted to other projects. The agency now plans to include these in their regular protocols once testing is complete.

Related article: M&E: Five things you are doing wrong

An independent evaluator can also highlight and help resolve implementation issues. The study in Uganda focused on strengthening supportive supervision of community health workers to improve community screening of malnourished children. This resulted in more children accessing care at the intervention health facilities and an increased workload for the health workers.

The study team started close communications with the staff to encourage them to provide high quality of care. They also supported inventory management at health centres, conducted refresher trainings and experience sharing sessions for staff in order to improve case and data management. Finally, they worked with the existing system to improve data collection and monitoring at both treatment and control facilities.

3. Are impact evaluations too slow for humanitarian contexts?

Impact evaluations can typically take two to three years to complete, which can be an unacceptably long time in a fast changing humanitarian setting. Four studies in 3ie’s humanitarian evidence programme which examine the effectiveness of the World Food Programmes’ nutrition interventions provided impact estimates in 14-18 months. Good preparatory work, early engagement and innovative use of data and econometric tools ensured that these impact evaluations could be completed in such reduced timeframes.

Collecting data in a challenging humanitarian setting with a tight timeline can be a difficult task. The study in Niger used pre-existing administrative data, combined with quantitative techniques to assess impact. Thus, the study involved no additional data collection. Similarly, to reduce the timeline, the study in Sudan did not use baseline data but took advantage of its flexible sequential roll out design to assess impact.

A pitch for producing and synthesising evidence, sharing best practices and programme designs in humanitarian contexts

Impact evaluations in humanitarian contexts can be very challenging, but when well-planned and implemented, they can provide extremely valuable evidence for improving programme design and addressing implementation issues. They are also a key tool to assess impact and make informed decisions about expansion, redesign or interruption of programmes.

Sharing successes, failures and best practices can go a long way in helping to strengthen the evidence base and eventually leading to more effective programmes. To the sceptics we would retort: how can we afford not to improve the evidence-base when people’s lives are at stake?

With inputs from Marie Gaarder, Durgadas Menon and Kanika Jha.

This article was originally published on 3ie’s blog Evidence Matters. You can read it here.

We want IDR to be as much yours as it is ours. Tell us what you want to read.
ABOUT THE AUTHORS
Tara Kaul-Image
Tara Kaul

Tara Kaul reviews research and manages funded impact evaluation grants at 3ie. Tara is a development economist and holds a doctorate in Economics from the University of Maryland, USA. She has a masters in Economics from the Delhi School of Economics and a bachelors in Economics from Lady Shri Ram College, University of Delhi.Tara's current research focuses on the nutritional impact of food subsidies, and intra-household gender discrimination in educational expenditures in India.

Samidha Malhotra-Image
Samidha Malhotra

Samidha provides research support for 3ie's thematic grant programmes on Humanitarian Assistance and Agriculture Innovation. Prior to joining 3ie, Samidha worked at Lal Bahadur Shastri Research Centre for Public Policy and Social Change. She has also worked with Planning Commission of India, Indian Council for Research on International Economic Relations and the Institute for Human Development. Samidha has a bachelors in Economics from the University of Delhi, and a masters degree in Environmental Economics from Madras School of Economics.

COMMENTS
READ NEXT