‘How DFID Learns’. Or doesn’t. UK aid watchdog gives it a ‘poor’ (but the rest of us would probably do worse)

April 4, 2014

     By Duncan Green     

The UK Department for International Development’s independent watchdog, the  Independent Commission for Aid Impact (ICAI), has a report out today on ‘how DFID learns’. Or doesn’t. Because the report is critical and gives DFID an overall ‘amber-red’ assessment, defined as ‘programme performs ICAI logorelatively poorly overall against ICAI’s criteria for effectiveness and value for money. Significant improvements should be made’.

I’m not gloating here – in my brief time working at DFID, I was struck by its investment in staff training and the general level of curiosity and intellectual enquiry. I suspect ICAI would be even more critical of NGOs and other aid organizations, so anyone working in research and development should probably spend a few minutes skimming the report.

Here’s the overall assessment:

‘DFID has allocated at least £1.2 billion for research, evaluation and personnel development (2011-15) [see graph]. It generates considerable volumes of information, much of which, such as funded research, is publicly available. DFID itself is less good at using it and building on experience so as to turn learning into action. DFID does not clearly identify how its investment in learning links to its performance and delivering better impact. DFID has the potential to be excellent at organisational learning if its best practices become common. DFID staff learn well as individuals. They are highly motivated and DFID provides opportunities and resources for them to learn. DFID is not yet, however, managing all the DFID research spendelements that contribute to how it learns as a single, integrated system. DFID does not review the costs, benefits and impact of learning. Insufficient priority is placed on learning during implementation. The emphasis on results can lead to a bias to the positive. Learning from both success and failure should be systematically encouraged.’

Recognize any of that? Thought so. And here are the report’s recommendations, some of which probably also sound pretty familiar:

1: DFID needs to focus on consistent and continuous organisational learning based on the experience of DFID, its partners and contractors and the measurement of its impact, in particular during the implementation phase of its activities.

2: All DFID managers should be held accountable for conducting continuous reviews from which lessons are drawn about what works and where impact is actually being achieved for intended beneficiaries.

3: All information commissioned and collected (such as annual reviews and evaluations) should be synthesised so that the relevant lessons are accessible and readily useable across the organisation. The focus must be on practical and easy-to-use information. Know-how should be valued as much as knowledge.

4: Staff need to be given more time to acquire experience in the field and share lessons about what works and does not work on the ground.

5: DFID needs to continue to encourage a culture of free and full communication about what does and does not work. Staff should be encouraged always to base their decisions on evidence, without any bias to the positive.’

Some other interesting extracts from the main 40 page report:

Staff turnover: ‘Staff are continuously leaving and joining DFID (sometimes referred to as ‘churn’). Fragile states are particularly vulnerable to high Dilbert on quantification of research impactstaff turnover by UK-based staff. For instance, in Afghanistan, DFID informed us that staff turnover is at a rate of 50% per year. We are aware of one project in the Democratic Republic of Congo having had five managers in five years.

This process represents both a constant gain and loss of knowledge to DFID. When staff depart DFID, their knowledge should be retained in the organisation. Similarly, when staff join DFID, their prior knowledge should be made available to others.’

Learning by failing: ‘During 2013, DFID began to discuss failure in a more open and constructive way than it had previously done. This began substantially with the February blog of the Director General for Country Programmes. Following this, a short video was produced by DFID staff in the Democratic Republic of Congo that discussed failures in a water supply improvement project. This internal video has been catalytic in stimulating discussion about how DFID should be more honest about failure. It has resulted in the introduction of ideas, such as the need to fail fast. During 2013, DFID’s Research and Evidence Division has piloted approaches to discussing failure in ‘fail faires’, where staff come together to identify what can be improved. It is too early to say whether these will support a change of culture in DFID in its attitude to learning from failure, albeit they appear to be a positive innovation.’

(not) Listening to staff and partners: ‘Junior and (even senior) locally employed DFID staff generally ‘only give our opinion if asked’. Generalist, administrative and locally employed staff are not being listened to sufficiently by DFID’s specialists. They often have much experience of how aid is delivered: know-how….. DFID staff do not appear to prioritise how they listen to others. This applies to learning internally and from external sources…. Staff believe that DFID remains too much in a mode of trying to manage or change others rather than listen to and support them.’

All good stuff, but what is lacking is any discussion of the institutional constraints to DFID being able to implement these recommendations – after all, there must be reasons why they haven’t done so already – see Neil McCulloch’s recent piece on the political economy of donors.

Finally, props to the UK government for setting up ICAI in the first place. Really impressive example of rigour, transparency and accountability (more on that topic here next week, when some T&A gurus discuss its limitations).

April 4, 2014
 / 
Duncan Green
 / 
 / 

Comments