Image

8 mistakes to avoid in reporting an INGO’s contributions to the SDGs

August 19, 2021

     By Duncan Green     

Guest post from Ximena Echeverria Magariños and Jay Goulden, of CARE International

INGOs have for many years felt the need to communicate how many people their programs reach in a year, but the numbers of people our programs “touch” doesn’t tell us anything about the difference they make in people’s lives. Increasingly, INGOs are seeking to report numbers of people whose lives change as a result of a program’s contribution (people impacted) and learn about how those changes occur. As a vision of a world of dignity, equality, sustainability and prosperity for all, the Sustainable Development Goals (SDGs) provide common language for such reporting, and some INGOs are framing their impact reporting as contributions to the SDGs, at country (BRAC) or global level (CRS).

CARE recently did so too, reporting CARE and our partners’ contributions to change for 157m people across 11 SDGs (we reported against 8 SDGs in 2019). But we’ve made many mistakes in this process and others might find it useful to have a list of the top things to avoid. So, what have we got wrong?

  • Privileging numbers over everything else: while we did collect some data on how projects had contributed to impact, the system primarily quantified numbers of people impacted. We did not yet systematically gather qualitative data, or data from participatory Monitoring, Evaluation and Learning (MEL) systems where project participants themselves described the impacts they were experiencing. Main lesson: feed qualitative and participatory evidence into your system, as well as quantitative evidence.
  • Some overly complex indicators: we selected a set of 25 global indicators for programs to report their impacts, but some indicators were hardly used due to complexity or cost of reporting. Only five projects reported impacts on the adolescent birth rate, and only 13 on the international poverty line. Lesson: when selecting indicators, consider how easy or difficult they are to report against and remove those that aren’t being used.
  • Gender metrics were not prioritized: despite gender equality being a core approach across all our work, we generated insufficient evidence of our global contributions to gender equality. The section to report on indicators of gender equality was at the bottom of a 280 row Excel form – 🤦🏻‍♂️. Not surprisingly, few projects reported against these. We also framed gender equality as a cross-cutting approach, rather than an impact target to be accountable towards. Lesson: make your priorities a core target and put them at the top of your data collection system
  • Inconsistent learning: while we did identify the learning behind the numbers for some areas of our work – e.g. the key strategies behind our most significant impacts on stunting or preventing GBV – we did not have a defined learning agenda. We have used our latest  SDG report to generate key lessons from the last five years, but should have been more proactive. Lesson: define a small set of learning questions, and structure your evidence and data to respond to these.
  • Many projects slipped through the system: we didn’t track why projects were ending without reporting impact. This could have been for good reasons: to avoid double-counting impacts reported under other projects, or evaluations postponed due to COVID-19 restrictions. But in other cases, failures in MEL systems could have been addressed or data gaps filled. In fact, proactive following up with large projects that had not yet reported impacts brought evidence of a further 22 million of impacts we would otherwise have missed. Lesson: ensure you systematically track whether projects are reporting impacts and take actions to address any gaps.
  • Using complicated, offline tools: initially we collected data in Excel, which caused complications for colleagues across the world working with different versions of the software (though we could easily iterate and adjust based on feedback). We have moved to an online system this last year. Lesson: plan for user-friendly online data collection tools that will work in low bandwidth environments, but that you can keep adjusting as needed.
  • Investing more on data collection than use: at the beginning, we put greater efforts into training teams to input data, rather than use it. It took a while to help people see that this system was something they could use, for program management decisions, as well as for advocacy or fundraising. Once teams could see how to use the system, through better data visuals, and people throughout the organization were paying attention to the data reported, the inputs got better because people felt that data quality mattered. Lesson: remember that use of the data is as important as getting the data reported. Think of the different “personas” of users, and plan to get the system to give them what they need.
  • There was no unique identifier for projects: this one is a bit “geeky”, but until this last year, we didn’t set up a unique code that identifies projects in our global impact system, to track their data from one year to another. Different spellings of project names in different years, or in separate forms for reach and impact data, made it difficult to analyse how program quality related to impact – let alone have access to accurate financial data on cost per impact. Lesson: just do this from the offset – it will save a lot of time in the long run!

So, is reporting your contributions to the SDGs even worth it, Duncan might ask? Our experience over the past 6 years is that it is. Firstly, INGOs should understand what impact we are having, beyond the level of individual projects. After decades of investment, partnership and implementation, it is shocking that we mostly cannot say more than how many people our programs “touch”.  These efforts to aggregate our global impacts have also helped us identify what we have learned and what we need to do differently, as well as to improve our MEL capacities throughout the organisation (e.g. reporting the impacts of our advocacy and influencing work).

Secondly, we should share this contribution publicly. Duncan might ask ‘to whom are we accountable’ and for us, CARE is accountable to many – our project participants, peers, partners, institutional donors, taxpayers who contribute to governmental donors, individual donors who support us, and the governments in the countries where we work. One way to be more accountable to all of these is to know what difference we are making and to report this publicly, and not only for individual projects. Thirdly, in our experience, learning and accountability is enhanced by a collective framework; and the SDGs provide this at global level.  They are not perfect, but they have galvanised collective commitment, measures and action. If we expect countries to report on progress towards the SDGs or companies on how they are contributing, surely INGOs should also be transparent about their contributions to these shared global goals.

Is anyone listening? Do those we are accountable to find an SDG framework more useful than a simple M&E report? (Our imaginary Duncan asks a lot of questions!) The people and partners we work with on individual projects most likely don’t care, but donors, Governments and UN agencies do welcome seeing our reported impact framed within the SDGs. After all, reporting against an INGO’s self-defined goals from their own strategic plan is largely irrelevant to everyone except the individual INGO. At least the SDGs are important for others as well. Being invited to be part of the opening session of the 2021 High Level Political Forum was most likely a result of our reporting our contributions towards the SDGs and sharing that with the UN. We also know that at national level, having evidence of contributions to national priorities such as the SDGs opens doors to influencing, and so ultimately to greater potential impact.

So we will certainly continue to do this over the next decade. We think we can avoid repeating the mistakes above, but will no doubt make some more! We hope others can share their lessons too, so we can avoid falling into some other traps over the coming years.

August 19, 2021
 / 
Duncan Green
 / 

Comments