How Data Analytics can Unlock the Knowledge in Development Organisations

Guest blog by Itai Mutemeri (@tyclimateguy) is Head of Analytics at London based Senca Researchpropicli

In September 2017, I headed up to the Oxfam head office in Oxford to present our research paper: Big Data Opportunities for Oxfam – Text Analytics. Like all good research titles, it’s a mouthful.  The paper explored the potential application of text analytics in response to Oxfam’s call for proposals on how big data tools could aid the organisation. Given that we use text analytics software, daily, it seemed a good fit. I began with a primer on text analytics and why it is useful.

Data lives on a spectrum between structured and unstructured. Structured data is what it sounds like – basically anything you can neatly arrange in a spreadsheet. Unstructured data, is much harder to analyse and comes in the form of social media posts, pictures and archived documents, to name a few.

The general rule of thumb, is that most of the data organisations have access to is unstructured. For example, research indicates that 80% of the data the US government has access to is in unstructured text, more commonly known as normal human language. But unstructured text poses problems for humans and machines alike; for the lowly analyst, the sheer volume of data can be daunting, and for man’s intellectual steed – the computer – the unstructured nature of the text is difficult to decipher. In development, we are making strides toward data-driven decision making, so it makes sense to have tools that will help you analyse more than 20% of your data.

But analysis for its own sake is pointless, so our paper focused on the use of text analytics software to solve the two most prevalent problems we have observed while helping other organisations use their text data.

Problem 1: People don’t tap into the organisation’s collective wisdom. Put simply, we generally start planning projects as if no-one within the organisation has ever worked on something similar. We look outside for useful information and signals, instead of starting from the existing internal knowledge. But we ignore institutional wisdom at our peril – we end up repeating mistakes that would have been easily avoided if we had learnt from the experience of others. I watched a large project fail because of something that we could have known and avoided – it is a crushing feeling.

knowledge gap-textanalyticsThere are many reasons we don’t use our institutional wisdom. The ODI cites, amongst other factors, an incentives system that rewards staff for new ideas; leading to a neglect of lessons previously learned. I once worked with an organisation whose internal database was so difficult to search that no one ever used it, except to deposit yet more project documents that no one would ever read. Even if they were able to find the relevant documents, reading them all to synthesize the key lessons would be a monumental task. This is especially true in large organisations like Oxfam, which has offices around the world. For this problem, we introduced a text analytics solution that makes it easier to search through internal documents.

Problem 2: Organisations struggle to study their own behaviour. Put more simply, figuring out what’s working and what isn’t, particularly in large organisations, is extremely challenging. The directors and managers of development organisations should be able to ask (and get answers to) questions like “What is working well across our WASH initiatives?” The answers inform decisions on where resources should be allocated.

Considering the mountain of paperwork our sector produces in the form of proposals, project plans, project completion reports and monitoring, evaluation, accountability and learning documents – such data should exist. The problem here, as it often is, is a resource one. Analyzing all these documents would require a team of researchers that most organisations could not afford. One of the outcomes of technology should be to allow us to do more with less. With that in mind, I presented to the Oxfam staff a text analytics powered tool that, with some customization, would allow analysis of a large corpus of documents to get to those answers.

Examples of text analytics in development

Using text analytics to create new knowledge is not limited to the analysis of documents. The UN Global Pulse Lab inBig Data Kampala created a toolkit that makes talk radio broadcasts machine-readable through the use of speech recognition technology and translation tools that transform radio content into text. Once the conversation is in text, one can use the tool to explore relevant public conversations on topics of interest, in this case:

  • Early warning and influx of refugees in Uganda
  • Recording losses associated with local disasters
  • Local Governance and public health service delivery

On the first topic, the lab was able to uncover the main topics of conversation related to refugees over the month of analysis; and emerging vulnerabilities related to refugees later in the year.

The goal of the project is to support the Government of Uganda and development partners in incorporating the voices of Ugandan citizens into the development process. With an estimated 7.5 million words spoken on Ugandan radio every day, this kind of analysis would be impossible without text analytics. For those interested in other tools for working with qualititative data in development, I highly recommend this overview.

I concluded the presentation with recommendations on how to choose the right analytics solution:

  1. Focus on the highest value opportunities

This is the best way to get buy-in from all the people whose work you are going to disrupt, because they can see the value of what you are proposing.

  1. Start with questions, not data or solutions

Too-much-data-300x150Figure out what the problem or what the question is, then try to assess whether or not you have the data to answer that question, and only then should you pick an appropriate solution.

  1. Run short, inexpensive experiments

The most important thing is to quickly validate, using the organisation’s own experiments – which solutions can work.

In my opinion, data is a fancy word for history. The more tools we have to study, understand and learn from our history the better.

We would love to hear from other organisations on how they are using data to improve their project work.

Subscribe to our Newsletter

You can unsubscribe at any time by clicking the link in the footer of our emails. For information about our privacy practices, please see our .

We use MailChimp as our marketing platform. By subscribing, you acknowledge that your information will be transferred to MailChimp for processing. Learn more about MailChimp's privacy practices here.


9 Responses to “How Data Analytics can Unlock the Knowledge in Development Organisations”
  1. George

    Thanks for this thought-provoking post. I definitely agree that some user-friendly tools to help us get to grips with that 80% of data that currently sits unused would be a step forward. However, I disagree with the final point made here: “The more tools we have to study, understand and learn from our history the better.” I don’t think the issue is having more and more tools. Sure we certainly need some new tools, but I think just as crucial is a need for organisations to carve out the space for staff to critically reflect on the information/data/analysis-conducted. In my experience, organisations need to proactively create a culture where staff are given the time and space to do this – otherwise we’ll just have more findings (based on more data) that get filed and not used. I often find that organisations and people promoting tech-based ‘solutions’ to M&E/MEL issues in the develeopment sector, tend to stop at the point where a tech-solution has been proposed/developed/used, but actually for me the key thing is getting people to critically reflect on the findings that have emerged from the tech-solution’s analysis. People have the capacity and the willingness to do this, but they need to be given the time and space within their organisations fpr it. For me this is just as important as new tools.

    • I completely agree with you George, tools or the analysis that we produce with them are just one step. I would take it a step further and say that if anything useful has been learned, we should reflect and also adjust our actions accordingly. What kind of barriers have you faced with getting time and space to reflect?

  2. Heather Marquette

    As a researcher and someone who was director of a knowledge management centre for years, I’m a big fan of data and evidence obviously. Do you have a theory of change, though, for how the possession of this data is going to lead to better decisions and, hopefully, outcomes? What kind of high level/political buy-in and new systems do you find are needed to make use of new data? Like you said, there’s an awful lot of data already out there that could be being used already. Or is your last sentence your shout out for other people to let us know how it worked for them?

    • You’re right about my last sentence, I really want to learn what has and hasn’t worked for individuals and organisation’s in the past. How did your knowledge management centre collect evidence and disseminate it? Do any tools or approaches that worked particularly well stand out from that experience?

      With regards to the theory of change: from a decision-making standpoint, what good analysis of our data should do is inform our discussions and decisions. During the decision making process things like “political realities” and individual agendas will still appear but at the very least data allows us to have a source of objective truth. Personally, I’ve found that to be a particularly useful tool for guiding and informing discussions especially in meetings where some people have strong opinions and are very vocal on theories they hold based on their experiences (those people tend to get their way). The outcome from all of this should be an organisation that uses the evidence to support or invalidate the hypotheses we’ve operationalised. The idea is to follow what works, acknowledge and adjust what doesn’t and that should lead to better outcomes. Admittedly, this all easier said than done.

      In terms of getting political buy-in, I think the most important thing is to be empathetic to leaders when you approach them to start a new initiative. I always find a 2×2 matrix with Interests and Positions on either axis really useful for putting myself in someone else’s shoes. Also, Aid Data recently published some research that I’ve found really useful on this topic:

  3. Jayakumar Christian

    Fully agree with the 2 issues (problems) the article identifies. In addition to the complexity of these large organisations, very often large organisations loose their ability to listen to the grassroots where these ‘data’ & wisdom reside. We begin to believe the ‘internal speak’ at the highest levels of analysis and have more lessons learnt workshops (which should rightly be named as lessons not-learnt workshops). More than ‘more tools and analysis’ I think complex organisations need the political will to listen to the grass-root margins.

    • Having sat through a few lessons not-learnt workshops, I completely understand. I think that good tools and organisational processes should result in managers/leaders having conversations that are informed by ground truth. Your point on the high-level analysis being translated into “internal speak” is also a particularly salient point; we turn wisdom and data into these succinct stories because its hard to have a conversation or make a presentation where we address every nuance. Stories also resonate intellectually and emotionally which is why we lean on them so much. Luckily, on the Internet at least, we have a comments section where we can add that nuance to the conversation.
      I’d love to know more about your experiences/frustrations in getting your organisation to listen. Is it an information issue? Is it an agenda issue (where the goals of management override the information you’re getting from the margins)?

  4. Sam

    A very stupid but useful tool:word cloud. As a diplomat, speeches get analysed and reported on. However, this kind of analysis suffers from selection bias: what we find important, what we know important, we see, while the rest is ignored. I remember making the analysis of the speech to the party congress in Vietnam, and the only word the word cloud was showing, notwithstanding the deep analysis and directions found, was “PARTY”. Probably the only priority was to strenghten the hold of the party, and it just mentioned everything elese.