Charlotte Maugham
Sandy Oliver

Engaging with evidence and uncertainty: choosing where to start

Guest post by Charlotte Maugham and Sandy Oliver

There are two powerful trends playing out in the development and humanitarian world: the push to make better use of research evidence to produce viable policy options, and the localisation agenda. The two are sometimes treated as mutually exclusive – “I mistrust any decision made without reference to critically appraised evidence.” vs “Ah, but my operating context is so unique that only knowledge held within the boundaries of this village has any applicability for my decisions.”

This soaring divide is driven in part by realities of context but also by unchallenged wisdom within our comfortable siloes, and by our own proclivities.

The truth is – whatever we are most comfortable with, we should probably all be doing a bit more of the other.  There are very few contexts where we can’t apply some form of generalisable evidence (systematic review, meta-analysis etc.). And even fewer where we can’t learn with and from those people closest to the problem about how our attempts to solve it might be improved.

It is a truth (almost) universally acknowledged that evidence of any kind rarely provides certainty. But getting the balance right between methodological rigour and local knowledge can boost the overall usability of evidence – so we can confidently inform decisions, while acknowledging uncertainty. This is particularly true when we want to use evidence to speak truth to power. The influence of evidence can only ever be as strong as our understanding of how it might be received… and vice versa.

So where do we start in getting this balance right?

Over the last two years, a group of academics, evaluators, and development and humanitarian practitioners have developed a toolkit that helps to identify the most appropriate mix of methods for engaging stakeholders with evidence in any intervention. It all starts with pushing past our own prejudices. And to do this, we first have to recognise them as such!

When interviewing practitioners we found this 2×2 matrix (figure 1), inspired by Duncan Green, helpful for exploring some of the hidden dynamics in how different types of evidence are used and exchanged.

Figure 1: How the context of decision making differs

The matrix works by helping us to situate decision-making in relation to:

  1. Availability of generalisable evidence; and
  2. Understanding among stakeholders of the operational context .

Doing so can reveal deep-held (and often ill-informed) preferences for particular forms of evidence. It also goes some way towards explaining why these preferences might be holding us back… and how we might get a better balance.

To illustrate all this we’ve populated the matrix using two real-world examples related to clean cookstoves and child nutrition.

Figure 2: How the context of decision making differs: real world examples

Example 1: Cookstoves: from laboratories and epidemiology to home cooking

Matrix Top Right: Linear model of knowledge to action (Knowledge Transfer)

Generalisable evidence from engineering and physiology convinced a small number of practitioners, policymakers, donors, academics and business networks that cooking indoors with solid fuel pollutes the air, harms the lungs and increases mortality. Working with WHO, this group recommended promoting the use of better stoves with cleaner fuels.

Matrix Bottom Right: Relational model of knowledge to action (Knowledge Exchange)

But… despite promotion of evidence linking solid fuels to poor health – campaigns did not result in widespread adoption of clean cookstoves.  So, WHO brought together research and policy experts from within health, engineering, air pollution and economics to share what they knew and develop guidance to influence cookstove markets, communities and homes.One “radical” and effective solution to this problem saw engineers and anthropologists working with household cooks to redesign, test and set standards for stoves that would appeal to home cooks.

So what? For those of us that prize the rigours of the generalisable evidence base (sometimes above all else) – this example acts as an important caution.  The linear model for encouraging the uptake of such evidence is all well and good when the context is well understood… but bring uncertainty into the picture (which let’s be honest applies to most development interventions) and this model quickly breaks down. By contrast, the relational model, in which we encourage dialogue between different types of stakeholders is more effective.

Woman using Philips stove, India. Credit: Nigel Bruce

Example 2: Children’s healthy eating: in the school system and at home

So far, none of this is ground-breaking – by now we’re thankfully all well-versed in the “context matters” narrative. But it does serve as an important reminder that making sense of generalisable evidence in any given context means giving equal value to other types of evidence – tacit know-how, lived experience etc.

The left-hand side of the matrix is where things start to get really interesting…

Matrix Top Left: System model of knowledge to action

With the best will in the world, generalisable evidence is sometimes thin on the ground. In some situations, while the context may be familiar and well understood, relevant studies may be lacking or offer uncertain findings.

This has been the case for leaders in the education system seeking to support children’s healthy eating in schools. While these leaders have sound understanding of, and influence within, the schools system, they have found the evidence from trials of school feeding programmes varied. In this situation, the most effective solutions have involved developing programmes with local teams rather than experts from elsewhere and possibly with local ingredients and cooking methods.

Matrix Bottom Left: Emergent model of knowledge to action (Knowledge Mobilisation or Co-production)

More challenging still, is when understanding of the context is also lacking!  This has been the case for decision makers whose goal is for children (particularly those living in poverty) to eat healthily at home. A child’s home is a context where these leaders have little understanding, and exert little influence.

In this type of situation, it makes sense to turn to the people holding the relevant knowledge: those feeding their children. Here the engagement method of choice is to learn from poor families whose children are well nourished and to share that learning with families whose children are less well nourished.

So what? Without shared understanding of either generalisable evidence or the social system, the first step is asking around until useful evidence begins to emerge.

Cultural norms and politics: working with alliances

Making decisions with evidence doesn’t happen in a vacuum. Returning to the cookstoves story, the sustainable technology NGO, Practical Action, also recommends a more politically aware approach. This approach recognises how social norms hinder women from participating in energy markets in personal, technical or leadership roles. It values the political will for setting ambitious national targets. It recommends coordinating the efforts of industry associations, civil society forums and consumers, particularly women, to activate the market; and working with finance institutions to develop business models more suited to women and the poorest households.

So what? Using research evidence is not only a technical exercise, but also a social endeavour that benefits from some political awareness to anticipate challenges and develop alliances.

Where to turn when engaging with evidence?

Working through the 2×2 in relation to your own work on any particular project, you’ll probably find you don’t fit neatly into one square. But there will likely be one or two that align more closely with your situation than the others… and you might be surprised at which! Just getting to this stage can help build your trust in the relevance of different forms of evidence, while also helping you to plan which models for engaging stakeholders with evidence make most sense.

Now you know which models might work best – our toolkit can help you locate appropriate methods for engaging stakeholders with evidence. It is intentionally live and we are always searching for more methods to add – so do get in touch.

Subscribe to our Newsletter

You can unsubscribe at any time by clicking the link in the footer of our emails. For information about our privacy practices, please see our .

We use MailChimp as our marketing platform. By subscribing, you acknowledge that your information will be transferred to MailChimp for processing. Learn more about MailChimp's privacy practices here.


6 Responses to “Engaging with evidence and uncertainty: choosing where to start”
    • Thanks for this insight, Rick. I can see the similarities. I think many of us find visualising the focus and degree of consensus helps our thinking about how to work with mixed groups of people. There seem to be a few of these sorts of matrices around, with small-ish differences in how their axes and content are described to suit the task in hand, whether this is programming, evaluation or use of research, for instance, and now I’ve seen yours on theory of change. Our work built on ideas about adaptive management for development with the orientation chosen to mirror what we’d found when developing participative approaches to research synthesis.

  1. Anita Makri

    This looks useful, thanks for sharing. Is there a case-study of how it’s been applied? Interested to get a sense of what it looks like in practice, e.g. in the context of a workshop or another way of engaging policymakers/stakeholders on evidence they can use.

    • Thanks for your interest, Anita. All our work so far has been retrospective, discussing the matrix with people in government, NGOs and academia, listening to them reflect on their past work. Their experience and reflections helped us to articulate different ways of engaging with generalisable knowledge and local knowledge, either separately or together. We discussed this way of seeing engagement options in a workshop at the What Works Global Summit, in Mexico City, 2019, where participants found the framework made sense to them and quickly located their own activities within it. Hearing from them and others helped us clarify the language and presentation when discussing past choices about how to engage stakeholders. We’ll be looking now for opportunities to use the matrix prospectively.

Leave a Reply to Sandy Oliver Cancel reply

Your email address will not be published. Required fields are marked *