9 bad things you do (but know you shouldn’t) in research communications

caroline_cassidy_copy_copy_0Guest post by Caroline Cassidy (left) and Louise Ball louise ball

Over the years, at ODI’s Research and Policy in Development (RAPID) programme, we have worked with an array of researchers, communicators, practitioners and policy-makers, trying to make head and tail of how to get evidence to influence or inform policy. Reflecting on how far we’ve come, we realised that there’s a ton of good advice out there on communicating research for policy. But what we often don’t talk about is the things that go against that advice, which….cough cough… if we’re honest, we all do. At least sometimes. For example…

copy all meetingsDescribe your research audience as ‘policy-makers, practitioners and the public’. You know you should be more specific, but no one seems to be questioning it. And, well we want to reach influential ‘people’. Next time, try asking: if you could get five people to read it, who would they be?

Publish a long list of recommendations that are highly general, unrealistic, or both. Another common recommendation ‘faux-pas’ is to tell policy-makers what they ‘should’ to do. Read these three tips for presenting research to governments – and why it’s sometimes like being nice about an ugly baby!

Sign up to produce 20 publications for a 12-month project. Sometimes researchers are under such pressure to publish publish publish and in the rush to get strategies and plans locked down you promise the world. Quality over quantity anyone?  As our colleague says, ‘not every policy problem has a report shaped solution’.

Every other word is an acronym. As long as you spell them out in full the first time it’s ok right?! If we had one acronymswish from the communications fairy it would be to kill acronyms for good…

Send your research to 100s of people, and never follow up. There is no denying that this happens all the time. We vote to limit the word ‘dissemination’ in international development – good communication is about engagement and is multi-directional. If we all just followed up a little bit more, surely we might see more change happen?

Hail download numbers and other vanity metrics when reporting on your research ‘successes’. Even though you know it doesn’t tell you (or your donor) very much, but it’s so easy to get the stats and well…surely page views mean that someone out there is interested!? Try this toolkit on monitoring and learning from communications for other indicators that could tell you more.

Produce a video, podcast or infographic because ‘we have left over budget or/and it would be cool’. Now we don’t want to discourage doing this type of thing – it can be a totally brilliant way to reach your target audience, but it’s never a good idea to do any type of communications just for the sake of it! Strategic is the word.

policy briefs on a tableDeath by PowerPoint. Many have written about this over the years, many have tried to make things better (cue the birth of Prezi and Ted Talks). And yet….how many times did you want to take that nap at the last conference you attended? Less is definitely more with presentations, particularly when it comes to the text part.

Whack in a photo, you haven’t got time to look properly, this one will do. This is something that often just passes by any scrutiny. And yet, as this twitter chat showed, photography use is SO important and bad use could be reinforcing outdated development clichés.

This is just a short list we have put together – there must be more out there… comments box for any examples you have!

Subscribe to our Newsletter

You can unsubscribe at any time by clicking the link in the footer of our emails. For information about our privacy practices, please see our .

We use MailChimp as our marketing platform. By subscribing, you acknowledge that your information will be transferred to MailChimp for processing. Learn more about MailChimp's privacy practices here.


3 Responses to “9 bad things you do (but know you shouldn’t) in research communications”
  1. Thea Hilhorst

    Ouch, how do I recognize these don’ts from research programmes I have been involved in. I bet all of us researchers have been guilty of these don’ts in many instances.
    Yet, there is also a more fundamental issue when we continue to frame the problem as ‘communicating research’. Communicating research may have much less impact than we hope – insights from reports are not often shared with colleagues, easily forgotten and who really believes that policy responds to evidence – when we can often see how policy cherry-picks research findings to retrofit their chosen pathways for change? It is proven that co-produced knowledge is more effective in bringing about change – and there is now a huge trend of involving prospective audiences in the different stages of research, and importantly in the formulation of recommendations. It is also interesting to consider how research uptake can become more durable for development. In a recent blog http://bit.ly/2ERw17v we give some examples of how we tried to make research durable in the DRC programme of the Secure Livelihoods Research Consortium, amongst others by ensuring a solid involvement of researchers from the global South and by strengthening research institutes in the country.

  2. Steve

    On the day a story broke about prostitution and Oxfam in Haiti I get this blog titled “9 things we do that we know we shouldn’t “! I thought the content would be about our (aid workers) moral behaviour!