Skip to content

Quality control: a few options for think tanks


[Editor’s note: you can read more about peer review systems for think tanks and research quality control.]

Last month I visited the Think Tank Fund in Hungary. It was my first time there and I got to meet the whole team (I think). The whole week turned out to be a mini festival of discussion and debate on think tanks that I hope will be repeated soon. Throughout the week we came back to a few issues and questions. One such question is the matter of research quality control (to which I would also add communications quality control, and, why not management, too? just “quality control”, then.) As a way of kicking off the conversation I thought I would share a short reflection on this subject. A few years ago I had to interview candidates for a communications manager. We asked all the candidates about quality control: what would you do to ensure the high quality of the outputs produced? We expected (and got) answers that focused on systems and processes. This particular candidate (who got the job) said: if you want good quality outputs then hire competent people. I will make sure my team is made up of competent people. There is no substitute for having good researchers, communicators and managers. All the systems and processes in the world will be useless without competence. What can organisations do to address quality control? In particular, what can small and resource-strapped organisations do? Here are some ideas:

  • In-house quality control: Many think tanks rely on in-house systems to ensure that their outputs and activities fulfil some minimum standards. This demands a few things:
    • Capacity within the organisation: someone needs to have the right skills and experience to act as a quality controller. This may be more likely for research quality but communicators and managers are often on their own.
    • Time to provide feedback and support to researchers, communicators and managers. It is all very well to say that something is not up to standards but what really matters is that this is followed up by advice on how to improve it.
  • A combination of in-house and outsourced quality control: Most think tanks would prefer to compliment their in-house quality checks with external inputs from peer reviewers or even academic boards. This can be useful for some of their outputs: e.g journal papers, working papers, and maybe for flagship reports. This is appropriate but more expensive and think tanks need to make sure that they budget accordingly. Also, external reviewers demand that think tanks have the capacity to manage the process as it is very likely that they will take longer than expected to deliver. And if there is one thing that think tanks cannot afford is to miss their deadlines. Missing a deadline can mean missing a one hour opportunity window; years of work down the drain.

The following ideas are a bit more experimental. I am just putting them out there to see if there is any reaction to them:

  • Peer-to-peer peer review (think tank level): Think tanks in a country or in a region could come together to offer each other peer review support. In a country or region, think tank A could offer think tank B peer review support -in exchange for the reciprocal service. This is actually how it works in many developed countries so I am always surprised about claims that think tanks would be turned off by fear that their competitors would steal their ideas. More likely, opposition to this idea has to do with embarrassment. For this to work, however, there has to be a string relationship of trust between the think tanks and the researchers involved.
  • Peer-to-peer peer review (researcher level): I think this is more likely to work as researchers tend to know each other and often support each other’s work -even if they compete for funding and access. Think tanks communities in developing countries are too small for researchers to afford not being nice to each other. Much harder to find is the support that directors and managers could get from their peers in the think tank community. Few directors and managers in fact ever think about their work or would consider asking for help.
  • Learn from others: An alternative to asking for advice is to search for it. All think tanks have access to other think tanks’ work (locally and internationally). It should be possible for a researcher or communicator to know if his or her work is up to standard. Managers may have to seek out peers more directly, but this is not impossible. In all cases, this approach demands honesty and humbleness. Some ways to do this include:
    • Find a few ideal think tanks (domestic and international) and compare the quality of their work to your own. Place a paper on housing policy by NIESER or Brookings (or a local think tank) next to a paper on housing policy by your think tanks and compare. You may not be able to be as ‘good’ but you can certainly strive to and move in the right direction.
    • Watch or attend the events, keep an eye on the best publication designs, websites and use of digital tools of the most progressive and active think tanks. Think tanks always know who are ‘very good at communications’ so it should not be hard for them find some to compare their own work to.
    • Funders could (but think tanks should) support the formation of a culture of public events that maximise the opportunities of think tanks in a city to come together and exchange ideas, ‘spy’ on each other, build relationships, etc. These events (like the ones that any think tanker can attend in London or Washington on a daily basis) provide an invaluable opportunity to learn from each other.
  • Practice, practice, practice (repeat, repeat, repeat): I have written about PMRC in Zambia. They have a great range of publications and other communication tools developed over a year of intensive practice. The model we developed together was simple: a few outputs that would be used on a monthly basis for every project. By repeating the same outputs over and over again they would get to practice and continuously improve their quality. Each time, too, they were able to introduce a new channel or tool. The best way to learn how to do research and write a good paper is to practice doing research and writing papers. Young researchers need practice. Too many think tanks I have come across favour long term research projects in which their younger staff have little or nothing to do when instead they could be:
    • Writing literature reviews (and annotated bibliographies): This can help to give the researchers a good understanding of several issues, they can help with future research (by them, their bosses or other researchers), they are communication tools in their own right, and can help them to learn from others (by reading the work of others). Any down time in a young researcher’s week should be focused on something like this.
    • Play with databases: When I started working in research about 15 years ago my colleagues and I used to spend quite a bit of our time putting together cross-tabs and running regressions with data bases just for the fun of it. And I mean for the fun of it: what is the relationship between education of the mother and education of the father per quintile? what is the relationship between years of education for urban adults and size of the household? We had no use for these but it helped us to learn how to use STATA and how to ask questions.
    • Allow communicators and young researchers to develop data visualisations, manage personal or organisational Twitter handles, organise short 1-2 hour lunch time events, etc. These things do not cost much (I recently organised a couple of events in Peru with local think tanks that cost peanuts) and even tiny events for a few people can allow communicators to practice and get better at things for those big and important moments.
9 Comments Post a comment
  1. Jeff Knezovich #


    An interesting post and something good to think about. I understand the point about needed competent people to ensure quality and that all the processes in the world can’t assure quality if the people involved aren’t competent. However, I do feel that this post is seeming to conflate quality with capacity (or in your words, ‘competency’). And many of your ‘more experimental’ methods relate more to strengthening the capacity rather than dealing with quality control.

    But in that vein, doing a ‘competitor’s analysis’ (as I suggested in this post is certainly a great way to benchmark and define ‘quality’. It’s difficult to control for quality without understanding what that means, after all!

    In terms of building capacity and upping the quality, I would also throw into the mix a few other concepts:

    * Participate in competitions: For the On Think Tanks Data Visualisation Competition (, judges provide private feedback on each of the visualisations as they select the winners and finalists. That, combined with public feedback is a great way of understanding what works, what doesn’t, and why.

    * Build collaborations: In last week’s post on audiences ( I perhaps implied but did not explicitly state that different think tanks will have different strengths in terms of reaching international or local audiences. Try creating partnerships with complementary organisations and see how they work, what their expectations are, and what their review processes include. It’s another very useful way of setting benchmarks.

    * Go for workshop+ models: I’ve done my fair share of workshops over time, but the ones I always found most valuable were intensive trainings with follow up support. So, for example we’ve done trainings on writing policy briefs and then worked with organisations to develop the final briefs over a three-month period. Going through the whole process with a helping hand can be a great way of building quality.



    October 7, 2013
    • Jeff, all good points. What I meant to suggest is that to improve quality (and make sure that tts deliver quality work) they could seek to improve their capacity.

      And to do this they can seek out opportunities to learn from others or from themselves rather than simply relying on external ‘peer reviewers’.

      Competitions, you are right, are anther very interesting and useful option.


      October 7, 2013

Trackbacks & Pingbacks

  1. Premio PODER for think tanks in Peru: the judges have a say | on think tanks
  2. How to fund think tanks? A few questions that may help decide | on think tanks
  3. Better Sooner than Later: Addressing think tanks’ governance and management challenges to take full advantage of new funding and support opportunities | on think tanks
  4. Core vs. project funding for think tanks: Managing development | on think tanks
  5. New organisational development grants from the Think Tank Fund | on think tanks
  6. International Workshop for Managers of Research Organizations in the Western Balkans and the EU: 19 October 2015 | on think tanks
  7. The next capacity development “thing”: Management for Researchers | on think tanks

Leave a Reply

Please log in using one of these methods to post your comment: Logo

You are commenting using your account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s


Get every new post delivered to your Inbox.

Join 7,074 other followers

%d bloggers like this: