June 5, 2013

Case study

Supporting think tanks series: “Fact before Think” – The Case for Data-Focused Think Tanks

[Editor’s note: This is the second of a series of posts based on 5 think pieces prepared for the evaluation of pilots for the Indonesian Knowledge Sector Initiative. The views expressed in these publications are those of the author(s) and not necessarily those of the Commonwealth of Australia. The Commonwealth of Australia accepts no responsibility for any Loss, damage or injury resulting from reliance on any of the information or views contained in this publication.]

Think tanks can seize a great opportunity by focusing on data, in addition to their role of arguing for improved policies. Practical examples, from the US and beyond, illustrate the advantages of this “fact tank” approach, which has not always received sufficient attention in the promotion of evidence-based policymaking.


Evidence-based policies require quality data. Mostly taken for granted in developed countries, data is less reliable where state institutions are weak. Quality data and data analysis thus fills a gap, and is a critical contribution to informing decision-makers, improving public debate, and tracking the actual implementation of policy. Drawing on the experience of running the Caucasus Research Resource Centers (CRRC), which focused on data, this think piece highlights some key lessons.

Challenge of Evidence, for Evidence-Based Policies

Senator Patrick Moynihan is often cited as saying that “everyone is entitled to their own opinion, but not to their own facts.” Put differently, you need shared facts to move beyond partial points of view. Providing quality data is thus a starting point for constructive approaches to policymaking.

In many countries there remains a pressing need for basic data that tells us how citizens are doing. Have their lives improved? Do they have access to basic services? Are women doing as well as men? What do families struggle with? Do parents think their children’s lives will be better than their own? Too often these basic questions, and many other ones, are less easy to answer than they should be. The data is simply not there, or not of sufficient quality.

Providing data is typically the task of National Statistics Departments. Yet in many countries these agencies are weak and under-resourced. For example, the head of one national statistics agency once told us: “I just don’t have enough funding. Sweden has more than 70 people calculating their Gross Domestic Product, Lithuania 45, and I have 4 people on the job, with an average salary of $500 per month.” In other words, the country was spending less than $30,000 a year aggregating its GDP data.

In another country which I followed, the national census had a budget of around $5 million, while hoping to reach every household. Per capita, the census invested less than $2, when the United States spends about $42 per citizen. Underpaid enumerators, in a census and other household surveys, often cut corners. Therefore, uncertain about the quality of their data, the Statistics Departments are not necessarily enthusiastic about making it accessible.

Even where the data is good, Statistics Departments are unlikely to highlight findings that put the government into an unfavorable position. As they rely on the government for funding, they have high incentives to keep a low profile.

Statistics Departments can also inherit legacies that are difficult to change: a few years ago our team in Georgia established that unemployment, measured by international standards, was at 31%. However, official figures put unemployment at 16%. Privately, government officials said that the higher numbers were right. “We inherited the 16% from the previous government, and by the time we figured out what was going on, we couldn’t change it and suddenly double unemployment under our watch.” Many international agencies, including the World Bank, often use and sometimes even republish such flawed data, perhaps because they do not want to challenge their host government.

Given these constraints, many people don’t trust official data. Even reliable data can be contaminated by distrust, and people as well as decision-makers often find an anecdote as persuasive as official statistics. Disbelieving data, the government does not have a measure of its impact; the opposition does not learn how to look at metrics, thus gets little preparation for governing. Quality data does not by itself ensure sound decision-making, as established democracies illustrate, but it offers political actors a better opportunity to anchor debate constructively.

Solution: “Fact Before Think”

Think tanks, and independent research organizations, can play a constructive role to overcome this gap in data. Data provision and analysis is a unique niche for think tanks. More flexible than researchers with teaching schedules, less pressed by deadlines than journalists, and more independent than statistics officials, think tanks are in a great position to contribute numerical insight to public debate.

In generating the data, there are a number of workable approaches. At CRRC, we collected the data ourselves. Doing the fieldwork gave us confidence in its quality, and more expertise in the nuances of survey implementation. (One example: through cognitive interviews, we realized that the Georgian word for household, shinameurnoba, was understood by our respondents as referring to people of working age and livestock, but not to children.) By having fieldwork teams, we could deliver reliably, not depending on subcontractors. The expertise in handling complex data subsequently took us into other fields, such as media monitoring during high-stakes elections, or the provision of SMS-based reporting systems to enhance community security in volatile regions. Moreover, the “vertical integration” was attractive financially and helped cover overheads.

Contracting data generation out is another approach. The Pew Research Centers, our role model and the world’s premier “fact tank”, hires highly respected survey firms with broad international reach. This allows Pew to concentrate on conceptualizing research, analyzing, and then communicating findings through a range of attractive channels. Pew’s focus seems to work well, even in a field as competitive as that of DC think tanks. In their innovative analysis, David Roodman and Julia Clarke ranked Pew 2nd by its per-dollar reach, among all US think tanks.

Secondary data analysis is also an attractive option, since there is so much information available. One Indian think tank that contributed significant chunks to the government’s national planning reported that the data they received from various ministries was sketchy. The think tank said that while synthesizing fragmented data was challenging, it provided “a great opportunity for us to showcase our analytical skills”, and generated clarity where previously there had been contradiction.

Similarly, a colleague running a small consulting outfit had his team compare district-level World Bank data on poverty with governmental data on targeted social assistance. The comparison showed significant discrepancies, challenging the government and the World Bank to check their data. Enrique also pointed out an institution in Zambia to me, the Jesuit Centre for Theological Reflection. It  manages a monthly Basic Needs Basket survey across the country that is published in an easy to read one-page format. The data fills an information gap but also generates opportunities for public debate.

In all these cases, good data contributes to policymaking, by offering a better understanding of what is going on. As Richard Rose, a leading survey expert, has said, “counting people makes them count”. Counting also made CRRC count. The data was quoted widely, in the national and international media and also by leading national politicians.

Quality data allowed us to add nuance and to unpack concepts that otherwise remained abstract: following Richard Rose‘s suggestion, we introduced survey questions that measure destitution, to get an understanding of poverty that is more nuanced, and more telling, than official definitions.

Using such measures of destitution, we could illustrate, for example, that by 2011 Azerbaijan’s rush of oil wealth had left many citizens behind: 90% of the population said they did not have enough money to afford buying durables, such as a fridge or a washing machine; 38% stated they could afford food, but not new clothes, and; 22% said they didn’t even have enough money for food. By contrast, the World Bank put poverty in Azerbaijan at 15.8 %. Its official online definition: “National poverty rate is the percentage of the population living below the national poverty line. National estimates are based on population-weighted subgroup estimates from household surveys.”

Arguably it is more illuminating to hear that 19% of Azerbaijanis say that over the previous six months they repeatedly borrowed money to pay for food, with another 31% borrowing money at least once. Agile data collection by independent organizations thus is an extremely valuable complement to official sources, especially since the Azerbaijan page of the World Bank in March 2013 still drew on 2008 data.

Evidence is the first step towards better policy, and it’s also an easier step for a think tank that does not have Brookings’ lineup of 300+ scholars, or its USD 90+ million annual budget. When the funding for municipal garbage collection became a controversial question in Tbilisi, we could say with confidence that more than 80% citizens were highly concerned about linking garbage fees to monthly electricity bills. This finding was a contribution to informing the debate, even though we had never looked into all the alternative ways of funding municipal services.

In an excellent piece a few years ago, Goran Buldioski argued “think instead of tanks”, stressing the need for local think tanks to identify their niche. One could suggest, similarly “fact before think”. It’s a clunky phrase, but it highlights a huge opportunity. This entails, of course, a thorough understanding that facts are construed in different ways, as well as showing how different groups see issues of concern. The way people frame issues, after all, determines to what extent they engage and comply with policies.

Emerging lessons: Opportunity for Think Tanks & Donors

A number of lessons stand out, based on this experience:

  1. Too often think tank professionals (and donors) rush toward wanting a sophisticated solution, when the policy problem itself is insufficiently understood. Quality data is the first step towards an evidence-based approach, and independent research organizations, residing locally, play a critical and constructive role in providing this evidence, in ways that other institutions cannot.
  2. The Internet offers exceptional opportunities for making data accessible. Online Data Analysis now is easy to provide and maintain. Such tools vastly enhance transparency and accountability, and the possibility for an evidence-based debate.
  3. Independent data generation and analysis creates accountability for the Government as well as the Statistics Departments, and an opportunity for citizens to review the accuracy of information they pay for. It thus complements other efforts, such as the Open Government Partnership, or the great set of tools put forward by MySociety.
  4. Role models in the US, and beyond, illustrate the huge potential for think tanks that position themselves as fact tanks. There is a real opportunity for entrepreneurial institutions, as well as donors seeking to make a transformative investment.
  5. To improve research capacity, one has to get researchers to work together in teams, and move away from the still-popular notion of the Grand Intellectual, in the mold of Émile Zola. Data-gathering fosters such teamwork, since it’s a complex production process that requires input from diverse specialists, and rigorous quality control. (My colleague Koba Turmanidze’s great maxim for internal vigilance: “if the preliminary results look really interesting, they probably are wrong.” See here how a simple labeling error in an international survey led to downstream woes, affecting the Washington Post.) Survey work helps to build great – and fun – teams that later can go on to tackle other challenges of generating critical evidence.
  6. Building such institutions should follow successful practices of investing into and fostering startups, and not rely on “getting money out the door” grant-making instruments. Good donor engagement can make a huge difference, and CRRC was lucky to have the support of the Carnegie Corporation of New York, both in providing core funds and in giving critical guidance. Illustrating great practice, reporting was succinct, focusing on the substance, to help management step back and consider the bigger picture.

In sum, an emphasis on data is needed, and it works: think tanks and donors can seize a transformative opportunity by focusing on data. This can make people’s lives better by contributing to truly improved debate, policy and implementation.

About the author:

Hans Gutbrod:  Executive Director at Transparify

Read more from: Hans Gutbrod