Skip to content

Posts tagged ‘Brazil’

Video of Think Tanks Data visualisations event

On Think Tanks and WonkComms have organised an event to close the On Think Tanks Data Visualisation Competition, supported by the Think Tank Fund. The event, in London, included the participation of Brazilian, Mexican and Czech experts whose visualisations were rewarded by the competition. Watch the video and join the conversation.

Read more

The onthinktanks interview: Robert Muggah, author of Mapping Arms Data

The First Round of the On Think Tanks Data Visualisation Competition was won by the Mapping Arms Data visualisation. Submissions for Round 2 of the competition are open until 2 October 2013, and I wanted to find out more about how this technically advanced, visually stunning and information packed data visualisation came into being with the hopes of inspiring other think tanks to consider putting together their own (maybe not so advanced!) visualisations. As such, I sat down with Robert Muggah, one of the visualisation's creators, for an interview.

Read more

Results of the On Think Tanks Data Visualisation Competition, Round 1

The votes are in. The judging is in. And we can now officially announce the winner of the first round of the On Think Tanks Data Visualisation Competition. Drumroll please! The first place winner of $500 and a chance to compete in the finals is…

Read more

The onthinktanks interview: Sandra Polonia Rios on Brazilian funding models

In this interview, Sandra Polonia Rios, Director of theCentro de Estudos de Integração e Desenvolvimento, in Brazil, discusses how different funding models can affect think tanks' influence

Read more

Carnival Month: Dancing like Brazilian think tanks

Clara Richards writes about Brazilian think tanks in the first of a series of posts to celebrate the Carnaval. She finds that the cluster model, which combines several business models, is particularly popular in Brazil.

Read more

A new BRICS think tank network

The rise of the BRICS bloc in the last decade, since its conception as an economic group by Goldman Sachs in 2001 as a counterbalance to G7 countries in the world scene, has seen a growing cooperation between its members (Brazil, Russia, India, China, and, as a country added in later years, South Africa), specially on economic and diplomatic grounds, as well as the building of an institutional framework, having already held four summits, the last one in March in New Delhi. There is more trade within the bloc, estimated to reach USD 500 billion in 2015, and the contact between their governments is ever growing. However, BRICS countries have big differences, among them their political and cultural values, the composition of their economic structures and outreach, and, above all, the lack of a common history (with exception of some bilateral relations). Nonetheless, even if links between these countries are questionable, the group has been consolidating for the last five years.

The recent publication of The BRICS Report, on the occasion of the last summit, calls for a harmonisation of economic and diplomatic policies, as well as for forging stronger links between the five countries. In the Sanya summit in 2011, the declaration included the need of research cooperation, and the formation of meeting groups for think tanks. In November 2011, the BRICS Trade & Economic Research Network was launched in Shanghai by five think tanks:

Although all five of them are focused on different subjects in their own countries, in this agreement they have focused on three objectives related to trade and economics:

  • Promotion of fair markets,
  • Inclusive growth, and
  • Sustainable development.

As reported in their strategy paper, their work will consist of publications, policy research and advocacy, as well as highlighting the role of government funding for the growth of their activities. It is clear that trade tariffs and conditions are a key matter for the BRICS countries, as they face protectionist measures from developed countries in sectors like agriculture or manufacturing, where they are actually more competitive. These agreements for a BRICS research group were confirmed in the New Delhi summit this year, where talks about greater public policy research where on the agenda.

There are other efforts that look for a common BRICS policy and commitment to its development inside those countries has been getting ever stronger. In Brazil, the BRICS Policy Center (BPC), founded by PUC-Rio and the City of Rio de Janeiro, is dedicated to BRICS studies by means of analysis, further cooperation between the governments, and cooperation between their societies. The BPC receives visitor researchers and fellows from the other BRICS countries and they have a very active agenda on economic, commercial, political and cultural subjects, publishing research papers, organising conferences, monitoring work, etc.

This is an interesting transnational initiative in which think tanks have been given a key role by their respective governments. Do think tank networks in other regions play similar roles?

Public funds for public policy research in Latin America: a study by Lardone and Roggero

Think tanks in Latin America are mostly dependent on private and foreign funding, while governments don’t have a policy toward funding them and the social sciences sector as a whole.  This is the conclusion that Martín Lardone and Marcos Roggero came to in Vínculos entre conocimiento y política: el rol de la investigación en el debate public en América Latina (edited by Norma Correa and Enrique Mendizabal). In their study about the role of the government in public policy research funding in Latin America, they found that governments in the region have a narrow view of research promotion so that regular public funding is mainly directed to “hard sciences” –biochemistry, medicine, agriculture, etc.-, leaving a marginal share of funding for the social sciences.

Lardone and Roggero identified two clear research funding mechanisms:

  •  on-budget, programmatic financingon a stable, systematic and structural basis that works along a long-term, permanent policy on research as a tool for development of a country and which has h a fixed allocation in the national budget through ministries, public agencies or universities; and
  • non-programmatic financing thatworks in an unstable and non-systematic way, funding researchers on a project-to-project basis.

The authors concluded that programmatic financing tends to favour research done in universities, well-established entities with fixed budgets, and “hard scientific” research. As an example, only 10% of projects approved by Colombia’s COLCIENCIAS, the national agency responsible for science and technologies, are related to social sciences and education, while the remaining 90% funds natural and exact sciences, engineering, medicine, agriculture, etc.

It is not surprising, either, to find efforts to promote in-house research through public policy research inside ministries and public agencies, such as in Indonesia, which follows a policy of Balitbangs, or government research units.

In Latin America, the large majority of think tanks are private and their finances are weak. They depend on private and foreign funding for international cooperation and foundations from abroad. As mentioned before, think tanks have difficulties to access public funding. The most common way of getting public fund is by offering their own services through short-term contracts, agreements, or sometimes bidding for work in government projects. Unfortunately this means that often projects aren’t longer than a year because governments are subject to one-year budget processes.On the other side, many think tanks in Latin America prefer to be distant from government funds, citing autonomy and independent agenda as key factors for their work.

Nonetheless, various new types of public financing for public policy research are appearing in Latin America, for example:

  1. Governments allocate funds coming from multilateral financers and international organisations (e.g. IADB, World Bank, UNDP) to research.
  2. Governments manage incoming funds from the international cooperation and channelthem to organisations (among them think tanks) through a bidding process. This type of management in being used in Bolivia, through the Vice-Ministry of Public Investment and External Financing. A similar system is employed in Colombia for projects of the Presidential Agency for Social Action and International Cooperation, where organisations participate on a voluntary basis.
  3. Governments channel their own funds through specialised public agencies, as in Costa Rica (that no longer relies on foreign funds) and Brazil, which has an ad-hoc agency for public policy research.
  4. Governments centralise demands for monitoring and evaluation and outsource this work to think tanks as a permanent policy. This is case of Mexico’s CONEVAL, the social development evaluation agency, and in Chile.
  5. Parliaments decide which think tanks to finance as advisers to groups or commissions. This system has been criticised for benefitting think tanks that are related to political parties, but at the same time it is a way of compensating for the governing party’s access to public agencies and information. This system is applied in Chile.

Another year, another ranking of think tanks (and surprise surprise, Brookings is still the best)

I’ll accept that James McGann’s effort to identify and rank all the think tanks in the world has some positive outcomes. First of all, it has people talking about think tanks -and some think tanks are even becoming aware that there is a debate out there about themselves. Second… no, that is it. [Also have a look at Goran Buldioski's blog on the same subject]

I am still of the opinion that going beyond the counting and study of individual think tanks (and their immediate systems) is useless and misleading. Here are five reasons why I do not support this ranking, and then a longer semi-rant at the document.

  1. Think tanks cannot be de-linked from their political, social and economic environment; since think tanks define themselves in relation to the other players in the system. Brookings cannot be described without references to US bipartisanship -when we say independent research in the US we mean independent of either party (as well as of other interests). But independent means something entirely different in China, India, Brazil, or Argentina. Global and regional rankings are therefore unhelpful when the focus of think tanks is local (not local as in of this town or neighbourhood but of their direct interactions).
  2. The list is too diverse to be relevant. The definition of ‘think tanks’ has improved since I last commented on it to include politics. But he has now included organisations some that cannot be possibly compared with the rest. Let’s put it this way: if I define a mobile phone as a device that allows me to make phone calls while on the move I could be tempted to include laptops (after all I can make Skype calls ‘on the move’) but I wouldn’t because it would be confusing and unhelpful. A mobile is one thing and a laptop is another. Maybe they will do things that the other can also do but that does not make them the same thing. Amnesty International, Human Rights Watch, Transparency International and the various foundations (funders rather than researchers) included …. how useful is it to compare them with IPAR in Rwanda or GRADE in Peru?
  3. It is still based on perception rather than thoughtful analysis. Thoughtful analysis would have required the development of a database with answers to all the questions or criteria presented in page 56. These are good questions, but the nominators were not asked to provide answers to these, only to use them to think about their nominations. This means that it is all about presentation rather than content: still a popularity contest among people who clearly cannot know about every context and must therefore rely on what is accessible to them (this is obvious when one realises that most of the top non-US think tanks are either focusing on (or working under the banner of) international development, security and foreign affairs). The kind of analysis that I am attempting and that Goran Buldioski, for instance, is undertaking in Eastern Europe is absent.
  4. A ranking must have a clear definition of what the top spot implies: top 25 by revenue, by number of staff, by number of publications, by happiness of their staff, etc. It is the same as with sport: Usain Bolt is the fastest sprinter. The Ballon d’Or on the other hand is a perception based award given to the best football player according to the votes of coaches and captains of international teams, as well as journalists from around the world. So you either define why one wins or you define who votes; but you cannot keep both unclear or hidden.
  5. It is dangerous. It creates incentives towards investing in profile raising and visibility rather than focusing on research and research capacity. The director of a think tank that is not on the list emailed me, worried about their absence, what should we do? Given that they are one of the most influential think tanks in their country, undertake research of the highest quality and are running groundbreaking and innovative initiatives (copied all over the world) my answer is: nothing. And those who make it to the list because they are popular rather than good are incentivised against doing anything about it because they may believe that the list confers them credibility.

My recommendation (if some sort of ranking is what we want) then continues to be the promotion of national think tank awards like the one promoted by Prospect Magazine. It is a shame, really, because this project has the potential to collect fantastic data on think tanks unfortunately because of the focus on the ranking a huge opportunity is being lost.

On the report itself, here are some preliminary comments after a single read (I promise to give it another go):

The first thing I notice is that top to the list are Brookings and Chatham House. I often go to their websites and find out a bit more about them and see that, yes, they have fantastic research and wide range of products and are clearly at the top of their game. And when I can I go to Chatham House events. So far so good, I guess. But then, second and third are Amnesty International and Transparency International. I know these organisations well. They are quite active in my country (Peru) but they are international campaigning NGOs, not think tanks. Transparency International participates in electoral processes as an observer. Is this the role of a think tank? Amnesty international campaigns for human rights and against their violations. I don’t think that researchers lobbying for more funds and freedom for think tanks in many developing countries would like their governments to think that this would mean more space for TI and AI to operate there too. Apples and Oranges?

Then I remember that the winner of Prospect Magazine’s 2010 Think Tanks Award was the Institute for Government; I check the top non-US think tanks but find that there are other UK think tanks in the list and the Institute for Government is nowhere to be found. In fact, it is not mentioned in the whole document. That is odd but, OK, not all rankings have to agree. What about Policy Exchange? Policy Exchange was set up by the supporters and members of the Conservative Party and was instrumental in the development of the ideas that shaped the arguments that won the 2010 election and that are guiding the new government’s policy agenda. There is a fantastic indirect account of this in Peter Snowdon’s book: Back from the Brink. No, the Policy Exchange is not listed either.

To make sure I am not missing anything I jump to the table for Europe (page 31) but no luck. They are not there. But the Overseas Development Institute is.

Now, as much as I like ODI, I am sure that it is not more influential than Policy Exchange. So, wait a minute, maybe this ranking is not about influence but about worth..?… about value? reputation? is it about finding the ones more capable of speaking truth to power? But why then have an index every year? What can change year on year to get a new one into the ranking? An annual index suggest that think tanks quality can change in a short period of time and therefore it is possible for an unknown organisation to make it to the top is the happen to do all the right things. Is it possible in this ranking? CGD did it more or less and on the basis of a good combination of research and communications. But is it possible for think tanks in small countries focusing on local issues? And is it really a worthy end?

The more I see Chatham House and other security and international relations think tanks the more it feels as if the theme of this year’s ranking is foreign policy or international development -maybe that is what this year was about. Or maybe this is what the annual ranking should be about: focus on a single theme so that more and better analysis can be done for each think tank.

Nevermind, let’s get back to it. On to Latin America, which I know a bit. The list includes the Centro de Estudios Publicos (CEP) from Chile, the Centro de Implementacion de Politicas Publicas para la Equidad y el Crecimiento (CIPPEC) in Argentina, the Instituto Libertad y Democracia (ILD) in Peru (which by the way is on both 15 and 24), and CEPAL (the UN’s Economic Commission for Latin America and the Caribbean, or ECLAC in English). This is interesting. CEPAL is the only truly regional policy research centre in the list -but it is a UN body. CEP and CIPPEC are clearly focused on their own countries -and they are certainly influential there but not in my country, Peru. And ILD was influential (granted it has been one of the most influential organisations int he world led by their director Hernando de Soto) but it almost has no public presence in Peru and cannot be really compared with other Peruvian and Latin American think tanks if one quickly browses through their work and publications. ILD is a fantastic analysis based consultancy working across the developing world on the basis of some research done in the 1980s. If they make it to the top of the list it is far more interesting to find out why this is the case rather than their place in the ranking: is it because this is what policymakers value, or were the respondents from Africa or Asia where they do most of their work?

In any case, policy in Peru is influenced by (among the think tanks) CIUP, GRADE (which is mentioned), IEP, and others that are not on the list. This is a perfect example of visibility: is it sometimes my impression that GRADE is quite successful in reaching audiences in DC and London and is therefore well known globally; while IEP and CIUP might be more focused on domestic policy debates and hence less well known beyond the country or region -or certain research communities. This probably reflects their origins, mandate and business models. So even within a country, comparison is difficult. Who is to say though whether one is better than the other based on their choice of audiences? [This section has been edited; see comments below.]

Back to Latin America (and for that matter, Europe). In Latin America there isn’t a regional government so what is the point of a regional ranking. So what is the top think tank is Brazilian? Is it informing the Chilean government? Is it valuable for Colombia? Maybe in Europe ‘European think tanks’ make more sense but then is this why domestically focused think tanks are not mentioned? Clearly, international reviewers would not know who are the movers and shakers of Peruvian, British, or Spanish policies. (Again, a point in favour of national awards.)

So maybe the regional focus has little to do with where the think tanks do their influencing and more with quite simply where they are based. But if this is the case then once again we’d be separating think tanks from their context -and this is not right.

And now on to Africa. This list looks a bit messy, to say the least. The first 7 are from South Africa (no surprises there). But number 8 is a regional research network made up of researchers based in think tanks across Africa -I’d like to call it a think tank but I am not sure how it compares with the others. And then it lists a few organisations which can hardly be called organisations at all and are only popular or known because they are among the only ones in their countries. Others are in the process of getting there; but are not there yet. A tiny bit of analysis would have provided sufficient information to disqualify them as worthy of any ranking; and to identify many others who may be more worthy of a mention.

Anyway, what is the point of saying that organisation xyz is among the top 25 in Africa? How does it compare with the Latin American ones, for instance?

What happened with the debate on think tanks in South Asia? I’ve been avidly following a great debate on Indian newspapers on think tanks that would suggest a fantastic opportunity for a study such as this one. And how useful is it to compare them with think tanks in East and Southeast Asia? In fact, how useful is it to compare think tanks in China or Vietnam with those in Japan, Indonesia and South Korea? Our overview study on think tanks and politics in the region showed foundational differences between them that merit more rather than less national focus.

The lack of analysis is telling of the limits of this type of research. A country or region focused study (rather than ranking) would have been much richer and useful.

The thematic rankings are also quite interesting. The fact still remains that one cannot separate theme from politics -and politics are always local.

I would have loved an explanation for Chatham House coming ahead of IDS in the ranking on International Development. Chatham House if by far a better think tank than ODI and IDS on foreign policy (and let’s face it they are a fantastic think tank in general and its contribution to international development debate is invaluable) but given that international development policy is still largely dominated by DFID and that DFID’s research programme is dominated by IDS and ODI (and not Chatham House) and that IDS alumni roam the corridors of DFID I cannot understand the ranking. More explanation is needed, please.

Also, why is Fundacao Getulio Vargas included in this table? They are not focused on International Development policy, their focus is on just policies; the international development prefix is added by ‘northern’ organisations to describe policies for or of developing countries. FGT deal with economic, business and legal research for the development of Brazil. How is this different from the research done by Brookings or IPPR for the development of the US and the UK respectively? (patronising?)

Also FGV is included at the foundation level not at the level of its centres of programmes, however, the Pew Research Center rather than the Pew Charitable Trusts is included. Why? I would suggest that it has to do with the narrow and shallow focus on a global index instead of a desire to understand the richness of the histories of these organisations.

Then it gets confusing -think tanks are in more than one category but in totally different levels and others which one would expect to find are gone. Yes, this is all possible, as most think tanks would be good in one thing and not in all; but Chatham House, for example, is the top UK think tank in most list but behind the International Institute for Strategic Studies when it comes to their core area of expertise: foreign policy and security. This makes no sense.

The potentially most useful list (domestic economic policy) ends up being a US focused one. This further illustrates the limitations of a global ranking and its bias towards international development and foreign affairs think tanks that are more easily identifiable in the blogosphere or more popular communication channels than domestically focused ones.

Then the special categories: most innovative policy idea -great category but what have they been nominated for? what was the idea that got Brookings to the top? Again, another missed opportunity to provide intelligent insights into the rich and complex reality of think tanks. The same goes for the outstanding policy research programme category. Which programme got ODI the 15th place? ODI has quite a lot of programmes -and also projects that we call programmes because they are larger than the usual small projects we run. So which one was it? The Africa Power and Politics Programme? The Research and Policy in Development Programme? The Humanitarian Policy Group’s Integrated Programme? The Chronic Poverty Research Centre? It is important to know because some of these are delivered with other organisations so ODI could not take all the credit.

I got bored a bit and jumped over some tables until I got to the best government affiliated think tank -WBI? Nice to know that the WB is considered a government. If the WB is a ‘government’ would the UN not be one too? (UNU-WIDER and CEPAL are in the other tables.) What about think tanks entirely (or almost entirely) funded by their governments or the international cooperation?

And then, Party Affiliated think tanks -which is an important addition to any work on think tanks. This merits an entirely different post. What does affiliated mean? Does this include Conservative think tanks in the United States like Heritage or the Conservative Party’s Central Research Department? And wouldn’t CASS and VASS (the Vietnamese equivalent of CASS) be part of this category? After all, they are affiliated to the Communist Party and Chinese and Vietnamese line ministries have their own think tanks.

I don’t want this to be a totally anti-Go-to-Think-Tank-of-the-Year rant. As I said before, the ranking has created an opportunity for debate and discussion on think tanks and this is good. But this ought to lead to a proper discussion about think tanks, the roles they play and how they may be able to contribute to their contexts (local and/or global).

The list of questions and criteria in page 56 is the best part of the document and an important contribution to the think tanks debate. It provides a guideline of sorts to study think tanks in greater detail and to promote a more intelligent debate. Focusing on the list and the ranking, I think, robs us of James McGann’s and his team’s undeniable capacity to do this and leave us with a bitchy Oscar nominations season for researchers.

Follow

Get every new post delivered to your Inbox.

Join 5,058 other followers