In conversation with Carlos Castaneda: promoting and using evidence in Colombia

By  Hub LAC
16 January 2024
SERIES Voices of evidence users

Welcome to the ‘Voices of evidence users’ interview series, offering first-hand insights from the people who use evidence in decision-making.

In this conversation, Hub LAC talks to Carlos Castaneda about the role and mechanisms of evidence use and generation in Colombia, and in particular synergy between academia, citizens and decision makers. 

Carlos is an economist from the University of Antioquia. He has a master’s degree in economics from the University of Los Andes and a master’s degree in public and social policy from Pompeu Fabra University. He is currently the Director of Public Policy Monitoring and Evaluation at the National Planning Department of Colombia (DNP).

1. What are the priority issues or problems in your organisation? 

The priority issues are the evaluation of strategic programmes linked to the country’s National Development Plan (NDP). Precisely, one of the responsibilities of the DNP is to elaborate this plan, together with the Ministry of Finance and the other sectors of the national government.

The DNP defined five strategic lines and two transversal lines on which we believe the country should focus for development in the medium and long term, these lines are: 

1) Land management around water with environmental justice: climate change 

2) Axis of human security and social justice

3) Human right to food (transformation of the agricultural sector, food production and food security)

4) Productive transformation, internationalisation and climate action

5) Regional convergence 

All of this is framed within two cross-cutting themes: comprehensive peace and macroeconomic stability. We also work within the framework of the 2030 agenda of strategic development goals (SDGs).

2. What role has scientific evidence played in addressing these priority problems or issues, and what partnerships or synergies have supported or enabled this process?

A concrete example has been the use of evidence gap maps to identify which actions have worked in the world, in relation to the citizen-income programmes. 

I mention this because Colombia intends to migrate from conditional cash transfers to unconditional cash strategies. This is one of the ways to make evidence available to decision-makers in future programmes.

Another example is the Mi casa ya programme, a housing programme that subsidised middle and lower-middle class people so that they could obtain housing. 

We did an impact evaluation about two years ago and the results of that evaluation made it possible for the Ministry of Housing to make decisions that allowed the programme to be restructured.

Regarding the types of evidence, these can be different depending on the degree of maturity of the programmes. What we do is try to be innovative in the way we provide or transmit evidence to decision-makers: sometimes an evidence map is more useful, sometimes an infographic, video, among others.

All the evidence we generate is taken to the decision-making bodies, to the councils of ministers and to the medium-term expenditure framework committees, which is where the budget allocated to each sector is defined.

We work in several stages to identify where evidence is needed and we rely, as mentioned above, on the different agencies in the ministries and on consulting firms and national institutions such as Fedesarrollo, think tanks, and the National and Javeriana University of Colombia. 

We also partner with international agencies such as the Global Evaluation Initiative, University of Berkeley, University of Toronto, and professors associated with the University of Chicago.

Regarding the involvement of the citizens’ perspective, we have the citizen perception survey that is conducted every year, and we seek to know their perspective on how the evidence and the government’s offer are used. This allows us to monitor the gaps between what the government states. 

An example with data from five years ago from these interviews on the follow-up of actions of the Ministry of Education was that the coverage in primary school was 90–95%. But at the time of the interview, we asked the mothers if they could enrol their children in school age and 50% said no. 

This is a gap between what citizens perceive and the real offer of the government, which means that other strategies must be taken. 

Everything we generate is public, citizens can download reports, data, surveys, etc., to replicate exercises, or go deeper into the results. We want citizens to come closer and for the government to participate as well.

3. What are the main challenges and opportunities you identify for institutionalising and systematically incorporating evidence into the decision-making process?

Among the existing challenges, we find that to develop rigorous evaluations we need more time than we actually have. 

Each year, we receive between 170 and 180 requests for revaluations; we do not have the capacity and we prioritise according to criteria defined in CONPES 4083 (2022) and each year we prioritise 15 and 17 evaluations.

To accompany this process, we try to have a database in order to perform exercises quickly. In this sense, if there is no academic research, we incorporate faster tools, such as machine learning, that allows the generation of evidence in a timely manner.

Another challenge is that we have seen that the use of evidence is anecdotal, as there is no systematic measurement of how it is used. We are developing an evaluation use index that will allow us to measure how and when the evidence we are generating is being used. This allows us to see when recommendations are implemented, and how changes are taking place.

On the other hand, there is not always political will among decision-makers; in other words, the guidelines of a programme may not require it to be evaluated.

Another major concern is to find the best way to communicate the results of the evidence because the documents are often very large and risk not being read. The evidence is already being displayed on different platforms to attract the general public to get short and eye-catching information.

Among the opportunities, we also have robust tools with the technical quality to inform and make decisions, which we share with countries such as Costa Rica and Peru in order to keep up-to-date with what is happening in neighbouring countries.

And, finally, in Colombia we have CONPES 4083 of 2022 for the strengthening of the use and institutionalisation of evaluations for decision-making. This gives us the legal framework that obliges us to implement all actions related to evidence and offers an overview of the use of evidence in the country, along with improvement actions to consolidate our organisation (Sinergia).

4. What advice would you give to researchers and decision-makers who want to improve the impact and use of scientific evidence in policy decisions?

The advice to researchers in decision-making consists of facilitating evaluations so that policy recommendations can be used. It is necessary to use mechanisms to communicate what is most relevant. For example, a matrix for the use of recommendations is advised to encourage the presentation of results in a clear and simple way to experts.

The advice to decision-makers is to better understand what evaluations and different types of evidence are reporting. For example, 90% of decision-makers often claim that what they need is an impact assessment and when we clarify what the assessment consists of, they realise that they do not need this type of evidence but other types of evidence. 

Decision-makers need to be better informed about the evaluation method that their programme requires in order to reach a common agreement with those in charge of the research. 

Finally, it is necessary to mention the need to establish synergies between researchers and decision-makers to discuss and analyse the different types of evidence.

Contact: [email protected]