Beyond the Hype: Lessons and Challenges from MineduLab’s Quest for Evidence-Informed Policy-Making

4 November 2024

MineduLab, launched in 2013 within Peru’s Ministry of Education, is often lauded as a pioneering model of a policy lab aimed at enhancing evidence-based policy-making in education. With support from the Abdul Latif Jameel Poverty Action Lab (J-PAL) and Innovations for Poverty Action (IPA), it sought to test, evaluate, and scale educational interventions. While its establishment was seen as a significant step towards institutionalising evidence use in Peru, the narrative of its unqualified success requires closer examination. 

Most of what has been written about the MineduLab is based on its original design and intentions.  

As I wish to argue in this article, MineduLab’s experience provides a nuanced understanding of both the potential and the limitations of the policy lab model, revealing critical insights that challenge the prevailing notion of its effectiveness.

By critically examining these challenges, we can better understand how to design policy labs that not only generate evidence but also ensure its effective and ethical use in shaping educational policy and practice.

The promise of policy labs in education

Policy labs like MineduLab are designed to bridge the gap between research and policy by providing a structured environment to test innovative solutions using rigorous evaluation methods such as randomised controlled trials (RCTs). The model has evolved since, of course, and these embedded units now use a broader set of research tools and approaches. 

The context for the formation of the MineduLab was promising. The Ministry of Education at the time was led by Jaime Saavedra, a former researcher and World Bank education expert with close ties to the global education research community. His team was riding the wave of evidence-informed policy in Peru, and several had previously worked in the newly launched MIDIS (Ministry of Development and Social Inclusion), which had embedded evidence use into its design. Peru’s government and research sector was also, by global standards, ahead in its capacity to generate socio-economic and administrative data.  

Precisely by leveraging pre-existing administrative data, MineduLab was able to quickly pilot various interventions, such as programs aimed at improving students’ growth mindset or the maintenance of school infrastructure through SMS-based systems. This approach facilitated the rapid generation of evidence on the effectiveness of these interventions without the need for new data collection, showcasing the lab’s agility in responding to educational challenges.

MineduLab’s operational model aligns with the global goals of policy labs, which aim to institutionalise evidence use within government structures. According to a recent review of policy labs in education, OTT identified the importance of these labs in strengthening the connection between evidence and policy implementation. This involves not only generating relevant evidence but also promoting its use and building the capacity of policymakers to integrate findings into practice.

Scaling: The myth of seamless implementation

In 2016, MineduLab was recognised with the “Think Tank to Watch” award by the judges of the Premio Poder al Think Tank del Año—an award promoted by OTT.

Despite the lab’s initial successes, the reality of scaling its innovations has also been fraught with challenges. 

An early warning came from a conversation I had with a long-time civil servant at the Ministry of Education. He approached me during a meeting of Peru’s Alliance for the Use of Evidence (now Semana de la Evidencia) and explained how MineduLab had been parachuted into the ministry with no attention to all the efforts to promote the use of evidence that already existed. He complained that a whole team was already dedicated to the generation and use of evidence from research and evaluation that had not been consulted and had now been excluded from the policymaking space.  And, he argued, the MineduLab was disconnected from the rest of the sector and where implementation took place. 

While it is true that MineduLab was committed to scaling successful interventions, the process has been anything but straightforward. For example, the “Expand Your Mind” programme, aimed at encouraging a growth mindset among students, showed positive results in trials but struggled to achieve broader implementation. When I spoke with former senior members of the MineduLab team in 2022, they reported that out of 21 innovations tested until they left the Ministry, only 4 were scaled, with scaling primarily dependent on opportunity, timing, and leadership changes.

This issue reflects a broader challenge faced by many policy labs: the lack of a dedicated team or an institutional strategy for scaling successful interventions. 

The struggle with scaling is not unique to MineduLab. The broader literature on policy labs (and evidence-informed policy more generally) points to the need for alignment with national budget cycles, the involvement of local stakeholders, and the establishment of robust implementation frameworks as critical factors for successful scaling. Without these, even the most promising interventions may remain confined to the pilot stage, unable to make a tangible impact on the educational system at large​.

MineduLab, given its origin and focus on RCTs, never quite managed to get into the rhythm of a highly political and complex sector like education. 

The role of context and institutional integration

One of the key lessons from MineduLab’s experience is the importance of institutional integration. Despite being housed within the Ministry of Education, MineduLab’s influence on broader policy decisions has been limited. After an initial golden period when the ministry was led by Jaime Saavedra and two other technocrats who followed his lead, the lab has been increasingly isolated and disconnected from the core political, policy and budgetary processes that drive educational reform in the sector. As hard evidence of this, the most recent publication on the MineduLab’s website is from 2018.  For soft evidence, I’ve relied on conversations with education policy experts in Peru. 

The notion of a policy lab as an “embedded” unit within government is central to its potential impact, but MineduLab’s experience shows that mere physical or organisational embedding is insufficient.

The recent review by OTT for the Jacobs Foundation on policy labs underscores this point, noting that effective policy labs must be strategically positioned within the organisational structure and the political and policymaking ecosystems themselves. This means being involved in the processes where key educational policies are debated and decided rather than functioning as an adjunct unit focused solely on the generation of evidence.

This calls for something more akin to the No. 10 Policy Unit that functioned at arms-length of political power at the heart of the British government. 

There is a parallel case in Peru. Not far from the Ministry of Education is the Ministry of Development (MIDIS). MIDIS has a similar lab hosted by a larger evidence division, Evidencia Midis. This team runs the ministry’s efforts to generate and use evidence from research and evaluation. Unlike the MineduLab, this team, including its lab, has maintained its position within the Ministry—even as its political leadership suffered the same changes as in the education sector. Part of the explanation is that its function is outlined in the Ministry’s organic law. In other words, it is part of its DNA—not an add-on. 

Ethical considerations 

Another aspect that has been under-explored in the dominant narratives around EdLabs is the ethical dimension of its interventions. Early on, the lab’s focus on rapid results sometimes overshadowed critical discussions around ethics, including a rather informal approach to sharing government data, consent from participants, and equitable access to the benefits of interventions. For example, some programmes provided additional resources or opportunities to a select group of students, raising questions about fairness and inclusivity, or researchers were given access to the mobile numbers of teachers and parents to test the impact of SMS messages. 

These ethical considerations are crucial for the legitimacy and acceptance of policy labs and must be integrated into their design from the outset. According to former staff I spoke to, if this issue had been addressed at the time, many of the experiments would have been impossible to test. 

What about local research?

I was at a conference in Santo Domingo when one of the model’s international proponents spoke to a conference hall full of researchers and evaluators. He presented the model through which J-PAL was able to deliver high-quality evidence to governments and spoke at length about the MineduLab and how it allowed the Peruvian government to access this evidence from researchers from the best universities in the world ‘for free’. 

The audience gasped. Afterwards, a participant asked me: ‘How are we meant to compete?’

In Peru, local researchers have often complained that MineduLab’s design and operation in its early years made their participation impossible. 

First, the advisory board of the MineduLab was made up of international researchers. Second, the “innovation window” opportunities were shared through the J-PAL network rather than publicly or via local education research networks – e.g. Peru’s Sociedad de Investigación Educativa Peruana. At its launch, the small MineduLab team simply did not have the capacity to manage an open call themselves. 

Third, even when the opportunities were more widely shared and open to anyone, the Peruvian government did not pay for the experiments. Funding had to be secured by the researchers themselves. Peruvian researchers, who had the capacity to undertake these experiments, struggled (and still struggle) to access the necessary funds; something that researchers affiliated to the J-PAL network and from well-funded universities had no problem with. 

This changed over time. The MineduLab team responded by launching more innovation windows locally, and some research teams have involved Peruvian researchers. 

But this initial period was an important lost opportunity to build a constituency that could have supported the MineduLab and its objectives, as changes in the ministry’s leadership pushed it out of the decision-making spaces it was meant to inform. 

Its initial exclusionary design, therefore, undermined the opportunity to help strengthen local research capacities and secure the MineduLab’s own sustainability

Not one Lab

A crucial element of the story is that the MineduLab is not a single model. Since its inception, it has adapted to its context and has served many purposes. 

In my opinion, it has served multiple purposes for different actors concurrently; and  prioritised them depending on the level of power of the various actors involved. For instance, in its early days the MineduLab, like other efforts to support RCTs and impact evaluations, was just that: a vehicle to channel funding to researchers interested in this type of research. At the same time, it was presented as a vehicle to help policymakers in the government incorporate evidence into programme design.

Consequently, the initial design prioritised the deliver high-quality impact evaluations; even if the challenges described above also became apparent to the Lab’s local team.

As the initiative’s original academic and political sponsors left and the local team grew stronger and more experienced, changes were made to the model: more openness to local researchers, for instance. 

However, the rapid re-politicisation of the Ministry of Education’s top brass after 2016 brought about new challenges that saw its influence and capacity wane. 

Moving beyond the success narrative: lessons for the future

This is not an account of failure. The very fact that the Lab was established and that it has survived for so long in Peru’s weak institutional context is a sort of success, even if many things did not work as planned. MineduLab’s journey offers several lessons for designing and implementing future policy labs in education. For example: 

  1. Data infrastructure and management: Peru’s prior investments in administrative data made it possible to deliver early wins for the Lab. But, while leveraging existing data can facilitate rapid testing of interventions, a more comprehensive approach to data management that includes non-state actors is necessary to address a wider range of educational issues. 
  2. Strategic political integration: Policy labs must be embedded not just physically but strategically within the political and policymaking ecosystem. This involves active participation in policy discussions and alignment with budgetary and administrative processes. MIDIS’ experience offers lessons for future EdLab design. 
  3. Scaling requires thinking and planning: Successful scaling of innovations is not guaranteed by evidence alone. It requires a dedicated strategy, resources, and alignment with broader institutional priorities. What is meant by scaling, too, needs to be properly discussed: is it reaching more beneficiaries or deepening the impact of an intervention? The Lab may not be best placed to lead this. 
  4. Ethical frameworks: Incorporating ethical considerations into the design and implementation of interventions is crucial for their legitimacy and acceptance. With time, these ethical considerations will likely become harder to address and more expensive to follow. 
  5. Broader engagement with the local policy research community: Efforts must be made to actively involve local researchers and stakeholders, moving beyond reliance on international partnerships to build sustainable, locally-owned research capacities.

In conclusion, while MineduLab has made significant strides in promoting evidence-based policy in Peru’s education sector, its experience also highlights the complexities and challenges of institutionalising evidence use. 

We believe that more independent research and evaluation of these efforts are needed to help ensure that future investments are effectively informed by evidence. OTT is committed to understanding why interventions worked and not just if they did.