What factors influence effective evidence use in teacher training institutions?

26 March 2024

Despite the powerful contribution that good data can provide, it’s been well-documented that the use of evidence in decision-making is still very low.  

Specifically, how teacher training institutions use evidence is a matter that still needs to be thoroughly studied. In Argentina, the teacher training system is generally considered to be opaque and under-studied.  

Given the challenges raised in the literature from 2011 and 2014 about the system’s social utility, quality and excessive scale, it’s surprising that literature about how Argentinian teacher training institutions use evidence is so scarce. 

However, previous research studies from 2009, 2012 and 2013 have shown that some preconditions are necessary for educational institutions to use evidence positively.  

In 2021, I conducted a study into how these conditions are related to evidence-use in eight teacher training institutions in Buenos Aires.  

I took advantage of the fact that the National Ministry of Education had produced individual reports for each institution, which detailed the results of the first national assessment of trainee teachers.  

In my study, I investigated how the eight institutions have used these reports.  

My study 

Key findings 

Overall, I found that the level of appropriation and the use of evaluative evidence was not homogeneous across the eight teacher training institutions used in this study; some institutions ranked higher than others in their use of evidence. 

Following the theoretical framework, I found that the institutions that benefitted the most from the reports were characterised by the following distinctive features:   

  1. Employing individuals experienced in using data
  2. Having leaders with evidence-oriented mindsets  
  3. Having a collaborative culture among professors
  4. Giving professors the autonomy to implement changes
  5. Providing information and supportive internal structures
  6. Accepting externally produced evidence
  7. Habitually using data in decision-making
  8. Personalising their institutional projects according to context
  9. Promoting a strong sense of belonging among staff
  10. Utilising effective communication among all staff 

Numbers 1–7 reflect the preconditions referenced in the existing literature, under three of the dimensions described therein: the individual, institutional and evidence-properties dimensions. 

Numbers 8–10 are factors that I observed during my field work.   

All of these will be exemplified in the results section.  

My findings are consistent with the common reality of public policy implementation: what is planned centrally, at the macro level, does not always translate into concrete changes in the habits and behaviours of actors at the micro level.  

In this case, the minor behavioural changes recommended in the national Ministry of Education’s reports were only implemented by a group of institutions with the aforementioned characteristics. 


To answer my research question – How have these institutions used these reports? – I designed a stratified sample, considering the institutions’ size (big/small) and management (private/public).  

I then conducted three interviews per institution with the deans, professors and other staff.  

My analysis permitted me to categorise the institutions into high users or low users of evaluative evidence. The indicators that allowed this categorisation included the following: 

  1. The development of a project after receiving the government report  
  2. The use of group discussions about the report’s findings
  3. The individual professors’ use of the report in class
  4. The dissemination of the report among the professors, but with no discussion among them
  5. Satisfaction with the results/assessment in the report but no further use 


According to my classification, I found that three out of the eight institutions were intensively using these reports.  

The aforementioned features of the highly rated institutions are exemplified below. 

1. Technical capacities on the use of data

The highly rated institutions employed individuals who were experienced in using data.  

In the following example, I spoke to a professor who already had experience in using evaluation reports in the primary and secondary education system:  

“Because, that is to say, I am working now with… with the Aprender tests in Mathematics. That is to say, I work, I am a generalist trainer [for using reports in schools].”

2. Institutional leadership 

Having leaders with evidence-oriented mindsets was found to be a common feature in the institutions that used evidence effectively. 

One positive example of this came from an interview with one of the deans of a public institution. They told me about the training work that they conducted with their professors after receiving the report: 

With a team that I insisted a lot on transforming…one of the priority issues where they put a lot of emphasis was evaluation.”  

Their decision made it possible to work on an issue that was prioritised for their institution in the government report: evaluation. 

3. Collaboration among professors  

 A collaborative culture was noted among the professors in the highly rated institutions.  

One positive example of collaboration can be seen through the following quote from a professor that I interviewed: 

“Sometimes we get here early, at five o’clock. And between five and five thirty we have a little while, and well, we talk about these things, we talk about the students, what difficulties they have, what happened with this one, sometimes we go to look for a student who drops out.”  

 One point to bear in mind, however, is that for collaborative practices to work most effectively they should take place within the working day. 

 4. Perception of autonomy to implement changes  

 Effective evidence use was seen to be higher when the professors perceived that they had the freedom to address issues and implement any necessary changes.  

For example, one professor from a private institution described how the staff identified a problem in their assessment practices, came up with a solution and implemented a change: 

So, one idea was not only to target teaching practices…What we were lacking a bit was the issue of learning assessment. So that’s where we decided to aim for.”  

 5. Available support and infrastructure  

Readily available information and internal structures that support evidence use were also found to be a key characteristic of the highly rated institutions in my study. To collect data about this, I used observation guides rather than interviews.  

In one positive example, timetables, articles and other documents related to the management of the institution covered the walls of the dean’s office. The institutional project was also readily available.  

In another example, the institution had a bulletin board with information on daily activities and an up-to-date website and social networks. In addition, the dean had a notebook and a large table as a desk, which was also used for team meetings.  

In contrast, the walls of one small public institution’s meeting room were covered with black and white pictures of the former deans from the last 30 years, which is far less conducive to effective evidence use. 

 6. Externally produced evidence  

 The highly rated institutions didn’t discard the report’s validity just because it was externally produced/produced by a governing party with opposing ideologies to their own, whereas other institutions did.  

For example, in one institution, the issuer of the measurement tools and report – the national Ministry of Education – was delegitimised; therefore, the rating for evidence-use here was not high.  

One professor told us, “It seemed as if the person who sent the assessment tools was not internalised with the level.”  

In addition, a secretary from a private institute argued that teachers’ evaluations should be implemented by provinces rather than the state, as they tend to be closer to the institutions and have functional dependence.  

7. Use of multiple sources of evidence  

In the highly rated institutions, I observed that the deans and professors habitually use evidence in their decision-making: surveys, regular observations of classes, and statistical analyses of internal performance variables/student performance.  

For example, one vice dean described the student surveys that they carry out every year to identify what aspects of the course need to be improved. 

 8. Relevant, updated institutional project  

Another characteristic of highly rated institutions was that they actually discussed and personalised their institutional projects, rather than just adopting a project that was acceptable as a matter of form.  

For example, one professor reported that their institutional project is a document that was constructed by consensus and for the specific context of that institute:  

“There’s no chance of using it as a ready-made recipe. Like saying ‘Oh, this is a nice institutional project’. Well, yes, that is no good for us. 

9. Sense of belonging  

In the institutes that were highly rated for their evidence use, a recurring theme among the staff was that they feel part of the institution.  

For example, one professor said, “yes, of course, of course, here we do everything like this“, when asked about an action that referred to him specifically, indicating a team mentality. 

 10. Fluid and articulated communication 

Effective communication was the most striking difference between the training institutions that used evidence effectively and those that did not.  

Where communication was found to be ongoing, useful and inclusive of the whole staff – the deans, teachers, professors and clerical staff – the use of evaluative evidence was clear.  

For example, in one highly rated institution, the dean told us, “We have regular meetings with professors and we strengthen each area. I work in the pedagogical part. We complement each other a lot in the team. 

Conversely, in one institution that was rated as a low evidence user, I observed a lack of communication. One professor told us, ” You just find out what has been built (in the institutional project) or you are notified in the minutes book.