Evidence. Influence. Impact. & a good story to tell

Evidence. Influence. Impact. & a good story to tell

Last week, the International Initiative for Impact Evaluation’s (3ie) annual report was released carrying the well-established title of Evidence. Influence. Impact. The report, once more, brings together a rich collection of evidence products and accounts of policy influence in the quest of improving lives through impact evaluation. It sheds light on the main successes and learnings the NGO achieved in fostering evidence-informed development policies and programming in 2014. I recommend the report in general but would want to single out for our South African Network members the case study on page 30 of the report: A youth wage subsidy experiment for South Africa.

This case study should be textbook material for anybody interested in the policy/research nexus in South Africa. It discusses the findings and subsequent policy uptake of a RCT of the impact of youth wage subsidies in South Africa (Levinsohn et al 2014). Alas, while the findings of the study have received extensive coverage, few people are aware that Levinsohn and colleagues’ youth wage subsidy RCT similarly presents a milestone for the evidence-informed policymaking community in the country. When the South African parliament adopted the Employment Tax Incentive Act on 31 October 2013, the South African political economy was upside down: The ANC as the ruling party backed the policy despite heavy opposition by its trade union alliance member in government, in effect openly opting to side with the official opposition party who had implemented a similar policy at provincial level just months before. What watershed moment might have been able to turn government tables so starkly? As it appears, rigorous and context-aware research evidence might have had a finger in the pie.

In a nutshell, the study evaluated whether wage subsidy vouchers could help increase youth employment. The RCT design itself is fascinating, but so are the findings that ‘one year after allocation, young people with the voucher were seven percentage points more likely to be in wage employment than those without the voucher’. These findings hold true even a year after the experiment and the bottom line strongly supports the assumption that youth wage subsidies do work to increase youth employment in South Africa. So far, so good; where it gets really interesting though is once one considers the political economy context in which these findings managed to inform policy.

From the onset of the study, it was obvious that the question of whether youth wage subsidies work had already been answered by politicians. Leftwing and trade union fractions of the ruling party made it clear that they regarded the policy proposal as part of a wider ‘neo-liberal agenda’ aiming to undermine the power of employed workers in the country. On the other side, the official opposition party as well as employer organisations supported the subsidy believing it to be able to create more jobs in the country. The government at the same time faced public pressure to address the high rates of youth unemployment in the country without having had tabled explicit policy options or programmes yet. Youth wage subsidies enjoyed support within the National Treasury in particular. Arguments between these different fractions were heated, even physical at times. This was the policy climate in which Levinsohn and peers’ experiment was supposed to compete for attention, even more, target policy influence. Evidence to the rescue – a rather daunting task for a single research study!

Though, in a case of what was certainly not a supply-driven model of evidence uptake, the research team produced a marvellous piece of evidence; introduced it in a what can only be labeled a *challenging* policy context; and the outcome (not confusing an anecdote with evidence of direct attribution) was a national policy suspiciously similar to what the research team had recommended. I think there is good reason to regard this as evidence-informed policymaking par excellence and that as advocates of evidence use we can take away three and a half main lessons from this successful case of policy influence:

  • Half a lesson goes to the formulation of an explicit Policy Influence Plan (PIP) prior to the design of the study, and its constant updating throughout the conduct of the research. Here is Levinsohn and peers’ PIP, and the formulation of such a plan is actually mandatory for all 3ie-funded evidence products, making this practice increasingly common.
  • Invest in a rigorous research design: Knowing that the findings of the experiment would be hotly disputed by either political fraction, the study applied the most rigorous research design possible and communicated each step in the research process transparently. This rigour and transparency did safeguard against the dismissal of the findings on methodological grounds and paved the way for the study’s results into the ongoing policy debate.
  • Ensure high-level legitimacy and dissemination: The research team was perceived by most fractions as an independent actor in the debate. While funded by the National Treasury, the research was operationally independent. At the same time, each actor in the debate was aware of the conduct of the study, the rigour of its design, and the transparency of its publication.  In practice, the research team had to go to great lengths to ensure its independence, as an example from the PIP shows: ‘In order to mitigate this [trade unions’] explicit opposition we have invited representatives of organised labour to be on the steering committee for this project. They refused. We therefore had to drop organised business from the steering committee to avoid criticism of favouritism‘. Having raised awareness at such high-level, the dissemination of the study results was targeted at the key actors in the debate. This also ensured that major media outlets were interested in covering the research.
  • Embrace the political economy: The research team actively engaged with the political economy likely to mitigate the relevance of the study’s findings in the policy arena. For this purpose, the team investigated the main arguments of the parties opposed to the youth wage subsidy, in this case ‘that subsidies for younger workers will displace older workers, and that since the subsidy is temporary young workers will be discarded once their subsidies end and will be replaced by other subsidy holders’ (taken from the above PIP). The research design was then adapted to, in addition to the main research question, also explicitly gather evidence on these two points of criticism. Instead of merely showing that the youth wage subsidy indeed does create jobs, the research team was also able to engage with the main critics illustrating that this creation of jobs did not replace older workers, and that firms did not lay-off young workers once the subsidy had stopped. 

In this year’s state of the nation address, President Jacob Zuma highlighted the Employment Tax Incentive and its impact on the employment of young workers as part of the government’s ‘good story to tell’. I would argue here that the adoption and implementation of this act similarly is a ‘good story to tell’ for everyone interested in evidence-informed policymaking in the country.