At Simavi we are keen to take full responsibility for demonstrating the impact of our work. But what exactly is this ‘impact’? And how do we judge whether a project has impact or not? We are implementing a new approach to answer exactly these questions.
Simavi’s goal is simple: a healthy life for all. The sheer size of this ambition requires us to use our resources in the most effective way possible. Over the course of the past few years we have developed a comprehensive theory of change that sets out what we believe to be the most effective way to achieving our mission. We translate this theory of change into all of our programmes – but it’s not set in stone. We want to be more and more confident when setting up each and every programme that it will effectively contribute to achieving our goal, therefore we have to be willing to embrace change.
When monitoring programmes, the focus is most often on the intermediate effects (e.g. the number of people attending meetings, the number of community health workers trained) and so we measure evidence on performance (output, immediate outcomes). Sometimes programmes incorporate international standards, or follow certain protocols, but this tells us little about how what we are doing leads to our overall goal. In a time when development aid resources are shrinking, and public scrutiny to demonstrate effectiveness and impact is increasing, this means we are all – donors and NGOs alike – under more and more pressure to generate evidence of what we are doing. Faced with this challenge, in a climate of scarce resources, doesn’t it make sense to do everything we can to achieve the maximum impact on people’s lives? Indeed, don’t we have a responsibility to do exactly that?
Simavi’s answer is ‘yes’. Therefore we have embarked on a journey to maximise the impact of our new programmes by developing an evidence-informed programme design method that uses all the evidence available to ensure that interventions are effective and efficient. The extra time and effort we dedicate to monitoring, researching and evaluating will allow us, in the long run, to assess more accurately both the width and the depth of the impact we have.
In 2016 we laid the groundwork needed to learn more about the impact of our new programmes. As a result of the collaboration between Simavi and universities including Impact Centre Erasmus, Maastricht and Oxford Universities, we have already introduced a stronger, evidence-informed planning process for the core activities in our Going for Gold and Ritu programmes. Both these programmes now take a combined approach to SRHR and WASH right from the start, which has already provided valuable feedback on the way we work and how our programmes are built.
The results generated with evidence-informed programming will not only demonstrate the effectiveness of one specific programme, but also contribute to the sector-wide evidence base on the impact of SRHR and WASH interventions. We aim to limit the costs of impact measurement by focusing our efforts on a specific set of Simavi interventions and by seeking collaboration with others in the sector to jointly build a stronger evidence base. We believe this will ultimately result in a more efficient use of funds and a greater overall impact towards our goal of achieving a healthy life for all.
Dr Kellie Liket, researcher at the Impact Centre Erasmus, is on a mission to change the (mis)understanding of civil society and donors about impact measurement. She argues that people limit themselves to think about impact as solely an accountability question.
Dr. Kellie Liket: “In the impact discussion, we focus on measurement. How many water pumps? How many midwives? We do something, measure it and tell if it is a lot, a little or nothing. Simavi dares to ask itself: how can we have most impact? What are we going to do to have more? It’s then that the real power of impact is unleashed….impact with aspiration.”
In 2017 we will work together with Kellie Liket and many others towards achieving more impact by continuing our learning process on evidence-informed programming. For a more detailed explanation of our approach to evidence-informed programming, please read the annual report here.