CDI’s network has been growing fast. Last month we conducted our first ever survey of the 3,500 email subscribers in our community (from 9th to 22nd August) – a huge thank you to all those who participated.
This survey has given us an opportunity to reflect on CDI’s community and their priorities in terms of research, learning and evaluation practice. It will help us to ensure that insights on impact evaluation designs and methods from CDI research remain relevant, timely, and distributed in the best way possible to stimulate debate and take the field further. It has also allowed us to take stock of our efforts so far, and obtain useful feedback so that we can continue to encourage debate, promote innovation and foster new alliances.
Our response rate of almost 200, from 57 different countries, was encouraging, particularly as it took place during the August period (when many individuals may have been on holiday). As CDI is a partnership between academia, policy researchers and evaluators, we were very pleased to see that we are reaching a similarly diverse audience. We were heartened that individuals took time in the survey to provide detailed qualitative feedback, with suggestions that we published our result, which we are pleased to do. We hope that you are similarly interested by the feedback as we are.
Given that CDI was set up to span researchers and practitioners, evaluators and non-evaluators, government and non-government, the spread of responding individuals and their organisations was interesting and encouraging. Over a quarter were from NGOs (27%) and another quarter from Universities (25%), with a fifth (21%) from Government, and many from consultancy (14%).
Understandably, most individuals working with CDI were in researcher or analyst positions, but also typically in higher-level positions such as CEOs, directors or heads of country offices. Offbeat responses included parish priests, bankers, communication officers, with teachers featuring as well.
Notably, nearly half of respondents were from Africa (48%). This is positive both in terms of the fact that we are not simply UK or European facing, and because we are engaging with practitioners of impact measurement, not just funders. This is also a potential challenge for us, and places different informational needs on us. It also suggests that our communications strategy may need further work in determining how we engage effectively with our more diverse audience.
From an aggregate perspective, there was a similarly broader perspective on evaluating impact, covering both experimental and non-experimental approaches, than we had anticipated. Broader methods to evaluate impact were slightly more present than experimental methods. When looking more deeply at the data, consultants and universities were more likely to apply broader ways to evaluate impact in their work, rather than NGOs and Government employees.
When it came to the core themes that our community is interested in, it was interesting to find ethics come out top in terms of people’s rankings (61% rated it very high on their priorities), but in terms of average scores, it was the ‘use of evaluation evidence’ that garnered greater interest across the whole community. There was actually high interest shown across all of CDI’s themes, particularly when looking at the application of evidence and methodological innovations. We now wonder whether this interest is because of insufficient support in these areas at the organisational level, or whether these are consistent issues faced by evaluators and researchers, and therefore in need of constant refinement.
In feedback, there were also some concerns expressed around the language used in evaluation. We know that language and methods can be confusing when identifying, designing and piloting evaluation solutions, particularly with new frameworks, so we try to make our language as clear as possible to promote learning. As CDI develops, we will attempt, through our newsletters, seminars, and publications, to place further emphasis on clarity of language, so that the awareness of the new tools and methodologies can span beyond evaluators, researchers and academics.
E-newsletters, seminars and events were favoured as means of communication by participants. Noting our diverse geography, we are aware that CDI needs to put more thought into how we can further engage with our base and present information in the best forms for them. We intend to keep our quarterly newsletter (which you can sign up to here), but we also intend to think about how we alert our members to new seminars, information and learning that may be of interest to them. We are already making a start on this with our new CDI Twitter page, which although not hugely demanded by our community, presents the opportunity for a more consistent flow of information to members. Similarly, seminars are clearly important, and while we try to podcast all seminars, we will work on other ways we can engage with individuals, for example exploring opportunities such as livestreaming and webinars.
In which format do you prefer to receive and share information on impact evaluation evidence and debate?
‘Other’ included webinars, emails, conferences, academic articles, seminars and Facebook
Additional comments/ Feedback
There was also a range of qualitative feedback from respondents, and we are happy with how positively our work has been received. What has become clear is that our members want us to extend our operations and make our findings more widely accessible, thereby helping to influence policymakers. The frequency of communications is an important element of achieving this. Similarly, there seems to be a demand for training opportunities in broader methods of impact evaluation, including through online courses, which is something CDI may have to consider. We are running our first ‘face to face’ course in January 2017 (on “Designing Effective Ways to Evaluate Impact”), after which we will look at blended and online options.
As CDI continues to develop, we aim to continue producing high-quality research, developing and broadening the range of evaluation designs and methods, and providing funders and evaluators alike with valuable insights and tools to measure and understand impact.
Once again, we would like to thank all our CDI members for their time and support, and if you have any more feedback as to how CDI should engage with its base, please feel free to contact us at CDI@ids.ac.uk.
Photo Credit: Curt Carenmark/World Bank