Library – Adaptive Management in International Development - Custom feedLibrary – Adaptive Management in International Developmenthttps://docs.adaptdev.info/lib/2024-03-29T06:30:44.522687+00:00https://docs.adaptdev.info/lib/atom.xml?type=documentKerkoMinister of International Development and La Francophonie Mandate Letter (November 12, 2015)https://docs.adaptdev.info/lib/R3QJS23W2019-05-30T15:04:41Z2019-05-30T15:05:50ZDear Minister:I am honoured that you have agreed to serve Canadians as Minister of International Development and La Francophonie. You will be part of a strong team of ministers led by the Minister of Foreign Affairs.Office of the Prime Minister2015-11-12T15:13:00-05:00enMinister of International Development and La Francophonie Mandate Letter (November 12, 2015)Reflections on the Utilization-Focused Evaluation (UFE) Processhttps://docs.adaptdev.info/lib/7UJ7SGQT2018-10-22T11:26:00Z2018-10-22T11:27:53ZThis presentation from the Strengthening ICTD Research Capacity in Asia (SIRCA) provides an overview of how UFE was used in their SIRCA programme. It was presented at the Evaluation Conclave 2010, New Delhi, India
The key objectives of the program are to:
Enhance research capacity in Asia through rigorous academic research
Create a space for dialogue on ICT4D social science research issues in Asia
Create linkages through a mentorship program
Disseminate findings in publications and conferences
Contents
SIRCA Programme
SIRCA Key Objectives
SIRCA Evaluation
UFE Learnings
UFE Challenges
Evaluation is over…but there’s a lasting outcome...Lim, YvonneMizumoto, Ann28 October 2010Strengthening ICTD Research Capacity in Asia (SIRCA) ProgrammeReflections on the Utilization-Focused Evaluation (UFE) Process10 Things to Know About Evaluationhttps://docs.adaptdev.info/lib/DKSRP8BW2018-10-11T16:03:21Z2018-10-11T16:05:22ZEvaluation is essential to good development. But there are still many myths and misconceptions about what it is - and how it should be used.
ODI's Research and Policy in Development Programme (RAPID) has many years' experience supporting evaluation in complex development contexts.
In support of the International Year of Evaluation 2015, we've put together our essential 'things to know' about evaluation in 10 infographics.
Available in English and French.Buffardi, AnneHearn, SimonPasanen, TiinaPrice, clareBall, LouiseJune 2015RAPID10 Things to Know About EvaluationTemplate concept note for an impact evaluationhttps://docs.adaptdev.info/lib/JXEHQ4LG2018-10-11T15:11:17Z2018-10-11T15:11:53ZMethods LabNovember 2015Template concept note for an impact evaluationSample criteria to select case studies for evaluationhttps://docs.adaptdev.info/lib/HZNSDYVZ2018-10-11T15:09:38Z2018-10-11T15:10:37ZTime and budget constraints often mean that organisations are unable to evaluate all of their programmes,
and large programmes, operating in multiple locations, are unable to evaluate all project sites. This tool
introduces two sets of criteria to support evaluators and programme managers to select case studies or
programmes for evaluation: i) information about how relevant or feasible evaluation is for individual
programmes, and ii) across the overall portfolio, strategic thinking around what types of cases are most
important to understand. This tool was developed by Anne Buffardi, Irene Guijt, Simon Hearn and Tiina
Pasanen for use in The Methods Lab projects.Methods LabNovember 2018Sample criteria to select case studies for evaluationGuidance on tasks and deliverables for different evaluation phaseshttps://docs.adaptdev.info/lib/HP67ZX882018-10-11T15:09:27Z2018-10-11T15:13:23ZThis tool describes the five key phases of evaluation, from planning and design, to implementation and
communication of results. It provides a list of the main tasks and deliverables for each phase, intended for
use by anyone managing an impact evaluation. This tool was developed by Irene Guijt, Simon Hearn,
Tiina Pasanen and Patricia Rogers for use in Methods Lab projects. It follows to some extent the
BetterEvaluation Rainbow Framework.Methods LabNovember 2015Guidance on tasks and deliverables for different evaluation phasesGuiding questions to help narrow the scope of an evaluationhttps://docs.adaptdev.info/lib/97BJKP8R2018-10-11T15:08:19Z2018-10-11T15:09:11ZTime and budget constraints can mean that programmes are not able to assess all possible evaluation
questions; this is especially true for multi-component or multi-site programmes operating in challenging
environments. This tool identifies areas of enquiry to help programmes prioritise the number of questions
and measurement indicators used. This tool was developed by Anne Buffardi for use in in Methods Lab
projects.Methods LabNovember 2015Guiding questions to help narrow the scope of an evaluationReport template on integrating impact into an existing monitoring and evaluation systemhttps://docs.adaptdev.info/lib/N2HTGN5B2018-10-11T15:06:02Z2018-10-11T15:07:19ZMany development programme staff will commission an impact evaluation towards the end of a project or
programme, only to find that the monitoring system did not provide adequate data about implementation,
context, baselines or interim results. This tool provides a template outline for a report making
recommendations on how to integrate a focus on impact into a programme’s existing monitoring and
evaluation system, as the programme moves into a new phase. This template was developed by Anne
Buffardi and Tiina Pasanen for use in Methods Lab projects.Methods LabNovember 2015Report template on integrating impact into an existing monitoring and evaluation systemReport templates for evaluability assessmenthttps://docs.adaptdev.info/lib/2QJIC95P2018-10-11T14:59:11Z2018-10-11T15:00:16ZAn evaluability assessment aims to assess the extent to which, and how best, a project can be evaluated in
a reliable and credible fashion. These templates are intended to help anyone conducting an evaluability
assessment to structure the final report. This tool was developed by Anne Buffardi and Bronwen
McDonald for use in Methods Lab projects. It accompanies The Methods Lab publication ‘Evaluability
assessment for impact evaluation: guidance, checklists and decision support’.Methods LabNovember 2015Report templates for evaluability assessmentSample agendas for an evaluability assessment stakeholder workshophttps://docs.adaptdev.info/lib/7T6X29Z72018-10-11T14:53:07Z2018-10-11T14:53:46ZAn evaluability assessment aims to assess the extent to which, and how best, an intervention can be
evaluated in a reliable and credible fashion. These sample agendas are intended for people convening key
stakeholders (such as project implementation staff and managers, donors and government officials) to
discuss the purpose and scope of an impact evaluation and to identify key evaluation questions. This tool
was developed by Bronwen McDonald, Anne Buffardi and Irene Guijt for use in Methods Lab projects. It
accompanies the Methods Lab publication ‘Evaluability assessment for impact evaluation: guidance,
checklists and decision support’.Methods LabNovember 2015Sample agendas for an evaluability assessment stakeholder workshopSample interview questions for evaluability assessmenthttps://docs.adaptdev.info/lib/KXPALNV22018-10-11T14:51:15Z2018-10-11T14:51:55ZMethods LabNovember 2015Sample interview questions for evaluability assessmentAdaptive Management Self-assessment toolhttps://docs.adaptdev.info/lib/PRIRKYXP2018-10-09T13:49:37Z2021-10-22T10:33:57ZThe Adaptive Management self-assessment tool has been designed to help teams assess the extent to which they have a supportive environment for adaptive management within their country program. The self-assessment tool helps you think about five different areas that have been identified as important for supporting adaptive management:
1. Culture & leadership
2. Dynamic teams
3. Appropriate analysis
4. Responsive implementation & operations
5. Enabling environment (for example donor funding and relationships)mercy Corps2016Mercy CorpsAdaptive Management Self-assessment toolCLA Maturity Tool: Example Spectrum Cardshttps://docs.adaptdev.info/lib/LNAMNLZJ2018-09-03T13:09:40Z2019-03-12T14:09:39ZUSAID2016.10CLA Maturity Tool: Example Spectrum CardsInterview protocol: Most significant turning pointshttps://docs.adaptdev.info/lib/II6BFMFH2017-10-11T15:19:45Z2017-10-11T15:19:45ZThis interview protocol was used for a research project on adaptiveness in technology for governance initiatives in Kenya. For more information, please read the research report at:
Prieto Martin, P.; Hernandez, K.; Faith, B. and Ramalingam, B. (2017) Doing Digital Development Differently: Lessons in adaptive management from technology for governance initiatives in Kenya, MAVC Research Report, Brighton: Institute of Development Studies, ids.ac.uk/project/making-all-voices-countPrieto Martin, PedroFaith, Becky2017.10Institute of Development StudiesInterview protocol: Most significant turning pointsNavigating the Evidence on Transparency, Participation and Accountability: What Insights Have Emerged? What Gaps Remain? - Terms of reference for the Consultant Author(s)https://docs.adaptdev.info/lib/JJ5UJ5BN2017-10-11T15:19:45Z2017-10-11T15:19:45ZExample of Terms of Reference for a Report on TAP.Halloran, Brendan2015Navigating the Evidence on Transparency, Participation and Accountability: What Insights Have Emerged? What Gaps Remain? - Terms of reference for the Consultant Author(s)