Consultancy Summary:
The objective of this consultancy is to conduct an independent external evaluation of the program’s pilot phase across the five target provinces of Nangarhar, Laghman, Bamyan, Baghlan, and Kunduz. The evaluation will examine the program’s performance using the OECD‑DAC evaluation criteria of relevance, effectiveness, efficiency, impact, and sustainability, with a particular focus on the quality of teaching, the use and effectiveness of digital learning tools, and the program’s contribution to improved learning outcomes.
Specifically, the evaluation will assess the relevance of the EdTech approach and content to the needs, context, and capacities of the target groups; the effectiveness of the program in achieving intended educational outcomes such as learning gains, learner engagement, and digital skill acquisition; and the efficiency of resource use in delivering these results. It will also examine the extent to which the pilot contributed to gender‑sensitive and inclusive outcomes (impact), as well as the potential for the continued and sustained use of digital learning tools, pedagogical practices, and capacities beyond the project lifecycle (sustainability).
The evaluation will build on and validate findings from the recently completed endline assessment, while generating deeper learning on the factors that influenced program performance. By combining rigorous analysis with user‑centered evidence, the consultancy will produce actionable insights to inform program refinement, support sustainability planning, and strengthen future fundraising efforts.
Approach and Deliverables:
The evaluation will use a focused, mixed methods approach, combining rigorous qualitative inquiry, structured analysis of existing data, and a light participatory visual component. This approach ensures both methodological credibility and the generation of compelling evidence suitable for external communication and fundraising.
The suggested framework for achieving the objectives of the evaluation is listed below. However, the consultancy team is expected to review, refine and/or propose their approach to undertaking the evaluation. The consultancy team will develop and finalize the methods and tools in line with the objectives of this evaluation in collaboration with War Child Canada’s staff.
Desk Review
• Review of the pilot phase documentation, including the endline dataset and report, monitoring data, training materials, digital platform usage logs (if accessible), and program design documents.
• Assessment of the strengths, limitations, and credibility of the endline findings.
Teaching Quality Assessment
• Structured KIIs with instructors/trainers, stakeholders to examine training adequacy, classroom environment, support systems, and challenges faced by instructors.
• In-person (where feasible) lesson observations using a simplified teaching quality rubric adapted to local context.
Digital Tool Use and Effectiveness Review
• Analysis of how learners and instructors engaged with digital tools, including accessibility, relevance, usability, and perceived learning value.
• Interviews or FGDs with learners in a selection of provinces.
• Verification of digital usage data.
Participatory Photovoice (in selected provinces)
• A small photovoice exercise in only some selected provinces based on accessibility.
• Participants submit photos and short captions illustrating how digital learning has influenced their lives or learning processes.
• Follow-up individual interviews with selected participants to deepen interpretation.
• Photos will be incorporated into the final evaluation report as case examples or evidence narratives.
Validation
• A remote validation meeting with program staff and key stakeholders to confirm findings and refine recommendations.
Below is a summary of the suggested approaches and a tentative timeline over a six (6) weeks period:
Phase Activities Timeline
Phase 1: Inception Desk review, refinement of methods, development of tools, selection of photovoice provinces 1 Week
Phase 2: Data Collection KIIs, FGDs, lesson observations photovoice implementation, digital tool review 2 Weeks
Phase 3: Analysis Data analysis and synthesis of findings 1 Week
Phase 4: Reporting and Validation Draft report, validation presentation, revisions, final report 2 Weeks
The consultancy team will be responsible for the design of the evaluation and data collection tools, coordination and implementation of data collection activities, analysis of data, and reporting. The consultancy team will provide the following deliverables:
• Inception report (not more than 10 pages) including refined evaluation approach, final methodology and data collection tools, sampling strategy, workplan and timeline,
• Photovoice package, including 10-15 curated photos and a short analytic summary (not more than 2-3 pages), all required ethical documentation must be included,
• Draft report (not more than 25 pages excluding annexes),
• Validation presentation summarizing key findings and recommendations for validation by the project’s management and M&E and other key project staff,
• Final report (incorporating feedback from the validation meeting)
The consultancy team will present an evaluation report that includes:
I. An executive summary that includes a summary of key findings and recommendations
II. Evaluation Design, Methodology and Limitations
III. Findings (Teaching Quality, Use and Effectiveness of EdTech Tools)
IV. Case studies and photovoice outputs
V. Recommendations
VI. Annex of final tools, detailed tables and summary notes from qualitative data collection, list of stakeholders interviewed/consulted, bibliography of any supporting documentation reviewed, and any photos/videos with credit and consent forms (if any)
Confidentiality and data protection
All outputs, including reports, datasets, etc., produced under this assignment will not be disseminated in part or whole without express authority from War Child Canada. Thus, the consultant shall not produce these materials in any form (electronic, hard copies, etc.) to a third party without the written permission from War Child Canada.
Ethical Standards and Protocols
The assessment approach must consider the safety of the respondents, especially children, at all stages of the assessment. It should clearly define and explain to participants who took part in this study what the purpose of the exercise is and how respondents` information will be used, considering confidentiality standards for the research study. Also, vital considerations will be given to ensure the protection of children throughout the different stages, including hiring and training of enumerators, field data collection, data analysis, and writing the report. Ethical standards, good practices, and WCC core values for the study and data collection should be strictly followed at all levels. As a minimum, the study should be based on the principle of ‘do no harm’.
Assessment proposals should not exceed USD $11,000.
The assessment is planned to occur from May 10, 2026, to June 20, 2026, with final deliverables provided no later than June 25, 2026.