The evaluation will assess project performance against expectations set out in the project results framework. The TE will assess results according to the criteria outlined in the UNDP Evaluation Guidelines.
The evaluation will consider the pertinent outcomes and outputs as stated in the project document focused towards advancing medium to long term planning in climate sensitive sectors in relations to Country Programme Outcome #3: Inclusive Growth-UNDP will support the Government to meet its obligations under the Paris Agreement by strengthening policy and legislative capacities, building partnerships for climate action, particularly with the private sector, and mobilizing national and global finance. Mainstreaming environmental considerations into national policy and planning to ensure climate justice for women and marginalized groups will remain a priority.
As described in the background, the NAP programme has implemented 4 outcomes. An analysis of achievements across all 4 outcomes is expected:
NAP PROGRAMME OUTCOME 1 |
Strengthened institutional coordination and climate change information and knowledge management for medium- to long-term planning; |
NAP PROGRAMME OUTCOME 2 |
Adaptation options appraised and prioritized, and National Adaptation Plan formulated; |
NAP PROGRAMME OUTCOME 3 |
Climate risk informed decision-making tools developed and piloted by planning and budget departments at national and sectoral levels; |
NAP PROGRAMME OUTCOME 4 |
Nationally appropriate participatory adaptation investments tracking mechanism and financial plan for mid- and long-term CCA implementation set up |
The following reports and deliverables are required for the evaluation:
- TE Inception report
- Draft Evaluation Report
- Presentation at the validation workshop with key stakeholders, (partners and beneficiaries)
- Final Evaluation report
One week after contract signing, the evaluation team will produce an inception report clarifying the objectives, methodology and timing of the evaluation. The inception report must include an evaluation matrix presenting the evaluation questions, data sources, data collection, analysis tools and methods to be used. Annex 3 provides a simple matrix template. The inception report should detail the specific timing for evaluation activities and deliverables and propose specific site visits and stakeholders to be interviewed. Protocols for different stakeholders should be developed. The inception report will be discussed and agreed with the UNDP Country Office before the national evaluator proceed with site visits.
The draft evaluation report will be shared by the evaluation team to the UNDP Country Office, who will circulate the draft to stakeholders. The evaluation team will present the draft report in a validation workshop that the UNDP country office will organise. Feedback received from these sessions should be considered when preparing the final report. The evaluators will produce an ‘audit trail’ (Annex Z) indicating whether and how each comment received was addressed in revisions to the final report.
Evaluation Questions
The evaluation seeks to answer the following questions, focused around the evaluation criteria of relevance, effectiveness, efficiency and sustainability:
Relevance:
- How well has the programme aligned with government and agency priorities?
- To what extent has NAP’s selected method of delivery been appropriate to the development context?
- Has NAP programme been influential in influencing national policies on climate change adaptation?
- To what extent was the theory of change presented in the outcome model a relevant and appropriate vision on which to base the initiatives?
- To what extent was the project in line with the UNDP Strategic Plan, CPD, United Nations Sustainable Development Cooperation Framework
- (UNSDCF), SDGs, and GCF strategic programming
Effectiveness
- What evidence is there that the programme has contributed towards an improvement in national government capacity, including institutional strengthening?
- Has the NAP programme been effective in helping improve climate change adaptation planning in Bangladesh?
- To what extent have outcomes been achieved or has progress been made towards their achievement.
- What has been the contribution of partners and other organizations to the outcome, and how effective have the programme partnerships been in contributing to achieving the outcome?
- What were the positive or negative, intended or unintended, changes brought about by NAP’s work?
- What contributing factors and impediments enhance or impede NAP performance?
- To what extent did the project contribute to gender equality, the empowerment of women, and/or a human-rights based approach?
Efficiency
- Are NAP’s approaches, resources, models, conceptual framework relevant to achieve the planned outcomes?
- To what extent were quality outputs delivered on time?
- Has there been an economical use of financial and human resources and strategic allocation of resources (funds, human resources, time, expertise, etc.)?
- Did the monitoring and evaluation systems that NAP has in place help to ensure that activities and outputs were managed efficiently and effectively?
- Were alternative approaches considered in designing the programme?
Sustainability
- What is the likelihood that the NAP programme interventions are sustainable?
- What mechanisms have been set in place by NAP to support the government of Bangladesh to sustain improvements made through these interventions?
- To what extent has a sustainability strategy, including capacity development of key national stakeholders, been developed or implemented?
- To what extent have partners committed to providing continuing support?
- What indications are there that the outcomes will be sustained, e.g., through requisite capacities (systems, structures, staff, etc.)?
- What opportunities for financial sustainability exist?
- How has the project developed appropriate institutional capacity (systems, structures, staff, expertise, etc.) that will be self-sufficient after the project closure date?
Impact
- What has happened as a result of the programme or project?
- What real difference has the activity made to the beneficiaries?
- How many people(w/m) have been affected?
- Were there contributions to changes in policy/legal/regulatory frameworks, including observed changes in capacities (awareness, knowledge, skills, infrastructure, monitoring systems, etc.) and governance architecture, including access to and use of information (laws, administrative bodies, trust building and conflict resolution processes, information-sharing systems, etc.)?
- Were there contributions to changes in socio-economic status (income, health, well-being, etc.)?
- Discuss any unintended impacts of the project (both positive and negative) and assess their overall scope and implications.
- Identify barriers and risks that may prevent further progress towards long term impact;
- Assess any real change in gender equality, e.g. access to and control of resources, decision- making power, division of labor, etc.
The evaluation must also include an assessment of the extent to which programme design, implementation and monitoring have taken the following cross cutting issues into consideration:
Human rights
To what extent have poor, indigenous and tribal peoples, women and other disadvantaged and marginalized groups benefitted from NAP’s interventions?
Gender Equality
- To what extent has gender been addressed in the design, implementation and monitoring of the NAP programme?
- To what extent has NAP programme promoted positive changes in gender equality? Were there any unintended effects?
- How did the programme promote gender equality, human rights and human development in the delivery of outputs?
The evaluation team will include a summary of the main findings of the evaluation report. Findings should be presented as statements of fact that are based on analysis of the data.
A section on conclusions will be written in light of the findings. Conclusions should be comprehensive and balanced statements that are well substantiated by evidence and logically connected to the evaluation findings. They should highlight the strengths, weaknesses and results of the project, respond to key evaluation questions and provide insights into the identification of and/or solutions to important problems or issues pertinent to project beneficiaries, UNDP and the GCF, including issues in relation to gender equality and women’s empowerment.
Recommendations should provide concrete, practical, feasible and targeted recommendations directed to the intended users of the evaluation about what actions to take and decisions to make. The recommendations should be specifically supported by the evidence and linked to the findings and conclusions around key questions addressed by the evaluation.
The evaluation report should also include lessons that can be taken from the evaluation, including best and worst practices in addressing issues relating to relevance, performance and success that can provide knowledge gained from the particular circumstance (programmatic and evaluation methods used, partnerships, financial leveraging, etc.) that are applicable to other GCF and UNDP interventions. When possible, the evaluation team should include examples of good practices in project design and implementation.
It is important for the conclusions, recommendations and lessons learned of the TE report to include results related to gender equality and empowerment of women.
Methodology
The evaluation report must provide evidence-based information that is credible, reliable and useful.
The evaluation will be carried out by an external team of independent evaluators and will follow a participatory and consultative approach ensuring close engagement with a wide array of stakeholders and beneficiaries, including national and local government officials and staff, donors, beneficiaries from the interventions, and community members.
Evidence obtained and used to assess the results of NAP’s interventions must be triangulated from a variety of sources, including verifiable data on indicator achievement, existing reports, evaluations and technical papers, stakeholder interviews, focus groups, surveys and site visits. In the event where field mission is not possible due to COVID, then remote interviews may be conducted through telephone or online (teams, zoom etc). Under such situation, site visits will be carried out by the National Consultants. These formalities will be agreed upon during contract discussions and finalized in the inception meeting. The specific design and methodology for the evaluation should emerge from consultations between the evaluation team and the above-mentioned parties regarding what is appropriate and feasible for meeting the evaluation purpose and objectives and answering the evaluation questions, given limitations of budget, time and data. The evaluation team must use gender-responsive methodologies and tools and ensure that gender equality and women’s empowerment, as well as other cross-cutting issues and SDGs are incorporated into the evaluation report.
The final methodological approach including interview schedule, site visits and data to be used in the evaluation must be clearly outlined in the evaluation Inception Report and be fully discussed and agreed between UNDP, stakeholders and the evaluation team.
The final report must describe the full evaluation approach taken and the rationale for the approach making explicit the underlying assumptions, challenges, strengths and weaknesses about the methods and approach of the evaluation.
The following steps in data collection are anticipated:
Desk Review
A desk review should be carried out of the key strategies and documents underpinning the project’s scope of work. This includes reviewing the project document, different reports, country programme document, as well as any monitoring and other documents, to be provided by the project and Commissioning Unit.
Field Data Collection
Following the desk review, the national evaluator will build on the documented evidence through an agreed set of field and interview methodologies, including:
- Interviews with key partners and stakeholders
- Field visits to project sites and partner institutions
- Survey questionnaires where appropriate
- Participatory observation, focus groups, and rapid appraisal techniques
EXPECTED DELIVERABLES AND OUTPUTS:
Deliverables/Outputs |
Estimated Duration (days) |
Target Due Dates (indicative) |
Review and Approvals Required |
---|---|---|---|
|
4 |
One week after signing the contract |
Programme Specialist |
|
|||
|
13 |
Within third weeks after signing the contract |
Programme Specialist |
Outputs:
|
5 |
Within eighth week after signing the contract |
Programme Specialist |
Total |
22 |
|
|
SUPERVISION AND PERFORMANCE INDICATORS
The International Evaluation Consultant will closely work with Programme Specialist (Nature, Climate & Energy) of UNDP, National Project Director (NPD) and Deputy Project Director of NAP. His/her performance evaluation will be conducted by the supervisor. The Consultant will ensure a result-based system is in place to tap key results that contribute towards the achievement of project outputs and outcomes.
IMPLEMENTATION ARRANGEMENTS
The UNDP CO will select the evaluation team through standard UNDP procurement processes and will be responsible for the management of the evaluators. The Head of Unit/Deputy Resident Representative Programme (DRR/P) will designate a focal point for the evaluation that will work with the M&E Specialist and Programme Manager to assist in facilitating the process (e.g., providing relevant documentation, arranging visits/interviews with key informants, etc.). The CO Management will take responsibility for the approval of the final evaluation report. The M&E Specialist or designate will arrange introductory meetings within the CO and the DRR/P or her designate will establish initial contacts with partners and project staff. The consultants will take responsibility for setting up meetings and conducting the evaluation, subject to advanced approval of the methodology submitted in the inception report. The CO management will develop a management response to the evaluation within two weeks of report finalization.
The Task Manager of the Project will convene an Advisory Panel comprising of technical experts to enhance the quality of the evaluation. This Panel will review the inception report and the draft evaluation report to provide detail comments related to the quality of methodology, evidence collected, analysis and reporting. The Panel will also advise on the conformity of evaluation processes to the UNEG standards. The evaluation team is required to address all comments of the Panel completely and comprehensively. The Evaluation Team Leader will provide a detail rationale to the advisory panel for any comment that remain unaddressed.
The evaluation will use a system of ratings standardising assessments proposed by the evaluators in the inception report. The evaluation acknowledges that rating cannot be a standalone assessment, and it will not be feasible to entirely quantify judgements. Performance rating will be carried out for the four evaluation criteria: relevance, effectiveness, efficiency and sustainability.
While the Country Office will provide some logistical support during the evaluation, for instance assisting in setting interviews with senior government officials, it will be the responsibility of the evaluators to logistically and financially arrange their travel to and from relevant project sites and to arrange most interviews. Planned travels and associated costs will be included in the Inception Report and agreed with the Country Office.
INPUTS
The Consultant will use his/her own personal laptop. The project office will provide the office space for the consultant.
EVALUATION TEAM COMPOSITION AND REQUIRED COMPETENCIES:
The evaluation will be undertaken by a team of 1 external evaluator, a Team Lead (international consultant) and a National Evaluator (national consultant). The national consultant (Evaluator) will support the entire evaluation process and ensure to produce the final product. The national consultant will work under the guidance of an International consultant. In addition to his/her direct reporting line to the international consultant, the National Consultant will rely on the project staff, partners and stakeholders to prepare the ground for effective and efficient implementation of the evaluation.
The evaluators cannot have participated in the project preparation, formulation and/or implementation (including the writing of the project document) and should not have a conflict of interest with the project’s related activities.
UNDP Evaluation Guidelines: http://web.undp.org/evaluation/guidance.shtml#handbook