Kyiv city, Kyiv, UA
15 days ago
Final Evaluation of Integrated Emergency Re-sponse for Crisis-Affected Persons in Ukraine-Consultant

Background of the project  

This TOR is intended to guide a consultancy service to conduct an evaluation of the BHA funded Multisectoral Emergency Response Project (GD056). The project started on August 12, 2022, and will end on December 11, 2024. The project has been implemented by the IRC and local partners addressing the life-threatening and harmful consequences of the ongoing Ukraine crisis, delivering urgently needed Protection and Health assistance to people who remained living in previously occupied and/or currently hard-to-reach locations, host communities, and internally displaced persons (IDPs), reaching approximately 187,699 individuals (123,543 female and 64,156 male) in Kherson, Kharkiv, Vinnytsia, Dnipropetrovsk, Poltava, and Zaporizhzhia Oblasts. Given the proposed extension in the timeframe for the award, Donetsk Oblast was also included among the list of award operating locations to allow for flexible and timely delivery of assistance as areas became accessible and prioritized for the delivery of humanitarian programming.

Whilst IRC delivered community-level services through direct services in targeted locations, partnerships formed the cornerstone of IRC Ukraine’s work. The IRC partnered with six local organizations namely (Ukrainian Deminers Association [UDA], Right to Protection [R2P], Charitable Foundation Peaceful Sky of Kharkiv, Martin Club, Green-Landiya, and Stellar Ukraine), providing support to organizations to both scale their emergency response to the crisis and to pivot back to their service offering of specialized service provision. These partnerships supported the transition and sustainability of activity achievements by strengthening local ownership of interventions and reinforced client feedback mechanisms to enable conflict-affected populations to shape the services they received.

The program targeted and reached those immediately affected by a deterioration in the protection environment as well as those who are vulnerable due to displacement and marginalization to improve their access to primary health and support their recovery and strengthen their resiliency. Using a needs-based approach, the intervention bolstered timely humanitarian assistance, prioritizing unmet gaps for vulnerable households, while maintaining flexibility, cost effectiveness and value for money.

Scope of Work

This consultancy is a short-term assignment to undertake the final evaluation for the project titled “Integrated Emergency Response for Crisis-Affected Persons in Ukraine”.  The evaluation will employ a mixed methods approach that combines quantitative and qualitative data collection and analysis techniques, meeting the scientific standards of credibility and reliability that can support and inform the ongoing and future emergency and recovery programming.

The evaluation work will cover the project’s achievements, with a focus on the Economic Cooperation and Development - Development Assistance Committee (OECD-DAC) Evaluation Criteria. The evaluation is expected to generate evidence from mixed sources that would inform decision makers on whether it has been successful or not in achieving the initially envisaged objectives as stated in the project proposal document.  In addition, the project key indicators shown in the Log frame will be assessed through primary and secondary data sources from the target locations. Further scoping of the evaluation work will be developed during the inception meeting with the selected consultant, which will be organized by the IRC upon the selection of the consultant.

The evaluation will focus on the following evaluation questions based on OECD-DAC criteria as a framework for achieving the intended objectives:

a.                Relevance/Appropriateness:

Were interventions appropriate and effective for the target group based on their needs? And which target groups and individuals were reached by the interventions?

b.                Efficiency:

Are the most efficient approaches being used to convert inputs into outputs?

To what extent have the activity’s interventions adhered to planned implementation schedules? and

What was the level of efficiency and timely delivery of the goods or services?

c.                Effectiveness:

Is the intervention achieving its objectives, and does it do so in a timely manner?

To what extent do the activity’s interventions appear to have achieved their intended outputs and outcomes?

To what extent did the activity help prevent individuals and households from adopting negative coping strategies?

How IRC and partners’ activities in health and protection were integrated with each other and if/how that approach to integration increased the effectiveness of the interventions.

d.                Impact:

What are the wider effects (intended and unintended, positive, and negative) of the interventions (social, economic, and environmental) on individuals, gender- and age-groups, communities, and institutions? What factors appear to facilitate or inhibit these changes?

Which interventions appeared to be more or less important to achieving activity outcomes? How did these changes correspond to those hypothesized by the activity’s Theory of Change?

e.                Coordination:

How are the interventions integrated into the wider response provided by other humanitarian actors, the local and national authorities etc.

f.                 Protection:

How do the interventions mainstream the protection of all groups that comprise the affected population?

Deliverables:

The selected consultant is expected to provide a detailed proposal, indicating the methodologies that will be used for documenting the evaluation of the project components. It is expected that a mix of methods (quantitative and qualitative) will be used to conduct the evaluation. To answer the key evaluation questions set out in the TOR, the consultant is expected to come up with proposed design and methods for collecting quantitative and qualitative data from primary/secondary sources, including sampling techniques data collection tools and procedures, method for compiling, analysis, and systematic presentation of the data. IRC highly encourages the use of electronic data collection, especially for quantitative data collection.

The methods should be in consideration of the sampling technique; sampling frame and sample size, quantitative and qualitative data collection techniques, secondary data inputs, measures for data quality assurance, data cleaning and management; data analysis technique and software to be used for analysis, ethical considerations and limitations and report generation.

The methodology and framework for the evaluation should be designed based on a participatory approach with inputs sought from the stakeholders, capturing views of the target beneficiaries, including the perspectives of vulnerable communities and groups. Moreover, the consultant is expected to communicate and gather views from the beneficiary community, government stakeholders, and health facilities, using the relevant tools.

Data Collection Procedure: A mixed method of evaluation data collection is recommended, and the data collection process follows the following procedures and steps.

Desk review of key documents: Before the start of the field work, the consultant must review available project key documents; mainly the project proposal, periodic reports, and related studies, to understand the context and nature of the project as well as the key thematic areas and the Log frame.

Field based data collection: The consultant is expected to conduct field-based engagement with key project stakeholders, clients, government representatives. Among others:

•            IRC technical and field-based staff.

•            Key government stakeholders such as sectoral regional government offices including Health offices and facilities; Social protection office, etc.

•            Clients, including sample host communities, returnees, IDPs by (Women, Men, boy and Girls and other vulnerable groups).

However, more detailed evaluation methods are expected from the applicants/selected consultant as part of the inception report in consideration of:

The evaluation designs

Direct consultation with relevant actors/experts

Data collection process (Sampling method, sampling frame, instruments, protocols, and procedures)

Procedures for analyzing quantitative and qualitative data

Data presentation/dissemination methods.

Report writing and sharing etc.

Sampling Strategies:

This evaluation will employ a mixed methods approach that combines quantitative and qualitative data collection methods from both primary and secondary sources to have a more comprehensive coverage and in-depth analysis through triangulation of data from various sources. To measure the evaluation questions from the quantitative data collection point of view, the consultant will follow the recommendation from BHA_emergency_M&E_guidance_February_2022. Accordingly, “one-stage simple random sampling (SRS) strategy” will be used as the list of all beneficiaries is available. Since this approach does not require advanced knowledge in survey statistics and sampling weights are not needed, making it ideal for emergency contexts. Also, the IRC and the consultant will analyze the evolving security situation and its impact on the geographical location to be included in the sampling strategy. Depending on the ongoing security situation in the target locations the sampling strategy might be changed into a two-stage cluster sampling strategy. Overall, the consultant is required to use an appropriate sampling strategy that is statistically representative of the population of interest (POI) accordingly the actual sample size will be calculated (at 95% confidence level and 5% margin of error).

 

For the qualitative data collection – Purposive sampling will be used to enroll participants in the FGDs and KIIs. On top of this, observation and document review will be used as part of qualitative method. The evaluation team will conduct KIIs to assess the evaluation questions (relevance, effectiveness, efficiency, impact, coordination and protection). At least one Key informants will be selected per stakeholders from targeted oblasts/regions (Migration office, Administrative Service Office, IDP Councils, Municipal and City Councils, Departments of Social Policy and Social Protection, Centers for Social Services for Families, National Police departments, Departments of Education, Center for the provision of administrative services, Paramedic and obstetric stations, Health points etc.) Moreover, FGDs will be conducted with various groups of targeted beneficiaries (Men, women, boys, girls, Person with disability, and Elderly people) to collect qualitative information on emergency health and protection intervention. Overall, the evaluation team will undertake 20 FGDs depending on the target group by location with a maximum of six types of community groups in the targeted locations/oblasts. 

The above methods were designed based on the fact that this project is implemented in an emergency setup with a rapidly changing context and frequent movement of people from place to place. According to the M&E plan, there are four outcome indicators, all of which are custom indicators suitable for an emergency setup.

The outcome indicators under this project are:

% of clients that receive psychosocial support or case management report an improved sense of wellbeing % of surveyed children participating in IRC’s CP programs who report an improvement in their sense of safety and well-being % of surveyed women and girls participating in WPE programs who report knowing where someone can get support if they experience violence % of clients who are satisfied with IRC service per the standard satisfaction survey·  

Data analysis and triangulation method:

This evaluation will employ a mixed methods approach that combines quantitative and qualitative data collection and analysis techniques. The quantitative data will consist of surveys records whereas the qualitative data will consist of key informant interviews, focus groups, observations, and document review.

The mixed methods approach will be employed to integrate quantitative and qualitative data to answer the evaluation questions. This will provide a more comprehensive and holistic picture of the evaluation context, process, and outcomes than either method alone. To elaborate on the analysis requirement, the evaluator will make sure that data from the target-based surveys is analyzed quantitatively using standard package software to generate descriptive statistics like frequencies, proportions, and means. Thematic analysis, using key themes related to the evaluation objectives (by grouping responses by themes), will be employed for qualitative data analysis.

The data analysis includes triangulation of quantitative data with qualitative findings as well as data from primary source with secondary data. The triangulation step involves multiple data sources and methods to validate and enrich the findings of the evaluation.

The steps that will be taken to triangulate data from different sources are:

·       Data from multiple sources (qualitative and quantitative) as mentioned on the methodology part (surveys, key informant interviews, focus groups discussion, observations, and document review).

·       Data from different stakeholders, such as beneficiaries, staff (IRC and partner), and government offices will be collected to capture diverse perspectives.

·       During the data analysis step, quantitative and qualitative data will be analyzed separately using appropriate methods for each (statistical analysis for quantitative data and thematic analysis for qualitative data).

·       After the initial analysis, the result will be compared with different data sources and methods to identify converging themes or patterns.

·       The finding from different data sources and methods will be integrated to coherent narrative in a way that highlights how different pieces of data complement each other and provide a full understanding of the research questions.

·       Once the report is generated, the finding and interpretation will be validated and cross checked with stakeholders, including peer review involving IRC technical advisors to validate the analysis and the conclusions are well-supported by the data/evidence based.

Data quality control measures:

·       To ensure the accuracy and reliability of collected data, several quality control measures will be implemented at different stages (survey design, data collection and data cleaning stages).

·       Quality control measures will be implemented at survey design stage through creating comprehensive plan that outlines the data collection method, tools, timelines etc.. Through ensuring the alignment of the survey design with the evaluation questions.

·       During data collection stage, enumerators/data collectors will be trained on the questionnaire and tools, and all tools will be pilot tested before actual data collation.

·       For beneficiary survey digital data collection platform (Kobo toolbox/Commcare) will be used to ensure the quality of data during data collection through implementing different validation criteria and for real time data quality check.

·       Supervisors will be assigned to oversee data collection in the field, providing real-time guidance and support to data collectors.

·       The data collection process will be regularly monitored to ensure adherence to protocols and regular reviews of the collected data during the data collection phase to identify and correct errors, inconsistencies, or missing data early in the process.

·       Finally, after data collection, a thorough data cleaning process will be performed to address any errors, outliers, or inconsistencies. As part of data cleaning, action will be taken include removing duplicate entries, correcting incorrect entries, and standardizing data formats.

Data disaggregated:

The evaluator will make sure that different groups within the target beneficiaries are carefully sampled and represent the diverse groups benefited from the project to get meaningful insights and to learn the specific needs of different subgroups. To this end, during the design stage careful planning will be put in place to ensure that diverse groups within the target population that need to be represented ( gender, age, disability status, profile (IDP, host community), geographic location etc..) And in the data analysis stage the data will be analyzed separately for each group to get more insight about the perspectives of different groups.

Schedule:
The consultant will evaluate the entire life of the project up until the period of evaluation. Required deliverables are the following:
Secondary Data: Review of secondary data from varied and relevant sources that must be specified in the consultant’s proposal, including, but not limited to, relevant project documents. However, the evaluator should also provide other key secondary documents to be used, relevant to the target localities. Activity documents available for evaluators include:
Proposal documents (technical proposal narrative, budget, work plan, M&E plan, etc.)
Program reports to date
Review of the performance monitoring data
Documents as requested by the evaluator
Inception Report: This will be required within ten working days of the contract award; the consultant must submit a detailed inception report indicating the evaluation design and operational work plan, which must include the methodology, sample size, proposed data collection and analysis methods to address the key questions of the evaluation. The inception report shall also include questionnaires and interview protocols.
Evaluation Design: This will include the development of tools, data entry, and data analysis plan for designing the entire evaluation study.
Tools and Methodologies: The evaluator will be expected to share the tools and methodology to be employed in this evaluation with IRC, and incorporate any feedback provided.
Data Quality: The evaluator will ensure that, throughout the process of data collection, all necessary measures are undertaken to ensure that data quality management is adhered to. The proposal must clearly outline these quality control measures. The evaluator should also propose a digital data collection methodology to be used. All datasets and syntax files used in software analysis packages will be submitted to the IRC.
Data Analysis and Interpretation: The evaluator will have to ensure proper analysis of all quantitative and qualitative data/information collected. Data needs to be disaggregated by household profile, gender and any other social-economic categories, as applicable.
Regular Coordination: Throughout the study, the evaluator will maintain regular contact with the IRC team contact person to ensure the organization is aware of progress throughout the evaluation period.
Dissemination of Results: The evaluator, with support from IRC, will conduct three dissemination meetings. The first involves key staff, the second meeting focuses on review of the draft report to ensure transparency and consensus, and the final meeting includes additional stakeholders for final validation.
Evaluation Report: The evaluator will submit detailed and accurate reports (draft and final) as per the evaluation objectives to the IRC within the agreed timeframe, in the specified format outlined below.
The evaluation is expected to begin no later than September 23, 2024. This date and those below for the key deliverables under this consultancy may be adjusted in the final agreement that will be developed later, in consultation with the evaluator hired. The final evaluation report and summary and/or infographic brief(s) should be submitted to the Senior MEAL Coordinator no later than November 29, 2024. 

Key Evaluation Activities Due Date
Kickoff meeting 23-Sep-24
Draft Inception Report (including tools) 4-Oct-24
Final inception report and site selection  8-Oct-24
Field work (Data collection) 9-Oct-24
Presentation of the draft evaluation findings and first meeting 15-Nov-24
Draft of the final evaluation report and infographic brief(s) 21-Nov-24
Meetings with project technical team in the CO and field office and partners representatives. 25-Nov-24
Meeting three with headquarter IRC technical advisors, country office senior management team and donor (as required). 26-Nov-24
Final evaluation report and infographic brief(s) 29-Nov-24


Payment Rate: Applicants should provide the financial proposal for executing the deliverables with a base rate and total amount.

 


Confirm your E-mail: Send Email