KNOWLEDGE SHARING WORKSHOP FOR EVALUATION MANAGERS:

Transcription

KNOWLEDGE SHARING WORKSHOP FOR EVALUATION MANAGERS:
1
KNOWLEDGE SHARING WORKSHOP FOR
EVALUATION MANAGERS:
How to design and conduct a country
programme evaluation
Structure & content
WORKSHOP FOR THE AFRICA REGION
Johannesburg, South Africa
June 25- 29, 2012
Africa Regional Office and
Evaluation Branch at Division for Oversight Services
United Nations Population Fund
UNITED NATIONS
POPULATION FUND
STRUCTURE & CONTENT OF
THE WORKSHOP
The delivery of this workshop is part of the launching and implementation of the new
methodology for UNFPA country programme evaluations (CPEs) developed by the
evaluation branch at the Division for Oversight Services (DOS). This document
presents the outline of the course both in terms of structure and contents.
TARGET GROUP
The main target group for this workshop consist in evaluation managers in UNFPA
Country offices. The first workshop will be delivered in the Africa region.
DURATION
Four days in two-session modules. The second half of the last day will be devoted to
Evaluation Quality Assessment (EQA).
OBJECTIVES
The overall objective of the workshop is that evaluation managers are qualified to
perform their roles in the conduct of a CPE. To achieve that goal the workshop has been
designed to familiarise evaluation managers with the process of designing and
conducting CPEs. The workshop has three main specific objectives:

Evaluation managers are acquainted with the overall CPE process and with the main
features of every phase of the CPE.

Evaluation managers are well informed and understand the roles they will play in
future CPE and in particular, they are fully acquainted (i) with the CPE tools they
will have to provide input to; (ii) and with the quality standards of the tools
evaluators will use.

Evaluation managers have a full understanding of the quality assurance mechanisms
they will have to apply to ensure both the quality of the entire process in general,
and the quality of the final report in particular.
WORKSHOP METHODOLOGY
The presentations delivered in the workshop are mostly based on the Handbook on how
to design and conduct Country Programme Evaluations.
The handbook is essentially targeted at the evaluation team members who will actually
design and conduct the evaluation. Therefore, the presentations adjust the content of the
Handbook to the workshop’s target group: the evaluation managers.
The workshop will consist in a series of brief presentations on methodological aspects
supported by examples and followed by exercises.
The workshop approach will also rely on debates, that is, specific time slots placed at
the end of some of the modules.
Evaluation Branch, Division for Oversight Services
UNITED NATIONS
POPULATION FUND
Most of the examples provided in the workshop will be based on the pilot
implementation of the CPE methodology in Bolivia, Cameroon and Madagascar.
WORKSHOP MATERIALS
Materials will include:
 The PowerPoint slides organised by module
 The Handbook on how to design and conduct Country Programme Evaluations,
which includes all the tools and templates that will be used throughout the five
phases
of
the
CPE.
The
handbook
can
be
accessed
here:
http://www.unfpa.org/public/home/about/Evaluation/Methodology
The following hand-outs:
o Supporting materials complementing the exercises e.g. print out of parts of CPE
Final Reports, EQAs, etc.
WORKSHOP STRUCTURE - Summary
The workshop is structured around five modules: introduction, preparatory phase,
design phase, conducting the evaluation and reporting phase. The main workshop units,
learning points, exercises and the timings are summarised in the tables below.
Evaluation Branch, Division for Oversight Services
UNITED NATIONS
POPULATION FUND
Day 1
Time
8:30
9:00
9:30
9:40
Module topic
Registration
Opening remarks
Introduction
Course presentation
10:00
Overview of the
phases of a CPE
10:30
11:00
Debate
MODULE 1 – INTRODUCTION
Learning points
Presentation of participants
Presentation of the objectives of the training session as
well as the Handbook.
Presentation of the five phases of the CPE process. Brief
presentation of the features of each phase, how the phases
interrelate and the role of the evaluation managers in each
phase.
Questions and answers
NA
NA
Examples of key issues not-to-be
omitted in each phase and which
have important repercussions in the
quality of subsequent phases.
Coffee break
INTRODUC
MODULE 2 – PREPARATORY PHASE
Brief ToRs overview: elements to be included in the ToRs;
the consultation process associated to drafting the ToRs;
considerations on quality.
The main aspects to take into account when selecting and
recruiting the evaluation team; presentation and succinct
study of the Ethical Code of Conduct for UNEG/UNFPA
Evaluations; focus on aspects such as independence and
conflict of interest.
Aspects to take into account when establishing the
Reference Group, i.e., members, representativeness, and
operational mechanisms. How to design the quality
assurance process: identification of quality assurance
process milestones, and the role of the Reference Group in
quality assurance.
Questions and answers
11:20
Drafting the ToR
11:45
Selecting and
recruiting the
evaluation team
12:00
Establishing the
reference group
and the quality
assurance process
12:15
13:0014:30
14:30
Debate
Lunch break
Preparing initial
documentation
Inventory of the required documentation: where to find it
and how obtain it.
Exercise: prepare a list of documents
15:00
Preparing list of
Atlas projects by
CPAP output and
Strategic Plan
Outcome
Preparing
stakeholders
mapping
Steps involved in the preparation of the list of Atlas
projects Why this list is important and how evaluators will
use it
15:45
16:0017:00
Examples
Coffee break
Debate
Steps to be carried out in the preparatory phase in order to
provide input to the stakeholder sample selection process in
the design phase. Rationale of selecting a sample of
stakeholder rather than a sample of projects.
Exercise: prepare a Stakeholders’ mapping table
Examples of features that should
not be omitted when drafting
ToRs.
Examples of team composition
Introduction to the tool 9:
checklist for the documents to be
provided by the CO to the
Evaluation Team.
Introduction to the tool 3 and
template: List of Atlas Projects by
CPAP output and Strategic Plan
outcome
Introduction to the tool 4 and
template: Stakeholders’ mapping
table
Open discussion on the role of the evaluation managers in the preparatory phase. Opening
questions: do current resources, organisational structures and internal capacity at Country
Offices allow evaluation managers to carry out their role in the preparatory phase? What are the
weaknesses in this regard and how could they be overcome?
Evaluation Branch, Division for Oversight Services
UNITED NATIONS
POPULATION FUND
Day 2
MODULE 3 - DESIGN PHASE
Time
9:00
9:30
Module topic
CPE wider
Framework
The CPE internal
framework (ToRs)
Country context
and UNFPA
response
Intervention logic
Learning points
Examples
Main elements of the CPE evaluation framework: CPE
mandate
The three CPE components; the objectives of a CPE; the
scope of a CPE.
Overview of what evaluators should know to understand the
country context and the UNFPA response. Review of the
main documents. The questions that evaluator may ask
about the UNFPA response and programmatic flow.
The elements of the intervention logic (needs, objectives,
inputs, activities, outputs, outcomes, impact).
NA
Evaluation criteria
What are the evaluation criteria; definitions and rationale of
the evaluation criteria for the programmatic areas and for
strategic positioning; differences between criterion e.g.
efficiency and effectiveness; responsiveness and relevance.
10:30
11:00
11:20
Debate
Coffee break
Evaluation
questions
Questions and answers
12:00
12:4514:15
14:15
Debate
Lunch break
Questions and Answers
From a sample of
stakeholders to the
CPA
overall
agenda
The role of the evaluation manager in drawing the agenda.
How to develop individual agendas out of the overall
agenda. The supporting and quality assurance role of the
evaluation manager in drawing up individual agendas.
14:30
Data
collection
methods
Brief analysis of quality issues for evaluation managers to
take into account when assessing the appropriateness of the
data collection methods chosen by the evaluation team.
Validity of findings
Brief analysis of quality issues evaluation managers should
have into account when assessing the validation
mechanisms and the triangulation techniques.
What the Evaluation Matrix is and how to prepare it. How
evaluator managers can assess the quality of the Evaluation
Matrix.
What are the evaluation questions and why are they so
important; the link between evaluation criteria and
evaluation questions; process of selecting priority
evaluation questions.
15:00
The
Evaluation
Matrix
15:45
16:0017:00
Coffee break
Group exercise: participants will assess the quality of an Evaluation Matrix.
Evaluation Branch, Division for Oversight Services
See template fot the ToRs
Overview of the basic documents
that the Evaluation Team should
read (tool 9) and where Evaluator
managers could find them.
See elements of theory (Part 3,
3.4.1)
Examples of Cameroon and
Madagascar Effects diagrams
Differences between key terms –
tips to avoid confusion: activities
and Annual Work Plans; outputs
and outcomes; efficiency and
effectiveness, relevance and
responsiveness.
Examples of evaluation questions:
how to identify well formulated
from inadequately formulated
questions.
Strategic Plan issues: Tool 5 The corporate alignment matrix
and table 5- Corporate mandate
aspects that should be assessed
and the criteria under which they
could be assessed.
Tips and suggestions for a
realistic agenda (briefing and
debriefing allocations;
independence of the team in field
visits; etc.).
Examples of situations in which is
more appropriate to use one type
of data collection method and the
reasons why.
Examples of Evaluation Matrixes
UNITED NATIONS
POPULATION FUND
Day 3
MODULE 3 - DESIGN PHASE
Time
9:00
10:30
11:00
12:4514:15
14:15
Module topic
Learning points
Starting the field
mission
–
overview
General briefing
(plenary session)
Data collection –
practical
considerations
15:45
16:00
Coffee break
Data analysis
practical
considerations
–
Presenting
preliminary results
16:45
Examples
Group presentations: Results of the Evaluation Matrix exercise will be presented and discussed with the entire group.
Discussions will be facilitated and guided by trainers and will conclude with the main learning points.
Design Report
Brief presentation of the detail outline of a design report; One or two real reports will be
structure, minimum content and quality aspects.
presented to illustrate how a
complete design report should
look like.
Coffee break
Debate
Open discussion on the role of the evaluation managers in the design phase.
Lunch break
Debate
MODULE 4 - CONDUCTING THE EVALUATION
Evaluators’ internal team meeting; security briefing;
general briefing with the country office and individual
briefings with the CO’s programme officers (individual
agendas, logistics, programme overview).
Rationale, and main issues to be covered: presentation of
the CPE methodology, briefing on general political and
technical aspects, logistics and two core aspects: refining
and adjusting the Evaluation Matrix (if necessary).
Practical aspects that influence the quality of interviews
and groups discussions.
Practical aspects regarding
changes in the individual agendas. Role of the evaluation
manager in the preparation and organization of focus
groups; quality aspects.
Examples
(Cameroon
and
Bolivia) of presentations of
programme overviews to be made
by programme officers to the
evaluation team and suggestions
on their minimum content.
.
Examples
of
factors
that
influence the quality of the data
collected
(especially
when
conducting interviews). Tool 13 –
Interview logbook; Template 8 –
Note of the results of focus
group.
Practical aspects to be considered to ensure the quality of Examples
of
triangulation
the data analysis process. This unit will also cover the chain techniques
and
validation
of reasoning of the evaluation and the approach to data mechanisms.
analysis (evidence-based reasoned judgments).
How evaluators will prepare the plenary debriefing; what Example of a PowerPoint
evaluation managers should expect from the presentation; presentation used to present
the double objective of plenary sessions; who should preliminary results in a Plenary
attend; sequence of the presentation.
briefing.
Open discussion on the role of the evaluation managers in the field phase.
Evaluation Branch, Division for Oversight Services
UNITED NATIONS
POPULATION FUND
Day 4
MODULE 5 - REPORTING PHASE
Time
9:00
9:45
11:00
11:15
11:40
12:30
13:00
14:30
16:00
16:15
16:20
16:40 –
17:15
Module topic
Learning points
Examples
Assessment of the Overview of the main features of the Assessment of the Tool 16 – CPAP indicator quality
M&E system of the M&E system
assessment grid
CO
Group exercise: participants will assess the quality of the indicators of a results framework
Group presentations: Results of the exercise will be presented and discussed with the entire group.
Coffee break
The
Evaluation Overview of the outline, structure and contents of a final Example of completed evaluation
Report – structure report.
reports.
and template
Evaluation Report Explanations on the review process; types of observations; Examples of audit trails
– review process
how to fill in the audit trail.
An explanation on what is the EQA, what are the quality
Evaluation
criteria, when it should be filled out, how it is structured,
Quality
Assessment (EQA) how to fill it out, how to apply the scoring and the
weighting
Questions and Answers
Debate
Lunch break
Group exercise: participants will assess the quality of one CPE report
Examples of EQAs
Group presentations: Results of the exercise will be presented and discussed
with the entire group.
Coffee break
Beyond
the The evaluation report is distributed to stakeholders in the
reporting
phase: country and at UNFPA headquarters. CO and relevant
Management
services prepare a management response to the evaluation
response,
recommendations. Report made available to UNFPA
dissemination and Executive Board by the time of approving a new CPD.
follow up
The report, the Evaluation Quality Assessment and the
management response will be published on the UNFPA
evaluation webpage http://web2.unfpa.org/public/about/oversight/evaluations/
Follow up of recommendations one year later.
Presentation of a
Example
of
a
completed
concrete example
management response sheet.
Open discussion on the role of the evaluation managers in the reporting and dissemination
Debate
phases.
Evaluation Branch, Division for Oversight Services