Establishing a process of Monitoring, Evaluation and Learning

How do we Evaluate Programmes?

 

An organization aiming at programme program quality should establish a system of Monitoring and Evaluation.  In fact in order to ensure  program quality programme managers need to use the feed back of monitoring and evaluation in order to:

Attention: do not confuse project monitoring with project evaluation. 

Project monitoring is identical with project performance measurement and belongs to the project execution and control  phase of project management. Monitoring and reporting the project performance is to review the project progress against expected milestones, timelines and costs. The purpose of reporting is to share the information required to manage  Manage CSSQ (Cost, Scope, Schedule, and Quality). . With reference to tasks, actions and projects, monitoring is looking at what is currently happening. It is functional to the objective of delivering the project outputs with the expected quality and within the time and cost constrains defined in the project plan document. The deliverable of project appraisal is the Project Status (or progress) report (see project execution templates).

Project evaluation is not a part of project execution but is a different phase of the programme cycle Project evaluation is not done by the project manager but by the project sponsor (e.g. the Programme Manager, etc.) or by someone supporting her. It requires Project Status reports as a fundamental input. Its outputs are projects evaluation reports  and program lessons learned. Evaluation is based on monitoring but adds a "judgment" on the correlation between activities performed, outputs delivered, changes induced objectives achieved and impacts obtained upon the factors generating the problems and opportunities that motivated the project. While the delivery of project outputs is the responsibility of the project team, the achievements of project outcomes and objectives depends also by the way stakeholders utilize the project outputs to interact with the rest of the community and contribute to achieve the project objectives.

 

Monitoring and evaluation can be done by deputed offices and persons within the organization or by external consultants (individuals or specialized organizations)  (see job profile of an evaluation expert in a development organization).

As stakeholders increasingly demand accountability and transparency, it is becoming imperative that Impact Reports reveal:

M&E, when absorbed down at all levels of the organizational structure, are also strategic in making each team member more accountable for their specific tasks and making the teams accountable for their capacity to generate the communication climate that empowers each member to fulfil her/his role.

Because of the cycle structure of programming, evaluation is the last stage of each ending cycle and the first one of each new cycle, the results of evaluation feeding into the other stages. rephrase the following sentences In fact programme managers need to use the feed back of monitoring and evaluation in order to check whether the programme or project is being implemented according to plan; assess whether the programme/project leads to the changes or impacts that were anticipated; consider how sustainable the programme or project impact is likely to be; identify key learning and action points to feed back into this programme or project and inform future projects, programmes and policy.

See also

 

 

Other resources:  IPTET Evaluation Ethics, Politics, Standards, and Guiding Principles

 

 

I think in this chapter we just need to say what are the resons that prompt an organization to set up an evaluation team that is somehow different (how muich) from the programme team ....  like conflict of interests ... generating transparency ... learning ....  empowering ... etc.

 

what is here below can be used for this or should be pasted on different sections


However, we think we can, and should, do a lo
t better, particularly in the area of involving partners and the people with whom we work. To date there has not been much sharing and cross-fertilisation of ideas. And we are missing valuable opportunities for learning, sharing knowledge, and improving the quality of our programmes all over the world. This is why we have built on current practices to devise a new system to improve the way that we capture and share learning at all levels of our programme and to provide more robust and consistent evidence of our impact.

In the past, we carried out Annual Impact Reporting, a process that used set criteria (7 impact questions) to evaluate a selection of projects, and pulled together into a global Programme Impact Report. This process served us well but there is now a much stronger need for  our organization to show more evidence of the quality, effectiveness, and impact of all of our work - to donors and other stakeholders. It’s also a huge opportunity for  our organization to invest in programming as it necessitates building in time at all levels to consult, reflect, and ask challenging questions about whether there is tangible evidence of  our organization achieving significant change in the lives of poor people.

This new approach is called the ‘Monitoring, Evaluation and Learning (MEL) System’. It’s a ‘system’ through which we can build on what we do well and improve in areas where we fall short. It will ensure that we are more open to listening and learning from partners and the people with whom we work, and that we actively seek feedback and improve our programmes as a result. All of this learning, happening on different levels, will enable us to make more informed decisions about the future direction of our overall programme.

The new MEL system started this year with each region nominating two programmes to pilot a new way of monitoring programmes. The emphasis here is on improving the dialogue that we have with partners, and people directly affected by our programmes, in ways that promote listening and feedback, using formal and informal channels to reinforce trust, collaboration and genuine dialogue. We will try out new ways of assessing the effectiveness of programmes, starting with involving others in decisions about how best to this, and through considering the use of video and photography as additional ways of sharing learning.

Each programme chosen will conduct a ‘Monitoring Review’ by March 07 that will focus on reflecting upon and analysing our work. What we learn from this will be fed back into improving programmes and monitoring processes, and developing support materials for  our organization staff and partners. The number of programmes delivering monitoring reviews will be significantly scaled up year on year, with a target of all programmes conducting regular reviews by the end of April 2010.

Country Learning Reviews will be held every year from 2007/08. These provide an opportunity for  our organization staff, partners and OI affiliates to learn from knowledge gained from programme evaluations, monitoring reviews and other sources and to amend programmes as necessary. Regional Learning Reviews will be held every two years (starting in 2008/9) to share learning and knowledge from the countries and regional programmes, providing the opportunity to re-think and challenge existing approaches, and to make strategic decisions about the direction of future programmes. A key output of these Reviews will be the compilation and delivery of written reports that, together with other regional reports, will provide a strong basis for providing evidence of impact and accountability. An idea under discussion is a global forum, called  our organization Reflects, which will be held every three years as an opportunity for staff, partners and external challengers to make decisions about the future direction of our overall programme.

It’s important that we measure ourselves through being aware of external standards and trends. This will be achieved through investigating the development and use of indicators as part of an Indicator Feasibility Study to enable us to make more informed decisions about how to measure success. We will also deliver Strategic Evaluations that focus on important issues that cut across all of our programmes, and Impact Assessments examining the difference  our organization makes, for example in our HIV and AIDS work, or Watsan scale-up programme. This year’s strategic evaluation is on the theme of ‘partnership’, with a survey of approximately 900 partnerships and external consultants recruited to deliver field research and in-depth evaluations of work with 11 partner organisations. The learning from this piece of work, together with the partnership policy, will help to shape the way that  our organization works with partners in the future.

Why are we doing all of this? To move  our organization GB towards a stronger culture of learning and accountability, through putting in place new, joined-up monitoring and evaluation processes that enable us to learn and to assess the impact of what we do. These processes will be integrated with current activities, the aim being to improve the way that we do things, rather than changing things radically. This will result in monitoring and evaluation being seen as less of a separate activity and a chore, and more as adding fantastic value to our programming through reflecting, listening to others, and challenging ourselves with new ideas and ways of doing things. There’s lots of good work going on already. We need to tap this knowledge and use what we have learned to increase our confidence and competence in our programme and improve the quality of our work, judging it by the outcomes and impact we achieve together with others.

 

A combination of the processes and outputs of the Monitoring, Evaluation and Learning System will enable continuous and collaborative learning about our programme amongst the range of our Organization stakeholders (poor people, partners, programme staff, managers, trustees and donors).  This learning will be used to inform our strategic direction and transform the quality of our programmes. 

 Over a period of 5-10 years, the system will provide our Organizationwith the body of evidence that we need to assess our performance and demonstrate our programme impact.

 It is the means by which we will hold ourselves accountable to delivering changes in the lives of poor people.

 

Supporting country programmes to develop structured and systematic processes that are aligned with the Monitoring, Evaluating and Learning plans and processes of the organisation and recommend steps towards this end

 

The Programme Quality Advisor:

a)      Explains the Monitoring, Evaluation and Learning System and processes to the programme and project managers

b)      assists managers in establishing these plans, systems and processes and monitors  implementation

 

 

 

 

Challenges (learning and accountability)

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

Considerations
 

 

 

 

Basic principles of programme quality in aid organizations are:

 

 

A proposal for establishing a M&E system: FMPS

 

 

 

See also

 

 

 

See also

 

What is "quality" in development/humanitarian programmes and how can we set the "standards"?

What does it mean  "putting gender equality at the heart of development and humanitarian work" and what does it imply at programme management level?

What are the links between knowledge management, learning and institutional development?

How can we manage collaborations so as to generate a sense of partnership and solidarity with key development stakeholders?

How can we manage collaborations  so as to improve one's own organizational capacity?

How can we manage collaborations  so as to  achieve high reputation?

Basic Concepts of Programme Cycle Management

How do we Develop Programmes

How do we Implement Programmes: Designing and Executing Projects

How do we Evaluate Programmes?

Programme/project financial management

The ability to support and understand the many different contexts/cultures in the developing world

Leading and Managing  Team Building  -  Motivating the Project Team

Training as a Communication Strategy

Communication SkillTasks, tools and elements of communication  -  Guideline: Why do organisations need to plan and manage their communication?

 

 

What is participation  -   Partecipazione come concertazione allargata

The Six Steps in the partnership process

Identify project stakeholders

Involving Stakeholders

Introduction

Building Trust

Involving Directly Affected Stakeholders

Seeking Feedback

Involving the Voiceless

Involving the Opposition

Participatory Planning and Decisionmaking

What Do Participatory Techniques Achieve?

Creating a Learning Mood

Building Community Capacity

Understanding Community Organizations

Building the Capacity of Community Organizations

Partecipation Methods and Tools

Introduction Methods and Tools

Appreciation-Influence-Control (AIC)

Objectives-Oriented Project Planning (ZOPP)

TeamUP

Beneficiary Assessment

Glossary of tool

-----------------------

See also: The Strategy Challenges

Communication and Evaluation

Communication and Knowledge Management

 

 

 

 

 

Link between Evaluation Criteria and the Logframe

 (See also: the 3 level hierarchy of project/programme objectives).

 

 

 

 

 

 

 

Guidelines: How to conduct a useful M&E action 

Common Myths and Misconceptions About Evaluation

The ability to support and understand the many different contexts/cultures in the developing world

 

 

old files:

 

example of evaluation reports  :  http://www.asiandevbank.org/Evaluation/default.asphttp://www.europeanevaluation.org

Other resources

A Web site on evaluation for the francophone : http://evaluation.francophonie.org/

Methodologies:  http://www.netricerche.net/http://www.delfo.net/methods/methods.htm

http://earth.prohosting.com/elecon/evaldevel/evaldevelopment.html

 

 

Actors: http://www.assirm.it/associ.htmhttp://www.esomar.nl/directory.htmhttp://www.ipsos-explorer.it/cosa/frame.htm  (In basso a destra clicca su metodi e strumenti); http://www.gpf.it/cosafa.html;

 

 

 

 

 

-------------

 EU resources