It is important to acknowledge that the degree to which monitoring and evaluation results are received and utilised is highly dependent on the type of organisational culture that exists. 

At Southern Hemisphere we have always adopted an organisational development and organisational learning approach to monitoring and evaluation. We are so excited that learning is now emerging as central consideration in much of the current thinking in our field.

We conceptualise the Monitoring and Evaluation for Learning (MEL) cycle as shown in the diagram below. This can either run as a quick cycle (e.g. after a series of events), or over a longer period, such as a 3 year project.

The evidence collected through monitoring and evaluation is useful for providing the opportunity to review and reflect, motivate, account, advocate, showcase, correct, plan.

But how do we ensure that the results are understood, believed and internalised?  How do we build buy in and commitment to the new ways of being or operating?  How do we ensure information is not only used for reporting but transforms organisations to become learning organisations that are open, transformational, self-critical and responsive?

It is important to acknowledge that the degree, to which monitoring and evaluation results are received and utilised is highly dependent on the type of organisational culture that exists.  That is, the leadership style, developmental approach, perceptions of criticism and “failure”, communication styles, etc.  However there are some things that evaluators and development practitioners could do to optimise learning:

  • Monitor and evaluate with end in mind; or as Patton put it, with the “intended use by the intended user” in mind. Be clear from the onset about what data, for whom and how this should be packaged in a way that can facilitate use and learning.
  • Provide a solid foundation for decision making through producing evidence that is valid and reliable. This creates confidence in the data and strengthens the arguments and recommendations made.
  • Speak, review and reflect on the data. A report is not good enough.  People need to process through questioning, sharing and reflecting.   It is crucial to involve stakeholders in developing or reviewing recommendations.
  • Plan for the future. Key recommendations should be prioritised collectively by stakeholders. These recommendations should then be articulated in programme or strategic plans, which should be monitored.
  • Expect a human reaction and help stakeholders to work through this. Learning from monitoring and evaluation often requires change management. Thus prepare to have resistance from staff and work towards getting their buy in to the change.
  • Facilitation skills are crucial! Ask the right questions and remember the key is not always to convince people but to facilitate them towards finding solutions themselves.  Ask questions that explore the facts and feelings about the situation:
    • What does the evidence say?
    • What will happen if we do not address this issue – what are the potential risks?
    • What are the potential advantages of welcoming the change?
    • What makes you apprehensive about the change?
  • Reinforce the need for evidence based decision making, especially when staff/participants want to revert to old ways of being/doing.
  • Include the right people in the learning space as learning happens through being involved.

Remember that having good M&E plans, good quality data, and good reports is only half the job!  Helping organisations and people navigate their way through information, past their own assumptions, beliefs, theories and attachments to transform, rediscover, expand and collapse is a job well done!

Author: Wilma Wessels is a partner and senior consultant at Southern Hemisphere Consultants.


Additional Resources

  • Dena Lomofsky, Southern Hemisphere Consultants, talks about learning as the new challenge for Think Tanks at the OTT Conference | Click to view.
  • Better Evaluation’s blog series on Learning and Adaptive Management | Click to read 
  • Davidson, Jane. “Question-Driven Methods Or Method‐Driven Questions? How We Limit What We Learn by Limiting What We Ask.” Journal of MultiDisciplinary Evaluation. Volume 11, Issue 24, 2015 | Click to read
  • Salib, Monalisa. “Adaptive Management: If Not Now, When?” USAID Learning Lab Blog. August 15, 2016 | Click to read
  • Guijt, Irene. “Exploding the Myth of Incompatibility between Accountability and Learning.” At Ubels, J. et al (2010). Capacity development in practice – Improving on Results. SNV. | Click to read
  • Patrizi et al. (2013). “Eyes Wide Open: Learning as Strategy Under Conditions of Complexity and Uncertainty.” The Foundation Review. | Click to read
  • Weyrauch, V. et al (2010). Learners, practitioners and teachers. Handbook on monitoring, evaluating and managing knowledge for policy influence. Buenos Aires: CIPPEC. | Click to read

This article was initially shared via the Southern Hemisphere Consultants’  May 2017 newsletter, and has been reposted with permission. To find out more about their work and training click here.