Monthly Corner

IDH and WSAF Publication of ToolKit

Tashi Dendup Blog

David Wand - Podcast Reviewing Somalia SRH GBV project Performance Measurement Framework 

Public Health Journal - December, 2024

Please get in touch with Steven Ariss (s.ariss@sheffield.ac.uk) if you’re keen to learn more or would like more FAIRSTEPS related resources.

ORACLE NEWS DAILY - Article by George S. Tengbeh

IEG & World Bank Publication - October, 2024

Getaneh Gobezie - Two Blogs

EVALSDGs Insight Dialogue - October 23rd 2024

Value for Women Publication 2024

Doing things right or doing the right things? Lessons from participation in EES, 2014

Participation in EES, 2014 was an excellent opportunity to rethink assumptions, meet friends again and see bits of Dublin.

What did I learn from the four days I spent at EES? I learnt from 'systems-thinkers' the distinction between doing things right and doing the right things.  If one has the wrong theory of change but a plan of action, one can land up doing planned things right, but not the right thing to address the issue that one wants to address be it poverty, HIV/AIDS, equity etc.  At the same time there is a need for reflection on what is "right" which itself can be contested   Looking back it is more easy in evaluations to assess whether things planned are being rightly implemented, than ask "are right things being done?" and "whose views are reflected in determining what is right?" (Hummelbrunner, 2014)

Yet another lesson is the need for rooting our evaluations in theories. While theories on evaluation are becoming popular, the  panel on 'what we can learn from classics' made a brilliant case that we root our evaluation work in theories of sociology, political science, economics etc. (including Karl Marx!) In fact, multidisciplinary exploration of years gone by contrasts with the current tendency to specialise. However, there was nobody  making the case of delving into what Simone de Beauvoir and other feminists of that time wrote.  

The panel on real time evaluations was brilliant and pointed to how agencies and individuals can do rapid assessments (3-4)  in humanitarian situations and plan forward. A detailed evaluation does not make sense as the situation is changing rapidly. The importance of meeting diverse stakeholders (including men and women from affected community) during real time assessments was emphasized, but not involving primary stakeholders in planning ahead. In-fact accountability to marginalised groups in evaluations posed a challenge in almost all the sessions/plenaries/workshops that I atteended.    

While one tends to look at evaluations at micro-level, there were others evaluating European regional cohesion program. The indicators and the evaluation challenges were different.  For example, data on cohesion  was not always available at the regional level. Contribution is more difficult to assess. Yet, as women's group when we launch regional campaigns on women's rights, minority rights or LGBT rights we need to learn how to assess regional advocacy and programs. 

 I was taught that evaluation is at the end of the project period. The workshop on evaluability assessments point to the need to examine right after design if the theory of change is evaluable, if data would be generated through the proposed MIS to evaluate the project and so on. Evaluability of projects and programmes is important, and this has to be factored from the beginning.

On the whole, the EES, 2014 committee not only 'did things right', but also 'did the right things'. Nevertheless a lesson is not to have gender exclusive panels but integrate gender specific presentations into mainstream. Greater effort also needs to be made to bring developing country voices into large plenaries - efforts from developed and developing countries are essential. It would then perhaps not be European, but perhaps time is right for Global Evaluation Society!  

Reference:

Hummelbrunner, R, 2014,  Systems Thinking, Learning and Values in Evaluation (ppt), Presented at the EES, 2014 in Dublin on October 2nd, 2014 

  

 

Views: 206

Add a Comment

You need to be a member of Gender and Evaluation to add comments!

Join Gender and Evaluation

Comment by Rituu B Nanda on October 10, 2014 at 22:37

Response from our member Dr Suresh Sundar through email

Thanks.

Excellent take home messages presented by Ranjani.
If there is a good monitoring process in place, accompanied by a concurrent evaluation system, with participation of stakeholders, there will be ample scope for mid course corrections, leaving very little surprises during the mandatory terminal evaluations.
With best wishes,  

Suresh Sundar

Comment by Barbara Befani on October 10, 2014 at 19:15

Hi, thanks for sharing this. As a participant to the conference, as well as a member of the organizing committee and an EES board member, I am particularly glad you enjoyed the conference.

I wanted to flag that Richard Hummelbrunner has written an article on the same topic of this presentation, which will be published in the IDS Bulletin 46.1, due out in January 2015 in a special issue edited by myself, Ben Ramalingam and Elliot Stern. The IDS Bulletin is not technically open access but for this special issue Wiley will make an exception and it will be freely downloadable (including Richard's article).

© 2024   Created by Rituu B Nanda.   Powered by

Badges  |  Report an Issue  |  Terms of Service