Reflections on Kathmandu Evaluation Conclave - 2013

It was really a nice moment for me when the participation in the Evaluation Conclave was confirmed and I have started to initiate thinking on how best I could learn from the Conclave as well as could contribute both personal and ‘Feminist evaluation’ experience.

For me, it was the first time to be part of this type of colossal Conclave, which would discuss, learn, think and plan only on Evaluation! The Conclave was designed in a way, so that maximum you can learn in a single Conclave, through panels, workshops, breakaway session, coffee shop meetings and expert lectures. It was really a well-designed and well-planned assembly, where I have decided in which one can decide before she/he would attend it.

The first day was started with the key note address by Katherine Hay emphasizing on ‘How does Evaluation matters’ and ‘How do Societies Change?’ I really liked her analysis on Treasure of a Country and ‘How to MEASURE the TREASURE’?  Another good quote was of Marco Segone, who said that Evaluation should not be seen as a funded project, but it should be a MOVEMENT, involving participation from all segments of the society. He also said that Evaluation should be for a equitable movement and should be a tool to achieve the goal of ‘EQUITY FOR ALL’.

After tea break, the Breakaway panels were started where i attended the panel on gender. Prof Chandra Bhadra, Academic Activist from Nepal, was fabulous and she shared her experience on participation of women in evaluation of some projects. She highlighted some major issues, like women’s time, use of language, allow the women to express their voices, their privacy, their illiteracy and other issues like poverty, violence against women, ethnic/religion/cultural barrier. She said that now Development Evaluators should advocate for gender equality and should be engaged in activism for women’s rights.

Next panellist was Dr. Ranjani K. Murthy, who shared her experience on participatory Evaluation as well as tools and methods used in it. Her presentation mainly focussed on the gender intensity and specific constraints and possible solutions to women’s participation in evaluation. Among all issues, I liked her discussion on Happiness Mapping, Body Mapping, GDOL mapping, poverty ranking and Gender-based Access and Control over Resources.

The last panellist in the gender panel from South Asia was Ms. Lama Kanchan, who shared her experience on measuring gender impacts in poverty alleviation programmes.  She said that some challenges are

  • Underestimation of social indicators
  • Gender-specific dimensions often get diluted in M&E, specifically in the programmes related to poverty alleviation, and
  • Economic acceleration of women often viewed as threat!

 

The second half of the first day was Workshops, where I participated in the ‘Real World Evaluation’ led by Dr. Jim Rugh. The workshop has to objectives:

 

  • Discussion on how to promote a Real World and Holistic approach to an Impact Evaluation
  • How to design evaluation under budget, time, data and political constraints.

 

The workshop started with a presentation on what is real world impact evaluation; various design options, ways to reconstruct BLS data, and alternative counterfactuals. He also emphasised on Mixed-method Evaluation and different logic models.

 

He divided the participants into small groups and given different case-stories. Mainly the participants did activities on

 

  1. Scoping
  2. Design
  3. Logic Model
  4. Reconstructing Baseline data
  5. Alternative Counterfactuals
  6. Realistic and holistic impact evaluation, and
  7. Negotiating Terms of References

 

[see details in www.RealWorldEvaluation.org]

 

The next day started with the lecture of Mr. Robert Chambers on Opportunities and Challenges for participation in Evaluation. It was very informative and interesting discussion with question and answer session. The panel on Impact Evaluation as a tool for Gender-based Policy by John Floretta was also very interesting.

 

After tea-break, I joined the panel on ‘How to design and conduct gender responsive evaluations’, Chaired by Ms. Shreyoshi Jha, UN WOMEN. Three panellists were Dr. Shiv Kumar, National Advisory Council, India, Ms. Rebecca Miller, Mahidol University, Bangkok and Dr. Yamini Atmavilas, ASCI, India. Dr. Shiv Kumar shared a study by Indian Institute of Population Studies on Young Population, where the study is showing that 47% married girls said that they were very scared or unhappy on the day of wedding and one of the reason was that they heard that boys slap girls after marriage! He raised very valid points, like, if we want to evaluate a programme for women, how to proceed with a patriarchy society like in India? And also what should be the design of the interventions to take care of all circumstances affecting women rights?

 

Rebecca shared her experience on ‘Safe Cities’ a UN WOMEN Flagship Programme in different countries to make places free of violence against women and girls. She also discussed on the strategies and challenges to measure sexual violence accurately, threats of validity ethical questions. She also said that right now we need an integrated, innovative and participatory approach, which will involve women and local organisations as equal partners.

 

Yamini, Associate Professor & Chair, Gender Studies Centre for Human Development, Administrative Staff College of India, has a vast experience on evaluation from a gender point of view and she shared her experience of impact evaluation of a Conditional Cash Transfer Scheme of Govt of India, known as Indira Gandhi Matritya Suraksha Yojana [IGMSY], which is piloted in 50 districts of India in 12 states. There were two main questions for impact evaluation, 1] whether women are experiencing improvement in health or not, and 2] whether women are experiencing financial benefit or not.

 

Other questions were whether women can negotiate their works off during critical time of pregnancy or lactation? Whether they have time to get services, to take rest, to listen sessions on maternal and child health by the service providers in the village? Whether they used incentives in the way they like to do?

For me, this study is very important from the gender equity point of view and as the scheme is now in pilot phase, the findings are also very important to share with the stakeholders for scaling up in future.

 

After lunch there were 6 parallel workshops, among which I participated in a very new one, which was on the ‘Collaborative Learning Method’, led by Tessie Tzavaras Catsambas, EnCompass LLC. The workshop had 5 basic objectives, which are as follows:

 

1. Develop a basic understanding of Quality Improvement and Creative Design principles

2. Select a topic for collaborative learning in CoE

3. Use peer coaching for generating creative solutions and identifying best practices

4. Imbed measurement and regular monitoring in Collaborative Learning

5. Plan for Collaborative Learning within CoE

 

According to the collaborative learning method, through three steps – quality improvement, improvement in collaborations and creative design, ‘Improvement Collaborative’ could be formed, which is an organized network of a large number of sites (e.g. districts, facilities or communities) that work together for a limited period of time, usually 9 to 24 months, rapidly to achieve significant (often dramatic) improvements in a focused topic area through shared learning and intentional spread methods. The system, processes, quality and efficiency of service are to be improved.   

The Four basic principles of quality improvement are:

 

1] Understand client needs

2] Understand the system and processes of care

3] Teamwork, and  

4] Measure results

And the No. 5 is the most important ….MAKE CHANGES!

 

She shared the basic model for improvement in collaborative learning is the PDSA Cycle, which is as follows:

 

Act:

•Take action based on results

•What changes are to be made?

•Next cycle?

 

Plan:

•Objectives

•Questions and predictions (why?)

•Planning (who, what, where when)

•Plan for data collection

•Communicate the change, engage

 

Study:

•Complete the analysis of the data (impact of intervention?)

•Compare data to predictions

•Summarize what was learned

 

 

Do:

•Carry out the plan

•Document problems and unexpected observations

•Begin data analysis

 

 

 

 

 

 

 

 

 

 

 

 

 

According to my understanding, there are 2-3 things which are very new in collaborative learning method, than the traditional ones; collaborative are

 

  • Common problem, shared by all members
  • Structured cross-learning
  • Friendly competition
  • Shared best practices
  • Rapid spread

For details please visit, www.encompassworld.com.

She also facilitated a group work with 4 groups of participants. Each group selected a case and practiced on the collaborative learning method. Her tool of ‘Peer Coaching Approach’ is very interesting and CINI will use that in adolescent and young people’s programmes.

The next day started with the keynote speech y Michael Quinn Patton on Innovative Directions for Evaluation of Development, followed by 6 workshops, among which I participated in the Partcipatory Evaluation, led by Robert Chambers.

Other worth-mentioning panels were Evaluation of socially responsible business models by Ravi Verma, ICRW, Use and usability of Evaluation by Murray Saunders, etc.

As an individual as well as an organisation, I have come back with huge knowledge, new ideas, clarification on some issues and avenues and networks through which we can promote participatory feminist evaluation in India, especially in Eastern India. As CINI is working both at grassroots level with network of women CBOs and also works at state and national level for policy advocacy, it was a great opportunity to promote the feminist evaluation and gender-friendly evaluation approaches. Already I have become a member of the Community of Evaluators and a Task Group Member of the Institutional Strengthening group.

Hope with all these learning and experiences, we could promote feminist and participatory evaluation both as individual as well as part of the organisation!

Reflections%20from%20the%20Evaluation%20Conclave%202013.doc

Views: 129

Add a Comment

You need to be a member of Gender and Evaluation to add comments!

Join Gender and Evaluation

Comment by Saeid Nouri Neshat on May 1, 2013 at 16:06

This model of PDSA is so interesting, it is very basic model for improvement in collaborative learning as you have mentioned. I knew that there is PDCA but here instead of C for Check you have S for Study. Of course both Check and Study are phases of thinking - means that you collect data and start to analyze them. Anyway, thank you. Effective and useful.

© 2024   Created by Rituu B Nanda.   Powered by

Badges  |  Report an Issue  |  Terms of Service