Comparing experiences with MSC, 5C Capability, V&A, Keystone survey

Dear all,

I am reviewing the history of a number of tools that my client organisation has used and adapted over the past 3-4 years. I would like to hear from other (particularly smaller) organisations who used the following tools for monitoring and evaluating/demonstrating impact of their work and share experiences regarding successes and challenges, how these original tools were adapted (or eventually replaced) to fit the organisational context.

Most Significant Change/Stories of Change

5 Core Capability Framework

Voice & Accountability

Keystone Survey

Thank you so much and looking forward to hearing from you.

Karen

Views: 630

Reply to This

Replies to This Discussion

Have you heard about outcome harvesting?  We are using it for an internal evaluation of an advocacy program.  The tool is a participatory method that codifies what I would consider good practice evaluation methods when you lack a baseline and outcomes are hard to measure with the typical indicator approach and on top of that it allows you to capture unplanned outcomes.  There is a guidebook:  http://www.managingforimpact.org/sites/default/files/resource/outom...

and there are some published cases from the World Bank:  http://wbi.worldbank.org/wbi/Data/wbi/wbicms/files/drupal-acquia/wb...

As an evaluator, i helped with mapping impacts of educational projects in India and the Czech Republic using the Most Significant Change (in triangulation with ither methods). I uswd it also with children. Recently, a project on human rights education and another one on Roma inclusion in preschools aplied this method. The implementers, small NGOs, were also interested to use it in their social work. I have some points at www.evaluace.com, please let me know if you need something specific!
Thanks for other tips. :)
Inka

Dear Karen,

We have used MSC to gauge individual and collective efficacy and leadership skills among elected women representatives in local governance. Let me know your specific queries.

Regards

Manju

Hi Karen,

I have not used MSC fully in the purist form as by Dart and Davies. 

I facilitated about a nine month long community engagement project using strength-based approach (community life competence) with youth, village communities on HIV and drug use in Nagaland, India. We ended this process with Participatory action research. In the questionnaire we added the question what was the most significant change? The data was collected by core group from the community through Focus group discussions with different groups like elders, drug users, youth, women, home visits etc. The question provoked a lot of discussion on what they groups understood by change and we also got many stories of change. 

When the core group analysed the data (we facilitated the process)  they chose 2-3 stories amongst the stories collected. 

I found MSC a good tool for reflection and learning. 

Thank you Rituu - I think most tools need to be adapted to the context and are rarely used in a pure form. I think the discussion around specific issues like "most significant" or meaning of change is the most rewarding because it generates so much in information re values, perceptions and expectations. I would agree it is great for reflection and learning.

Dear Karen, as for Keystone Development Partnership Survey, you may check with acodev.be. They used it to survey relations between Belgian and Southern NGOs and had some interesting findings. They were also interested to engage some more development NGO platforms to have a relevant benchmark. As far as I know, the Czech platform was not interested due to financial reasons. 

Thank you, Inka

Our project is using Rick Davies and MandE's Most Significant Change approach in schools across Turkey. A large stumbling factor has been enabling school coordinators to understand what MSC stories should contain. "MSC" as a description of the activity should never have been used because potential voices were lost due the daunting task of expressing something "significant". "Every little change" is perhaps better at the author level. I think the point also resonates with evaluators needing to ensure all voices are heard, in their rich diversity.

Our teacher coordinators saw it as a competition to get a really "significant" story and not as the rich tapestry you describe. This not only defeats the purpose but also creates a barrier to the participation in the monitoring. For the purpose of story collection "every little change" helps encourage the wider range and inclusive participation that we are striving for. It is all in the communication because as the evaluator of course we understand the value of all the stories, negative and positive as you say.
At the same time, the teacher coordinators really want to understand the process they are involved in. I have found that presenting the index of all the stories and indicating how it enables us to identify trends of changes across different provinces (e.g. greater self-confidence in 5% of all stories, but 15% in a particular region) or unexpected changes that are actually widespread, and so on, really explains the process.

Thank you, Malcolm - this is interesting. I understand the dilemma about telling something "significant". I had similar experience - or the opposite, that there was not much reflection about significance at all. Your index of stories to identify trends sounds like a very good idea. 

Hi, Karen,  would you be able to give a bit more information on the specific tool you are referring to for voice and accountability?   thanks

this is a tool developed by CAFOD to measure partners' capacity to advocate and influence at government, constituency, and corporate level. 

http://www.e-alliance.ch/fileadmin/user_upload/docs/Advocacy_Capaci...

Karen, many thanks! This is very interesting, have you ever applied it yourself? Thank you! :) 

RSS

© 2024   Created by Rituu B Nanda.   Powered by

Badges  |  Report an Issue  |  Terms of Service