Monthly Corner

IDH and WSAF Publication of ToolKit

Tashi Dendup Blog

David Wand - Podcast Reviewing Somalia SRH GBV project Performance Measurement Framework 

Public Health Journal - December, 2024

Please get in touch with Steven Ariss (s.ariss@sheffield.ac.uk) if you’re keen to learn more or would like more FAIRSTEPS related resources.

ORACLE NEWS DAILY - Article by George S. Tengbeh

IEG & World Bank Publication - October, 2024

Getaneh Gobezie - Two Blogs

EVALSDGs Insight Dialogue - October 23rd 2024

Value for Women Publication 2024

The why and the how of Mixed Methods evaluation designs, with Donna Mertens.

It’s early morning in Minneapolis and the third day of the educational programme of the AEA 2019 is about to begin. A rather large group of people has gathered to be guided through Mixed methods in evaluation by Donna Mertens, whose multiple affiliations include the former editor of Journal on Mixed Methods Research and the past president of the American Evaluation Association. For the majority of participants, the topic is not new. Indeed, after years of debate on the limitations of exclusive reliance on quantitative or qualitative evaluation approaches, mixed method approach has become the first choice for many evaluators, particularly for evaluations in complex contexts where integrating community perspectives into an inquiry process by collecting qualitative data is essential.

Yet, for many of us waiting to attend this workshop, questions still remained about how to develop and correctly apply the mixed methods in evaluation. Donna was quick to encourage her learners, reaffirming that there is no one right way to mix methods in evaluation. What is definitive in the choice of mixed methods are different philosophical assumptions guiding the evaluator’s thinking about ethical practice, the nature of knowledge, and the nature of reality, which in turn lead to different stances in terms of methodological frameworks. Mixed methods can be used within any of these methodological frameworks.

What is important and, according to Donna, not always followed through in the mixed methods evaluation design, is the synergy between
quantitative and qualitative aspects of the evaluation methodology. In other words, complementing quantitative and qualitative data should provide an evaluation with insights that go beyond those coming from quantitative or qualitative data alone.

The sophistication of mixed methods designs presented in this training and described in Donna’s new book Mixed Methods Design in evaluation are guided by evaluation paradigms and their related branches as defined by Marvin C. Alkin in his book Evaluation Roots: Methods, Use and Values. Mertens and Wilson added a fourth branch: Social Justice.

Briefly, the positivist and postpositivist paradigm is associated with the Methods Branch and is characterised by a primary concern about the effectiveness of interventions and the application of experimental evaluation designs. Quantitative methods and the choice of randomized controlled trials (RCTs) tend to dominate this branch.

By contrast, the Values branch and the constructivist paradigm places value on cultural context, targeted populations and the reflection of multiple realities they represent. Qualitative methods to study context and the meaning of the problem prevail over quantitative ones in the design of value-driven evaluations.  

The Use Branch and the pragmatic paradigm are commonly adopted by evaluators who are primarily concerned by the evaluation’s use by a particular group of stakeholders. In this context, evaluators design methodologies viewed as credible by this group, prioritising the most practical ways of responding to the evaluation questions.

Finally, the Social Justice branch and transformative paradigm are explicitly concerned with issues of power dynamics, systemic discrimination and social transformation, and emphasize human rights perspectives and the voices of marginalized groups. Feminist theory, disability rights theory, and critical race theory, among others, can effectively guide transformative evaluation approaches.

Of the four main paradigms, transformative evaluation most strongly defines Donna as an evaluator and guides her evaluation designs, implementation and use. In this blog, Donna and Stephen Porter compiled a valuable list of resources on transformative evaluation approached that could be relevant for many readers of Gender and Evaluation.

After a quick run through the theory, workshop participants practice creating mixed methods designs along the above-mentioned major evaluation branches. 

At the risk of oversimplifying the training content, the following best practices suggested by Donna should be followed by prospective evaluators:

  • Be explicit about the mixed methods design being used.
  • Determine that the study does use both quantitative and qualitative data.
  • Follow criteria available for judging quantitative studies and qualitative studies in Research and evaluation in education and psychologyFor example, different criteria would be used to assess the quality of a randomized controlled trial than for a survey or an ethnographic case study.
  • Examine the points at which qualitative and quantitative methods are integrated in the study. Note how this is done and how the integration results in a stronger study than would be possible by one approach alone.
  • Situate the work within existing literature about mixed methods approaches and indicate how this approach expands understandings methodologically.
  • Examine the philosophical framing claimed for the study and determine the extent to which the study reflects the assumptions of the chosen framework.

For those who are interested in knowing more, here are links to a case study, evaluation report, and a transformative evaluation toolkit generously shared by Donna:

 

 

 

 

 

 

 

 

Views: 1885

Add a Comment

You need to be a member of Gender and Evaluation to add comments!

Join Gender and Evaluation

Comment by Fabiola Amariles on February 11, 2020 at 1:26

Thank you, Alena for summarizing the teachings of Donna Mertens in her workshop. And thanks to Donna for continuing to spread not only the methods but the principles for applying transformative paradigms to the evaluation.

It is a privilege to participate in the knowledge exchange events that Donna offers! Gradually the principles of social justice and human rights permeate the evaluation through the mixed methods she teaches. I have also participated in her workshops and the lessons are huge!

I am glad that you also shared academic resources to learn more about mixed methods and transformative evaluation. 

Hugs to the two of you! 

Comment by Azza on February 10, 2020 at 21:03

Very useful report

Comment by Rituu B Nanda on February 10, 2020 at 17:53

Thanks Alena! Thanks Donna!

Donna, i have had the honour to listen to you face to face and how you bring out the inequities- differently abled, race, gender, location and nuances around them. Ownership of those affected in the evaluation has been a game changer in my experience. 

Comment by Donna Mertens on December 10, 2019 at 2:18

Conducting this workshop always keeps me motivated to want to improve how we as a community do evaluations. The participants come with such critically important issues and I love seeing how they apply the principles of transformative evaluation to their evaluation projects. We had participants who focused on improving safety for sex workers in Mozambique and reducing cyber bullying of members of the LGBT community, among others. 

© 2024   Created by Rituu B Nanda.   Powered by

Badges  |  Report an Issue  |  Terms of Service