It’s early morning in Minneapolis and the third day of the educational programme of the AEA 2019 is about to begin. A rather large group of people has gathered to be guided through Mixed methods in evaluation by Donna Mertens, whose multiple affiliations include the former editor of Journal on Mixed Methods Research and the past president of the American Evaluation Association. For the majority of participants, the topic is not new. Indeed, after years of debate on the limitations of exclusive reliance on quantitative or qualitative evaluation approaches, mixed method approach has become the first choice for many evaluators, particularly for evaluations in complex contexts where integrating community perspectives into an inquiry process by collecting qualitative data is essential.
Yet, for many of us waiting to attend this workshop, questions still remained about how to develop and correctly apply the mixed methods in evaluation. Donna was quick to encourage her learners, reaffirming that there is no one right way to mix methods in evaluation. What is definitive in the choice of mixed methods are different philosophical assumptions guiding the evaluator’s thinking about ethical practice, the nature of knowledge, and the nature of reality, which in turn lead to different stances in terms of methodological frameworks. Mixed methods can be used within any of these methodological frameworks.
What is important and, according to Donna, not always followed through in the mixed methods evaluation design, is the synergy between
quantitative and qualitative aspects of the evaluation methodology. In other words, complementing quantitative and qualitative data should provide an evaluation with insights that go beyond those coming from quantitative or qualitative data alone.
The sophistication of mixed methods designs presented in this training and described in Donna’s new book Mixed Methods Design in evaluation are guided by evaluation paradigms and their related branches as defined by Marvin C. Alkin in his book Evaluation Roots: Methods, Use and Values. Mertens and Wilson added a fourth branch: Social Justice.
Briefly, the positivist and postpositivist paradigm is associated with the Methods Branch and is characterised by a primary concern about the effectiveness of interventions and the application of experimental evaluation designs. Quantitative methods and the choice of randomized controlled trials (RCTs) tend to dominate this branch.
By contrast, the Values branch and the constructivist paradigm places value on cultural context, targeted populations and the reflection of multiple realities they represent. Qualitative methods to study context and the meaning of the problem prevail over quantitative ones in the design of value-driven evaluations.
The Use Branch and the pragmatic paradigm are commonly adopted by evaluators who are primarily concerned by the evaluation’s use by a particular group of stakeholders. In this context, evaluators design methodologies viewed as credible by this group, prioritising the most practical ways of responding to the evaluation questions.
Finally, the Social Justice branch and transformative paradigm are explicitly concerned with issues of power dynamics, systemic discrimination and social transformation, and emphasize human rights perspectives and the voices of marginalized groups. Feminist theory, disability rights theory, and critical race theory, among others, can effectively guide transformative evaluation approaches.
Of the four main paradigms, transformative evaluation most strongly defines Donna as an evaluator and guides her evaluation designs, implementation and use. In this blog, Donna and Stephen Porter compiled a valuable list of resources on transformative evaluation approached that could be relevant for many readers of Gender and Evaluation.
After a quick run through the theory, workshop participants practice creating mixed methods designs along the above-mentioned major evaluation branches.
At the risk of oversimplifying the training content, the following best practices suggested by Donna should be followed by prospective evaluators:
For those who are interested in knowing more, here are links to a case study, evaluation report, and a transformative evaluation toolkit generously shared by Donna:
Add a Comment