Evaluation of UN Women’s Work on the Care Economy in East and Southern Africa
Evaluation of UN Women's work on the Care Economy in East and Southern Africa - Evaluation Report
A regional study of gender equality observatories in West and Central Africa, carried out by Claudy Vouhé for UN Women
Sources: UN Women
This regional study offers an inventory and analysis of the legal framework of gender observatories, their attributions, functions and missions. It is based on exchanges with 21 countries, in particular the eleven countries that have created observatories. It compares the internal organisation and budgets of the observatories between countries, looks at operational practices, in particular the degree of involvement in the collection and use of data, and identifies obstacles and good practices in terms of influencing pro-gender equality public policies. Finally, the study draws up a list of strategic recommendations intended for observatories, supervisory bodies and technical and financial partners.
MSSRF Publication - November 2025 - Shared by Rajalakshmi
Ritu Dewan - EPW editorial comment on Labour Codes
Eniola Adeyemi Articles on Medium Journal, 2025
An analysis of the “soft life” conversation as it emerges on social media, unpacking how aspirations for ease and rest intersect with broader socio-economic structures, gendered labour expectations, and notions of dignity and justice
Tara Prasad Gnyawali Article - 2025
This article focused on the story of community living in a wildlife corridor that links India and Nepal, namely the Khata Corridor, which bridges Bardiya National Park of Nepal and Katarnia Wildlife Sanctuary of Uttar Pradesh, India.
This article revealed how the wildlife mobility in the corridor affects community livelihoods, mobility, and social inclusion, with a sense of differential impacts on farming and marginalised communities.
Lesedi Senamele Matlala - Recent Article in Evaluation Journal, 2025
Vacancy | GxD hub, LEAD/IFMR | Research Manager
Hiring a Research Manager to join us at the Gender x Digital (GxD) Hub at LEAD at Krea University, Delhi.
As a Research Manager, you will lead and shape rigorous evidence generation at the intersection of gender, AI, and digital systems, informing more inclusive digital policies and platforms in India. This role is ideal for someone who enjoys geeking out over measurement challenges, causal questions, and the nuances of designing evaluations that answer what works, for whom, and why. We welcome applications from researchers with strong mixed-methods expertise, experience designing theory or experiment based evaluations, and a deep commitment to gender equality and digital inclusion.
Must-haves:
• 4+ years of experience in evaluation and applied research
• Ability to manage data quality, lead statistical analysis, and translate findings into clear, compelling reports and briefs
• Strong interest in gender equality, livelihoods, and digital inclusion
• Comfort with ambiguity and a fast-paced environment, as the ecosystem evolves and pivots to new areas of inquiry
📍 Apply here: https://lnkd.in/gcBpjtHy
📆 Applications will be reviewed on a rolling basis until the position is filled.
So sooner you apply the better!
Hi Gender and Eval Community!
Over on my blog, free-range evaluation, I recently shared some thoughts on how my colleagues and I have been working to support and promote "evaluative thinking," especially among "non-evaluators" (i.e., program implementers who don't see themselves as evaluators, or who maybe even dislike evaluation).
Here, I share some of those thoughts, in the hope of learning from you all what you do to foster evaluative thinking with the people with whom you interact.
I was inspired by a recent post on the Stanford Social Innovation Review blog that mentioned the importance of evaluative thinking. The post, “How Evaluation Can Strengthen Communities,” is by Kien Lee and David Chavis, principal associates with Community Science.
They describe how—in their organization’s efforts to build healthy, just, and equitable communities—supporting evaluative thinking can provide “the opportunity for establishing shared understanding, developing relationships, transforming disagreements and conflicts, engaging in mutual learning, and working together toward a common goal—all ingredients for creating a sense of community.” Along with Jane Buckley and Guy Sharrock, in our work to promote evaluative thinking in Catholic Relief Services and other community development organizations, we have definitely seen this happen as well.
But how does one support evaluative thinking? On aea365 and in an earlier post here, we share some guiding principles we have developed for promoting evaluative thinking. Below, I briefly introduce a few practices and activities we have found to be successful in supporting evaluative thinking (ET). Before I do that, though, I must first give thanks and credit to both the Cornell Office of Research on Evaluation, whose Systems Evaluation Protocol guides the approach to articulating theories of change which has been instrumental in our ET work, and to Stephen Brookfield, whose work on critical reflection and teaching for critical thinking has opened up new worlds of ET potential for us and the organizations with which we work! Now, on to the practices and activities:
What other techniques and practices have you used to promote and support evaluative thinking?
A theory of change 'pathway model' from CRS Zambia, helping practitioners to identify and critically reflect on assumptions.
Note: The ideas above are presented in greater detail in a recent article in the American Journal of Evaluation:
Buckley, J., Archibald, T., Hargraves, M., & Trochim, W. M. (2015). Defining and teaching evaluative thinking: Insights from research o.... American Journal of Evaluation. Advance online publication. doi:10.1177/1098214015581706
----------------------------
References:
Brookfield, S. (2012). Teaching for critical thinking: Tools and techniques to help students question their assumptions. San Francisco, CA: Jossey-Bass.
De Bono, E. (1999). Six thinking hats. London: Penguin.
Shadish, W., Cook, T., & Campbell, D. (2002). Experimental and quasi-experimental designs for generalized causal inference. Boston: Houghton Mifflin.
Trochim, W., Urban, J. B., Hargraves, M., Hebbard, C., Buckley, J., Archibald, T., Johnson, M., & Burgermaster, M. (2012). The Guide to the Systems Evaluation Protocol (V2.2). Ithaca NY. Retrieved from https://core.human.cornell.edu/research/systems/protocol/index.cfm.
Add a Comment
Dear Tom, Thanks indeed for your nice effort. Really Excellent.
Thanks for sharing Tom. Lots of great ideas and links to really useful information. Much appreciated!
Thanks for this valuable sharing, Tom. I love evaluation for the reflection and subsequent learning. Participatory action research is one tool I have found very helpful to promote reflection and action. I love it!
Innovative ideas to share and promote "evaluative thinking" among "non-evaluators". Looking forward to read the detailed article in the American Journal of Evaluation.
© 2026 Created by Rituu B Nanda.
Powered by
You need to be a member of Gender and Evaluation to add comments!
Join Gender and Evaluation