Monthly Corner

 IDH Publication, 2026

Gender-Based Violence (GBV) is not just a social issue, it’s a systemic challenge that undermines agricultural value chains.

In rural and isolated areas, GBV threatens women’s safety, limits their economic participation, and weakens food security. When women cannot work safely, entire communities lose resilience, and businesses lose productivity. Climate resilience strategies that overlook gendered risks leave communities exposed and women vulnerable.

Ending GBV is essential for building equitable, sustainable, and climate-resilient agri-food systems; and it’s not only a human rights imperative, but also central to climate adaptation and economic stability.

The good news? Solutions work. Programs like the Women’s Safety Accelerator Fund (WSAF) demonstrate that addressing GBV can enhance productivity and strengthen workforce morale and brand reputation. Safe, inclusive workplaces aren’t just good ethics, they’re smart business.

Gurmeet Kaur Articles

Luc Barriere-Constantin Article

 This article draws on the experience gained by The Constellation over the past 20 years. It is also a proposal for a new M&E and Learning framework to be adopted and adapted in future projects of all community-focused organisations.

Devaka K.C. Article

Sudeshna Sengupta Chapter in the book "Dialogues on Development edited by Prof Arash Faizli and Prof Amitabh Kundu."

Vacancies

  • We’re Hiring: National Evaluation Consultant – Bangladesh

UN Women is recruiting a National Evaluation Consultant (Bangladesh) to support the interim evaluation of the Joint Regional EmPower Programme (Phase II).

This is a great opportunity to work closely with the Evaluation Team Leader and contribute to generating credible, gender-responsive evidence that informs decision-making and strengthens programme impact.

📍 Location: Dhaka, Bangladesh (home-based with travel to project locations)
📅 Apply by: 24 February 2026, 5:00 PM
🔗 Apply here: https://lnkd.in/gar4ciRr

If you are passionate about feminist evaluation, gender equality, and rigorous evidence that drives change (or know someone who is) please apply or share within your networks.

  • Seeking Senior Analyst - IPE Global

About the job

IPE Global Ltd. is a multi-disciplinary development sector consulting firm offering a range of integrated, innovative and high-quality services across several sectors and practices. We offer end-to-end consulting and project implementation services in the areas of Social and Economic Empowerment, Education and Skill Development, Public Health, Nutrition, WASH, Urban and Infrastructure Development, Private Sector Development, among others.

Over the last 26 years, IPE Global has successfully implemented over 1,200 projects in more than 100 countries. The group is headquartered in New Delhi, India with five international offices in United Kingdom, Kenya, Ethiopia, Philippines and Bangladesh. We partner with multilateral, bilateral, governments, corporates and not-for-profit entities in anchoring development agenda for sustained and equitable growth. We strive to create an enabling environment for path-breaking social and policy reforms that contribute to sustainable development.

Role Overview

IPE Global is seeking a motivated Senior Analyst – Low Carbon Pathways to strengthen and grow its Climate Change and Sustainability practice. The role will contribute to business development, program management, research, and technical delivery across climate mitigation, carbon markets, and energy transition. This position provides exceptional exposure to global climate policy, finance, and technology, working with a team of high-performing professionals and in collaboration with donors, foundations, research institutions, and public agencies.

More Details Please go through

Hi Gender and Eval Community!

Over on my blog, free-range evaluation, I recently shared some thoughts on how my colleagues and I have been working to support and promote "evaluative thinking," especially among "non-evaluators" (i.e., program implementers who don't see themselves as evaluators, or who maybe even dislike evaluation). 

Here, I share some of those thoughts, in the hope of learning from you all what you do to foster evaluative thinking with the people with whom you interact.

I was inspired by a recent post on the Stanford Social Innovation Review blog that mentioned the importance of evaluative thinking. The post, “How Evaluation Can Strengthen Communities,” is by Kien Lee and David Chavis, principal associates with Community Science.

They describe how—in their organization’s efforts to build healthy, just, and equitable communities—supporting evaluative thinking can provide “the opportunity for establishing shared understanding, developing relationships, transforming disagreements and conflicts, engaging in mutual learning, and working together toward a common goal—all ingredients for creating a sense of community.” Along with Jane Buckley and Guy Sharrock, in our work to promote evaluative thinking in Catholic Relief Services and other community development organizations, we have definitely seen this happen as well.

But how does one support evaluative thinking? On aea365 and in an earlier post here, we share some guiding principles we have developed for promoting evaluative thinking. Below, I briefly introduce a few practices and activities we have found to be successful in supporting evaluative thinking (ET). Before I do that, though, I must first give thanks and credit to both the Cornell Office of Research on Evaluation, whose Systems Evaluation Protocol guides the approach to articulating theories of change which has been instrumental in our ET work, and to Stephen Brookfield, whose work on critical reflection and teaching for critical thinking has opened up new worlds of ET potential for us and the organizations with which we work! Now, on to the practices and activities:

  • Create an intentional ET learning environment
    • Display logic models or other theory of change diagrams in the workplace—in meeting rooms, within newsletters, etc.
    • Create public spaces to record and display questions and assumptions.
    • Post inspirational questions, such as, “How do we know what we think we know?” (as suggested by Michael Patton here).
    • Highlight the learning that comes from successful programs and evaluations and also from “failures” or dead ends.
  • Establish a habit of scheduling meeting time focused on ET practice
    • Have participants “mine” their logic model for information about assumptions and how to focus evaluation work (for example, by categorizing outcomes according to stakeholder priorities) (Trochim et al., 2012).
    • Use “opening questions” to start an ET discussion, such as, “How can we check these assumptions out for accuracy and validity?” (Brookfield, 2012, p. 195); “What ‘plausible alternative explanations’ are there for this finding?” (see Shadish, Cook, & Campbell, 2002, p. 6).
    • Engage in critical debate on a neutral topic.
    • Conduct a media critique (critically review and identify assumptions in a published article, advertisement, etc.) (an activity introduced to us by evaluation capacity building pioneer Ellen Taylor-Powell).
  • Use role-play when planning evaluation work
    • Conduct a scenario analysis (have individuals or groups analyze and identify assumptions embedded in a written description of a fictional scenario) (Brookfield, 2012).
    • Take on various stakeholder perspectives using the “thinking hats” method in which participants are asked to role play as a particular stakeholder (De Bono, 1999).
    • Conduct an evaluation simulation (simulate data collection and analysis for your intended evaluation strategy).
  • Diagram or illustrate thinking with colleagues
    • Have teams or groups create logic and pathway models (theory of change diagrams or causal loop diagrams) together (Trochim et al., 2012).
    • Diagram the program’s history.
    • Create a system, context and/or organization diagram.
  • Engage in supportive, critical peer review
    • Review peer logic models (help identify leaps in logic, assumptions, strengths in their theory of change, etc.).
    • Use the Critical Conversation Protocol (a structured approach to critically reviewing a peer’s work through discussion) (Brookfield, 2012).
    • Take an appreciative pause (stop to point out the positive contributions, and have individuals thank each other for specific ideas, perspectives or helpful support) (Brookfield, 2012).
  • Engage in evaluation
    • Ensure that all evaluation work is participatory and that members of the organization at all levels are offered the opportunity to contribute their perspectives.
    • Encourage members of the organization to engage in informal, self-guided evaluation work.
    • Access tools and resources necessary to support all formal and informal evaluation efforts (including the support of external evaluators, ECB professionals, data analyzers, etc.).

What other techniques and practices have you used to promote and support evaluative thinking?

A theory of change 'pathway model' from CRS Zambia, helping practitioners to identify and critically reflect on assumptions.

Note: The ideas above are presented in greater detail in a recent article in the American Journal of Evaluation:

Buckley, J., Archibald, T., Hargraves, M., & Trochim, W. M. (2015). Defining and teaching evaluative thinking: Insights from research o...American Journal of Evaluation. Advance online publication. doi:10.1177/1098214015581706

----------------------------

References:

Brookfield, S. (2012). Teaching for critical thinking: Tools and techniques to help students question their assumptions. San Francisco, CA: Jossey-Bass.

De Bono, E. (1999). Six thinking hats. London: Penguin.

Shadish, W., Cook, T., & Campbell, D. (2002). Experimental and quasi-experimental designs for generalized causal inference. Boston: Houghton Mifflin.

Trochim, W., Urban, J. B., Hargraves, M., Hebbard, C., Buckley, J., Archibald, T., Johnson, M., & Burgermaster, M. (2012). The Guide to the Systems Evaluation Protocol (V2.2). Ithaca NY. Retrieved from https://core.human.cornell.edu/research/systems/protocol/index.cfm.

Views: 261

Add a Comment

You need to be a member of Gender and Evaluation to add comments!

Join Gender and Evaluation

Comment by Md. Safiur Rahman on June 25, 2015 at 9:20

Dear Tom, Thanks indeed for your nice effort. Really Excellent.

Comment by Albie Colvin on June 25, 2015 at 3:02

Thanks for sharing Tom. Lots of great ideas and links to really useful information. Much appreciated!

Comment by Rituu B Nanda on June 24, 2015 at 21:46

Thanks for this valuable sharing, Tom. I love evaluation for the reflection and subsequent learning. Participatory action research is one tool I have found very helpful to promote reflection and action. I love it!

Comment by Rajib Nandi on June 24, 2015 at 15:25

Innovative ideas to share and promote "evaluative thinking" among "non-evaluators". Looking forward to read the detailed article in the American Journal of Evaluation.

© 2026   Created by Rituu B Nanda.   Powered by

Badges  |  Report an Issue  |  Terms of Service