IDH and WSAF Publication of ToolKit
Tashi Dendup Blog
David Wand - Podcast Reviewing Somalia SRH GBV project Performance Measurement Framework
Public Health Journal - December, 2024
Please get in touch with Steven Ariss (s.ariss@sheffield.ac.uk) if you’re keen to learn more or would like more FAIRSTEPS related resources.
ORACLE NEWS DAILY - Article by George S. Tengbeh
IEG & World Bank Publication - October, 2024
Getaneh Gobezie - Two Blogs
EVALSDGs Insight Dialogue - October 23rd 2024
Quick tips to assess the risks of AI applications in Monitoring and Evaluation
recording here, and the Evaluation Insight here.
Value for Women Publication 2024
March 4, 2025 at 6pm to March 6, 2025 at 7pm – Europe
0 Comments 0 LikesWe believe that systems thinking has a place in evaluation because it lets us think strategically about complexity and multiple intersectional influences that impact an intervention. What do you think?
Tags:
I support the statement; nevertheless I see systems thinking in evalaution is undermined by weak monitoring or thematic studies during implementation about context changes; viable information about position changes in decision making from household to institutional level is almost absent. Combined with the fact that research time for evaluation is too short causing from the start already bias in geographic spreading, in actors included, time, etc
Finally systems thinking is for most professionals to complex to handle and transform its results in daily work routine; so such an evaluation might dis empower the end user of the evaluation results.
Thanks Jolanda for your comments. The more I consider ST in evaluation processes, the more I realize how much there is to learn about how to leverage and use its methodologies. For me, this exploration translates into increased opportunities. A systems thinking approach uses boundary concepts as the fundamental, iterative process in any analysis. Boundaries around systems are physical, personal and or social constructs/worldviews (perspectives). They define the limits of something, but not necessarily making those limits fixed, but still marking the inclusion or exclusion of ideas or stakeholders along with the reasoning behind those decisions. The UN Women IOE guidance of a Gender Environment Marginalizing and Systemic Evaluation (GEMSE is a working name the approach is currently being drafted) is looking at how to support evaluators as they work with the new SDGs which have an explicit interest in building local capacity to measure their own challenges and successes. To that end, I would hope to see a gradual paradigm shift in development work away from ‘planned interventionism’ (assuming we can measure change as a result of coordinated and planned action) to an acceptance that all systems are inherently complex and emergent and capturing/learning from both intended and unintended outcomes. As you point out though, the current timeline norms (and therefore funding) also need to shift to allow for more participatory, reflective and iterative data gathering cycles.
© 2024 Created by Rituu B Nanda. Powered by