Skip to content

Handy Tools for Peacebuilders

2013 January 16

Creating a culture and community of learning is central to the work that we do at Search for Common Ground. This was one of the key motivations for the launch of the Learning Portal for DM&E for Peacebuilding, an online community of practitioners that also serves as a comprehensive resource for everything design, monitoring, and evaluation. The Learning Portal intends to build the capacity of specialists in the field and to ensure that our staff has access to the most cutting edge ideas when it comes to measuring our impact in complex settings.

DMEAcknowledging the need for practical, action-oriented guides, the Institutional Learning Team at Search for Common Ground has put together 32 diverse modules that serve as basic, self-guided manuals on how to conduct design, monitoring and evaluation. The modules are meant to enable peacebuilders to learn simple, practical, and effective tools to design high quality programs and follow them up with top-notch monitoring and evaluation.

In June 2012, the authors of Search for Common Ground’s groundbreaking design, monitoring and evaluation for peacebuilding handbook, Designing for Results: Integrating Monitoring and Evaluation into Peacebuilding Activities, released a study that illustrated the crucial role of resource materials in building evaluation capacity. There were three important takeaways from the study.


1)      Practitioners in the field often indicate that there is a huge gap in practical hands-on resources that can help build evaluation capacity;

2)      Practitioners struggle more with the practical application of monitoring than with conducting and creating evaluations; and lastly,

3)      While there are a number of brilliant documents helping this process, practitioners still need assistance with creating monitoring systems that are adaptable to the dynamic contexts of conflict settings.

The modules that we have developed are a variety of tools and methodologies for measuring impact and creating space for learning. There are also instructions and tips on cross-cutting methodologies and research practices that are useful at any stage of the project cycle. They are all bite size pieces of the project cycle that are easily accessible and are designed to work with the hectic schedules of monitoring and evaluation work in the field. The modules also have helpful tips and real life examples about process and the potential pitfalls that should avoided for each specific subject at hand.

When creating these documents, we really aspired to de-mystify the whole process of monitoring our impact and capturing the transformative power of our programs. People often think of monitoring and evaluation as a cumbersome exercise that requires much time and expertise when in fact, these modules demonstrate handy tips and exercises that will make the process more participatory and make monitoring and evaluation in conflict settings more time- and cost-effective. Many of the examples draw from SFCG’s own experience to bring these lessons to life.

And here’s the best part-most of the modules are small documents that can be further broken up into smaller sections to address specific issues and opportunities that come up while making evaluative decisions. Most of the modules include handy tools, interactive exercises and even quizzes to pre-test your knowledge. Some are narrowly targeted (Ex. How do I conduct an interactive and participatory focus group? Or: What is the correct procedure to create and conduct a survey and how can I use this information in a variety of ways?) while others target larger issues. For instance, one of the modules examines the ethics of conducting research on children and youth in conflict settings.

Our modules are really designed so that anyone in the field, with no prior experience in evaluation or monitoring can pick up from any point and implement evaluative decision-making at every step of the project cycle! So what are you waiting for? Check out the DM&E Modules now at

Tanya D’Lima is a Design, Monitoring, and Evaluation Intern on the Institutional Learning Team at Search For Common Ground in our Washington, D.C. office. She has a Masters degree in International Development and Social Change from Clark University, Worcester, Massachusetts. She is interested in gender sensitive programming and evaluation in conflict settings and has written some of the above modules based on feedback from SFCG’s field experience.
No comments yet

Leave a Reply

Note: You can use basic XHTML in your comments. Your email address will never be published.

Subscribe to this comment feed via RSS