Browse purpose Resources
-
A Guide to Measuring Advocacy and Policy This guide provides some perspective on where the field of philanthropy has been with regard to evaluation of advocacy and policy and also acknowledges the unique issues and challenges associated with measuring these efforts. In addition, this guide serves as an invitation to grantmakers to engage in and expand thinking about evaluation as it relates to advocacy and policy efforts. As seriously as many grantmakers take their investments in this area, foundations should also take seriously the need to advance evaluation of advocacy and policy work.
Author: Prepared for Annie E. Casey Foundation Research by Organizational Research Services Type: Research & Reports Date: Jan 1, 2007 Be the first to review this resource! Download (255.04 KB) -
Advocacy & Public Policy Grantmaking: Matching Process to Purpose Building on research conducted in 2007 by Coffman and Campbell, this brief summarizes advocacy and public policy grantmaking approaches and their implications for grant portfolio composition and management, auxiliary supports and evaluation. “Advocacy and public policy grantmaking” refers to grantmaking in support of a wide range of advocacy activities that are intend to trigger, block, maintain, support and/or monitor changes in public policy at any level of government. Author: Tanya Beer, Pilar Stella Ingargiola, and Meghann Flynn Beer Type: Research & Reports Date: Aug 5, 2012 Be the first to review this resource! Download (1.17 MB) -
Advocacy & Public Policy Grantmaking: Matching Process to Purpose The Colorado Trust, in an effort to better understand the field of funding advocacy and public policy work, commissioned this publication to describe strategies funders use; advantages and tradeoffs to funding advocacy; implications for funders, advocates, and evaluation; and implications for the outcomes the Colorado Trust is working towards. It identifies three buckets of funding strategies that have emerged: the policy target approach, the advocacy niche approach, and the field building approach.
Author: Tanya Beer, Pilar Stella Ingargiola, Meghann Flynn Beer Type: Research & Reports Date: Aug 1, 2012 Be the first to review this resource! Download (379.48 KB) -
Evaluating Social Innovation In this paper, the authors explore ways that common evaluation approaches and practices constrain innovation and offer lessons about an emerging evaluation approach—developmental evaluation—which supports the adaptation that is so crucial to innovation. For what kinds of grantmaking strategies should funders consider using developmental evaluation? What organizational conditions are necessary for it to work? How can grantmakers grapple with the challenging questions that developmental evaluation raises about innovation, accountability, rigor, and adaptation? Author: Hallie Preskill and Tanya Beer Type: Research & Reports Date: Aug 1, 2012 Point K Pick Be the first to review this resource! Download (341.7 KB) -
Evaluating System Change: A Planning Guide This methods brief provides guidance on planning effective evaluations of system change interventions. It begins with a general overview of systems theory and then outlines a three-part process for designing system change evaluations. This three-part process aligns (1) the dynamics of the targeted system or situation, (2) the dynamics of the system change intervention, and (3) the intended purpose(s) and methods of the evaluation.
Author: Margaret B. Hargreaves Type: Research & Reports Date: Apr 1, 2010 Be the first to review this resource! Download (1.63 MB) -
Evaluation for the Way We Work Michael Quinn Patton describes the developmental evaluation approach. Here is an excerpt from the article:
Author: Michael Quinn Patton Type: Newsletters & Periodicals Date: Mar 21, 2006 Be the first to review this resource! Download (886 KB) -
Evaluation Principles and Practices: An Internal Working Paper The purpose of this document is to advance the Foundation’s existing work so that our evaluation practices become more consistent across the organization. We hope to create more common understanding of our philosophy, purpose, and expectations regarding evaluation as well as clarify staff roles and available support. With more consistency and shared understanding, we expect less wheel re-creation across program areas, greater learning from each other’s efforts, and faster progress in designing meaningful evaluations and applying the results.
Author: Fay Twersky & Karen Lindblom Type: Research & Reports Date: Jan 22, 2013 Be the first to review this resource! Download (1.42 MB) -
Evaluation: Finding a Common Ground [Slides] While common frameworks and approaches for evaluation have been developed across multiple fields, regional associations for grantmakers have, for the most part, been left out of this dialogue.The purpose of this session is to highlight the common threads that distinguish regional associations from other organizational genres in the social sector. Regional associations promote effectiveness in philanthropy by providing grantmakers with opportunities to engage with others, share ideas, and generate best practices that support both the individual and collective impact of philanthropy.
Author: Veena Pankaj and Ann K. Emery Type: Presentation Slides Date: Jul 30, 2013 Be the first to review this resource! Web Link -
Funder Collaboratives: Why and How Funders Work Together When it comes to funder collaboratives, is the whole truly greater than the sum of its parts? Can foundations make a bigger impact with grant dollars by working together than by going it alone? Yes, grantmakers say, as long as members define their goals, set clear operational guidelines, and work from the start to make the collaborative function well for grantees. In this guide, contributors share strategies for structuring a collaborative to fit its purpose, building strong relationships and resolving conflicts, and figuring out if the collaborative you're in is working. Author: Grantcraft Type: Research & Reports Date: Jan 1, 2010 Be the first to review this resource! Download (306.83 KB) -
Guidance Note #3: Introduction to Mixed Methods in Impact Evaluation Mixed methods (MM) evaluations seek to integrate social science disciplines with predominantly quantitative (QUANT) and predominantly qualitative (QUAL) approaches to theory, data collection, data analysis and interpretation. The purpose is to strengthen the reliability of data, validity of the findings and recommendations, and to broaden and deepen our understanding of the processes through which program outcomes and impacts are achieved, and how these are affected by the context within which the program is implemented.
Author: Michael Bamberger Type: Workbooks & Guides Date: Sep 5, 2012 Be the first to review this resource! Web Link -
How to Design a Monitoring and Evaluation Framework for a Policy Research Project This guidance note focuses on the designing and structuring of a monitoring and evaluation framework for policy research projects and programmes.
The primiary audience for this guidance note is people designing and managing monitoring and evaluation. However, it will be a useful tool for anyone involved in monitoring and evaluation activities.
The framework presented in this guidance note is intended to be used in a flexible manner depending on the purpose and characteristics of the research project.
Author: Methods Lab Type: Workbooks & Guides Date: Jan 1, 2016 Be the first to review this resource! Download (346.06 KB) -
Learning to Love Your Logic Model In this recorded webinar, Tom Chapel, Chief Evaluation Officer of the CDC, provides an overview of the purpose of logic models, how to use them, and common logic model components.
Summary from the CDC website:
It’s fun to make fun of logic models. While some of the criticism is justified, much is directed at a caricature of logic models that no model fan would recognize. In this webinar we’ll remind you:
Author: Thomas J. Chapel, Chief Evaluation Officer, CDC Type: Websites & Online Tools Date: Jan 1, 2017 Point K Pick Be the first to review this resource! Web Link -
Measuring Up: HIV-Related Advocacy Evaluation Training Pack Summary
This evaluation training pack is published by the Alliance and the International Council of AIDS Service Organizations (ICASO) and consists of two guides - a guide for facilitators and a guide for learners. They are designed for advocacy and monitoring and evaluation staff of civil society organisations (CSOs) (including networks) that are involved in designing, implementing, and assessing advocacy projects at different levels. The purpose of these guides is to increase users’ capacity to evaluate the progress and results of their advocacy work.
Author: Nicky Davies and Alan Brotherton Type: Workbooks & Guides Date: Nov 9, 2010 Be the first to review this resource! Web Link -
Nudging the Giant: The Story of the POLICY Project/Nigeria 1999 - 2004 The POLICY Project began in 1999, following the first civilian elections Nigeria had seen in sixteen years. The Project's purpose was "to strengthen the policy process in population, reproductive health, and HIV/AIDS as a basis for improved services[...]In 2002, the scope of the project was expanded from HIV/AIDS, population, and reproductive health to include child survival." This evaluation report, prepared by Ann R. Author: The POLICY Project Type: Research & Reports Date: Jun 1, 2005 Be the first to review this resource! Download (870.71 KB) -
Outcomes-Based Planning and Evaluation Course This resource is an online course about Outcomes Based Planning and Evaluation ("OBPE"). It is designed for museum professionals and librarians. Modules include:
- Introduction: An introduction to OBPE, including program examples, OBPE benefits, and a list of resources
- Plan: Planning an OBPE approach to your program, including assessing audience needs, defining your solution and your own definitions of success, considering other stakeholders, and articulating your program's purpose
Author: Elizabeth Kryder-Reid, Helen J. Schwartz, Annette Lamb, et al. Type: Websites & Online Tools Date: Jan 1, 2006 Point K Pick Be the first to review this resource! Web Link -
Pathfinder Advocate Edition: A Practical Guide to Advocacy Evaluation Pathfinder is a practical guide to the advocacy evaluation process. This edition guides advocates through the advocacy evaluation process from start to finish. Editions for evaluators and funders are also available. Drawn from Innovation Network’s research and consulting experience, Pathfinder encourages the adoption of a “learning-focused evaluation” approach, which prioritizes using knowledge for improvement. Author: Innovation Network Type: Workbooks & Guides Date: Nov 1, 2009 Point K Pick Be the first to review this resource! Download (1.12 MB) -
Pathfinder Evaluator Edition: A Practical Guide to Advocacy Evaluation Pathfinder is a practical guide to the advocacy evaluation process. This edition guides evaluators through the advocacy evaluation process from start to finish. Editions for advocates and funders are also available. Drawn from Innovation Network’s research and consulting experience, Pathfinder encourages the adoption of a “learning-focused evaluation” approach, which prioritizes using knowledge for improvement. Author: Innovation Network Type: Workbooks & Guides Date: Nov 1, 2009 Point K Pick Be the first to review this resource! Download (1.4 MB) -
Pathfinder Funder Edition: A Practical Guide to Advocacy Evaluation Pathfinder is a practical guide to the advocacy evaluation process. This edition guides funders through the advocacy evaluation process from start to finish. Editions for advocates and evaluators are also available. Drawn from Innovation Network’s research and consulting experience, Pathfinder encourages the adoption of a “learning-focused evaluation” approach, which prioritizes using knowledge for improvement. Author: Innovation Network Type: Workbooks & Guides Date: Nov 1, 2009 Point K Pick Be the first to review this resource! Download (1.17 MB) -
Program Evaluation Guide A general introduction to Evaluation, this guide walks the reader through the process of answering the following seven basic questions
1. What do we want to evaluate?
2. What is the purpose of the evaluation?
3. What type of evaluation do we want to use?
4. What information do we need to answer our questions?
5. How do we get the information?
6. How will we analyze the information?
7. How will we use and share the results?Author: Katie Cangemi and Maggie Litgen Type: Workbooks & Guides Date: Dec 31, 2011 Be the first to review this resource! Download (195.46 KB) -
Project Evaluation Guide: Module 7, Culturally Responsive Evaluation The purpose of this module is to alert users to the importance of culturally responsive evaluation and to explain some of its key components. It discusses strategies that have been found to be useful in conducting evaluations that are responsive to all cultures.
Author: National Science Foundation Type: Workbooks & Guides Date: Nov 9, 2010 Point K Pick Be the first to review this resource! Web Link -
Rapid Evaluation The purpose of this guide is to introduce the basic concepts and methods used in rapid evaluations (REs), and to demonstrate how this approach can be applied to the various stages of program development and implementation.
It covers inportant terms/definition, when to use REs, advantages and disadvantages of REs, the five elements of an RE, and provides additional reading resources.
Author: The International Training and Education Center for Health (I-TECH) Type: Workbooks & Guides Date: Jan 1, 2008 Be the first to review this resource! Download (272.44 KB) -
Software for Nonprofit Evaluation and Case Management Many systems exist to manage the wealth of data nonprofits collect. As part of our evaluation consulting work, Innovation Network team members are often asked to recommend systems to our nonprofit clients. The purpose of this document is to share with the field our most recent scan of existing software (a/o January 2010). We hope this information is helpful to you, and we encourage you to contact us with feedback. Author: Johanna Morariu, Innovation Network, Inc. Type: Workbooks & Guides Date: Jan 31, 2010 Point K Pick Be the first to review this resource! Download (86.51 KB) -
Sourcebook for Evaluating Global and Regional Partnership Programs: Indicative Principles and Standards The purpose of the indicative principles and standards contained in this Sourcebook is to help improve the independence and quality of program-level evaluations of GRPPs in order to enhance the relevance and effectiveness of the programs. The principal audiences for the Soucebook are the governing bodies and management units of GRPPs, as well as professional evaluators involved in the evaluation of these programs. Author: Independent Evaluation Group (IEG) Type: Workbooks & Guides Date: Jan 1, 2007 Be the first to review this resource! Download (833.37 KB) -
State of Evaluation 2010: Evaluation Practice and Capacity in the Nonprofit Sector Nonprofits hear a lot of talk about evaluation these days—metrics and measurements, indicators and impact, efficiency and effectiveness. Everyone, from donors to board members, seems to want evaluation results. But what are nonprofits really doing to evaluate their work? How are they really using evaluation results? What support are they getting? What else do they need?
Author: Innovation Network, Inc.; Morariu, Johanna; Reed, Ehren. Type: Research & Reports Date: Oct 1, 2010 Point K Pick Be the first to review this resource! Download (5.55 MB) -
Strengthening Anti-Hunger Advocacy in California: Evaluation of the California Nutrition Initiative 1998-2001 MAZON launched the California Nutrition Initiative (CNI) in 1998. The CNI’s purpose was "to improve the nutritional health and well-being of low-income Californians by strengthening the capacity of the state’s non-profit and anti-hunger network." This evaluation report on the projects first four years identifies strategies that could be used to replicate MAZON's successes, including network development and awareness-building. Author: Leventhal/Kline Management Inc. Type: Research & Reports Date: Jun 1, 2003 Be the first to review this resource! Download (1.32 MB)