Browse evaluation Resources
-
Paying More Attention to Paying Attention (From Introduction)
In 1998 I wrote Paying Attention: Visitors and Museum Exhibitions, a book supported by a National Science Foundation (NSF) grant called “A Meta-analysis of Visitor Time/Use in Museum Exhibitions.” The grant accomplished three main goals:
Author: Beverly Serrell Type: Research & Reports Date: Jan 1, 2010 Be the first to review this resource! Web Link -
Performance Management and Evaluation: Two Sides of the Same Coin (Presentation slides) Performance management and evaluation-what's the difference? With an increasing emphasis on measurement and impact, service providers and their funders are pushing for increasingly sophisticated evaluation approaches such as experimental and quasi-experimental designs. However, experimental methods are rarely appropriate, feasible, or cost-effective for the majority of organizations and service providers. Author: Isaac Castillo, Ann K. Emery Type: Presentation Slides Date: Oct 16, 2013 Point K Pick Be the first to review this resource! Download (1.42 MB) -
Performance Monitoring And Evaluation Tips A guide on focus groups from the U.S. Agency for International Development (USAID). Author: USAID Center for Development Information and Evaluation Type: Workbooks & Guides Date: Jan 1, 1996
Web Link -
Performance Monitoring Framework for Conservation Advocacy, A This report was written in response to the New Zealand Department of Conservation’s environmental advocacy work. It "sets out a framework for monitoring the effectiveness of conservation advocacy programmes in increasing public awareness about, and involvement in, conservation." The report offers advocacy monitoring and evaluation guidelines applicable beyond the environmental advocacy field. Author: James, Bev Type: Research & Reports Date: Mar 1, 2001 Be the first to review this resource! Download (229.7 KB) -
Philanthropic Freedom: A Pilot Study Hudson Institute’s Center for Global Prosperity (CGP) is pleased to announce the publication of Philanthropic Freedom: A Pilot Study, the first time that the ease of giving has been fully measured and compared across 13 countries. The pilot study and each of the detailed country reports can be downloaded for free from www.Hudson.org/PhilanthropicFreedom.
Author: Hudson Institute Center for Global Prosperity Type: Research & Reports Date: Mar 28, 2013 Be the first to review this resource! Download (1.31 MB) -
Philanthropic Strategies and Tactics for Change: A Concise Framework This article discusses the various tactics grantmakers rely on to create impact. The author describes theories of change, theories of leverage, programmatic tactics, and grantmaking tactics. Within programmatic tactics he discusses the need for evaluations to allow for ongoing program adjustments and to inform future efforts. Author: Frumkin, Peter Type: Research & Reports Date: Aug 31, 2002 Be the first to review this resource! Web Link -
Portfolio Evaluation vs. Grant Evaluation In this webinar Johanna Morariu and Ehren Reed discuss four levels of evaluation: grant-level, portfolio-level, foundation-level, and issue-level. The presentation addresses the pros and cons of these four levels of evaluation, and when one level may be more appropriate than another. Also included are considerations for right-sizing your evaluation approach—design, data collection, analysis, and reporting differences between grant and portfolio evaluations.Author: Johanna Morariu and Ehren Reed Type: Websites & Online Tools Date: Feb 22, 2012 Be the first to review this resource! Web Link -
Power, Participation, and State-based Politics: An Evaluation of the Ford Foundation's Collaborations that Count Initiative The Applied Research Center (ARC) conducted a two-year participatory evaluation to provide an account of the Ford Foundation's "Collaborations That Count Initiative." The report identifies areas in which the 11 statewide collaborations succeeded, and draws attention to ways in which support to collaboration might be more effectively provided in the future. Author: Applied Research Center Type: Research & Reports Date: Apr 21, 2004 Be the first to review this resource! Download (1.85 MB) -
Program Development and Evaluation This site provides a comprehensive set of resources on planning and implementing an evaluation. Some of their tools require a free login. Author: University of Wisconsin - Extension Type: Websites & Online Tools Date: Jan 18, 2008 Be the first to review this resource! Web Link -
Program Evaluation Guide A general introduction to Evaluation, this guide walks the reader through the process of answering the following seven basic questions
1. What do we want to evaluate?
2. What is the purpose of the evaluation?
3. What type of evaluation do we want to use?
4. What information do we need to answer our questions?
5. How do we get the information?
6. How will we analyze the information?
7. How will we use and share the results?Author: Katie Cangemi and Maggie Litgen Type: Workbooks & Guides Date: Dec 31, 2011 Be the first to review this resource! Download (195.46 KB) -
Program Evaluation: Assessing and Measuring Your Program’s Performance Innovation Network's slideshow (from a presentation to the 2007 Nonprofit Technology Conference in Washington, D.C.) about the importance of planning for effective program evaluation, using our online tools, and leveraging online communities for knowledge-sharing. (1.15 MB .pdf file; may take some time to download.) Author: Innovation Network, Inc. Type: Presentation Slides Date: Jan 18, 2008 Be the first to review this resource! Download (1.15 MB) -
Program Evaluation: Igniting the Untapped Power An introductory piece about the power of evaluation, including Innovation Network's approach and things to consider before beginning an evaluation effort. Author: Innovation Network, Inc. Type: Opinion (blog, editorial) Date: Mar 1, 2002 Be the first to review this resource! Download (227.62 KB) -
Project Evaluation Guide: Module 7, Culturally Responsive Evaluation The purpose of this module is to alert users to the importance of culturally responsive evaluation and to explain some of its key components. It discusses strategies that have been found to be useful in conducting evaluations that are responsive to all cultures.
Author: National Science Foundation Type: Workbooks & Guides Date: Nov 9, 2010 Point K Pick Be the first to review this resource! Web Link -
Proofiness: The Dark Arts of Mathematical Deception and the Evaluation Profession Johanna Morariu describes how she explains the merit and appropriateness of qualitative designs when helping individuals and organizations design and evaluation approach or when presenting qualitative findings. Author: Johanna Morariu Type: Opinion (blog, editorial) Date: Jan 18, 2012 Be the first to review this resource! Web Link -
ProPack III - The CRS Project Package: A Guide to Creating a SMILER M&E System The approach to M&E described in this guide is called SMILER. It is a comprehensive and practical approach to developing a project monitoring system that incorporates processes for learning based on robust evidence. It has been written for CRS project managers, technical, and M&E staff to guide their work with partners and communities by describing how to develop an M&E system in which data are systematically collected, reported and used to make project decisions.
Author: Susan Hahn, Guy Sharrock Type: Workbooks & Guides Date: Jun 15, 2010 Be the first to review this resource! Download (1.18 MB) -
Pros and Cons of Evaluation A solid two-page overview of why foundations should get involved with evaluation presented in a straightforward Pro vs. Con fashion. Author: Janet Carter Type: Opinion (blog, editorial) Date: Jan 1, 2003 Be the first to review this resource! Download (57.57 KB) -
Prove & Improve – A Self-Evaluation Resource for Voluntary and Community Organisations This document is intended to provide a starting point for exploring evaluation - and in particular, self evaluation of outcomes - and for thinking about some of the issues involved in this essential area of developing and running a successful project. Author: Community Evaluation Northern Ireland Type: Websites & Online Tools Date: Oct 1, 2008
Download (508.77 KB) -
Public Communication Campaign Evaluation First in the series of five papers from the Communications Consortium Media Center (q.v.), this paper is a "scan of challenges, criticisms, practice, and opportunities." Author Julia Coffman:
- Discusses recent events in the field of advocacy evaluation;
- Examines evaluation challenges, criticisms, and practice; and
- Includes sections on relevant theory, outcomes, and evaluation design.
Author: Coffman, Julia Type: Research & Reports Date: May 1, 2002 Be the first to review this resource! Download (190.47 KB) -
Public Will Building: What it Means and How to Evaluate It [Handout] Strategic Partner Julia Coffman (of the Center for Evaluation Innovation) and Innovation Network Director Ehren Reed led a discussion on Public Will Building for The Connecticut Health Foundation's Leadership Fellows. Author: Julia Coffman and Ehren Reed Type: Research & Reports Date: Dec 14, 2011 Be the first to review this resource! Web Link -
Putting the system back into systems change: A framework for understanding and changing organizational and community systems This paper provides one framework—grounded in systems thinking and change literatures—for understanding and identifying the fundamental system parts and interdependencies that can help to explain system functioning and leverage systems change. The proposed framework highlights the importance of attending to both the deep and apparent structures within a system as well as the interactions and interdependencies among these system parts. Author: Pennie G. Foster-Fishman, Branda Nowell, Huilan Yang Type: Research & Reports Date: May 18, 2007 Be the first to review this resource! Download (381.29 KB) -
Rapid Evaluation The purpose of this guide is to introduce the basic concepts and methods used in rapid evaluations (REs), and to demonstrate how this approach can be applied to the various stages of program development and implementation.
It covers inportant terms/definition, when to use REs, advantages and disadvantages of REs, the five elements of an RE, and provides additional reading resources.
Author: The International Training and Education Center for Health (I-TECH) Type: Workbooks & Guides Date: Jan 1, 2008 Be the first to review this resource! Download (272.44 KB) -
Readiness for Evaluation and Learning: Assessing Grantmaker and Grantee Capacity When undertaking a new organizational or program approach to evaluation, begin with questions of readiness. What is the existing EVALUATION PRACTICE of my organization or program? What is the existing EVALUATION CAPACITY of my organization or program? Author: Johanna Morariu, Innovation Network, Inc. Type: Research & Reports Date: Apr 1, 2012 Point K Pick Be the first to review this resource! Download (375.25 KB) -
Remarks made at the Environmental Evaluators’ Network Forum: NAVIGATING EVALUATIVE COMPLEXITY IN THE AGE OF OBAMA The author draws on her vast evaluation experience, especially in federal evaluation, to confront issues of complexity in evaluation. She offers the idea of using comprehensive checklists, and supplies her own example.
An exerpt:
Author: Eleanor Chelimsky Type: Opinion (blog, editorial) Date: Jun 8, 2010 Point K Pick Be the first to review this resource! Download (81.2 KB) -
Report from "Planning, Assessing and Learning from Advocacy Workshop" This report summarizes a 4-day workshop help in Accra, Ghana, in April 2006. The workshop was a collaboration between INTRAC and ActionAid. INTRAC was seeking to understand M&E as practiced on the ground as part of its preparation for an international conference, and ActionAid was motivated by a desire to present findings from three years of the Action Research Project and give participants a forum to express themselves. Author: INTRAC Type: Research & Reports Date: Apr 30, 2006 Be the first to review this resource! Download (47.11 KB) -
Report: "Ten Considerations for Advocacy Evaluation Planning: Lessons Learned from KIDS COUNT Grantee Experiences" The Annie E. Casey Foundation and Organizational Research Services, Inc. detail ten lessons learned from an evaluation of five KIDS COUNT grantees that began in 2007. The evaluation was designed to test some of the ideas presented in "A Guide to Measuring Advocacy and Policy", a report produced by AECF and ORS in 2006. Author: Organizational Research Services Type: Research & Reports Date: Jan 1, 2009 Be the first to review this resource! Download (289.93 KB)