Browse data analysis Resources
-
OUT OF SCHOOL TIME (OST) OBSERVATION INSTRUMENT The observation instrument provides site visitors at the out of school time program sites with a framework to capture and rate essential and observable indicators of positive youth development. The observation instrument includes the following:
Cover Sheet: a checklist for capturing basic facts about the observed activity, such as activity type, staff roles, number of participants, and grouping patterns.Author: Policy Studies Associates Type: Templates & Samples Date: Dec 1, 2005 Be the first to review this resource! Download (133.97 KB) -
Out-of-School Time Program Research and Evaluation Database This is a rich source for promising practices and evaluation reports on a huge variety of out-of-school programs. Author: Harvard Family Research Project Type: Websites & Online Tools Date: Aug 1, 2007 Point K Pick Be the first to review this resource! Web Link -
Outcomes Based Evaluations Using the Logic Model A training program guide about logic models and evaluation. Though developed by the Substance Abuse and Mental Health Services Administration, the concepts and advice in the guide are applicable to programmatic areas outside health and mental health. Author: Center for Substance Abuse Prevention Type: Workbooks & Guides Date: Mar 1, 2002 Be the first to review this resource! Download (866.13 KB) -
Outcomes-Based Planning and Evaluation Course This resource is an online course about Outcomes Based Planning and Evaluation ("OBPE"). It is designed for museum professionals and librarians. Modules include:
- Introduction: An introduction to OBPE, including program examples, OBPE benefits, and a list of resources
- Plan: Planning an OBPE approach to your program, including assessing audience needs, defining your solution and your own definitions of success, considering other stakeholders, and articulating your program's purpose
Author: Elizabeth Kryder-Reid, Helen J. Schwartz, Annette Lamb, et al. Type: Websites & Online Tools Date: Jan 1, 2006 Point K Pick Be the first to review this resource! Web Link -
Participatory Analysis: Expanding Stakeholder Involvement in Evaluation Veena Pankaj and Myia Welsh described Innovation Network's participatory approach to evaluation, highlighting how stakeholders can be involved in the analysis and interpretation of data. They also shared tips from Innovation Network's white paper titled "Participatory Analysis: Expanding Stakeholder Involvement in Evaluation." Author: Veena Pankaj and Myia Welsh Type: Opinion (blog, editorial) Date: Jun 6, 2011
Web Link -
Participatory Analysis: Expanding Stakeholder Involvement in Evaluation This paper shares some techniques we have used to give evaluation participants a more active role in the review and analysis of their evaluation data. It also discusses the benefits and challenges of the participatory analysis process. Author: Veena Pankaj, Myia Welsh, and Laura Ostenso of Innovation Network, Inc. Type: Research & Reports Date: Apr 1, 2011 Be the first to review this resource! Download (429.45 KB) -
Participatory Asset Mapping Toolkit Healthy City supports communities in identifying, organizing, and sharing its collective voice with decision makers at the local and state levels. Through their Community Research Lab, Healthy City share best practices and methods for Community-Based Organizations (CBOs) interested in supporting their strategies with research that combines community knowledge with Healthy City technologies. Toward this aim, they have developed the Community Research Lab Toolbox. The toolbox presents research concepts, methods, and tools through topical guides and toolkits. Author: Healthy City Type: Tipsheets & Paper Tools Date: Apr 8, 2012
Download (2.22 MB) -
Pathfinder Advocate Edition: A Practical Guide to Advocacy Evaluation Pathfinder is a practical guide to the advocacy evaluation process. This edition guides advocates through the advocacy evaluation process from start to finish. Editions for evaluators and funders are also available. Drawn from Innovation Network’s research and consulting experience, Pathfinder encourages the adoption of a “learning-focused evaluation” approach, which prioritizes using knowledge for improvement. Author: Innovation Network Type: Workbooks & Guides Date: Nov 1, 2009 Point K Pick Be the first to review this resource! Download (1.12 MB) -
Pathfinder Evaluator Edition: A Practical Guide to Advocacy Evaluation Pathfinder is a practical guide to the advocacy evaluation process. This edition guides evaluators through the advocacy evaluation process from start to finish. Editions for advocates and funders are also available. Drawn from Innovation Network’s research and consulting experience, Pathfinder encourages the adoption of a “learning-focused evaluation” approach, which prioritizes using knowledge for improvement. Author: Innovation Network Type: Workbooks & Guides Date: Nov 1, 2009 Point K Pick Be the first to review this resource! Download (1.4 MB) -
Pathfinder Funder Edition: A Practical Guide to Advocacy Evaluation Pathfinder is a practical guide to the advocacy evaluation process. This edition guides funders through the advocacy evaluation process from start to finish. Editions for advocates and evaluators are also available. Drawn from Innovation Network’s research and consulting experience, Pathfinder encourages the adoption of a “learning-focused evaluation” approach, which prioritizes using knowledge for improvement. Author: Innovation Network Type: Workbooks & Guides Date: Nov 1, 2009 Point K Pick Be the first to review this resource! Download (1.17 MB) -
Paying More Attention to Paying Attention (From Introduction)
In 1998 I wrote Paying Attention: Visitors and Museum Exhibitions, a book supported by a National Science Foundation (NSF) grant called “A Meta-analysis of Visitor Time/Use in Museum Exhibitions.” The grant accomplished three main goals:
Author: Beverly Serrell Type: Research & Reports Date: Jan 1, 2010 Be the first to review this resource! Web Link -
Performance Management and Evaluation: Two Sides of the Same Coin (Presentation slides) Performance management and evaluation-what's the difference? With an increasing emphasis on measurement and impact, service providers and their funders are pushing for increasingly sophisticated evaluation approaches such as experimental and quasi-experimental designs. However, experimental methods are rarely appropriate, feasible, or cost-effective for the majority of organizations and service providers. Author: Isaac Castillo, Ann K. Emery Type: Presentation Slides Date: Oct 16, 2013 Point K Pick Be the first to review this resource! Download (1.42 MB) -
Philanthropic Freedom: A Pilot Study Hudson Institute’s Center for Global Prosperity (CGP) is pleased to announce the publication of Philanthropic Freedom: A Pilot Study, the first time that the ease of giving has been fully measured and compared across 13 countries. The pilot study and each of the detailed country reports can be downloaded for free from www.Hudson.org/PhilanthropicFreedom.
Author: Hudson Institute Center for Global Prosperity Type: Research & Reports Date: Mar 28, 2013 Be the first to review this resource! Download (1.31 MB) -
Portfolio Evaluation vs. Grant Evaluation In this webinar Johanna Morariu and Ehren Reed discuss four levels of evaluation: grant-level, portfolio-level, foundation-level, and issue-level. The presentation addresses the pros and cons of these four levels of evaluation, and when one level may be more appropriate than another. Also included are considerations for right-sizing your evaluation approach—design, data collection, analysis, and reporting differences between grant and portfolio evaluations.Author: Johanna Morariu and Ehren Reed Type: Websites & Online Tools Date: Feb 22, 2012 Be the first to review this resource! Web Link -
Program Evaluation Guide A general introduction to Evaluation, this guide walks the reader through the process of answering the following seven basic questions
1. What do we want to evaluate?
2. What is the purpose of the evaluation?
3. What type of evaluation do we want to use?
4. What information do we need to answer our questions?
5. How do we get the information?
6. How will we analyze the information?
7. How will we use and share the results?Author: Katie Cangemi and Maggie Litgen Type: Workbooks & Guides Date: Dec 31, 2011 Be the first to review this resource! Download (195.46 KB) -
Program Evaluation: Assessing and Measuring Your Program’s Performance Innovation Network's slideshow (from a presentation to the 2007 Nonprofit Technology Conference in Washington, D.C.) about the importance of planning for effective program evaluation, using our online tools, and leveraging online communities for knowledge-sharing. (1.15 MB .pdf file; may take some time to download.) Author: Innovation Network, Inc. Type: Presentation Slides Date: Jan 18, 2008 Be the first to review this resource! Download (1.15 MB) -
Proofiness: The Dark Arts of Mathematical Deception and the Evaluation Profession Johanna Morariu describes how she explains the merit and appropriateness of qualitative designs when helping individuals and organizations design and evaluation approach or when presenting qualitative findings. Author: Johanna Morariu Type: Opinion (blog, editorial) Date: Jan 18, 2012 Be the first to review this resource! Web Link -
ProPack III - The CRS Project Package: A Guide to Creating a SMILER M&E System The approach to M&E described in this guide is called SMILER. It is a comprehensive and practical approach to developing a project monitoring system that incorporates processes for learning based on robust evidence. It has been written for CRS project managers, technical, and M&E staff to guide their work with partners and communities by describing how to develop an M&E system in which data are systematically collected, reported and used to make project decisions.
Author: Susan Hahn, Guy Sharrock Type: Workbooks & Guides Date: Jun 15, 2010 Be the first to review this resource! Download (1.18 MB) -
Public Will Building: What it Means and How to Evaluate It [Handout] Strategic Partner Julia Coffman (of the Center for Evaluation Innovation) and Innovation Network Director Ehren Reed led a discussion on Public Will Building for The Connecticut Health Foundation's Leadership Fellows. Author: Julia Coffman and Ehren Reed Type: Research & Reports Date: Dec 14, 2011 Be the first to review this resource! Web Link -
Remarks made at the Environmental Evaluators’ Network Forum: NAVIGATING EVALUATIVE COMPLEXITY IN THE AGE OF OBAMA The author draws on her vast evaluation experience, especially in federal evaluation, to confront issues of complexity in evaluation. She offers the idea of using comprehensive checklists, and supplies her own example.
An exerpt:
Author: Eleanor Chelimsky Type: Opinion (blog, editorial) Date: Jun 8, 2010 Point K Pick Be the first to review this resource! Download (81.2 KB) -
Sample Size Calculator This online sample size calculator can help you rapidly estimate how large your sample needs to be to compensate for different margins of error and confidence levels. Free and easy to use, the page also includes good introductory information on how to apply the resource. Author: Raosoft Type: Websites & Online Tools Date: Jan 1, 2004 Be the first to review this resource! Web Link -
Seeing the Forest (Beyond the Trees): Learning Across the Experiences of Seven Advocacy Evaluators [Slides] Advocacy and policy change evaluation continues to evolve and mature--from a fledgling field a few years ago to the flourishing field of today. Evaluators are advancing as well, developing an increasingly robust collective understanding about what works for advocacy evaluation. In this session a diverse group of seven advocacy evaluators explored and synthesized observations drawn from an array of real-world experiences. Panelists spoke to targeted questions, weaving in their wealth of experience and examples. Author: Johanna Morariu, Jara Dean-Coffey, Tom Kelly, Claire Hutchings, David Devlin Foltz, Robin Kane, Jared Raynor, Anne Gienapp, Type: Presentation Slides Date: Oct 19, 2013 Be the first to review this resource! Web Link -
Social Research Update: Photo-Interviewing for Research Rosalind Hurthworth offers insight into multiple uses of photography in evaluation. Author: Hurworth, Rosalind Type: Research & Reports Date: Mar 1, 2003 Be the first to review this resource! Download (156.06 KB) -
Sourcebook for Evaluating Global and Regional Partnership Programs: Indicative Principles and Standards The purpose of the indicative principles and standards contained in this Sourcebook is to help improve the independence and quality of program-level evaluations of GRPPs in order to enhance the relevance and effectiveness of the programs. The principal audiences for the Soucebook are the governing bodies and management units of GRPPs, as well as professional evaluators involved in the evaluation of these programs. Author: Independent Evaluation Group (IEG) Type: Workbooks & Guides Date: Jan 1, 2007 Be the first to review this resource! Download (833.37 KB) -
State of Evaluation - Russian translation The Project: Nonprofits hear a lot of talk about evaluation these days—metrics and measurements, indicators and impact, efficiency and effectiveness. Everyone, from donors to board members, seems to want evaluation results. But there was a big knowledge gap around evaluation practice: What are nonprofits really doing to evaluate their work? How are they really using evaluation results? What support are they getting? What else do they need?
Author: Innovation Network Type: Research & Reports Date: Feb 3, 2011 Be the first to review this resource! Download (6.78 MB)