Browse learning R Resources
-
Grantcraft: Evaluation Technique Series To help grantmakers understand some newer evaluative approaches and weigh their advantages, GrantCraft has developing a collection of briefing notes. Each note explains the basics of one technique and answers some common questions about its use. A mini-case, based on one grantmaker’s experiences, is featured in each guide. Additional literature about the topic is also provided.
Participatory Action Research - Involving "All The Players" in Evaluation and Change
Author: Grantcraft Type: Workbooks & Guides Date: Jan 1, 2011 Be the first to review this resource! Web Link -
How Can We Help Out Grantees Strengthen Their Capacity for Evaluation? There is a widespread and growing recognition in the nonprofit sector about the importance of evaluation--not only for measuring impact, but also for improving programs and better serving communities. While grantmakers generally see evaluation as necessary, most are not yet investing enough resources in this area. In 2014, nearly three quarters of nonprofits reported that their funders "rarely or never" fund impact measurement costs.
Author: Grantmakers for Effective Organizations (GEO) Type: Research & Reports Date: Aug 1, 2015 Be the first to review this resource! Download (728.72 KB) -
How to Climb the R Learning Curve Without Falling Off the Cliff: Advice from Novice, Intermediate, and Advanced R Users R is hotter than ever in the evaluation field as evaluators are looking for ways to improve their data management, analysis, and visualizations. First-time R users are asking themselves, Is R right for my evaluation work? Where do I start if I want to learn R? How long will it take to learn R? Evaluators without programming experience are often frustrated by R's steep learning curve. These novice R users are left wondering, How can I climb the R learning curve without falling off the cliff?
Author: Tony Fujs, Will Fenn, Ann Emery Type: Tipsheets & Paper Tools Date: Oct 19, 2013 Point K Pick Be the first to review this resource! Download (229.52 KB) -
ILAC Brief 16: "Contribution analysis: An approach to exploring cause and effect" In this brief from Biodiversity International's Institutional Learning and Change Initiative (ILAC), John Mayne discusses the steps involved in contribution analysis (including the development of a theory of change), an evaluation approach that may be useful when others are not practical. More specifically, Mayne provides an example of an evaluation capacity building project for agricultural research organizations. Author: Mayne, John Type: Newsletters & Periodicals Date: May 1, 2008 Be the first to review this resource! Download (129.56 KB) -
Introduction to Before and After Action Reviews (BARs and AARs) The Before and After Action Review is a simple, straightforward set of questions to ask before and after an important piece of work — whether it is preparing for a meeting, engaging with board members or launching into a new initiative.Author: Fourth Quadrant Partners, LLC Type: Workbooks & Guides Date: Sep 16, 2020
Download (251.85 KB) -
Learning As We Go: Making Evaluation Work for Everyone This "briefing paper for funders and nonprofits" provides a detailed answer to the question, "Why evaluate?" It offers an overview of the importance of evaluative thinking, use of a logic model, and the range of perceptions about evaluation. Author: York, Peter J. Type: Workbooks & Guides Date: Jun 1, 2003 Be the first to review this resource! Web Link -
Learning from Your Neighbor: Public Policy Dispute Resolution and Public Participation Maureen Berner and John Stephens from the University of North Carolina at Chapel Hill's School of Government presented the following slides at the 2008 American Evaluation Association Conference. This presentation, in conjunction with a paper of the same title, compares and constrasts the evaluation of public policy dispute resolution (PPDR) and public participation (PP) programs.
Author: Berner, Maureen; Stephens, John Type: Presentation Slides Date: Nov 1, 2008 Be the first to review this resource! Download (84 KB) -
Learning from Your Neighbor: The Value of Public Participation Evaluation for Public Policy Dispute Resolution Maureen Berner and John Stephens from the University of North Carolina at Chapel Hill's School of Government presented the following draft paper (NOTE: This resource includes the introduction only.) at the 2008 American Evaluation Association Conference. This draft, in conjunction with a presentation of the same title, compares and constrasts the evaluation of public policy dispute resolution (PPDR) and public participation (PP) programs.
Author: Berner, Maureen; Stephens, John Type: Research & Reports Date: Oct 13, 2008 Be the first to review this resource! Download (47.5 KB) -
Learning to Love Your Logic Model In this recorded webinar, Tom Chapel, Chief Evaluation Officer of the CDC, provides an overview of the purpose of logic models, how to use them, and common logic model components.
Summary from the CDC website:
It’s fun to make fun of logic models. While some of the criticism is justified, much is directed at a caricature of logic models that no model fan would recognize. In this webinar we’ll remind you:
Author: Thomas J. Chapel, Chief Evaluation Officer, CDC Type: Websites & Online Tools Date: Jan 1, 2017 Point K Pick Be the first to review this resource! Web Link -
LearnPhilanthropy LearnPhilanthropy's Knowledge Library is a resource for people who are new to grantmaking or those seeking new ideas and tools to improve their grantmaking practice. Here you will find essential learning and new research on a range of common issues and key challenges in philanthropy. Working with leading organizations across the field, LearnPhilanthropy regularly updates this centralized library with reports, tools, and other resources. Author: LearnPhilanthropy Type: Websites & Online Tools Date: Sep 23, 2014 Point K Pick Be the first to review this resource! Web Link -
Logic Model Workbook (.doc) Innovation Network's own workbook (revised in late 2010), which offers an introduction to the processes and concepts of the logic model. This workbook can be used alone or in conjunction with the Logic Model Builder at the Point K Learning Center. Also available in .pdf format. Author: Innovation Network, Inc. Type: Workbooks & Guides Date: Dec 31, 2010 Point K Pick
Download (524.5 KB) -
Logic Model Workbook (.pdf) Innovation Network's own workbook (revised in 2010), offering an introduction to the processes and concepts of the logic model. This workbook can be used alone or in conjunction with the Logic Model Builder at the Point K Learning Center. Author: Innovation Network, Inc. Type: Workbooks & Guides Date: Dec 31, 2010 Point K Pick
Download (473.71 KB) -
Making Change Happen This conference report discusses overall themes and topics from a November 2001 meeting of the same name, attended by forty-nine people engaged in international advocacy and citizen participation efforts. In its final chapter, "How to Assess Success," the publication discusses many of the tensions and issues in evaluating advocacy efforts, as well as the need to perform evaluations in order to further learning. Author: Clark, Cindy Type: Research & Reports Date: Nov 1, 2001 Be the first to review this resource! Download (796.03 KB) -
Monitoring and Evaluating Advocacy Advocacy and citizen participation are now key components of most development projects. In this paper the author outlines a framework for research into these issues that will be carried out in five countries around the world (the ActionAid action research project). This paper is based on Chapman and Wameyo’s 2001 "Scoping Study," q.v. Author: Chapman, Jennifer Type: Research & Reports Date: Feb 1, 2002 Be the first to review this resource! Download (28.92 KB) -
Monitoring and Evaluating Advocacy Efforts: Learning from Successes and Challenges This chapter, part of a larger workbook for health advocacy issues, details three types of evaluation used to evaluate the achievements of advocacy efforts: process evaluation, outcome evaluation, and impact evaluation. Author: Advocates for Youth Type: Workbooks & Guides Date: Jan 1, 1998 Be the first to review this resource! Web Link -
Monitoring, Evaluation and Learning (MEL) in NGO Advocacy: Findings from Comparative Policy Advocacy MEL Review Project “For organizations committed to social change, advocacy often figures as a crucial strategic element. How to assess effectiveness in advocacy is, therefore, important. The usefulness of Monitoring, Evaluation and Learning (MEL) in advocacy are subject to much current debate.
Author: Oxfam America Type: Websites & Online Tools Date: Feb 1, 2013 Be the first to review this resource! Download (1.89 MB) -
My M & E (Monitoring and Evaluation) My M&E is an interactive WEB 2.0 platform to share knowledge on country-led Monitoring and Evaluation systems worldwide. In addition to being a learning resource, My M&E facilitates the strengthening of a global community, while identifying good practices and lessons learned about program monitoring and evaluation in general, and on country-led M&E systems in particular. Author: UNICEF and others Type: Websites & Online Tools Date: Jan 1, 2010 Be the first to review this resource! Web Link -
Performance, Learning, Leadership and Knowledge Everything you wanted to know about Instructional Systems Design in one place.
Author: Professor Don Clark Type: Websites & Online Tools Date: Sep 2, 2009 Be the first to review this resource! Web Link -
Philanthropy and Mistakes: An Untapped Resource This article discusses how foundations and their nonprofit partners might think about failure and share their hard-learned lessons. The authors first distinguish among different types of mistakes and how they relate to specific types of foundation investments. The authors then discusses three examples that represent different types of mistakes that foundations and their nonprofit partners make. Finally, the authors offer lessons to foundations about adapting, learning, and sharing in the face of failure.
Author: Robert Giloth and Susan Gewirtz Type: Research & Reports Date: Jan 1, 2009 Be the first to review this resource! Download (520.58 KB) -
ProPack III - The CRS Project Package: A Guide to Creating a SMILER M&E System The approach to M&E described in this guide is called SMILER. It is a comprehensive and practical approach to developing a project monitoring system that incorporates processes for learning based on robust evidence. It has been written for CRS project managers, technical, and M&E staff to guide their work with partners and communities by describing how to develop an M&E system in which data are systematically collected, reported and used to make project decisions.
Author: Susan Hahn, Guy Sharrock Type: Workbooks & Guides Date: Jun 15, 2010 Be the first to review this resource! Download (1.18 MB) -
Readiness for Evaluation and Learning: Assessing Grantmaker and Grantee Capacity When undertaking a new organizational or program approach to evaluation, begin with questions of readiness. What is the existing EVALUATION PRACTICE of my organization or program? What is the existing EVALUATION CAPACITY of my organization or program? Author: Johanna Morariu, Innovation Network, Inc. Type: Research & Reports Date: Apr 1, 2012 Point K Pick Be the first to review this resource! Download (375.25 KB) -
Report from "Planning, Assessing and Learning from Advocacy Workshop" This report summarizes a 4-day workshop help in Accra, Ghana, in April 2006. The workshop was a collaboration between INTRAC and ActionAid. INTRAC was seeking to understand M&E as practiced on the ground as part of its preparation for an international conference, and ActionAid was motivated by a desire to present findings from three years of the Action Research Project and give participants a forum to express themselves. Author: INTRAC Type: Research & Reports Date: Apr 30, 2006 Be the first to review this resource! Download (47.11 KB) -
Seeing the Forest (Beyond the Trees): Learning Across the Experiences of Seven Advocacy Evaluators [Slides] Advocacy and policy change evaluation continues to evolve and mature--from a fledgling field a few years ago to the flourishing field of today. Evaluators are advancing as well, developing an increasingly robust collective understanding about what works for advocacy evaluation. In this session a diverse group of seven advocacy evaluators explored and synthesized observations drawn from an array of real-world experiences. Panelists spoke to targeted questions, weaving in their wealth of experience and examples. Author: Johanna Morariu, Jara Dean-Coffey, Tom Kelly, Claire Hutchings, David Devlin Foltz, Robin Kane, Jared Raynor, Anne Gienapp, Type: Presentation Slides Date: Oct 19, 2013 Be the first to review this resource! Web Link -
State of Evaluation in the Social Sector Measurement, evaluation, and learning are hotter than ever in the social sector. Foundations and nonprofits are focused on answering the question What difference are we making? And the field of evaluation has advanced in promising ways, developing meaningful evaluation approaches to better fit the latest philanthropic and nonprofit strategies. Author: Johanna Morariu and Will Fenn Type: Templates & Samples Date: Mar 21, 2013 Be the first to review this resource! Web Link -
Strengthening Line of Sight Line of sight is about strengthening how we think about our ultimate goals and then maintaining an unobstructed vision from our current decisions and actions to those goals. It is about always asking: “What are we trying to accomplish and what would success look like?” Line of Sight is a process, not a product.Author: Fourth Quadrant Partners, LLC Type: Tipsheets & Paper Tools Date: Sep 16, 2020 Be the first to review this resource! Download (1015.3 KB)