Browse conference Resources
-
Data Placemats: A DataViz Technique to Improve Stakeholder Understanding of Evaluation Results [Slides] At the American Evaluation Assocaition 2012 Annual Conference, Veena Pankaj describes various ways to improve stakeholder engagement, as well as ways to increase stakeholder understanding of evaluation results. Author: Veena Pankaj Type: Presentation Slides Date: Oct 25, 2012 Be the first to review this resource! Web Link -
Data Visualization Approaches for Program Evaluation (and Beyond) Simone Parrish writes, "When you hear the phrase “program evaluation findings,” are you bored already? Most people—even within the evaluation field—perceive evaluation as dry. The major output of an evaluation is often a weighty report that gets read once (if at all) before it begins its long-term dust-collecting destiny.
Author: Simone Parrish Type: Opinion (blog, editorial) Date: Apr 29, 2014 Point K Pick Be the first to review this resource! Web Link -
Dataviz! Or, How to Win at Communication and Influence People (Resources Handout) Are you intrigued by data and information visualization—dataviz—and how it could improve your communication strategy? Are you interested in the range of dataviz options, but unsure which is right for you? Or are you maybe even drowning in data and looking for someone to throw you a life-saving suggestion for tools to transform your data into a message?
Author: Johanna Morariu Type: Tipsheets & Paper Tools Date: May 17, 2013 Point K Pick Be the first to review this resource! Download (510.3 KB) -
Developmental Evaluation This presentation by Ricardo Wilson-Grau at the Michigan Association for Evaluation's annual conference is based on the concept of Developmental Evaluation (DE) elaborated by Michael Quinn Patton over the past 20 years and now crystallised in a book – Developmental Evaluation: Applying Complexity Concepts to Enhance Innovation and Use. The presentation presents an overview of DE, explains the basic theory behind DE, when can DE be used in evaluation, as well as the differences between DE and traditional evaluation. Author: Ricardo Wilson-Grau Type: Presentation Slides Date: May 3, 2000 Be the first to review this resource! Download (5.05 MB) -
Dynamic Dozen: Delivery. Tips from top presenters from the American Evaluation Association This study consisted of interviews with a dozen of the top AEA presenters to get their secrets about how to make and deliver great presentations. Their comments were grouped into three stages of presenting: message, design, and
Author: Anjanette Raber Type: Workbooks & Guides Date: Aug 1, 2012 Be the first to review this resource! Download (489.15 KB) -
Dynamic Dozen: Design. Tips from top presenters from the American Evaluation Association This study consisted of interviews with a dozen of the top AEA presenters to get their secrets about how to make and deliver great presentations. Their comments were grouped into three stages of presenting: message, design, and delivery. This report focuses solely on Design, that is, the intentional composition of slides. While the context for their talks spanned long and short presentations, and included different types of audiences and purposes, their insights can be used or modified by evaluators for their own presentations at the AEA annual conference and elsewhere.
Author: Anjanette Raber Type: Workbooks & Guides Date: Aug 1, 2012 Be the first to review this resource! Download (431.09 KB) -
Dynamic Dozen: Message. Tips from top presenters from the American Evaluation Association This study consisted of interviews with a dozen of the top AEA presenters to get their secrets about how to make and deliver great presentations. Their comments were grouped into three stages of presenting: message, design, and delivery. This report focuses solely on Message, that is, the mindful planning of a structured presentation. While the context for their talks spanned long and short presentations, and included different types of audiences and purposes, their insights can be used or modified by evaluators for their own presentations at the AEA annual conference and elsewhere.
Author: Anjanette Raber Type: Workbooks & Guides Date: Aug 1, 2012 Be the first to review this resource! Download (555.08 KB) -
Evaluating Advocacy: A Model for Public Policy Initiatives This presentation, drawing on work done with the Coalition for Comprehensive Immigration Reform ("CCIR"), was given by Innovation Network staff at the 2006 American Evaluation Association Conference. The presentation discusses advocacy evaluation in general, some inherent challenges that apply more strongly to advocacy evaluation than to evaluation of traditional service programs, and some practical planning and evaluation structures developed as a result of the work with CCIR. Author: Innovation Network, Inc. Type: Presentation Slides Date: Nov 1, 2006 Be the first to review this resource! Download (1.42 MB) -
Evaluation as a Tool for Creating and Leading a Results-Based Learning Culture [Slides] Johanna Morariu and Will Fenn discussed effective evaluation initiatives, highlighting useful tools such as those available from Innovation Network’s Point K at the Emerging Practitioners in Philanthropy 2013 National Conference in Chicago, IL. Author: Johanna Morariu and Will Fenn Type: Presentation Slides Date: Apr 5, 2013 Be the first to review this resource! Web Link -
Evaluation Blogging: Improve Your Practice, Share Your Expertise, and Strengthen Your Network (Presentation slides) Want to start blogging about evaluation, but not sure where to start? Started, but want to know what to expect (or what to do next, or how to keep it going)? Ready to take your independent consulting practice to the next level? Author: Ann K. Emery, Susan Kistler, Chris Lysy, Sheila B. Robinson Type: Presentation Slides Date: Oct 19, 2013 Be the first to review this resource! Download (2.84 MB) -
Excel Elbow Grease: How to Fool Excel into Making (Pretty Much) Any Chart You Want In October 2013, Ann K. Emery presented How to fool Excel into making (pretty much) any chart you want at the American Evaluation Association's annual conference in Washington, D.C. She shared four strategies for communicating data more clearly in Excel: 1) Adjust default settings until charts pass the Squint Test; 2) Create two charts in one; 3) Create invisible sections of charts; and 4) Exploit the stock chart types, for example, by making timelines from stacked bar charts or by making dot plots from scatter plots. Author: Ann K. Emery Type: Presentation Slides Date: Oct 17, 2013 Point K Pick Be the first to review this resource! Web Link -
How to Climb the R Learning Curve Without Falling Off the Cliff: Advice from Novice, Intermediate, and Advanced R Users R is hotter than ever in the evaluation field as evaluators are looking for ways to improve their data management, analysis, and visualizations. First-time R users are asking themselves, Is R right for my evaluation work? Where do I start if I want to learn R? How long will it take to learn R? Evaluators without programming experience are often frustrated by R's steep learning curve. These novice R users are left wondering, How can I climb the R learning curve without falling off the cliff?
Author: Tony Fujs, Will Fenn, Ann Emery Type: Tipsheets & Paper Tools Date: Oct 19, 2013 Point K Pick Be the first to review this resource! Download (229.52 KB) -
Learning from Your Neighbor: Public Policy Dispute Resolution and Public Participation Maureen Berner and John Stephens from the University of North Carolina at Chapel Hill's School of Government presented the following slides at the 2008 American Evaluation Association Conference. This presentation, in conjunction with a paper of the same title, compares and constrasts the evaluation of public policy dispute resolution (PPDR) and public participation (PP) programs.
Author: Berner, Maureen; Stephens, John Type: Presentation Slides Date: Nov 1, 2008 Be the first to review this resource! Download (84 KB) -
Learning from Your Neighbor: The Value of Public Participation Evaluation for Public Policy Dispute Resolution Maureen Berner and John Stephens from the University of North Carolina at Chapel Hill's School of Government presented the following draft paper (NOTE: This resource includes the introduction only.) at the 2008 American Evaluation Association Conference. This draft, in conjunction with a presentation of the same title, compares and constrasts the evaluation of public policy dispute resolution (PPDR) and public participation (PP) programs.
Author: Berner, Maureen; Stephens, John Type: Research & Reports Date: Oct 13, 2008 Be the first to review this resource! Download (47.5 KB) -
Making Change Happen This conference report discusses overall themes and topics from a November 2001 meeting of the same name, attended by forty-nine people engaged in international advocacy and citizen participation efforts. In its final chapter, "How to Assess Success," the publication discusses many of the tensions and issues in evaluating advocacy efforts, as well as the need to perform evaluations in order to further learning. Author: Clark, Cindy Type: Research & Reports Date: Nov 1, 2001 Be the first to review this resource! Download (796.03 KB) -
Performance Management and Evaluation: Two Sides of the Same Coin (Presentation slides) Performance management and evaluation-what's the difference? With an increasing emphasis on measurement and impact, service providers and their funders are pushing for increasingly sophisticated evaluation approaches such as experimental and quasi-experimental designs. However, experimental methods are rarely appropriate, feasible, or cost-effective for the majority of organizations and service providers. Author: Isaac Castillo, Ann K. Emery Type: Presentation Slides Date: Oct 16, 2013 Point K Pick Be the first to review this resource! Download (1.42 MB) -
Picturing Your Data is Better Than 1000 Numbers: Data Visualization Techniques for Social Change Are you intrigued by infographics and how they could improve your communication strategy? Are you interested in what it takes for an organization to systematically use data? Or are you maybe even drowning in data and looking for someone to throw you a life-saving suggestion for software and other tools? Johanna Morariu, Beth Kanter, and Brian Kennedy presented a panel on data and information visualization at the 2012 Nonprofit Tech Conference. This video is a recording of the panel.
Author: Johanna Morariu Type: Presentation Slides Date: Be the first to review this resource! Web Link -
Program Evaluation: Assessing and Measuring Your Program’s Performance Innovation Network's slideshow (from a presentation to the 2007 Nonprofit Technology Conference in Washington, D.C.) about the importance of planning for effective program evaluation, using our online tools, and leveraging online communities for knowledge-sharing. (1.15 MB .pdf file; may take some time to download.) Author: Innovation Network, Inc. Type: Presentation Slides Date: Jan 18, 2008 Be the first to review this resource! Download (1.15 MB) -
Report from "Planning, Assessing and Learning from Advocacy Workshop" This report summarizes a 4-day workshop help in Accra, Ghana, in April 2006. The workshop was a collaboration between INTRAC and ActionAid. INTRAC was seeking to understand M&E as practiced on the ground as part of its preparation for an international conference, and ActionAid was motivated by a desire to present findings from three years of the Action Research Project and give participants a forum to express themselves. Author: INTRAC Type: Research & Reports Date: Apr 30, 2006 Be the first to review this resource! Download (47.11 KB) -
Report: "Ten Considerations for Advocacy Evaluation Planning: Lessons Learned from KIDS COUNT Grantee Experiences" The Annie E. Casey Foundation and Organizational Research Services, Inc. detail ten lessons learned from an evaluation of five KIDS COUNT grantees that began in 2007. The evaluation was designed to test some of the ideas presented in "A Guide to Measuring Advocacy and Policy", a report produced by AECF and ORS in 2006. Author: Organizational Research Services Type: Research & Reports Date: Jan 1, 2009 Be the first to review this resource! Download (289.93 KB) -
Seeing the Forest (Beyond the Trees): Learning Across the Experiences of Seven Advocacy Evaluators [Slides] Advocacy and policy change evaluation continues to evolve and mature--from a fledgling field a few years ago to the flourishing field of today. Evaluators are advancing as well, developing an increasingly robust collective understanding about what works for advocacy evaluation. In this session a diverse group of seven advocacy evaluators explored and synthesized observations drawn from an array of real-world experiences. Panelists spoke to targeted questions, weaving in their wealth of experience and examples. Author: Johanna Morariu, Jara Dean-Coffey, Tom Kelly, Claire Hutchings, David Devlin Foltz, Robin Kane, Jared Raynor, Anne Gienapp, Type: Presentation Slides Date: Oct 19, 2013 Be the first to review this resource! Web Link -
State of the Field: Updated, Longitudal Findings about Nonprofit and Philanthropic Evaluation Practices and Capacities [Slides] The State of Evaluation project provides valuable insight to all those who work in and with the nonprofit sector. The project is designed to collect longitudinal data to document evaluation trends in the U.S. nonprofit sector, including how nonprofits staff evaluation, how evaluation is funded, why evaluation is undertaken, how evaluation results are used, and much more. This year marks the beginning of longitudinal data and analysis, drawing from the first iteration of the project in 2010. Author: Johanna Morariu Type: Presentation Slides Date: Oct 27, 2012 Be the first to review this resource! Web Link -
Success and Failure in the Evaluation Process What do the terms "success" and "failure" really mean in the philanthropic world? Funders have taken different approaches to learning from initiatives that haven't gone quite as they had hoped. Some funders want to learn from their mistakes, some provide technical assistance to lagging grantees, and some want to focus their light on "bright spots" and grantee successes.
Author: Kat Athanasiades Type: Presentation Slides Date: Mar 1, 2015 Be the first to review this resource! Web Link -
System Mapping: A Case Example Innovation Network's Ehren Reed introduced this handout at a panel titled "Body of Evidence or Firsthand Experience? Evaluation of Two Concurrent and Overlapping Advocacy Initiatives" at the 2009 American Evaluation Association conference. The handout demonstrates the concept of system mapping using a simplified version of a map produced as part of an advocacy evaluation capacity building project for CARE USA.
Author: Innovation Network, Inc. Type: Templates & Samples Date: Nov 1, 2009 Be the first to review this resource! Download (143.92 KB) -
The Conference is Over, Now What? Professional Development for Novice Evaluators (Presentation handout) Are you a recent graduate or novice evaluator? If so, cheers to your first few years in the evaluation field! Your professional development is just beginning.
Author: Ann K. Emery Type: Tipsheets & Paper Tools Date: Oct 18, 2013 Be the first to review this resource! Download (547.8 KB)