USAID Performance Monitoring and Evaluation TIPS
TIPS
USAID's Performance Monitoring and Evaluation TIPS provide practical advice and suggestions to USAID managers and partners on issues related to performance monitoring and evaluation. These publications are supplemental references to the Automated Directive Service (ADS) Chapter 203 (pdf, 264kb).
TIPS 1: Conducting a Participatory Evaluation- Participatory evaluation provides for active involvement in the evaluation process of those with a stake in the program: providers, partners, customers (beneficiaries), and any other interested parties. Participation typically takes place throughout all phases of the evaluation: planning and design; gathering and analyzing the data; identifying the evaluation findings, conclusions, and recommendations; disseminating results; and preparing an action plan to improve program performance. For complete TIP, click here (pdf, 113kb).
TIPS 2: Conducting Key Informant Interview- They are qualitative, in-depth interviews of 15 to 35 people selected or their first-hand knowledge about a topic of interest. The interviews are loosely structured, relying on a list of issues to be discussed. Key informant interviews resemble a conversation among acquaintances, allowing a free flow of ideas and information. Interviewers frame questions spontaneously, probe for information and take notes, which are elaborated on later. For complete TIP, click here (pdf, 264kb).
TIPS 3: Preparing and Evaluation Scope of Work- The statement of work (SOW) is viewed as the single most critical document in the development of a good evaluation. The SOW states (1) the purpose of an evaluation, (2) the questions that must be answered, (3) the expected quality of the evaluation results, (4) the expertise needed to do the job and (5) the time frame and budget available to support the task. For complete TIP, click here (pdf, 281kb).TIPS 4: Using Direct Observation Techniques: Most evaluation teams conduct some fieldwork, observing what's actually going on at assistance activity sites. Often this is done informally, without much thought to the quality of data collection. Direct observation techniques allow for a more systematic, structured process, using well-designed observation record forms. For complete TIP, click here (pdf, 129kb).
TIPS 5: Using Rapid Appraisal Methods: Rapid Appraisal (RA) is an approach that draws on multiple evaluation methods and techniques to quickly, yet systematically, collect data when time in the field is limited. RA practices are also useful when there are budget constraints or limited availability of reliable secondary data. For example, time and budget limitations may preclude the option of using representative sample surveys. For complete TIP, click here (pdf, 100kb).
TIPS 6: Selecting Performance Indicators: Performance indicators define a measure of change for the results identified in a Results Framework (RF). When well-chosen, they convey whether key objectives are achieved in a meaningful way for performance management. While a result (such as an Assistance Objective or an Intermediate Result) identifies what we hope to accomplish, indicators tell us by what standard that result will be measured. Targets define whether there will be an expected increase or decrease, and by what magnitude. For complete TIP, click here (pdf, 351kb).
TIPS 7: Preparing a PMP: This TIPS provides the reader with an overview of the purpose and content of a Performance Management Plan (PMP). It reviews key concepts for effective performance management and outlines practical steps for developing a PMP. For complete TIP, click here (pdf, 137kb).
TIPS 8: Baselines and Targets: The achievement of planned results is at the heart of USAID's performance management system. In order to understand where we, as project managers, are going, we need to understand where we have been. Establishing quality baselines and setting ambitious, yet achievable, targets are essential for the successful management of foreign assistance programs. For complete TIP, click here (pdf, 95kb).TIPS 9: Conducting Costumer Service Assessments: A customer service assessment is a management tool for understanding USAID's programs from the customer's perspective. Most often these assessments seek feedback from customers about a program's service delivery performance. The Agency seeks views from both ultimate customers (the end-users, or beneficiaries, of USAID activities-usually disadvantaged groups) and intermediate customers (persons or organizations using USAID resources, services, or products to serve the needs of the ultimate customers). For complete TIP, click here (pdf, 125kb).
TIPS 10: Conducting Focus Group Interviews: A focus group interview is an inexpensive, rapid appraisal technique that can provide managers with a wealth of qualitative information on performance of development activities, services, and products, or other issues. A facilitator guides 7 to 11 people in a discussion of their experiences, feelings, and preferences about a topic. The facilitator raises issues identified in a discussion guide and uses probing techniques to solicit views, ideas, and other information. For complete TIP, click here (pdf, 120kb).
TIPS 11: Introduction to Evaluations at USAID: This TIPS will provide the reader with a general introduction to the purpose, role, and function of evaluation in the USAID program and project design and implementation cycle. It will provide background on why evaluation has become an important part of the effort to improve the effectiveness of foreign assistance programming. It will also provide links to other TIPS with more detailed guidance on when and why to evaluate, how to evaluate, uses of evaluation data, how to address common problems, and how to structure the evaluation's findings, conclusions, and recommendations. For complete TIP, click here (pdf, 175kb).
TIPS 12: Data Quality Standards: Data quality is one element of a larger interrelated performance management system. Data quality flows from a well-designed and logical strategic plan where Assistance Objectives (AOs) and Intermediate Results (IRs) are clearly identified. If a result is poorly defined, it is difficult to identify quality indicators, and further, without quality indicators, the resulting data will often have data quality problems. For complete TIP, click here (pdf, 304kb).
TIPS 13: Building a Results Framework: The Results Framework (RF) is a graphic representation of a strategy to achieve a specific objective that is grounded in cause-and-effect logic. The RF includes the Assistance Objective (AO) and Intermediate Results (IRs), whether funded by USAID or partners, necessary to achieve the objective (see Figure 1 for an example). The RF also includes the critical assumptions that must hold true for the strategy to remain valid. For complete TIP, click here (pdf, 476kb).
TIPS 14: Monitoring the Policy Reform Process: The discussion and examples in this paper are organized around the issues and challenges that USAID's development professionals and their clients/partners face when designing and implementing systems to monitor the policy reform process. For complete TIP, click here (pdf, 216kb).
TIPS 15: Measuring Institutional Capacity: This PME Tips gives USAID managers information on measuring institutional capacity,* including some tools that measure the capacity of an entire organization as well as others that look at individual components or functions of an organization. The discussion concentrates on the internal capacities of individual organizations, rather than on the entire institutional context in which organizations function. This Tips is not about how to actually strengthen an institution, nor is it about how to assess the eventual impact of an organization's work. Rather, it is limited to a specific topic: how to measure an institution's capacities. For complete TIP, click here (pdf, 243kb).TIPS 16: Mixed Methods: This TIPS provides guidance on using a mixed-methods approach for evaluation research. Frequently, evaluation statements of work specify that a mix of methods be used to answer evaluation questions. This TIPS includes the rationale for using a mixed-method evaluation design, guidance for selecting among methods (with an example from an evaluation of a training program) and examples of techniques for analyzing data collected with several different methods (including ?parallel analysis). For complete TIP, click here (pdf, 426kb).
TIPS 17: Creating an Evaluation Report: This TIPS has three purposes. First, it provides guidance for evaluators on the structure, content, and style of evaluation reports. Second, it offers USAID officials, who commission evaluations, ideas on how to define the main deliverable. Third, it provides USAID officials with guidance on reviewing and approving evaluation reports. For complete TIP, click here (pdf, 129kb).
TIPS 18: Conducting Data Quality Assessments: Data quality assessments (DQAs) help managers to understand how confident they should be in the data used to manage a program and report on its success. USAID's ADS notes that the purpose of the Data Quality Assessment is to: "…ensure that the USAID Mission/Office and Assistance Objective (AO) Team are aware of the strengths and weaknesses of the data, as determined by applying the five data quality standards …and are aware of the extent to which the data integrity can be trusted to influence management decisions." (ADS 203.3.5.2) For complete TIP, click here (pdf, 112kb).
TIPS 19: Impact Evaluations: Rigorous impact evaluations are useful for determining the effects of USAID programs on outcomes. This type of evaluation allows managers to test development hypotheses by comparing changes in one or more specific outcomes to changes that occur in the absence of the program. Evaluators term this the counterfactual. Rigorous impact evaluations typically use comparison groups, composed of individuals or communities that do not participate in the program. The comparison group is examined in relation to the treatment group to determine the effects of the USAID program or project. For complete TIP, click here (pdf, 400kb).
Bibliographic Details | |
---|---|
Author | USAID |
Publisher | United States Agency for International Development |
Publication Date | January 1, 2012 |
Publication City | Washington DC |
Publication Work | |
Resource Type | |
Resource Focus |
|
Audience | |
Region | |
Keywords | |
Submitted to Point K | April 10, 2012 - 12:35pm |