Cover image for Road to Results : Designing and Conducting Effective Development Evaluations.
Road to Results : Designing and Conducting Effective Development Evaluations.
Title:
Road to Results : Designing and Conducting Effective Development Evaluations.
Author:
Morra Imas, Linda G.
ISBN:
9780821379110
Personal Author:
Physical Description:
1 online resource (611 pages)
Contents:
CONTENTS -- Preface -- About the Authors -- Abbreviations -- Introduction -- References -- FOUNDATIONS -- Chapter 1. Introducing Development Evaluation -- Evaluation: What Is It? -- The Origins and History of the Evaluation Discipline -- The Development Evaluation Context -- Principles and Standards for Development Evaluation -- Examples of Development Evaluations -- Summary -- Chapter 1 Activity -- Notes -- References and Further Reading -- Web Sites -- Chapter 2. Understanding the Issues Driving Development Evaluation -- Overview of Evaluation in Developed and Developing Countries -- Implications of Emerging Development Issues -- Summary -- References and Further Reading -- Web Sites -- PREPARING AND CONDUCTING EFFECTIVE DEVELOPMENT EVALUATIONS -- Chapter 3. Building a Results-Based Monitoring and Evaluation System -- Importance of Results-Based Monitoring and Evaluation -- What Is Results-Based Monitoring and Evaluation? -- Traditional versus Results-Based Monitoring and Evaluation -- Ten Steps to Building a Results-Based Monitoring and Evaluation System -- Summary -- Chapter 3 Activities -- Notes -- References and Further Reading -- Web Sites -- Chapter 4. Understanding the Evaluation Context and the Program Theory of Change -- Front-End Analysis -- Identifying the Main Client and Key Stakeholders -- Understanding the Context -- Tapping Existing Knowledge -- Constructing, Using, and Assessing a Theory of Change -- Summary -- Chapter 4 Activities -- References and Further Reading -- Web Sites -- Chapter 5. Considering the Evaluation Approach -- General Approaches to Evaluation -- Summary -- Chapter 5 Activity -- Notes -- References and Further Reading -- Web Sites -- DESIGNING AND CONDUCTING -- Chapter 6. Developing Evaluation Questions and Starting the Design Matrix -- Sources of Questions -- Types of Questions.

Identifying and Selecting Questions -- Developing Good Questions -- Designing the Evaluation -- Summary -- Chapter 6 Activities -- References and Further Reading -- Web Sites -- Chapter 7. Selecting Designs for Cause-and-Effect, Descriptive, and Normative Evaluation Questions -- Connecting Questions to Design -- Designs for Cause-and-Effect Questions -- Designs for Descriptive Questions -- Designs for Normative Questions -- The Need for More Rigorous Evaluation Designs -- Chapter 7 Activities -- Annex 7.1 Summary of Key Terms and Attributes of Different Design Types -- References and Further Reading -- Web Sites -- Chapter 8. Selecting and Constructing Data Collection Instruments -- Data Collection Strategies -- Characteristics of Good Measures -- Quantitative and Qualitative Data -- Tools for Collecting Data -- Summary -- Chapter 8 Activities -- References and Further Reading -- Web Sites -- Chapter 9. Choosing the Sampling Strategy -- Introduction to Sampling -- Types of Samples: Random and Nonrandom -- Determining the Sample Size -- Summary -- Chapter 9 Activities -- References and Further Reading -- Web Sites -- Chapter 10. Planning for and Conducting Data Analysis -- Data Analysis Strategy -- Analyzing Qualitative Data -- Analyzing Quantitative Data -- Linking Qualitative Data and Quantitative Data -- Summary -- Chapter 10 Activities -- References and Further Reading -- Web Sites -- MEETING CHALLENGES -- Chapter 11. Evaluating Complex Interventions -- Big-Picture Views of Development Evaluation -- Joint Evaluations -- Country Program Evaluations -- Sector Program Evaluations -- Thematic Evaluations -- Evaluation of Global and Regional Partnership Programs -- Summary -- References and Further Reading -- Web Sites -- LEADING -- Chapter 12. Managing an Evaluation -- Managing the Design Matrix -- Contracting the Evaluation.

Roles and Responsibilities of Different Players -- Managing People, Tasks, and Budgets -- Summary -- Chapter 12 Activities -- References and Further Reading -- Web Sites -- Chapter 13. Presenting Results -- Crafting a Communication Strategy -- Writing an Evaluation Report -- Displaying Information Visually -- Making an Oral Presentation -- Summary -- Chapter 13 Activities -- References and Further Reading -- Web Sites -- Chapter 14. Guiding the Evaluator: Evaluation Ethics, Politics, Standards, and Guiding Principles -- Ethical Behavior -- Politics and Evaluation -- Evaluation Standards and Guiding Principles -- Summary -- Chapter 14 Activity -- References and Further Reading -- Web Sites -- Chapter 15. Looking to the Future -- Past to Present -- The Future -- Chapter 15 Activity -- References and Further Reading -- Web Sites -- Appendixes -- Index -- Boxes -- 1.1 Uses of Evaluation -- 1.2 The Institute of Internal Auditors and the International Organization of Supreme Audit Institutions -- 1.3 The 10 Parts of the DAC Evaluation Quality Standards -- 1.4 Relevance: The World Food Programme's Evaluation of Food Aid for Relief and Recovery in Somalia -- 1.5 Effectiveness: DfID's Evaluation of Support for the World Food Programme's Efforts in Bangladesh -- 1.6 Efficiency: Evaluation of the Disasters Emergency Committee's Mozambique Flood Appeal Funds -- 1.7 Impact: Joint Evaluation of Emergency Assistance to Rwanda -- 1.8 Sustainability: JICA's Evaluation of the Third Country Training Program on Information and Communication Technology -- 2.1 The Millennium Development Goals -- 3.1 The Power of Measuring Results -- 3.2 Difference between Results-Based Monitoring and Results-Based Evaluation -- 3.3 Ten Uses of Results Findings -- 4.1 How to Conduct a Stakeholder Analysis -- 4.2 Tapping the Knowledge Fund on Crime Prevention.

4.3 Reviewing the Literature on Programs to Improve Student Achievement in a Southern Africa Country -- 5.1 Social Assessment of the Azerbaijan Agricultural Development and Credit Project -- 5.2 Building Trust through Participatory Evaluation -- 5.3 Using an Evaluation Synthesis to Measure Environmental Benefits -- 6.1 Evaluating an Intervention That Has No Standards -- 6.2 Evaluating Policies and Interventions Using Question-and-Answer Questions -- 6.3 The Five Stages of the Evaluation Process -- 7.1 What Makes Elephants Go Away? -- 7.2 Do Community-Managed Schools Work? An Evaluation of El Salvador's EDUCO Program -- 7.3 Using a Cross-Sectional Design to Answer Descriptive Questions -- 7.4 Example of Case Study Design for Descriptive Questions -- 7.5 Using a Before-and-After Design to Answer Descriptive Questions -- 7.6 Example of Interrupted Time Series Design for Descriptive Questions -- 7.7 Impact of Job Training Programs for Laid-Off Workers -- 7.8 Spain's Support of Rigorous Evaluation Designs -- 8.1 Rules for Collecting Data -- 8.2 Taking a Structured Approach to Evaluating an Agricultural Intervention -- 8.3 Patton's 20-Question Qualitative Checklist -- 8.4 Measuring the Popularity of Art Exhibits Using Obtrusive and Unobtrusive Methods -- 8.5 Using Google Earth for Mapping -- 8.6 Structured and Semistructured Survey Questions -- 8.7 Tips for Wording Survey Questions -- 8.8 General Guidelines for Conducting Surveys -- 8.9 Tips for Writing Questionnaires -- 8.10 Tips for Conducting Interviews -- 8.11 Tips for Conducting Interviews across Cultures -- 8.12 Tips for Designing Focus Group Questions -- 8.13 Sample Focus Group Questions -- 9.1 Using Cluster Sampling to Identify People with AIDS to Interview -- 10.1 Example of Coding -- 10.2 Using Content Analysis to Identify the Perceived Benefits of Hand Pumps.

10.3 Strategies for Coding Data Electronically -- 10.4 Tips for Maintaining Quantitative Data -- 10.5 Calculating the Standard Deviation -- 10.6 Guidelines for Analyzing Quantitative Survey Data -- 11.1 Example of Country Program Evaluation Methodology -- 11.2 Joint External Evaluation of Tanzania's Health Sector, 1999…2006 -- 11.3 Example of Thematic Evaluation of Assessment of Child Scavenging in Africa, Asia, and Europe -- 12.1 Tips for Writing Terms of Reference for an Evaluation -- 12.2 Tips for Resolving Conflicts -- 12.3 Tips for Improving Teamwork -- 13.1 Using Innovative Communication Techniques to Increase Interest in Evaluation Results -- 13.2 Tips for Creating Effective Figures -- 13.3 Tips for Preparing PowerPoint Presentations and Handouts -- 13.4 Tips for Using Slides -- 15.1 Key Issues for Diagnosing a Government's M&E Systems -- 15.2 Essential Competencies for Program Evaluators -- Figures -- 3.1 Program Theory of Change (Logic Model) to Achieve Outcomes and Impacts -- 3.2 Sample Program Theory of Change (Logic Model) to Reduce Childhood Morbidity through Use of Oral Rehydration Therapy -- 3.3 Ten Steps to Designing, Building, and Sustaining a Results-Based Monitoring and Evaluation System -- 3.4 Spectrum of Data Collection Methods -- 3.5 Identifying Expected or Desired Level of Improvement Requires Selecting Performance Targets -- 3.6 Key Types of Monitoring -- 3.7 Links between Implementation Monitoring and Results Monitoring -- 3.8 Example of Linking Implementation Monitoring to Results Monitoring -- 3.9 Achieving Results through Partnership -- 4.1 Moving from Inputs to Results -- 4.2 Potential Environmental Influences on Program Results -- 4.3 Process for Constructing a Theory of Change -- 4.4 A Simple Theory of Change for Improving Decision Making by Training Evaluators.

4.5 Simple Theory of Change Diagram with Key Assumptions Identified.
Abstract:
This comprehensive text presents concepts and procedures for evaluation in a development context. It provides procedures and examples on how to set up a monitoring and evaluation system, how to conduct participatory evaluations and do social mapping, and how to construct a "rigorous" quasi-experimental design to answer an impact question. The book begins with a description of the context of development evaluation and how it arrived where it is today. It then discusses current issues driving development evaluation, such as the Millennium Development Goals and the move from simple project evaluations to the broader understandings of complex evaluations. The topics of implementing "Results-based Measurement and Evaluation" and constructing a "Theory of Change" are emphasized throughout the text. Next, the authors take the reader down "the road to results", presenting procedures for evaluating projects, programs, and policies by using a "Design Matrix" to help map the process. This road includes: determining the overall approach formulating questions selecting designs developing data collection instruments choosing a sampling strategy planning data analysis for qualitative, quantitative, and mixed method evaluations. The book also includes discussions on conducting complex evaluations, how to manage evaluations, how to present results, and ethical behavior--including principles, standards, and guidelines. The final chapter discusses the future of development evaluation. This comprehensive text is an essential tool for those involved in development evaluation.
Local Note:
Electronic reproduction. Ann Arbor, Michigan : ProQuest Ebook Central, 2017. Available via World Wide Web. Access may be limited to ProQuest Ebook Central affiliated libraries.
Added Author:
Electronic Access:
Click to View
Holds: Copies: