10-Step Evaluation for Training and Performance Improvement
- 0 %
Der Artikel wird am Ende des Bestellprozesses zum Download zur Verfügung gestellt.

10-Step Evaluation for Training and Performance Improvement

 WEB PDF
Sofort lieferbar | Lieferzeit: Sofort lieferbar I

Unser bisheriger Preis:ORGPRICE: 74,99 €

Jetzt 74,98 €*

ISBN-13:
9781544323978
Einband:
WEB PDF
Seiten:
352
Autor:
Seung Youn (Yonnie) Chyung
eBook Typ:
PDF
eBook Format:
WEB PDF
Kopierschutz:
Adobe DRM [Hard-DRM]
Sprache:
Englisch
Beschreibung:

Introducing multiple evaluation frameworks, this text uses problem-based learning to guide the reader through a 10-step evaluation process, allowing them to produce specific deliverables that culminate in a completed evaluation project.

Written with a learning-by-doing approach in mind,  10-Step Evaluation for Training and Performance Improvement gives students actionable instruction for identifying, planning, and implementing a client-based program evaluation. The book introduces readers to multiple evaluation frameworks and uses problem-based learning to guide them through a 10-step evaluation process. As students read the chapters, they produce specific deliverables that culminate in a completed evaluation project.

List of Tables

List of Figures

List of Exhibits

Preface

About the Author

Introduction

Performance Improvement and Evaluation

What Is Evaluation?

What Is Not Evaluation?

How Does Evaluation Compare With Research?

Program Evaluation in the HPI Context

Evaluation Is Often Neglected

Different Evaluation Designs Used in Program Evaluation

Descriptive Case Study Type Evaluation Design

Frameworks for Conducting Evaluations in the HPI Context

The 10-Step Evaluation Procedure

Chapter Summary

Chapter Discussion

Chapter 1. Identify an Evaluand (Step 1) and Its Stakeholders (Step 2)

Identify a Performance Improvement Intervention as an Evaluand

Use the 5W1H Method to Understand the Intervention Program

Ask Why the Intervention Program Was Implemented

Check If Program Goals Are Based on Needs

Sell Evaluation to the Client

Identify Three Groups of Stakeholders

Chapter Summary

Chapter Discussion

Now, Your Turn—Identify an Evaluand and Its Stakeholders

Chapter 2. Identify the Purpose of Evaluation (Step 3)

Differentiate Evaluation From Needs Assessment

Gather Information About the Evaluation Purpose

Assess Stakeholders’ Needs for the Program and the Evaluation

Determine If the Evaluation Is a Formative or Summative Type

Determine If the Evaluation Is Goal Based or Goal Free

Determine If the Evaluation Is Merit Focused or Worth Focused

Keep in Mind Using a System-Focused Evaluation Approach

Write an Evaluation Purpose Statement

Chapter Summary

Chapter Discussion

Now, Your Turn—Identify the Purpose of Evaluation

Chapter 3. Assess Evaluation Feasibility and Risk Factors

Incorporate Macro-Level Tasks Into Micro-Level Steps

Assess Feasibility of the Evaluation Project

List Project Assumptions

Estimate Tasks and Time Involving Stakeholders

Assess Risk Factors for the Evaluation Project

Chapter Summary

Chapter Discussion

Now, Your Turn—Assess Feasibility and Risk Factors

Chapter 4. Write a Statement of Work

Prepare a Statement of Work for the Evaluation

Determine Sections to Be Included in a Statement of Work

Develop a Gantt Chart

Review a Sample Statement of Work

Now, Your Turn—Write a Statement of Work

Chapter 5. Develop a Program Logic Model (Step 4)

Apply a Theory-Based, If–Then Logic to Developing a Program

Review United Way’s Program Outcome Model

Review Kellogg Foundation’s Program Logic Model

Review Brinkerhoff’s Training Impact Model Compared to the Four-Level Training Evaluation Framework

Compare Elements Used in Different Frameworks

Develop a Program Logic Model

Develop a Training Impact Model

Chapter Summary

Chapter Discussion

Now, Your Turn—Develop a Program Logic Model or a Training Impact Model

Chapter 6. Determine Dimensions and Importance Weighting (Step 5)

Think About Dimensions of the Evaluand to Investigate

Start With the Stakeholders’ Needs

Relate the Purpose of Evaluation to the Program Logic Model Elements

Incorporate Relevant Theoretical Frameworks and Professional Standards

Write Dimensional Evaluation Questions

Determine Importance Weighting Based on Usage of Dimensional Findings

Recognize a Black Box, Gray Box, or Clear Box Evaluation

Finalize the Number of Dimensions

Chapter Summary

Chapter Discussion

Now, Your Turn—Determine Dimensions and Importance Weighting

Chapter 7. Determine Data Collection Methods (Step 6)

Determine Evaluation Designs for Dimensional Evaluations

Select Data Collection Methods That Allow Direct Measures of Dimensions

Apply Critical Multiplism

Triangulate Multiple Sets of Data

Select Appropriate Methods When Using the Four-Level Training Evaluation Model

Select Appropriate Methods When Using Brinkerhoff’s Success Case Method

Review an Example of Data Collection Methods

Use an Iterative Design Approach

Assess Feasibility and Risk Factors Again

Conduct Formative Meta-Evaluations

Chapter Summary

Chapter Discussion

Now, Your Turn—Determine Data Collection Methods

Chapter 8. Write an Evaluation Proposal and Get Approval

Determine Sections to Be Included in an Evaluation Proposal

Review a Sample Evaluation Proposal

Now, Your Turn—Write an Evaluation Proposal

Chapter 9. Develop Data Collection Instruments I—Self-Administered Surveys (Step 7)

Comply With IRB Requirements

Use Informed Consent Forms

Determine Materials to Be Developed for Different Data Collection Methods

Distinguish Anonymity From Confidentiality

Develop Materials for Conducting Self-Administered Surveys

Determine Whether to Use Closed-Ended Questions, Open-Ended Questions, or Both

Ask Specific Questions That Measure the Quality of a Dimension

Design Survey Items Using a Question or Statement Format

Recognize Nominal, Ordinal, Interval, and Ratio Scales

Decide Whether to Include or Omit a Midpoint in the Likert Scale

Decide Whether to Use Ascending or Descending Order of the Likert Scale Options

Follow Other Guidelines for Developing Survey Items

Develop Survey Items That Measure a Construct

Test Validity and Reliability of a Survey Instrument

Conduct Formative Meta-Evaluations

Chapter Summary

Chapter Discussion

Now, Your Turn—Develop Survey Instruments

Chapter 10. Develop Data Collection Instruments II—Interviews, Focus Groups, Observations, Extant Data Reviews, and Tests (Step 7)

Determine Whether to Use a Structured, Unstructured, or Semi-Structured Interview

Develop Materials for Conducting Interviews or Focus Groups

Solicit Interview Volunteers at the End of a Self-Administered Web-Based Survey

Develop Materials for Conducting Observations

Develop Materials for Conducting Extant Data Reviews

Develop Materials for Administering Tests

Conduct Formative Meta-Evaluations

Chapter Summary

Chapter Discussion

Now, Your Turn—Develop Instruments for Conducting Interviews, Focus Groups, Observations, Extant Data Reviews, and Tests

Chapter 11. Collect Data (Step 8)

Follow Professional and Ethical Guidelines

What Would You Do?

Use Strategies to Collect Data Successfully and Ethically

Use Strategies When Collecting Data From Self-Administered Surveys

Use Strategies When Collecting Data From Interviews and Focus Groups

Use Strategies When Collecting Data From Observations and Tests

Use Strategies to Ensure Anonymity or Confidentiality of Data

Conduct Formative Meta-Evaluations

Chapter Summary

Chapter Discussion

Now, Your Turn—Collect Data

Chapter 12. Analyze Data With Rubrics (Step 9)

Use Evidence-Based Practice

Keep in Mind: Evaluation = Measurement + Valuation With Rubrics

Apply the Same or Different Weighting to the Multiple Sets of Data

Analyze Structured Survey Data With Rubrics

Analyze Unstructured Survey or Interview Data With Rubrics

Analyze Semi-Structured Survey or Interview Data With Rubrics

Analyze Data Obtained From Observations, Extant Data Reviews, and Tests With Rubrics

Determine the Number of Levels and Labels for Rubrics

Triangulate Results Obtained From Multiple Sources for Each Dimension

Conduct Formative Meta-Evaluations

Chapter Summary

Chapter Discussion

Now, Your Turn—Analyze Data With Rubrics

Chapter 13. Draw Conclusions (Step 10)

Revisit Formative or Summative Use of Evaluation Findings

Develop a Synthesis Rubric

Draw Evidence-Based Conclusions and Recommendations

Conduct Formative Meta-Evaluations

Chapter Summary

Chapter Discussion

Now, Your Turn—Draw Conclusions and Make Recommendations

Chapter 14. Write a Final Report and Conduct a Summative Meta-Evaluation

Extend the Evaluation Proposal to a Final Report

Present Dimensional Results in the Evaluation Results Section

Present Supporting Information in Appendices

Present Conclusions

Report the Findings Ethically

Conduct a Summative Meta-Evaluation

Report Limitations

Write an Executive Summary

Present the Final Report to Stakeholders

Follow Up With Stakeholders

Present Complete Sections in a Final Report

Now, Your Turn—Write a Final Report

Appendix A. A Summary of the Frameworks Used

Appendix B. Evaluation Development Worksheets

Appendix C. Survey Questionnaire Make

Appendix D. A Sample Survey Questionnaire Measuring Multiple Dimensions, Sample Rubrics, and Reliability Testing With IBM® SPSS® Statistics

Appendix E. Experimental Studies and Data Analysis With t-Tests Using Excel

Glossary

References

Index