Effective program evaluation is a systematic way to improve and account for public health actions. Evaluation involves procedures that are useful, feasible, ethical, and accurate. The Centers for Disease Control and Prevention (CDC) has developed a Framework for Program Evaluation, a practical, non-prescriptive tool, that summarizes and organizes the steps and standards for effective program evaluation.1
1. Increase Your Knowledge and Skills. This training spotlight uses the CDC framework as a conceptual model to organize learning opportunities. It presents introductions to the six steps of program evaluation in short video podcasts. You can also download materials from the CDC about each step. After reviewing the introductory material, you can access additional learning opportunities to gain knowledge and skills related to each step of the framework. Access Trainings from the MCH Navigator
2. Translate Knowledge to Practice. This spotlight also includes an Evaluation Toolkit developed by NCEMCH that includes an evaluation primer, a collection of key resources, and an interactive Choose-and-Use tool to assist users in finding instructions on how to conduct evaluations and examples of successful evaluations from the field. Access Resources from NCEMCH
Trainings from the MCH Navigator
Engaging stakeholders includes those involved in program operations; those served or affected by the program; and primary users of the evaluation. For additional details, see the CDC factsheet on Engaging Stakeholders
Start Here:
Learn More:
Engaging Partners for the Transformation of the MCH Block Grant. Date Developed: March 4, 2017. Source: AMCHP. Presenter(s): Nora Carswell, BBA; Michele Lawler, MS, RD; Michael Spencer, LGSW, MSW. Type: Video Conference Presentation. Level: Introductory. Length: 115 minutes.
Describing the program includes the need, expected effects, activities, resources, stage, context and logic model. For additional details, see the CDC factsheet on Describing the Program
Start Here:
Learn More:
Writing SMART Goals and Objectives. Date Developed: March 30, 2011. Source: Capacity for Health. Presenter(s): Sonya Dublin. Type: Webinar. Level: Introductory. Length: 60 minutes.
Logic Models for Evaluation Planning. Date Developed: September 25, 2012. Source: Capacity for Health. Presenter(s): Lyn Paleo, MPA, DrPH. Type: Webinar. Level: Introductory. Length: 58 minutes.
Program Evaluation, Improvement: Understanding Needs Assessments. Date Developed: December 17, 2015. Source: Defense Centers of Excellence. Presenter(s): Capt. Armen H Thoumaian, PhD, USPHS; Carter Frank, MA, MS; S. Hope Gilbert, PhD; Debra Stark, MBA; Susanne Meehan, BA. Type: Webinar. Level: Introductory. Length: 45 minutes.
Enhancing Program Performance with Logic Models. Date Developed: Unknown. Source: University of Wisconsin-Extension. Presenter(s): N/A Type: Online Course. Level: Introductory. Length: Self-paced.
Focusing the evaluation design ensures that you assess the issues of greatest concern to stakeholders while using time and resources as efficiently as possible. Consider the purpose, users, uses, questions, methods and agreements. For additional details, see the CDC factsheet on Focusing the Evaluation Design
Start Here:
Learn More:
Developing Good Evaluation Questions. Date Developed: April 23, 2014. Source: USAID Learning Lab. Presenter(s): Unknown. Type: Webinar. Level: Introductory. Length: Unknown.
Program Evaluation: Basic Data Analysis. Date Developed: January 25, 2012. Source: Capacity for Health. Presenter(s): Jessica Manta-Meyer. Type: Webinar. Level: Introductory. Length: 64 minutes.
Intermediate Data Management and Analysis. Date Developed: May 2, 2012. Source: Capacity for Health. Presenter(s): Jessica Manta-Meyer. Type: Webinar. Level: Intermediate. Length: 55 minutes.
Focusing Evaluation Design. Date Developed: March 27, 2014. Source: Urban Indian Health Institute. Presenter(s): Julie Loughran, MPH. Type: Webinar. Level: Introductory. Length: 30 minutes.
Thinking About Design. Date Developed: Unknown. Source: CDC. Presenter(s): Thomas J Chapel, MA, MBA. Type: Webinar. Level: Introductory. Length: 15 minutes.
Gathering credible evidence strengthens evaluation judgments and the recommendations that follow. These aspects of evidence gathering typically affect perceptions of credibility: indicators, sources, quality, quantity and logistics. For additional details, see the CDC factsheet on Gathering Credible Evidence
Data Collection in Program Evaluation: How to Ensure Quality and Security. Date Developed: May 17, 2015. Source: Defense Centers of Excellence. Presenter(s): Capt. Armen H Thoumaian, PhD, USPHS; Jill Goodwin, PsyD; Toby Canning, PhD; Carter Frank, MA, MS; Susanne Meehan, BS. Type: Webinar. Level: Introductory. Length: 49 minutes.
Approaches to Collecting and Using Social Determinants of Health Data. Date Developed: June 23, 2016. Source: All In: Data for Community Health. Presenter(s): Peter Eckart, AM; Alison Rein, MS; Andrew Hamilton, RN, BSN, MS; Michelle Lyn, MBA, MHA. Type: Webinar. Level: Introductory. Length: 60 minutes.
Using Mixed Methods to Evaluation Interventions.Date Developed: June 14, 2016. Source: International Institute for Qualitative Methodology. Presenter(s): Souraya Sidani. Type: Webinar. Level: Introductory. Length: 55 minutes.
Data Collection Choices. Date Developed: Unknown. Source: CDC. Presenter(s): Tom Chapel. Type: Webinar. Level: Introductory. Length: 22 minutes.
Mixed Methods in Program Evaluation. Date Developed: Unknown. Source: CDC. Presenter(s): Tom Chapel. Type: Webinar. Level: Introductory. Length: 22 minutes.
Justifying conclusions entails linking those conclusions to the evidence gathered and judging them against agreed-upon values or standards set by the stakeholders. Justify conclusions on the basis of evidence using these five elements: standards, analysis/synthesis, interpretation, judgment and recommendations. For additional details, see the CDC factsheet on Justifying Conclusions
Start Here:
Learn More:
Introduction to Analyzing Evaluation Data. Date Developed: June 23, 2016. Source: Urban Indian Health Institute. Presenter(s): Jami Bartgis, PhD. Type: Webinar. Level: Introductory. Length: 66 minutes.
Basic Data Analysis. Date Developed: April 18, 2012. Source: Capacity for Health. Presenter(s): Jessica Manta-Meyer. Type: Webinar. Level: Introductory. Length: 64 minutes.
Fundamentals of Qualitative Research Methods.Date Developed: June 23, 2015. Source: Yale Global Health Leadership Institute. Presenter(s): Leslie Curry PhD, MPH. Type: Webinar. Level: Introductory. Length: 17 minutes.
Ensuring use and sharing lessons learned occurs with these steps: design, preparation, feedback, follow-up and dissemination. For additional details, see the CDC factsheet on Ensuring Use and Sharing Lessons Learned as well as a checklist of items to consider when developing evaluation reports.
Use of Impact Evaluation Results. Date Developed: November 20, 2012. Source: InterAction. Presenter(s): David Bonbright. Type: Webinar. Level: Introductory. Length: 61 minutes.
This video presentation talks about various evaluation methodologies and how they have been applied in real-world settings.
Resources from NCEMCH
The Evaluation Toolkit is designed to help users locate resources to assist in documenting and achieving measurable health outcomes. It consists of three components:
An Evaluation Primer that explains the basics of public health evaluations.
A list of Key Resources for use in the development of evaluations.
A Choose and Use Guide for quickly accessing current evaluations and guides for use as promising practices.
1 Koplan, Jeffrey P., Robert L. Milstein, and S. Wetterhall. "Framework for program evaluation in public health." Atlanta, Georgia.: US. Department of Health & Human Services (1999).
Evaluation: Training Spotlight
December 2017
Keisha Watson, PhD
John Richards, MA, AITP
This project is supported by the Health Resources and Services Administration (HRSA) of the U.S. Department of Health and Human Services (HHS) under grant number UE8MC25742; MCH Navigator for $225,000/year. This information or content and conclusions are those of the author and should not be construed as the official position or policy of, nor should any endorsements be inferred by HRSA, HHS or the U.S. Government.