Evaluation Tools

Mentor provides a guide to assist you with evaluation your prevention program. It includes several resources that allow you to introduce or improve your program evaluation. Evaluation is an essential part of drug prevention work, allowing you to assess your impact and improve your funding potential.

We all know the value of program evaluation. It involves more than collecting, analyzing, and providing data. It makes it possible for you to gather information and to use the information to learn continually about and improve your program. Developing and using an evaluation plan is an important step in strengthening community capacity and promoting community involvement. Identifying and measuring outcomes provides program participants with a clear path of the way forward and can help promote their active engagement in the program (W.W. Kellogg Foundation, 2004).

Provided below are several resources and starting points for building an evaluation plan or strengthening your current program evaluation. Each resource is summarized and then followed by either its web-based source or the link to a supporting document.

1. Center for Substance Abuse Prevention (CSAP) Prevention Tool

CSAP’s online resource provides full coverage for evaluating a prevention program. It is aimed at helping the novice evaluator as well as an experienced one. The resource has the following tools:

  • Prevention Platform: A comprehensive tool for designing an outcome or process evaluation and identifying data collection strategies.

  • Prevention Pathways: Technical assistance to answer questions related to evaluation, assist with planning evaluation efforts, and identify data collection instruments.

  • Prevention Management Reporting and Training System: This Web site provides a single point of access to a variety of content and core services, and offers a single sign-on to many Prevention web sites that previously required separate logins. The Prevention Management Reporting and Training System will provide all of CSAP education, data collection and training systems through one web portal.

Source: prevention.samhsa.gov/evaluation

2. Logic Model Development Guide

A popular, scientifically-tested and effective model for evaluating a program is the logic model. As a learning and management tool, the logic model provides a guide for effective program planning, implementation, and evaluation. Using evaluation and the logic model can strengthen your program by documenting your outcomes and allowing to share knowledge about what works in your program and why it works. W. K. Kellogg Foundation, a large NGO in the United States, published in 2004 this handbook that describes how to use a logic model to develop and implement an evaluation of your program.

Source: W.K. Kellogg Foundation (1-800-819-9997), click here for a PDF copy of this guide.

3. Prevention Plus

This very detailed handbook provides a step-by-step approach to assessing drug prevention programs at the school and community level. Program evaluation is presented according to a four-step model:

  1. goal and desired outcome identification;
  2. process assessment;
  3. outcome assessment; and
  4. impact assessment.

Steps in preparing an evaluation report and sample assessment measures are provided. Also described are specific evaluation worksheets to be used in evaluating any of 50 types of activities that are commonly used in drug prevention programs.

Source: ERIC - Education Resources Information Center or click here to read online at Google Books.

4. Identifying and Selecting Evidence-Based Interventions:

Revised Guidance Document for Strategic Prevention Framework State Incentive Grant Program The purpose of this guide book is to assist community planners to identify and select evidence-based prevention programs that address local needs and reduce substance abuse problems. There are 6 sections to this resource:

  1. Section I sets the stage for selecting evidence-based interventions to include in a comprehensive strategic plan.
  2. Section II focuses on two analytic tasks: 1) assessing local needs, resources, and readiness to act; and 2) developing a community logic model.
  3. Section III details how prevention planners can apply the community logic model to determine the conceptual fit or relevance of prevention strategies that hold the greatest potential for affecting a particular substance abuse problem.
  4. Section IV discusses the importance of strength of evidence to inform and guide intervention selection decisions. Presents three definitions of “evidence-based” programs.
  5. Section V summarizes the process of working through three considerations that determine the best fit of interventions to include in comprehensive prevention plans.
  6. Section VI discusses expectations for selecting and implementing evidence-based, community prevention programs.

Source: Substance Abuse and Mental Health Services Administration. Click here for a PDF copy of the document.

5. European Monitoring Centre for Drugs and Drug Addiction (EMCDDA)

EMCDDA provides two major resources for identifying best practices and evaluating programs.

  • The Evaluation Instruments Bank (EIB). This is an online archive of freely available instruments for evaluating drug-related interventions. Details regarding copyright and/or possible use restrictions are specified for each instrument. Instruments are generally classed according to the intervention field they are designed to be used in (treatment, prevention, or harm reduction), though some instruments may be usable in more than one field. Also, there is an additional link here that provides prevention evaluation support and tools for a wide range of different target groups and prevention related issues, for different target groups and with both process and outcome evaluations.

  • Prevention and Evaluation Resources Kit (PERK). This resource compiles basic but evidence-based prevention principles, planning rules and evaluation tips. Additionally, it provides related documentation or references for download; it is hoped that this additional material will be particularly useful for readers who have difficulty accessing the scientific prevention literature. To illustrate the theoretical discussion, an intervention example, partly based on a real-life situation, gives a practical perspective. Finally, an additional aim of the PERK exercise is to develop a first common draft of minimum prevention principles and standards for the European Union, similar to the NIDA’s ‘Red Book’.

6. UNODC

Two resources are provided by UNODC.

  • Monitoring and Evaluating Prevention Activities with the Active Involvement of Youth and the Community

    This resource contains tools to help you plan, implement, monitor and evaluate prevention activities that are effective and that involve youth at each stage of the project. Tools are provided to help monitor and evaluate prevention activities with the active involvement of youth and the community. Commenting “Ideally, you should think about monitoring and evaluation already while you are planning your prevention activities”, the page provides general guidance on planning an effective prevention response to the substance abuse situation of a target group or a community.

  • Youth Substance Abuse Prevention Programmes

    This is a handbook for practitioners who want to improve the monitoring and evaluation of their programmes for the prevention of substance and drug abuse among youth. It was prepared on the basis of the available literature and of the experience of the members of the Global Youth Network. The handbook discusses how to improve the monitoring and evaluation of your programme and how to go about it. The publication is presently available in Arabic, Chinese, English, French, Russian and Spanish.

    Chapter 1. Introduction Chapter 2. Why monitor and evaluate Chapter 3. What are monitoring and evaluation? Chapter 4. What should be monitored and evaluated Chapter 5. Who should be involved in the monitoring and evaluation? Chapter 6. A framework to plan monitoring and evaluation Chapter 7. Collecting the information Chapter 8. Analysing the data and using the information you have collected

7. Evaluation and Assessment of School Drug Education Programmes

Another tool provided by UNODC is this resource. It defines different kinds of evaluation methods, both formal and informal, that teachers or facilitators of drug education programmes can use to assess the quality of programme implementation and its effects on student knowledge, attitudes and behaviour. It includes a checklist for evaluating skills-based drug education programs.

8. How do we know we are making a difference? A community alcohol, tobacco, and drug indicators handbook

These resources from Join Together will help you develop an effective monitoring and evaluation plan for your drug related programme. The material consists of a handbook and website. They are designed to be both a standalone guide for users who don't have a print copy of the book, and an up-to-date reference for those who do.

Both resources are divided into three sections:

  1. Building Your Program This section covers everything from starting an indicator reporting program to creating an effective indicator report. We recommend that you begin here if you're in the early stages of planning an indicators program.
  2. Community Indicators Here you will find a menu of alcohol, tobacco, and drug indicators with examples of what to measure, links to data sources, interpretation guidelines, and other resources.
  3. Community Stories The third section features a growing list of examples of what other communities have accomplished using an indicator reporting program.

9. A Guide to Evaluating Prevention Programs

The International Center for Alcohol Policy (ICAP) has published an evaluation toolkit that includes a detailed evaluation guide and two case studies. A Guide to Evaluating Prevention Programs is intended for those who are implementing alcohol prevention programs, as well as producers and trade associations in the beverage alcohol industry who have programs aimed at reducing alcohol-related harm. The guide provides a concise and introductory overview of the basic elements of an evaluation, including process-, outcome- and impact-based evaluation designs. The two case studies provide a description of how to apply the principles of evaluation to an awareness campaign around the use of designated drivers and a school-based alcohol education program.

Click here for a copy of the guide and the two case studies.

10. Evaluation of the Drug Prevention Activities: Theory and Practice

This publication was developed by the Prevention Platform of the Pompidou Group (PG) of the Council of Europe. This resource is intended to assist "policymakers and their advisors in the decision-making process about the allocation of scarce resources for drug prevention." As the authors note, it is important for policy makers to understand the limitations for evaluation of drug prevention interventions and address the ways that evaluation can be made more effective. Thus, this work goes to great lengths to educate its readership as to ways an evaluation is a cost-effective and useful tool The manual consists of two main sections: Evaluation of the Drug Prevention Activities: Theory (by Alfred Uhl) and Evaluation of the Drug Prevention Activities: Practice (by Richard Ives).

Read the documnet online at the Pompidou website.

Additional Resources

Do you want more information? Here are additional resources about program evaluation.

A. EMCDDA

EMCDDA provides a number of helpful publications for those wanting to know more about evaluation:

Source: EMCDDA

B. Centers for Disease Control (CDC)

The CDC also has a great online library that lists over 100 prevention program evaluation resources. There is something there for everyone and every need.

Source: Centers for Disease Control and Prevention

C. Council of Europe Pompidou Group

The Pompidou group recently held a meeting on evaluation: “Evaluation of drug prevention: from dogma to useful tool”. Here are comments from its conclusion:

  1. Different approaches to prevention require different approaches to the evaluation of prevention. Drug prevention takes place in an environment of multiple and interlinked factors which influence outcomes in various ways. It takes place in many different settings and addresses a range of need. The complexity and diversity of drug prevention defies simplistic summary. Different and varied prevention approaches are therefore appropriate.

  2. Drug prevention requires a comprehensive and long-term view. Evaluation of drug prevention therefore also needs also to take this perspective. Drug prevention is a long term investment. The ‘project-based’ approach to drug prevention has merit – for example, when novel approaches are being piloted or where transferability is being tested; or where resources are very limited or only available short-term. But the project-based approach has serious limitations.

  3. More synergy should be looked for in implementing and evaluating prevention – for example, with other social problems and risky behaviour. Drug prevention – and its evaluation – is best approached in partnership with practitioners, with ‘stakeholders’ and with communities.

  4. Communicating this complex and multifaceted picture to politicians, policy-makers and citizens is a necessary and urgent task.

  5. The international transferability of prevention activities is feasible and can be useful. However, there are issues in adoption and adaptation. Evaluation can assist in identifying the essential elements of prevention work that should be retained in any context, and those elements that can be adjusted to suit particular national, regional, local and sub-cultural experiences, traditions and contexts.

Source: Pompidou Group (Council of Europe)

D. Project Synthesis

A Mentor-funded project reviewed evidence-based prevention programs and identified elements that were characteristic of these blue-chip programs. Your program’s features can be matched against the features of known top-grade programs. Click here for a PDF copy of this article.

E. Mentor’s Scientific Advisory Network

Mentor’s own group of scientists can provide advice and support for designing, planning and implementing an evaluation of your program. Contact jeff [at] mentorfoundation [dot] org (Jeff Lee) for details.

Final Thoughts

This is not an exhaustive review of help in how to go about evaluating your prevention work. We hope it offers a good starting point for your evaluation work and needs. If you have other suggestions, experience, references, resources please send a note to winte001 [at] umn [dot] edu (Ken Winters) or jeff [at] mentorfoundation [dot] org (Jeff Lee).

Prepared by Ken Winters, Jeff Lee, and Richard MacKenzie,Mentor Foundation, June 2010