Evaluation Workbook

Transcription

Evaluation Workbook
Colorado Partners for
Sustainable Change
Evaluation Workbook
Fifth in a six-part series
Part 1:
Part 2:
Part 3:
Part 4:
Part 5:
Part 6:
1
Getting Started
Needs Assessment
Strategic Planning
Implementation
Evaluation
Sustainability
Strategic Planning: Evaluation Workbook
Colorado Partners for Sustainable Change
*A 6 part series funded by CDHS; Division of Behavioral Health
http://www.cdhs.state.co.us/adad/prevention.htm
Evaluation
What Is Evaluation?
Evaluation is the systematic assessment of the design, implementation, and utility of a program,
practice or policy. There are several types of evaluation, which occur at different stages of a project.
Three of the most common types are: process, fidelity, and outcome evaluation. Process evaluation
involves understanding how a program works by documenting program functioning. Fidelity
evaluation refers to assessing the extent to which a program is delivered as it was intended according to
research. Outcome evaluation is an assessment of the effectiveness of a program -- does it work? This
workbook will focus primarily on outcome evaluation, but some resources related to process and
fidelity evaluation are included in Appendix C.
This workbook outlines the steps for developing an evaluation plan and includes templates, sample
plans, and helpful tips.
Communities engaging in evaluation should already have a coalition in place and have completed a
thorough needs assessment, created a strategic plan, and begun implementing programs and
strategies. See the previous four workbooks, ‘Getting Started,’ ‘Needs Assessment,’ ‘Strategic
Planning,’ and ‘Implementation’, for guidance on these prior steps.
Why should you conduct an evaluation?
1. To monitor the effectiveness of your interventions, programs, and strategies.
2. To make data-driven decisions in choosing the strategies to continue implementing.
3. To inform and support fundraising efforts.
Using the Workbook
Each section of this Workbook is organized in the same format. You will see the following
headings in each section:
Actions: Provides an overview of work to consider, or goals to obtain, when going through a
needs assessment process.
Tools:
These can be followed step by step or can be used as a standalone for a specific topic
your existing coalition may need support in.
You might find it helpful to keep track of all your evaluation planning in one place. An Evaluation
Planning Table is included in Appendix A. It is suggested that you refer to this table throughout
the evaluation workbook.
2
Strategic Planning: Evaluation Workbook
Colorado Partners for Sustainable Change
*A 6 part series funded by CDHS; Division of Behavioral Health
http://www.cdhs.state.co.us/adad/prevention.htm
Acknowledgements
"This document was prepared under Contract number IHM ADA0601085 with the Colorado Department of
Human Services (CDHS) Division of Behavioral Health. The publication was made possible by Grant
Number #6 U79 SP11181 from the HHS Center for Substance Abuse Prevention (CSAP), Substance Abuse
and Mental Health Services Administration (SAMHSA)'s Strategic Prevention Framework State Incentive
Cooperative Agreements Grant administered by the Community Prevention section of the CDHS Division of
Behavioral Health. Its contents are solely the responsibility of the authors and do not necessarily represent
the official views of the HHS Center for Substance Abuse Prevention (CSAP), Substance Abuse and Mental
Health Services Administration (SAMHSA) or the CDHS Division of Behavioral Health."
3
Strategic Planning: Evaluation Workbook
Colorado Partners for Sustainable Change
*A 6 part series funded by CDHS; Division of Behavioral Health
http://www.cdhs.state.co.us/adad/prevention.htm
Workbook Sections
Strategy I: Understanding Evaluation
Actions:
1. Increase understanding about evaluation, including its benefits and value
2. Determine the purpose of your evaluation
3. Assess organizational readiness for evaluating outcomes
Tools:
 Assessing the Benefits of Evaluation
 Assessing Readiness for Evaluating Outcomes
Strategy II: Planning an Evaluation
Actions:
1. Determine the roles and responsibilities for your evaluation
2. Ensure that standards of utility, feasibility, propriety, and accuracy will inform the evaluation
planning process
3. Describe the strategy or strategies to be evaluated; participants, resources (inputs), activities
and desired outcomes (short term, mid-term and long term)
4. Identify key evaluation questions and indicators
5. Choose an evaluation design
6. Compile components into an evaluation plan
Tools:
 Evaluation Committee Action Plan
 Things to Consider When Planning an Evaluation
 Overview of Process, Fidelity, and Outcome Evaluation
 Key Evaluation Questions
 Ways to Make Your Evaluation More Culturally Sensitive
 Outcome Evaluation Designs
Strategy III: Collecting Data
Actions:
1. Revisit your research questions/evaluation questions/expected outcomes
2. Determine how you will know whether your expected outcomes have been achieved
3. Determine the type of data collection method(s)
4. Develop a data collection plan
4
Strategic Planning: Evaluation Workbook
Colorado Partners for Sustainable Change
*A 6 part series funded by CDHS; Division of Behavioral Health
http://www.cdhs.state.co.us/adad/prevention.htm
Tools:
 Data Collection Methods
 Selecting a Data Collection Method
 Increasing Response Rate
 Culturally Sensitive Data Collection
Strategy IV: Analyzing and Interpreting Evaluation Data
Actions:
1. Review available data
2. Prepare and focus the data
3. Analyze the data
4. Interpret the data and identify limitations
Tools:




Data Analysis Glossary
Steps for Analyzing Quantitative Data
Basic Quantitative Data Analysis Techniques
Tips for Analyzing Qualitative Data
Strategy V: Presenting and Utilizing Evaluation Data
Actions:
1. Determine the audience(s)
2. Decide which information and findings to communicate
3. Choose a format or formats
4. Present the evaluation data
5. Reflect on the data to identify recommendations and next steps
Tools:





5
Selecting a Data Presentation Format
Characteristics of an Effective Graph
Types of Communications and Reports
Self-Assessment: How well are you using your evaluation findings?
Questions to Ask Yourself When Interpreting and Utilizing Your Data
Strategic Planning: Evaluation Workbook
Colorado Partners for Sustainable Change
*A 6 part series funded by CDHS; Division of Behavioral Health
http://www.cdhs.state.co.us/adad/prevention.htm
6
Strategic Planning: Evaluation Workbook
Colorado Partners for Sustainable Change
*A 6 part series funded by CDHS; Division of Behavioral Health
http://www.cdhs.state.co.us/adad/prevention.htm
Strategy I:
Understanding Evaluation
Actions in this Section
1. Increase understanding about evaluation, including its benefits and value
2. Determine the purpose of your evaluation
3. Assess organizational readiness for evaluating outcomes
Tools in this Section


Assessing the Benefits of Evaluation
Assessing Readiness for Evaluating Outcomes
Tips for Success:




7
Be ready to facilitate discussions involving different perspectives on the usefulness of
evaluation to help increase buy-in with evaluation.
Remember that stakeholders and funders may have varying reasons or needs for conducting
an evaluation.
Consider issues that may hinder or facilitate evaluation work.
Look for local resources, such as college or graduate students, who could help with your
evaluation at low or no cost.
Strategic Planning: Evaluation Workbook
Colorado Partners for Sustainable Change
*A 6 part series funded by CDHS; Division of Behavioral Health
http://www.cdhs.state.co.us/adad/prevention.htm
Assessing the Benefits of Evaluation
This tool can be useful for gaining a better understanding of your staff or coalition members’
attitudes towards evaluation. There is space to fill in your own items at the bottom.
What is the value of evaluation?
1. Provides answers to your questions
2. Helps you know what difference you are making
3. Validates your successes
4. Helps improve decision making
5. Leads to improved programming
6. Leads to better resource allocation
7. Provides data to be able to communicate the value
of your program
8. Increases your understanding about the program
and people being affected
9. Fulfills accountability demands
10. Satisfies supervisor, funder, organization
11. Leads to continued support; attracts support
12. Enhances the knowledge base
13. Leads to publicity and recognition for you, the
program and your organization
14. Contributes to staff improvement and
development
15. Facilitates communication with clientele
16. Helps generate resources
17. Facilitates creation of new partnerships
18. Makes it possible to compare your work to others
19. Is useful in performance appraisal
20. Gives us reason to be satisfied, as appropriate
21. Gives us reason and opportunity to celebrate
22. Provides a way to demonstrate scholarship
23. Provides opportunity to publish
24.
25.
8
Circle Individual
(I), Organization
(O), or both
I
I
I
I
I
I
O
O
O
O
O
O
I
O
I
O
I
I
I
I
O
O
O
O
I
O
I
O
I
I
I
I
I
I
I
I
I
I
I
O
O
O
O
O
O
O
O
O
O
O
Strategic Planning: Evaluation Workbook
Colorado Partners for Sustainable Change
*A 6 part series funded by CDHS; Division of Behavioral Health
http://www.cdhs.state.co.us/adad/prevention.htm
Prioritize your
TOP 10
(1=highest
priority)
Assessing Your Organization’s Readiness for Evaluating Outcomes
This tool is designed to help organizations consider some of the factors involved in successfully
evaluating outcomes. It is not necessary that all factors are in place before starting to evaluate, but a
supporting and conducive environment is important. There is no right or wrong answer. Mark the
number that best describes how you feel about each statement.
Don’t
Know
Characteristic
We have a clearly defined and commonly
understood mission statement, vision, and
values/beliefs.
We have identified priorities that are reflected
in our goals.
The leaders in our organization are committed
to results-based management and measuring
outcomes.
There is general commitment to evaluation
throughout the organization.
Human, fiscal, and computer resources are
available for planning and implementing
outcome measurement.
There is a plan and timeline for our outcome
evaluation process.
We have a common evaluation
language/framework in the organization.
Stakeholders/funders are expecting our
organization to report outcomes.
Strongly
Disagree
Disagree
Neither
Agree
Agree
nor
Disagree
Strongly
Agree
















































Staff are interested in evaluating outcomes.






Staff have skills to conduct evaluation.






Our organization supports professional
development.






Evaluation is rewarded in our organization.






Evaluation processes, data, and findings are
valued.
Evaluation data are used (will be used) within
the organization to improve programs, guide
resource allocations, and support planning.
Evaluation data are used (will be used) outside
the organization to enhance public image,
increase funding, and share promising
practices.


















9
Strategic Planning: Evaluation Workbook
Colorado Partners for Sustainable Change
*A 6 part series funded by CDHS; Division of Behavioral Health
http://www.cdhs.state.co.us/adad/prevention.htm
Strategy II:
Planning an Evaluation
Actions in this Section
1. Determine the roles and responsibilities for your evaluation
2. Ensure that standards of utility, feasibility, propriety, and accuracy will inform the evaluation
planning process
3. Describe the strategy or strategies to be evaluated; participants, resources (inputs), activities
and desired outcomes (short term, mid-term and long term)
4. Identify key evaluation questions and indicators
5. Choose an evaluation design
6. Compile components into an evaluation plan
Tools in this Section






Evaluation Committee Action Plan
Things to Consider When Planning an Evaluation
Overview of Process, Fidelity, and Outcome Evaluation
Key Evaluation Questions
Ways to Make Your Evaluation More Culturally Sensitive
Outcome Evaluation Designs
Tips for Success:





10
Include strategies to protect human subjects and ensure cultural competency in your
evaluation plan.
Creating an evaluation plan is often a group effort.
Engaging stakeholders is one of the first steps to creating a successful evaluation.
A logic model can be useful for describing the program to be evaluated.
Multiple methods of data collection and having a control or comparison group strengthen an
evaluation design.
Strategic Planning: Evaluation Workbook
Colorado Partners for Sustainable Change
*A 6 part series funded by CDHS; Division of Behavioral Health
http://www.cdhs.state.co.us/adad/prevention.htm
Evaluation Committee Action Plan
Use or adapt this chart to create an action plan for your evaluation committee.
Task
11
Who will do
it
Timeline
Resources
needed
Strategic Planning: Evaluation Workbook
Colorado Partners for Sustainable Change
*A 6 part series funded by CDHS; Division of Behavioral Health
http://www.cdhs.state.co.us/adad/prevention.htm
Progress
made
Things to Consider When Planning an Evaluation
UTILITY – How useful is your evaluation to you and
your audiences?
Purpose is clearly stated
Users of the evaluation and their information needs are
considered – the evaluation will serve the information
needs of intended users
There is a plan for sharing the evaluation
The evaluation provides useful information that will be
valued
FEASIBILITY – How practical is your evaluation?
The evaluation can be implemented given your resources,
situation, and audience
Nothing in the political climate is evident that will affect
implementation or use of the evaluation results
The value of the evaluation outweighs the costs of
conducting it
PROPRIETY – How appropriate is the evaluation for
those who are involved?
The evaluation demonstrates respect for people and their
rights
There is a plan to properly communicate findings to all
involved or affected
Both strengths and weaknesses are examined
The evaluation is fair and ethical
ACCURACY – How accurate is the evaluation and the
information it conveys?
The program is clearly described (logic model)
The evaluation “fits” the program; your evaluation
procedures are clear so that anyone who wants to copy
your evaluation or determine its adequacy can do so
12
Strategic Planning: Evaluation Workbook
Colorado Partners for Sustainable Change
*A 6 part series funded by CDHS; Division of Behavioral Health
http://www.cdhs.state.co.us/adad/prevention.htm
Needs
Improvement
Okay
Evaluation Standard
Excellent
Use this checklist to determine how your evaluation plan integrates key evaluation standards.
Comments
Overview of Process, Fidelity, and Outcome Evaluations
Process-based evaluations are designed to understand how a program, policy or practice is
implemented and how it produces results. These evaluations are useful in understanding
program change, addressing problems with the program, becoming more efficient in delivering
programs, and for accurately detailing how a program works so that it can be understood by
people outside the organization and replicated elsewhere.
Sample questions/outputs:
o How many meetings were held?
o How many people attended the training session?
o How many youth received the program?
Fidelity evaluation is an assessment of how well a program is being implemented in comparison
with the original design of the program. Evidence-based programs are developed and tested over
time based on scientific theory to build the program components. Properly implementing the
program components – that is, implementing them with fidelity - is expected to lead to program
outcomes. Deviations from, or dilution of the program components as they were developed,
could have unintended or even negative consequences on program outcomes.
Implementation fidelity questions:
o Did the program instructor have adequate training?
o Was the entire curriculum taught?
o Did each session of the program last as long as it was supposed to?
Outcomes result from program participation, and an outcome evaluation focuses on determining
whether the desired benefits have occurred. Outcomes often pertain to enhanced learning
(knowledge, attitudes, or skills) or improved conditions, such as increased literacy, self-reliance,
and so on. Outcomes are often confused with program outputs or units of services, such as the
number of clients who went through a program or the number of parents who viewed a media
campaign.
Outcome examples:
o Youth have a less favorable attitude toward substance use
o Fewer parents supply alcohol for their kids
o Youth get better math grades
o Fewer DUI arrests
13
Strategic Planning: Evaluation Workbook
Colorado Partners for Sustainable Change
*A 6 part series funded by CDHS; Division of Behavioral Health
http://www.cdhs.state.co.us/adad/prevention.htm
Key Evaluation Questions
About outcomes/impacts
 What do people do differently as a result of the program?
 Who benefits from the program and how?
 Are participants satisfied with what they gain from the program?
 Are the program’s accomplishments worth the resources invested?
 What do people learn, gain, accomplish?
 What are the social, economic, environmental impacts (positive and negative) on people,
communities, and the environment?
 What are the strengths and weaknesses of the program?
 Which activities contribute most? Least?
 What, if any, are unintended secondary or negative effects?
 How well does the program respond to the initiating need?
 How efficiently are everyone’s resources being used?
About program implementation
 What does the program consist of – activities, events?
 What delivery methods are used?
 Who actually carries out the program and how well do they do so?
 Who participates in which activities? Does everyone have equal access?
 What resources and inputs are invested?
 How many volunteers are involved and what roles do they play?
 Are the financial and staff resources adequate?
About program context
 How well does the program fit in the local setting? With educational needs and learning styles of
target audiences?
 What in the program’s context inhibits or contributes to program success?
 What in the setting are givens and what can be changed?
 Who else works on similar concerns? Is there duplication?
About program need
 What are the characteristics of the target population?
 What assets in the local context and among target groups can be built upon?
 What are current practices?
 What changes do people see as possible or important?
Source: Taylor-Powell, E., Steele, S., & Douglah, M. (1996). Planning a program evaluation (G3658-1). Page 5.
http://learningstore.uwex.edu/pdf/G3658-1.PDF
14
Strategic Planning: Evaluation Workbook
Colorado Partners for Sustainable Change
*A 6 part series funded by CDHS; Division of Behavioral Health
http://www.cdhs.state.co.us/adad/prevention.htm
15
Strategic Planning: Evaluation Workbook
Colorado Partners for Sustainable Change
*A 6 part series funded by CDHS; Division of Behavioral Health
http://www.cdhs.state.co.us/adad/prevention.htm
Ways to Make Your Evaluation More Culturally Sensitive
1. Examine your own biases and attitudes and their probable origins.
2. During the early stages of planning the evaluation, take time to explore the cultures and any
cultural issues that might affect your evaluation.
3. Educate yourself about the cultural groups involved in your program and/or evaluation.
4. Listen to people tell their stories, ask questions, read, and learn. Consider an activity where
people bring in crafts/recipes/artifacts from their cultures to share, and share your own.
5. Engage members of the cultural groups to participate in the design and implementation of
the evaluation or in an evaluation advisory group. Incorporate diverse perspectives and
opinions.
7. Be flexible in your choice of evaluation design and data collection methods.
8. Use multiple sources of information and data collection methods.
9. Remember that evaluation may take more time if you need to build trusting relationships
with new groups.
10. Reject the myth of color blindness. Everyone is touched by race. It shapes how others see
you and how you see yourself.
11. Recognize that the culture you belong to – your own identity group – affects your
perspectives and behavior. Culture is who you are. This is true for everyone you meet,
including program participants and stakeholders.
12. Work with others who differ in race, ethnicity, orientation, abilities, etc. in order to broaden
and develop our own perspectives.
13. Engage in self-reflective thinking and writing to better understand your own culture in order
to better understand others.
14. Don’t assume that one way, or your way, is better.
15. Always be respectful.
16. Avoid jargon and exclusive language and behaviors.
17. Demystify evaluation.
18. Other strategies:
________________________________________________________________________
Adapted from Preskill, H., & Russ-Eft, D. (2005). Activity 12: Cultural sensitivity in evaluation. In Building evaluation
capacity: 72 activities for teaching and training (p. 66). Thousand Oaks, CA: Sage Publications, Inc. and Cowles, T.
(2005, Summer). Beyond basic training: 10 strategies for enhancing multicultural competency in evaluation [Electronic
version]. The Evaluation Exchange, Vol. XI, No. 2. Retrieved August 4, 2008 from
http://www.hfrp.org/evaluation/the-evaluation-exchange/issue-archive/evaluation-methodology/ten-strategiesforenhancing-multicultural-competency-in-evaluation.
Outcome Evaluation Designs
16
Strategic Planning: Evaluation Workbook
Colorado Partners for Sustainable Change
*A 6 part series funded by CDHS; Division of Behavioral Health
http://www.cdhs.state.co.us/adad/prevention.htm
The following is a list of the different types of outcome evaluation designs to choose from.
Outcome
Evaluation
Designs
Post-test only
Retrospective
pre- and posttest
Pre- and posttest
Pre- and posttest with a
comparison
group
17
Description
of Design
Advantages
Evaluation is done after
the strategy is already
completed; for example,
an end-of-session
questionnaire
Participants are asked to
recall or reflect on their
situation, knowledge,
attitude, behavior, etc.
prior to the strategy
Information is gathered
both before the strategy
takes place and once again
afterwards
Useful when time is an issue
or participants are not
available before the strategy
begins
Data are collected before
the strategy from two
groups. One group
participates in the strategy
and the other does not.
Data are collected from
both groups once the
strategy has ended
Provides the most assurance
that outcomes are actually the
result of your strategy; allows
you to more accurately assess
how much of an effect the
strategy has
Useful when you want
comparison information but
a true pre-test is impossible
to administer
Relatively simple to
implement; controls for
participants' prior
knowledge/attitudes/
skills/intentions; and
provides better evidence of
the effectiveness of the
strategy compared to prior
designs
Strategic Planning: Evaluation Workbook
Colorado Partners for Sustainable Change
*A 6 part series funded by CDHS; Division of Behavioral Health
http://www.cdhs.state.co.us/adad/prevention.htm
Challenges
Difficult to determine the
magnitude of the outcome
or whether the outcomes
are due to the strategy or
due to some other cause
Participants may find it
difficult to remember
how they thought/behaved
prior to the strategy
Cannot account for nonstrategy influences on
outcomes; when selfreporting is used rather
than objective measures
post-test scores may be
lower than pre-test scores
when participants
overestimate their
knowledge/attitudes/skills
on a pre-test but accurately
assess their knowledge/
attitudes/skills on the
post-test
Can demand more time
and resources and requires
access to at least two
similar groups
Outcome
Evaluation
Designs
Description
of Design
Advantages
Challenges
Pre- and posttest with follow
up
Data are collected before
the strategy begins, at the
end of the strategy, and
again at some point in the
future
Allows you to see if the
strategy has lasting effects;
can provide valuable
information about long-term
impacts
Time series
Data are taken at intervals
before the strategy begins
and after it ends
Allows you to track
participants’ progress as they
move through the strategy
Tracking and contacting
participants demands time
and resources; cannot
account for non-strategy
influences unless a
comparison group is also
tracked
Best suited for longer
strategies; doesn’t account
for non-strategy influences
18
Strategic Planning: Evaluation Workbook
Colorado Partners for Sustainable Change
*A 6 part series funded by CDHS; Division of Behavioral Health
http://www.cdhs.state.co.us/adad/prevention.htm
Strategy III:
Collecting Data
Actions in this Section
1.
2.
3.
4.
Revisit your research questions/evaluation questions/expected outcomes
Determine how you will know whether your expected outcomes have been achieved
Determine the type of data collection method(s) you will use
Develop a data collection plan
Tools in this Section




Data Collection Methods
Selecting a Data Collection Method
Increasing Response Rate
Culturally Sensitive Data Collection
Tips for Success:






19
Using more than one type of data collection method can provide a more complete picture.
Consider the overall advantages and disadvantages of each data collection method.
Determine if you need to obtain permission to collect data (e.g., from the participant, school,
parent).
Ensure individuals who are administering data collection tools receive training on how to do
so and periodically check to see that they continue to follow established procedures.
When possible, use a measurement tool that has been developed and validated by previous
research rather than designing your own questions/surveys.
Consider whether your data collection method is appropriate for what you are assessing in
terms of “level” (i.e., using community level data to assess program change is inappropriate;
data should be collected from program participants)
Strategic Planning: Evaluation Workbook
Colorado Partners for Sustainable Change
*A 6 part series funded by CDHS; Division of Behavioral Health
http://www.cdhs.state.co.us/adad/prevention.htm
Data Collection Methods
Data collection methods are typically categorized as either quantitative or qualitative.
Quantitative methods have more structured responses that are typically easy to aggregate and
analyze using statistical techniques. Examples include surveys/questionnaires and existing data (e.g.,
Census data)
Qualitative methods allow for greater variability and detail in responses. Examples include
interviews, focus groups, and case studies.
Which data collection method to use for your evaluation depends on what you want to know, the
type of data you need, and your available resources. The table below can help you decide which data
collection method(s) to use.
Method
Overall Purpose
Advantages
Challenges
Questionnaires,
Surveys,
Checklists
When need to quickly
-can complete anonymously
and/or easily get lots of
-inexpensive to administer
information from people in a -easy to compare and analyze
non threatening way
-administer to many people
-can get lots of data
-many sample questionnaires
already exist
-might not get careful
feedback
-wording can bias client's
responses
-are impersonal
-in surveys, may need
sampling expert
- doesn't get full story
Interviews
When want to fully
understand someone's
impressions or experiences,
or learn more about their
answers to questionnaires
-get full range and depth of
information
-develops relationship with
client
-can be flexible with client
-can take much time
-can be hard to analyze and
compare
-can be costly
-interviewer can bias client's
responses
Documentation
review
When want impression of
how program operates
without interrupting the
program; is from review of
applications, finances,
memos, minutes, etc.
-get comprehensive and
historical information
-doesn't interrupt program or
client's routine in program
-information already exists
-few biases about information
-often takes much time
-info may be incomplete
-need to be quite clear about
what looking for
-not flexible means to get
data; data restricted to what
already exists
Observation
To gather accurate
information about how a
program actually operates,
particularly about processes
-view operations of a program
as they are actually occurring
-can adapt to events as they
occur
-can be difficult to interpret
seen behaviors
-can be complex to categorize
observations
-can influence behaviors of
program participants
-can be expensive
20
Strategic Planning: Evaluation Workbook
Colorado Partners for Sustainable Change
*A 6 part series funded by CDHS; Division of Behavioral Health
http://www.cdhs.state.co.us/adad/prevention.htm
Focus groups
Explore a topic in depth
through group discussion,
e.g., about reactions to an
experience or suggestion,
understanding common
complaints, etc.; useful in
evaluation and marketing
-quickly and reliably get
common impressions
-can be efficient way to get
much range and depth of
information in short time
- can convey key information
about programs
-can be hard to analyze
responses
-need good facilitator for
safety and closure
-difficult to schedule 6-8
people together
Case studies
To fully understand or
depict client's experiences in
a program, and conduct
comprehensive examination
through cross comparison of
cases
-fully depicts client's
experience in program input,
process and results
-powerful means to portray
program to outsiders
-usually quite time consuming
to collect, organize and
describe
-represents depth of
information, rather than
breadth
McNamara, C. (1997-2008). Overview of methods to collect information. In Basic guide to program evaluation. Minneapolis,
MN: Free Management Library. http://www.managementhelp.org/evaluatn/fnl_eval.htm#anchor1585345
21
Strategic Planning: Evaluation Workbook
Colorado Partners for Sustainable Change
*A 6 part series funded by CDHS; Division of Behavioral Health
http://www.cdhs.state.co.us/adad/prevention.htm
Needs
Improvement
Okay
Data Collection Standard
Excellent
Things to consider when selecting a data collection method
Comments
UTILITY – How useful is your data collection
method?
Will the data sources and collection methods serve the
information needs of your primary users?
Are your sources of information clear?
Are your sources of information appropriate?
FEASIBILITY – How practical is your data collection
method?
Are your sources and methods practical and efficient?
Do you have the capacity, time, and resources?
Are your methods non-intrusive and non-disruptive?
PROPRIETY – How appropriate your data collection
method for your participants?
Are your methods respectful, legal, ethical, and
appropriate?
Does your approach protect and respect the welfare of all
those involved or affected?
ACCURACY – Are your data collection methods
technically adequate?
Does your method adequately answer your questions?
Does your method measure what you intend to measure?
Does your method reveal credible and trustworthy
information?
Does your method convey important information?
Are your data collected in a consistent and quality manner?
Adapted from University of Wisconsin-Extension, Cooperative Extension (2008). Building capacity in evaluating
outcomes: A teaching and facilitating resource for community-based programs and organizations. Madison, WI: UW
Extension, Program Development and Evaluation.
22
Strategic Planning: Evaluation Workbook
Colorado Partners for Sustainable Change
*A 6 part series funded by CDHS; Division of Behavioral Health
http://www.cdhs.state.co.us/adad/prevention.htm
Ways to increase your response rate:
The response rate is the proportion of people who participated. It is calculated by dividing the
number of returned surveys by the total number of surveys distributed.
The higher your response rate the more likely your results are representative of the entire group you
surveyed (i.e., the less likely your results are biased due to the way certain individuals responded)
A response of 60% is considered an acceptable return rate for survey research.










Over sample overall and from any specific target populations you may have to help ensure
you receive enough responses.
Ensure that respondents see the value of participating.
Use a combination of methods.
Contact participants in advance.
Make (multiple) follow-up contacts.
Provide incentives.
Provide 1st class postage/return postage.
Set return deadlines.
Make the survey easy to complete.
Offer to send participants the results so they know their information will be used.
*Adapted from University of Wisconsin-Extension, Cooperative Extension (2008). Building capacity in evaluating
outcomes: A teaching and facilitating resource for community-based programs and organizations. Madison, WI: UW
Extension, Program Development and Evaluation.
23
Strategic Planning: Evaluation Workbook
Colorado Partners for Sustainable Change
*A 6 part series funded by CDHS; Division of Behavioral Health
http://www.cdhs.state.co.us/adad/prevention.htm
Things to consider to help ensure your data collection method is culturally sensitive
Existing data/records
 Need careful translation of documents in another language
 May have been written/compiled using unknown standards or levels of aggregation
 May be difficult to get authorization to use
 Difficult to correct document errors if low literacy level
Survey:
 Literacy level
 Tradition of reading, writing
 Setting
 Not best choice for people with oral tradition
 Translation (more than just literal translation)
 How cultural traits affect response – response sets
 How to sequence the questions
 Pretest questionnaire may be viewed as intrusive
Interview:
 Preferred by people with an oral culture
 Language level proficiency; verbal skill proficiency
 Politeness – responding to authority (thinking it’s unacceptable to say “no”), nodding,
smiling, agreeing
 Need to have someone present to conduct the interview
 Relationship/position of interviewer
 May be seen as interrogation
 Direct questioning may be seen as impolite, threatening, or confrontational
Focus Group:
 Issues of gender, age, class, clan differences
 Issues of pride, privacy, self-sufficiency, and traditions
 Relationship to facilitator as prerequisite to rapport
 Same considerations as for interview
Observation:
 Discomfort, threat of being observed
 Issue of being an “outsider”
 Observer effect
 Possibilities for misinterpretations
*Adapted from University of Wisconsin-Extension, Cooperative Extension (2008). Building capacity in evaluating
outcomes: A teaching and facilitating resource for community-based programs and organizations. Madison, WI: UW
Extension, Program Development and Evaluation.
24
Strategic Planning: Evaluation Workbook
Colorado Partners for Sustainable Change
*A 6 part series funded by CDHS; Division of Behavioral Health
http://www.cdhs.state.co.us/adad/prevention.htm
25
Strategic Planning: Evaluation Workbook
Colorado Partners for Sustainable Change
*A 6 part series funded by CDHS; Division of Behavioral Health
http://www.cdhs.state.co.us/adad/prevention.htm
Strategy IV:
Analyzing and Interpreting Evaluation Data
Actions in this Section
1.
2.
3.
4.
Review available data
Prepare and focus the data
Analyze the data
Interpret the data and identify limitations
Tools in this Section
 Data Analysis Glossary
 Steps for Analyzing Quantitative Data
 Basic Quantitative Data Analysis Techniques
 Tips for Analyzing Qualitative Data
Tips for Success:







26
Determine your evaluation questions before analyzing any data so that you know what you
want to be able to say with the data.
Get input from all stakeholders, especially in regard to determining evaluation questions and
interpreting results.
Simple analyses may be the best analyses for meeting your evaluation objectives.
Consider hiring a data analyst, but do not be afraid to conduct basic analyses.
Use the data to offer solutions as well as highlight problems.
Remember that results do not speak for themselves. For example, what does it mean that
45% of the respondents reported that underage drinking is not a problem in their
community? Is this percentage greater or less than last year? High or low for the county?
Because of the reliance on interpretation, analyzing qualitative data is best done by more
than one person with open discussion after each has read the same transcripts or notes.
Strategic Planning: Evaluation Workbook
Colorado Partners for Sustainable Change
*A 6 part series funded by CDHS; Division of Behavioral Health
http://www.cdhs.state.co.us/adad/prevention.htm
Data Analysis Glossary
TERM
Analysis
Cleaning data
Code
Codebook
Coding
Content analysis
Cross tabulation
(cross tab)
Data
Database
Dependent variable
Descriptive statistics
Frequency
Independent variable
Instrument
Interpretation
Mean
Measure
Measures of central
tendency
Median
Mode
Percent distribution
Percentage
27
DEFINITION
The process of separating a whole into its parts to better understand it
The process of checking all data and excluding from analysis any forms or
individual responses that are incomplete or do not make sense
A number, symbol, or label given to a piece of data in order to abbreviate it
A record of terms and decisions that provides instruction for data entry
The process of giving a “code” to each response or piece of information to
enable analysis
A process for organizing and analyzing open-ended, unstructured
information (qualitative data)
Shows the distribution of two or more variables simultaneously; usually
presented in a contingency table
Information that may be quantitative or qualitative in nature
A comprehensive collection of related data organized for convenient access,
generally in a computer program
Aspects, such as knowledge, attitude, and behavior, that are expected to
change as a result of the program (intervention)
A branch of statistics in which the analyses “describe” the raw data, such as
counts, percentages, measures of central tendency, and measures of
variability (e.g., range, standard deviation, and variance)
Count of a particular response occur; number of times something occurs
Aspects, such as the program or activities, that you deliver/control
The questionnaire, form, checklist, or device that captures or collects
information
The process of making sense of or bringing meaning to the analyzed data
Average; obtained by adding all the answers or scores and dividing by the
total number
verb: To ascertain the quantity or quality of something
noun: The instrument used to estimate or appraise something
Analyses that characterize what is typical for the group, including means,
modes, and medians
The middle value or midpoint of responses
The most commonly occurring value or answer
The proportion of respondents selecting each response
A part of a whole expressed in hundredths; a commonly used statistic that
expresses information as a proportion of a whole
Strategic Planning: Evaluation Workbook
Colorado Partners for Sustainable Change
*A 6 part series funded by CDHS; Division of Behavioral Health
http://www.cdhs.state.co.us/adad/prevention.htm
Population
Pre-post tests
Qualitative data
Quantitative data
Raw data
Response
Response rate
Sample
Sampling
Spreadsheet
Statistics
Unique identifier
Variable
The total group of interest (people, businesses, locations, etc.), from which
a sample is drawn or about which a conclusion is stated. In survey research,
the “population” refers to any group of people or organizations you are
studying. You can have a “population” of worksites, restaurants,
associations, churches, or schools. But you can also have a “population” of
older adults, teens, or pregnant women.
An instrument administered before and after the intervention as a means
for documenting change over that period of time
Data that consist of words and observations, not numbers
Data in the form of numbers, or numeric data
Data as collected before they are processed and analyzed
The individual answer to a given question
The number of responses returned divided by the total number solicited
The subgroup or subset of the larger group, or population
The process of selecting members from the larger population to meet the
purpose of the study
Computer software that allows data to be arranged in a grid for easy entry
and basic analysis
A branch of mathematics that involves organizing and interpreting
numerical information
A number assigned to individuals or questionnaires to identify and track
each one
A characteristic that is measurable
*From University of Wisconsin-Extension, Cooperative Extension (2008). Building capacity in
evaluating outcomes: A teaching and facilitating resource for community-based programs and
organizations. Madison, WI: UW Extension, Program Development and Evaluation.
28
Strategic Planning: Evaluation Workbook
Colorado Partners for Sustainable Change
*A 6 part series funded by CDHS; Division of Behavioral Health
http://www.cdhs.state.co.us/adad/prevention.htm
Steps for Analyzing Quantitative Data
Organize
Analyze
Interpret
Identify
limitations
 Assign unique identification numbers (IDs) to each questionnaire – use the same number
of digits for each. Check each questionnaire for completeness. Remove questionnaires
that are substantially incomplete or do not make sense.
 Calculate the response rate. How many questionnaires did you send out? How many were
returned? How many of the returned surveys are complete and usable?
Response rate = number of usable surveys returned / number of surveys mailed
 Assign codes as needed (e.g., No = 0, Yes = 1; 999 = missing, 888 = not applicable) and
create a codebook of those codes.
 Record the data from each question, keeping written track of any decisions you make
about the data. You will want a record so you can treat additional data, such as a posttest, the same way and so that people new to the project can enter and analyze data the
same way as before.
 Keep the questionnaires organized (numbered) and enter the data systematically. You
may want to note on the questionnaire who entered it in case there are problems or
questions later.
 Check for data accuracy. Look for values that are higher than the options, for
inconsistent responses, and for data entry errors.
 Determine frequency, percentages, and/or other analyses.
 Consider analyzing separately by characteristics of your sample population, such as age,
gender, or ethnicity.
 Create tables, charts, and/or other data displays to show the data.
29
Strategic Planning: Evaluation Workbook
Colorado Partners for Sustainable Change
*A 6 part series funded by CDHS; Division of Behavioral Health
http://www.cdhs.state.co.us/adad/prevention.htm
 Delve deeper to see findings across participants and program characteristics.

 Look at the data for patterns, high numbers, low numbers, and expected/unexpected
results.
 Bring meaning to the numbers, percentages, words, and comments. What information
needs attention?
 Highlight key points and lessons learned.
 Seek explanations for any external and internal factors that may have affected the results.
 Identify any limitations such as useful information that was not collected, biases in the
respondents’ answers, or low return rates. Think about how these limitations might affect
your results and what you may be able to do about them in the future.
*Adapted from University of Wisconsin-Extension, Cooperative Extension (2008). Building
capacity in evaluating outcomes: A teaching and facilitating resource for community-based
programs and organizations. Madison, WI: UW Extension, Program Development and
Evaluation.
30
Strategic Planning: Evaluation Workbook
Colorado Partners for Sustainable Change
*A 6 part series funded by CDHS; Division of Behavioral Health
http://www.cdhs.state.co.us/adad/prevention.htm
Basic Quantitative Data Analysis Techniques
Quantitative analysis involves using numerical data to make calculations and draw conclusions in terms of
percentages, proportions, and other values.
 Here are a few simple steps to begin your analysis of quantitative data.
1. Describe the responding persons, organizations, or communities using frequencies for each
demographic or other descriptive item of information.
 Give the total number (n=) for each descriptive item. This is important, especially if some
people didn’t answer all the questions and the numbers differ from question to question.
 Consider reporting the range for each descriptive item (e.g., the youngest participant was
“18 years old” and the oldest was “67 years old”)
 You may want to use percentages as well as counts.
 If applicable, describe the participants and the comparison group separately.
2. For questions whose answers are reported as rating scales or rankings (e.g., strongly disagree = 1,
strongly agree = 5) consider computing a mean, or average, for each question.
Example: Suppose 40 people responded to an Attitude item. Let’s say that one
person responded “1” (strongly disagree), three people responded “2” (disagree),
eight people responded “3” (neither agree nor disagree), 17 people responded “4”
(agree), and 11 people responded “5” (strongly agree). The mean score would be:
(1x1)+(3x2)+(8x3)+(17x4)+(11x5) =
40
154 = 3.85
40
Thus, on average, people’s attitudes were closest to the response “agree”.
3. If a series of questions go together (e.g., a set of attitude questions), consider making a scale out
of these items and computing one score for the entire scale.
Example: Suppose there are five attitude questions, each with possible responses of
“1” (strongly disagree), “2” (disagree), “3” (neither agree nor disagree), “4” (agree),
and “5” (strongly agree).
If a person’s responses to the questions were: agree, strongly agree, disagree, agree,
and neither agree nor disagree , then the total score on the attitude scale would be:
4+5+2+4+3 = 18.
This could also be expressed as an average response: 18/5 questions = 3.6.
31
Strategic Planning: Evaluation Workbook
Colorado Partners for Sustainable Change
*A 6 part series funded by CDHS; Division of Behavioral Health
http://www.cdhs.state.co.us/adad/prevention.htm
4. For outputs or outcomes expressed in categories (e.g., attended/did not attend; drink/used to
drink/ never drank), consider calculating proportions, rates, or ratios.
Proportion: A proportion is a part of the whole.
Example: Suppose 18 legislators of the 25 who were visited by members of your
coalition voted for the social hosting legislation. The proportion would be 18/25 = .72.
Expressed as a percentage, this would equal: .72 x 100 = 72%.
Rate: A rate is a special type of proportion. It has a specific time period associated with it,
and it is expressed in standard units (e.g., per 100, per 1,000, per 100,000).
Ratio: A ratio is a mathematical way to compare two numbers. If we want to compare a to b,
we calculate the ratio by dividing a by b.
Example: Continuing the example above, the ratio of legislators who supported the
legislation (18) to those who did not (25-18 = 7) is 18/7, or 2.6, usually reported 2.6 to 1.
This means that for every legislator who did not support the legislation, there were 2.6
who did.
5. If your evaluation questions ask about the change in an output or outcome, subtract the value at
the beginning of the program from the value at the end of the program. For example, if you
want to know how much the Attitude score increased, subtract the Attitude score at pretest from
the Attitude score at posttest.
*Adapted from: http://www.ttac.org/power-of-proof/interp_data/analyze/quantitative.html
32
Strategic Planning: Evaluation Workbook
Colorado Partners for Sustainable Change
*A 6 part series funded by CDHS; Division of Behavioral Health
http://www.cdhs.state.co.us/adad/prevention.htm
Tips for Analyzing Qualitative Data
Organize
Familiarize
Categorize
Interpret
 Organize all your narrative data in one place.
Sometimes you may have narrative data from different interviews done at different times,
various observations, or different open-ended questions on a survey questionnaire.
 Read through and get to know your data.
Spend time reading through the data and thinking about the data.
 Decide whether you will use preset categories or emergent categories.
There is no single correct way to categorize qualitative data, but consider the alternative
approaches and pick the way that best suits your purpose and data.
 Make the analysis suit the use.
Sometimes it is easy to become so immersed in your data that it is hard to see the forest for the
trees. There is so much that is interesting and insightful. You may want to include everything.
You may want to share it all. However, remember the end user. Seldom do others want to read
pages of “rich” description. What will that user really want to learn from these data? Think about
a 3-minute summary of your 90-minute focus group interview.
 Interpretation is more than description.
Once you’ve categorized and summarized the data, think about the meaning. Keep the
interpretation rooted in the raw data, but move beyond just presenting and summarizing the data.
Think about the significance of the findings.
An analysis is meant to understand common themes across individuals and groups, which is why
you attempt to get as representative a sample as possible. However, there are times when you
will find individual comments that do not fit within your theme important or useful to include in
the write-up. If this happens, qualify that it was not a common theme, but an important
comment; “although it was not a common theme across participants, one individual did note…”.
 Allow adequate time.
Thoughtful and useful analysis takes time. Allocate time for doing the analysis. Often qualitative
data collection and analysis occur simultaneously, so consider the time that is needed during data
collection for reading and thinking about your data. Analysis doesn’t just happen at the end.
*Adapted from University of Wisconsin-Extension, Cooperative Extension (2008). Building capacity in evaluating outcomes: A
teaching and facilitating resource for community-based programs and organizations. Madison, WI: UW Extension, Program
Development and Evaluation.
33
Strategic Planning: Evaluation Workbook
Colorado Partners for Sustainable Change
*A 6 part series funded by CDHS; Division of Behavioral Health
http://www.cdhs.state.co.us/adad/prevention.htm
Strategy V:
Presenting and Utilizing Evaluation Data
Actions in this Section
1.
2.
3.
4.
5.
Determine the audience(s)
Decide which information and findings to communicate
Choose a format or formats
Present the evaluation data
Reflect on the data to identify recommendations and next steps
Tools in this Section
 Selecting a Data Presentation Format
 Characteristics of an Effective Graph
 Types of Communications and Reports
 Self-Assessment: How well are you using your evaluation findings?
 Questions to Ask Yourself When Interpreting and Utilizing Your Data
Tips for Success:






34
Present evaluation findings in a way that will clearly communicate your results and
demonstrate how they answer your evaluation questions.
Consider all the different audiences that may be interested in your results: staff, program
managers, stakeholders, community members, etc.
Tailor the message to the audience in simple, concise, appropriate language.
Match the presentation format to the audience, such as a summary report for a board of
directors or a press release for the local newspaper.
Use quotations, pictures, and graphics to highlight key findings.
Share evaluation findings with the “right” people in a timely manner so that the evaluation
isn’t ignored.
Strategic Planning: Evaluation Workbook
Colorado Partners for Sustainable Change
*A 6 part series funded by CDHS; Division of Behavioral Health
http://www.cdhs.state.co.us/adad/prevention.htm
Selecting a Data Presentation Format
Type of Data
Presentation
Format
Purpose/Best Used
When
Considerations
Sample
The teens cited several
challenges to their
commitment to
nonsmoking including
living in homes that
were not smoke-free,
patronizing restaurants
that permit smoking,
socializing with friends
who allow smoking in
their homes and going
away to college.
Students agreed that
the behavior of parents
and peers influence
their own smoking. For
example, a student
described how he began
smoking: ‘I was bored
and stressed out--my
Dad smokes and I tried
it. Three years later, I’m
still going strong.’
Qualitative
Summarizing
Text
Good for describing a
general theme
Are there quotations
available to support the
theme?
Respondent
Quote
Best used as an
example of a summary
theme.
Remove or disguise any
names. Quote only the
parts that are relevant
to the theme being
discussed.
Text
Good for conveying
counts or other simple
information.
• No more than 3-4
numbers should be
presented in a sentence.
“With regard to gender,
48% of the sample was
female and 52% male.”
Tables
• Good for frequency
distribution and for
tabulation where two
categorical variables are
presented in
relationship
• Helpful to add
percentages (down
columns or across rows)
or cumulative
percentage
• Show the sample size
(n=number of people)
• Can present more
than one variable.
• A small table is better
than a large graph
• The more variables
and numbers it includes,
the more confusing a
table is to the audience.
• Should be selfexplanatory: title,
columns, and rows
labeled
• Generally better than
graphs when there are a
lot of categories or
numbers being
presented
Better than tables for:
• Showing trends
• Making broad
comparisons
• Showing relationships
Better than tables for:
• Showing trends
• Making broad
comparisons
• Showing relationships
Frequency Table
Quantitative
Graphs
Charts
35
• Best for lay audiences
• Especially good for
conveying simple
concepts
• Good for showing
differences between
groups
• Can be used to
present more than one
Strategic Planning: Evaluation Workbook
Colorado Partners for Sustainable Change
*A 6 part series funded by CDHS; Division of Behavioral Health
http://www.cdhs.state.co.us/adad/prevention.htm
Cross Tabulation Table
Stratified Table
Arithmetic line graphs
Scatter diagram
Pie chart
Pictograph
Bar chart (single,
grouped, stacked)
Histogram
variable.
http://www.ttac.org/power-of-proof/interp_data/select/index.html
Characteristics of an Effective Graph
•
Keep it simple – only essential information
•
Title – clear and succinct
•
Clear units of measure
•
Shows data clearly without distortion
•
Simple, straightforward design without “clutter”
•
Legible – font size 10 point or larger
•
Includes sample size
•
Acknowledges sources, if applicable
Percent of Anytown,CO MS students (N = 256)
who have used substances in the past 30 days
35%
30%
25%
20%
pre-test
15%
post-test
10%
5%
0%
Alcohol
Binge
drinking
Tobacco
Marijuana
Other drugs
*Adapted from University of Wisconsin-Extension, Cooperative Extension (2008). Building capacity
in evaluating outcomes: A teaching and facilitating resource for community-based programs and
organizations. Madison, WI: UW Extension, Program Development and Evaluation.
36
Strategic Planning: Evaluation Workbook
Colorado Partners for Sustainable Change
*A 6 part series funded by CDHS; Division of Behavioral Health
http://www.cdhs.state.co.us/adad/prevention.htm
Types of Communications and Reports
TYPE
DESCRIPTION
Technical report
Executive summary
A detailed report of an evaluation’s methods and findings.
A few pages, usually located at the beginning of a longer report, that outline
a study’s major findings and recommendations.
A focused report that compiles major achievements for all programs, either
written or presented orally.
A one- or two-page description of a program’s successes (progress and/or
achievements) used to promote a program and tell its story.
A short paragraph that highlights impact — the difference the program has
made.
A quick, concise message that clearly communicates your story and can be
relayed in the time the elevator travels between the first and sixth floors!
A brief, newsworthy piece or interaction to release specific information.
Annual report
Success story
Impact statement;
paragraph spotlight
Elevator story
News release; press
conference
Media appearance
Public meeting
Action planning; working
session; staff workshop
Memo; email; fax; postcard
Personal discussion
Brochure; newsletter; bulletin
Published article
Display/exhibit: photographs,
3-ring binder, poster, chart
Audio/Video presentation;
slide presentation
A release of newsworthy information that usually includes a staged event,
such as a local sports star leading a walk-a-thon to raise awareness about a
study on exercise and health.
A gathering that is open to the public where evaluation findings are
presented in a clear, simple manner, usually with time set aside for open
discussion.
An interactive discussion of findings, usually with the intention of in-depth
learning and setting next steps.
A short message circulated among staff or a group, usually focused on one
specific point.
A face-to-face interaction to discuss evaluation findings with an individual or
small group.
A brief, simply-worded publication that can be distributed or mailed to
various outlets.
An article written for a particular journal with a target audience in mind.
A visual presentation of specific information, with minimal use of written
words.
Electronic visual presentation that may include audio and personal
comments.
Adapted from:
Work Group for Community Health and Development, University of Kansas. (2007). Communicating information
to funders for support and accountability (Section 4). In The Community Tool Box: Using evaluation to understand and improve the
initiative (Chapter 39). Retrieved July 15, 2008, from
http://ctb.ku.edu/tools/sub_section_main_1376.htm and Centers for Disease Control and Prevention. (2007).
Impact and value: Telling your program’s story. Atlanta, Georgia: Centers for Disease Control and Prevention,
National Center for Chronic Disease Prevention and Health Promotion, Division of Oral Health. Retrieved
June 23, 2008, from http://www.cdc.gov/oralhealth/publications/library/pdf/success_story_workbook.pdf
37
Strategic Planning: Evaluation Workbook
Colorado Partners for Sustainable Change
*A 6 part series funded by CDHS; Division of Behavioral Health
http://www.cdhs.state.co.us/adad/prevention.htm
Needs
Improvement
Okay
How well are you using your evaluation
findings to…
Excellent
How Well Are You Using Your Evaluation Findings?
N/A
Comments
educate decision makers, funders, and/or
key stakeholders
demonstrate accountability; satisfy
public inquiries
guide budget and resource allocations;
gain resources
show progress when planned outcomes
aren’t expected until a distant future
make people aware of the program,
achievements, and/or needs (program
visibility)
guide program improvements
promote the program, maintain
commitment
enhance public image
recruit volunteers, participants,
and/or partners
identify effective practices
share lessons learned
celebrate accomplishments
*Adapted from University of Wisconsin-Extension, Cooperative Extension (2008). Building capacity
in evaluating outcomes: A teaching and facilitating resource for community-based programs and
organizations. Madison, WI: UW Extension, Program Development and Evaluation.
38
Strategic Planning: Evaluation Workbook
Colorado Partners for Sustainable Change
*A 6 part series funded by CDHS; Division of Behavioral Health
http://www.cdhs.state.co.us/adad/prevention.htm
Questions to Ask Yourself When Interpreting and Utilizing Your Data
Data results are a useful source of information about your progress towards the achievement of
intended outcomes. This information can help determine how to maximize the effectiveness of
your program.
1. How large was your sample (i.e., what is the number of matched pre and post tests or
how much data did you collect)? _________
a. If the number is lower than anticipated:
i.
Does it reflect challenges in reaching the number of people you planned to reach?
If yes, how might you change or expand your recruitment process?
ii.
Does it reflect challenges with data collection and management (e.g., surveys
were lost)? If yes, how might you modify your data collection and/or data
management process?
b. Were there other challenges encountered that could have affected the sample you
obtained? If yes, what steps will you take in the future to address these challenges?
2. What do the overall results for each of your measures tell you about the progress towards
the outcomes you hope to achieve?
3. Are results what you expected?
a. If not, what factors might have affected the results? (e.g., the population served was
different than the target population, participant attrition, participant motivation, lost
data, issues with program fidelity/adeptness of program facilitator, staff turnover,
etc.)
39
Strategic Planning: Evaluation Workbook
Colorado Partners for Sustainable Change
*A 6 part series funded by CDHS; Division of Behavioral Health
http://www.cdhs.state.co.us/adad/prevention.htm
4. If you used any other methods to obtain data/information (e.g., interviews, focus groups),
what methods did you use? What were some of the major findings from that data?
5. Is there a need for new and/or additional measurement tools and/or evaluation
components in order to help you better assess the outcomes you hope to achieve? If yes,
please explain.
6. How do you plan to share your findings and who do you plan to share the findings with?
Applying what you learned
Use the questions below to guide your thinking about steps you will take to build upon
successes and lessen challenges to implementation and evaluation of your program.
7. How will you apply what you have learned?
8. What kind of support (e.g., evaluation and/or other assistance) might be useful to achieve
your desired outcomes?
a. Do you have the internal capacity for this? If not, do you need to consider expanding
your staff and/or contracting services?
*Developed by OMNI Institute in Denver, Colorado.
40
Strategic Planning: Evaluation Workbook
Colorado Partners for Sustainable Change
*A 6 part series funded by CDHS; Division of Behavioral Health
http://www.cdhs.state.co.us/adad/prevention.htm
Appendix A: Evaluation Planning Table
This table is intended to serve as a road map for your agency’s evaluation efforts. The document can be used to track strategies and
evaluation/reporting requirements across different funders. The first two rows contain examples of how the table can be completed. Please make
sure to complete sections I and II of this table. We suggest using one line for each evaluation tool that you are using.
Section I. Program Information
Strategy
Funder
Target
population
Outputs
Short-Term/
Intermediate
Outcomes
Long-term
Outcomes
Does a
funder
require this
evaluation?
____ No
____ Yes
____ No
____ Yes
____ No
____ Yes
____ No
____ Yes
____ No
____ Yes
____ No
____ Yes
____ No
____ Yes
____ No
____ Yes
____ No
____ Yes
____ No
____ Yes
41
Strategic Planning: Evaluation Workbook
Colorado Partners for Sustainable Change
*A 6 part series funded by CDHS; Division of Behavioral Health
http://www.cdhs.state.co.us/adad/prevention.htm
Date due to funder
Section II: Data Collection, Reporting, & Analysis Plan
OUTPUT /
OUTCOME
MEASUREMENT
TOOL
DATA COLLECTION
Person
Responsible
Administration
Date
42
DATA ENTRY
Person
Responsible
Completion
Date
DATA ANALYSIS
Person
Responsible
Strategic Planning: Evaluation Workbook
Colorado Partners for Sustainable Change
*A 6 part series funded by CDHS; Division of Behavioral Health
http://www.cdhs.state.co.us/adad/prevention.htm
Data Analysis
Completion
Date
DATA REPORTING
Person
Responsible
Data Report
Completion
Date
Appendix B: Glossary
TERM
Accuracy
DEFINITION
The extent to which an evaluation is truthful or valid in what it says about a
program, project, or material.
Achievement
Performance as determined by some type of assessment or testing.
Activities/services
The specific activities and/or services offered as part of the program (e.g.,
eight group sessions) that are expected to lead to the short-term outcomes
Assessment
The act of determining the standing of an individual, program, or
community on some variable of interest.
Attitude
A person’s opinion about another person, thing, or state.
Audience
Consumers of the evaluation; those who will or should read or hear of the
evaluation, either during or at the end of the evaluation process. Includes
those persons who will be guided by the evaluation in making decisions and
all others who have a stake in the evaluation (see stakeholders).
Baseline
Facts about the condition or performance of subjects prior to treatment or
intervention.
The combination of factors accompanying the study that may have
influenced its results, including geographic location, timing, political and
social climate, economic conditions, and other relevant professional
activities in progress at the same time.
Information that may be quantitative or qualitative in nature
Context
Data
Data Collection
The gathering of information (figures, words or responses) that describes
some situation from which conclusions can be drawn.
Effectiveness
Refers to the worth of a project in achieving formative or summative
objectives.
An assessment of the strengths and weaknesses of programs, policies,
personnel, products, and organizations to improve their effectiveness.
Evaluation
Evaluation Design
A plan for conducting an evaluation; e.g., data collection schedule, report
schedules, questions to be addressed, analysis plan, management plan, etc.
Feasibility
The extent to which resources allow an evaluation to be conducted.
Fidelity
The extent to which program components were delivered consistently
across participants (e.g. individuals or classrooms) and that the
implementation was true to the program model and theory.
Formative Evaluation
Evaluation designed and used to improve an object, especially when it is still
being developed.
Indicator
The unit of measurement that is used to monitor or evaluate the
achievement of project objectives over time.
43
Strategic Planning: Evaluation Workbook
Colorado Partners for Sustainable Change
*A 6 part series funded by CDHS; Division of Behavioral Health
http://www.cdhs.state.co.us/adad/prevention.htm
Instrument
The questionnaire, form, checklist, or device that captures or collects
information
Long-term outcomes
Achieved a few years after receiving the program and include changes in
behavior, norms, or policies.
Measure
verb: To ascertain the quantity or quality of something
noun: The instrument used to estimate or appraise something
Measurement tools
Documents used to track outputs and short-term outcomes (e.g.,
attendance sheet, pre-post survey).
Organizational
Readiness
The degree to which an organization has the appropriate time and
resources to perform an evaluation of their programs.
Outcome Evaluation
An assessment of the immediate or direct effects of the program on
participants.
The amount of product and/or service that the program tends to provide
(e.g., the program will be offered 5 times and reach 15 youth each time).
Outputs
Pre-post tests
An instrument administered before and after the intervention as a means
for documenting change over that period of time
Process Evaluation
Qualitative data
An assessment of program materials and activities; an evaluation of the
appropriateness of the approach and procedures that will be used in the
program.
Data that consist of words and observations, not numbers
Quantitative data
Data in the form of numbers, or numeric data
Sample
The subgroup or subset of the larger group, or population
Short-term outcomes
The immediate effects that are expected to occur after receiving the
program, such as changes in knowledge, attitudes, and skills.
Stakeholders
Individuals or groups who may affect or be affected by the program or
program evaluation.
Strategy/Program
The overall strategy being implemented (e.g., social marketing) or program
being delivered (i.e., program name).
Target population
Who the strategy/program is designed to reach (e.g., high school youth, all
members of the community).
44
Strategic Planning: Evaluation Workbook
Colorado Partners for Sustainable Change
*A 6 part series funded by CDHS; Division of Behavioral Health
http://www.cdhs.state.co.us/adad/prevention.htm
Appendix C: Additional Resources
Kellogg Foundation – includes an evaluation workbook
http://www.wkkf.org/knowledge-center/publications-and-resources.aspx
University of Wisconsin Cooperative Extension -- Comprehensive evaluation manual
University of Wisconsin-Extension, Cooperative Extension (2008). Building capacity in evaluating
outcomes: A teaching and facilitating resource for community-based programs and organizations.
Madison, WI: UW Extension, Program Development and Evaluation.
http://learningstore.uwex.edu/Assets/pdfs/G3866.pdf
Tobacco Technical Assistance Consortium -- explanations and tools for evaluating an existing
program
http://www.ttac.org/power-of-proof/setting_stage/index.html
Thompson-Robinson, M., Hopson, R., SenGupta, S. (Eds.). (2004). In Search of Cultural
Competence in Evaluation Toward Principles and Practices: New Directions for Evaluation, No.
102. Wiley Periodicals, Inc.
Strengthening Evaluation Through Cultural Relevance and Cultural Competence –
Workshop materials from a session held during the American Evaluation Association’s 2010
Summer Evaluation Institute
http://www.eval.org/SummerInstitute08/08SIHandouts/Uploaded/aea08.si.kirkhart2.pdf
Meta-list of websites that have basic evaluation guides
http://gsociology.icaap.org/methods/basicguides.html
Guidebook of evaluation
http://www.ceni.org/publications/ProveandImprove.pdf
Website of links to free evaluation resources
http://gsociology.icaap.org/methods/
Implementation Fidelity
http://www.colorado.edu/cspv/blueprints/Fidelity.pdf
Process Evaluation
http://health.state.ga.us/pdfs/ppe/Workbook%20for%20Designing%20a%20Process%20Evaluati
on.pdf
Online evaluation trainings offered by Center for Substance Use Prevention – Program
Evaluation 101 and 102 are especially recommended
http://pathwayscourses.samhsa.gov/courses.htm
45
Strategic Planning: Evaluation Workbook
Colorado Partners for Sustainable Change
*A 6 part series funded by CDHS; Division of Behavioral Health
http://www.cdhs.state.co.us/adad/prevention.htm