Program Evaluation

Transcription

Program Evaluation
Program Evaluation
Krista S. Schumacher
Schumacher Research Group
Ph.D. Student
Educational Psychology:
Research, Evaluation, Measurements & Statistics (REMS)
Oklahoma State University
918-284-7276
[email protected]
[email protected]
Prepared for the Oklahoma State Regents for Higher Education
2012 Summer Grant Writing Institute
To Evaluate or to Assess?

Technically speaking….
◦ Assessment
 Long-term outcomes, aggregated judgment
◦ Evaluation
 Short-term outcomes, “unique event” judgment

In education terms….
◦ Assessment
 Student learning outcomes
◦ Evaluation
 Research on programs/curricula to increase student learning
[email protected]
918-284-7276
2
Why Evaluate?

How will you know your project is progressing
adequately to achieve objectives?

How will funders know your project was
successful?
◦ Increasing emphasis placed on evaluation, i.e.,
 U.S. Department of Education
 National Science Foundation
 Substance Abuse and Mental Health Services Administration
(SAMHSA)
[email protected]
918-284-7276
3
Why Evaluate?

Improve the program –
◦ “Balancing the call to prove with the need to improve.” (W.K.
Kellogg Foundation

Determine program effectiveness –
◦ Evaluation supports “accountability and quality control” (Kellogg
Foundation)
◦ Significant influence on program’s future

Generate new knowledge –
◦ Not just research knowledge
◦ Determines not just that a program works, but analyzes how and
why it works
 With whom is the program most successful?
 Under what circumstances?
[email protected]
918-284-7276
4
Why Evaluate?
WHAT WILL BE DONE WITH THE RESULTS?????
“Evaluation results will be reviewed
(quarterly, semi-annually, annually) by the
project advisory board and staff. Results
will be used to make program
adjustments as needed.”
[email protected]
918-284-7276
5
Federal Emphasis on Scientifically
Based Research (SBR) in Evaluation

Experimental research design
◦ Random assignment

Quasi-experimental research design
◦ No random assignment

Program Evaluation Standards
◦ Joint Committee on Standards for Educational Evaluation
1. Utility
2. Feasibility
3. Propriety
4. Accuracy
5. Evaluation accountability
[email protected]
918-284-7276
6
Types of Evaluation
Process evaluation:
 What processes are used and how well
do they work?
Outcome evaluation:
 Did the project achieve its stated
objectives?
[email protected]
918-284-7276
7
Process Evaluation
◦ What was provided and to whom?
 services (modality, type, intensity, duration)
 recipients (individual demographics and characteristics)
 gender, age, race/ethnicity, income level, first-generation status
 context (institution, community, classroom)
 cost (did the project stay within budget?)
• Do processes match the proposed project plan?
◦ What types of deviation from the plan occurred?
◦ What led to the deviations?
◦ What effect did the deviations have on the project and
evaluation?
[email protected]
918-284-7276
8
Outcome Evaluation
◦ What effect did the program have on participants?
 Activities / Objectives
 Achievement / Attitudes and beliefs
◦ What program/contextual factors were associated
with outcomes?
◦ What individual factors were associated with
outcomes?
◦ How durable were the effects?
 What correlations can be drawn between outcomes and
program?
 How do you know that the program was the cause of the effect?
 You can’t! Unless you use an experiment.
[email protected]
918-284-7276
9
Who will Evaluate?

External evaluators increasingly required
or strongly recommended
◦
◦
◦
◦
Partners for effective and efficient programs
Methodological orientations
Philosophical orientations
Experience and qualifications
[email protected]
918-284-7276
10
How much will it cost?
External evaluations cost money…period.
 Standard recommendation:

 5% to 10% of total budget
 Kellogg Foundation; U.S. Dept of Ed.; NSF
Check funder limits on evaluation
 Ensure cost is reasonable but sufficient

[email protected]
918-284-7276
11
Two Types of Data

Quantitative
◦ Numbers based on objectives and activities
◦ Types of data needed:





Number of participants (process)
Grade point averages (outcome)
Retention rates (outcome)
Survey data (outcome and process)
Qualitative
◦ Interviews
◦ Focus groups
◦ Observation
[email protected]
918-284-7276
12
Methods/Instruments

How are you going to get your data?
◦ Establish baseline data
◦ Institutional Research Office (I.R.)
 GPA
 Retention
 Graduation
◦
◦
◦
◦
Pre- and post-assessments (knowledge, skills)
Pre- and post-surveys (attitudinal)
Enrollment rosters
Meeting minutes
[email protected]
918-284-7276
13
Data Analysis

Qualitative Data
◦ Data analysis programs
 NVivo , ATLAS.ti, etc…
◦ More than pithy anecdotes
 “May explain – and provide evidence of – those hard-to-measure outcomes
that cannot be defined quantitatively.” – W.K. Kellogg Foundation
 Provides insight into how and why a program is successful
◦ Analyze for themes that support (or don’t) quantitative data
[email protected]
918-284-7276
14
Data Analysis

Quantitative data
◦ Data analysis programs:
◦ SPSS (Statistical Program for the Social Sciences), Stata, etc...
◦ Descriptive and inferential statistics:
 Descriptive:
 Frequencies
 Means
 Standard deviation
 Inferential (parametric, nonparametric)





t-tests, Mann-Whitney U test (difference of means, two groups)
ANOVA, Kruskal-Wallis, Friedman (difference among more than two groups)
Correlation (relationship between variables)
Regression (explanation, prediction)
Etc……
[email protected]
918-284-7276
15
A Detour: Educational Research

For education grants, two approaches to
evaluation:
◦ Program Evaluation
◦ Educational Research

Include Educational Researcher as co-PI
◦ Look in Educational Psychology programs
◦ OSU Center for Educational Research and
Evaluation (a.k.a. REMS Center)
[email protected]
918-284-7276
16
Educational Research (cont.)

Educational research is NOT laboratory
science research
 Requires Institutional Review Board (IRB) approval
 Oversight arm for Human Subjects Research
 May require informed consent

How NOT to set up ed research
◦ One Biology section, split into two sections
 One receives “treatment”; other traditional
instruction
 Students blindly enroll in one section WITHOUT
knowledge this is an experiment
[email protected]
918-284-7276
17
Educational Research (cont.)

Example: NSF TUES
◦ Incorporate interdisciplinary, semester-long
research projects into Introductory Biology
◦ Hypothesis:
◦ “We hypothesize that semester-long, interdisciplinary,
collaborative research projects will increase student
learning and interest in science more than the
standard three-hour laboratory activity.”
[email protected]
918-284-7276
18
Educational Research (cont.)

Measurements follow from hypothesis
◦ Increases in student learning:
 Establish baseline data
 E.g., Performance on tests, course projects, course grades
 Pre/post tests
 Course projects compared to previous courses
without intervention
◦ Increases in student interest:
 Establish baseline data (if possible)
 Pre/post attitudinal surveys
 Focus groups
[email protected]
918-284-7276
19
Data Collection & Reporting:
Two Types of Timeframes

Formative
◦ Ongoing throughout life of grant
◦ Measures activities and objectives

Summative
◦ At conclusion of grant funding

NEED BOTH!
[email protected]
918-284-7276
20
Timelines

When will evaluation occur?
◦
◦
◦
◦
◦
◦
Monthly?
Quarterly?
Semi-annually?
Annually?
At the end of each training session?
At the end of each cycle?
[email protected]
918-284-7276
21
Origin of the Evaluation:
Need and Objectives
Need: For 2005-06, the fall-to-fall retention rate of
first-time degree-seeking students was 55% for the
College’s full-time students, compared to national
average retention rates of 65% for full-time
students at comparable institutions (IPEDS, 2006).
Objective: The fall-to-fall retention rate of full-time
undergraduate students will increase by 3% each
year from a baseline of 55% to 61% by Sept. 30,
2010.
[email protected]
918-284-7276
22
Evaluation Data Collection and
Reporting Plan
Objectives
Data
collected
and
timeline
Methods for
data
collection
and timeline
Instruments
to be
developed
and timeline
Reports/
outcomes
timeline
Increase fallto-fall
retention by
3% per year
to 61%
Student
enrollment in
first fall and
second fall
within one
month of
start of
second fall
Enrollment
entered by
gender and
race/
ethnicity into
database
within first
four weeks
of each
semester
Enrollment
rosters
separated by
gender and
race/ethnicity
by Jan. 15,
2009
At midpoint of
each
semester
[email protected]
918-284-7276
23
BEWARE THE LAYERED OBJECTIVE!

By the end of year five, five (5) full-time
developmental education instructors will
conduct 10 workshops on student
retention strategies for 200 adjunct
instructors.
[email protected]
918-284-7276
24
Logic Models



From: University of Wisconsin-Extension, Program Development and Evaluation
http://www.uwex.edu/ces/pdande/evaluation/evallogicmodel.html
A Logic Model is……
◦ A depiction of a program showing what the program will do and what it is to
accomplish.
◦ A series of “if-then” relationships that, if implemented as intended, lead to the
desired outcomes
◦ The core of program planning and evaluation
Situation
Hungry
Inputs
Get food
Outputs
Eat food
Outcomes
Feel better
[email protected]
918-284-7276
25
Evaluation Resources

Evaluation Resource Center for Advanced Technological
Education (NSF ATE)


http://evalu-ate.org/ (directory of evaluators)
Western Michigan University, The Evaluation Center
◦ http://ec.wmich.edu/evaldir/index.html (directory of evaluators)

American Evaluation Association
◦ www.eval.org (directory of evaluators)

Joint Committee on Standards for Educational Evaluation
◦ http://www.jcsee.org/program-evaluation-standards/program-evaluationstandards-statements

W.K. Kellogg Foundation Evaluation Handbook
◦ http://www.wkkf.org/knowledge-center/resources/2010/W-K-KelloggFoundation-Evaluation-Handbook.aspx
[email protected]
918-284-7276
26
Evaluation & Statistics Resources

Statistics and Research Methods Resources
◦ http://statsandmethods.weebly.com/

The Research Methods Knowledge Base
◦ http://www.socialresearchmethods.net/

“Discovering Statistics Using SPSS,” by Andy Field
◦ http://www.sagepub.com/field3e/

Planning an Effective Program Evaluation short course
◦ http://www.the-aps.org/education/promote/pen.htm

“Evaluation for the Unevaluated” course
◦ http://pathwayscourses.samhsa.gov/eval101/eval101_toc.htm

OSRHE list of evaluators and other resources
◦ http://www.okhighered.org/grant%2Dopps/writing.shtml
[email protected]
[email protected]
918-284-7276
27