"Take a Stand" Symposium - February 19, 2014 ()

Transcription

"Take a Stand" Symposium - February 19, 2014 ()
Collaborative Evaluation
The Promise of Collaborative Evaluation within
Sistema-inspired Programs:
Evaluating for Outcomes
Take a Stand Conference, Los Angeles
February 19th, 2014
Dennis Palmer Wolf, Ed.D.
Principal
Steven J. Holochwost, Ph.D., M.P.A.
Senior Researcher
1
Collaborative Evaluation
Overview of the Session:
Collaborative Evaluation for Outcomes
•  Three forms of collaboration that can strengthen programs
•  Collaboration between programs and evaluators
•  Collaboration among staffs, children, families, and evaluators
•  Collaboration across programs
•  A short description of a multi-site evaluation of Sistema-inspired
programs, funded by the Buck Family Foundation
•  A forecast of Session 2: Implementing and Sustaining Evaluations
2
Collaborative Evaluation
An Introduction to Us and Our Approach
•  Who we are
-  Dennie Palmer Wolf
-  Steven Holochwost
-  Judith Hill Bose
•  Values that inform our approach
•  Building on Sistema’s commitment to outcomes that combine
music, personal, and social results
•  Building on what Sistema sites have already learned to do: A
desire not to re-invent the wheel
•  A collaborative approach: Engaging staff, youth, families, and
researchers
3
Collaborative Evaluation
Learning, not just Listening
•  To make this a learning (not just a listening) experience, we
will ask you to analyze and reflect on your own program
•  Build your own model linking program inputs to results for
young people
•  Keep a running list of questions you have
•  Move quickly to ensure time for exchange
4
Collaborative Evaluation
Collaboration between
Evaluators and Sistema-inspired Programs
5
Collaborative Evaluation
Your Program: Outputs and Outcomes
Outputs
Outcomes
What does your program do?
What does your program hope to achieve?
Example:
Example:
We offer high-quality, after-school
musical instructions to 40 students
for four hours/week
We want to increase rates of high
school graduation among our
participants.
6
Collaborative Evaluation
Modest to Immodest Proposals
Higher grades
Higher test scores
Better behavior
Participation
in Programs
Improved penmanship
Increased agility
Broader wingspan
Telekenetic abilities
7
Collaborative Evaluation
The Black Box of Evaluation
Participation
High school
graduation
8
Collaborative Evaluation
Opening the Black Box
Participation
Quality of
Instruction
Arts Learning
High school graduation
Engagement
9
Collaborative Evaluation
Opening the Black Box
Understanding a Range of Outcomes
Intermediate
Outcomes
Outputs
Participation
(Perseverance)
Proximal
Outcomes
(Interpersonal Skills)
Distal
Outcomes
Arts Learning
(Agency)
(Graduation)
(Executive Functions)
Program Practice
Quality of Instruction
Student Engagement
Areas of Emphasis
strengthening practices
10
Collaborative Evaluation
Reflection 1: Your Program s Model
Outputs
Participation
Proximal
Outcomes
Intermediate
Outcomes
Distal
Outcomes
Arts Learning
(Perseverance)
(Graduation)
Program Practice
Quality of Instruction
Student Engagement
Areas of Emphasis
11
Collaborative Evaluation
Collaboration between:
Programs, Staff and Children
& Families
12
Collaborative Evaluation
Documenting and Strengthening the Program
with the help of students, families
•  Why and how to engage families in the research?
•  What should we document?
•  What we could learn about broader Sistema effects?
•  Working together could we develop tools that help
us to understand:
-  What families contribute to the success of programs?
-  How families are affected by their participation in the program?
13
Collaborative Evaluation
How to Collect: The Data Dinner:
With youth Interviewing their Families
about Heritage that supports music
making
14
Collaborative Evaluation
What to Collect and How?
One Family’s Weekly Contributions to the Program
What I do
What I have to say
Pick up’s at end of program on most days, get off early
one day to listen to the rehearsal
Hard to do with my job, my boss not a
music person
Remind him to practice, sit and listen when I have the
time, 2 nights involved grandmother, one night a cousin
who also plays, to keep him going, he’s a little
discouraged
Want to know more about practicing, not
just nag him
Borrow my friend’s van to take him and his friends and
their family to the concert at the church
Listen to his pieces on the recordings with him while I
am getting dinner on. to keep him going, he’s a little
discouraged
Sometimes it takes doing the music at once
Talk to his younger brother about if he wants to do it, try They would have each other to practice
to get him involved
with
15
Collaborative Evaluation
Youth as Researchers on their Own Musical Lives
Diary Day (Sunday): Phase II, Highly Engaged Student
Where?
Home
Church
(Bethel)
Friend’s
house
Home
What were you doing?
How long?
Activity
Breakfast – Ate a bowl of cereal
Got myself and brother ready for
church
Listening to music on my stereo
Went to church
Service – I played in the service
About 30 min
11:00 – 1:30
Mother had meeting, watched my
brother play outside with
Listened to Ipod
Took my cello, she plays too, we
played music, then watched TV with
her little sister
1:30 – 2:30
Got on Facebook – a friend from CMW
got me on last summer
Dinner – We talked about the concert
on Saturday, I had a solo on Saturday,
so we reflecting on how people thought
I did
Got ready for school; Watched MTV
Listened to music on stereo
About 30 – 45
min
10 – 15 min
About 3:00
30 - 45 min.
About 30 min
About 30 min
Who was with
Rare/Regular
you?
Related to music more generally
Family
Family
Listen every day when I
get ready
Family
Congregation
Twice a month, young
musicians rotate playing
for the service
Friend
Another cellist was
going to come, but
couldn’t
Get together to play
about once every two
months
Every day
Family
Whenever I perform
Everyday
16
Collaborative Evaluation
Why We Should Open the Black Box
(and enrich what we see)
Participation
Arts Learning
Quality of
Instruction
Engagement
Youth
High school
graduation
Positive social
relations
Strong negotiation
and planning skills
Families
17
Collaborative Evaluation
Why We Should Open the Black Box:
Understanding a Range of Outcomes for A Range of People
Outputs
Participation
Proximal
Outcomes
Arts learning
across family
Added Time
Program Practice
Quality of Instruction
Student Engagement
Family Engagement
Areas of Emphasis
Intermediate
Outcomes
Distal
Outcomes
(Perseverance)
Try for
selective HS.
programs
(Interpersonal Skills)
(Agency)
(Executive Functions)
Strong Social
Networks
Increased experience
seeking and choosing
opportunities
Graduation
from a
program with
higher
expectations
strengthening practices
18
Collaborative Evaluation
Reflection 2: Collaborating with Students and Families
•  Return to your program model
•  How have you engaged youth in thinking about the
outcomes for them and others?
•  How have you engaged families to learn more about the
sequence of impacts on them?
•  How have you strengthened your program to support the
impacts on students and/or families?
19
Stanford Thompson, Chief Executive Officer
Ben Fuller, Chief Operations Officer
Kathleen Krull, Program Director
Collaborative Evaluation
Collaboration Across the Field
20
Strategic Plan Discussion Draft
27!
Collaborative Evaluation
A Collaborative Evaluation Study
Funded by the Buck Family Foundation
Specific Aims
•  Developing evidence about range of effects under what
conditions
•  Positioning Sistema projects to continue to evaluate
their work
•  Creating a network of mutual support, which includes:
-  A set of strategies for embedding this kind of work in your projects
-  A set of common measures
21
Collaborative Evaluation
Key Features of the Proposed Study
•  Look at program features (in/out of school, school-year/year round,
etc.)
•  Develop common measures applied in common ways (e.g.,
demographics, attendance, persistence, grades, musical progress,
etc.)
•  Incorporate site-specific measures (e.g., “Grit,” time journals, etc.)
•  Involve staff, youth, and families as research partners
22
Collaborative Evaluation
Open Call for Multiple Modes of Participation
•  Core sites (approximately 6)
- 
- 
- 
- 
- 
Shared training and discussion of common measures
Common measures and guidance in using them
Support to develop site-specific measures
Convenings
Shared resources (e.g., website, evaluation strategies)
•  Collaborating Sites
-  Common measures and guidance in using them
-  Convenings
-  Shared resources
•  Companion Sites
-  Access to the project’s shared resources
23
Collaborative Evaluation
Working Timeline
•  February 2014:
-  “Take a Stand”: Workshops and Conversations
-  Expression of Interest Forms
•  April 2014: Selection of Core and Collaborating Sites
•  Summer 2014: Preparation, training, data set-up
•  September 2014: Begin Year 1 data collection
•  September 2015: Begin Year 2 data collection
•  October 2016: Final report and public tools
24
Collaborative Evaluation
Session 2:
•  A Collaborative Evaluation across Sistema-inspired Programs:
Building the Field
•  Time: Mere moments
•  Location: Here
25
Collaborative Evaluation
Additional Resources
WolfBrown Investigators
•  [email protected]
•  [email protected]
Longy-Bard Colleague
•  Judith Hill Bose ([email protected])
Websites
•  www.wolfbrown.com
•  http://bit.ly/SistemaStudy
26
Collaborative Evaluation
A Collaborative Evaluation across
Sistema-inspired Programs:
Building the Field
February 19th, 2014
Dennis Palmer Wolf, Ed.D.
Principal
Steven J. Holochwost, Ph.D., M.P.A.
Senior Researcher
27
Collaborative Evaluation
An Introduction to Us and Our Approach
•  Who we are
-  Dennie Palmer Wolf
-  Steven Holochwost
-  Judith Hill Bose
•  Values that inform our approach
•  Building on Sistema’s commitment to outcomes that combine
music, personal, and social results
•  Building on what Sistema sites have already learned to do: A
desire not to re-invent the wheel
•  A collaborative approach: Engaging staff, youth, families, and
researchers
28
Collaborative Evaluation
What we want to explore in this session
•  Aims of the study:
-  Developing evidence about range of effects under what conditions
-  Positioning Sistema projects to continue to evaluate their work
-  Creating a network of mutual support
•  Drawing on common program descriptors and measures
•  Developing site-specific measures
•  Involving youth and families as co-researchers
•  Offering multiple ways to participate
-  Core sites (~6)
-  Collaborating sites
-  Companion sites
29
Collaborative Evaluation
Collaboration on Common Program Descriptors
•  Sistema sites typically share many features and values
•  But programs are implemented in a range of ways
- 
- 
- 
- 
- 
In-school, out-of-school
School year vs. year-round
Purely music, or with added supports
Example of attendance, both unexcused and excused
Amount of time per week/year
•  If we develop common descriptors (and are willing to
pool data) we can investigate the consequences of some
of those structural choices, and learn from the results
30
Collaborative Evaluation
Collaboration on Common Measures
•  A core of measures that many Sistema projects already
collect
•  But differ across projects
•  Could be collected in a more standardized way that
would give clearer findings
•  Extensions of current measures
31
Collaborative Evaluation
Examples of Common Measures
•  Demographics
-  Date of birth
-  Race/Ethnicity categories
•  Program
-  Attendance and persistence
-  Musical progress
•  Family Context
-  Challenges
-  Assets
•  Academic Achievement
-  Percentile scores
-  Referenced scores
32
Collaborative Evaluation
Questions and Answers about Common Measures
•  Think about what you currently
collect?
•  What types of data are important
to collect in standard ways across
Sistema sites?
•  What questions or concerns do
you have about collecting data in a
shared and standardized way
across sites?
33
Collaborative Evaluation
Site-specific Measures
Outputs
Participation
Proximal
Outcomes
Intermediate
Outcomes
Distal
Outcomes
Arts Learning
(Perseverance)
(Graduation)
(Executive
Functions)
Other
Outcomes
Other sitespecific
choices
•  NJSO Champs: Persistence
•  Play on Philly: Executive Functions
34
Collaborative Evaluation
Questions and Answers about Site-specific Measures
•  Think about the site-specific outcomes you have designed
your program to achieve:
-  In musical terms
-  In terms of other outcomes
•  What, if any, struggles have you had in attempting to
measure these outcomes?
•  What kind of assistance do you want from outside
researchers? From your colleagues in the Sistema field?
35
Collaborative Evaluation
Working Timeline
•  February 2014:
-  “Take a Stand”: Workshops and Conversations
-  Expression of Interest Forms
•  April 2014: Selection of Core and Collaborating Sites
•  Summer 2014: Preparation, training, data set-up
•  September 2014: Begin Year 1 data collection
•  September 2015: Begin Year 2 data collection
•  October 2016: Final report and public tools
36
Collaborative Evaluation
Collaboration Across the Field
Common
Measure
Development
To sites
To field
Reporting
Results
Site-specific
Results of analyses initially
used to refine measures
Measures made
available to field
Data
Analysis
Data
Collection
Eventually results are
reported to the field
Sites return data to common database
37
Collaborative Evaluation
Open Call for Multiple Modes of Participation
•  Core sites (approximately 6)
- 
- 
- 
- 
- 
Shared training and discussion of common measures
Common measures and guidance in using them
Support to develop site-specific measures
Convenings
Shared resources (e.g., website, evaluation strategies)
•  Collaborating Sites
-  Common measures and guidance in using them
-  Convenings
-  Shared resources
•  Companion Sites
-  Access to the project’s shared resources
38
Collaborative Evaluation
Proposed Working Structure
Longy-Bard
- Program Design
- Pedagogy
Companion Sites
Core Sites
WolfBrown
- Research Design
- Measurement
- Data Analysis
Collaborating Sites
Each Core & Collaborating contains:
•  Lead Liaison
•  Teaching Artist-Researchers
•  Youth and Family Researchers
39
mission, the geographical foci for the foundation’s grant-making activities, and existing
personal relationships between foundation leadership and members of Play On Philly’s
Board.
- Who:Evaluation
Chief Executive Officer, Chief Operations Officer, & Development Manager
Collaborative
- When: January, 2014
Action Step 3: Develop a Work Plan
- How: Using the list of the most promising opportunities for foundation funding, assemble
a calendar of application deadlines for 2014. Working backwards from those deadlines,
develop a work plan that assigns writing duties to different members of the staff and
designates when those written materials are due to the Development Manager.
- Who: Chief Executive Officer, Chief Operations Officer, & Development Manager
- When: January, 2014
Questions and Answers about the Overall Process
•  What questions do you have about this process and structure?
Action Step 4: Implement the Work Plan
- Who: All staff, under the direction of the Development Manager
- When: Beginning January 2014 and continuously thereafter
•  What suggestions to you have?
40
Strategic Plan Discussion Draft
!
11!
Collaborative Evaluation
Additional Resources
WolfBrown Investigators
•  [email protected]
•  [email protected]
Longy-Bard Colleague
•  Judith Hill Bose ([email protected])
Websites
•  www.wolfbrown.com
•  http://bit.ly/SistemaStudy
41