1 Winter Forum 2014 Student Learning Assessment Report Submitted

Transcription

1 Winter Forum 2014 Student Learning Assessment Report Submitted
Winter Forum 2014 Student Learning Assessment Report
Submitted by the Office of Assessment, Trinity College
1
Executive Summary
The Duke Quality Enhancement Plan describes the Winter Forum as a 2.5 day immersive oncampus experience emphasizing engagement in a compelling global issue through lectures,
workshops, group work, and service. It intends to educate through multiple lenses and multiple
points of view. The Winter Forum was designed as a response to the previous lack of large-scale
activities that engage students (undergraduate and graduate) in a collaborative and collective
intellectual experience. Of particular importance was the development of an immersive learning
experience for subsets of student who may have difficulty traveling or studying abroad.
The 2014 Winter Forum took place on Duke’s campus at the Fuqua School of Business.
Sponsored by the Office of Undergraduate Education, the Center for Child and Family Policy
and the Program in Education, the 2014 Winter Forum was titled Rethink Education: The
Innovation Challenge. Observing the structure of the 2014 Winter Forum, Assessment personnel
saw an emphasis on content delivery through presentations and lectures as well as interactive
dialogue and team-focused breakout sessions facilitated by both faculty and content specialists.
Teams were challenged to develop an initiative that addressed issues in education which affected
both the United States and India. Quality Enhancement Plan learning objectives for the Winter
Forum indicate that students will be able to:
a) evaluate a global issue (the topic of a given year’s Winter Forum) from perspectives of
multiple disciplines;
b) evaluate a global challenge from multiple cultural perspectives;
c) engage in collaborative group work; and
d) relate the Winter Forum experience to classroom coursework and co-curricular
experiences.
The assessment methods and tools were developed around these objectives. For the 2014 Winter
Forum, the pre- and post-program surveys and the pre- and post-program knowledge tests were
the main vehicles by which the Office of Assessment, Trinity College (OATC) collected
information about students’ movement toward these objectives. Assessment personnel also
gathered data with the aid of a program specific observational rubric. The observational data
collected by OATC personnel provided meaningful insight into the collective movement, as well
as movement by team, of participants towards Quality Enhancement Plan (QEP) learning
outcomes. Observational data were collected with the highest frequency possible given program
design, group size and personnel constraints.
With these observations in mind, available data suggest that participants in the 2014 Winter
Forum collectively moved towards the QEP outcomes, generally. Observational data suggests
that students’ ability to engage in collaborative group work centered on a global issue, as well as
their ability to evaluate global issues from multiple cultural, geographical and historical
perspectives had a noticeable increase. With respect to students’ ability to evaluate global issues
from the perspective of multiple disciplines, students showed minimal movement. Students’ also
showed minimal movement in their self-reported ability to relate the winter forum experience to
classroom coursework and co-curricular experiences, though there may be a ceiling effect from
the high pre-program self-assessment scores.
2
The OATC observed an increase in the overall performance on the knowledge test, pre-program
to post-program. The OATC observed learning gains in nearly all of the knowledge areas
addressed on this test, however there is some variability in results by test question. At the end of
the program, students’ levels of knowledge’ rated at a slightly higher overall level. An index of
overall performance across the multiple-choice and free-response items shows a clear numerical
increase from pre- to post-test. Faculty coordinators’ intentional mapping of the 2014 Winter
Forum curriculum to the pre and post program knowledge tests and student learning outcomes
were very beneficial in promoting this movement among participants and allowing for clear
assessment of student learning.
The pre-program and post-program surveys asked participants to evaluate expectations of the
learning experience with regard to current competencies central to the QEP learning outcomes.
Results from the pre-program survey show participants self-reporting moderate to high
capabilities with respect to the QEP-level competencies (e.g., ability to evaluate a complex
global issue from multiple cultural perspectives) and very high expectations for the 2014 Winter
Forum to promote further gains in these areas. By the end of the program, responses to the postprogram survey suggest modest increases and, in some cases, the experience falling short of the
participants expectations to further develop these competencies. However, this is most likely the
result of unrealistic expectations in the pre-program self-assessment and a more grounded
response during the post-program self-assessment.
With respect to other areas of the post-program knowledge test, participants self-reported the
interactions with content experts and faculty coaches as very influential to their development
during Winter Forum, as well as the formal presentations by Winter Forum guest speakers.
Participants indicated that the writing of blog posts was minimally helpful in their development –
though it does serve as an additional point at which faculty coordinators and assessment
personnel can collect artifacts on student learning. In responses to a question eliciting general
comments about the Winter Forum experience, the analysis reveals generally positive sentiments
and dispositions toward the experience rather than specific evidence of movement towards the
QEP and programmatic learning outcomes. Review of the open ended comments participants
submitted on the post-program survey shows the program was perceived as a successful learning
experience, in which participants enjoyed immersion into a topic of substantial global, national,
and local importance without the regular distractions of the academic year. At the conclusion of
the event almost half of all respondents indicated the Winter Forum had prompted them to
pursue further involvement in the field of education, though this was a decrease from the almost
three-quarters of students who said they expected the Winter Forum to increase their future
involvement in the field.
There was some small but notable participant attrition immediately before the start of the
program – related to bad weather and travel delays/cancellations. The Winter Forum continues to
attract women, Asian and Pacific Islander students, and non-athletes in higher proportions than
other groups. Of the targeted student subgroups, only international students are overrepresented
in the Winter Forum applications. Student Athletes are slightly under represented in comparison
to the overall student population. Students receiving financial aid are also underrepresented in
comparison the overall student population. This is a trend that appears consistent with previous
Winter Forums. Participants indicated banners in the Bryan center, word of mouth from faculty
3
and students, and email blasts were the main ways in which they learned about the 2014 Winter
Forum. Future program organizers may want to consider additional marketing focused on the
target subgroups for which Winter Forum was designed.
Students’ decisions whether or not to apply to future Winter Forums continues to be a reflection
of the alignment between their intellectual and professional interests and the forum topic. Almost
all (96.8%) of the participants who responded to the post program survey indicated they would
recommend the Winter Forum to other Duke students.
Participants were invited to suggest changes for future iterations of the Winter Forum, and these
comments generally fit into the following categories: communications, logistics and location;
and suggestions for maximizing the group work experience. OATC-generated recommendations
for future consideration include: ensuring a good mix of pedagogical approaches (lecture,
discussion, group work, etc.), the development of program specific learning outcomes in addition
to the pre-existing QEP outcomes, increased time for teams to work together and process new
information, and a capstone experience in which particpants are able to demonstrate their
understanding of the Winter Forum topic at the conclusion of the program. The OATC intends to
continue reviewing and enhancing the assessment tools and rubrics used in the evaluation of the
Winter Forum event, making improvements to the data collection process wherever possible.
4
Introduction
The following narrative details the assessment for the 2014 Winter Forum, Rethink Education:
The Innovation Challenge. Notable findings, conclusions, and suggestions for future
consideration also are documented. In addition to observational data collection, the pre- and
post-program surveys and the pre- and post-program knowledge tests were the main vehicles by
which the OATC collected information about students’ movement toward the objectives
The majority of the findings reported here, particularly those pertaining to student learning
outcomes, are drawn from pre- and post-program expectations and dispositions surveys and preand post-program knowledge tests. The Office of Assessment observed team activities and
collected those observations via rubrics, however, due to the variability in team activities the data
may be inconclusive. With these considerations in mind, the available data suggests that, overall,
participants in the 2014 Winter Forum moved towards the intended QEP outcomes.
Assessment Plan
The primary objective of the QEP, Global Duke: Enhancing Students’ Capacity for World
Citizenship, is to enable Duke students to learn and function most effectively in the world, to be
world citizens. The operationalization of this concept of global citizenship includes facilitating
students’ movement towards enhanced and nuanced knowledge, skills, and attitudes. Table 1
represents the learning themes associated with the QEP.
Table 1. Knowledge, Skills, and Attitudes and Values of a World Citizen
Element
Knowledge
Details
Understanding of culture, diversity, globalization, interdependence, global irregularities, peace
and conflict, nature and environment, sustainable development, possible future scenarios, social
justice
Skills
Research and inquiry skills, theory testing, critical thinking, communication skills and political
skills essential for civic engagement in a global society, cooperation, and conflict resolution,
ability to challenge injustice and inequalities
Attitudes and
values
Appreciation of human dignity, respect for people and things, belief that people can make a
difference, empathy toward other cultures and viewpoints, respect for diversity, valuing justice
and fairness, commitment to social justice and equity, curiosity about global issues and global
conditions that shape one's life, concern for the environment, and commitment to sustainable
development
With respect to building students’ base of knowledge, QEP-affiliated programs seek to develop
students’ awareness of significant contemporary issues and their global scope, including the
history, differences, and perspectives of and within regions and cultures. QEP programs also
encourage the development of skills, including the ability to engage positively with, and learn
from, people of different backgrounds and in different environments. Finally, these programs are
charged with cultivating students’ attitudes towards and self-awareness of their identities as both
national and global citizens. The activities associated with the QEP contribute to the
development of connections within the student body through shared experiences within the
learning community.
The QEP describes the Winter Forum as a 2.5 day immersive on-campus experience
emphasizing engagement in a compelling global issue through lectures, workshops, group work,
5
and service. It intends to educate through multiple lenses and multiple points of view. The
Winter Forum was designed as a response to the previous dearth of large-scale activities that
engage students (undergraduate and graduate) in a collaborative and collective intellectual
experience. Of particular importance was the development of an immersive learning experience
for subsets of student who may have difficulty traveling or studying abroad. Learning objectives
for the program indicate that students will be able to:
a) evaluate a global issue (the topic of a given year’s Winter Forum) from perspectives of
multiple disciplines;
b) evaluate a global challenge from multiple cultural perspectives;
c) engage in collaborative group work; and
d) relate the Winter Forum experience to classroom coursework and co-curricular
experiences.
For the 2014 Winter Forum, the pre- and post-program surveys, the pre- and post-program
knowledge tests, and observational data collected by assessment personnel were the main
vehicles by which the Office of Assessment collected information about students’ movement
toward the objectives. Students also submitted blog posts throughout the Winter Forum, based on
prompts written by the faculty coordinators. A mixture of quantitative and qualitative analysis
methods were utilized when appropriate and feasible.
Applicants and Attendees
As noted in the introduction, the Winter Forum was designed, in part, to attract undergraduates
who are less likely to travel abroad (e.g., athletes, engineers, and science majors), students who
seek to integrate a diversity of learning experiences abroad, and Duke’s international students
(undergraduate and graduate) who seek opportunities for intellectual engagement with U.S.
students.
The 2014 Winter Forum attracted 200 applicants, an increase from the 146 students who applied
for the 2013 Winter Forum. This increase may be the result of the 2014 Winter Forum returning
to Duke University’s main campus after taking place at the Marine Lab in 2013. Tables 2 and
Table 3 summarize the biodemographic breakdown of the 2014 applicant pool. Female students
are overrepresented in the cohort of applicants, as are Asian or Pacific Islander and AfricanAmerican students, students who participated in the Focus Program, and students not receiving
financial aid. Conversely, male, Caucasian and Hispanic students, student athletes, non-focus
students, and financial aid recipients are underrepresented among applicants. A greater
proportion of Winter Forum applicants are international (13.5%) compared to that of the student
body overall.
6
Table 2. Summary of applicants to the 2014 Winter Forum and Trinity and Pratt Overall 2009-2013
N
%
Overall
200
100.0%
Trinity and
Pratt
Overall (0913)1
8800
Female
Male
Unknown
128
68
4
64.0%
34.0%
2.0%
4438
4362
0
50.4%
49.6%
0.0%
African-American
American Indian or Alaskan Native
Asian or Pacific Islander
32
1
83
16.0%
0.5%
41.5%
946
71
2248
10.8%
0.8%
25.5%
Caucasian
Hispanic
Native Hawaiian
Other or not specified
57
8
1
4
28.5%
4.0%
0.5%
2.0%
4321
635
9
411
49.1%
7.2%
0.1%
4.7%
Unknown
14
7.0%
159
1.8%
Non-Athlete
Tier 1 Scholarship
187
1
93.5%
0.5%
7923
156
90.0%
1.8%
Tier 1 Non-Scholarship
Tier 2
Tier 3
Unknown
0
3
5
4
0.0%
1.5%
2.5%
2.0%
21
441
259
0
0.2%
5.0%
2.9%
0.0%
First-year
Sophomore
Junior
42
56
57
21.0%
28.0%
28.5%
1752
1735
1744
19.9%
19.7%
19.8%
Senior
Fifth-year (enrolled)
Other/Unknown
38
1
6
19.0%
0.5%
3.0%
1797
1772
59
20.4%
20.1%
0.7%
Focus Student
Non-Focus student
Unknown
83
113
4
41.5%
56.5%
2.0%
1749
7050
1
19.9%
80.1%
0.0%
Financial aid recipient
Non-financial aid recipient
Unknown
85
111
4
42.5%
55.5%
2.0%
6165
2635
0
70.1%
29.9%
0.0%
1
%
100.0%
Trinity and Pratt overall consists of Fall 2009 through Fall 2013 matriculates
7
Table 3. Summary of applicants to the 2014 Winter Forum, by citizenship.
Overall
200
100.0%
Trinity
and
Pratt
Overall
(09-13) 1
8800
All international
27
13.5%
739
N
%
%
100.0%
8.4%
Canada
China
India
Nepal
New Zealand
Portugal
Sierra Leone
Singapore
South Africa
USA
Vietnam
Unknown (Blank)
1
0.5%
72
0.8%
10
5.0%
202
2.3%
2
1.0%
34
0.4%
1
0.5%
4
0.0%
2
1.0%
12
0.1%
1
0.5%
1
0.0%
1
0.5%
1
0.0%
3
1.5%
31
0.4%
1
0.5%
5
0.1%
173
86.5%
8061
91.6%
1
0.5%
6
0.1%
4
2.0%
9
0.1%
1
Trinity and Pratt overall consists of Fall 2009 through Fall 2013
matriculates
The final selection criteria were not available to the OATC at the time of this report. However,
students who had not participated in a previous Winter Forum received priority at selection, as
did individuals who fit the profile of students who may not have access to or the ability to engage
in other types of international learning experiences. 108 students initially were accepted to the
program. A small amount of additional attrition occurred between October and early January,
resulting in a final participant cohort of 95. Tables 4 and Table 5 summarize the biodemographic
breakdown of these 95 participants. Women continue to be overrepresented in the Winter Forum
cohort, as are Asian students, students who participated in the Focus Program, and students not
receiving financial aid.
Table 4. Summary of 2014 Winter Forum participants.
N
%
Overall
95
100.0%
Trinity
and Pratt
Overall
(08-12)1
8800
Female
Male
Unknown
54
56.8%
41.1%
4438
4362
50.4%
49.6%
2.1%
0
0.0%
African-American
American Indian or Alaskan Native
12
0
12.6%
0.0%
946
71
10.8%
0.8%
39
2
%
100.0%
8
Table 4. (continued)
Asian or Pacific Islander
Caucasian
Hispanic
Native Hawaiian
32
33.7%
2248
25.5%
34
6
1
3
7
35.8%
6.3%
1.1%
3.2%
7.4%
4321
635
9
411
159
49.1%
7.2%
0.1%
4.7%
1.8%
84
1
0
88.4%
1.1%
0.0%
7923
156
21
90.0%
1.8%
0.2%
4
4
2
4.2%
4.2%
2.1%
441
259
0
5.0%
2.9%
0.0%
15
22
32
21
15.8%
23.2%
33.7%
22.1%
1752
1735
1744
1797
19.9%
19.7%
19.8%
20.4%
1
4
1.1%
4.2%
1772
59
20.1%
0.7%
Focus Student
Non-Focus student
Unknown
33
34.7%
1749
19.9%
60
2
63.2%
2.1%
7050
1
80.1%
0.0%
Financial aid recipient
Non-financial aid recipient
Unknown
48
50.5%
6165
70.1%
45
47.4%
2
2.1%
1
Trinity and Pratt overall consists of Fall 2009 through Fall 2013 matriculates
2635
0
29.9%
0.0%
Other or not specified
Unknown
Non-athlete
Tier 1 scholarship athlete
Tier 1 Non-Scholarship athlete
Tier 2 athlete
Tier 3 athlete
Unknown
First-year
Sophomore
Junior
Senior
Fifth-year
Other/Unknown
Table 5. Summary of 2014 Winter Forum participants, by citizenship.
100.0%
Trinity
and
Pratt
Overall
(08-12) 1
8800
100.0%
9
9.5%
739
8.4%
China
India
Nepal
New Zealand
3
2
1
2
3.2%
2.1%
1.1%
2.1%
202
34
4
2.3%
0.4%
0.0%
USA
Vietnam
Unknown (blank)
84
1
2
88.4%
1.1%
2.1%
12
8061
6
9
0.1%
91.6%
0.1%
0.1%
N
%
Overall
95
All international
%
With respect to students’ first and second majors, majors are known only for participants who are
in their third or fourth years of study. The distribution of majors and minors is listed in Table 6.
9
Of these students, the most prevalent first majors are Public Policy Studies (33.9% of 59
participants with a declared major) and Biology (13.6%). Education was the most prevalent
declared minor (19.4%).
Table 6. Summary of 2014 Winter Forum participants, by major and minor.
First
major
% of WF
participants
with a declared
major
Second
major
% of WF
participants
with a declared
major
First
minor
% of WF
participants
with a declared
major
Second
minor
% of WF
participants
with a declared
major
AMES
0
0.0%
0
0.0%
2
5.6%
0
0.0%
ART
ARTV
BIO
BME
0
0
8
2
0.0%
0.0%
13.6%
3.4%
0
0
0
0
0.0%
0.0%
0.0%
0.0%
1
0
0
0
2.8%
0.0%
0.0%
0.0%
0
1
0
0
0.0%
12.5%
0.0%
0.0%
CA
CHEM
CLCZ
CLLA
1
0
0
1
1.7%
0.0%
0.0%
1.7%
0
0
0
0
0.0%
0.0%
0.0%
0.0%
2
4
0
0
5.6%
11.1%
0.0%
0.0%
1
1
1
0
12.5%
12.5%
12.5%
0.0%
CPS
ECE
ECON
EDUC
1
2
2
0
1.7%
3.4%
3.4%
0.0%
1
0
0
0
10.0%
0.0%
0.0%
0.0%
0
0
3
7
0.0%
0.0%
8.3%
19.4%
0
0
0
0
0.0%
0.0%
0.0%
0.0%
ENGL
ENVS
EOS
FREN
1
1
0
0
1.7%
1.7%
0.0%
0.0%
1
0
1
1
10.0%
0.0%
10.0%
10.0%
1
0
0
1
2.8%
0.0%
0.0%
2.8%
0
0
0
0
0.0%
0.0%
0.0%
0.0%
GER
GLHL
HIST
ICS
0
0
1
4
0.0%
0.0%
1.7%
6.8%
0
2
1
1
0.0%
20.0%
10.0%
10.0%
1
0
3
0
2.8%
0.0%
8.3%
0.0%
0
0
0
1
0.0%
0.0%
0.0%
12.5%
MATH
ME
MUS
NEUR
2
2
0
3
3.4%
3.4%
0.0%
5.1%
1
0
0
0
10.0%
0.0%
0.0%
0.0%
1
0
1
0
2.8%
0.0%
2.8%
0.0%
0
0
0
0
0.0%
0.0%
0.0%
0.0%
PHIL
POLI
PPS
PROG2
0
3
20
1
0.0%
5.1%
33.9%
1.7%
1
0
0
0
10.0%
0.0%
0.0%
0.0%
2
2
0
0
5.6%
5.6%
0.0%
0.0%
0
0
0
0
0.0%
0.0%
0.0%
0.0%
PSY
SOC
SPAN
2
2
0
3.4%
3.4%
0.0%
0
0
0
0.0%
0.0%
0.0%
0
3
2
0.0%
8.3%
5.6%
3
0
0
37.5%
0.0%
0.0%
A subsequent section of this narrative details the results of the pre- and post-program knowledge
tests and surveys. One item on the pre-program survey provides useful information about
participants’ levels of engagement in issues related to education prior to the start of the program.
Item 8 asks, To what degree are you now actively involved in issues pertaining to education?
10
Of the 108 respondents to the pre-program survey, 2 (1.9%) indicated they are not involved, 29
(26.9%) indicated they are interested but not involved, 49 (45.5%) indicated they are moderately
involved, and 27 (25.0%) indicated they are highly involved. Movement towards QEP and
programmatic learning outcomes should be considered overall, as well as by students’ initial
levels of engagement in the forum topic as this may introduce a ceiling effect in some areas.
Pre- and post-program knowledge tests
To gauge one aspect of students’ learning gains following the Winter Forum program, the OATC
issued a 17-item knowledge test to participants prior to and immediately following the Winter
Forum. In general, the OATC observed an increase in the overall performance on the test,
though there is some variability in results by test question. The following summary explains the
methods by which data were collected and analyzed, interprets the results, and provides
recommendations for future iterations of the Winter Forum knowledge test.
Summary of the test
The knowledge test was designed by subject matter experts (i.e., faculty) on the Winter Forum
planning committee. The original 16 questions were made up of 13 multiple choice and 3 free
response/short answer questions. OATC staff split question 15 into two separate questions for the
purposes of scoring the knowledge test, as it was made up of two distinct questions. When scored
these questioned were referenced as 15A and 15B. The questions are detailed in Table 7 below
(Appendix 1, the scoring rubric for knowledge test open-ended items includes suggested
responses). The OATC added an Unsure option to minimize guessing and to distinguish true
unknown responses from incorrect responses. Correct responses to the multiple choice and
true/false items are bolded.
Table 7. Winter Forum 2014 Knowledge Test Questions
Item
1
Prompt
Students have a fundamental, constitutional right to education:
a) True in India but not in the US
b) True in India and true in the US
c) True in the US but not in India
d) Not true in India; not true in the US
2
The population of India is almost 1.3 billion (approximately 16% of the world’s population). What percentage of the world’s
scientific researchers are Indian citizens?
a) 2%
b) 10%
c) 45%
d) 56%
3
In 2010, roughly ______ % of India’s population lived on less than $1.25 (US) per day:
a) 10%
b) 22%
c) 33%
d) 45%
4
Which of the following is NOT considered an essential feature of a social innovation?
a) A social innovation must be novel.
b) A social innovation must be developed a non-profit organization.
c) A social innovation must be an improvement over current approaches.
d) A social innovation must be sustainable.
11
Table 7. (continued)
5
According to the 2012 NSF Science and Engineering Indicators, which of the following statements is TRUE in regards to
student participation in advanced STEM courses?
a) More boys took advanced biology classes than girls
b) More girls took algebra II than boys
c) Boys are 10 times more likely to take engineering classes than girls
d) None of the above
6
Poor infrastructure at schools in India makes teaching even harder. According to the 2011 Annual Status of Education
Report:
a) 51% of schools did not have working toilets, while 17% of schools had no provision for drinking water
b) 24% of schools did not have working toilets, while 10% of schools had no provision for drinking water
c) 15% of schools did not have working toilets, while 5% of schools had no provision for drinking water
d) None of the above
7
Data from the 2012 US Census indicate that ______% of children under 18 are below poverty in the US:
a) 14.7%
b) 21.8%
c) 37.4%
d) 53.2%
8
According to the 2011 Census, the literacy rate for females in India is ______% in relation to 82% for males.
a) 40%
b) 55%
c) 65%
d) 75%
9
Which of the following is NOT true about an organization’s theory of change?
a) A theory of change evolves as the program develops.
b) A theory of change clearly communicates the link between the program and its potential impact.
c) The complexity of social problems limits the usefulness of a theory of change.
d) Some funders do not require their grantees to develop a theory of change.
10
Research indicates that culturally responsive teaching strategies to promote engagement and achievement among
underrepresented groups do which of the following:
a) connect academic content to students’ home and heritage
b) focus on students’ mastery of the basic academic skills
c) require instruction in English to promote access to the nation’s rich history and heritage
d) none of the above
11
Name four sociocultural factors that affect girls’ access to high-quality education in India.
a) ______________________________________________________________________________
b) ______________________________________________________________________________
c) ______________________________________________________________________________
d) ______________________________________________________________________________
12
Which of the following is true in regards to development of intercultural competence?
a) Cultural knowledge necessarily leads to cultural competence
b) Cultural contact necessarily leads to cultural competence
c) Cultural contact may lead to reduction of stereotypes
e) None of the above
13
In the US, at what level of governance are decisions made about what gets taught and tested in public schools, certification of
teachers and principals, and the minimum number of school days per year?
a) Individual school level
b) Local/district level
c) State level
d) Federal level
12
Table 7. (continued)
14
In the US, the Common Core State Standards (CCSS) have been adopted by 45 states and the District of Columbia as part of
“a state led effort [to establish] a single set of clear educational standards for kindergarten through 12th grade in English
language arts and mathematics.” In what way(s), if at all, do the CCSS incorporate expectations for students’ learning in
science?
a) Next Generation Science Standards have been created to align with CCSS English language arts standards
b) Next Generation Science Standards have been created to align with CCSS mathematics standards
c) Both a and b
d) No efforts have been made to align science standards with CCSS standards
15A
Name four characteristics of cultural competency and indicate why it is important when adopting and/or reforming
educational practices and pedagogies, particularly when supporting underrepresented students (200 words or less).
15B
Name four characteristics of cultural competency and indicate why [cultural competency] is important when adopting
and/or reforming educational practices and pedagogies, particularly when supporting underrepresented students
(200 words or less).
16
Discuss two similarities and two differences between education in the US and India. (200 words or less.)
In fall 2013, all students accepted to the program were emailed links to the knowledge test and
the student expectations survey via the program’s Sakai listserv. The OATC and personnel from
the Winter Forum planning committee monitored rates of completions and followed-up with
non-respondents at regular intervals. Submissions to the knowledge test were received at a
slower rate than those of the expectations survey. Overall, the OATC received pre-test
submissions from 101 students, of whom only 93 are among the final participant population of
95. Thus, the OATC calculates a pre-test return rate of 97.9%. The OATC received post-test
submissions from 58 students, or a return rate of 61.1%
The results presented in this narrative are based on a sample from the final N of 95, the number
of students who completed the full Winter Forum program. Scoring the 13 multiple-choice and
true/false items was relatively straight-forward, as correct answers were supplied by faculty on
the Winter Forum planning committee. Table 8 presents the distribution of responses, pre-test
and post-test.
Table 8. Distribution of responses (correct/incorrect) to knowledge test multiple-choice and true/false
items, pre-test and post-test.
Pre-test (N=45)
Post-test (N=45)1
Item
1
2
3
Students have a
fundamental, constitutional
right to education:
The population of India is
almost 1.3 billion
(approximately 16% of the
world’s population). What
percentage of the world’s
scientific researchers are
Indian citizens?
In 2010, roughly ______ %
of India’s population lived on
less than $1.25 (US) per
day:
Correct
Incorrect
Unsure
Correct
Incorrect
Unsure
N
%
N
%
N
%
N
%
N
%
N
6
13.3%
34
75.6%
5
11.1%
32
71.1%
13
28.9%
0
0.0%
12
26.7%
16
35.6%
17
37.8%
32
71.1%
11
24.4%
2
4.4%
11
24.4%
17
37.8%
17
37.8%
24
53.3%
17
37.8%
4
8.9%
%
13
Table 8. (continued)
4
5
6
7
8
9
10
12
13
14
Which of the following is
NOT considered an
essential feature of a social
innovation?
According to the 2012 NSF
Science and Engineering
Indicators, which of the
following statements is
TRUE in regards to student
participation in advanced
STEM courses?
Poor infrastructure at
schools in India makes
teaching even harder.
According to the 2011
Annual Status of Education
Report:
Data from the 2012 US
Census indicate that
______% of children under
18 are below poverty in the
US:
According to the 2011
Census, the literacy rate for
females in India is ______%
in relation to 82% for males.
Which of the following is
NOT true about an
organization’s theory of
change?
Research indicates that
culturally responsive
teaching strategies to
promote engagement and
achievement among
underrepresented groups do
which of the following:
Which of the following is true
in regards to development of
intercultural competence?
In the US, at what level of
governance are decisions
made about what gets
taught and tested in public
schools, certification of
teachers and principals, and
the minimum number of
school days per year?
In the US, the Common Core
State Standards (CCSS) have
been adopted by 45 states
and the District of Columbia
as part of “a state led effort [to
establish] a single set of clear
educational standards for
kindergarten through 12th
grade in English language arts
and mathematics.” In what
way(s), if at all, do the CCSS
incorporate expectations for
students’ learning in science?
38
84.4%
6
13.3%
1
2.2%
40
88.9%
4
8.9%
1
2.2%
1
2.2%
33
73.3%
11
24.4%
0
0.0%
40
88.9%
5
11.1%
12
26.7%
14
31.1%
19
42.2%
33
73.3%
11
24.4%
1
2.2%
24
53.3%
13
28.9%
8
17.8%
28
62.2%
13
28.9%
4
8.9%
8
17.8%
22
48.9%
15
33.3%
30
66.7%
13
28.9%
2
4.4%
18
40.0%
13
28.9%
14
31.1%
24
53.3%
18
40.0%
3
6.7%
31
68.9%
5
11.1%
9
20.0%
36
80.0%
7
15.6%
2
4.4%
37
82.2%
4
8.9%
4
8.9%
41
91.1%
4
8.9%
0
0.0%
35
77.8%
8
17.8%
2
4.4%
41
91.1%
4
8.9%
0
0.0%
13
28.9%
8
17.8%
24
53.3%
20
44.4%
18
40.0%
7
15.6%
14
The OATC observes learning gains in many of the knowledge areas addressed on this test.
Correct responses to item 1 jump from 13.3% of all responses at the pre-test to 71.1% of all
responses at the post-test. Similarly large increases occur for item 2 (26.7% to 71.1%), item 6
(26.7% to 73.3%), and item 8 (17.8% to 66.7). Item 3 (24.4% to 53.3%), item 7 (53.3% to
62.2%), item 9 (40.0% to 53.3%), item 10 (68.9% to 80.0%), item 12 (82.2% to 91.1%), item 13
(77.8% to 91.1%), and item 14 (28.9% to 44.4%), all saw increases in correct responses as well.
Item 4 showed similar rates of correct responses at pre-test and post-test (84.4% to 88.9%). The
only item that saw a decrease in correct responses was item 5. Only 1 student responded
correctly to item 5 at pre-test, and no one answered correctly at post-test. The low scores for item
5 could indicate confusion over the topic or misunderstanding of the test item.
Four items on the knowledge test required short answer responses. The faculty members on the
Winter Forum planning committee supplied example responses for each item, from which the
OATC developed a 5-point scoring rubric, assessing students’ understanding of the material on a
continuum from Not able to be assessed to Exceeds satisfaction. The rubric is included in
Appendix 1.
The OATC randomly sampled 45 submissions to pre-test and 45 submissions to the post-test,
double-scoring just over one-third of those tests to establish rater concordance and reliability.
Table 9 presents the number of submissions scored out of the total number of tests received at
each administration. Also note that 4 tests (2 pre-tests and 2 post-tests) were used for rater
training and calibration. Those results were excluded from the final analysis.
Table 9. Summary of tests scored.
Rater training
Pre-test
Post-test
Total
N tests singlescored
N tests doublescored
Excluded from final
results
N/A
N/A
45
45
90
30
30
60
15
15
30
N submissions
Tests scored
2 from pre
2 from post
101
58
159
Table 10 shows the degree to which the multiple raters agreed on the scores to be assigned to
individual items on the double-scored tests. Agreement was defined as no difference on scores
between raters. Aggregating both administrations (pre-test and post-test), the table demonstrates
a high degree of concordance. Of the 30 tests double-scored (15 pre-test and 15 post-test),
representing a total of 120 scores assigned (30 tests x 4 free-response items each), the raters were
in agreement on 78.3%% and 85.0% of all responses, pre-test and post-test respectfully. This
rate varied slightly by item, as shown in Table 10.
15
Table 10. Summary of Rater Concordance (Pre-test and Post-test combined).
Rater Concordance2
All knowledge test items
double-scored, PRE
No Difference
Difference of 11
Difference of 21
Pre-test percentage of
agreement
Overall
Q11
Q15A
Q15B
Q16
60
15
15
15
15
47
10
3
13
1
1
11
4
0
11
2
2
12
3
0
86.7%
73.3%
73.3%
80.0%
78.3%
All knowledge test items
double-scored, POST
60
15
15
15
15
No Difference
Difference of 11
Difference of 21
51
6
3
13
2
0
13
1
1
13
2
0
12
1
2
85.0%
86.7%
86.7%
86.7%
80.0%
Post-test percentage of
agreement
1
In cases where double-scored tests had a difference of more than 1 on an item, the item was evaluated
and scored by a third rater.
2
The table presents the overall number of test items scored (90), not the number of student submissions
(159).
Table 11 presents the distribution of scores, by test item, for the pre-test and post-test,
respectively. A score of zero was assigned to those responses that were designated Not able to be
assessed. Responses which were given a zero included those that were missing or
incomprehensible due to grammar, spelling, structure or language choice. This also included
responses such as “I don’t know” or “unsure”. Prior to the program, students’ overall level of
knowledge centered on Needs a lot of improvement, with some variation across test items. The
highest levels of knowledge, prior to the program, were observed among item 11 (Girls’ access
to quality education) and item 16 (Education in US/India) with 82.2% and 71.1% receiving final
ratings greater than Not able to be assessed, respectively. Students had a harder time responding
to items 15A (Cultural Competency) and 15B (Cultural Competency Importance). Only 53.3%
and 35.6% received ratings higher than Not able to be assessed, respectively. None of the
answers scored at pre-test received a rating of Exceeds expectations.
At the end of the program, students’ levels of knowledge continued to center around Needs a lot
of improvement, though at a slightly higher overall level. Compared to the 3 item ratings (1.7%)
receiving Satisfactory or Exceeds satisfaction at the pre-test, 21 item ratings (11.7%) received
Satisfactory or Exceeds satisfaction at the post-test. Again, there was some variation across test
items. Please see Table 11 for additional detail.
16
Table 11. Summary of results from Knowledge Pre-Test and Knowledge Post-Test
Final scores: PRE
Overall
Q11
Q15A
Q15B
Q16
Not provided or not able to be assessed (0) 1
Percent of total for question
71
39.4%
8
17.8%
21
46.7%
29
64.4%
13
28.9%
Needs a lot of improvement (1)
Percent
total for question
Table
11. of
(continued)
75
41.7%
26
57.8%
17
37.8%
9
20.0%
23
51.1%
Needs some improvement (2)
Percent of total for question
31
17.2%
11
24.4%
5
11.1%
7
15.6%
8
17.8%
Satisfactory (3)
Percent of total for question
3
1.7%
0
0.0%
2
4.4%
0
0.0%
1
2.2%
Exceeds expectations (4)
Percent of total for question
0
0.0%
0
0.0%
0
0.0%
0
0.0%
0
0.0%
Total
180
45
45
45
45
Mean
0.81
1.07
0.73
0.51
0.93
Final scores: POST
Overall
Q11
Q15A
Q15B
Q16
Not provided or not able to be assessed (0) 1
Percent of total for question
44
24.4%
6
13.3%
12
26.7%
19
42.2%
7
15.6%
Needs a lot of improvement (1)
Percent of total for question
55
30.6%
19
42.2%
13
28.9%
10
22.2%
13
28.9%
Needs some improvement (2)
Percent of total for question
60
33.3%
16
35.6%
16
35.6%
12
26.7%
16
35.6%
Satisfactory (3)
Percent of total for question
20
11.1%
4
8.9%
3
6.7%
4
8.9%
9
20.0%
Exceeds expectations (4)
Percent of total for question
1
0.6%
0
0.0%
1
2.2%
0
0.0%
0
0.0%
Total
180
45
45
45
45
Mean
1.33
1.40
1.29
1.02
1.60
Responses which were given a zero included those that were missing or incomprehensible due to grammar, spelling,
structure or language choice. This also included responses such as “I don’t know” or “unsure”.
1
The OATC also created an index of overall performance across the multiple-choice, true/false,
and free-response items. The OATC assigned a maximum of 17 points (one point for each item
on the knowledge test, with question 15 split into two separate questions; 15A and 15B),
representing 4 open-ended items and 13 multiple choice items. Open-ended items which were
rated on the five-point scale were converted from a range of 0-4 to a range of 0-1. Those
responses earning a 3 or 4 on the rubric - equal to Satisfactory or Exceeds Satisfaction - were
awarded a 1 and those responses that received a 0, 1 or 2 – equal to Missing or Unable to score,
Needs a lot of improvement, or Needs Some Improvement – were awarded a 0. Multiple-choice
were issued 1 point for a correct answer and 0 points for any incorrect answer. The number of
tests in this analysis (N=45) is based on fully complete submissions. For such cases to be
17
included, open-ended items were fully complete so that scores could be assigned to all openended items. Table 12 presents the results of those composite global scores. Overall, we see a
clear numerical increase from pre- to post-test, in terms of mean scores.
Table 12. Composite global scores, comparing knowledge pre-test and post-test.
Pre-test
Post-test
N tests
45
45
Mean
SD
Median
5.53
1.82
6
8.93
1.92
9
Blog Posts Analysis
The OATC reviewed participant responses to the blog post prompts developed by the 2014
Winter Forum coordinators. Prompts are outline in Table 13. Prompts were structured in a way
to help encourage and guide student thinking and reflection. The prompts build upon the themes
of the day and also build upon the previous day’s prompts. The scaffolding of the blog post
prompts help encourage and guide participants’ thoughts and reflection – as well as provide an
additional vehicle to capture evidence of student learning.
Table 13. List of Blog Post Prompts
Blog Post
Prompts
Post 1
• What is the scope of the problem that you are addressing?
• What are the potential benefits of solving this problem?
• How does the problem differ between the U.S. and India?
• Which location (U.S. or India) are you focusing on and why?
• How is your innovation related to STEM education?
Post 2
• What are the key components of your approach?
• What is innovative about it? How is it a better approach than previous efforts?
• What outcomes do you expect if your approach is successful?
• What additional information do you need to further develop your ideas?
Post 3
• What are the important elements of the cultural context that you are considering as you
develop your approach?
• How does your issue differ between the U.S. and India? Will some aspects of your
approach work in both locations?
• What are the next steps in developing your innovation beyond Winter Forum?
18
To get a general sense of the prominent words and themes in response to each blog post prompt,
the OATC created a word cloud for each of the sets of responses to blog posts. Word clouds are
a clear visual representation of the selected source text, with more frequent words appearing
larger and bolder in the cloud. The OATC generated the following word clouds, presented in
Figures 1 through 3, using Wordle.net. By default, the word clouds present the 150 most
frequent words from the source text, excluding common conjunctions, prepositions, and
pronouns (e.g., and, of, and we). Figure 1 represents Blog Post 1 aggregate responses, Figure 2
represents Blog Post 2 aggregate responses, and Figure 3 represents Blog Post 3 aggregate
responses.
Figure 1. Word Cloud for Blog Post 1 Responses
The prompts in blog post 1 encouraged participants to think about issues in education and what
problem they hoped to address during the course of the Winter Forum. Beyond the word
students, which is used in abundance, respondents tended to mention words like STEM, India,
Summer, Community, Education and Gap as well as, to a lesser extent, words like
socioeconomic, bridging, curiosity, Indian and achievement. A general review of the full text
posts submitted by the different teams expands on what we see in the word clouds – participant
responses were generally related to reducing education inequality between socioeconomic groups
in both the US and India.
Figure 2. Word Cloud for Blog Post 2 Responses
19
Review of the word cloud for blog post 2 (Figure 2) reveals the word students to again be the
most commonly used word, while words like Community, School, STEM, Teachers and Program
were also frequently used. Other commonly used words included children, science, younger,
projects, education, girls, Durham, India and future. On the second day when teams submitted
responses to the 2nd set of blog post prompts we can begin to see commonly used words
reflecting their attempt to distinguish and differentiate their project from others. A closer look at
the full text responses shows that many of the teams found the idea of targeting young students,
female students and students from low socioeconomic statuses as primary ways in which to make
their projects unique.
Figure 3. Word Cloud for Blog Post 3 Responses
The final set of blog post prompts encourage participants to consider cultural context for their
project as well as similarities and differences in implementing their initiative in the United States
versus India. The word cloud in figure 3 shows the commonly used words – Students, program,
India, School, Community, Education, STEM, and teachers – are reflective of these topics,
generally. It is difficult to observe more specific trends in the 3rd blog post due to the increasing
variability in participant responses – reflective of the unique projects and ideas that each team
had developed over the previous two days.
The blog posts served as an opportunity for faculty coordinators of the Winter Forum to
encourage and guide reflection among the participants. The careful creation of writing prompts
that pull from the lessons and experiences of the day – as well as building upon themselves over
the course of the program – help students to process the new knowledge and develop a better
understanding of the material presented at Winter Forum. The analysis of blog posts in this study
is limited, though it does provide an opportunity to gather artifacts on the students’ learning over
the course of the program. Additional analysis could be possible in the future. In the case of the
2014 Winter Forum the blog posts served as one of many tools to encourage student
development throughout the program.
Pre- and post-program dispositions and expectations surveys
The OATC administered pre- and post-program surveys, assessing participants’ dispositions
towards - and expectations of - the program as well as their experiences overall. Both versions
of the instrument asked participants to evaluate expectations of the learning experience with
20
regard to current competencies central to the QEP learning outcomes, outcomes specific to the
2014 Winter Forum, and established Trinity College learning objectives. The pre-test also asked
about Winter Forum recruitment, dietary, and accessibility issues. The post-test included items
assessing various programmatic and logistical elements such as the physical space and
effectiveness of communication with students.
Appendices 2 and 3 list the survey results for the pre-test and post-test, respectively. In both
cases, only submissions from students who were among the final 108 selected participants were
included in the reports. The pre-program survey N of 108 represents 100% of program
participants, whereas the post-program survey N of 63 represents 58.3% of program participants.
As observed across past Winter Forum assessment results, motivation and incentive to complete
post-program tasks declined after the conclusion of the program.
Summary of the pre-program survey
On the pre-program survey, where 1 represents Very low ability and 5 represents Very high
ability, participants reported moderate to high abilities in terms of the QEP outcomes. These
include Evaluating global issues from a multi-disciplinary perspective (mean of 3.57),
Evaluating global issues from a multi-cultural perspective (mean of 3.59), Evaluating global
issues from a multi-geographical perspective (mean of 3.13), Evaluating global issues from a
historical perspective (mean of 3.09), Thinking critically about the relationship between science
and public policy (mean of 3.32), Relating what you know about the 2014 Winter Forum topic to
future coursework and co-curricular experiences (mean of 3.89), and Engaging in collaborative
group work centered on the 2014 Winter Forum topic (mean of 4.11). According to program
participants, making learning gains in each of these areas rated between Slightly important and
Very important. Learning to evaluate global issues from a multi-cultural perspective had the
highest average rating (mean of 4.41) followed by Learning to engage in collaborative group
work centered on the 2014 Winter Forum topic (mean of 4.39), where 1 represents Not at all
important and 5 represents Very important.
With respect to the Trinity College learning objectives, most participants rated their expectations
between Moderately High and Very High, where 1 represents Very low expectation and 5
represents Very high expectation. Based on mean ratings, the highest expectations were
attributed to Learning to apply knowledge, concepts, principles, or theories to a specific situation
or problem (mean of 4.38), Learning to integrate and synthesize knowledge (mean of 4.29),
Developing the ability to work as part of a team (mean of 4.23), and Learning to analyze ideas,
arguments and points of view (mean of 4.22). On average, participants had the lowest overall
expectation of Learning to work independently (mean of 3.09).
Participants overall have moderate to very high expectations of the opportunities afforded by the
Winter Forum. On a scale of 1 to 5, where 1 represents Very low expectation and 5 represents
Very high expectation, participants expected an environment that encouraged open and honest
exchange of ideas from multiple disciplinary, cultural and geographic perspectives (mean of
4.31) and exposure to cutting edge ideas and theory via scholars and experts in the field (mean
of 4.16). Respondents had moderate expectations of gaining information to clarify education
21
goals and objectives (mean of 3.91), clarify career goals and objectives (mean of 3.81), to
integrate into daily life (mean of 3.63), and to integrate in other courses and course work (3.94).
Also on the pre-program survey, participants indicated the degree to which they were actively
involved in issues pertaining to education. Most of the participants reported being moderately or
highly involved in issues related to education, while others were interested but not involved
(Appendix 2). Participants also indicated interest in various academic disciplines. Participants
rated their interest from 1 to 4; where 1 represents No interest and 4 represent Substantial
interest. The highest levels of interest reported on the pre-program survey where in Education
(mean of 3.67), Global Cultural Studies (mean of 3.14), Public Policy (mean of 3.14), and
Psychology (mean of 3.02). When asked on a scale of 1 to 5, where 1 is Very unlikely and 5 is
Very likely, how likely the experience in the 2014 Winter Forum will influence active
engagement in issues related to education participants indicated a moderate to high degree of
expectation (mean of 4.04) that the event would encourage involvement.
Item 11 invited participants to share any comments about their expectations of the 2014 Winter
Forum. These comments generally discussed excitement for the upcoming event and an interest
in learning more about issues in education. All comments can be found in Appendix 2.
Summary of the post-program survey
To estimate changes in attitudes, dispositions, or competencies over time, similar questions were
asked on the post-program survey. For the post-program survey, participants again reported
roughly moderate abilities in terms of the QEP outcomes, where 1 represents Very low ability
and 5 represents Very high ability. These include Evaluating global issues from a multidisciplinary perspective (mean of 4.02), Evaluating global issues from a multi-cultural
perspective (mean of 3.97), Evaluating global issues from a multi-geographical perspective
(mean of 3.95), Evaluating global issues from a historical perspective (mean of 3.46), Thinking
critically about the relationship between science and public policy (mean of 4.06), Relating what
you know about the 2014 Winter Forum topic to future coursework and co-curricular
experiences (mean of 4.33), and Engaging in collaborative group work centered on the 2014
Winter Forum topic (mean of 4.44). These values reflect gains of between 0.33 (Ability to
engage in collaborative work) and 0.82 (Ability to evaluate global issues from a multigeographical perspective), on a 5-point scale where 1 represents Very low ability and 5
represents Very high ability.
Regarding participants’ appraisals of movement towards the Trinity College learning objectives,
results were more mixed. On average, modest gains were reported for some of the learning
objectives including: Gained factual knowledge (post-test mean of 3.81, gain of 0.07), Learn to
analyze ideas, arguments, and points of view (post-test mean of 4.27, gain of 0.05), Learn to
evaluate the merits of ideas and competing claims (post-test mean of 4.08, gain of 0.04), Develop
my ability to work as part of a team (post-test mean of 4.27, gain of 0.04), and Develop my
ability to take a stand even when others disagree (post-test mean of 4.00, gain of 0.14). For the
remaining items, post-test mean results were below participants’ expectations of the program.
Both sets, pre-test and post-test, were rated on a 5-point scale. Negative differences are observed
among Develop my ability to work through ethical problems in science and public policy (post22
test mean of 3.94, difference of -0.25), Develop my ability to work independently (post-test mean
of 2.87, difference of -0.22), Enhance my writing skills (post-test mean of 3.29, difference of 0.17) Develop my ability to recognize ethical problems in science and public policy (post-test
mean of 3.98, difference of -0.15), Learn to apply knowledge, concepts, principles, or theories to
a specific situation (post-test mean of 4.25, difference of -0.13), Enhance my critical thinking
skills (post-test mean of 4.08, difference of -0.09), Learned to integrate and synthesize
knowledge (post-test mean of 4.21, difference of -0.08), Enhance my speaking skills (post-test
mean of 3.81, difference of -0.05), and Understood fundamental concepts and principles (posttest mean of 3.92, difference of -0.03). These declines may be related to higher than anticipated
student expectations observed in the pre-program survey.
With respect to the opportunities afforded by the Winter Forum, students’ experiences by the end
of the program compare favorably with their expectations prior to the program. Expectations
remained relatively constant with the greatest positive differences observed among: Information
that I will integrate in my daily life (post-test mean of 3.73, difference of 0.10), An environment
that encourages open and honest exchange of ideas from multiple disciplinary, cultural, and
geographic perspectives (post-test mean of 4.40, difference of 0.09), and Exposure to cutting
edge ideas and theory via scholars and experts in the field (post-test mean of 4.23, difference of
0.07). The greatest negative difference was observed for Information that will help clarify my
education goals and objectives (post-test mean of 3.79, difference of -0.12). This decline may
again be related to higher than anticipated student expectations for this opportunity.
Participants’ feedback on the following program components’ contributions to their learning
gains varied. On a scale of 1 to 5, where 1 represents Not at all and 5 represents Very highly,
participants on average rated the Interaction with experts in the field highest (mean of 4.54),
followed by Formal presentations by Winter Forum speakers (mean of 4.48), Interaction with
faculty coaches (mean of 4.40), and Intellectual debate (mean of 4.35) . Rated lowest was the
developing group blog posts (mean of 2.44). Compared to the 75.9% of respondents who
thought it likely or very likely the Winter Forum would influence them to become more involved
in education, at the end of the program only 46.0% indicated that they agreed or strongly agreed
with the statement, My involvement in the Winter Forum has prompted me to pursue further
involvement in education.
Items 6, 12, and 13 invited respondents to comment on their intentions to apply or not apply for
future Winter Forum programs, the Winter Forum experience generally, and any changes they
would recommend for future Winter Forum events, respectively. Comments pertaining to
participants’ intentions to apply or not apply to future Winter Forum programs were largely
positive indicating those who are not graduating will likely apply for future programs. General
comments on Winter Forum can be characterized as positive and constructive. Students indicated
they would have enjoyed more time to work in their groups. Future recommendations were also
mostly constructive and positive. All participant responses to items 6, 12 and 13 can be found in
Appendix 3.
23
Analysis of Observational Data Gathered by Assessment Personnel
To enable additional assessment of students’ learning gains, personnel from the OATC and the
Provost’s Office observed participant engagement during presentations, team meetings and final
group presentations. Overall, the OATC observed movement toward higher-level thinking and
discussion through the course of the Winter Forum program. The following summary explains
the methods by which the observational data were collected and analyzed, interprets the results,
and provides recommendations for future observational assessment at the Winter Forum.
Method
Observational data was collected by OATC personnel over the course of the Winter Forum.
Movement towards QEP outcomes was measured using Bloom’s Taxonomy as a developmental
scale. Bloom’s taxonomy includes the following levels and definitions, beginning with
Knowledge and progressing through to Evaluation:
Table 14. Bloom’s Taxonomy and Definitions
Definitions
Knowledge
Comprehension
Application
Analysis
Synthesis
Evaluation
Bloom’s Definition
Remember previously learned information.
Demonstrate an understanding of the facts.
Apply knowledge to actual situations
Break down objects or ideas into simpler parts and find evidence to support
generalizations.
Compile component ideas into a new whole or propose alternative solutions.
Make and defend judgments based on internal evidence or external criteria.
OATC team members were present during all whole group activities and rotated between groups
during breakout sessions and team meetings. During these break out group sessions the observer
assignments were randomized across teams and times within the session (early, middle, and end).
Analysis of Observational Data – QEP Outcomes
Observational data indicates movement towards QEP outcomes, though the degree of movement
is variable across these outcomes. Results of the observational data showed movement from the
knowledge level to, at its furthest, synthesis. Data from individual team observations are
available in Appendix 5. Some of the most compelling evidence of positive movement can be
seen in Figures 5, 6 and 7, which shows agreement between OATC personnel as they submit data
throughout the course of the Winter Forum. Coding for the events in these figures can be found
in Table 15. The gains observed here are notable as OATC personnel observed increased
comprehension even though the timeframe was only 2.5 days.
24
Table 15. Event Key for QEP Observational Agreement
Code Event
1
Sun. 1:00 PM Welcome and Overview
2
Sun. 1:00 PM Introduction
3
Sun. 1:00 PM Innovation and Education: U.S. and Global Efforts - BEGINNING
4
Sun. 1:00 PM Innovation and Education: U.S. and Global Efforts - END
5
Sun. 2:00 PM Education in India - BEGINNING
6
Sun. 2:00 PM Education in India - END
7
Sun. 2:00 PM Education in the U.S. and North Carolina - BEGINNING
8
Sun. 2:00 PM Education in the U.S. and North Carolina - END
9
Sun. 3:30 PM Researching the Scope of the Problem - BEGINNING
10
Sun. 3:30 PM Researching the Scope of the Problem - MIDDLE
11
Sun. 3:30 PM Researching the Scope of the Problem - END
12
Sun. 5:00 PM Interactive workshop on cultural conditioning - BEGINNING
13
Sun. 5:00 PM Interactive workshop on cultural conditioning - MIDDLE
14
Sun. 5:00 PM Interactive workshop on cultural conditioning - END
15
Mon. 9:00 AM Schools in the United States; Benjy Downing
16
Mon. 10:30 AM Presenting your ideas
17
Mon. 11:30 AM Understanding the context, teams create plan for the day - BEGINNING
18
Mon. 11:30 AM Understanding the context, teams create plan for the day - MIDDLE
19
Mon. 11:30 AM Understanding the context, teams create plan for the day - END
20
Mon. 12:30 PM Education and Innovation
21
Mon. 2:00 PM Innovation and Education: Overview
22
Mon. 3:20 PM Team research and interviews with experts: - BEGINNING
23
Mon. 3:20 PM Team research and interviews with experts: - MIDDLE
24
Mon. 3:20 PM Team research and interviews with experts: - END
25
Mon. 5:45 PM Team Research - BEGINNING
26
Mon. 5:45 PM Team Research - MIDDLE
27
Mon. 5:45 PM Team Research - END
28
Mon. 7:45 PM Teams work on presentations
29
Tues. 9:00 AM Team Presentations - BEGINNING
30
Tues. 9:00 AM Team Presentations - MIDDLE
31
Tues. 9:00 AM Team Presentations - END
25
Figure 5. Observational Agreement – Ability to evaluate global issue from perspective of multiple disciplines
Figure 5 shows agreement between OATC personnel as they observes participants ability to
evaluate global issues from the perspective of multiple disciplines. There is a slight increase
towards the end of the program, though participants observed abilities hover around the
comprehension level. The Winter Forum curriculum did focus on STEM education, though it
was not apparent that students conceptualized the educational process from multiple disciplines.
The limited movement with respect to this particular QEP outcome is not surprising given the
nature of the 2014 Winter Forum and its limited focus on exploring education through different
disciplines.
26
Figure 6. Observation Agreement – Ability to evaluate a global issue from multiple perspectives
OATC personnel observed significant increases in students’ abilities to evaluate a global issue
from multiple cultural, geographical and historical perspectives. Participants’ increased ability to
demonstrate this is a reflection of the 2014 Winter Forum curriculum. The curriculum was
largely focused on education in India and the similarities and differences between the culture and
history of the United States’ education systems and India’s education systems. The 2014 Winter
Forum was designed in a way in which we would expect to see movement among participants for
this particular QEP outcome, and the observational data collected does show that students
increased their ability to evaluate these global issues through multiple perspectives.
The upward trend visible in Figure 6 is compelling evidence of participants overall movement
towards this particular QEP outcome. Increases from the knowledge and comprehension stages
to application, analysis and even synthesis tend to occur at approximately the midpoint of the
event. Given the short time frame in which Winter Forum operates this sort of movement is
strong evidence of the programs successful design and execution. Raters did discuss scoring both
before and during the event to ensure general reliability. Scorers tended to be fairly consistent
with their own scoring, and each did show gains made throughout the course of the Winter
Forum. Similar upward trends are also observable in Figure 7.
27
Figure 7. Observational Agreement – Ability to engage in collaborative group work centered around a global issue
Participants’ ability to engage in collaborative group work centered around a global issue
increased over the course of the 2014 Winter Forum. Participants’ ability to work together
effectively was demonstrated early in the event. Students began working with their teams on the
first day and their cooperation and collaboration only improved over the course of the program.
Faculty mentors and facilitators certainly played a role in helping to bring students together, but
this was also encouraged by the required presentation at the end of the program – participants
needed to work together to have a viable product and presentation ready to show at the
conclusion of the Winter Forum. This is corroborated in the observational data reported in Figure
7.
Summary and recommendations
Summary
Overall the pedagogy of the 2014 Winter Forum was successful. The scaffolding of the Winter
Forum curriculum and alignment with QEP learning outcomes provided a meaningful and welldesigned opportunity for students, faculty and other participants to share, discuss, and develop
ideas and knowledge pertaining to the field of education. Winter Forum was perceived by the
participants to be an engaging and beneficial experience that was unlike other opportunities
available at Duke. The majority of the participants felt that the opportunity provided an excellent
chance to engage with faculty and other students in an environment that would have otherwise
been unavailable. Gains were generally observed across all participants with respect to the QEP
learning outcomes, and the goals of Winter Forum were generally met.
28
Clear and intentional mapping of the Winter Forum curriculum to the intended learning
outcomes provided a supportive environment in which participants were able to grow, develop
and demonstrate their learning gains – which also made the observation of expected learning
outcomes more apparent to OATC personnel on site. Positive movement among participants
towards the intended learning outcomes was corroborated by the increased scores between the
pre-program and post-program knowledge test as well as results of the pre and post program
surveys. The end-of-program presentations by each of the Winter Forum Teams served as a good
opportunity to capture structured, end of program observations on students’ mastery of the
Winter Forum topic before the close of the program.
Recommendations
Participants’ suggestions for programmatic improvements are listed in Appendix 3 (Question
12). Students expressed overwhelming support for the 2014 Winter Forum, with minimal
suggestions for changes and improvements. The most prominent student recommendations were
centered around the competitive nature of the team projects and the time allotted to develop these
projects. Based on the feedback provided, students would be more comfortable knowing upfront
(pre-Winter Forum) that the team projects would be competitive in nature. Additionally, many
students commented on the limited amount of scheduled time to work with their teammates to
develop their projects. The competitive nature of the team projects in the 2014 Winter forum did
help to entice and motivate students to push themselves during the course of the Winter Forum,
though some students expressed a concern about the amount of late nights right before the start
of the academic term. Given the current timeframe for Winter Forum this may be an unavoidable
concern, though it is something to consider in future years.
Several respondents indicated that back to back presentation could be a bit taxing, though given
the amount of content may be a necessity. Including break-out sessions or additional team time
between presentations may be a way to increase focus during presentations while also providing
the additional team time that respondents indicated a desire for. With respect to program
logistics, students were generally split on their willingness to begin the Winter Forum at an
earlier time in order to fit in more material, versus maintaining the current pre-semester time
frame.
With regard to the continuous enhancement of the Winter Forum’s assessment plan, the OATC
recommends maintaining several key assessment tools for future Winter Forums: group
observations, pre and post program knowledge test, and the pre and post program survey with
additional direct measures when possible, including student/group blogging. The final
presentations required of each team served as a meaningful capstone experience that allowed
participants to demonstrate their understanding of the subject matter at the conclusion of the
program. The need to produce a final proposal helped motivate students throughout the Winter
Forum. While the competitive nature of the final presentations may be left to the discretion of the
faculty coordinators, the OATC would strongly encourage future organizers to require a final
presentation/proposal to serve as a capstone experience for the Winter Forum participants. The
need to develop a final presentation for their faculty and peers motivates participants in a manner
that is difficult to replicate.
29
The development of program level learning outcomes would add additional insight into the
intended movement of students participating in the Winter Forum. The development of program
level outcomes allows assessment personnel to explore the movement of students beyond the
QEP learning outcomes. Winter Forum is a unique experience that changes and evolves year to
year based on the vision of that year’s faculty coordinators – there will inevitably be learning
outcomes that go outside of the QEP goals and it is useful to gather feedback on these additional
learning outcomes for post program reflection, as well as future planning. OATC personnel are
able to include program level learning outcomes into their observational rubric for future Winter
Forum coordinators. Additional questions regarding program level learning outcomes may be
added to the knowledge tests and surveys as well. The intentional design and mapping of the
Winter Forum knowledge test to the program’s curriculum allowed for meaningful analysis of
student movement towards the intended learning outcomes. Clearly defined answers by the
faculty help ensure that information being shared with students is intentional in nature and
retention and mastery of relevant concepts can be measured in a meaningful and reliable way.
Knowledge test questions should continue to be reflective of the program content, designed by
faculty to reflect the information they feel participants should acquire over the course of the
event. Future Winter Forum coordinators would be encouraged to replicate much of the
curriculum design and formatting of the 2014 Winter Forum. Given the evidence collected and
presented here it is clear that the pedagogy of the 2014 Winter Forum was ultimately successful
in moving students towards QEP outcomes.
30
Appendix 1: Scoring rubric for the knowledge test free-response items, preand post-tests
0
Unable to be Assessed
Response is missing or
incomprehensible due to
grammar, spelling,
structure, or language
choice. Includes "I
don't know" responses.
11
Name four sociocultural factors that
affect girls’ access to high-quality
education in India.
Faculty Response:
SOCIOCULTURAL FACTORS:
Poverty:
• When a family is in poverty, boys are
more likely to be educated than girls
• Girls provide free labor at home for the
family, so are not sent to school
Custom/culture:
• Girls do not receive primary education
because they are pulled out early/are kept
home to protect family honor
• Girls do not have high-achieving role
models
Facilities:
• Schools are unable to provide safe and
sanitary facilities for girls
Sexism:
• Beliefs that girls are not capable of
mastering content
• Beliefs that girls are not worthy of
acceleration and enrichment education
• Boys get more attention from teachers
(more positive and negative attention)
Names no more than
one correct factor.
Others are wrong or
missing.
1
"Needs a lot of
improvement"
Attempts the questions,
but fails due to one or
more factual
inaccuracies
Names no more than
two correct factors.
Others are wrong or
missing.
2
"Needs some
improvement"
3
"Satisfactory"
4
"Exceeds satisfaction”
Effort made at answering
the question, but the
response is insufficient in
terms of
comprehensiveness,
detail, and/or
understanding of key
principles. In some
cases, the student does
not provide the requested
number of examples, but
provides at least one
correct example.
Question is answered clearly.
All tasks are completed, and
requested number of
examples is complete. No
additional effort is made to
provide evidence or context
to persuade the audience.
(Sample response from
faculty are considered to be at
the satisfactory level unless
otherwise noted)
Answers the question and
elaborates, exceeding the
requirements of the
question in terms of detail,
context, and/or evidence.
Well written, fluid,
articulate.
Names no more than
three correct factors.
Others are wrong or
missing.
Names four correct
factors. Additional detail
provided is minimal.
Names four correct
factors and provides
meaningful detail or
information pertaining
to those factors.
31
15 (A)
(A) Name four characteristics of cultural
competency
Faculty Response:
Essential Attitudes:
• Respect: others are valued
• Openness: a willingness to risk and to
move beyond one’s comfort zone.
• Curiosity: a willingness to risk and to
move beyond one’s comfort zone.
• Discovery
Necessary Knowledge:
• The importance of understanding the
world from others’ perspectives
• Cultural self-awareness: the ways in
which one’s culture has influenced one’s
identity and worldview
• Culture-specific knowledge1
• Deep cultural knowledge including
understanding other world views
• Sociolinguistic awareness
Skills: Those that address the
acquisition and processing of
knowledge:
• Observation, Listening, Evaluating,
Analyzing, Interpreting, Relating
Internal Outcomes: Aspects that occur as
a result of acquired attitudes, knowledge
and skills necessary for intercultural
competence
• Flexibility
• Adaptability
• Ethnorelative perspective and empathy
External Outcomes: Behavior and
communication of the individual
“Effective and appropriate behavior and
communication in intercultural situations”
Intercultural competence is a lifelong
process; there is no one point at which an
individual becomes completely
interculturally competent.
Intercultural competence does not “just
happen;” instead, it must be intentionally
addressed.
Fails to describe
more than one
characteristic
correctly.
Student's response
slightly resembles
faculty response.
Two to three correct
examples are given
(according to
characteristics
outlined in faculty
response).
Student's response
includes all four
characteristics which
were outlined
specifically in faculty
response
Student's response
includes all four
characteristics outlined
in faculty response.
Student's response also
elaborates on 2 or
more of the factors
they listed (example of
this additional detail
may be seen in the
bulleted portion of the
faculty response).
32
15 (B)
(A) Name four characteristics of cultural
competency and (B) indicate why it is
important when adopting and/or
reforming educational practices and
pedagogies, particularly when
supporting underrepresented students
(200 words or less).
Faculty Response:
IMPORTANCE when adopting and/or
reforming educational practices and
pedagogies
• Danger of a single story/danger of
assumptions and stereotypes when working
with students whose experiences are likely
different from your own
• Importance of validating local knowledge
• Importance of perspective-taking
Source:
Darla Deardorff:
http://www.nafsa.org/_/file/_/theory_con
nections_intercultural_competence.pdf
16
Discuss two similarities and two
differences between education in the US
and India (200 words or less.)
Facutly Response:
Similarities:
• Focus on STEM education
• STEM success (rhetorically) tied to
national success
• Influence of standardized assessment on
content/what taught
• Teachers at the elementary level not well
trained in STEM content
• Education practices and policies highly
decentralized
• Schools struggle to value diverse cultural
experiences/practices of all students
• Poverty, race/ethnicity impact access to
resources and success in school
• Low income students score lower than
those not low income
• Lack of inquiry based instruction
• Inequitable access to technology
Student attempts to
respond but is not
able to adequately
articulate the
importance of
cultural
competency.
Student adequately
describes the
importance of
cultural competency
without discussing
any of the bulleted
points outlined in
faculty response
guide.
Student correctly
describes the
importance of
cultural competency
and mentions one of
the three bulleted
points outlined in the
faculty response
guide.
Student makes an
attempt at answering
the question but
describes only one
correct similarity or
one correct
difference as
outlined in the
faculty response
guide.
Accurately describes
two correct
similarities or two
correct differences,
but not both. (i.e.
Student response may
address ONLY
similarities or ONLY
differences as
outlined in the faculty
response guide.)
Student correctly
describes the importance
of cultural competency
and provides meaningful
detail about one or more
of the bulleted points
outlined in the faculty
response guide.
Student correctly
describes the
importance of cultural
competency and
provides meaningful
detail about two or
more of the bulleted
points outlined in the
faculty response guide.
Accurately describes two
similarities and two
differences as indicated
by faculty response
guide. Other information
given is minimal.
Accurately describes
two similarities and
two differences as
outlined in the faculty
response guide and
provides meaningful
details on those
similarities or
differences.
33
resources
• Rural and poor areas have fewer local
resources
Differences:
United States:
• Right to education not constitutionally
guaranteed
• Class size averages 25 students
• Compulsory education: age 6 through at
least 15 years; most up to 18 years
India:
• Right to education constitutionally
guaranteed
• Class size up to 60 students
• Compulsory education age 6 – 14
• Girls’ access to education more
compromised – first to be kept home if
children’s labor needed
• Highest rate of child malnourishment
34
Appendix 2: Winter Forum pre-program survey results
2014 Winter Forum Pre-program Survey
N=
108
Q1
How did you learn about the 2014
Winter Forum (please check all that
apply)?
N
%
Ad in The Chronicle
7
6.5%
Banner at the Bryan Center
35
32.4%
Student activities fair promotion
12
11.1%
Ad in a Duke bus
11
10.2%
A Duke faculty member
31
28.7%
A Duke student
30
27.8%
An email from a distribution list (please
specify below)
26
24.1%
Other (please specify below)
14
13.0%
Q1. How did you learn about the 2014 Winter Forum? Other:

Banner outside Perkins

spoke with Deb Johnson at a DSG event with Larry Moneta

Library banner

Banner outside Perkins

Saw a friend's facebook post about it

flier around campus

website

Ad next to perkins

banner next to perkins & a professor
35

I don't recall

Website

banner outside the library

Researching Education-focused programs at Duke

Researching Education-focused programs at Duke
Q2. If you checked a distribution list from the above responses, please indicate the
name of the distribution list.

Global Health or Pubpol mailing list

The Newsletter from Duke Parents and Family Programs

pre-health

weekly freshman email

Public policy majors

Duke partnership for service

Public policy majors listserv

ICS, The short list, Baldwin

Child Research Policy Certificate Mailing List

PPL

An ICS mailing forward

I believe it was because of my participation in Engage or my education minor (I
do not remember).

UCAE

Public Policy majors mailing list.

Pratt Undergraduates

PubPol list

Public policy newsletter

N/A

DukeEngage

academic advising center

Ad from public policy majors mailing list
36

public policy major mailing list

Public Policy Majors List
Q3
Please check any dietary restrictions
you have. If you have no special dietary
restrictions, you may skip this
question.
N
%
Vegetarian
19
17.6%
Vegan
0
0.0%
Other (please specify below)
16
14.8%
Q3b. Please check any dietary restrictions you have. If you have no special dietary restrictions, you
may skip this question. Other:

Lactose Intolerant, can take a pill if necessary

No beef or pork

No red meat

lactose-intolerant

peanut and nut allergy

no pork

Allergy to all nuts (including peanuts) and all seafood

High cholesterol, fatty foods

lactose intolerant

Gluten-free

Chicken, Fish, and Turkey are the meats I eat.

I don't eat red meat.

celiac's (gluten free completely)

no beef or pork

Vegetarian but eat chicken only (chickenatarian)

lactose intolerant
37
Q4
For the items below, please indicate
your current ability to address the
overall topic of the 2014 Winter Forum.
Very low
ability
[1]
Low ability
[2]
Moderate
ability
[3]
High ability
[4]
Very high
ability
[5]
No Reply
Your ability to evaluate global issues from
a multi-disciplinary perspective.
0
0.0%
0
0.0%
4
3.7%
5
4.6%
5
4.6%
11
10.2%
22
20.4%
19
17.6%
51
47.2%
37
34.3%
43
39.8%
51
47.2%
37
34.3%
45
41.7%
34
31.5%
27
25.0%
15
13.9%
15
13.9%
5
4.6%
6
5.6%
0
0.0%
0
0.0%
0
0.0%
0
0.0%
2
13
51
32
10
0
1.9%
12.0%
47.2%
29.6%
9.3%
0.0%
2
5
29
39
33
0
1.9%
4.6%
26.9%
36.1%
30.6%
0.0%
0
3
20
47
38
0
0.0%
2.8%
18.5%
43.5%
35.2%
0.0%
Not at all
important
[1]
Slightly
important
[2]
Moderately
important
[3]
Important
[4]
Very
important
[5]
No Reply
0
0.0%
0
0.0%
0
0.0%
0
0.0%
1
0.9%
1
0
0.0%
1
0.9%
7
6.5%
5
4.6%
4
3.7%
4
12
11.1%
10
9.3%
24
22.2%
21
19.4%
13
12.0%
10
47
43.5%
41
38.0%
40
37.0%
41
38.0%
41
38.0%
38
49
45.4%
56
51.9%
37
34.3%
41
38.0%
49
45.4%
55
0
0.0%
0
0.0%
0
0.0%
0
0.0%
0
0.0%
0
0.9%
3.7%
9.3%
35.2%
50.9%
0.0%
0
0.0%
1
0.9%
11
10.2%
41
38.0%
55
50.9%
0
0.0%
Your ability to evaluate global issues from
a multi-cultural perspective.
Your ability to evaluate global issues from
a multi-geographical perspective.
Your ability to evaluate global issues from
a historical perspective.
Your ability to think critically about the
relationship between science and public
policy.
Your ability to relate what you know
about the 2014 Winter Forum topic to
future coursework and co-curricular
experiences.
Your ability to engage in collaborative
group work centered on the 2014 Winter
Forum topic.
Q5
Please indicate how important it is to
you that you have the opportunity to
make gains in each of the following
areas.
Evaluate global issues from a multidisciplinary perspective.
Evaluate global issues from a multicultural perspective.
Evaluate global issues from a multigeographical perspective.
Evaluate global issues from a historical
perspective.
Think critically about the relationship
between science and public policy.
Relate what you know about the 2014
Winter Forum topic to future coursework
and co-curricular experiences.
Engage in collaborative group work
centered on the 2014 Winter Forum topic.
Mean
SD
3.57
0.79
3.59
0.85
3.13
0.92
3.09
0.91
3.32
0.87
3.89
0.96
4.11
0.80
Mean
SD
4.34
0.67
4.41
0.70
3.99
0.91
4.09
0.87
4.23
0.87
4.31
0.86
4.39
0.71
38
Q6
With respect to the 2014 Winter Forum, to
what extent do you expect to make gains
in the following general abilities?
Gain factual knowledge.
Understand fundamental concepts and
principles.
Learn to apply knowledge, concepts,
principles, or theories to a specific situation
or problem.
Learn to analyze ideas, arguments and
points of view.
Learn to integrate and synthesize knowledge.
Learn to evaluate the merits of ideas and
competing claims.
Enhance my writing skills (describing,
communicating, presenting your opinions).
Enhance my speaking skills (communicating
with faculty and peers, asking for advice,
presenting your own ideas).
Enhance my critical thinking skills
(understanding concepts, problem solving,
and trouble shooting).
Develop my ability to work independently.
Develop my ability to work as part of a team.
Develop my ability to recognize ethical
problems in science and public policy.
Develop my ability to work through ethical
problems in science and public policy.
Develop my ability to take a stand even when
others disagree.
Very low
expectation
[1]
Low
expectation
[2]
Moderate
expectation
[3]
High
expectation
[4]
Very high
expectation
[5]
No Reply
1
0.9%
0
0.0%
0
8
7.4%
4
3.7%
2
29
26.9%
18
16.7%
10
50
46.3%
64
59.3%
41
20
18.5%
21
19.4%
55
0
0.0%
1
0.9%
0
0.0%
1.9%
9.3%
38.0%
50.9%
0.0%
0
0.0%
0
0.0%
0
0.0%
2
1.9%
2
1.9%
1
0.9%
2
1.9%
12
11.1%
20
18.5%
16
14.8%
27
25.0%
43
39.8%
38
35.2%
42
38.9%
44
40.7%
37
34.3%
48
44.4%
49
45.4%
35
32.4%
14
13.0%
0
0.0%
0
0.0%
0
0.0%
0
0.0%
1
5
31
42
29
0
0.9%
1
4.6%
1
28.7%
19
38.9%
45
26.9%
42
0.0%
0
0.9%
0.9%
17.6%
41.7%
38.9%
0.0%
3
2.8%
1
0.9%
1
0.9%
1
0.9%
2
1.9%
28
25.9%
2
1.9%
1
0.9%
1
0.9%
6
5.6%
42
38.9%
14
13.0%
21
19.4%
19
17.6%
26
24.1%
26
24.1%
45
41.7%
45
41.7%
43
39.8%
45
41.7%
9
8.3%
46
42.6%
40
37.0%
44
40.7%
29
26.9%
0
0.0%
0
0.0%
0
0.0%
0
0.0%
0
0.0%
Mean
SD
3.74
0.88
3.95
0.72
4.38
0.73
4.22
0.81
4.29
0.75
4.04
0.81
3.45
0.92
3.86
0.90
4.17
0.81
3.09
0.97
4.23
0.82
4.13
0.82
4.19
0.82
3.86
0.94
39
Q7
Please indicate the degree to which you
expect that the 2014 Winter Forum will
provide the following opportunities.
Very low
expectation
[1]
Low
expectation
[2]
Moderate
expectation
[3]
High
expectation
[4]
Very high
expectation
[5]
No Reply
0
2
19
47
40
0
0.0%
1.9%
17.6%
43.5%
37.0%
0.0%
0
3
11
43
51
0
0.0%
2.8%
10.2%
39.8%
47.2%
0.0%
0
0.0%
1
0.9%
0
0.0%
1
0.9%
1
0.9%
4
3.7%
8
7.4%
17
15.7%
11
10.2%
5
4.6%
26
24.1%
27
25.0%
21
19.4%
38
35.2%
28
25.9%
47
43.5%
36
33.3%
36
33.3%
35
32.4%
39
36.1%
31
28.7%
36
33.3%
34
31.5%
23
21.3%
35
32.4%
0
0.0%
0
0.0%
0
0.0%
0
0.0%
0
0.0%
Mean
SD
2.94
0.77
Mean
SD
4.04
0.86
Exposure to cutting edge ideas and theory
via scholars and experts in the field.
An environment that encourages open and
honest exchange of ideas from multiple
disciplinary, cultural and geographic
perspectives.
An opportunity to develop academic and
professional networks.
Information that will help clarify my education
goals and objectives.
Information that will help clarify my career
goals and objectives.
Information that I will integrate in my daily
life.
Information that I will integrate in other
courses and course work.
Q8
Q9
To what degree are you now actively
involved in issues pertaining to
education?
Not involved [1]
Interested but
not involved
[2]
Moderately
involved [3]
Highly
involved [4]
No Reply
2
29
49
27
1
1.9%
26.9%
45.4%
25.0%
0.9%
How likely is it that your experience in the
2014 Winter Forum will influence you to
become actively engaged in issues related
education? (This may include a variety of
curricular, co-curricular, and professional
activities such as independent research,
connecting with a community groups,
reaching out to NGOs, writing an op-ed,
taking a specific class, etc.)
Very unlikely
[1]
Unlikely [2]
Unsure [3]
Likely [4]
Very
Likely
[5]
No Reply
2
1
22
48
34
1
1.9%
0.9%
20.4%
44.4%
31.5%
Mean
SD
4.16
0.78
4.31
0.77
3.97
0.83
3.91
0.98
3.81
1.05
3.63
0.96
3.94
0.93
0.9%
40
Q10
Please indicate the extent to which each of
the following disciplines interests you.
Arts (visual and performance)
Biology
Business, accounting, or finance
Computer science
Economics
Education
Engineering
Environmental science
Foreign languages
Global Cultural Studies
International Comparative Studies
Marine science
Math, statistics
Natural sciences
Neuroscience, neurobiology
Political Science
Pre-health (pre-med, pre-dental, pre-vet)
Psychology
Public policy
Sociology
Other
No interest
[1]
11
10.2%
25
23.1%
21
19.4%
40
37.0%
21
19.4%
1
0.9%
47
43.5%
32
29.6%
9
8.3%
7
6.5%
9
8.3%
46
42.6%
28
25.9%
24
22.2%
22
20.4%
15
13.9%
49
45.4%
8
7.4%
7
6.5%
11
10.2%
3
2.8%
Slight interest
[2]
Moderate
interest
[3]
Substantial
interest
[4]
No
Reply
37
34.3%
26
24.1%
38
35.2%
23
21.3%
34
31.5%
4
3.7%
26
24.1%
37
34.3%
24
22.2%
18
16.7%
25
23.1%
37
34.3%
39
36.1%
34
31.5%
31
28.7%
32
29.6%
26
24.1%
24
22.2%
18
16.7%
30
27.8%
3
2.8%
30
27.8%
29
26.9%
31
28.7%
31
28.7%
34
31.5%
24
22.2%
19
17.6%
26
24.1%
35
32.4%
36
33.3%
35
32.4%
12
11.1%
29
26.9%
30
27.8%
31
28.7%
37
34.3%
7
6.5%
34
31.5%
35
32.4%
33
30.6%
2
1.9%
30
27.8%
28
25.9%
18
16.7%
14
13.0%
19
17.6%
77
71.3%
16
14.8%
13
12.0%
40
37.0%
47
43.5%
39
36.1%
13
12.0%
11
10.2%
20
18.5%
24
22.2%
24
22.2%
26
24.1%
42
38.9%
47
43.5%
34
31.5%
8
7.4%
0
0.0%
0
0.0%
0
0.0%
0
0.0%
0
0.0%
2
1.9%
0
0.0%
0
0.0%
0
0.0%
0
0.0%
0
0.0%
0
0.0%
1
0.9%
0
0.0%
0
0.0%
0
0.0%
0
0.0%
0
0.0%
1
0.9%
0
0.0%
92
85.2%
Mean
SD
2.73
0.98
2.56
1.11
2.43
0.99
2.18
1.08
2.47
1.00
3.67
0.60
2.04
1.10
2.19
1.00
2.98
0.97
3.14
0.92
2.96
0.97
1.93
1.01
2.21
0.95
2.43
1.03
2.43
1.03
2.65
0.98
2.09
1.22
3.02
0.96
3.14
0.93
2.83
0.99
2.94
1.24
41
Q10b. Please indicate the extent to which each of the following
disciplines interests you. Other:

Women's Studies

Classical Studies

Cultural Anthropology

Cultural Anthropology

Ethics

ethics

Global Health
Q11. Please share any comments you may have about your expectations of the 2014 Winter Forum.




















i really hope that this forum will better prepare me for active engagement in education discussion and policy.
Gain ideas to generate a solution for my home country South Africa.
I hope to meet professors and faculty members to form strong relations with them.
I'm not sure what to expect, but I know it will be a great experience!
I wish to challenge my opinions about education reform.
I had a fantastic 2013 Winter Forum experience and hope that this forum can help build upon my interest in
education and learn how to apply it to real world problems.
I expect it to be a great experience I can grow from as a person!
I hope to be able to learn more about education issues in both India and America, especially with regard to science,
as I have little knowledge about that area specifically.
I look forward to it.
I'm excited for engaging discussions and arguments. Interested to see what our challenge is going to be and how my
team and I will work together to solve it.
I am looking forward to learning new innovative ways to overcome obstacles to quality education, both at home and
in India.
expecting to learn a lot about innovations in education, particularly in STEM fields and in India.
I'm greatly looking forward to being challenged and inspired by Winter Forum!
I hope to gain a network of peers and professionals to share and learn about education issues and policies that I can
integrate into future work!
I want to sprint mentally!
I'm going in with an enthusiastic attitude and a open mind! I'm excited!
I expect to be confronted with realities that require immediate attention, and not in a sugar-coated way.
In all, I expect the Winter Forum to be an experience that I think about often whenever I come across a project
related to education, no matter what the context. I want to be able to apply what I do and what I learn during the
Winter Forum to the work that I do at Duke and beyond, both in reference to the actual material taught and to the
relationships I might build with my team and my mentors. But above all, my expectation for the Winter Forum is to
keep me captivated, and make me want to return to those long sessions time and time again.
I'm coming in with fairly low expectations just because I feel like there's little that can be done in three days.
I think this will be a very mentally challenging experience for me to participate in and that it will force me to not only
learn about other people's opinion, but also become more secure in the opinions that I might have.
42










I expect to be exposed to many different opinions. Through this forum, I hope to expand my perspective and the
perspective of others. In addition, I hope to learn a lot from my faculty mentor and fellow group members as well.
I am very excited to get started on our project! I hope to further narrow my goals in the education field.
I am expecting a vibrant community of people who share my interests in education.
I hope for this time to be primarily about brainstorming ideas, but I question the impact and the overall goal of the
forum. I think there's an expectation to come up with same grand idea on how to change education internationally on
a systemic basis, but I think this approach may fundamentally be flawed in that is just not feasible. One of my main
expectations is to certainly gain factual knowledge on the issues and to be incorporated into a highly diverse and
capable community that cares about education, but I do not expect anyone to come up with an idea that we can just
implement right after this three day period. To positively impact a community, I think there needs to first be a deep
sense of understanding of what the issues really are, which can only be achieved by knowing the experiences of
others. Factual knowledge and talks can be great for providing context to the issues, but until we spend time in the
communities that we are trying to serve, I fear that our ideas will simply not be equipped to provide the change we
are hoping to make. I think this is an important point for us to understand, not to dishearten our passion and work
ethic, but to provide a better understanding of what are real goals should be for the forum.
I expect to hear awesome speakers and learn how to come up with solutions to real-life problems as if I were working
in the field.
Past participants of the Winter Forum have praised this experience as one of intellectual stimulation and growth. I
hope to learn as much as I can from this experience!
I'm a little nervous about how the competitive environment is going to shape the way we all think about our
experience, but I'm excited to meet new people and learn about things that interest me. / Also, I was kind of
overwhelmed by the language of this survey - a lot of it struck me as jargon. I hope we actually talk about what it
means to "evaluate global issues from multicultural/multidisciplinary/multigeographical perspective" instead of just
throwing those words around. There's a huge tendency to do that in education, and after a while it becomes
incredibly redundant and frustrating.
I hope to be able to have hands on experience in improving my entrepreneurial abilities, learning how to implement
feasible solutions to big problems.
I am looking forward to getting to know my team members and working on a project that utilizes each individual's
skills and strengths.
I think the 2014 Winter Forum will be a most enriching experience, and I look forward to it very much!

I expect to be surrounded by highly motivated and ambitious students and faculty that are passionate about learning
about education.

I will be a member of Teach for America next year, and I look forward to using the lessons and ideas I've learned and
experienced in the Winter Forum and applying them to my work in TFA!

I expect it to be a great learning experience!

I expect to become acquainted with various experts in the field and develop critical thinking skills to tackle problems
such as the achievement gap and education in poverty.

I have been so excited about this opportunity ever since I read the information on the education innovation clusters.
Those groups were exactly what I had been forming in my mind and hoping to create for years now--I am so excited
that there are already people doing it that I can learn from.

I would like to work together with the team to develop a plan that has foresight and can be implemented.

I hope to have a great experience meeting awesome students and enhancing my knowledge about education in
developing countries.
43
Appendix 3: Winter Forum post-program survey results
2014 Winter Forum Post-program Survey
N = 63
Q1
For the items below, please indicate
your current ability, after the
conclusion of the program, to
address the overall topic of the 2014
Winter Forum.
Your ability to evaluate global issues
from a multi-disciplinary perspective.
Your ability to evaluate global issues
from a multi-cultural perspective.
Your ability to evaluate global issues
from a multi-geographical perspective.
Your ability to evaluate global issues
from a historical perspective.
Your ability to think critically about the
relationship between science and public
policy.
Your ability to relate what you know
about the 2014 Winter Forum topic to
future coursework and co-curricular
experiences.
Your ability to engage in collaborative
group work centered on the 2014 Winter
Forum topic.
Very low
ability
[1]
Low ability
[2]
Moderate
ability
[3]
High ability
[4]
Very high
ability
[5]
No Reply
0
1
11
37
14
0
0.0%
1.6%
17.5%
58.7%
22.2%
0.0%
0
2
14
31
16
0
0.0%
3.2%
22.2%
49.2%
25.4%
0.0%
0
2
11
38
12
0
0.0%
3.2%
17.5%
60.3%
19.0%
0.0%
0
10
22
23
8
0
0.0%
15.9%
34.9%
36.5%
12.7%
0.0%
0
3
12
25
22
1
0.0%
4.8%
19.0%
39.7%
34.9%
1.6%
0
0
5
32
26
0
0.0%
0.0%
7.9%
50.8%
41.3%
0.0%
0
0
4
27
32
0
0.0%
0.0%
6.3%
42.9%
50.8%
0.0%
Mean
SD
4.02
0.68
3.97
0.78
3.95
0.71
3.46
0.91
4.06
0.87
4.33
0.62
4.44
0.62
44
Q2
Please indicate the extent to which
you made learning gains in each of
the following areas.
Gained factual knowledge.
Understood fundamental concepts and
principles.
Learned to apply knowledge, concepts,
principles, or theories to a specific
situation or problem.
Learned to analyze ideas, arguments
and points of view.
Not at all
[1]
A little
[2]
Moderately
[3]
Highly
[4]
Very highly
[5]
No Reply
0
0.0%
1
1.6%
0
7
11.1%
4
6.3%
3
10
15.9%
11
17.5%
8
34
54.0%
30
47.6%
22
12
19.0%
17
27.0%
30
0
0.0%
0
0.0%
0
0.0%
4.8%
12.7%
34.9%
47.6%
0.0%
0
3
9
19
32
0
0.0%
4.8%
14.3%
30.2%
50.8%
0.0%
Learned to integrate and synthesize
knowledge.
0
3
10
21
29
0
0.0%
4.8%
15.9%
33.3%
46.0%
0.0%
Learned to evaluate the merits of ideas
and competing claims.
0
3
8
33
19
0
0.0%
4.8%
12.7%
52.4%
30.2%
0.0%
Enhanced my writing skills (describing,
communicating, presenting your
opinions).
7
5
23
19
9
0
11.1%
7.9%
36.5%
30.2%
14.3%
0.0%
1
6
15
23
18
0
1.6%
9.5%
23.8%
36.5%
28.6%
0.0%
0
3
12
25
23
0
0.0%
4.8%
19.0%
39.7%
36.5%
0.0%
9
14.3%
0
0.0%
1
16
25.4%
4
6.3%
7
19
30.2%
7
11.1%
7
12
19.0%
20
31.7%
25
7
11.1%
32
50.8%
23
0
0.0%
0
0.0%
0
1.6%
11.1%
11.1%
39.7%
36.5%
0.0%
1
7
9
24
22
0
1.6%
11.1%
14.3%
38.1%
34.9%
0.0%
0
0.0%
5
7.9%
9
14.3%
30
47.6%
19
30.2%
0
0.0%
Enhanced my speaking skills
(communicating with faculty and peers,
asking for advice, presenting your own
ideas).
Enhanced my critical thinking skills
(understanding concepts, problem
solving, and trouble shooting).
Developed my ability to work
independently.
Developed my ability to work as part of
a team.
Developed my ability to recognize
ethical problems in science and public
policy.
Developed my ability to work through
ethical problems in science and public
policy.
Developed my ability to take a stand
even when others disagree.
Mean
SD
3.81
0.88
3.92
0.92
4.25
0.86
4.27
0.88
4.21
0.88
4.08
0.79
3.29
1.16
3.81
1.01
4.08
0.87
2.87
1.21
4.27
0.90
3.98
1.04
3.94
1.05
4.00
0.88
45
Q3
Please indicate the degree to which
the 2014 Winter Forum provided the
following opportunities.
Exposure to cutting edge ideas and
theory via scholars and experts in the
field.
An environment that encourages open
and honest exchange of ideas from
multiple disciplinary, cultural and
geographic perspectives.
An opportunity to develop academic and
professional networks.
Information that will help clarify my
education goals and objectives.
Information that will help clarify my
career goals and objectives.
Information that I will integrate in my
daily life.
Information that I will integrate in other
courses and course work.
Q4
To what extent did the following
components of the Winter Forum
contribute to your learning gains?
Formal presentations by Winter Forum
speakers
Developing group blog posts
Interaction with experts in the field
Interaction with faculty coaches
Intellectual debate
Not at all
[1]
A little [2]
Moderately
[3]
Highly
[4]
Very highly
[5]
No Reply
0
2
7
28
25
1
0.0%
3.2%
11.1%
44.4%
39.7%
1.6%
0
2
3
25
32
1
0.0%
3.2%
4.8%
39.7%
50.8%
1.6%
0
0.0%
0
0.0%
0
0.0%
1
1.6%
0
0.0%
4
6.3%
7
11.1%
7
11.1%
10
15.9%
7
11.1%
10
15.9%
15
23.8%
14
22.2%
13
20.6%
12
19.0%
30
47.6%
24
38.1%
25
39.7%
19
30.2%
24
38.1%
18
28.6%
16
25.4%
16
25.4%
19
30.2%
19
30.2%
1
1.6%
1
1.6%
1
1.6%
1
1.6%
1
1.6%
Not at all
[1]
A little
[2]
Moderately
[3]
Highly
[4]
Very highly
[5]
No Reply
0
0.0%
16
25.4%
0
0.0%
0
0.0%
1
1.6%
2
3.2%
15
23.8%
1
1.6%
3
4.8%
2
3.2%
3
4.8%
23
36.5%
1
1.6%
5
7.9%
7
11.1%
21
33.3%
6
9.5%
24
38.1%
19
30.2%
17
27.0%
37
58.7%
3
4.8%
37
58.7%
36
57.1%
36
57.1%
0
0.0%
0
0.0%
0
0.0%
0
0.0%
0
0.0%
Mean
SD
4.23
0.78
4.40
0.73
4.00
0.85
3.79
0.96
3.81
0.96
3.73
1.12
3.89
0.98
Mean
SD
4.48
0.74
2.44
1.12
4.54
0.62
4.40
0.83
4.35
0.92
46
Q4b. To what extent did the following components of the Winter Forum
contribute to your learning gains? Other:

Readings
Please indicate your opinion of the following aspects
of the 2014 Winter Forum.
Q5
Strongly
disagree [1]
Disagree
[2]
Neutral
[3]
Agree
[4]
Strongly
agree [5]
No
Reply
0
3
4
29
26
1
0.0%
4.8%
6.3%
46.0%
41.3%
1.6%
0
3
4
28
28
0
0.0%
4.8%
6.3%
44.4%
44.4%
0.0%
2
3
10
16
32
0
3.2%
4.8%
15.9%
25.4%
50.8%
0.0%
The sites at Fuqua were comfortable, convenient, and
appropriate for the Winter Forum.
The locations assigned for the break-out groups
effectively facilitated group work.
I will consider applying for next year's Winter Forum.
Mean
SD
4.26
0.79
4.29
0.79
4.16
1.07
Q6. Please comment on your intention to apply or not apply for future Winter Forum programs.

Winter forum is a great way to gain skills necessary in the future

I became really passionate about the content matter and I really enjoyed creating new educational innovations and creating relationships with my teammates, but the time
commitment was unreal to have to do all of it in 2 days.

I very much enjoyed the wonderful experience of the program and will be applying in the future.

I intend to apply to other winter forum programs in the future.

Probably apply.

I really enjoyed Winter Forum this year and I would definitely consider applying for a future program if it involved a topic that was as interesting to me as education is.

I certainly will apply because Winter Forum was worth so much to me. I like that this program is volunteer based and students don't receive anything like money or course
credit for it which means that they do it for self-satisfaction and sheer interest. I enjoyed getting to listen to speakers that are experts in the field and look forward to doing it
again next year.
47

I really enjoyed the program especially because I am very interested in Education and have done some work both in the US and in India. It was also a great way to get to know
other experts in the field and getting to know what works they are doing. If there is a topic that interests me, I would definitely apply for future Winter Forum programs. Great
job from everyone! :)

The intense group work and developing presentation skills are valuable life skills that I definitely improved on during the program. I really enjoyed the multidisciplinary
collaborative efforts to come up with a solution to a difficult problem.

I have no doubt winter forum will continue to provide excellent programming but whether I apply depends largely on the topic. /

If the topic is compelling for me and I am passionate about the issue, I will certainly consider applying.

I expected Winter Forum to simply be lectures and discussions on education--but this experience was like nothing I've ever had. It was such a high-intensity, focused
environment, and it fostered so many incredible ideas that I want to do it again.

I'm a senior and I absolutely loved Winter Forum! I will definitely encourage my underclassmen friends to apply!

If I wasn't graduating, I would definitely apply for Winter Forum next year. I loved the experience and found it very beneficial.

I would love to apply for winter forum next year. This was an extremely valuable experience and I would love to partake again next year, especially since I am a scientist and
the topic is psychology/neuroscience.

I enjoyed that I got to participate in my first Winter Forum in a field I am passionate about. Because of the wonderful experience I had, I am curious about how other versions
will compare and how they might stimulate me.

I really liked the program, but do feel that it is really intense and might make me really tired in my first week of classes.

undecided- depends on the topic

Depending on the topic at hand, I would very much like to apply for future Winter Forum programs. Winter Forum provides an atmosphere that epitomizes everything that's
great about Duke, and I love being in such an environment.

Interest was in topic, not in experience.

I would if I were not a Senior :(

I plan to apply for the next winter forum!

I am a senior, but if I weren't, I would definitely apply.

depends on the topic

Depending on the topic, I am certainly strongly considering applying for future winter forums.
48

would love to, but graduating :(

I enjoyed my Winter Forum experience and would definitely consider applying again. The only draw back would be the intense amount of work directly before school starts.

I am very interested in next year's Winter Forum topic, and so I really hope that I am accepted.

I am a senior and unable to apply for future Winter Forums.

I don't have any background in neuroscience, but after seeing how much winter forum takes an interdisciplinary approach, I'd love to take part again

If it is a competition, I will definitely apply! If it isn't, I will still be interested, but I will be more of the subject.

I will definitely apply to future Winter Forums now having done two in a row in my first two years at Duke. It has become a staple of my education and a bright way to begin the
semester. I am unsure about next year's topic, but I will consider next year's forum and definitely try to apply to one more before I leave Duke in 2016.

I am a senior and will not be able to participate in future forums.

The topic does not interest me as much

graduating

I am extremely interested in applying for future Winter Forum programs especially considering the fact that the topic involves neuroscience.

I really enjoyed this year's Winter Forum, particularly because of the theme. My decision to apply to a future Winter Forum would depend on the topic of conversation. Also, it
was an extremely work-intensive program right before the start of the semester.

I don't know very much about neuroscience, and probably won't apply. This year was excellent, though.

I will be graduating but would have loved to do it again next year! Best Winter Forum I've attended thus far!

I would apply if the topic was something I was interested in.

I definitely would if I wasn't graduating

I will be graduating in May 2014, so will not be able to apply in future Winter Forum programs.

I will most likely apply for next year's Winter Forum.

N/A - I'm a senior
49

This is the second Winter Forum I have been fortunate to attend. My decision will be heavily based on my interest in the topic, the schedule and location (as Winter Forum
consistently conflicts with recruitment), and the format - as this year's forum was a draining way to start the semester.

Graduating Senior

Since this year's Winter Forum fit my educational and career objectives so well, I might not consider applying to other Forums that probably will be less focused on what I hope
to do in life. I wouldn't want to take a spot away from someone for whom it would be most beneficial, particularly since I'll be a senior next year.
Q7
Will you recommend the Winter Forum to other Duke
students?
Q8
My involvement in the Winter Forum has prompted me to
pursue further involvement in education. (This may
include a variety of curricular, co-curricular, and
professional activities such as independent research,
connecting with a community groups, reaching out to
NGOs, writing an op-ed, taking a specific class, etc.)
Q9
Please evaluate the effectiveness of the pre-Winter
Forum communication among/between the following
groups with respect to preparing you for the Winter
Forum activities.
Program coordinators to students
Faculty to students
Students to faculty
Yes [1]
61
96.8%
No [2]
1
1.6%
Unsure [3]
1
1.6%
No Reply
0
0.0%
Very
unlikely [1]
Unlikely
[2]
Neutral [3]
Likely [4]
Very
Likely [5]
No
Reply
6
18
10
12
17
0
9.5%
28.6%
15.9%
19.0%
27.0%
0.0%
Very
Ineffective
[1]
Ineffective
[2]
Neutral [3]
Effective [4]
Very
Effective
[5]
No
Reply
1
1.6%
3
4.8%
2
3.2%
1
1.6%
0
0.0%
0
0.0%
9
14.3%
11
17.5%
17
27.0%
34
54.0%
33
52.4%
28
44.4%
18
28.6%
16
25.4%
16
25.4%
0
0.0%
0
0.0%
0
0.0%
Mean
SD
3.25
1.38
Mean
SD
4.06
0.80
3.94
0.93
3.89
0.90
50
Q10
Q11
Would you have been willing to start earlier in the
weekend in order to have more time for the program
content, curriculum, and project development?
Please indicate the extent to which each of the following
disciplines interests you.
Arts (visual and performance)
Biology
Business, accounting, or finance
Computer science
Economics
Education
Engineering
Environmental science
Foreign languages
Global Cultural Studies
International Comparative Studies
Marine science
Math, statistics
Natural sciences
Neuroscience, neurobiology
Yes [1]
No [2]
Unsure [3]
No Reply
21
19
22
1
33.3%
30.2%
34.9%
1.6%
No interest
[1]
Slight
interest
[2]
Moderate
interest
[3]
Substantial
interest
[4]
No Reply
9
14.3%
15
23.8%
14
22.2%
18
28.6%
16
25.4%
0
0.0%
20
31.7%
14
22.2%
8
12.7%
5
7.9%
9
14.3%
24
38.1%
10
15.9%
12
19.0%
14
22.2%
19
30.2%
14
22.2%
20
31.7%
21
33.3%
13
20.6%
4
6.3%
19
30.2%
23
36.5%
14
22.2%
14
22.2%
16
25.4%
20
31.7%
29
46.0%
20
31.7%
16
25.4%
21
33.3%
12
19.0%
19
30.2%
13
20.6%
20
31.7%
15
23.8%
11
17.5%
18
28.6%
18
28.6%
14
22.2%
15
23.8%
15
23.8%
16
25.4%
16
25.4%
18
28.6%
12
19.0%
19
30.2%
8
12.7%
8
12.7%
11
17.5%
42
66.7%
8
12.7%
5
7.9%
20
31.7%
27
42.9%
20
31.7%
1
1.6%
5
7.9%
12
19.0%
12
19.0%
2
3.2%
3
4.8%
2
3.2%
3
4.8%
3
4.8%
2
3.2%
5
7.9%
3
4.8%
3
4.8%
3
4.8%
3
4.8%
3
4.8%
3
4.8%
3
4.8%
3
4.8%
Mean
SD
2.59
0.97
2.58
1.18
2.34
0.98
2.18
1.02
2.43
1.08
3.62
0.61
2.12
1.04
2.23
0.91
2.83
1.04
3.05
1.02
2.77
1.08
1.88
0.85
2.27
0.84
2.47
1.03
2.47
1.07
51
Political Science
Pre-health (pre-med, pre-dental, pre-vet)
Psychology
Public policy
Sociology
Other
9
14.3%
26
41.3%
4
6.3%
4
6.3%
7
11.1%
3
4.8%
17
27.0%
12
19.0%
14
22.2%
10
15.9%
15
23.8%
0
0.0%
23
36.5%
6
9.5%
21
33.3%
19
30.2%
23
36.5%
0
0.0%
11
17.5%
17
27.0%
21
33.3%
27
42.9%
16
25.4%
5
7.9%
3
4.8%
2
3.2%
3
4.8%
3
4.8%
2
3.2%
55
87.3%
2.60
0.96
2.23
1.27
2.98
0.93
3.15
0.94
2.79
0.97
2.88
1.55
Q11b. Please indicate the extent to which each of the following disciplines interests
you. Other:

Human Development

history

Applied Cultural Anthropology

Classical Studies
Q12. What one change would you make for future Winter Forums that you think would enhance the experience for other Duke students?

Better transportation

Find a way to implement a similar structure to the forum without having a project that requires such a time commitment outside the designated time of the forum

I can not think of any changes.

More days, less hours per day (it gets very tiring and stressful).

more time to work as groups or individually interact with the amazing speakers that are brought in

Allow a few more points during the day between sessions to meet in groups and add to your idea based on those sessions. Eg. the coaching sessions were really
useful, but after each one, we only had the 2-minute walk to the next session in which adjust our idea to be able to make the most of the next coach's advice.
52

I think that we had very little group time especially on the first day to really hash out our ideas. I would have loved to have more time with the group to work on the
project.

Longer time with the lecturers so that they can cover more in depth material

Remove the competition aspect of the forum because it detracted from my experience and just have the groups present at the end and that's the end of it.

A few less speakers so we can have more time to develop our programs.

More team working time- there were too many speeches and not enough time for us to work on projects. There was so much pressure that even though it was an
amazing experience at times I questioned my choice coming. I didn't want to be burned out for when classes start.

Getting to meet more people, most interactions only happened in group. Few networking opportunities

If a finished product, presentation etc. is expected at the end of the program, the participants should have more time to let the information and opinions they were
exposed to to sink in.

Pre-WF communication should be improved, especially for those who were abroad and could not attend the fall semester meeting. For example, I didn't know that WF
was business casual, or that it would be STEM-related. Additionally, this program was exhausting, and, while I think it was worth it, there is room for improvement in
such an intense schedule.

Nothing!

The entire 2 days were jam packed and at time it was a bit overwhelming. Having more breaks during the forum would have allowed students the ability to sit back and
reflect on the material being presented. In addition, a few more gatherings with students before the forum would have also been useful.

This is pretty small, but we would have greatly benefitted to have time inbetween meeting with the experts on Monday. The experts were extremely helpful, but we did
not have time to gather our thoughts and change aspects of our project before meeting with the next expert.

more time to do group work and more specific problem, or more direction

More planned time to work on projects. Fuqua is quite far away.

Allow teams to choose their own presentation slots. Do not randomly assign them. If 5 teams want to go first, then have a lottery for just those 5.

Not make it a competition- takes away from the speakers and creates a luring finality distraction to content

Since this was a competition format, I would've wished to have more time (either expanding beyond the 48 hours or having more group work sessions during the 48
hours) to actually formulate concretely our proposals.
53

Narrow the focus and scope of Winter Forum. We worked with a lot of generalizations and were unable to really think critically about the problem, even, never mind
the solution. / / Add time to confer after each coaching session (or each two) so that as a team, we can briefly talk through and make some changes to our ideas,
perhaps.

A little less speakers and more time to work in groups

More time to work in teams.

Every year for winter forum, the topic should focus on innovation and groups should work to create a project. This was definitely the best part about winter forum:
having the opportunity to really engage in social entrepreneurship and develop business skills.

would have picked teams slightly differently - should also incorporate whether people consider themselves strong public speakers, good researchers, creative thinkers,
etc. to create more balanced team

Make the location more convenient to access from the West Bus Stop.

More processing time between the coaching sessions.

A less intense work schedule. While the forum was a wonderful experience, many groups worked 16-18 hours on the Monday, and slept only 4-6 before the session
started again. I don't think this is particularly conducive directly before the semester commences. I do not know if this was a result of the competition element?
Perhaps a future idea could be to spread the forum over an extra day or two, and reduce the length of those days.

If the program started later, ended earlier, or otherwise allowed more time for sleep, I think that the experience would be greatly enhanced. I, and many other students
who I spoke to, was nearly unconscious during many of the speakers. I would have really liked to listen to them speak, but that was my only chance to get rest
because if I wasn't on top of my game the rest of the time I would have let down my team mates.

I appreciated having a wide variety of experts from the field come speak and wish more time had been allocated for having speakers.

While I loved the project aspect, I thought that with our limited time constraints, it made it rather stressful, difficult, and time-consuming to get it done in time. I learned
a lot and don't regret the experience at all (and in fact, encourage more projects like that) but the schedule would need to be reworked

I would consider making the teams smaller.

Move location to West Campus.

The most impactful time is that spent with other students working through problems and trying to devise solutions. Having more time to do that in a setting like this
forum and I am sure others should be a greater priority.

None that I can think of.

I would make the teams smaller; I think this would involve more active participation by students
54

don't know why, but a lot of projects were relatively similiar, maybe due to the ordering of the program/ the readings but feel like groups ideas might have been biased
early on. additionally more time to collaborate with group the first day- maybe less speakers

Allowing more time for working on the final presentation.

More time meeting with your group before the start of Winter Forum. Interacting with other students to share ideas, not just the students in our group (getting to
interact with other Duke students).

Perhaps stretch it one day longer and subsequently make the days themselves shorter. There was definitely some burnout by the end of the long days.

Not applicable!

It doesnt have to be a competition. I honestly did not even know it was going to be a competition until i got there and was rather disappointed that what i thought was
going to be a weekend of intellectual debate turned into a all out drive to deliver anything, literally any decent looking STEM proposal to the table.

I think that trying to make it in a more central location would be easier, because it's a trek from every spot on campus

Perhaps starting to plan the problem process earlier than when we started

Possibly include excursions off campus in order to apply the topic in the real world.

More time for group work

Less competative

More built-in time to network and interact with the visiting speakers.

Less group work

I would suggest having two facilitators per group, in order to give more than one main perspective as groups decide what project to pursue.
55
Q13. Please share any other comments you may have about the 2014 Winter Forum.

I had a wonderful time. The magnificent food was appreciated. The guest speakers were excellent.

I loved it! It was perfect! Working with my teammates was amazing.

it was very well organized and done!

I really enjoyed the presentations and working with students and faculty, and I learned a lot. It was a great experience. Thanks!

I personally think that the program would have been equally exciting if we didn't have the competition part. Sometimes competition can take away the message
that you are trying to convey. I felt like we were so focussed on getting our project done that we used all our extra time on it and didn't interact enough with
amazing guests we had from all over. However, I liked the intensity of the program and having to come up with an innovative project within 48 hours. I think this
will help us to handle intense situations that we are going to continuously face in the future.

I love my team and faculty mentor and creating those relationships was one of the best aspects of my Winter Forum experience.

I would definitely recommend it because it is a intense challenging problem that is exhausting but being with a group of highly motivated Duke students
discussing an intellectual topic is a blast!

The Winter Forum has been a wonderful experience and unexpectedly I had a lot of fun though I had not envisioned the experience to be necessarily fun though most certainly interesting. I think this had a lot to do with the great group dynamic that my team had. I would have appreciated it if more time was
allocated for questions after all of the presentations because the speakers were phenomenal and one does not often get the chance to ask them questions.


Incredible. Absolutely incredible.
Absolutely loved the experience! It was extremely well organized. The sessions were just the right length and I loved that we heard from such a wide variety of
speakers! Congrats to the Winter Forum team!

This way my first Winter Forum, and I loved every moment of it. I really enjoyed getting to know new students I otherwise wouldn't have met. In addition, the
teams were set up perfectly and everyone experiences nicely complemented each other. Also, thank you to all those who helped out with Winter Forum and
making it possible!

Thank you so much for planning!
56

I definitely learned a lot more about teamwork than I did about education. A lot of the understanding I feel was superficial just because we tried to cover too
much in too little time-- math education in India in five minutes? I came out of those sessions feeling really frustrated with the lack of complexity in how we
explored those issues. / / I was also frustrated by the composite Bull City Elementary in Durham and how we conceptualized the two different schools. We
succumbed to the issue of the single story. The focus in India was on impoverishment; in the US, the focus was on alienation of interest and technology. BCE
in Durham had pretty bad passing rates-- which means that teachers have to be struggling with teaching basic reading and math-- and yet an elementary
school is effectively able to teach Project-Based Learning that blends into all disciplines? There was a serious tone shift that I was concerned by, as well as
issues of "saving the wold." It would have also been beneficial to talk about the Common Core standards-- what does tha timplementation look like? What
does it mean fo the solutions we have to have-- to what extent can they be in the classroom? What issues with project-based learning exist-- what challenges
would our innovations have to consider? / / What challenges with technoloy do we have to consider when working with India? English speaking skills? / / What
do both groups do right, and why? A common theme in a lot of the successful inititatives that Maya talked about in India weren't technological platforms but
simple kits-- why is that? Have technological solutions and platforms been tried? (Of course!) What was difficult? What might we have to address? What are
issues with retaining volunteers in India (which were necessary for quite a few innovations)? What types of programs do this successfully? / / I also think many
of our innovations did the incredibly presumptuous thing of assuming that we were 'innovations.' I would have appreciated looking at the kinds of small-scale
innovations that Maya talked about (and things like the students-teaching-students model and the Breakthrough Collaborative) and discussing what works and
what doesn't. What assumptions should we not make? / / Overall, I felt frustrated by the brevity of Winter Forum and the scope of its goals. I think that in trying
to discuss STEM education, education in India, education in the US, social entrepeneurship, and presentation skills all in less than a day's worth of actual
programming and sessions, I came out with no real, deeper understanding of anything-- and then we're expected to solve a problem?

It was fun!

I thought it was very unfortunate that we didn't talk at all about why we chose to pursue STEM. There was virtually no content at all about *why* it's important to
increase interest in STEM, which I think would have been valuable. I talked about this briefly with some of the people running the Winter Forum, and they
mentioned that they received many questions about why STEM and why India. Though I think the why India question was valuable as well, I think it was
somewhat addressed in the sense that we saw the problem - India's education certainly has a lot to improve on. But during the forum we didn't talk about STEM
at all (and not during the required readings either), and what the *problem* with having low interest in STEM is. What does increasing interest in STEM do for
the kids job prospects, for the economy, for national infrastructure and science research? Increasing interest in STEM almost certainly decreases interest in the
humanities - how does that balance play out, why is it worth it or not worth it? In what situations is it which one? These are all valuable questions that bothered
me significantly as we went through Winter Forum. / / I also think that it would have been good to discuss the importance of researching whatever's out there,
and what makes something actually innovative or different. I think that the vast majority of solutions and projects were simply repeats of things that exist
already. There's virtually no difference from many of the projects and real-life endeavors. At what point does it become useful to try to "innovate" your own new
idea versus improving on something that exists that's already high quality? How do you research effectively to understand what ideas have already been
implemented? Too many of the projects that I saw I thought would simply be not feasible because they required too much work and were repeats of what exists
already.

I had a really great time. As difficult as it was to come up with a project idea in a short period of time, that same aspect made the Winter Forum an enjoyable
learning experience.

I was accepted the day before the program began. I got no information about the program's structure, schedule, or dress code until I arrived and started asking
questions from the other students. That is why I rate the pre-program communication as inneffective. I rated the extent to which the program improved my
abilities to assess ideas and knowledge of educational issues, etc., relatively low not because the program did not focus on those things but because they were
very high to begin with and the program did not add anything behind my current level.

It was hard for me to come up with one change I would make to the forum.

I LOVED my Winter Forum experience. The topics explored have better defined my academic and career interests.

I had an amazing experience and team. Thank you so much for doing this! It has definitely made a difference in my education and will likely into the future I
have no doubt. The intense environment, novelty of the concepts, and real world issues made this a fantastic program for students.
57

Great experience! I learned a lot of interesting information from experts in the field (coaches and speakers), enjoyed the help of our faculty mentor, and liked
collaborating with Duke students to formulate a strong presentation.


Awesome experience, couldn't be happier that i was able to be a part of it.
It was the best Winter Forum I have attended so far! It was well organized, the people in my group were awesome (and all of the groups seemed to be
amazingly diverse), my faculty advisor was excellent, and all of the sessions were extremely useful and interesting. I wasn't bored for a single second of Winter
Forum, and it made me excited to be back at Duke a little earlier from Winter Forum!
n/a
It was great, but pretty intense. I also wish we had gained more content knowledge and learned more from speakers. It was way more focused on the proposal
than I anticipated which was cool but adding like another day of background first and then making the days less intense would be better.
I loved having the opportunity to interact with faculty, experts in the field (both social entrepreneurship and education), and a student body with immense
diversity. It was great to devise real, applicable solutions.





Winter Forum was a wonderful and highly enriching experience and I hope I get the chance to participate again.
The group work experience ruined by Winter Forum. I loved the speakers, but I didn't gain anything from my group experience. It's the traditional Duke
assignment of trying to figure out a solution as quick as possible. However, I loved the speakers, experts, and faculty.

I'm grateful for all the Education department's hard work and creativity, and I'd definitely recommend the Winter Forum to other students.
58
Appendix 4. Observational Data – Agreement Graphs
59
60
61
Appendix 5. Observational Data – Team QEP Outcomes
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
Appendix 6. Observational Data – Event Key
Code Event
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
Sun. 1:00 PM Welcome and Overview
Sun. 1:00 PM Introduction
Sun. 1:00 PM Innovation and Education: U.S. and Global Efforts - BEGINNING
Sun. 1:00 PM Innovation and Education: U.S. and Global Efforts - END
Sun. 2:00 PM Education in India - BEGINNING
Sun. 2:00 PM Education in India - END
Sun. 2:00 PM Education in the U.S. and North Carolina - BEGINNING
Sun. 2:00 PM Education in the U.S. and North Carolina - END
Sun. 3:30 PM Researching the Scope of the Problem - BEGINNING
Sun. 3:30 PM Researching the Scope of the Problem - MIDDLE
Sun. 3:30 PM Researching the Scope of the Problem - END
Sun. 5:00 PM Interactive workshop on cultural conditioning - BEGINNING
Sun. 5:00 PM Interactive workshop on cultural conditioning - MIDDLE
Sun. 5:00 PM Interactive workshop on cultural conditioning - END
Mon. 9:00 AM Schools in the United States; Benjy Downing
Mon. 10:30 AM Presenting your ideas
Mon. 11:30 AM Understanding the context, teams create plan for the day - BEGINNING
Mon. 11:30 AM Understanding the context, teams create plan for the day - MIDDLE
Mon. 11:30 AM Understanding the context, teams create plan for the day - END
Mon. 12:30 PM Education and Innovation
Mon. 2:00 PM Innovation and Education: Overview
Mon. 3:20 PM Team research and interviews with experts: - BEGINNING
Mon. 3:20 PM Team research and interviews with experts: - MIDDLE
Mon. 3:20 PM Team research and interviews with experts: - END
Mon. 5:45 PM Team Research - BEGINNING
Mon. 5:45 PM Team Research - MIDDLE
Mon. 5:45 PM Team Research - END
Mon. 7:45 PM Teams work on presentations
Tues. 9:00 AM Team Presentations - BEGINNING
Tues. 9:00 AM Team Presentations - MIDDLE
Tues. 9:00 AM Team Presentations - END
88
89