ZBTHS College Preparatory Mathematics (CPM)

Transcription

ZBTHS College Preparatory Mathematics (CPM)
A Statistical Analysis Of:
ZBTHS College Preparatory Mathematics (CPM) Curriculum
Class of 2014 Student Outcome Data
Using EPAS System Test Scores
Prepared By: James P. Tenbusch, Ph.D.
A statistical analysis was performed on all Zion-Benton High School student EPAS System test scores
EXPLORE, PLAN, ACT-P) collected over the past three years for students exposed to the College
Preparatory Mathematics (CPM) Curriculum. This analysis included an examination of the frequency
distribution by test type to determine the presence of outlier scores, along with three different methods of
interpreting mean student performance against college readiness benchmarks via the use of a standard
normal curve. Results showed that the gap between means student performance on EPAS System
mathematics aptitude tests and associated college readiness cut scores doubled during the first year of
exposure to the CPM curriculum, and quadrupled during the second year. Aggregate CPM student
readiness for a college mathematics curriculum was observed to decrease by approximately 10% per
year (Baseline: 39.3%; Year One: 28.4%; Year Two: 18.7%).
INTRODUCTION
As a means to address concerns from parents, teachers, and students regarding the full implementation of the
College Preparatory Mathematics (CPM) Curriculum at ZBTHS, a Freedom of Information Act (FOIA) request
was made under Section 2:250 “Access to District’s Public Records” of the Zion Benton Township School
District 126’s Board Policy Manual to obtain EPAS System student test score data. Specifically, this request
asked for all “EPAS System (EXPLORE, PLAN, ACT PRACTICE) composite test scores in Mathematics for all
students continuously enrolled in CPM-based coursework for the current and previous two academic terms:
2010-11; 2011-12; 2012-13.” The data set received was provided minus student names or coded identifiers in
order to preserve the confidentiality of student records.
PURPOSE
The purpose of this investigation is to provide ZBTHS learning community stakeholders with a descriptive
statistics report regarding student outcomes associated with the school-wide implementation of the CPM
curriculum. The data analysis presented in this report used widely accepted statistical methods to determine
CPM curriculum effectiveness in terms of college readiness standards in mathematics. EPAS System student
population test scores on three measures of mathematical competency were used to answer the two essential
research questions listed below. The measurements used included: EXPLORE Test (administered at the
beginning of the freshman year), PLAN Test (administered at the beginning of the sophomore year), and
ACT-P (administered at the beginning of the junior year).
Research Questions:
1. Does a negative performance gap exist between ZBTHS beginning freshman student test scores and
EPAS System College Readiness Benchmarks (as measured by the EXPLORE Test)?
2. If a negative performance gap does exist: Do ZBTHS students exposed to the CPM curriculum present
test scores that indicate a closing of this gap during their freshman and sophomore year?
1
METHODS
To determine the most statistically valid analysis of the CPM EPAS System student population test score data
provided, a frequency distribution from the lowest score to the highest score was conducted. The reason for
this preliminary review of the data set was to determine the presence of any outlier values within the
distribution of scores. Should outlier scores exist within a distribution, some method must be selected to
eliminate these scores in order to derive the best estimate of a distribution’s true population mean (
The
population mean, or “average,” can be distorted away from its true value when atypical extreme scores are
included in its calculation.
An analysis of the test scores presented in Table 1 indicates that outliers are present within the CPM data set.
This means that the use of a canvas method, where 100% of student test scores are used to calculate a mean
and standard deviation would be suspect.
Table 1: Frequency Distribution of EPAS Test Scores By Test Type
TEST EXPLORE
SCORES
FREQ
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
E=1-25
P=1-32
A=1-36
2
0
1
1
0
10
6
12
34
37
69
50
43
34
13
20
0
18
0
14
3
MEAN
VAR
BENCH
VAR
PLAN
FREQ
MEAN
VAR
BENCH
VAR
ACT-P
FREQ
MEAN
VAR
BENCH
VAR
-11.2
----9.2
-8.2
----6.2
-5.2
-4.2
-3.2
-2.2
-1.2
-0.2
0.8
1.8
2.8
3.8
---5.8
---7.8
8.8
-12.0
----10.0
-9.0
----7.0
-6.0
-5.0
-4.0
-3.0
-2.0
-1.0
0.0
1.0
2.0
3.0
---5.0
---7.0
8.0
0
1
0
4
0
3
6
12
15
28
61
58
46
27
25
14
20
11
7
5
4
7
6
2
5
0
0
0
----11.2
----9.2
----7.2
-6.2
-5.2
-4.2
-3.2
-2.2
-1.2
-0.2
0.8
1.8
2.8
3.8
4.8
5.8
6.8
7.8
8.8
9.8
10.8
11.8
----------
----13.0
-12.0
-11.0
----9.0
-8.0
-7.0
-6.0
-5.0
-4.0
-3.0
-2.0
-1.0
0.0
1.0
2.0
3.0
4.0
5.0
6.0
7.0
8.0
9.0
10.0
----------
0
0
0
0
0
0
1
4
11
27
67
41
47
41
17
15
20
10
9
8
11
19
5
5
5
0
2
0
1
0
0
1
-------------------7.1
-6.1
-5.1
-4.1
-3.1
-2.1
-1.1
-0.1
0.9
1.9
2.9
3.9
4.9
5.9
6.9
7.9
8.9
9.9
10.9
---12.9
---14.9
------17.9
----------------11.0
-10.0
-9.0
-8.0
-7.0
-6.0
-5.0
-4.0
-3.0
-2.0
-1.0
0.0
1.0
2.0
3.0
4.0
5.0
6.0
7.0
---9.0
---11.0
------14.0
No. of
Student
= 367
PLAN
Mean
= 17.2
Mean
CRB Var
= - 1.8
No. of
Student
= 367
ACT-P
Mean
= 18.4
Mean
CRB Var
= - 3.6
No. of EXPLORE Mean
Student Mean CRB Var
= 367
= 16.2
= - 0.8
2
Table 1 indicates that the presence of outliers will skew the population mean away from a valid measure of
central tendency of EPAS System test scores. This fact prompted the researcher to examine the data set
using a variety of sampling methods. Accepted statistical practice requires a researcher to use a standard
sample method as a means to eliminate outliers by factoring out all scores that fall outside +/- two standard
deviations (SD) from the mean. The SD calculation shows how much variation or "dispersion" exists from the
mean. A low standard deviation indicates that the data points tend to be very close to the mean; high standard
deviation indicates that the data points are spread out over a large range of values.
Unfortunately, the use of the standard sample method resulted in an unacceptably high number of student
scores eliminated from the data set, making the mean derived from the new distribution of scores equally
suspect to the canvas method of analysis. This fact prompted the development of a hybrid sample method,
where a balanced approach to the elimination of outliers was selected to determine a more valid calculation of
the true population mean for each test type.
Table 2 displays information regarding the use of all three methods of analysis, and the loss of data associated
with each of the two sampling methods. The hybrid sampling method offers the most stability in the calculation
of a mean and SD because only eight (8) outlier scores are eliminated from the analysis from a total population
of 367 students (lowest 4 scores; highest 4 scores).
Table 2: EPAS Test Score Statistics By Analysis Method
ANALYSIS
METHOD
EXPLORE
%POP
EXPLORE
MEAN
EXPLORE
STDEV
BENCH
VARIANCE
PLAN
%POP
PLAN
MEAN
PLAN
STDEV
BENCH
VARIANCE
ACT-P
%POP
ACT-P
MEAN
ACT-P
STDEV
BENCH
VARIANCE
Canvas
Analysis
100.0%
16.15
3.30
-0.85
100.0%
17.22
3.92
-1.78
100.0%
18.42
4.29
-3.58
Standard
Sample
Analysis
94.3%
15.87
2.68
-1.13
97.0%
17.14
3.46
-1.86
96.2
17.97
3.69
-4.11
Hybrid
Sample
Analysis
97.8%
16.16
3.04
-0.84
97.8%
17.19
3.61
-1.81
97.8%
18.33
3.99
-3.67
Note: BENCH VARIANCE displays the difference between the population mean and the college readiness benchmark (cut score).
Table 2 shows that the use of the standard sample method results in a variable population capture rate of:
94% to 97%. This fact is problematic because a significant loss of student test scores will add a source of
error to the population mean. Like the canvas method of analysis, the hybrid sample method results in a stable
population capture rate. Only eight (8) test scores were eliminated from the 367 test score data set for each
EPAS System test type, resulting in a consistent 97.8% population capture rate.
Figure 2 on the following page provides a visual display of the total population frequency distribution by EPAS
System test type. The graph provides the full range of test scores observed for the distribution, ranging from 5
-36 along the vertical axis, and by score frequency along the horizontal axis (in increments of ten (10)
students). An interpretation example is provided, which shows that 69 students achieved an EXPLORE test
score of 15; two points below the college readiness cut score (benchmark). This graph also identifies the eight
(8) outlier scores eliminated in the hybrid sample method. The elimination of these scores were found to
stabilize the distribution of test scores, capture the maximum number of student data points, and produce the
most representative mean and standard deviation for the population. This allows for the use of a standard
normal curve to describe EPAS System student outcome by test type.
3
Figure 1: Frequency Distribution of Test Scores By Test Type
Results
Findings
Discussion
RESULTS
The hybrid sample method will be used to present student population test score results as a function of the
standard normal curve; however, both the canvas method and the standard sample will also be displayed.
This will allow other examiners to choose among these alternative methods to conduct their own analysis.
Said examiners are advised to use only one method of analysis because the blend of methods would be
statistically invalid. Each distribution presented is displayed using scores found within the observable range of
test scores for the method type.
Figure 2 on the following page shows that freshman students began their exposure to the CPM curriculum with
a mean test score of 16.2, which is -0.8 points below the College Readiness Benchmark (CRB) of 17 for this
test. Thirty-nine point three percent (39.3%) of students scored at or above the CRB. An examination of data
for the canvas method (Figure 3) and the standard sample method (Figure 4) show similar results to the hybrid
method. All methods of analysis (canvas, standard, hybrid) show a high level of correlation on the EXPLORE
test. These results provide a baseline for further comparison of EPAS System test scores.
4
Figure 2: EXPLORE Test Scores: HYBRID Method
Figure 3: EXPLORE Test Scores: CANVAS Method
Figure 4: EXPLORE Test Scores: STANDARD Method
5
Figure 5 shows that beginning sophomore student mean PLAN Test score was 17.2, which is -1.8 points below
the CRB of 19 for this test after one year of exposure to the CPM curriculum. Twenty-eight point four percent
(28.4%) of this population scored at or above the CRB. A significant increase in the SD value indicates greater
variability in test scores among the student population one year prior.
Figure 5: PLAN Test Scores: HYBRID Method
Figure 6: PLAN Test Scores: CANVAS Method
Figure 7: PLAN Test Scores: STANDARD Method
6
Figure 8 shows that beginning junior student mean ACT-P Test score was 18.3, which is -3.3 points below the
CRB of 22 for this test after two years of exposure to the CPM curriculum. Eighteen point 7 percent (18.7%) of
this population scored at or above the CRB. A significant increase in the SD value indicates greater variability
in test scores among the student population one year prior.
Figure 5: PLAN Test Scores: HYBRID Method
Figure 9: PLAN Test Scores: CANVAS Method
Figure 10: PLAN Test Scores: STANDARD Method
7
Summary of Results
Figures 11-12 display summary statistics across all three test forms of the EPAS System. Figure 11 shows
that the performance gap between the mean scores observed and CRB for students exposed to the CPM
curriculum significantly increased over a two year period. In year one this gap doubled (freshman to
sophomore); in year two the gap quadrupled (sophomore to junior). Figure 12 shows the number of students
exposed to the CPM curriculum considered “college ready” decreased by approximately 10% per year.
Figure 11: EPAS System CRB Gap Analysis
Figure 12: EPAS System CRB Analysis by Proficiency Level
8
FINDINGS
Returning to the original research questions, the following findings are offered for consideration by all members
of the ZBTHS learning community.
1. Does a negative performance gap exist between ZBTHS beginning freshman student test scores and
EPAS System College Readiness Benchmarks (as measured by the EXPLORE Test)?
Results show that a significant performance gap exists between student population mean scores and CRB
standards. Research study students began their freshman year with an EXPLORE test mean scaled score
of 16.2, which is 0.8 points below the CRB for this baseline measurement test.
2. If a negative performance gap does exist: Do ZBTHS students exposed to the CPM curriculum present test
scores that indicate a closing of this gap during their freshman and sophomore year?
Results show that given the presence of a performance gap between student population mean scores and
CRB standards as a function of exposure to the CPM curriculum, the gap did not stabilize or improve,
rather it shows a significant increase from year to year. Standard deviation values indicate an increase in
the variability of test scores over time, which demonstrates that the gap between low and high performing
students has increased.
DISCUSSION
Based on the analysis conducted by this researcher, the following discussion questions are posed for
consideration by all members of the ZBTHS learning community in relation to school-wide implementation of
the CPM curriculum.
1. Should a constructivist curriculum be implemented for a student population characterized by high
variability in ability levels and mean low performance on college readiness benchmarks?
2. Given the presence of high variability and low mean test scores, should a Multi-Tiered System of
Support (MTSS) be implemented to address the needs of specific cohorts of math students as a
function of EPAS System test scores?
3. When considering the implementation of an MTSS, should a blend of direct instruction methods and
constructivist methods of instruction be piloted?
4. Does the school-wide implementation of CPM meet all of its program delivery requirements? Is the
curriculum delivered as designed and with fidelity?
5. Should further staff development be considered to ensure the CPM curriculum meets the needs of a
diverse student population?
6. Should an examination of incoming freshman EXPLORE test scores as a function of feeder district be
conducted to account for and adjust to variations in student readiness for exposure to the CPM
curriculum?
9
Recommendations For Additional Research
1. Collect anecdotal data through survey methods to determine teacher, student, and parent perception
regarding the effectiveness of the school-wide implementation of the CPM curriculum.
2. Collect beginning freshman EXPLORE data by feeder district and student performance quartiles.
3. Examine five-year historical data in relation to EPAS System CRB and mean student population
performance. This would result in comparison of CPM curriculum effectiveness in relation to previous
math curriculum initiatives.
4. Engage in the services of a university study group or program effectiveness consulting firm to conduct
further research into the questions listed above, and to replicate the results presented in this report.
5. Repeat this analysis after ACT Test scores are obtained in the Spring of 2013.
10