HSPT® Interpretive Manual - Scholastic Testing Service, Inc.

Transcription

HSPT® Interpretive Manual - Scholastic Testing Service, Inc.
High School Placement Test
Interpretive Manual
Administrative and Editorial
480 Meyer Road
Bensenville, Illinois 60106 –1617
Tel: 1.800.642.6787
Fax: 1.866.766.8054
email: [email protected]
Research and Development
4320 Green Ash Drive
Earth City, Missouri 63045
Tel: 1.855.532.0787
Fax: 1.314.739.3857
email: [email protected]
www.ststesting.com
CONTENTS
Alphabetical List Report and Rank-Order List Report.......................................................................................
2–4
Normative Scores.................................................................................................................................................
4–6
National Percentile (NP) Rank.....................................................................................................................
4
Local Percentile (LP) Rank..........................................................................................................................
4
Grade Equivalents (GE)...............................................................................................................................
5
Cognitive Skills Quotient (CSQ)..................................................................................................................
5
Standard Scores (SS)....................................................................................................................................
6
Using the Individual Results...............................................................................................................................
6–7
General Considerations........................................................................................................................................
7–8
Local and National Norms...........................................................................................................................
7–8
Questionable HSPT® Scores...............................................................................................................................
8–9
Coded Student Information................................................................................................................................. 9–10
Group Summary Statistical Report..................................................................................................................... 11–15
Frequency Distribution................................................................................................................................. 12–13
N-counts, Standard Score Means, and Standard Deviations........................................................................ 13–14
National Percentiles for Selected Group Percentiles....................................................................................
15
National Percentile Group Summary...................................................................................................................
16
Performance Profile............................................................................................................................................. 17–19
The Performance Profile Summary.....................................................................................................................
19
Item Analyses—Individual and Group................................................................................................................ 20–23
Individual Item Analysis Report.................................................................................................................. 20–21
Group Item Analysis Report......................................................................................................................... 22–23
Student Score Report........................................................................................................................................... 24–25
This booklet is a guide for interpreting results of STS’ High School Placement Test. It contains samples and
discussions of the following reports:
• HSPT® Alphabetical List Report
• HSPT® Rank-Order List Report
• HSPT® Group Summary Statistical Report
• HSPT® National Percentile Group Summary
• HSPT® Performance Profile
• HSPT® Individual Item Analysis Report
• HSPT® Group Item Analysis Report
• HSPT® Student Score Report
Detailed technical information about the reliability and validity of the test and correlations with various other standardized
tests are given in STS’ High School Placement Test Technical Supplement.
Copyright © 2012, 2008, Scholastic Testing Service, Inc. All rights reserved. No part of this work may be reproduced or transmitted in any form or
by any means, electronic or mechanical, including photocopy, recording, or any information storage and retrieval system, without prior permission in writing
from the publisher.
Published by Scholastic Testing Service, Inc., Bensenville, Illinois 60106–1617.
Printed in the United States of America.
­2
See explanation on page 4.
Kacie
Rachel
Lauren
Anthonv J00001
3499
Melanie R00001
3400
Justina
Moarey
13 09/26
Natlusek
13 02/14
Pleuker
13 09/22
Saitnella
13 04/29
Taktedy
13 03/23
Vugorska
13 09/29
SS- STANDARD SCORE
GE- GRADE EQUIVALENT
Kaitlyn N00001
34
Lomerez
13 06/06
137
173
207
173
173
217
999
173
3
215
558 73
61
637 92
87
558 73
61
558 73
61
614 89
81
614 89
81
515 56
45
524 61
49
614 89
81
602 87
77
502 50
40
405 18
13
558 73
61
581 81
69
547 70
57
558 73
61
NP
LP
VERBAL
SS
00001
TOTAL
NP
LP
READING
SS
GE
NP
LP
MATH
SS
GE
SS
GE
NP
LP
LANGUAGE
SS
GE
NP
LP
TOTAL
FORM: K
472 35
8.1 30
27 64
SC 56
594 83
9.7 78
561 71
9.5 63
579 76
9.8 71
528 58
9.0 51
446 31
7.8 30
491 47
8.6 43
428 26
7.5 24
491 47
8.6 43
579 79
9.7 73
579 79
9.7 73
521 59
9.0 48
511 54
8.7 44
520 58
8.8 50
563 75
9.5 66
502 51
8.7 43
528 62
9.0 53
541 63
9.2 55
510 56
108 44
517 53
8.9 47
606 87 636 90
118 76 10.4 89
540 67
110 54
525 60
8.9 54
536 63
9.1 58
503 52
8.7 47
521 59
9.0 48
588 82
9.8 77
437 29
7.6 21
520 58
8.9 50
594 83
9.8 78
494 49
8.5 40
29 76
SC 68
34 95
SC 94
29 76
SC 68
27 64
SC 56
25 52
SC 44
30 82
SC 74
30 82
SC 74
22 34
SC 28
26 58
SC 50
20 24
SC 19
28 70
SC 62
26 58
SC 50
27 64
SC 56
19 19
SC 16
491 48
8.5 39
375 11
6.4 4
458 35
7.9 26
597 84
9.9 79
498 50
8.6 41
560 73 591 80 657 94 648 93 641 92
115 61 10.0 75 10.8 93 10.5 92 10.4 92
570 77
113 66
3
1
562 73
9.6 65
262
5.2
525 60 676 96
8.9 54 10.8 96
409 21
7.2 19
359 10
6.5 8
450 33
7.8 24
569 76
9.7 69
578 78
9.7 72
419 23
7.3 22
511 54
8.7 44
525 60
8.9 54
579 76 596 83 625 91 608 86
9.8 71 10.0 78 10.3 88 10.0 83
561 71
9.5 63
508 50
8.7 43
425 21
7.5 15
500 47
8.6 39
499 51 602 83
108 39 10.2 79
499 51
106 39
527 61
108 50
649 94
123 90
566 75
113 64
499 51
107 39
399 17
89 11
491 47
105 35
610 87 623 88
114 78 10.2 86
540 67
113 54
35 96
SC 96
RS NP
SUB LP
OPTION
NP
LP
510 57
47
594 84
77
507 55
46
611 88
83
542 67
57
538 66
56
496 51
41
524 62
52
625 90
87
583 80
73
490 48
38
369 11
6
466 40
28
599 85
79
509 56
47
609 87
82
SS
COMPOSITE
1
(WITHOUT
OPTION )
PAGE:
RUN DATE: 11/12/08
592 83 602 83 596 83 638 92 617 88
116 72 10.2 79 10.0 78 10.4 90 10.2 87
SS NP
CSQ LP
CSQ- COGNITIVE SKILLS QUOTIENT
RS- RAW SCORE
468 39
28
549 68
57
517 56
47
549 68
57
517 56
47
386 15
9
486 45
35
527 59
50
668 96
93
517 56
47
496 48
39
414 21
13
424 23
15
613 88
82
527 59
50
599 84
79
NP
LP
QUANT
SS
SEC:
BASIC SKILLS
08
BY TOTAL GROUP
DATE: 11/22/08 GRADE:
COGNITIVE SKILLS
NP- NATIONAL PERCENTILE
LP- LOCAL PERCENTILE
00001
3499
E00001
3436
K00001
34
M00001
3450
J00001
3426
Daniel
Kleinman
13 04/30
F00001
3415
Darren
Herrreral
13 03/13
175
Marie
Haynton
13 03/05
C00001
34
175
AlexandrM00001
34
Gonzalez
13 07/11
173
183
Rachel
Drand
13 05/12
T00001
3415
173
Ertellazyk JonathanM00001
13 02/22
3450
Ethan
Carrillon
14 09/28
M00001
34
175
217
Brian
Baniels
13 08/16
E00001
3415
M00001
3450
James
Aragonman
13 01/09
12 34
OPTIONAL CODES
3
OTHER
CHOICES
2
ELEM
SCHOOL
CODES
TEST
CENTER
1
SAMPLE SCHOOL
STUDENT'S NAME
AGE B-DAY
GROUP I.D.
ALPHA LIST
00001
HSPT® Alphabetical List Report
12 34
James
Ethan
Melanie R00001
3400
Marie
Rachel
Aragonman
13 01/09
Carrillon
14 09/28
Taktedy
13 03/23
Haynton
13 03/05
Natlusek
13 02/14
­3
Justina
Brian
Anthonv J00001
3499
Kaitlyn N00001
34
AlexandrM00001
34
Rachel
Vugorska
13 09/29
See explanation on page 4.
Baniels
13 08/16
Saitnella
13 04/29
Lomerez
13 06/06
Gonzalez
13 07/11
Drand
13 05/12
405 18
13
558 73
61
502 50
40
515 56
45
558 73
61
547 70
57
558 73
61
524 61
49
614 89
81
614 89
81
602 87
77
637 92
87
581 81
69
558 73
61
558 73
61
614 89
81
NP
LP
VERBAL
SS
00001
TOTAL
NP
LP
MATH
SS
GE
SS
GE
NP
LP
LANGUAGE
NP
LP
TOTAL
SS
GE
528 58
9.0 51
561 71
9.5 63
399 17
89 11
491 47
105 35
499 51
107 39
499 51
106 39
540 67
110 54
540 67
113 54
510 56
108 44
527 61
108 50
425 21
7.5 15
500 47
8.6 39
508 50
8.7 43
561 71
9.5 63
541 63
9.2 55
472 35
8.1 30
517 53
8.9 47
579 76
9.8 71
499 51 602 83
108 39 10.2 79
570 77
113 66
566 75
113 64
606 87 636 90
118 76 10.4 89
610 87 623 88
114 78 10.2 86
359 10
6.5 8
419 23
7.3 22
409 21
7.2 19
428 26
7.5 24
503 52
8.7 47
525 60
8.9 54
525 60
8.9 54
491 47
8.6 43
491 47
8.6 43
262
5.2
3
1
450 33
7.8 24
562 73
9.6 65
521 59
9.0 48
437 29
7.6 21
511 54
8.7 44
521 59
9.0 48
511 54
8.7 44
579 79
9.7 73
375 11
6.4 4
458 35
7.9 26
491 48
8.5 39
502 51
8.7 43
494 49
8.5 40
498 50
8.6 41
520 58
8.9 50
528 62
9.0 53
563 75
9.5 66
520 58
8.8 50
446 31
7.8 30
579 79
9.7 73
594 83
9.8 78
597 84
9.9 79
594 83
9.7 78
588 82
9.8 77
569 76
9.7 69
525 60 676 96
8.9 54 10.8 96
536 63
9.1 58
578 78
9.7 72
28 70
SC 62
26 58
SC 50
20 24
SC 19
30 82
SC 74
29 76
SC 68
19 19
SC 16
29 76
SC 68
22 34
SC 28
30 82
SC 74
25 52
SC 44
27 64
SC 56
34 95
SC 94
27 64
SC 56
35 96
SC 96
592 83 602 83 596 83 638 92 617 88
116 72 10.2 79 10.0 78 10.4 90 10.2 87
26 58
SC 50
RS NP
SUB LP
OPTION
NP
LP
369 11
6
466 40
28
490 48
38
496 51
41
507 55
46
509 56
47
510 57
47
524 62
52
538 66
56
542 67
57
583 80
73
594 84
77
599 85
79
609 87
82
611 88
83
625 90
87
SS
COMPOSITE
1
(WITHOUT
OPTION )
PAGE:
27 64
SC 56
579 76 596 83 625 91 608 86
9.8 71 10.0 78 10.3 88 10.0 83
NP
LP
READING
SS
GE
FORM: S
RUN DATE: 11/12/08
560 73 591 80 657 94 648 93 641 92
115 61 10.0 75 10.8 93 10.5 92 10.4 92
649 94
123 90
SS NP
CSQ LP
CSQ- COGNITIVE SKILLS QUOTIENT
RS- RAW SCORE
414 21
13
424 23
15
496 48
39
486 45
35
517 56
47
527 59
50
468 39
28
527 59
50
386 15
9
517 56
47
517 56
47
549 68
57
613 88
82
599 84
79
549 68
57
668 96
93
NP
LP
QUANT
SS
SEC:
BASIC SKILLS
08
BY TOTAL GROUP
DATE: 11/22/08 GRADE:
COGNITIVE SKILLS
NP- NATIONAL PERCENTILE
LP- LOCAL PERCENTILE
183
Ertellazyk JonathanM00001
13 02/22
3450
SS- STANDARD SCORE
GE- GRADE EQUIVALENT
217
175
999
207
173
137
173
217
3
173
175
173
173
175
173
215
M00001
3450
M00001
34
00001
3499
J00001
3426
Daniel
Kleinman
13 04/30
M00001
3450
Kacie
Moarey
13 09/26
K00001
34
C00001
34
T00001
3415
E00001
3415
E00001
3436
Lauren
Pleuker
13 09/22
F00001
3415
Darren
Herrreral
13 03/13
OPTIONAL CODES
3
OTHER
CHOICES
2
ELEM
SCHOOL
CODES
TEST
CENTER
1
SAMPLE SCHOOL
STUDENT'S NAME
AGE B-DAY
GROUP I.D.
RANK LIST ON COMP
HSPT® Rank-Order List Report
ALPHABETICAL LIST REPORT AND RANK-ORDER LIST REPORT
The test scores for a given student appear on two separate list reports: the Alphabetical List Report and the RankOrder List Report (from highest to lowest) of the composite scores. Unless special arrangements were made, two
copies of the Alphabetical List Report and two copies of the Rank-Order List Report are provided for your use.
Each list report is suitably labeled for convenient identification, and each copy is distinctively colored for ease in
use and distribution within the school. The basic format of both lists is identical. The reports are illustrated on pages
2 and 3.
Both reports are divided into six major columns, each of which provides a rich assortment of information about the
individual student. At the far left you will find the “STUDENT’S NAME” column, as it was gridded on the answer
sheet at the time of testing.
The second column, “CODES,” accommodates two lines of coded information. The specific codes may be located
and identified by referring to the descriptions shown at the top of this column. The value of these codes and their uses
are discussed on pages 9 and 10.
“COGNITIVE SKILLS” is the third major column, which presents the scores the student earned on the Verbal and
Quantitative subtests as well as his or her total score for these two subtests combined. The computed cognitive skills
quotient (CSQ), which replaces the traditional IQ, will be found in this column as well.
The next major column is “BASIC SKILLS,” which displays the scores attained on the Reading, Mathematics, and
Language subtests. The scores for these three subtests are combined and reported as a total basic skills score.
The fifth major column is designated “OPTION” and contains the scores for any of the optional tests—Science,
Catholic Religion, and Mechanical Aptitude—which may have been administered in conjunction with the
HSPT®. For your convenience, the optional test used is identified by a two-letter abbreviation (SC, RL, and MC)
beneath the scores. An optional local test can be used to supplement the HSPT®. STS will score a school’s local
assessment and generate raw scores and local percentiles, provided the assessment is in a multiple-choice format with
at least four foils. An optional local test must not exceed 40 items, and a school must provide an answer key to the
STS Scoring Center prior to testing.
The composite scores are provided in the sixth major column, “COMPOSITE.” The composite score indicates a
student’s total performance on the five subtests that comprise the HSPT® battery. Like any total score, it cannot be
reported when one or more of the component subtests have been omitted from the testing.
NORMATIVE SCORES
As indicated by the score legend at the bottom of the Alphabetical List Report, five different types of scores are
incorporated into a student’s test results. As may be noted in the illustration, three or four of these measures are used
in connection with each subtest or total score. The specific scores included for a given part may be identified by
abbreviations which appear at the top of the appropriate column. The five types of scores are explained below.
National Percentile (NP) Rank
The percentile-rank scale ranges from 1 to 99 and compares the performance of an individual student with that of
other students within the same grade level. More specifically, a national percentile rank indicates the percentage of
raw scores (i.e., the total number of correct responses) in the representative national norm sample that are lower than
the raw score attained by a given student. Therefore, if an individual’s raw score on the Math subtest is equal to the
64th percentile, this means the raw score was higher than 64 percent of those in the national norm sample.
Local Percentile (LP) Rank
Local percentile ranks provide the same basic comparison as national percentile ranks except that the comparison
group is composed of local students rather than a national sample. In the case of your test results, the local group
consists of all of the students who were tested either at your school (if your testing was an independent effort) or in
your school system/district (if your testing was part of a coordinated, multi-school program). If a student earns a
local percentile of 71 on the Language subtest, this means the raw score was higher than 71 percent of those in
your group and/or school system/district.
­4
Grade Equivalents (GE)
Percentile ranks compare the performance of an individual student with other students at the same grade level. Grade
equivalents compare the performance of an individual with the average performance of students at other grade levels.
Consequently, the grade equivalent scale extends across grade levels. As a normative measure, grade equivalent
scores are subject to several limitations and certain precautions must be observed:
1) Unfortunately, grade equivalents lend themselves to misinterpretation. If an eighth-grade student earns a GE
of 10.4 on the Math subtest, this does not mean that the student is capable of doing tenth-grade math. It simply means that the student can do eighth-grade math as well as an average high school sophomore can do
eighth-grade math.
2) Grade equivalents are meaningful only within the range of skills measured by the test administered. In the
case of the eighth-grade student who earns a GE of 10.4 on the Math subtest, it is clear that this individual
is doing considerably better than most eighth graders. It must be remembered, however, that such a test was
designed primarily to assess those math skills and concepts that should have been learned through the eighth
grade. If this student were given a math test designed for use at the tenth-grade level, it is very unlikely that
he or she would attain a GE of 10.4.
3) Grade equivalents should not be used as the basis for placing students at grade levels that correspond to
attained GE scores.
Cognitive Skills Quotient (CSQ)
This measure replaces the traditional IQ score, but its purpose within the school setting remains the same—to function as a predictive index of a student’s future academic performance in order to assess learning potential. Like the
IQ, the CSQ is based upon the student’s scores on both the Verbal and Quantitative subtests as well as his or her age
at the time of testing. Unlike pure intelligence tests, however, these subtests do not restrict themselves to measure
only innate abilities. Instead, test items were carefully designed to provide various measures of the cognitive skills (i.e.,
skills related to learning) whether such skills are innate or acquired. Consequently, the CSQ is a richer, broader measure since the test items upon which it is based have a wider, more extensive scope than those ordinarily used in
intelligence tests.
For convenience, the CSQ was designed statistically to be interpreted in the same manner as the traditional IQ.
Thus, the following guide may be used in evaluating the CSQ:
above 130
110 & above
100–109
90–99
89 & below
below 70
represents academic potential that is found in approximately the upper 3% of the school
population;
represents academic potential that is found in the upper 25% of the school population;
represents academic potential that is found in the second quarter of the school population—50th to 75th percentiles;
represents academic potential that is found in the third quarter of the school population—
25th to 49th percentiles;
represents academic potential that is found in the lower 25% of the school population;
represents academic potential that is found in approximately the lower 3% of the school
population.
­5
Standard Scores (SS)
A new edition of the HSPT® is published annually, and the national normative scores described thus far are developed
each year for the newest form of the test series. As a result, these normative measures are current and ensure that
students seeking admission or entering a high school can be compared with an up-to-date representative national
sample of their peers.
Establishing a new normative scale each year offers distinct advantages, but also introduces a potential problem. The
annual scale is affected by any shift in performance that might occur within the normative sample groups from
one year to another. (Such shifts have been amply demonstrated among the national samples of entering college
students.) As a consequence, performance at the 65th national percentile on the current scale may not have the same
meaning as performance at the 65th national percentile on an earlier scale. This variability—when it occurs—can be
troublesome for administrators and admission personnel who wish to compare the data for an entering group with
that obtained from groups in the past.
Suppose, for example, that the math skills of those in the national norm samples slowly declined from one year to the
next. If the math skills of your entering groups remained essentially unchanged during the same period, the normative
scores of your groups would slowly increase across the years. Such “improvement” is largely theoretical, of course,
and is merely a reflection of the declining performance of their national counterparts. In a more absolute sense or
from the standpoint of curriculum and teaching techniques, the level of your students’ math skills is unchanged. If
the math skills of your groups were eroding at the same pace as those in the national samples, however, it is likely
that their normative scores would remain essentially the same from one year to another.
The key point to be noted is that any performance shift within the national sample will be reflected—in some fashion
and to some degree—in the data for your groups and could lead to misinterpretations when year-by-year comparisons
are attempted. Since such comparisons can be extremely valuable when suitable confidence may be placed in the
conclusions, some solution to this difficulty was needed. It came in 1980 when Scholastic Testing Service, Inc. introduced the use of standard scores into the HSPT® reports.
At that time, a normalized standard score scale was developed for all subtest and total scores of the 1980 edition,
Series EE. These three-digit scores—with a mean of 500 and standard deviation of 100—are invariant from year to
year and edition to edition. Patterned after the College Entrance Examination Board procedures, all subsequent editions of the HSPT® are equated to the Series EE, and this inter-relationship is expressed in the form of standard scores
that are included in the various reports. Consequently, the standard score scale is an absolute, unchanging frame of
reference which permits group comparisons to be made year after year with precision and confidence. In most
instances a given standard score scale ranged from 250 to 750. (For those interested in specifics, the equating procedures are based upon the Rasch latent trait model. A complete explanation is contained in STS' HSPT® Technical
Supplement and HSPT® Validity Studies.)
USING THE INDIVIDUAL RESULTS
The national percentiles, local percentiles, grade equivalents, and standard scores offer each test user a variety of
perspectives within which the performance of a student may be viewed. It should be apparent that the choice of which
normative score(s) to use will vary according to the experience of the test user, his or her professional preferences,
and the particular task to accomplish. We offer the following general comments for your consideration.
­6
STS’ HSPT® has been in continuous use since 1958. During its long history, the various editions have been administered to several million students, and an extensive number of research projects have been conducted. These have
demonstrated repeatedly that the composite score is the best single measure for predicting subsequent academic
performance. Consequently, we can recommend the use of this score in such applications as admission, scholarship
awards, general placement, and so forth. (For those interested in specifics, predictive validity studies are reported in
the HSPT® Validity Studies manual and the individual technical supplements which are published for each edition.)
Individual subtest scores should be carefully evaluated when placing students in specific courses. Based upon a survey of HSPT® users, it is evident that most schools utilize two or more subtest scores for this purpose. Thus, both the
Quantitative and Mathematics scores are frequently considered for placement in math courses; while the Verbal,
Reading, and Language scores are considered for English courses; and so forth. In addition, many reported the use
of other criteria as well, such as elementary school grades and teacher recommendations.
Do not overlook the advantages offered by the local percentile scores. If your school tested independently—rather than
participating in a coordinated, multi-school program—your local percentiles are based solely upon the performance
of your group of students. Consequently, a student’s local percentile on a given subtest directly indicates how well or
poorly that performance compares with others in your group, regardless of how well or poorly that performance
compares with the national sample. Thus, you can easily identify the high-, average-, or low-performing students with
respect to the group itself. Such scores can be very helpful in placing students in classes formed upon similar levels of
a given skill.
As was noted earlier, standard scores function as a fixed common denominator among the various HSPT® editions.
As a result, the primary value of this scale lies in the area of evaluating group results from one year to the next.
Nevertheless, the standard scores have some applications in the area of individual test results. For example, you may
find a small number of students, all of whom attained the 99th percentile on a given subtest or the battery composite.
It is quite likely, however, that the standard score each earned will not be identical, which allows further differentiation among them. This can be very useful in settings where a single scholarship is to be awarded.
If your school has established a cut-off score for admission, placement into an advanced math course, and so on, you
may wish to consider using a standard score cut-off rather than one of the other normative scores. Since the standard scores are an invariant measure, such a cut-off may be used year after year with the assurance that it is identifying students who have met or surpassed a consistent level of performance in a particular area. Since other national
normative measures are subject to some variability, their use as a cut-off may be less precise over a period of time.
(See discussion of standard scores on page 6.) Regardless of which measure is used as a cut-off, it is always desirable
to conduct appropriate research studies within the school to determine its effectiveness as a selection device.
GENERAL CONSIDERATIONS
Local and National Norms
The distinction between local norms and national norms is confusing for many students and parents. In non-technical
terms, each simply represents an established scale or standard of performance—a type of yardstick, so to speak—
by means of which a student’s performance can be measured and compared. In theory, the national scale and the local
scale could be very similar if not identical, but in practice rarely are. Since the two scales commonly differ (to a
greater or lesser extent), it follows that they commonly give different comparative measures (also to a greater or
lesser extent) of student performance. Such differences, particularly when they are large, can be confusing.
­7
Of the two, the national norm scale is undoubtedly the more familiar. This scale is established on the basis of a
nationwide testing program that is conducted at the time a test battery is standardized. Thus, the national norm scale
offers the means to compare an individual’s performance (raw score) against that of a representative sample of students throughout the nation. Regardless of the type of normative score—percentile ranks, grade equivalents, standard
scores—all national norm scales are established in this manner.
The phrase “local norms” refers to the scale that is based solely upon the performance (raw score) of a given group
of students—most commonly, all those who were tested in a given school. In this context the phrase “school
norms” could be interchanged with local norms. This scale is established by ranking student raw scores on a given
subtest from highest to lowest. Whether the group’s collective performance is very high or very low with respect to the
national scale, it should be apparent that some students must be at or near the top of this ranking (these would comprise the upper 5% of the group) while others must be at or near the bottom (the lower 5% of the group). Those in
the upper 5% achieved the highest performance within this group and, regardless of their performance in terms of the
national scale, will earn local percentile ranks of 95 or above to indicate their high position in their local group. As
one might expect, those in the lower 5% will earn local percentile ranks of 5 or below to signify their bottom position
within the local group.
In effect, national and local normative scales provide two different frames of reference in which to view an individual’s performance. Therefore, it is possible for a student to obtain high national norm scores (because his or her
performance compared favorably with the national sample) and low local norm scores (because that performance fell
in the lower ranges of the local group ranking). Conversely, a student could earn high local norm scores (because his
or her performance fell in the upper ranges of the local group) and low national norm scores (because that performance was below par with respect to the national scale).
QUESTIONABLE HSPT® SCORES
It can be very disconcerting for all concerned when the reported test scores sharply disagree with our expectations
and/or other available data. Fortunately, this is not a common problem, but it merits some attention.
Group Performance If your Group Summary Statistical Report (discussed on page 11) indicates that the students
performed unexpectedly high or low on a particular subtest, the most likely explanation is that some irregularity
occurred during the administration of the subtest. For example, reducing the specified time limits tends to lower
performance; extending them serves to raise performance. Since the unexpected results may have been caused by an
error that went unnoticed, it is often difficult for the test administrator to recognize that an irregularity occurred at
all. Nevertheless, mistakes are made—recognized or not—and the only indication may be an unexpected and unexplainable shift in the group’s performance on a single subtest. Other factors which typically lower group performance
include departing from the test directions as given in the Manual of Directions, any disruptive or distracting activity during the testing session, poor physical conditions in the room used for testing (temperature, lighting, ventilation), and so forth.
Individual Performance Virtually all inquiries related to student performance are concerned with individuals whose
test results are lower than expected. A typical example: Lisa Smith is an excellent math student, but her math scores
are substantially below average. A typical reaction: the scores don’t make any sense.
When the unexpected test results are confined to a single individual, it is highly unlikely that administrative irregularities are responsible. Instead, one must be alert for factors that would have an impact only upon the student
involved. For those who encounter a “Lisa Smith” among their students, we offer the following suggestions:
­8
1) Discuss the matter with Lisa at the earliest opportunity. Such a discussion may be unproductive since
the testing probably occurred several weeks earlier, making recall difficult. Nevertheless, she may remember
that the math subtest seemed especially difficult, or she found the directions confusing, or she may have
skipped one or more items (which might have led her to mark her subsequent responses in the wrong locations). You may discover that she did not feel well that day or was extremely anxious about taking the test.
Lisa may realize that you share her concern about the math scores, have reservations about their validity, and
are prepared to pursue the matter further if necessary. Most students (and parents) find such an attitude supportive and reassuring.
2) Contact the STS Scoring Center, request a verification of the math scores, and include any information that
may have a bearing on the matter. In this age of optical mark readers (electronic scoring devices), highspeed computers, and sophisticated computer programming, it is extremely unlikely that Lisa’s math
responses were erroneously scored and reported. Nevertheless, it is a legitimate question which needs and
deserves a definitive answer.
3) Inspect Lisa’s answer sheet, particularly her math responses. (Answer sheets are generally returned with
verification replies.) Excessive erasures frequently indicate uncertainty or confusion on the part of the
student. Determine whether she responded to every item on the subtest or omitted a substantial number (25%
or more). Time limits for the HSPT® are generous. Consequently, excessive omissions usually indicate that
the student found the subtest quite difficult or was overly cautious in responding, perhaps only marking
those items about which he or she felt very confident. Finally, examine Lisa’s responses to the math items,
noting each item in the test booklet used for the testing. (If one is not available, request a copy from STS.)
If possible, examine the responses with Lisa and discuss those that are incorrect. Such a session can be
very enlightening for both you and the student.
Needless to say, these suggestions require additional time and effort, but they will yield the maximum amount of
information about the subtest in question. In the vast majority of instances it is possible to arrive at a definitive conclusion regarding the validity of questionable test scores.
CODED STUDENT INFORMATION
SAMPLE SPECIAL CODING
The HSPT® answer sheets, both two- and four-page
sheets, contain a special code grid immediately below
the student name grid. This grid contains space for marking an individual’s: a) elementary school code, b) first,
second and third choice high school codes, c) other
codes, and d) optional codes that may be needed.
These sections (elementary school, high school choice(s),
other codes, and optional codes) offer a total of twentythree columns, each of which contains response positions numbered zero through nine. Any marks entered in
these columns automatically appear in appropriately designated locations on the HSPT® report materials. (Please
note that this information is specific only to the use of the
general HSPT® Manual of Directions.)
For those who wish to use the special code grid in their HSPT® programs, it should be understood that certain
preparations must be made prior to the test date. For example, if each student is to identify his or her elementary
school, it is necessary to develop a list of all elementary schools represented in the group (or in the area served by
the high school), so that each school may be assigned a unique 3-digit code. In most settings such a “code list” is
reproduced in sufficient quantity to provide each student with a copy on the day of testing. Experience has clearly
established two basic rules for code lists. First, assigned codes should never include leading zeros (e.g., 001) since
these tend to be ignored by students; second, a general code (e.g., 999) should be included to be used, for example,
when a student cannot find his or her elementary school among those shown on the code list.
­9
Some schools may wish to include other codes specific to their HSPT® programs. For example, if an individual
school wishes to know what foreign language each incoming student hopes to study, these codes may be coded in the
“OTHER CODES” column. The school might offer five foreign language courses, and they will be arbitrarily coded
“10” to “50.” A code of “60” is assigned to an “Undecided or None” category. In column A under “OTHER CODES,”
the students will write the appropriate code to show their language preference. Students will use column B under
“OTHER CODES” to show previous study of the language. A “10” in column B means “yes, previous study”; a “20”
in column B means “no previous study.” Students who marked “0” in column A (undecided or no interest) are
directed to leave column B blank. Another common situation may request that more specific information be provided by
the schools and/or students. Under “OPTIONAL CODES,” a school system may have school identification numbers,
Social Security numbers, or additional special coding included in this section.
When developing response possibilities within a given area, care must be exercised to ensure that only one response
can be selected from the list since the related column (or columns) can accept only a single coded response. Multiple
responses within the same column generate a “multi-mark condition” which electronic scanning devices are programmed to disregard as ambiguities. Whether the special code grids are used for their designated purposes or in
connection with a questionnaire, the appearance of the coded responses on the HSPT® reports often eliminates the
need to search for such information in other files or lists, which simplifies the use of the results.
It should also be noted that STS can produce any of the reports discussed in this manual based upon student responses in the special code grids. Thus, separate alphabetical listings could be developed for each of the elementary school
codes that are represented in a given code list. Similarly, listings could be produced for students who are planning to
attend college, junior college, trade school, and any other category that might be included in an educational goals
category. Of course, such reports are provided only upon request and increase the cost of HSPT® programs.
Nevertheless, a growing number of schools have discovered that the nominal cost is more than offset by such advantages as convenience, immediate availability of the data, and more effective use of personnel time.
­10
­11
See explanation on pages 12–15.
02-03
01
10-11
08-09
06-07
04-05
LOW
**********
AVERAGE
**********
SS- STANDARD SCORE
GE- GRADE EQUIVALENT
GROUP PERCENTILES
6
6
6
6
6
6
6
2 13
1 6
1
5 31
2 13
2 13
1
6
6
6
1
6
4 25
1 6
2 13
3 19
2 13
1 6
1 6
1
READING
6
6
1
1
6
6
2 13
2 13
2 13
1 6
2 13
3 19
1
1
MATH
GROUP
FREQ %
GROUP
FREQ %
LANGUAGE
TOTAL
6
1
6
2 13
2 13
1 6
2 13
4 25
2 13
1 6
1
6
6
1
6
2 13
4 25
1
1 6
3 19
2 13
1 6
1
6
6
6
6
1
1
6
6
3 19
3 19
1
1
GROUP
FREQ %
OPTION
GROUP
FREQ %
6
6
1
1
1
6
6
6
3 19
1 6
3 19
2 13
2 13
1
1
1
1
6
6
4 25
1 6
3 19
2 13
1 6
1 6
1
1
6
6
COMPOSITE
(WITHOUT
OPTION )
PAGE:
***************
1 6
3 19
1
1
GROUP
FREQ %
D I S T R I B U T I O N
GROUP
FREQ %
FORM: K
RUN DATE: 11/12/08
PERCENTILES FOR SELECTED GROUP PERCENTILES---------78
81
64
83
84
77
85
67
71
53
74
59
64
63
51
51
27
54
50
53
52
CSQ- COGNITIVE SKILLS QUOTIENT
RS- RAW SCORE
----------NATIONAL
88
68
73
56
62
40
--------N-COUNTS, STANDARD SCORE MEANS AND STANDARD DEVIATIONS--------16
16
16
16
16
16
16
16
16
559
516
541
552
505
542
538
536
58
56
77
97
67
65
55
71
1
1
1
6 38
1 6
1
1
3 19
NP- NATIONAL PERCENTILE
LP- LOCAL PERCENTILE
75TH %-ILE
50TH %-ILE
25TH %-ILE
GROUP SIZE
TOT:
16
STANDARD SCORE MEANS
STANDARD SCORE STAN.DEVIATIONS
1
2
19-23
15-18
12-14
26-40
24-25
51-59
41-50
85-87
81-84
76-80
5
3
6
1
HIGH
1
TOTAL
GROUP
FREQ %
F R E Q U E N C Y
94-95
92-93
90-91
88-89
***************
1
69-75
60-68
4
3
99
98
96-97
6
7
8
9
2
OPTIONAL
QUANT
GROUP
FREQ %
VERBAL
OTHER
CHOICES
SEC:
BASIC SKILLS
08
BY TOTAL GROUP
DATE: 11/22/08 GRADE:
COGNITIVE SKILLS
00001
GROUP
12 34
CODES FREQ %
ELEM
SCHOOL
CODES
TEST
CENTER
1
SAMPLE SCHOOL
NATIONAL
STANINE* PERCENTILE
BAND * INTERVALS
GROUP I.D.
SUMMARY REPORT
00002
HSPT® Group Summary Statistical Report
THE GROUP SUMMARY STATISTICAL REPORT
As its name suggests, this report summarizes the performance of each distinct group of students that participated in
a given HSPT® testing session. In most instances the students constitute an integrated whole; consequently, most
schools receive two copies of a single Group Summary Statistical Report. Quite simply, its purpose is to provide an
overall picture of the collective performance of the individuals who were tested. More specifically, it presents a distribution of their scores in terms of two different national norm scales, reports the means and standard deviations of the
standard scores, and relates selected levels within the group to the national percentile-rank scale. A sample of the
Group Summary Statistical Report is given on page 11.
Frequency Distribution
The frequency distribution occupies the upper portion of the Group Summary Statistical Report. The column at the
far left contains a listing of the national stanine and national percentile-rank scales while the adjacent column divides
these scales into their high (76th to 99th percentiles), average (24th to 75th percentiles), and low (1st to 23rd percentiles) components. (A table showing the fixed relationship between stanines and percentiles is shown below.) To the
right of this display, under “FREQUENCY DISTRIBUTION,” you will find the number (frequency) of students and
the percentage within the group who earned a given national norm score on each subtest, the totals for the cognitive
and basic skills tests, the optional test if administered, and the composite score.
Thus, a look at the sample reveals that on the Reading subtest, 2 students (13%) earned a national percentile of
81–84, which in turn is equivalent to a national stanine of 7. By combining selected data points, it can be determined
that the Reading performance of 2 students (12%) fell within the upper 12% of the national normative sample (88th
percentile), which corresponds to the 8th national stanine band, and that a total of 7 students (44%) earned Reading
scores which fell in the high range of the national scales. From the data points shown for Math, it may be determined
that the performance of 2 students (13%) fell within the 4th stanine band. For Language, a total of 3 students (19%)
fell at or below the national average (50th percentile), and 1 student (6%) fell in the low range of the national
scales.
Needless to say, the focus of this report is based upon the group rather than individuals. Hence, if one wishes to
identify the students who attained a national percentile of 99 for the composite score, it would be necessary to search
the list of individual student results to discover their names.
Fixed Relationship Between Stanines and Percentiles
Stanine
Percentile
Rating
9
8
7
76–99
High
6
5
4
24–75
Average
3
2
1
01–23
Low
­12
In essence, the frequency distributions are a graphic display of your students’ scores on each component of the HSPT®,
and an analysis of these data can provide useful insights into their performance characteristics. You may wish to begin
simply by noting the highest and lowest points represented in a given distribution. These points define not only the range
of skills possessed by your group in terms of the national scales, but also the scope of the local percentile scale developed
for a given subtest. Consequently, it is possible to obtain a general impression of the relationship between the local and
national scales. For example, the range of “TOTAL BASIC SKILLS” scores of the sample group, shown on page 11,
extends from the 10th–11th interval of the national percentile scale to the 92nd–93rd interval. Since this range also
marks the limits of the local percentile scale, one can add the local percentiles in the right column of each score reported and see that the lowest 6% (6% in the 10th–11th national percentile interval) are at the 10th–11th national percentile
interval. Similarly, the highest 6% according to the local percentiles are at the 92nd–93rd percentile interval.
The distributions may also be used to determine the number and/or percentage of students represented in the
high, average, and low categories or in some other categorical scheme of your own devising. Such information can
be useful in establishing the number or percentage of students who are likely candidates for admission or placement
in your setting and the relative range of skills represented in the defined categories.
In some settings the skills of the group will range from the lowest end of the national scales to the highest, and the
majority of scores will occur in the average category, with the balance divided more or less evenly between the two.
Such groups may be described as typical or normal, if their scores were plotted in a conventional graph, since the
resulting curve would approximate the familiar bell-shaped or normal curve. In other instances, however, the majority of scores will occur in either the high or low categories or exhibit marked tendencies in one of these directions.
Such groups may be described as atypical and often are the result of a school’s location, general reputation, or
other factors which attract a much more homogeneous group of prospective students.
Whether your group is typical or highly unusual, the frequency distributions can assist you in recognizing both
the specific and general performance characteristics of your applicants, and in forming preliminary judgments related
to admission or placement factors.
N-counts, Standard Score Means, and Standard Deviations
As noted earlier, the standard score scale is an invariant measure based upon the 1980 national normative sample.
This scale has the following characteristics:
•
•
•
•
•
the average standard score is 500
700 corresponds approximately to the 98th percentile
600 corresponds approximately to the 84th percentile
400 corresponds approximately to the 16th percentile
300 corresponds approximately to the 2nd percentile
The standard score means or averages shown for your students are based upon this scale, and in effect compare their
performance with that of the 1980 national sample. Such a comparison may be of interest in itself, but the greatest value
of the scale lies in its ability to function as a common denominator between various editions of the HSPT®. Thus, it
forms a bridge between your current group and previous groups, and allows you to make direct comparisons of their
respective performance levels.
­13
When comparing two groups of students, each consisting of 100 or more individuals, differences as small as 4 or 5
points between standard score means are statistically significant; that is, one can conclude with reasonable confidence that the observed difference stems from a true difference in test performance rather than the occurrence of
chance variations. As either the size of the groups or the magnitude of the difference increases, the same conclusion
may be drawn with even greater confidence. One must also recognize, however, that a difference which is statistically significant does not always possess practical significance. While differences in the range of 5 to 40 standard
score points are statistically significant for groups of 100 or more, such differences are not large enough to warrant
any special concern other than noting their occurrence and the direction of the shift. In other words, the skill level of
the two groups—while measurably different—is sufficiently similar to be considered equivalent for all practical purposes. Consequently, differences in this range lack practical significance.
As one might expect, observed differences in excess of 40 standard score points require more than a passing comment
on your part. Values in this range are indicative of substantial differences in test performance between groups, and
thus, signify major differences in their respective skill levels. When confronted by differences of this magnitude,
attention should be focused upon the curriculum related to the area in which the excessive difference was observed.
For example, if the standard score mean for Math of the current group were 45 to 50 points lower than that earned
by an earlier group, one would be well-advised to re-evaluate the math curriculum with respect to its suitability for
a group whose math skills are substantially weaker than those of previous students. A separate remedial program
might also be considered for those whose individual standard scores in Math are well below the mean of the current
group. Conversely, if the math performance of the current group were 45 or 50 points higher than earlier students, it
might be appropriate to increase the scope, pace, or depth of the curriculum to accommodate or even challenge their
higher level of math skills.
It should be noted that differences in excess of 40 points usually are not observed between groups whose testings are
separated only by a year or two. Typically, year-by-year comparisons yield differences well within the 5–40 range noted
earlier. However, if a given trend continues over an extended period, the accumulated differences (or the difference
between the initial and current groups) can reach proportions that merit serious attention. In other words, substantial
changes in performance are more likely to creep into view than burst dramatically upon the scene. Consequently, for
those who wish to monitor this aspect of the HSPT®, it is vital to retain the data obtained from each testing for use
in subsequent analyses.
Finally, one should not lose sight of the fact that a standard score mean reflects the general performance level of the group
in a given area, but it offers no insights regarding the specific skills which underlie that performance. It may be clear,
for example, that the language performance of your applicants is declining, but this fact sheds no light upon
which specific skills have deteriorated and thus contributed to the decline. In settings where curricular modifications or remediation programs are under consideration, information concerning the relative strengths and weaknesses of specific skills can be especially useful. Such information can be provided in the form of two different
reports—the Performance Profile and the Individual or Group Item Analyses—which are discussed later in this manual.
­14
National Percentiles for Selected Group Percentiles
As was noted earlier, the frequency distributions present a very detailed picture of your students' performance by
reporting the exact number of individuals occurring in each national percentile interval and/or stanine band. The
purpose of this section of the Group Summary Statistical Report is to provide an abbreviated description of your
group's performance, and in doing so, to refocus attention upon their performance as compared with their peers in
the current national normative sample. At the far left of this section are the selected rank positions within the group
(i.e., the group or local percentiles). The 75th %-ile represents the typical performance of those in the upper half of
your group, the 50th %-ile indicates the typical performance of the group as a whole, and the 25th %-ile reflects the
typical performance of those in the lower half of the group. Immediately to the right, in each of the columns related
to test performance, are the national percentile ranks attained by your group as a whole as well as those in the upper
and lower segments.
If you wish to evaluate the typical or average performance of your group (i.e., the 50th percentile or the median),
your attention would be directed to the national percentiles that appear in each of the test-related columns on the same
line as the phrase “50th %-ile.” Any national percentile of 50 indicates that the average performance of your group
is the same as the average performance within the national sample; that is, the 50th percentile of your group corresponds to the 50th percentile for the national sample. Any national percentile greater than 50 indicates that the
typical performance of your students was higher than that of the national sample; any below 50 indicates lower performance. As may be seen in the sample Group Summary Statistical Report on page 11, the average performance of
that group was above the national average on every component of the HSPT® ranging from 53 for Math to 74 for
Language.
If the average performance of the upper half of your group is under consideration (i.e., the 75th percentile, or the
upper 25% of your group), you would note the national percentiles that appear in each test-related column on the
same line as the phrase “75th %-ile.” If, for example, the performance on the Language subtest from this segment
of your group were equal to their counterparts in the national sample, you would find a national percentile of 75 in
the Language column. Any value higher than 75 would indicate that the performance of this segment (the upper 25%)
was higher than that of the upper 25% of the national sample; any below 75 would signify lower performance. As
may be noted, this segment of the sample group outperformed those in the national sample by earning national percentiles above 75 on every component except Quantitative Skills and Math. To be more specific, the average performance of those in the upper half of that group (75th %-ile) was equivalent to a national percentile of 81 for Reading;
hence, their performance exceeded 75% of those in the local group and 81% of those in the national sample. In other
words, those in the upper 25% of the illustrated group are in the upper 19% of the national sample in this subject
area.
The data given for the average performance of those in the lower half of your group (25th %-ile) may be analyzed in
a similar manner. One must remember, of course, to adjust the level of comparison to correspond to the level of the
segment being evaluated. As may be noted in the sample Group Summary Statistical Report, this segment outperformed their counterparts in the national sample in every area, ranging from 27 for Math to 62 for Verbal Skills.
­15
HSPT® National Percentile Group Summary
High School Placement Test
National Percentile Group Summary Results
(Scholastic Testing Service, Inc.)
(00001)
Group Name: SAMPLE SCHOOL
Grade: 08
16
Group Size:
By: Total Group
Date: 11/22/08
Form: S
Section:
The following charts allow you to look at three different sub-populations in your testing group,
and to compare them to the national average. Note that the option test is not included in the
computation of the Battery Composite score. A line has been drawn where your group would
match the national average.
The median or average of your group
The first chart at the right compares your
total group median or average performance
to the national average.
100
90
80
70
60
50
The second chart (below left) compares
the top 25% of your group to the top 25%
of the nation.
40
30
20
10
it e
63
os
mp
Co
Ba
sic
tio
n
ills
64
Op
ge
59
Sk
th
ua
Ma
74
La
ng
s
ad
i ng
kill
an
os
it e
52
mp
ills
Sk
53
tio
n
50
Op
54
ge
27
Co
Co
51
Ba
sic
Ve
51
ua
rba
l
it e
os
40
th
62
NP
mp
tio
n
85
Co
ills
77
Op
84
Sk
ua
ge
83
Ba
sic
Ma
th
64
La
ng
s
Re
ad
i ng
t
kill
l
Co
gS
an
Qu
rba
Ve
81
Ma
10
78
53
La
ng
20
10
s
30
20
ad
i ng
40
30
Re
50
40
kill
60
50
t
70
60
an
80
70
gS
90
80
Qu
90
68
71
The bottom 25% of your group
100
88
gS
Co
The top 25% of your group
100
NP
67
t
l
rba
Ve
Qu
The third chart (below right) compares
the bottom 25% of your group to
the bottom 25% of the nation.
56
Re
73
NP
NATIONAL PERCENTILE GROUP SUMMARY (See above report).
The National Percentile Group Summary is a report that displays the results of the HSPT® testing program in three
different ways. The first bar graph shows how the group median compares your total group median or average performance to the national average. The second bar graph displays how the students in the top 25% of your group
tested compares to the top 25% of the nation. Finally, the third bar graph displays the bottom 25% of the group
tested to those in the bottom 25% of the nation. While the bars represent the local group of students, a horizontal line
has been drawn to show where your group compares to the national average. Please note that any optional test is not
included in the computation of the Battery Composite score.
­16
AVG
SPECIFIC SKILLS
See explanation on pages 18–19.
VERBAL
Analogy
Logic
Verbal Classificati
Synonyms
Antonyms
**COGNITIVE SKILLS*
10 8
12 7
1611
13 7
9 6
X
X
X
X
X
10 3
1010
Spelling
Composition
QUANTITATIVE
Sequence
2220
Reasoning
1210
Geometric Compariso 8 7
Non-Geom Comparison 10 5
**COGNITIVE SKILLS*
1412
4 3
4 4
6 5
1110
3 1
1211
# OF NO.
ITEMS RIGHT
Incorrect Usage
--Nouns/Pronouns
--Verbs/Adv/Adj
--Other Parts/Spch
Correct Usage
Capitalization
Punctuation
-AVG
Comprehension
4032
X
--Vocab in Context
7 6
X
--Literal Comp
7 6
X
--Inferential Comp 1714
X
-Main Idea
2 2
X
-Draw Conclusions
7 6
X
-Reasoning
3 2
X
-Implied Characteri 5 4
X
--Critical Comp
9 6
X
-Author Purpose/Qua 2 2
X
-Compare/Contrast
2 1
X
-Predictions
1 0 X
-Fact/Fiction
4 3
X
Vocabulary
2215
X
+AVG
*****LANGUAGE*****
# OF NO.
ITEMS RIGHT
SS- standard score
RS- raw score
82-7 87-7
96-9 96-9
1
1
LOW
X
X
X
X
AVG
LOW
5
2
10
3
20
X
X
X
60
70
****MATHEMATICS****
SPECIFIC SKILLS
X
X
90
3
5
2
1
1
3024
3
5
3
1
1
1312
# OF NO.
ITEMS RIGHT
2114
-Numbers/Numeration 10 7
-Measurements
2 1
-Geometry
3 2
-Algebra
3 3
-Stats/Probability
3 1
Applications
-Numbers/Numeration 1613
-Measurements
1 0 X
-Geometry
4 4
-Algebra
5 3
-Stats/Probability
4 4
Procedural
X
Conceptual
7
8
95
XXXX
9
HIGH
99
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
00001
11/12/08
SPECIFIC SKILLS, Each major test area consists
of various specific skills detailed below.
Performance is shown on each of these by
the # of items answered correctly and may be
evaluated by noting the shaded or unshaded
column in which a single mark occurs.
These columns have the same meaning
as the shaded/unshaded columns in the
Performance Ratings section.
4
5
6
NATIONAL STANINE RANKING
80
XXX
XXXX
XXX
XXXX
XXXX
XXXX
XXXX
XXX
X
X
50
OTHER:
DATE: 11/22/08
RUN:
ABOVE
AVERAGE
-Numbers/Numeration
-Measurements
-Geometry
-Algebra
-Stats/Probability
X
40
AVERAGE
NATIONAL PERCENTILE RANKING
30
BELOW
AVERAGE
PERFORMANCE RATINGS
PERFORMANCE RATINGS, The student's national percentile scores are also shown on the graph. A band of
marks is used to allow for any inaccuracy in measurement with the score for this testing being near the
center. When comparing any two tests, it is likely that there is a true difference in scores only when
the ends of the bands do not overlap.
For most uses performance may be judged by noting the shaded or unshaded rating column in which a
band occurs. The High, Average, and Low ratings represent the highest 10%, middle one-third, and lowest
10% respectively. The Above Average represents the upper one-third (excluding the highest 10%) while
the Below Average represents the lower one-third (excluding the lowest 10%)
HIGH
******READING******
SPECIFIC SKILLS
609
35
ST- stanine
GE- grade equivalent
RS:
83-7
83-7
92-8
88-7
79-7
78-7
90-8
87-7
10.2
10.0
10.4
10.2
602
596
638
617
-AVG
PERFORMANCE SCORES, This student's performance
is shown above by a series of numeric scores for
each test area taken. These may be interpreted in
the conventional manner. Thus, a national
percentile rank of 65 (which would be located in
the NAT'L
column) would indicate that the
PCT
student's test score exceeded 65 percent of those
in a national normative population.
SCORE PCT- percentile rank
LEGEND CSQ- cognitive skills quotient
COMPOSITE (W/0 OPTION)
116
LOCAL
PCT-ST
61-6 73-6
79-7 84-7
72-6 83-7
GE
NAT'L
PCT-ST
558
599
592
SS
PERFORMANCE SCORES
FORM: S
BY TOTAL GROUP
OTHER-CODES:
ELEM: 175 H.S. CHOICES: 3415
+AVG
OPT: SCIENCE
COGNITIVE SKILLS
VERBAL
QUANTITATIVE
TOTAL CSQ =
BASIC SKILLS
READING
MATHEMATICS
LANGUAGE
TOTAL
MAJOR TEST AREAS
(00001)
SECTION:
GENDER: M
HIGH
SAMPLE SCHOOL
GRADE: 8
AGE: 1310
LOW
E
AVG
James
-AVG
Aragonman
+AVG
REPORT FOR:
HIGH
­17
LOW
PERFORMANCE PROFILE
HSPT® Performance Profile
THE PERFORMANCE PROFILE
Upon request this diagnostic report is provided for each student within a group. Generally speaking, it offers a unique
blend of information about student performance in that it not only provides the general scores attained by an individual, but also indications of his or her performance on the specific skills assessed by the HSPT® battery. School
personnel will find the convenient size and wealth of data quite useful for a wide range of purposes. The individualized character of the report, its graphic displays, and self-contained explanations make it an ideal report for distribution to the students and parents. A sample of the Performance Profile is shown on page 17.
The upper portion of this report focuses upon the student’s performance on the various subtests of the HSPT®. The
subtests are identified in the “MAJOR TEST AREAS” section, and the various scores are displayed in the
“PERFORMANCE SCORES” section. In addition to the scores provided in the other report materials, the Performance
Profile includes both local and national stanine scores. Stanines utilize a 9-point scale on which 9 represents the
highest performance, 5 the average, and 1 the lowest. One important advantage of stanines is their basic simplicity,
which some students and parents find less confusing than other types of scores.
At the far right is the “PERFORMANCE RATINGS” section. Here the student’s performance is presented in a
graphic display of X-bands. The national percentile rank earned by the individual lies near the center of a given
X-band, and its width reflects any variation in measurement that might be likely to occur. The shaded and unshaded
areas of the graph depict the various levels of performance, and the national percentile rank and stanine scales are
shown at the bottom for reference.
The mid-portion of the Performance Profile offers brief explanations of the “PERFORMANCE SCORES” and
“PERFORMANCE RATINGS” shown in the upper portion, as well as the “SPECIFIC SKILLS” data shown in the
lower portion.
The lower portion of this report presents a listing and graphic display of the “SPECIFIC SKILLS” assessed by the
five subtests of the HSPT®. Performance is indicated by the number of items answered correctly—“NO. RIGHT”
column—and the total number of items related to the skill is given as a frame of reference—“# OF ITEMS” column.
As a further aid to interpretation, the student’s performance is indicated by a single “X” in one of five columns which
have the same meaning as the shaded and unshaded columns in the “PERFORMANCE RATINGS” section in the
upper portion. A more complete description of the specific skills appears on the back of the report.
As may be noted in the sample report for James Aragonman on page 17, the Reading subtest is divided into two major
categories: Comprehension (40 items) and Vocabulary (22 items). James correctly answered 32 of the 40
Comprehension items. The location of the single “X” indicates that his performance was in the above-average
(“+AVG”) range when compared with the national sample.
The category of Comprehension is further divided into Vocabulary in Context (7 items), Literal Comprehension (7
items), Inferential Comprehension (17 items), and Critical Comprehension (9 items). The report shows the number of correct responses for each of these more specific areas as well as the ratings those responses earned. Thus, this
student correctly answered 6 of the 7 Vocabulary in Context items, which yielded an above average rating. Some areas
are divided even further. For example, the seven items related to Inferential Comprehension consist of two items dealing with main ideas, seven items dealing with drawing conclusions, three items dealing with reasoning, and five items
dealing with implied characteristics.
­18
Occasionally a skill of relatively minor importance is represented in a subtest by only two or three items so that areas
of greater importance may be assessed more fully or in greater depth. Whenever such a skill is measured by three or
less test items, the student’s performance may be reported only in terms of the number of test items involved and the
number correctly answered. If an “X” is not displayed on the graph, it may be difficult to statistically provide a reliable rating based upon such a small item base.
The primary advantage of the Performance Profile lies in its ability to communicate both the general performance
levels of the student as well as a more detailed picture of his or her specific skills. This approach can provide useful
insights for both school personnel and the student. Depending upon the specific factors involved in an individual
case, low or below-average ratings on a specific skill may be acceptable or even expected. If this is not the case,
however, attention is focused upon achievement weaknesses that might otherwise escape unnoticed.
THE PERFORMANCE PROFILE SUMMARY
A Performance Profile Summary is developed for each group of students for whom this report is requested. Its purpose is identical to that of the Group Summary Statistical Report provided in connection with the Alphabetical
List Report and Rank-Order List Report—to present an overall picture of the collective performance of a group of
individuals.
In appearance, the Performance Profile Summary is virtually identical to those provided for the individual students. It differs, of course, in that its contents reflect group rather than individual performance. This is accomplished by computing averages for the group with respect to both the general test scores in the upper portion and
the number of items correctly answered for the specific skills in the lower portion. These average values are presented in the appropriate locations and are displayed graphically as well. Note that local percentile and stanine averages are not shown since such values would invariably be 50 and 5 respectively for any of the general scores.
­19
Ethan
Rachel +DA DA++ ++C+++ B ++ +BA++++ ++C ++ADA B A+ + + ++CB ++D+++C+D+C+++B+C+CB+A
Drand
­20
Daniel +++ +C++ ++++++ + ++ +++++++ +++ +AA++ + ++ + + A+CD ++A++C++++AADDC+B+A +B
Kaitlyn+++ +C++ B+++B+ + ++ A++++B+ +++ ++A+B + ++ + + ++C+ C+B+++BA+++BAD+B++AC+D
Kleinman
Lomerez
Kacie
Rachel B++ +C++ +++++C + +B +BC++++ ++C +AA+B C ++ + + A++C +DB++C++D++A++BBD++++D
Lauren +++ ++++ B++++A + ++ +B+A+++ +DC C+++A + ++ + + ++B+ +D+++++AD+++++CBC+A+++
Anthonv+DA +C++ B+++D+ + ++ C++++C+ +++ +AA++ + ++ + + ++BD +BB+++B++++AD+B+B+CC+B
Melanie+++ +C++ ++++++ + +B +++++++ +++ C+++A + A+ + + ++BC ++B++++A++++++AB++C+++
JustinaB++ AC++ +++++A + ++ C+BC+A+ +++ ++B++ + ++ + + ++C+ A+AA+AC++CC++CC+CCCC++
Moarey
Natlusek
Pleuker
Saitnella
Taktedy
Vugorska
+++ +A++ ++++++ + ++ +++++BA +++ ++A++ + DB + + ++++ ++AC++BDB+++++AAB++++D
Darren +++ +C++ +++ ++ + ++ CB+++++ ++C +D+++ + ++ + B D+++ ++D+++CDD++BD+A++++B+D
Herrreral
+C+ +D++ +++++A A ++ +B+++++ +D+ +D+++ + ++ + + A+B+ ++A+++C++A+BD+CBD++C+D
Marie
Haynton
++A +++++ + ++ + + +BAB
Alexand+C+ +C+A ++++++ + ++ +B+A+++ C++ CAA+B + ++ C + +DCB ++A+++CAA++BD+A+++C+BD
+ ++ CB+++
Gonzalez
Ertellazyk JonathaBC+ ++++ ++++D
+++ ++++ ++++++ + ++ C++++B+ +++ CA++A + ++ + + +++B ++++B+B+A+++++++D+BD+D
BDC +++++ B ++ + + D+C+ BBAB+B++A+++DDCDB+++B+
Carrillon
++C +CCA +++++D + +A +BCD++
Brian
Baniels
++D ++++ +++++A + ++ +++++B+ ++D ++++B + +B + A ++C+ ++A+C+C+++++A+A+C++++B
James
ITEM
111 1111 111111 1 11 1111111 111 11111 1 11 1 1 1111 1111111111111111111111
NUMBERS 123 1123 133345 2 22 1222355 134 12344 4 34 2 4 3444 5555555666666666677777
694 5836 935940 4 08 7567812 476 31027 3 25 2 0 1189 3456789012345678901234
Aragonman
NAME
INDIVIDUAL ITEM REPORT
Test Date: 11/22/08 Test: READING
00001 Gr: 08 Sec:
-------- C O M P R E H E N S I O N -----------------*V O C A B U L A R Y--1
2
3
4 5 6
7
8
9 101112 13
14
TOTAL GROUP
SAMPLE SCHOOL
1
5
6
7
8
Author's Purpose/Theme
Compa rison and Contrast
Autho r's Qualifications
Predictions
Fact vs. Fiction
VOCAB ULARY
CRITI CAL COMPREHENSION
Main Idea or Title
Drawing Conclusions
Reaso ning
Impli ed Characteristics
INFERENTIAL COMPREHENSION
3 Details
4 Cause and Effect
LITER AL COMPREHENSION
1 Meani ng from Context
2 Multi -Meaning Words
VOCAB ULARY IN CONTEXT
COMPREHENSION
OBJECTIVE/SKILLS
OUTLINE
Run Date:11/12/08
Form: K
Page:
9
10
11
12
13
14
Level:
See explanation on page 21.
00001
HSPT® Individual Item Analysis Report
ITEM ANALYSES—INDIVIDUAL AND GROUP
The test results provided on such reports as the Alphabetical List Report and the Group Summary Statistical Report
allow a test user to determine achievement levels for any individual or the group as a whole. In some settings it may
be sufficient simply to know, for example, that the math skills of a student are average in terms of the national normative sample or that those of the group are at essentially the same level as earlier groups. In other settings, however,
where the focus of attention is upon the specific skills or objectives which underlie general performance, there is a
legitimate need for test data reflecting such skills.
The Performance Profile, discussed earlier, allows a test user to gain some insight into these specific skills. However,
the Individual Item Analysis Report and Group Item Analysis Report extend this insight to its fullest by providing
performance information on an item-by-item basis and relating it to a comprehensive outline of specific skills or
objectives. In short, item analyses reports equip the test user to make as penetrating evaluation of specific performance as his or her purpose may require.
Individual Item Analysis Report
A sample Individual Item Analysis Report is shown on page 20. As may be noted, the test results are presented in
alphabetical order and restricted to a single subject area–Reading in this instance. At the far right is the “OBJECTIVE/
SKILLS OUTLINE” column which identifies the specific skills or objectives measured by the items in this subtest. Major categories within the outline (e.g., “COMPREHENSION”) reappear as the first line of information in
the body of the report as a general reference for the data given below. Each skill or objective within a major category
carries an identifying number, and these are presented as the second line of information in the body of the report. The
item numbers related to a given objective appear beneath the major category (e.g., 1–Meaning from Context) and
constitute the third, fourth, and fifth lines of information (item numbers must be read vertically). Thus, as may be
seen in the sample, items 116, 129, and 134, deal with skill 1–Meaning from Context within the major category of
“VOCABULARY IN CONTEXT.”
Student results are reported in terms of the individual’s response to each test item: a “+” indicates a correct response,
a letter indicates the incorrect response that was made, and a blank signified that the student made no response to
the test item. As may be seen in the sample, James Aragonman correctly answered two of the three items related to
objective 1–Meaning from Context. He elected answer choice D (an incorrect answer) for item 134.
When using the Individual Item Analysis Report, one must not lose sight of its purpose, which is essentially diagnostic. Accordingly, it directs attention to student performance on individual items related to specific skills, rather than
focusing on a set of normative scores. In this context, evaluation of a student’s performance must be based upon your
knowledge of the subject area and the available information concerning the student, his or her educational background,
and so forth. If a given objective/skill was included in a school's curriculum, perhaps even emphasized, your expectations would be vastly different than if the objective/skill is commonly excluded or treated lightly. Incorrect responses
should be examined by referring to a test booklet. (If one is not available, request a copy from STS.) It is often possible to discover a pattern to the errors on an objective/skill that could provide the basis for remedial instruction.
It should be apparent that this evaluative procedure is virtually identical to that applied to criterion-referenced test
results. Needless to say, it is an intensely individualized process, but for this very reason can produce the most useful
and meaningful assessments of the specific strengths and weaknesses of the students.
­21
­22
81
84
94
88
58
9
10
11
12
13
14
GROUP SIZE:
54
91
71
75
65
16
143
132
122
140
131
153
160
167
174
120
117
114
113
81
81
94
88
69
75
56
13
25
49
72
90
83
47
75
45
22
21
'
'
'
'
'
'
'
'
'
'
'
'
'
'
'
'
'
'
'
'
'
'
'
'
'
'
'
'
'
'
85
49
83
52
81
50
81
56
141
154
161
168
88
57
32
39
88
69
44
50
145 69 88
128
125
137
121
133 91 99
129 53 69
118 31 25
'
'
'
'
'
'
'
'
'
'
'
'
'
'
'
'
'
'
'
'
'
'
'
'
'
'
'
'
'
'
148
155
162
169
30
27
60
35
25
13
81
25
126 75 75
146 42 56
130 29 50
135 82 94
134 72 75
123 85 94
'
'
'
'
'
'
'
'
'
'
'
'
'
'
'
'
'
'
'
'
'
'
'
'
'
'
'
'
'
'
149
156
163
170
41
75
67
63
50
75
75
88
142 88 94
127 69 75
139 91 94
136 62 88
'
'
'
'
'
'
'
'
'
'
'
'
'
'
'
'
'
'
'
'
'
'
'
'
'
'
'
'
'
'
157 44 81
164 40 50
171 34 38
147 63 50
138 93 99
144 71 81
(NT-P = National P-Value, GP-P = Group P-Value)
99
63
88
75
87
39
64
62
119 77 81
124 71 88
84
88
3
4
5
6
7
8
116 77 81
115 62 88
75
73
1
2
-OBJECTIVE#
AVG-P
INDIVIDUAL ITEM REPORT
Test Date: 11/22/08 Test: READING
00001 Gr: 08 Sec:
'
'
'
'
'
'
'
'
'
'
'
'
'
'
'
'
'
'
'
'
'
'
'
'
'
'
'
'
'
'
158 42 69
165 39 44
172 63 44
151 57 56
150 52 56
'
'
'
'
'
'
'
'
'
'
'
'
'
'
'
'
'
'
'
'
'
'
'
'
'
'
'
'
'
'
159 38 31
166 62 69
173 54 81
152 59 81
'
'
'
'
'
'
'
'
'
'
'
'
'
'
'
'
'
'
'
'
'
'
'
'
'
'
'
'
'
'
2
5
6
7
8
Author's Purpose/Theme
Comparison and Contrast
Author's Qualifications
Predictions
Fact vs. Fiction
VOCABULARY
CRITICAL COMPREHENSION
Main Idea or Title
Drawing Conclusions
Reasoning
Implied Characteristics
INFERENTIAL COMPREHENSION
3 Details
4 Cause and Effect
LITERAL COMPREHENSION
1 Meaning from Context
2 Multi-Meaning Words
VOCABULARY IN CONTEXT
COMPREHENSION
CONTENT OUTLINE
Run Date:11/12/08
Form: K
Page:
9
10
11
12
13
14
Level:
-------------------INDIVIDUAL ITEMS--NATIONAL AND GROUP P-VALUES------------------ITM NT GP ' ITM NT GP ' ITM NT GP ' ITM NT GP ' ITM NT GP ' ITM NT GP ' ITM NT GP '
# P P ' # P P ' # P P ' # P P ' # P P ' # P P ' # P P '
TOTAL GROUP
SAMPLE SCHOOL
HSPT® Group Item Analysis Report
See explanation on page 23.
Group Item Analysis Report
A Group Item Analysis Report is provided routinely when an Individual Item Analysis Report is requested, but it may
be ordered without the individual student data if so desired. In either case its purpose is the same—to provide an
overall perspective of the collective performance of the group on a single subtest. A sample of this report is shown on
page 22.
As in the case of the Individual Item Analysis Report, the specific objectives/skills measured by a given subtest are displayed in the “CONTENT OUTLINE” column on the right side of the form. At the far left in any given line of
information, you will find an objective number, the average percentage of students in the group who correctly
answered the cluster of items, and the individual item numbers themselves. In addition, each item number is shown
with the percentage of students in the national sample—“NT-P”—and in your group—“GP-P”—who correctly
answered it. Such percentages conventionally are termed p-values.
When examining this report, a useful entry point is the average p-values—“AVG-P”—shown for your group. Each
average p-value indicates the average percentage of students who correctly answered the cluster of items related to a
given objective or skill. In effect, the average p-values present a concise summary of the group’s performance with
respect to the assessed objectives/skills. Generally speaking, those in the lower range of the reported values represent
weaker group performance while those in the upper range reflect stronger group performance. As you might expect,
in this context terms such as “weaker” and “stronger” necessarily are relative terms whose significance will vary from
one group to another.
In most settings the test user will find it necessary to turn to the individual item numbers and determine how the group’s
p-values compare with the national p-values. Needless to say, such a procedure gives rise to a more comprehensive
view of the group’s performance, which in turn allows one to develop a fuller appreciation of the average p-values. For
example, in the sample it may be seen that objective 1 has an average p-value of 75. This value falls in the middle range
of those reported for this group. Upon examining the individual data, however, it is clear that the group excelled on
one of the items in the cluster, but trailed the national normative sample on the remaining three. It would be very
worthwhile to inspect the latter test items in the test booklet and determine the specific content which posed such
difficulty for most of the students in this group.
As should be apparent, one approaches the Group Item Analysis Report in much the same fashion as the Individual
Item Analysis Report—that is, the various data must be analyzed using your knowledge of the pertinent factors as the
primary frame of reference.
­23
HSPT® Student Score Report
HIGH SCHOOL PLACEMENT TEST
Scholastic Testing Service, Inc.
Student Score Report
School: SAMPLE SCHOOL
Grade: 08
Form: S
(00001)
Elem: 175
National Percentiles
Test Date: 11/22/08
1st 2nd 3rd 4th
Codes
Report by -Total Grp
5th
Choices: 34 15
Percentile
Cognitive Skills
James's
VB QT TCS RD MT LN TBS OP CMP
99
98
96 - 97
94 - 95
90 - 93
85 - 89
77 - 84
70 - 76
60 - 69
50 - 59
40 - 49
31 - 39
24 - 30
16 - 23
11 - 15
7 - 10
5 - 6
3 - 4
1 - 2
High
Above
Avg
Other Codes:
Avg
To the parents or guardian of:
James Aragonman
00001 Park
St.Louis
MO 63045
Below
Avg
Low
Dear James:
NATL Percentiles
73
84
83
Basic Skills
83 83 92 88
Composite
96
87
STS' High School Placement Test is a measure of your basic skills and your educational achievement. It was given so that you, your
parents, and your teachers can learn more about your preparation for high school.
WHAT THE TEST MEASURES
Cognitive Skills
Verbal Skills (VB) This test measures how well you perform reasoning tasks involving the use of words. Your ability in this area is
related to your performance in language, reading and various areas within social studies.
Quantitative Skills (QT) This test measures your ability to do reasoning problems involving numbers and quantities. This ability is
related to performance in mathematics, sciences and other areas that deal with numbers and things.
Total Cognitive Skills (TCS) This is a total of the Verbal Skills and Quantitative Skills subtests.
Basic Skills
Reading (RD) This test measures your ability to remember important ideas and significant details, recognize central thought or purpose,
make logical inferences and understand vocabulary in context. Since good reading habits and skills are essential to learning, thinking
and problem solving, this score is usually related to your overall success in school.
Mathematics (MT) This test not only measures your ability to perform arithmetic operations and apply math concepts to solve
problems, but also your knowledge of important concepts and ability to reason. Your score on this test tells you how well you are
prepared for high school mathematics.
Language (LN) This test measures your knowledge of capitalization, punctuation, grammar, spelling, usage and composition.
Total Basic Skills (TBS) This is a total of the Reading, Mathematics and Language subtests.
Optional Test (OP) The option test is a 40 item test in either Science, Mechanical Aptitude, or Religion.
Battery Composite (CMP) This score is a total of the Verbal, Quantitative, Reading, Language and Mathematics
sections of the battery.
WHAT THE SCORE FOR EACH TEST MEANS
The scores reported above are "National Percentile Ranks." They tell what percentage of students had scores below yours in a national
sample. If your Verbal Skills score is 55, for example, this means you did better than 55 percent of the students in the national sample.
A percentile rank of 50 is exactly average.
Cognitive Skills Quotient (CSQ) This score is a measure of a student's learning potential. It is an age-based norm rather than gradebased. The scale has a mean of 100 and an operational range of 55-145. Your CSQ score is 116.
Grade Equivalents (GE) These scores in the basic skills areas compare the student's performance with those of other grades. If one
were to test in January of grade 8, for example, and attain a Reading GE of 10.5, this means they scored as well on the grade eight
material as a mid-year grade ten student would have on the grade eight material. Your GE scores are:
Reading GE: 10.2
Mathematics GE: 10.0
Language GE: 10.4
Now is a good time, as you enter high school, to make the most of your special talents and to begin serious planning for your future
education and career.
See explanation
(pg 1)on page 25.
­24
Student Score Report
The Student Score Report is a one-page report for an individual student and his/her parents or guardian. The top part
of the report provides a graphic representation of the student's "National Percentile Rank" for each subskill taken,
total basic skills, and the battery composite score.
Under “WHAT THE TEST MEASURES” you will find an explanation of what each subtest measures. The ‘Total
Cognitive Skills,’ ‘Total Basic Skills,’ and ‘Battery Composite’ are also defined.
In the last section labeled “WHAT THE SCORE FOR EACH TEST MEANS” you will find an explanation on the
students cognitive skills quotient (CSQ) and their grade equivalent (GE). A breakdown of the student’s grade equivalent scores are also given.
SUGGESTIONS
Scholastic Testing Service, Inc. welcomes any suggestions for improving this testing program. Many times we
find that our best suggestions come from school personnel
who have administered the tests and used them in parent
conferences and student counseling. If you have suggestions, criticisms, or questions, please feel free to send them
to us:
SCHOLASTIC TESTING SERVICE, INC.
480 Meyer Road
Bensenville, Illinois 60106–1617
­25
CAT # HP 120012R1-013112