Guide to the Use and Interpretation of Scores for the EXADEPTM Test

Transcription

Guide to the Use and Interpretation of Scores for the EXADEPTM Test
Guide to the Use and
Interpretation of Scores
for the EXADEP Test
TM
www.ets.org/exadep
The Examen de Admisión a Estudios de PosgradoTM (EXADEPTM) is administered by Educational Testing Service.
Please send all your questions to the ETS office in Puerto Rico.
ETS Puerto Rico Office
Global Division
American International Plaza
250 Muñoz Rivera Ave.
3rd Floor, Suite 315
Hato Rey, PR 00918
Phone:
TDD:
Fax:
E-mail: Web site:
787-753-6363
787-758-4598
787-250-7426
[email protected]
www.ets.org/exadep
8th Edition, 2011.
Copyright © 2011 by Educational Testing Service. All rights reserved. ETS, the ETS logo and LISTENING. LEARNING. LEADING.
are registered trademarks of Educational Testing Service (ETS) in the United States of America and other countries. EXADEP and
EXAMEN DE ADMISIÓN A ESTUDIOS DE POSGRADO are trademarks of ETS.
2
TABLE OF CONTENTS
Page
PREFACE . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4
INTRODUCTION . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5
DESCRIPTION OF THE TEST . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6
EXADEP Measures Specific Reasoning Skills That Are Developed Over a Long Period of Time . . . . 7
GUIDELINES FOR THE USE OF EXADEP SCORES . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8
Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8
Policies . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8
Guidelines . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9
Normally Appropriate and Inappropriate Uses of EXADEP Scores . . . . . . . . . . . . . . . . . . . . . . . . . . 10
Appropriate Uses . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10
Inappropriate Uses . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10
CONSIDERATIONS IN SCORE INTERPRETATION . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11
Repeat Test Takers . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11
Examinees with EXADEP Scores More Than Five Years Old . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11
Examinees with Disabilities . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11
POLICY AND USE OF EXADEP SCORES . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 12
Score Reporting Policies . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 12
Reporting Revised Scores . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 12
Confidentiality and Authenticity of EXADEP Scores . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 12
EXADEP Scores and Graduate Admissions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 12
Validity Studies . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13
SCORE INTERPRETATION AND STATISTICAL INFORMATION . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 14
Verbal, Mathematics and Analytical Reasoning, Written Expression,
and English Sections of the Test . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 14
Standard Error of Measurement . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15
Reliability Coefficient . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15
STATISTICAL TABLES . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 16
How to Compare the Scores . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 16
Mean . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 16
Percentile Ranks . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 16
Table 1: Business . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 17
Table 2: Science . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18
Table 3: Education . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 19
Table 4: Law . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 20
Table 5: All Other Programs . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 21
Table 6: All Candidates . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 22
STEPS THAT ETS TAKES TO ENSURE FAIRNESS . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 23
Involving External Faculty Members in the Development of the Test. . . . . . . . . . . . . . . . . . . . . . . . . 23
Differential Item Functioning Analysis. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 23
3
PREFACE
Our mission at Educational Testing Service® (ETS®) is to advance quality and equity in education by providing
fair and valid assessments, research, and related services. The Guide to the Use and Interpretation of Scores
for the EXADEP™ supports this mission by helping to ensure that the test is created and scores are used
under fair and appropriate ETS guidelines. The purpose of this guide is to help institutions or university
members responsible for graduate admissions and fellowship awards to use EXADEP scores as an additional
aid in their decision making. Understanding the contents of this guide will help prevent misuse and
misinterpretation of the test scores and potential unfairness to applicants.
An absolute commitment to excellence, integrity, and fairness is at the core of everything we do. Our
products and services are meant to help educators, educational institutions, businesses, and governments
further education for all people worldwide. Additional copies or information about the test or services may
be obtained from the ETS Puerto Rico Office by e-mail or visiting our Web site at: www.ets.org/exadep.
4
INTRODUCTION
EXADEP scores can be used by admissions or fellowship panels to supplement undergraduate records
and other qualifications for graduate study. The scores provide common measures for comparing the
qualifications of applicants and aid in the evaluation of grades and recommendations. Any accredited
graduate or professional school or any department or division within a school may require or recommend
that its applicants take the test. An institution not accredited by a recognized agency can become a score
user if approved by the EXADEP Program.
The weight to be given to EXADEP scores can generally be established by relating what the tests measure
to the orientation, curriculum, and aims of a department. Specifically, the content validity of the tests for a
graduate department should be determined by reviewing each test carefully and then making subjective
decisions as to the weight, if any, the scores on EXADEP tests should receive in relation to other admission
factors.
Before taking the EXADEP, students should have completed at least two years of undergraduate study in an
institution where the language of instruction is Spanish. Students should indicate on their registration form
the institutions to which they want their test scores sent.
The institutions or the faculties within an institution that require or recommend that the candidates take the
EXADEP should announce it in their publications and on their Web site, and they should notify the candidates.
It is also important to let the candidates know the deadlines for receiving scores. For institutions that are
considering using the EXADEP test or for those planning to require EXADEP as a requisite of admission,
additional information and convenient forms can be found on the EXADEP Web site (www.ets.org/exadep).
5
DESCRIPTION OF THE TEST
The purpose of the EXADEP test is to measure an examinee’s quantitative, analytical reasoning and verbal
abilities in Spanish and in English as a foreign language in order to contribute to the prediction of the
examinee’s performance in graduate or professional schools and for fellowship selection.
The test comprises five sections. The first four are in Spanish and the fifth is in English. All questions in the
test are multiple-choice questions.
Section 1: Verbal Aptitude (90 minutes)
Part A: Antonyms
Analogies
Sentence Completion
Part B: Reading Comprehension
Section 2: Mathematics (40 minutes)
Arithmetic
Algebra
Geometry
Section 3: Analytical Reasoning (40 minutes)
Section 4: Written Expression (30 minutes)
Part A: Language Usage
Part B: Sentence Correction
Section 5: English (45 minutes)
Part A: Sentence Completion
Antonyms
Part B: Reading Comprehension
The total testing time is four hours and five minutes. There is a ten-minute break between Sections 2 and 3.
• The Verbal section measures the ability to analyze and evaluate written material and synthesize
information obtained from it, to analyze relationships among component parts of sentences, and to
recognize relationships between words and concepts. In each test edition, the passages are a balance
among three different subject-matter areas: humanities, social sciences, and natural sciences.
• The Mathematics section measures basic mathematical skills and understanding of elementary
mathematical concepts, as well as the ability to reason quantitatively and to solve problems in a
quantitative setting. There is a balance of questions requiring arithmetic, algebra, geometry, and data
analysis.
• The Analytical Reasoning section measures the ability to think analytically. It tests the ability to draw
inferences and think deductively based on a partially defined scenario and a set of conditions that
must hold in any fully developed version of the scenario. It does not test knowledge of any particular
subject nor does it require training in formal logic.
• The Written Expression section measures the ability to recognize the use of language essential to a
finished piece of writing that would be considered acceptable by most educated readers and writers
of Spanish.
• The English section measures certain elements in the spectrum of abilities required to reason
effectively in a verbal medium.
6
EXADEP Measures Specific Reasoning Skills That Are Developed Over a Long Period of Time.
EXADEP is designed to measure verbal, quantitative, and analytical reasoning abilities considered important
for successful performance in graduate school. The test is designed to be independent of particular courses
of study. Instead, it represents knowledge, skills, and abilities that are developed across many courses and
that reflect exposure to demanding courses of study over a long period of time. Research on other ETS tests
used for similar purposes has shown that short-term study will not alter scores greatly for most people.
It is important that test takers be thoroughly familiar with the test—its contents and procedures—before the
actual testing day in order to avoid receiving lower scores than they might otherwise obtain.
The long-term reasoning component measured by EXADEP makes it a test of developed skills. Both words
are significant here. The skills are reasoning skills of the type most important to success in graduate study.
The developed nature of these skills comes from a lifetime as a reader, as one who thinks through
quantitative problems, and as one who is accustomed to communicating and critiquing arguments in writing.
Unlike high school or undergraduate grade reports, EXADEP scores reflect performance on tasks that are
common to ALL applicants. The complementarity between EXADEP scores and other elements of the
application means that EXADEP scores should always be used in conjunction with those other elements.
Standardized test scores provide information that is highly comparable across examinees and that references
a well-documented set of knowledge and skills. Such scores do not, however, provide information about
other important characteristics, such as motivation and persistence. Furthermore, they do not typically
provide a good indicator for understanding such complexities as changes in level of attainment over time or
obstacles overcome to reach a particular level of attainment. Grades and letters of recommendation, in
contrast, are less comparable across examinees because they are based on different requirements for
different individuals. Grades represent different combinations of courses and are based on grading standards
unique to each institution, department, and professor. Letters of recommendation may provide valuable
information about an individual, but typically offer little systematic basis for comparing one applicant to
another. Despite these weaknesses, grades and letters of recommendation probably provide better
information than standardized tests about such traits as persistence, motivation, and ability to overcome
obstacles, all of which may be of considerable importance to success in graduate study. This is why it is
important to combine grades, letters of recommendation, and test scores to develop a holistic understanding
of an applicant’s abilities and of achievement in relation to abilities.
The complementarity of test scores and grades is made clear by the results of validity studies conducted by
the EXADEP Program in collaboration with various universities. These studies show that the combination of
EXADEP scores and undergraduate GPA predicts first-year grades more effectively than any single piece of
information. These various sources of information represent a system of checks and balances in decision
making. Fairness is enhanced by using multiple measures because systematic over- or under-prediction will
be decreased as additional measures are used. For example, students coming from schools or fields that
employ strict grading practices will be less disadvantaged when tests are used along with grades. Conversely,
test scores may represent peak performance capacity, while grades and letters of recommendation can offer
testimony to an individual’s ability to sustain performance over longer periods of time.
7
GUIDELINES FOR THE USE OF EXADEP SCORES
Introduction
These guidelines have been adopted by the EXADEP Program to provide information about the appropriate
use of the EXADEP scores for those who use the scores in graduate admissions and fellowship selection
processes and for other approved purposes. They are also intended to protect students from unfair decisions
that may result from inappropriate uses of scores. Adherence to the guidelines is important.
EXADEP is designed to assess academic knowledge and skills relevant to graduate study. As measures with
known statistical properties and high-quality technical characteristics, the scores from these tests, when
used properly, can improve graduate admissions and fellowship selection processes. Any EXADEP test,
however, has two primary limitations: (1) it does not and cannot measure all the qualities that are important
in predicting success in graduate study or in confirming undergraduate achievement, and (2) it is an inexact
measure—that is, only score differences that exceed the standard error of measurement of a given score
can serve as reliable indications of real differences in academic knowledge and developed abilities. Although
limitations and cautions apply to all admissions measures, the EXADEP Program is obligated to inform users
of the appropriate uses of EXADEP scores and to identify and try to rectify instances of misuse. To this end,
the following policies and guidelines are available to all EXADEP test takers, institutions, and organizations
that receive EXADEP scores.
Policies
In recognition of its obligation to ensure the appropriate use of EXADEP scores, the EXADEP Program
has developed policies designed to make score reports available only to approved recipients, to encourage
these score users to become knowledgeable about the validity of the tests, to protect the confidentiality of
examinees’ scores, and to follow up on cases of possible misuse of scores. The policies are discussed below.
Score Recipients
Accredited undergraduate and graduate institutions and non-degree-granting organizations that award
graduate fellowships are eligible for consideration as score recipients. Institutions and organizations that do
not meet either one of these requirements are, in general, not eligible to be score recipients. The EXADEP
Program retains the right to make exceptions to this policy in special circumstances.
Validity
The general appropriateness of using EXADEP scores for graduate admissions, fellowship selection, and
other approved purposes has been established by research studies carried out by ETS. EXADEP scores
may be appropriate for some other purposes, but it is important for the user to validate their use for those
purposes. To assist institutions in evaluating proposed uses, these guidelines include information about
appropriate and inappropriate uses.
Confidentiality
EXADEP scores, whether for an individual or aggregated for an institution, are confidential and can be
released only by authorization of the individual or institution or by compulsion of legal process.
Use of Scores in Aggregated Form
Information based on EXADEP scores may be useful to prospective students, but use of a precise mean
or median should be avoided. Graduate programs and institutions are urged to report EXADEP scores in
ranges such as the highest and lowest scores of the middle 50 percent of the admitted students. Presenting
information by score ranges emphasizes the diversity of individual scores for any one graduate program or
8
institution and also makes clear the overlap of scores among graduate programs and institutions. Use of
EXADEP scores in aggregated form as a measure for ranking or rating graduate programs, institutions,
university systems, or states is strongly discouraged except when the scores are used as one indicator
among several appropriate indicators of educational quality.
Encouragement of appropriate use and investigation of inappropriate use
All users of EXADEP scores have an obligation to use the scores in accordance with published EXADEP
Program policies and guidelines. Institutions have a responsibility to ensure that all individuals using
EXADEP scores are aware of the EXADEP Program score-use policies and guidelines and to monitor the use
of the scores, correcting instances of inappropriate use when they are identified. The EXADEP Program staff
is available to assist institutions in resolving issues of inappropriate score use.
Guidelines
Use Multiple Criteria
Regardless of the decision to be made, multiple sources of information should be used to ensure fairness
and balance the limitations of any single measure of knowledge, skills, or abilities. These sources may
include undergraduate grade point average, letters of recommendation, personal statement, samples of
academic work, and professional experience related to proposed graduate study. EXADEP scores should not
be used exclusively. The use of multiple criteria is particularly important when using EXADEP scores to
assess the abilities of educationally disadvantaged students, as well as those who are returning to school
after an extended absence. Score users are urged to become familiar with factors affecting score
interpretation for these groups, as discussed in this publication.
Accept Only Official EXADEP Score Reports
The only official reports of EXADEP scores are those issued by ETS and sent directly to approved institutions
and organizations designated by the test takers. Scores obtained from other sources should not be accepted.
If there is a question about the authenticity of a score report, it should be referred to ETS. ETS will verify the
accuracy of the scores and whether an official report was issued.
Conduct Validity Studies
Departments using EXADEP scores for graduate admissions, fellowship awards, and other approved
purposes are encouraged to collect validity information by conducting their own studies. The EXADEP
Program staff will provide advice on the design of appropriate validation studies without charge.
Maintain Confidentiality of EXADEP Scores
All individuals who have access to EXADEP scores should be aware of the confidential nature of the
scores and agree to maintain their confidentiality. Institutional policies should be developed to ensure that
confidentiality is maintained. For example, EXADEP scores should not be placed on documents sent outside
the institution.
Consider Verbal Aptitude, Quantitative and Analytical Reasoning, Written Expression, and English
Subscores
Since the level of skills required for success in graduate school varies by field or department, the subscores
provided for each section of the test should be taken into consideration.
Avoid Decisions Based on Small Score Differences
Small differences in EXADEP scores (as defined by the standard error of measurement) should not be used
to make distinctions among examinees. Standard errors of measurement (SEMs) vary by test and are
available in this publication.
9
Do Not Compare Scores from Different Tests
EXADEP scores are not directly comparable with scores on other graduate or undergraduate admission tests.
Percentile ranks should be compared only if they are based on the same reference population.
Recognize Limitations of Scores Earned on Tests Taken Under Special Conditions
EXADEP tests are offered with special arrangements and test materials to accommodate the needs of
students with visual, physical, hearing, and learning disabilities. Depending upon the nature and extent of the
disability, the scores may not accurately reflect a student’s educational achievement. For some students, the
nature of their disabilities may make it advisable to waive EXADEP score requirements.
Normally Appropriate and Inappropriate Uses of EXADEP Scores
The suitability of EXADEP for a particular use should be explicitly examined before using test scores for that
purpose. The following lists of appropriate and inappropriate uses of EXADEP scores are based on the
policies and guidelines previously outlined. The lists are meant to be illustrative, not exhaustive, in nature.
There may be other appropriate uses of EXADEP scores, particularly for non-admissions purposes, but any
use other than those listed below must be discussed in advance with EXADEP Program staff to determine
their appropriateness. If a use other than those appropriate uses listed below is contemplated, it will be
important for the user to validate the use of scores for that purpose. The EXADEP Program staff will provide
advice on the design of such validity studies without charge.
Appropriate Uses
Provided all applicable guidelines are adhered to, EXADEP scores are suitable for the following uses:
• selection of applicants for admission to graduate school
• selection of graduate fellowship applicants for awards
• selection of graduate teaching or research assistants
• guidance and counseling for graduate study
Inappropriate Uses
The EXADEP scores are not suitable for the following uses:
• Requirement of a minimum score on the EXADEP for conferral of a degree, credit-by-examination,
advancement to candidacy, or any noneducational purpose
• Requirement of scores on the EXADEP for employment decisions, including hiring, salary, promotion,
tenure, or retention (except for the awarding of assistantships to graduate students)
Comments, complaints, inquiries, and suggestions about the use of EXADEP scores are welcome. To contact
the EXADEP Program office, see the inside front cover.
10
CONSIDERATIONS IN SCORE INTERPRETATION
EXADEP scores should be used to supplement the information provided in a person’s application, such as
undergraduate record and letters of recommendation. Officials responsible for admission at each institution
must determine the significance of EXADEP scores for each applicant. Particular attention must be paid to
the use of EXADEP scores for individuals described below. The experience of institutions or departments
should continue to be the best guide to interpretation of EXADEP scores in these instances.
Repeat Test Takers
Individuals are permitted to take EXADEP tests more than once. Score recipients are cautioned not to view an
increase in scores necessarily as a reflection of academic gain, especially over a short time period. There are
several ways in which graduate departments can judge multiple scores for an individual (e.g., use average
of all scores, use most recent score, use highest score). Using the mean score may be the best technique
because it is the most objective. Whatever approach is adopted, it should be used consistently with all
applicants.
Examinees with EXADEP Scores More Than Five Years Old
The EXADEP Program established a policy of retaining and reporting EXADEP scores earned during the fiveyear period after the test date. Scores more than five years old are not reported. Candidates who took the test
more than five years ago must take it again if they want scores to be sent to institutions.
When institutions decide to consider older scores, they should be aware that the applicant’s competence may
have changed in the intervening time. The problem lies in determining how much the applicant’s competence
has changed in either direction in the intervening years and how the change affects present potential for
graduate work. Institutions may prefer to request that the applicant retake the test.
Examinees with Disabilities
ETS makes special arrangements for individuals who have recently documented visual, physical, hearing,
or learning disabilities and are unable to take the tests under standard conditions. The tests are administered
in a nonstandard manner chosen to minimize any adverse effect of the examinee’s disability on test
performance and to help ensure that, insofar as possible, the final scores should closely represent the
examinee’s educational achievement. Depending on the nature and extent of the disability, an examinee’s
scores may not fully reflect his or her educational achievement and, because there are so few persons with
disabilities taking EXADEP tests and their circumstances vary so widely, it has not been possible to provide
special interpretive data for these examinees. Therefore, graduate schools should seriously consider waiving
EXADEP requirements for applicants with certain disabilities.
11
POLICY AND USE OF EXADEP SCORES
Score Reporting Policies
EXADEP score reporting policies have been adopted by the EXADEP Program to encourage the appropriate
use of EXADEP scores and to protect the right of individuals to control the distribution of their own score
reports. Current EXADEP Program policy states that scores are reportable for five years. Score reports for
the test are released approximately three to four weeks after the test date to the examinees and to accredited
institutions of higher education granting the baccalaureate or higher and approved graduate fellowshipgranting sponsors designated by the examinees.
Absentees are reported to institutions. Their names will appear on the institution roster with no scores.
Percentile ranks shown on the tables included in this guide and on the score reports are based on the
performance of the current reference group. The percentile rank for any score may vary over the years
depending on the scores of the group with which the score is compared. Thus, when two or more applicants
are being compared, the comparison should be made on the basis of their respective scores; if percentile
ranks are considered, they should all be based on the percentile rank tables in the most recent edition of this
guide.
Reporting Revised Scores
ETS routinely follows extensive review and quality control procedures to detect and avoid flawed questions
and consequent errors in scoring. Nonetheless, if an error is discovered after scores have been reported,
the specific circumstances are reviewed carefully, and a decision is made about how best to take corrective
action that is fairest to all concerned. Revised scores are sent to the affected students, who may request
that ETS send the revised scores to any graduate schools or fellowship sponsors still considering their
applications.
Confidentiality and Authenticity of EXADEP Scores
EXADEP scores are confidential and must not be released by an institutional recipient without the explicit
permission of the examinee. EXADEP scores are not to be included in academic transcripts. Dissemination
of score records should be kept to a minimum, and all staff who have access to them should be explicitly
advised of their confidential nature. To ensure the authenticity of scores, the EXADEP Program urges that
institutions accept only official reports of EXADEP scores received directly from ETS.
The EXADEP Program recognizes the privacy rights of both institutions and individuals with regard to
information supplied by and about them. ETS therefore safeguards from unauthorized disclosure all
information stored in its data or research files. Information about an institution (identified by name) will be
released only in a manner consistent with a prior EXADEP agreement or with the consent of the institution.
EXADEP Scores and Graduate Admissions
Many factors play a role in an applicant’s admissibility and expectation of success as a graduate student.
EXADEP scores are only one element in this total picture and should be considered along with other data.
The EXADEP Program believes that EXADEP scores should never be the sole basis for an admissions
decision and that it is inadvisable to reject an applicant solely on the basis of EXADEP scores. A cutoff
score should not be used without consideration of other admission factors.
12
EXADEP scores permit comparison of one applicant to a graduate school with other applicants for the same
program at that institution as well as with everyone else who took the test. Subscores provide further
information for consideration. These subscores, which reflect a student’s general strengths and weaknesses
in the major areas on which the total score is based, aid in the interpretation of the total score. Often the
subscores can suggest areas in which the student may require extra work. A low subscore, however, may
be the result of lack of exposure to a particular field. As a result, subscores should always be reviewed in
relation to the applicant’s undergraduate history.
For admissions decisions, the weight to be given to EXADEP scores can generally be established by relating
what the tests measure to the orientation, curriculum, and aims of the department. Specifically, the content
validity of the tests for a graduate department should be determined by reviewing the test carefully and then
making subjective decisions as to the weight, if any, the scores on EXADEP should receive in relation to other
admission factors.
Validity Studies
One way to determine the weight to give to test scores is to conduct validity studies. Validity is an ongoing
process of assembling knowledge supporting interpretations that are made using test scores. A primary way
of determining the validity of a test is to examine the correlation between test scores (and perhaps other
predictors, such as undergraduate grade point average) and one or more criteria of success in graduate
study. It should be noted, however, that where there are small numbers of students, major problems can
occur in attempting to carry out adequate validity studies.
In addition to correlation studies, institutions might consider other approaches. One would be to prepare
a table of EXADEP scores for those students who do poorly and/or drop out of graduate school. Another
approach would be to independently assess the general skills or abilities needed for success in the particular
graduate school and compare them to those assessed by the EXADEP.
The EXADEP Program strongly recommends that institutions using EXADEP scores conduct such validity
studies, and it will assist institutions that wish to do so. Institutions interested in such assistance should
contact the ETS office in Puerto Rico.
13
SCORE INTERPRETATION AND STATISTICAL INFORMATION
Verbal, Mathematics and Analytical Reasoning, Written Expression, and English Sections of the Test
The range of EXADEP scores for each section is from 20 to 80, in one-point increments. If no answers are
given for a section, a score of 20 is reported for that section. The total scale score ranges from 200 to 800,
and it is the sum of the results of the different sections multiplied by their individual weights. The weight of
each section is proportional to the number of questions in the section. The distribution of weights is given in
the following table:
Section
1 Verbal Aptitude
2 Mathematics and
3 Analytical Reasoning
4 Written Expression
5 English
TOTAL
Section
Score
A
Weight
3.5
B
2.5
C
D
2.0
2.0
3.5A + 2.5B + 2.0C + 2.0D
Note: One score is reported for sections 2 and 3 combined.
There may be candidates who do not answer all questions in a given section of the test. Because the number
of answers is incorporated into the calculation of the scores, it is important that test takers answer every
question.
The following table indicates the mean, standard deviation, SEMs, and reliability coefficient based on the
results of all the candidates who took the test between February 2008 and December 2010.
EXADEP SUMMARY TABLE
(Based on the scale scores of 24,969 candidates who took the test between
February 2008 and December 2010)
Mean
Standard
Deviation
SEM
Reliability
Coefficient
45.1
11
4
0.868
49.5
10
4
0.832
4 Writing Expression
49.2
12
5
0.798
5 English
47.5
12
5
0.836
475.7
91
22
0.938
Section
1 Verbal Aptitude
2 Mathematics and
3 Analytical Reasoning
TOTAL
SEM = Standard Error of Measurement
14
The EXADEP Summary Table (on page 14) provides data on the standard error of measurement and
reliability.
Standard Error of Measurement
As with all educational measurements, the scores obtained by an individual could easily vary from one
administration to another, although there may have not been any changes in the candidate’s true skills. The
individual score is considered an estimate of the knowledge or skills of the person in the area examined.
If a person could take different editions of the test without changes in his or her knowledge and skills, the
average of all the scores obtained would be a precise measurement of the knowledge and the skills of the
person in the area examined. This hypothetical average is known as the “true score”. The difference between
the “true score” and the score obtained at a test administration is called “error of measurement.” The
term tries to describe the imprecise measurement of a unique score obtained by the candidate as the
measurement of knowledge or skills of the person. It does not mean that an error was committed when
developing or evaluating the test.
It is statistically possible to estimate the average of the error of measurement for a large group of candidates.
The statistic outcome is expressed in a score that is known as “standard error of measurement” (SEM). If the
“true scores” for a group of candidates are distributed according to a normal distribution, the probability is
that nearly 68 percent of the candidates will obtain scores within a standard error of measurement above or
below their true scores.
The standard error of measurement for the total score of the EXADEP, expressed in scale scores, is 22.
Therefore, for the 68 percent of the EXADEP candidates, it can be assumed that their true scores would fall
within a range of 22 points above and 22 points below the obtained scores.
The importance of the standard error of measurement is in understanding that, when comparing the
performance in the examination of two candidates, the variations in the scores perhaps do not represent a
real difference as far as the skills of the respective candidates. The value of the standard error of
measurement must be taken into consideration when interpreting the individual results.
Reliability coefficient
The reliability of a test indicates the degree to which individual candidates scores would change the relative
position of the scores if the examination had been administered under conditions to a certain extent different
(for example, if the candidates were tested by means of different test edition). The reliability is represented by
means of a statistical coefficient that is affected by measurement errors. Generally, while smaller the
measurement errors in a test, the greater the reliability. The reliability coefficient ranges from 0 to 1. The
reliability index of 1 indicates a perfectly reliable test (that is, that there are no measurement errors in the
test). The 0 indicates that the test obtains completely inconsistent results.
15
STATISTICAL TABLES
To aid in the interpretation of scale scores, the EXADEP Program describes scores in terms of their standing
in appropriate reference groups. Tables 1–6 provide percentile ranks (i.e., the percentage of candidates
in a group who obtained scores lower than each possible scale score) for the test. All candidates who took
the test between February 2008 and December 2010 were divided into five reference groups according to the
graduate program to which they requested the scores to be sent. Tables 1–5 show percentile ranks according
to those five groups. Table 6 shows percentile ranks for all candidates. The tables also indicate the means
and standard deviations for each group.
•
•
•
•
•
•
Table 1: Business
Table 2: Science
Table 3: Education
Table 4: Law
Table 5: All Other Programs*
Table 6: All Candidates
How to Compare the Scores
Mean
A way to interpret the scores is to compare the results of a candidate with the mean of scores obtained by
all candidates that took the test. It can be done for an EXADEP candidate by comparing the results of the
candidate with the mean of results in the Table of Summary of the EXADEP that appears on page 14.
However, it is even better to compare the scores of the candidate with the reference group in the tables
on pages 17–22.
Percentile Ranks
Another method to interpreting the EXADEP scores is to compare the score of one candidate with the score
of another candidate, as it indicates in percentile rank tables, Table 1 – Table 6.
The candidate’s score can be compared in relation to other candidates by means of the use of the tables. For
example, in Table 1, if the scaled score of a candidate is 56 in Section 1 (Verbal Aptitude), by using the Table,
then 86 percent of the other candidates obtained lower scores. Also, if a candidate obtained a scaled score
of 60 in Sections 2 and 3 (Mathematical and Analytical Reasoning), according to the table 83 percent
of the candidates of the group obtained a lower score. If a candidate obtained a scaled score of 68 in
Section 4 (Written Expression), then according to the table, 97 percent of the candidates obtained a lower
score. If a candidate obtained a scaled score of 61 on Section 5 (English), then the table shows that between
86 and 89 percent of the candidates obtained a lower score.
Also note that not all the total scores appear in the tables, but as multiples of 10. For example, in Table 1, if
the total scaled score of a candidate is 563, then (according to the table, 560 is the closest number) we can
assume that 84 percent of the candidates obtained a lower score.
*The fifth reference group includes all graduate programs that, from a statistical point of view, did not have enough candidates.
16
Table 1
PERCENTILE RANKS FOR BUSINESS PROGRAM CANDIDATES
Percentage of candidates who obtained scale scores lower than each possible scale score
ENGLISH
TOTAL
SCALE SCORE
PERCENTILE RANK
OF THE TOTAL
SCALE SCORE
99
98
98
97
97
95
94
93
91
89
86
83
80
76
73
68
62
57
48
40
31
23
14
8
5
2
1
1
1
1
1
800
780
760
740
720
700
680
660
640
620
600
580
560
540
520
500
480
460
440
420
400
380
360
340
320
300
280
260
240
220
200
99
99
99
99
98
97
96
94
91
88
84
79
73
65
57
49
41
32
24
17
11
7
4
2
1
1
1
4,316
4,316
4,316
4,316
n
4,316
Mean
43.7
50
47.9
46.8
Mean
467.6
Standard
Deviation
11
10
12
11
Standard
Deviation
91
80
78
76
74
72
70
68
66
64
62
60
58
56
54
52
50
48
46
44
42
40
38
36
34
32
30
28
26
24
22
20
99
99
99
99
99
99
98
96
95
92
89
86
82
77
71
66
57
51
45
38
31
25
20
15
12
8
6
5
3
1
n
MATHEMATICS/
ANALYTICAL
REASONING
99
99
99
99
99
98
97
95
92
88
84
79
73
66
61
54
49
42
36
30
26
21
17
14
11
8
6
4
3
2
1
VERBAL APTITUDE
99
99
99
98
97
95
94
92
89
86
83
78
74
68
62
54
47
40
32
23
17
12
7
4
2
1
1
1
1
SECTION
SCALE SCORE
WRITING
EXPRESSION
(Based on the results of 24,969 candidates who took the test between February 2008 and December 2010)
17
Table 2
PERCENTILE RANKS FOR SCIENCE PROGRAM CANDIDATES
Percentage of candidates who obtained scale scores lower than each possible scale score
TOTAL
SCALE SCORE
PERCENTILE RANK
OF THE TOTAL
SCALE SCORE
99
99
99
99
98
97
95
91
87
82
76
71
65
58
52
47
41
35
29
24
19
15
12
9
7
5
4
2
2
1
99
99
98
97
96
96
95
93
91
90
88
85
82
78
74
69
63
57
48
41
32
24
16
9
5
2
1
1
1
1
1
800
780
760
740
720
700
680
660
640
620
600
580
560
540
520
500
480
460
440
420
400
380
360
340
320
300
280
260
240
220
200
99
99
99
99
99
98
97
95
92
88
84
79
72
65
56
48
39
31
24
17
12
7
4
2
1
1
1
n
5,981
5,981
5,981
5,981
n
5,981
Mean
44.7
48.5
48.5
46.4
Mean
468.2
Standard
Deviation
11
10
12
11
Standard
Deviation
90
WRITING
EXPRESSION
99
99
99
99
98
98
97
95
93
90
87
83
78
72
67
60
52
44
36
27
21
14
8
4
2
1
1
1
1
1
1
MATHEMATICS/
ANALYTICAL
REASONING
99
99
99
99
99
99
97
96
93
90
87
83
79
74
68
62
54
47
41
35
28
22
18
14
11
7
5
3
2
1
VERBAL APTITUDE
80
78
76
74
72
70
68
66
64
62
60
58
56
54
52
50
48
46
44
42
40
38
36
34
32
30
28
26
24
22
20
SECTION
SCALE SCORE
ENGLISH
(Based on the results of 24,969 candidates who took the test between February 2008 and December 2010)
18
Table 3
PERCENTILE RANKS FOR EDUCATION PROGRAM CANDIDATES
Percentage of candidates who obtained scale scores lower than each possible scale score
PERCENTILE RANK
OF THE TOTAL
SCALE SCORE
TOTAL SCALE
SCORE
99
99
99
99
97
95
93
90
87
82
78
72
66
61
54
47
41
35
29
24
19
15
12
9
7
5
3
2
1
99
99
99
98
98
97
97
96
95
94
92
90
88
85
83
79
74
69
61
53
44
34
24
15
9
4
2
1
1
1
1
800
780
760
740
720
700
680
660
640
620
600
580
560
540
520
500
480
460
440
420
400
380
360
340
320
300
280
260
240
220
200
3,482
3,482
3,482
n
3,482
42.3
45.2
46.5
43.4
Mean
441.6
11
9
12
10
Standard
Deviation
86
80
78
76
74
72
70
68
66
64
62
60
58
56
54
52
50
48
46
44
42
40
38
36
34
32
30
28
26
24
22
20
99
99
99
99
99
98
97
95
93
90
87
84
79
74
69
63
57
50
44
37
31
25
20
15
11
8
5
3
1
n
3,482
Mean
Standard
Deviation
99
99
99
99
99
98
98
97
96
95
93
91
88
84
80
75
68
60
50
39
31
22
13
7
3
2
1
1
1
WRITING
EXPRESSION
ENGLISH
MATHEMATICS/
ANALYTICAL
REASONING
VERBAL APTITUDE
SECTION SCALE
SCORE
(Based on the results of 24,969 candidates who took the test between February 2008 and December 2010)
19
99
99
99
99
99
98
97
95
93
90
87
82
76
69
61
53
43
34
25
18
11
6
3
1
1
1
Table 4
PERCENTILE RANKS FOR LAW PROGRAM CANDIDATES
Percentage of candidates who obtained scale scores lower than each possible scale score
ENGLISH
TOTAL SCALE
SCORE
PERCENTILE RANK
OF THE TOTAL
SCALE SCORE
WRITING
EXPRESSION
MATHEMATICS/
ANALYTICAL
REASONING
VERBAL APTITUDE
SECTION SCALE
SCORE
(Based on the results of 24,969 candidates who took the test between February 2008 and December 2010)
99
99
98
97
96
94
92
88
85
80
75
69
63
56
49
41
33
26
19
13
9
6
3
1
1
1
1
1
1
99
99
99
99
99
97
95
92
88
82
76
68
61
53
45
38
32
26
21
17
13
10
8
6
4
3
3
2
1
1
1
99
96
95
94
92
91
88
86
83
81
77
73
69
64
59
53
46
40
32
26
18
13
8
4
2
1
1
1
1
1
1
800
780
760
740
720
700
680
660
640
620
600
580
560
540
520
500
480
460
440
420
400
380
360
340
320
300
280
260
240
220
200
99
99
99
99
99
99
97
96
93
90
86
80
74
65
56
46
36
28
20
14
9
6
4
2
1
1
1
1
7,844
7,844
7,844
7,844
n
7,844
Mean
48.2
53.2
52.2
51.2
Mean
509
Standard
Deviation
10
10
11
12
Standard
Deviation
84
80
78
76
74
72
70
68
66
64
62
60
58
56
54
52
50
48
46
44
42
40
38
36
34
32
30
28
26
24
22
20
99
99
99
99
99
98
97
95
92
88
83
78
72
65
56
48
40
33
26
20
15
11
8
6
4
3
2
1
1
1
n
20
Table 5
PERCENTILE RANKS FOR ALL OTHER PROGRAMS
Percentage of candidates who obtained scale scores lower than each possible scale score
WRITING
EXPRESSION
ENGLISH
TOTAL SCALE
SCORE
PERCENTILE RANK
OF THE TOTAL
SCALE SCORE
MATHEMATICS/
ANALYTICAL
REASONING
VERBAL APTITUDE
SECTION SCALE
SCORE
(Based on the results of 24,969 candidates who took the test between February 2008 and December 2010)
80
78
76
74
72
70
68
66
64
62
60
58
56
54
52
50
48
46
44
42
40
38
36
34
32
30
28
26
24
22
20
99
99
99
99
99
98
98
96
94
91
88
84
80
75
70
65
58
52
46
40
33
26
21
17
13
9
6
4
3
1
99
99
99
99
99
98
97
96
95
93
90
87
84
79
74
67
60
52
43
33
26
18
11
6
3
1
1
1
1
1
1
99
99
99
99
99
98
96
94
91
87
83
78
73
66
60
55
49
44
38
32
27
22
18
14
12
10
7
5
4
3
1
99
98
98
97
96
95
94
92
91
89
87
85
82
79
75
71
64
59
52
44
36
27
18
11
6
3
1
1
1
1
1
800
780
760
740
720
700
680
660
640
620
600
580
560
540
520
500
480
460
440
420
400
380
360
340
320
300
280
260
240
220
200
n
3,346
3,346
3,346
3,346
n
3,346
Mean
43.6
46.7
47.7
46
Mean
457.4
Standard
Deviation
11
10
12
12
Standard
Deviation
92
21
99
99
99
99
99
99
98
97
95
93
90
86
81
75
69
61
53
45
37
28
21
15
10
6
3
1
1
1
Table 6
PERCENTILE RANKS FOR ALL CANDIDATES
Percentage of candidates who obtained scale scores lower than each possible scale score
WRITING
EXPRESSION
ENGLISH
TOTAL SCALE
SCORE
PERCENTILE RANK
OF THE TOTAL
SCALE SCORE
MATHEMATICS/
ANALYTICAL
REASONING
VERBAL APTITUDE
SECTION SCALE
SCORE
(Based on the results of 24,969 candidates who took the test between February 2008 and December 2010)
80
78
76
74
72
70
68
66
64
62
60
58
56
54
52
50
48
46
44
42
40
38
36
34
32
30
28
26
24
22
20
99
99
99
99
99
98
97
96
94
90
87
83
78
73
66
60
52
45
39
33
26
21
17
13
10
7
5
3
2
1
99
99
99
98
97
96
95
93
90
87
84
79
75
69
63
56
49
41
33
24
19
13
7
4
2
1
1
1
1
1
1
99
99
99
99
99
98
96
94
91
86
81
75
70
62
56
50
44
38
32
27
22
17
14
11
9
7
5
3
2
2
1
99
98
97
96
95
94
93
91
89
87
85
81
78
74
70
66
59
54
45
38
30
22
15
9
5
2
1
1
1
1
1
800
780
760
740
720
700
680
660
640
620
600
580
560
540
520
500
480
460
440
420
400
380
360
340
320
300
280
260
240
220
200
99
99
99
99
99
99
98
97
96
93
90
87
82
76
69
61
53
44
36
28
21
15
10
6
4
2
1
1
1
n
24,969
24,969
24,969
24,969
n
24,969
Mean
45.1
49.5
49.2
47.5
Mean
475.7
Standard
Deviation
11
10
12
12
Standard
Deviation
91
22
STEPS THAT ETS TAKES TO ENSURE FAIRNESS
ETS has designed several procedures intended to build fairness into its tests: involving external faculty
members in the design and oversight of the tests, the fairness review process, and the differential item
functioning (DIF) analysis. The purpose of involving faculty members in the design and oversight of the tests
is to make sure that the perspectives of a diverse group of people are considered in planning and ongoing
operational activities. The purpose of the fairness review process is to ensure that tests reflect the
multicultural nature of society and to screen out any material that might be offensive or less accessible to
certain groups of test takers, such as those based on age, disability, ethnic group, race, or gender. The
purpose of the DIF analysis is to identify any test questions on which members of a particular group of test
takers perform differently than would be expected on the basis of their overall ability in the areas covered by
the test.
Involving External Faculty Members in the Development of the Test
The EXADEP Program involves undergraduate and graduate faculty members in the design and oversight
of the test. The EXADEP Advisory Committee is made up of men and women from different academic
disciplines and different Spanish-speaking countries representing a variety of institutions. Members are
drawn from different ethnic groups. Drawing on a diverse group of educators who are not ETS employees
is one way ETS seeks to ensure the fairness of the test. Every question in an ETS test (and all materials
published by ETS) must pass a fairness review. This review is based on a set of written guidelines; each
review is conducted by an ETS staff member specifically trained in the application of these guidelines. Any
test question that does not pass the fairness review must be revised to comply with the guidelines or be
removed from the test. The fairness review does not guarantee that women, minority group members,
or individuals with disabilities will perform well on the test, but it does guard against the possibility of
distraction caused by language or content that might be found offensive or inaccessible.
Differential Item Functioning Analysis
Differential item functioning occurs when people from different groups and of approximately equal
knowledge and skill perform in substantially different ways on a particular test question. Differential item
functioning analysis is a statistical technique used as part of the testing process that is designed to identify
test questions that are more difficult for members of one group than for members of some other group,
controlling for overall ability. It is important to realize that DIF is not synonymous with bias. DIF may occur
if a perfectly fair question happens to be measuring a skill that is not well represented in the test as a whole.
Each DIF analysis involves a set of comparisons between a group of examinees that is the focus of the study
(focal group) and the group with which it is compared (reference group). For example, if the focal group is
women, the reference group is men. The DIF analysis is based on a comparison between groups of test
takers of the same overall ability as determined by their performance on the test as a whole. A DIF statistic
is computed for each test question, indicating the extent to which members of the focal group perform
differently from members of the reference group who have similar ability levels. DIF analyses are run before
scoring is performed. A test question that appears, on the basis of the DIF analysis, to be functioning in a
substantially different way for the focal and reference groups, is reviewed and subject to removal from
scoring. The EXADEP Program encourages test takers to report concerns about specific test questions
directly to the test center administrator or to the EXADEP Program immediately following the test
administration. Subject-matter specialists will review these questions and eliminate them from scoring if
potential bias is determined. The test specialists will also respond in writing to the examinees. If a response
does not resolve an examinee’s concern, the examinee may pursue the matter further with ETS.
23
Guide to the Use and
Interpretation of Scores
for the EXADEP Test
TM
Copyright © 2011 by Educational Testing Service. All rights reserved. ETS, the ETS logo and LISTENING. LEARNING. LEADING. are registered trademarks of Educational Testing Service (ETS)
in the United States and other countries. EXADEP, EXAMEN DE ADMISIÓN A ESTUDIOS DE POSGRADO are trademarks of ETS. 16463