The Bar Examiner Volume 84, No. 1, March 2015

Transcription

The Bar Examiner Volume 84, No. 1, March 2015
Bar Examiner
THE
Volume 84 | Number 1
| March 2015
A publication of the National Conference of Bar Examiners
Articles
8
2014 Statistics
44
The Revised ABA Standards for Approval of Law Schools:
An Overview of the Major Changes
by Jeffrey E. Lewis
Departments
2
Letter from the Chair
by Bryan R. Williams
4
President’s Page
by Erica Moeser
54
The Testing Column
Essay Grading Fundamentals
by Judith A. Gundersen
57
News and Events
60
Litigation Update
by Fred P. Parker III and Jessica Glad
Bar Examiner
THE
Editor
Claire Huismann
Editorial Advisory Committee
Beverly Tarpley, Chair
Steven M. Barkan
Bedford T. Bentley, Jr.
Victoria A. Graffeo
Paul H. Mills
Madeleine J. Nibert
Fred P. Parker III
Hon. Phyllis D. Thompson
Publication Production and Design
Melanie Hoffman
Editorial Assistant
Lisa Palzkill
Publisher
National Conference of Bar Examiners
NCBE Officers
Chair
Bryan R. Williams
President
Erica Moeser
Immediate Past Chair
Margaret Fuller Corneille
Chair-Elect
Hon. Thomas J. Bice
Secretary
Robert A. Chong
NCBE Board of Trustees
Hulett H. Askew
Hon. Rebecca White Berch
Patrick R. Dixon
Michele A. Gavagni
Gordon J. MacDonald
Hon. Cynthia L. Martin
Suzanne K. Richards
Hon. Phyllis D. Thompson
2 The Bar Examiner, March 2015
T
Letter from
the Chair
he mission of the National Conference of Bar Examiners is twofold. It
is to work with other institutions to develop, maintain, and apply reasonable and uniform standards of education and character for eligibility
for admission to the practice of law. It is also to assist bar admission
authorities by, among other things, providing high-quality examinations for the
testing of applicants for admission to the practice of law, conducting educational
programs for bar admission authority members and staff, and providing other
services such as character and fitness investigations and research.
In fulfilling its mission, NCBE sponsors a number of training and educational
activities for the bar admissions community, provides assistance to jurisdictions
on many fronts, and offers opportunities for involvement in its activities. I
encourage bar examiners, administrators, judges, educators, and anyone affiliated with bar admissions to become familiar with the wide range of activities that
NCBE provides and to actively participate in those activities.
For those involved in the drafting and grading of bar exam questions, NCBE
sponsors a biennial mini-conference for bar examiners that addresses best practices in testing. Last fall, this two-day mini-conference, “Best Practices in Testing:
A Mini-Conference for Bar Examiners,” held at NCBE’s headquarters in Madison,
Wisconsin, focused on how bar examiners, especially new bar examiners, could
better learn how to grade essays and how to stay calibrated in their grading over
a period of time. For those who read and grade essays, this type of training is
invaluable. The mini-conference also addressed the fundamental principles applicable to drafting high-quality questions, as well as desirable scoring methods to
achieve the reliability required in high-stakes testing. Such best practices are also
frequent topics at other NCBE educational events.
For those bar examiners from jurisdictions that use the Multistate Essay
Examination (MEE) and/or the Multistate Performance Test (MPT), NCBE conducts grading workshops after the administration of each exam. These workshops, led by workshop facilitators familiar with the questions and grading
materials and by NCBE testing staff, provide excellent analysis of the questions
and expose attendees to the best practices for grading the questions to achieve
the goal of spreading the scores. The workshops can be attended in person or via
conference call; the sessions are also videotaped and edited and made available
on demand for graders to stream at their convenience beginning the week after
the exam.
In addition to providing training, NCBE devotes considerable resources to
providing educational opportunities for the bar examining community and state
courts, such as its Annual Bar Admissions Conference. It is always important to
know the trends in bar admissions and the latest hot-button issues experienced
by bar examiners throughout the country. This
year’s Conference, held in Chicago from April
30 to May 3, will focus on, among other topics,
a profile of the legal profession with particular
emphasis on law school enrollment, law school
debt, bar admission trends, and employment. Also
sure to be of great interest are several sessions
addressing specific character and fitness and ADA
issues, as well as sessions discussing the admission of foreign-trained lawyers and exploring the
future of tablet technology in testing. Other educational events sponsored by NCBE include biennial
academic support conferences directed to law
school faculty and administrators to help law
schools maximize their students’ preparation for
the bar exam, and mini-seminars educating select small
audiences on a variety of topics.
Much of NCBE’s important work is done through its
committees. One of my goals this year was to broaden the
scope of committee membership by inviting a wider array
of people involved in bar admissions from various jurisdictions. Standing committees on which people can serve
include the following: Character and Fitness, Diversity
Issues, Editorial Advisory, Education, Long Range Planning,
Multistate Bar Examination, Multistate Essay Examination/
Multistate Performance Test, Multistate Professional
Responsibility Examination, Technology, and the Special
Committee on the Uniform Bar Examination. Serving on a
committee affords members the opportunity to learn about
and participate in any of several aspects of bar admissions—
whether exchanging ideas about the content of the questions
that appear on NCBE exams and the methods for grading
the questions, exploring technological advances in testing
and grading, looking at issues that affect the diversity of the
testing population, identifying and planning educational
opportunities for the bar examining community, addressing
issues relating to character and fitness, or participating in
decisions concerning the content of this very magazine.
NCBE also assists jurisdictions with the more technical aspects of testing and grading—an invaluable service
provided by NCBE’s staff of experienced testing professionals. Bar examiners in all jurisdictions face issues that
go far beyond simply writing and grading the bar exam.
Standard setting, reliability and validity of jurisdictiondrafted tests, and new and different testing methods are just
a few of the issues faced by bar examiners for which NCBE
provides guidance and expertise. For example, a number
of years ago, the New York State Board
of Law Examiners, on which I serve as a
bar examiner, considered a controversial
policy issue that had been visited by other
jurisdictions. New York is the largest
jurisdiction in terms of the numbers of
candidates tested (over 15,000 in 2014),
and much of the content of its exam is
actually drafted by the Board. For these
reasons, the issues central to the administration of the New York Bar Exam, and
the policy concerns in New York, tend
to be somewhat different from those in
jurisdictions that test fewer applicants
and that may rely on NCBE to draft their test questions.
NCBE’s assistance in performing demographic studies, as
well as lending its expertise in analyzing the data received,
was invaluable to the New York Board in considering the
controversial policy issue.
A number of state boards, in addition to their policy
functions, are responsible for evaluating the character and
fitness of applicants to the bar. NCBE offers investigation
services to state bar admission authorities to verify information presented by applicants on their applications for admission to the bar—not only for U.S.-educated applicants, but
also for foreign-educated applicants. Given the complexity
of the verification process, this service saves jurisdictions
time and resources.
I encourage members of the bar admissions community
to become knowledgeable about and involved in NCBE
activities. One of the best ways to do this is to attend NCBE’s
Annual Bar Admissions Conference and to consider serving
on one of the NCBE committees. I believe that committee
work is one of the best ways to get to know the organization and that contributing time and talent to a committee,
with the knowledge that it furthers NCBE’s mission and
ultimately benefits the profession, can be very rewarding.
Best regards to all.
Sincerely,
Bryan R. Williams
Letter from the Chair 3
President’s Page
by Erica Moeser
S
everal years ago I was elected
that the results of the MBE equat-
to serve on the Board of Trust-
ing process were correct, we sent the
ees of the small municipality
scores out, and as one jurisdiction after
in which I live. I found it strik-
another completed its grading, the
ing that often as soon as new residents
impact of the decline in scores became
moved onto one of our Village streets,
apparent. Not content to rest on the
they began clamoring to have the street
pre-release replications of the results, I
turned into a cul-de-sac, or at the very
continued—and continue—to have the
least to have speed bumps the size of
results studied and reproduced at the
small Alps installed to calm the traffic
behest of the NCBE Board, which is
that moved by at the very same pace as
itself composed of current and former
when they had purchased their home.
bar examiners as well as bar admission
I have recalled that phenomenon over the past
few months when thinking of how some legal
administrators who are dedicated to doing things
correctly.
educators have reacted to the drop in MBE scores
Legal education commentators have raised ques-
earned during the July 2014 test administration.
tions about the reliability, validity, integrity, and
The neighborhood street is lawyer licensing, essen-
fairness of the test and the processes by which it is
tially unchanged, and the cul-de-sac is the wish for
created and scored. One dean has announced pub-
the process to change. Some legal educators have
licly that he smelled a rat. As perhaps the rat in ques-
expressed hope that NCBE would confess error in
tion, I find it difficult to offer an effective rejoinder.
the equating of that particular July test. Others have
Somehow “I am not a rat” does not get the job done.
gone further, calling for an overhaul of the test. A
few educators (and I treasure them) have pointed out
that the test results simply reflect the circumstances
in which legal education finds itself these days. It is
the very stability of the MBE that has revealed the
beginning of what may be a continuing and troubling slide in bar passage in many states.
The pages of this magazine are replete with
explanations of how NCBE examinations are crafted
and equated. The purpose of these articles has been
to demystify the process and foster transparency.
The material we publish, including the material
appearing in this column, details the steps that are
taken for quality control, and the qualifications of
As I have written previously, my first reaction
those who draft questions and otherwise contribute
at seeing the results of the July 2014 MBE was one
to the process. The words in the magazine—and
of concern, and I acted on that concern long before
much more—are available on the NCBE website for
the test results were released. Having nailed down
all those who are willing to take time to look.
4 The Bar Examiner, March 2015
The MBE is a good test, but we never rest on our
what an examinee knows. We try to use testing time
laurels. We are constantly working to make it better.
as efficiently as possible to achieve the broadest pos-
This is evidenced by the fact that the July 2014 MBE
sible sampling.
had the highest reliability ever of .92 (“reliability”
being a term of art in measurement meant to communicate the degree to which an examinee’s score would
be likely to remain the same if the examinee were to
be tested again with a comparable but different set
of questions). The MBE is also a valid test (another
measurement term meant to communicate that the
content of the test lines up with the purpose for which
the test is administered—here the qualification of an
entry-level practitioner for a general license to practice in any field of law).
As to the validity of the MBE, and of the palette
of four tests that NCBE produces, the selection of
test material must be relevant to the purpose to be
served, here the testing of new entrants into a licensed
profession. The collective judgments of legal educators, judges, and practicing lawyers all contribute to
the construction of the test specifications that define
the trust that comes with years of performance or
one hasn’t. Courts and bar examiners have developed trust in the MBE over the 40-plus years it has
been administered. Some legal educators have not.
Frankly, many legal educators paid little or no attention to the content and scoring of bar examinations
until the current exigencies in legal education brought
testing for licensure into sharp focus for them.
Of immediate interest to us at NCBE is the result
of the introduction of Civil Procedure to the roster
of MBE topics that occurred this February. Civil
Procedure found its way onto the MBE by the same
process of broad consultation that has marked all
other changes to our tests as they have evolved. The
drafting committee responsible for this test content is
fairly characterized as blue-ribbon.
possible test coverage. Ultimately a valid and reliable
I recognize and appreciate the staggering chal-
test that is carefully constructed and carefully scored
lenges facing legal education today. I recently heard
by experts in the field is a fair test.
an estimate that projected Fall 2015 first-year enroll-
Those unfamiliar with the job analysis NCBE
conducted about three years ago may find it useful to
explore by visiting the Publications tab on the NCBE
website. At the time of its release, I commended the
report to both bar examiners and legal educators
because the survey results spoke to the knowledge
and skills new lawyers felt to be of value to them as
they completed law school and struck out in various
practice settings. The job analysis effort was consistent with our emphasis on testing what we believe the
new lawyer should know (as opposed to the arcane or
With regard to integrity, either one has earned
ment at 35,000, down from 52,000 only a few years
ago. This falloff comes as more law schools are
appearing—204 law schools are currently accredited
by the American Bar Association, with several more
in the pipeline. Managing law school enrollment in
this environment is an uphill battle. Couple that with
regulatory requirements and the impact of law school
rankings, and one wonders why anyone without a
streak of masochism would become a law school
dean these days. The demands placed on today’s law
school deans are enormous.
the highly complex). Of course, any bar examination
NCBE welcomes the opportunity to increase com-
lasting a mere two days does no more than sample
munication with legal educators, both to reveal what
President’s Page 5
we know and do, and to better understand the issues
about who passes and who fails the bar examination
and pressures with which they are contending. We
to the law schools from which examinees graduate.
want to be as helpful as possible, given our respec-
At this writing all but a few states make this infor-
tive roles in the process, in developing the strategies
mation available to law schools. NCBE has offered
that will equip current and future learners for swift
to transmit the information for jurisdictions that lack
entry into the profession.
the personnel resources to do so. A chart reflecting
the current state of disclosures appears below. It
Over the past two years, jurisdictions have
heeded the call to disclose name-specific information
updates a chart that appeared in the December 2013
issue.
Summary: Pass/Fail Disclosure of Bar Exam Results
(Updated chart from December 2013 Bar Examiner)
These jurisdictions
automatically disclose
name-specific pass/fail
information to in-state law
schools. Out-of-state law
schools must request the
information.
These jurisdictions
automatically disclose
name-specific pass/fail
information to the law schools
from which test-takers
graduate.
These jurisdictions require
all law schools to request
name-specific pass/fail
information.
These jurisdictions
provide limited
or no disclosure.
California
Arkansas
Alaska
Disclosure with limitations
Connecticut
Florida
Arizona
Hawaii*
Georgia
Indiana
Colorado
Illinois
Kentucky
Delaware
New Jersey†
Texas‡
Iowa
Louisiana
District of Columbia
No disclosure
Kansas
Minnesota
Idaho
Alabama
Maine
Mississippi
Michigan
Guam
Maryland
North Carolina
Nevada
Northern Mariana Islands
Massachusetts
North Dakota
Pennsylvania
Puerto Rico
Missouri
Ohio
South Carolina
Republic of Palau
Montana
Rhode Island
South Dakota
Nebraska
Tennessee
Virgin Islands
New Hampshire
West Virginia
New Mexico
Wyoming
New York
Oklahoma
Oregon
Utah
Vermont
Virginia
Washington
Wisconsin
*Hawaii releases passing information only.
†If the applicant executes a waiver, New Jersey will release information to law schools on request.
‡Texas will disclose name-specific pass/fall information on request of the law school unless the applicant has requested that the information not be released.
NCBE is currently disseminating name-specific pass/fail information on behalf of Georgia, Iowa, Maine, Missouri, Montana, New Mexico, Oregon, and Vermont.
6 The Bar Examiner, March 2015
I am delighted to announce that Kansas has
Scott, who served for over 40 years as the Secretary-
become the 15th state to adopt the Uniform Bar
Treasurer of the Virginia Board of Bar Examiners,
Examination. The Kansas Supreme Court acted this
was both a founder of, and a mainstay in, the
January on a recommendation of the Kansas Board
Committee of Bar Admission Administrators, now
of Law Examiners. The first UBE administration in
titled the Council of Bar Admission Administrators.
Kansas will occur in February 2016, and Kansas will
His was an important voice as the job of admissions
administrator became professionalized. Scott was
begin accepting UBE score transfers in April of this
selected as a member of the NCBE Board of Trustees
year.
and served the organization well in that capacity. He
In closing, I would like to reflect on the loss
was a leader and a gentleman, and it saddens those
of Scott Street to the bar admissions community.
of us who knew him to say good-bye.
The December 2014 Bar Examiner included a chart showing the change in total first-year enrollment from 2010 to 2013 as provided by the
ABA Section of Legal Education and Admissions to the Bar and the LSAT 25th percentile for each year. One law school informed us of a discrepancy in need of correction; corrected information appears in the chart below.
Change in First-Year Enrollment from 2010 to 2013
and Reported Changes to the LSAT Score at the 25th Percentile
(Corrections to first-year enrollment data in the December 2014 Bar Examiner)
LAW SCHOOL
2010
2011
2012
2013
%
Change, 2010
to 2013
DAYTON, UNIVERSITY OF
207
177
133
100
-52%
52,106
47,276
43,155
39,674
-23.86%
Total First-Year Enrollment
TOTALS:
25% LSAT Score
2010
2011
2012
2013
150
148
146
145
President’s Page 7
2014 Statistics 2014 Statistics
This section includes data, by jurisdiction, on the following categories for 2014:
•
the number of persons taking and passing bar examinations;
•
the number taking and passing bar examinations categorized by source of legal education;
•
the number of and passage rates for first-time exam takers and repeaters, both overall and
for graduates of ABA-approved law schools;
•
the number of and passage rates for graduates of non-ABA-approved law schools by type
of school;
•
the number of attorney candidates taking and passing special Attorneys’ Examinations;
and
•
the number of disbarred or suspended attorneys taking and passing examinations as a
condition of reinstatement.
Also included are the following:
•
a chart showing a longitudinal view of bar passage rates, both overall and for first-time
takers, over a 10-year period;
•
a five-year snapshot, by jurisdiction, of the number of persons admitted to the bar by
examination, on motion, by transferred Uniform Bar Examination (UBE) score (data collection started by NCBE in 2013), and by diploma privilege, as well as the number of
individuals licensed as foreign legal consultants; and
•
a chart displaying relative admissions to the bar in 2014 by examination, on motion, and
by diploma privilege.
Data for the first 10 charts were supplied by the jurisdictions. In reviewing the data, the reader
should keep in mind that some individuals seek admission in more than one jurisdiction in a given
year. The charts represent the data as of the date they were received from jurisdictions and may
not reflect possible subsequent appeals or pending issues that might affect the overall passing
statistics for a given jurisdiction. Statistics are updated to reflect any later changes received from
jurisdictions and can be found on the NCBE website, www.ncbex.org.
The following national data are shown for the administrations of the Multistate Bar Examination
(MBE) and the Multistate Professional Responsibility Examination (MPRE):
•
summary statistics,
•
score distributions,
•
examinee counts over a 10-year period, and
•
mean scaled scores over a 10-year period.
The use, by jurisdiction, is illustrated for the MBE, the MPRE, the Multistate Essay Examination
(MEE), and the Multistate Performance Test (MPT).
8 The Bar Examiner, March 2015
2014 Statistics Contents
Persons Taking and Passing the 2014 Bar Examination........................................................................ 10
Persons Taking and Passing the 2014 Bar Examination by Source of Legal Education................... 12
First-Time Exam Takers and Repeaters in 2014...................................................................................... 14
2014 First-Time Exam Takers and Repeaters from ABA-Approved Law Schools............................ 18
2014 Exam Takers and Passers from Non-ABA-Approved Law Schools by Type of School.......... 22
Attorneys’ Examinations in 2014............................................................................................................... 23
Examinations Administered to Disbarred or Suspended Attorneys as a Condition of
Reinstatement in 2014................................................................................................................................. 23
Ten-Year Summary of Bar Passage Rates, 2005–2014............................................................................ 24
Admissions to the Bar by Type, 2010–2014............................................................................................. 28
2014 Admissions to the Bar by Examination, on Motion, and by Diploma Privilege...................... 31
Multistate Bar Examination...................................................................................................................... 32
Jurisdictions Using the MBE in 2014................................................................................................... 33
2014 MBE National Summary Statistics (Based on Scaled Scores)................................................. 34
2014 MBE National Score Distributions.............................................................................................. 34
MBE National Examinee Counts, 2005–2014..................................................................................... 35
MBE National Mean Scaled Scores, 2005–2014.................................................................................. 35
Multistate Professional Responsibility Examination.......................................................................... 36
Jurisdictions Using the MPRE in 2014 (with Pass/Fail Standards Indicated)............................ 37
2014 MPRE National Summary Statistics (Based on Scaled Scores)............................................. 38
2014 MPRE National Score Distributions......................................................................................... 38
MPRE National Examinee Counts, 2005–2014................................................................................. 39
MPRE National Mean Scaled Scores, 2005–2014............................................................................. 39
Multistate Essay Examination.................................................................................................................. 40
Jurisdictions Using the MEE in 2014................................................................................................. 41
Multistate Performance Test..................................................................................................................... 42
Jurisdictions Using the MPT in 2014................................................................................................. 43
2014 Statistics 9
2014 Statistics
Persons Taking and Passing the 2014 Bar Examination
February
Jurisdiction
July
Total
Taking
Passing
% Passing
Taking
Passing
% Passing
Taking
Passing
% Passing
Alabama
230
127
55%
522
337
65%
752
464
62%
Alaska
45
31
69%
74
48
65%
119
79
66%
Arizona
397
253
64%
667
456
68%
1,064
709
67%
Arkansas
139
88
63%
216
135
63%
355
223
63%
California
4,578
2,073
45%
8,504
4,135
49%
13,082
6,208
47%
Colorado
391
280
72%
847
631
74%
1,238
911
74%
Connecticut
278
199
72%
457
353
77%
735
552
75%
192
121
63%
192
121
63%
Delaware
District of Columbia
No February examination
297
136
46%
264
87
33%
561
223
40%
Florida
1,315
820
62%
3,214
2,122
66%
4,529
2,942
65%
Georgia
574
364
63%
1,311
967
74%
1,885
1,331
71%
Hawaii
117
75
64%
169
116
69%
286
191
67%
Idaho
52
36
69%
113
76
67%
165
112
68%
Illinois
984
740
75%
2,398
1,940
81%
3,382
2,680
79%
Indiana
266
162
61%
552
400
72%
818
562
69%
Iowa
97
83
86%
253
206
81%
350
289
83%
Kansas
157
135
86%
188
148
79%
345
283
82%
Kentucky
198
152
77%
388
295
76%
586
447
76%
Louisiana
398
190
48%
762
532
70%
1,160
722
62%
Maine
61
41
67%
119
87
73%
180
128
71%
Maryland
567
342
60%
1,537
1,102
72%
2,104
1,444
69%
Massachusetts
679
414
61%
2,096
1,598
76%
2,775
2,012
73%
Michigan
681
444
65%
953
604
63%
1,634
1,048
64%
Minnesota
225
175
78%
747
593
79%
972
768
79%
Mississippi
111
90
81%
183
143
78%
294
233
79%
Missouri
262
211
81%
792
676
85%
1,054
887
84%
Montana
54
36
67%
126
81
64%
180
117
65%
Nebraska
42
18
43%
171
131
77%
213
149
70%
Nevada
224
128
57%
332
191
58%
556
319
57%
New Hampshire
61
46
75%
161
134
83%
222
180
81%
New Jersey
1,015
613
60%
3,297
2,445
74%
4,312
3,058
71%
New Mexico
137
111
81%
203
171
84%
340
282
83%
4,032
1,902
47%
11,195
7,265
65%
15,227
9,167
60%
New York
a
Examinations in Puerto Rico are administered in March and September.
10 The Bar Examiner, March 2015
Persons Taking and Passing the 2014 Bar Examination (continued)
February
Jurisdiction
July
Total
Taking
Passing
% Passing
Taking
Passing
% Passing
Taking
Passing
% Passing
North Carolina
632
356
56%
1,207
746
62%
1,839
1,102
60%
North Dakota
42
26
62%
78
49
63%
120
75
63%
Ohio
440
283
64%
1,173
902
77%
1,613
1,185
73%
Oklahoma
121
85
70%
307
242
79%
428
327
76%
Oregon
213
140
66%
476
311
65%
689
451
65%
Pennsylvania
720
413
57%
1,981
1,496
76%
2,701
1,909
71%
Rhode Island
48
35
73%
176
128
73%
224
163
73%
South Carolina
252
158
63%
482
342
71%
734
500
68%
South Dakota
26
18
69%
84
61
73%
110
79
72%
Tennessee
304
194
64%
810
537
66%
1,114
731
66%
Texas
1,152
781
68%
2,929
2,091
71%
4,081
2,872
70%
Utah
147
113
77%
290
236
81%
437
349
80%
Vermont
47
32
68%
61
40
66%
108
72
67%
Virginia
547
325
59%
1,377
936
68%
1,924
1,261
66%
Washington
334
237
71%
886
685
77%
1,220
922
76%
West Virginia
81
57
70%
186
137
74%
267
194
73%
Wisconsin
95
68
72%
175
131
75%
270
199
74%
Wyoming
23
15
65%
60
45
75%
83
60
72%
Guam
9
7
78%
13
8
62%
22
15
68%
N. Mariana Islands
3
2
67%
5
5
100%
8
7
88%
Palau
No February examination
17
3
18%
17
3
18%
Puerto Ricoa
523
178
34%
698
296
42%
1,221
474
39%
Virgin Islands
11
6
55%
19
16
84%
30
22
73%
24,434
14,044
57%
56,493
37,769
67%
80,927
51,813
64%
TOTALS
a
Examinations in Puerto Rico are administered in March and September.
2014 Statistics 11
2014 Statistics
Persons Taking and Passing the 2014 Bar Examination
by Source of Legal Education
ABA-Approved
Law School
Jurisdiction
Non-ABA-Approved
Law Schoola
Law School
Outside the USA
Law Office Study
Taking
Passing
% Passing
Taking
Passing
% Passing
Taking
Passing
% Passing
Taking
Passing
% Passing
Alabama
469
401
86%
278
59
21%
5
4
80%
—
—
—
Alaska
115
78
68%
2
1
50%
2
0
0%
—
—
—
Arizona
1,057
705
67%
4
3
75%
3
1
33%
—
—
—
Arkansas
355
223
63%
—
—
—
—
—
—
—
—
—
California
8,786b,c
5,010b,c
57%
2,124b,c
419b,c
20%
1,031
148
14%
10
3
30%
Colorado
1,231
908
74%
4
1
25%
3
2
67%
—
—
—
Connecticut
696
550
79%
39
2
5%
—
—
—
—
—
—
Delaware
192
121
63%
—
—
—
—
—
—
—
—
—
District of Columbia
303
144
48%
14
1
7%
244
78
32%
—
—
—
Florida
4,524
2,941
65%
5
1
20%
—
—
—
—
—
—
Georgia
1,858
1,327
71%
25
2
8%
2
2
100%
—
—
—
Hawaii
286
191
67%
—
—
—
—
—
—
—
—
—
Idaho
165
112
68%
—
—
—
—
—
—
—
—
—
Illinois
3,318
2,656
80%
1
1
100%
63
23
37%
—
—
—
Indiana
818
562
69%
—
—
—
—
—
—
—
—
—
Iowa
348
289
83%
—
—
—
2
0
0%
—
—
—
Kansas
345
283
82%
—
—
—
—
—
—
—
—
—
Kentucky
586
447
76%
—
—
—
—
—
—
—
—
—
Louisiana
1,143
718
63%
—
—
—
17
4
24%
—
—
—
173
124
72%
6
4
67%
1
0
0%
—
—
—
Maryland
2,086
1,436
69%
2
2
100%
16
6
38%
—
—
—
Massachusetts
2,443
1,902
78%
291
95
33%
41
15
37%
—
—
—
Michigan
1,630
1,048
64%
—
—
—
4
0
0%
—
—
—
Minnesota
972
768
79%
—
—
—
—
—
—
—
—
—
Mississippi
294
233
79%
—
—
—
—
—
—
—
—
—
1,045
883
84%
2
2
100%
7
2
29%
—
—
—
Maine
Missouri
a
See page 22 for a breakdown of exam takers and passers from non-ABA-approved law schools by type of school.
b
California does not recognize U.S. attorneys taking the General Bar Examination as being from either ABA-approved or non-ABA-approved law
schools. This number of applicants (1,078 taking, 624 passing) is therefore omitted from either category. California’s “U.S. Attorneys Taking the General
Bar Exam” category is composed of attorneys admitted in other jurisdictions less than four years who must take, and those admitted four or more years
who have elected to take, the General Bar Examination.
c
Applicants under California’s four-year qualification rule who did not earn J.D. degrees (53 taking, 4 passing) are not included in either the ABAapproved or non-ABA-approved category. California’s four-year qualification rule allows applicants to take the General Bar Examination through a
combination of four years of law study without graduating from a law school.
12 The Bar Examiner, March 2015
Persons Taking and Passing the 2014 Bar Examination
by Source of Legal Education (continued)
ABA-Approved
Law School
Jurisdiction
Non-ABA-Approved
Law Schoola
Law School
Outside the USA
Law Office Study
Taking
Passing
% Passing
Taking
Passing
% Passing
Taking
Passing
% Passing
Taking
Passing
% Passing
Montana
180
117
65%
—
—
—
—
—
—
—
—
—
Nebraska
213
149
70%
—
—
—
—
—
—
—
—
—
Nevada
549
316
58%
3
1
33%
4
2
50%
—
—
—
New Hampshire
204
168
82%
18
12
67%
—
—
—
—
—
—
New Jersey
4,312
3,058
71%
—
—
—
—
—
—
—
—
—
New Mexico
338
282
83%
2
0
0%
—
—
—
—
—
—
New York
10,392
7,596
73%
6
1
17%
4,813
1,565
33%
16
5
31%
North Carolina
1,839
1,102
60%
—
—
—
—
—
—
—
—
—
120
75
63%
—
—
—
—
—
—
—
—
—
1,593
1,181
74%
—
—
—
20
4
20%
—
—
—
Oklahoma
428
327
76%
—
—
—
—
—
—
—
—
—
Oregon
682
450
66%
1
1
100%
6
0
0%
—
—
—
Pennsylvania
2,697
1,909
71%
1
0
0%
3
0
0%
—
—
—
Rhode Island
224
163
73%
—
—
—
—
—
—
—
—
—
South Carolina
734
500
68%
—
—
—
—
—
—
—
—
—
South Dakota
110
79
72%
—
—
—
—
—
—
—
—
—
Tennessee
841
593
71%
265
138
52%
8
0
0%
—
—
—
Texas
4,037
2,860
71%
14
6
43%
30
6
20%
—
—
—
Utah
437
349
80%
—
—
—
—
—
—
—
—
—
Vermont
99
67
68%
—
—
—
1
1
100%
8
4
50%
Virginia
1,903
1,259
66%
—
—
—
10
0
0%
11
2
18%
Washington
1,187
907
76%
—
—
—
17
6
35%
16
9
56%
West Virginia
267
194
73%
—
—
—
—
—
—
—
—
—
Wisconsin
260
197
76%
1
1
100%
9
1
11%
—
—
—
Wyoming
83
60
72%
—
—
—
—
—
—
—
—
—
Guam
22
15
68%
—
—
—
—
—
—
—
—
—
N. Mariana Islands
8
7
88%
—
—
—
—
—
—
—
—
—
Palau
6
1
17%
1
0
0%
10
2
20%
—
—
—
1,192
466
39%
29
8
28%
—
—
—
—
—
—
30
22
73%
—
—
—
—
—
—
—
—
—
70,225
48,529
69%
3,138
761
24%
6,372
1,872
29%
61
23
38%
North Dakota
Ohio
Puerto Rico
Virgin Islands
TOTALS
a
See page 22 for a breakdown of exam takers and passers from non-ABA-approved law schools by type of school.
2014 Statistics 13
2014 Statistics
First-Time Exam Takers and Repeaters in 2014a
First-Timers
Repeaters
Jurisdiction
2014 Administration
Taking
Passing
% Passing
Taking
Passing
% Passing
Alabama
February
128
99
77%
102
28
27%
July
418
331
79%
104
6
6%
Total
546
430
79%
206
34
17%
February
33
26
79%
12
5
42%
July
61
47
77%
13
1
8%
Alaska
Arizona
Arkansas
California
Colorado
Connecticut
Delaware
Dist. of Columbia
Florida
Georgia
Hawaii
Idaho
Illinois
Indiana
Total
94
73
78%
25
6
24%
February
280
199
71%
117
54
46%
July
564
421
75%
103
35
34%
Total
844
620
73%
220
89
40%
February
81
66
81%
58
22
38%
July
177
129
73%
39
6
15%
Total
258
195
76%
97
28
29%
February
1,492
822
55%
3,086
1,251
41%
July
6,220
3,818
61%
2,284
317
14%
Total
7,712
4,640
60%
5,370
1,568
29%
February
281
220
78%
110
60
55%
July
787
616
78%
60
15
25%
Total
1,068
836
78%
170
75
44%
February
192
167
87%
86
32
37%
July
408
346
85%
49
7
14%
Total
600
513
86%
135
39
29%
36
14
39%
February
No February examination
July
156
107
69%
Total
156
107
69%
36
14
39%
February
179
110
61%
118
26
22%
July
140
73
52%
124
14
11%
Total
319
183
57%
242
40
17%
February
805
587
73%
510
233
46%
July
2,864
2,057
72%
350
65
19%
Total
3,669
2,644
72%
860
298
35%
339
272
80%
235
92
39%
February
July
1,133
909
80%
178
58
33%
Total
1,472
1,181
80%
413
150
36%
February
83
60
72%
34
15
44%
July
145
109
75%
24
7
29%
Total
228
169
74%
58
22
38%
February
41
29
71%
11
7
64%
July
101
75
74%
12
1
8%
Total
142
104
73%
23
8
35%
February
661
552
84%
323
188
58%
July
2,203
1,881
85%
195
59
30%
Total
2,864
2,433
85%
518
247
48%
February
152
119
78%
114
43
38%
July
474
378
80%
78
22
28%
Total
626
497
79%
192
65
34%
a
First-time exam takers are defined as examinees taking the bar examination for the first time in the reporting jurisdiction. Repeaters are defined as
examinees who have taken the bar examination in the reporting jurisdiction at least once prior to the listed administration.
14 The Bar Examiner, March 2015
First-Time Exam Takers and Repeaters in 2014a (continued)
First-Timers
Jurisdiction
2014 Administration
Iowa
February
July
Total
Kansas
Kentucky
Louisiana
Maine
Maryland
Massachusetts
Michigan
Minnesota
Mississippi
Missouri
Montana
Nebraska
Nevada
Repeaters
Passing
% Passing
Taking
Passing
% Passing
81
73
90%
16
10
63%
245
202
82%
8
4
50%
326
275
84%
24
14
58%
February
132
122
92%
25
13
52%
July
176
144
82%
12
4
33%
Total
308
266
86%
37
17
46%
February
122
98
80%
76
54
71%
July
355
286
81%
33
9
27%
Total
477
384
81%
109
63
58%
February
150
71
47%
248
119
48%
July
572
429
75%
190
103
54%
Total
722
500
69%
438
222
51%
February
33
26
79%
28
15
54%
July
107
81
76%
12
6
50%
Total
140
107
76%
40
21
53%
February
267
190
71%
300
152
51%
July
1,359
1,049
77%
178
53
30%
Total
1,626
1,239
76%
478
205
43%
388
283
73%
291
131
45%
July
1,877
1,545
82%
219
53
24%
Total
2,265
1,828
81%
510
184
36%
February
382
271
71%
299
173
58%
July
769
563
73%
184
41
22%
Total
February
Taking
1,151
834
72%
483
214
44%
February
149
132
89%
76
43
57%
July
703
581
83%
44
12
27%
Total
852
713
84%
120
55
46%
February
77
69
90%
34
21
62%
July
156
133
85%
27
10
37%
Total
233
202
87%
61
31
51%
February
205
175
85%
57
36
63%
July
753
662
88%
39
14
36%
Total
958
837
87%
96
50
52%
February
41
31
76%
13
5
38%
July
114
77
68%
12
4
33%
Total
155
108
70%
25
9
36%
February
19
11
58%
23
7
30%
July
157
124
79%
14
7
50%
Total
176
135
77%
37
14
38%
February
143
96
67%
81
32
40%
July
261
179
69%
71
12
17%
Total
404
275
68%
152
44
29%
a
First-time exam takers are defined as examinees taking the bar examination for the first time in the reporting jurisdiction. Repeaters are defined as
examinees who have taken the bar examination in the reporting jurisdiction at least once prior to the listed administration.
2014 Statistics 15
2014 Statistics
First-Time Exam Takers and Repeaters in 2014a (continued)
First-Timers
Jurisdiction
2014 Administration
New Hampshire
February
New Jersey
New Mexico
North Carolina
North Dakota
Ohio
Oklahoma
Oregon
Pennsylvania
Rhode Island
South Carolina
South Dakota
Tennessee
Taking
Passing
% Passing
Taking
Passing
% Passing
46
39
85%
15
7
47%
July
151
130
86%
10
4
40%
Total
197
169
86%
25
11
44%
February
591
393
66%
424
220
52%
July
3,041
2,360
78%
256
85
33%
Total
3,632
2,753
76%
680
305
45%
February
116
102
88%
21
9
43%
July
180
158
88%
23
13
57%
Total
New York
Repeaters
296
260
88%
44
22
50%
1,490
918
62%
2,542
984
39%
July
9,231
6,872
74%
1,964
393
20%
Total
10,721
7,790
73%
4,506
1,377
31%
267
171
64%
365
185
51%
February
February
July
821
698
85%
386
48
12%
Total
1,088
869
80%
751
233
31%
February
31
21
68%
11
5
45%
July
66
42
64%
12
7
58%
Total
97
63
65%
23
12
52%
February
247
194
79%
193
89
46%
July
1,055
858
81%
118
44
37%
Total
1,302
1,052
81%
311
133
43%
February
66
56
85%
55
29
53%
July
285
239
84%
22
3
14%
Total
351
295
84%
77
32
42%
February
134
107
80%
79
33
42%
July
419
298
71%
57
13
23%
Total
553
405
73%
136
46
34%
February
344
249
72%
376
164
44%
July
1,747
1,440
82%
234
56
24%
Total
2,091
1,689
81%
610
220
36%
February
25
19
76%
23
16
70%
July
164
127
77%
12
1
8%
Total
189
146
77%
35
17
49%
February
170
120
71%
82
38
46%
July
413
308
75%
69
34
49%
Total
583
428
73%
151
72
48%
February
17
13
76%
9
5
56%
July
80
60
75%
4
1
25%
Total
97
73
75%
13
6
46%
February
185
134
72%
119
60
50%
July
712
514
72%
98
23
23%
Total
897
648
72%
217
83
38%
a
First-time exam takers are defined as examinees taking the bar examination for the first time in the reporting jurisdiction. Repeaters are defined as
examinees who have taken the bar examination in the reporting jurisdiction at least once prior to the listed administration.
16 The Bar Examiner, March 2015
First-Time Exam Takers and Repeaters in 2014a (continued)
First-Timers
Jurisdiction
2014 Administration
Texas
February
Utah
Vermont
Virginia
Washington
West Virginia
Wisconsin
Wyoming
Guam
N. Mariana
Islands
Palau
Virgin Islands
% Passing
Taking
Passing
% Passing
742
570
77%
410
211
51%
2,548
1,965
77%
381
126
33%
Total
3,290
2,535
77%
791
337
43%
February
111
95
86%
36
18
50%
July
261
228
87%
29
8
28%
Total
372
323
87%
65
26
40%
February
32
27
84%
15
5
33%
July
47
32
68%
14
8
57%
Total
79
59
75%
29
13
45%
February
263
184
70%
284
141
50%
July
1,216
886
73%
161
50
31%
Total
1,479
1,070
72%
445
191
43%
February
215
170
79%
119
67
56%
July
815
653
80%
71
32
45%
Total
1,030
823
80%
190
99
52%
February
43
39
91%
38
18
47%
July
166
132
80%
20
5
25%
Total
209
171
82%
58
23
40%
February
78
61
78%
17
7
41%
July
155
127
82%
20
4
20%
Total
233
188
81%
37
11
30%
February
15
11
73%
8
4
50%
July
54
43
80%
6
2
33%
Total
69
54
78%
14
6
43%
February
4
3
75%
5
4
80%
July
9
7
78%
4
1
25%
Total
13
10
77%
9
5
56%
February
3
2
67%
—
—
—
July
5
5
100%
—
—
—
Total
8
7
88%
—
—
—
1
25%
February
No February examination
13
2
15%
4
Total
13
2
15%
4
1
25%
February
152
56
37%
371
122
33%
July
451
213
47%
247
83
34%
Total
603
269
45%
618
205
33%
February
7
4
57%
4
2
50%
July
15
13
87%
4
3
75%
Total
TOTALS
Repeaters
Passing
July
July
Puerto Ricob
Taking
22
17
77%
8
5
63%
February
12,330
8,734
71%
12,104
5,310
44%
July
47,575
35,762
75%
8,918
2,007
23%
Total
59,905
44,496
74%
21,022
7,317
35%
a
First-time exam takers are defined as examinees taking the bar examination for the first time in the reporting jurisdiction. Repeaters are defined as
examinees who have taken the bar examination in the reporting jurisdiction at least once prior to the listed administration.
b
Examinations in Puerto Rico are administered in March and September.
2014 Statistics 17
2014 Statistics
2014 First-Time Exam Takers and Repeaters
from ABA-Approved Law Schoolsa
ABA First-Timers
Jurisdiction
Alabama
Alaska
Arizona
Arkansas
California
Colorado
Connecticut
Delaware
2014 Administration
Taking
Passing
% Passing
Taking
Passing
% Passing
February
85
78
92%
25
14
56%
July
348
305
88%
11
4
36%
Total
433
383
88%
36
18
50%
February
33
26
79%
11
5
45%
July
59
46
78%
12
1
8%
Total
92
72
78%
23
6
26%
February
276
198
72%
116
53
46%
July
563
420
75%
102
34
33%
Total
839
618
74%
218
87
40%
February
81
66
81%
58
22
38%
July
177
129
73%
39
6
15%
Total
258
195
76%
97
28
29%
February
736
441
60%
1,849
935
51%
July
5,102
3,415
67%
1,099
219
20%
Total
5,838
3,856
66%
2,948
1,154
39%
February
280
219
78%
109
60
55%
July
784
615
78%
58
14
24%
Total
1,064
834
78%
167
74
44%
February
183
165
90%
74
32
43%
July
400
346
87%
39
7
18%
Total
583
511
88%
113
39
35%
February
Georgia
Hawaii
Idaho
Illinois
Indiana
No February examination
July
156
107
69%
36
14
39%
Total
156
107
69%
36
14
39%
107
74
69%
45
8
18%
Dist. of Columbia February
Florida
ABA Repeaters
July
94
60
64%
57
2
4%
Total
201
134
67%
102
10
10%
February
804
586
73%
509
233
46%
July
2,862
2,057
72%
349
65
19%
Total
3,666
2,643
72%
858
298
35%
339
272
80%
219
90
41%
February
July
1,133
909
80%
167
56
34%
Total
1,472
1,181
80%
386
146
38%
February
83
60
72%
34
15
44%
July
145
109
75%
24
7
29%
Total
228
169
74%
58
22
38%
February
41
29
71%
11
7
64%
July
101
75
74%
12
1
8%
Total
142
104
73%
23
8
35%
February
644
544
84%
313
187
60%
July
2,179
1,867
86%
182
58
32%
Total
2,823
2,411
85%
495
245
49%
February
152
119
78%
114
43
38%
July
474
378
80%
78
22
28%
Total
626
497
79%
192
65
34%
a
First-time exam takers are defined as examinees taking the bar examination for the first time in the reporting jurisdiction. Repeaters are defined as
examinees who have taken the bar examination in the reporting jurisdiction at least once prior to the listed administration.
18 The Bar Examiner, March 2015
2014 First-Time Exam Takers and Repeaters
from ABA-Approved Law Schoolsa (continued)
ABA First-Timers
Jurisdiction
Iowa
Kansas
Kentucky
Louisiana
Maine
Maryland
Massachusetts
Michigan
Minnesota
Mississippi
Missouri
Montana
Nebraska
Nevada
2014 Administration
ABA Repeaters
Taking
Passing
% Passing
Taking
Passing
% Passing
February
81
73
90%
15
10
67%
July
245
202
82%
7
4
57%
Total
326
275
84%
22
14
64%
February
132
122
92%
25
13
52%
July
176
144
82%
12
4
33%
Total
308
266
86%
37
17
46%
February
122
98
80%
76
54
71%
July
355
286
81%
33
9
27%
Total
477
384
81%
109
63
58%
February
145
69
48%
244
118
48%
July
570
429
75%
184
102
55%
Total
715
498
70%
428
220
51%
February
30
24
80%
27
14
52%
July
105
81
77%
11
5
45%
Total
135
105
78%
38
19
50%
February
264
190
72%
295
149
51%
July
1,351
1,045
77%
176
52
30%
Total
1,615
1,235
76%
471
201
43%
323
256
79%
197
103
52%
July
1,796
1,506
84%
127
37
29%
Total
February
2,119
1,762
83%
324
140
43%
February
443
271
61%
235
173
74%
July
769
563
73%
183
41
22%
Total
1,212
834
69%
418
214
51%
February
149
132
89%
76
43
57%
July
703
581
83%
44
12
27%
Total
852
713
84%
120
55
46%
February
77
69
90%
34
21
62%
July
156
133
85%
27
10
37%
Total
233
202
87%
61
31
51%
February
202
172
85%
56
36
64%
July
749
661
88%
38
14
37%
Total
951
833
88%
94
50
53%
February
41
31
76%
13
5
38%
July
114
77
68%
12
4
33%
Total
155
108
70%
25
9
36%
February
19
11
58%
23
7
30%
July
157
124
79%
14
7
50%
Total
176
135
77%
37
14
38%
February
143
96
67%
77
31
40%
July
259
177
68%
70
12
17%
Total
402
273
68%
147
43
29%
a
First-time exam takers are defined as examinees taking the bar examination for the first time in the reporting jurisdiction. Repeaters are defined as
examinees who have taken the bar examination in the reporting jurisdiction at least once prior to the listed administration.
2014 Statistics 19
2014 Statistics
2014 First-Time Exam Takers and Repeaters
from ABA-Approved Law Schoolsa (continued)
ABA First-Timers
Jurisdiction
New Hampshire
New Jersey
New Mexico
New York
North Carolina
North Dakota
Ohio
Oklahoma
Oregon
Pennsylvania
Rhode Island
South Carolina
South Dakota
Tennessee
2014 Administration
ABA Repeaters
Taking
Passing
% Passing
Taking
Passing
% Passing
February
42
32
76%
9
7
78%
July
145
126
87%
8
3
38%
Total
187
158
84%
17
10
59%
February
591
393
66%
424
220
52%
July
3,041
2,360
78%
256
85
33%
Total
3,632
2,753
76%
680
305
45%
February
116
102
88%
20
9
45%
July
180
158
88%
22
13
59%
Total
296
260
88%
42
22
52%
February
1,284
718
56%
975
641
66%
July
7,302
6,031
83%
831
206
25%
Total
8,586
6,749
79%
1,806
847
47%
February
267
171
64%
365
185
51%
July
821
698
85%
386
48
12%
Total
1,088
869
80%
751
233
31%
February
31
21
68%
11
5
45%
July
66
42
64%
12
7
58%
Total
97
63
65%
23
12
52%
February
243
192
79%
189
89
47%
July
1,050
857
82%
111
43
39%
Total
1,293
1,049
81%
300
132
44%
February
66
56
85%
55
29
53%
July
285
239
84%
22
3
14%
Total
351
295
84%
77
32
42%
February
133
107
80%
76
32
42%
July
417
298
71%
56
13
23%
Total
550
405
74%
132
45
34%
February
344
249
72%
375
164
44%
July
1,745
1,440
83%
233
56
24%
Total
2,089
1,689
81%
608
220
36%
70%
February
25
19
76%
23
16
July
164
127
77%
12
1
8%
Total
189
146
77%
35
17
49%
February
170
120
71%
82
38
46%
July
413
308
75%
69
34
49%
Total
583
428
73%
151
72
48%
February
17
13
76%
9
5
56%
July
80
60
75%
4
1
25%
Total
97
73
75%
13
6
46%
February
119
88
74%
54
31
57%
July
615
459
75%
53
15
28%
Total
734
547
75%
107
46
43%
a
First-time exam takers are defined as examinees taking the bar examination for the first time in the reporting jurisdiction. Repeaters are defined as
examinees who have taken the bar examination in the reporting jurisdiction at least once prior to the listed administration.
20 The Bar Examiner, March 2015
2014 First-Time Exam Takers and Repeaters
from ABA-Approved Law Schoolsa (continued)
ABA First-Timers
Jurisdiction
Texas
Utah
Vermont
Virginia
Washington
West Virginia
Wisconsin
Wyoming
Guam
N. Mariana
Islands
Palau
2014 Administration
Taking
Passing
% Passing
Taking
Passing
% Passing
726
565
78%
406
210
52%
July
2,534
1,959
77%
371
126
34%
Total
February
3,260
2,524
77%
777
336
43%
February
111
95
86%
36
18
50%
July
261
228
87%
29
8
28%
Total
372
323
87%
65
26
40%
February
28
24
86%
10
3
30%
July
54
33
61%
7
7
100%
Total
82
57
70%
17
10
59%
February
260
182
70%
275
141
51%
July
1,215
886
73%
153
50
33%
Total
1,475
1,068
72%
428
191
45%
February
209
165
79%
116
66
57%
July
793
644
81%
69
32
46%
Total
1,002
809
81%
185
98
53%
February
43
39
91%
38
18
47%
July
166
132
80%
20
5
25%
Total
209
171
82%
58
23
40%
February
76
61
80%
14
7
50%
July
153
125
82%
17
4
24%
Total
229
186
81%
31
11
35%
February
15
11
73%
8
4
50%
July
54
43
80%
6
2
33%
Total
69
54
78%
14
6
43%
February
4
3
75%
5
4
80%
July
9
7
78%
4
1
25%
Total
13
10
77%
9
5
56%
February
3
2
67%
—
—
—
July
5
5
100%
—
—
—
Total
8
7
88%
—
—
—
1
50%
February
July
Total
Puerto Ricob
Virgin Islands
No February examination
4
0
0%
2
4
0
0%
2
1
50%
February
152
56
37%
353
116
33%
July
451
213
47%
236
81
34%
Total
603
269
45%
589
197
33%
February
7
4
57%
4
2
50%
July
15
13
87%
4
3
75%
Total
TOTALS
ABA Repeaters
22
17
77%
8
5
63%
February
11,097
7,979
72%
8,812
4,541
52%
July
44,120
34,338
78%
6,196
1,671
27%
Total
55,217
42,317
77%
15,008
6,212
41%
a
First-time exam takers are defined as examinees taking the bar examination for the first time in the reporting jurisdiction. Repeaters are defined as
examinees who have taken the bar examination in the reporting jurisdiction at least once prior to the listed administration.
b
Examinations in Puerto Rico are administered in March and September.
2014 Statistics 21
2014 Statistics
2014 Exam Takers and Passers from Non-ABA-Approved
Law Schools by Type of School
Conventional Law Schoola
Jurisdiction
Correspondence Law Schoolb
Online Law Schoolc
Taking
Passing
% Passing
Taking
Passing
% Passing
Taking
Passing
% Passing
278
59
21%
—
—
—
—
—
—
2
1
50%
—
—
—
—
—
—
4
3
75%
—
—
—
—
—
—
California
1,540
335
22%
158
33
21%
317
48
15%
Colorado
4
1
25%
—
—
—
—
—
—
Connecticut
39
2
5%
—
—
—
—
—
—
District of Columbia
8
1
13%
—
—
—
6
0
0%
Florida
5
1
20%
—
—
—
—
—
—
Georgia
25
2
8%
—
—
—
—
—
—
Illinois
1
1
100%
—
—
—
—
—
—
Maine
6
4
67%
—
—
—
—
—
—
Maryland
—
—
—
—
—
—
2
2
100%
Massachusetts
291
95
33%
—
—
—
—
—
—
Missouri
2
2
100%
—
—
—
—
—
—
Nevada
3
1
33%
—
—
—
—
—
—
New Hampshire
18
12
67%
—
—
—
—
—
—
New Mexico
2
0
0%
—
—
—
—
—
—
New York
6
1
17%
—
—
—
—
—
—
Oregon
1
1
100%
—
—
—
—
—
—
Pennsylvania
1
0
0%
—
—
—
—
—
—
Tennessee
265
138
52%
—
—
—
—
—
—
Texas
14
6
43%
—
—
—
—
—
—
Wisconsin
—
—
—
—
—
—
1
1
100%
Palau
1
0
0%
—
—
—
—
—
—
Puerto Rico
29
8
28%
—
—
—
—
—
—
2,545
674
26%
158
33
21%
326
51
16%
Alabama
Alaska
Arizona
d
TOTALS
a
Conventional law schools are fixed-facility schools that conduct instruction principally in physical classroom facilities.
b
Correspondence law schools are schools that conduct instruction principally by correspondence.
c
Online law schools are schools that conduct instruction and provide interactive classes principally by technological transmission, including
Internet transmission and electronic conferencing.
d
California applicants from non-ABA-approved law schools also include those who attended schools no longer in operation, composed of an
unverifiable mixture of conventional, correspondence, and online schools. This number of applicants (109 taking, 3 passing) is therefore omitted
from this chart.
22 The Bar Examiner, March 2015
Attorneys’ Examinationsa in 2014
February
Jurisdiction
Total
Taking
Passing
% Passing
Taking
Passing
% Passing
Taking
Passing
% Passing
California
510
275
54%
417
131
31%
927
406
44%
Georgia
136
125
92%
114
89
78%
250
214
86%
Idaho
12
11
92%
13
6
46%
25
17
68%
Maine
22
20
91%
14
10
71%
36
30
83%
Maryland
87
67
77%
104
99
95%
191
166
87%
Rhode Island
22
19
86%
12
7
58%
34
26
76%
Vermont
—
—
—
61
40
66%
61
40
66%
Guam
—
—
—
2
0
0%
2
0
0%
N. Mariana Islands
2
1
50%
1
1
100%
3
2
67%
Virgin Islands
4
2
50%
—
—
—
4
2
50%
795
520
65%
738
383
52%
1,533
903
59%
TOTALS
a
July
Attorneys’ Examination refers to a short form or other form of bar examination administered to attorneys admitted in other jurisdictions.
Examinations Administered to Disbarred or Suspended
Attorneys as a Condition of Reinstatement in 2014a
Jurisdiction
Taking
Passing
% Passing
Arizona
3
1
33%
Arkansas
1
1
100%
California
33
2
6%
Colorado
2
2
100%
Floridab
8
2
25%
South Carolina
5
3
60%
Texas
6
0
0%
TOTALS
58
11
19%
a
The form of examination administered to disbarred or suspended attorneys varied among jurisdictions as follows:
regular bar examination (5 jurisdictions), local component only (1 jurisdiction), Attorneys’ Examination (1 jurisdiction).
b
Florida reports only a subset of suspended attorneys who are required to take the Florida portion of the examination
only. Disbarred and other suspended attorneys who are required to take the regular bar examination are reported with
other test takers.
2014 Statistics 23
2014 Statistics
Ten-Year Summary of Bar Passage Rates, 2005–2014
Jurisdiction
Alabama
Alaska
Arizona
Arkansas
California
Colorado
Connecticut
Delaware
District of Columbia
Florida
Georgia
Hawaii
Idaho
Illinois
Indiana
2005
2006
2007
2008
2009
2010
2011
2012
2013
2014
Overall
64%
65%
64%
67%
65%
67%
65%
64%
64%
62%
First-Time
80%
80%
78%
79%
77%
78%
77%
76%
78%
79%
Overall
63%
62%
60%
70%
58%
71%
59%
67%
66%
66%
First-Time
75%
75%
82%
80%
72%
81%
71%
78%
80%
78%
Overall
67%
68%
70%
76%
73%
73%
70%
75%
73%
67%
First-Time
72%
75%
78%
84%
80%
81%
76%
80%
78%
73%
Overall
70%
69%
70%
72%
67%
65%
71%
68%
65%
63%
First-Time
78%
80%
80%
83%
74%
72%
84%
76%
76%
76%
Overall
46%
47%
49%
54%
49%
49%
51%
51%
51%
47%
First-Time
62%
65%
66%
71%
66%
65%
67%
65%
65%
60%
Overall
68%
68%
69%
73%
74%
74%
79%
77%
76%
74%
First-Time
78%
76%
78%
83%
85%
83%
86%
84%
82%
78%
Overall
74%
75%
77%
78%
75%
71%
71%
73%
73%
75%
First-Time
81%
83%
86%
87%
83%
81%
82%
82%
81%
86%
Overall
57%
59%
62%
73%
63%
66%
67%
63%
72%
63%
First-Time
63%
67%
71%
80%
71%
72%
73%
69%
78%
69%
Overall
51%
51%
54%
56%
49%
41%
48%
51%
47%
40%
First-Time
69%
72%
76%
70%
65%
60%
69%
68%
61%
57%
Overall
60%
64%
66%
71%
68%
69%
72%
71%
70%
65%
First-Time
71%
75%
78%
81%
78%
78%
80%
79%
78%
72%
Overall
73%
76%
75%
79%
76%
75%
76%
75%
76%
71%
First-Time
84%
86%
85%
89%
86%
84%
85%
84%
85%
80%
Overall
71%
71%
70%
76%
76%
68%
75%
68%
73%
67%
First-Time
81%
77%
82%
88%
86%
77%
83%
75%
81%
74%
Overall
74%
79%
76%
72%
81%
78%
79%
80%
79%
68%
First-Time
80%
85%
81%
80%
86%
83%
85%
86%
83%
73%
Overall
78%
79%
82%
85%
84%
84%
83%
81%
82%
79%
First-Time
85%
87%
89%
91%
91%
89%
89%
87%
88%
85%
Overall
75%
76%
76%
78%
75%
75%
74%
72%
74%
69%
First-Time
84%
84%
84%
84%
83%
81%
83%
79%
83%
79%
24 The Bar Examiner, March 2015
Ten-Year Summary of Bar Passage Rates, 2005–2014 (continued)
Jurisdiction
Iowa
Kansas
Kentucky
Louisiana
Maine
Maryland
Massachusetts
Michigan
Minnesota
Mississippi
Missouri
Montana
Nebraska
Nevada
New Hampshire
2005
2006
2007
2008
2009
2010
2011
2012
2013
2014
Overall
80%
81%
83%
85%
88%
87%
84%
88%
88%
83%
First-Time
86%
88%
89%
90%
93%
91%
90%
92%
93%
84%
Overall
76%
82%
87%
86%
82%
84%
86%
84%
85%
82%
First-Time
81%
90%
91%
89%
86%
90%
89%
89%
89%
86%
Overall
72%
73%
77%
77%
77%
77%
80%
76%
75%
76%
First-Time
80%
82%
87%
83%
86%
82%
86%
82%
81%
81%
Overall
69%
70%
61%
62%
69%
61%
66%
59%
50%
62%
First-Time
72%
76%
63%
66%
72%
65%
70%
63%
58%
69%
Overall
70%
73%
80%
86%
77%
88%
68%
68%
76%
71%
First-Time
81%
81%
84%
91%
82%
89%
73%
73%
81%
76%
Overall
65%
66%
67%
75%
69%
71%
74%
71%
73%
69%
First-Time
74%
78%
76%
85%
78%
80%
81%
78%
80%
76%
Overall
72%
77%
77%
80%
79%
81%
80%
77%
78%
73%
First-Time
82%
87%
86%
89%
87%
88%
87%
83%
85%
81%
Overall
64%
78%
76%
72%
81%
80%
76%
58%
62%
64%
First-Time
75%
87%
86%
82%
89%
85%
82%
64%
69%
72%
Overall
81%
86%
88%
87%
85%
86%
88%
85%
85%
79%
First-Time
88%
91%
93%
91%
90%
92%
93%
91%
90%
84%
Overall
85%
80%
81%
82%
78%
76%
73%
73%
77%
79%
First-Time
88%
86%
88%
88%
85%
80%
81%
81%
85%
87%
Overall
81%
82%
84%
87%
87%
86%
89%
89%
87%
84%
First-Time
88%
88%
90%
91%
91%
90%
93%
92%
90%
87%
Overall
84%
91%
89%
91%
87%
89%
90%
91%
85%
65%
First-Time
89%
92%
88%
92%
89%
93%
91%
93%
89%
70%
Overall
73%
80%
83%
84%
78%
81%
78%
73%
74%
70%
First-Time
85%
83%
89%
89%
88%
90%
83%
83%
77%
77%
Overall
59%
61%
60%
64%
60%
59%
65%
64%
61%
57%
First-Time
68%
72%
74%
77%
73%
73%
76%
73%
73%
68%
Overall
54%
77%
77%
88%
84%
80%
78%
82%
71%
81%
First-Time
61%
82%
84%
88%
85%
82%
81%
84%
75%
86%
2014 Statistics 25
2014 Statistics
Ten-Year Summary of Bar Passage Rates, 2005–2014 (continued)
Jurisdiction
New Jersey
New Mexico
New York
North Carolina
North Dakota
Ohio
Oklahoma
Oregon
Pennsylvania
Rhode Island
South Carolina
South Dakota
Tennessee
Texas
Utah
2005
2006
2007
2008
2009
2010
2011
2012
2013
2014
Overall
70%
73%
73%
77%
77%
76%
77%
71%
75%
71%
First-Time
77%
81%
82%
85%
84%
82%
84%
78%
79%
76%
Overall
81%
86%
78%
85%
84%
81%
82%
84%
83%
83%
First-Time
85%
91%
83%
92%
91%
88%
88%
89%
91%
88%
Overall
62%
63%
64%
69%
65%
65%
64%
61%
64%
60%
First-Time
74%
77%
77%
81%
77%
76%
76%
74%
76%
73%
Overall
64%
64%
65%
71%
67%
68%
70%
65%
59%
60%
First-Time
71%
75%
76%
83%
77%
78%
80%
79%
69%
80%
Overall
83%
72%
69%
77%
80%
78%
83%
78%
72%
63%
First-Time
90%
83%
79%
85%
87%
84%
85%
81%
80%
65%
Overall
71%
74%
76%
79%
76%
78%
79%
76%
79%
73%
First-Time
80%
83%
86%
88%
86%
86%
86%
84%
86%
81%
Overall
82%
83%
85%
89%
80%
82%
83%
80%
81%
76%
First-Time
89%
91%
91%
93%
87%
89%
88%
84%
86%
84%
Overall
67%
72%
74%
71%
69%
68%
68%
72%
73%
65%
First-Time
74%
80%
81%
78%
77%
75%
78%
81%
80%
73%
Overall
70%
71%
72%
77%
76%
74%
77%
73%
73%
71%
First-Time
80%
83%
83%
87%
86%
83%
85%
82%
81%
81%
Overall
65%
71%
75%
75%
74%
74%
69%
78%
71%
73%
First-Time
71%
77%
79%
79%
78%
79%
74%
83%
76%
77%
Overall
80%
77%
79%
75%
72%
73%
73%
67%
75%
68%
First-Time
85%
78%
82%
82%
78%
80%
77%
73%
79%
73%
Overall
72%
77%
85%
88%
83%
94%
94%
83%
87%
72%
First-Time
83%
85%
89%
95%
90%
99%
94%
86%
91%
75%
Overall
74%
75%
71%
76%
68%
70%
69%
68%
73%
66%
First-Time
80%
79%
80%
83%
77%
79%
77%
73%
82%
72%
Overall
71%
74%
76%
78%
78%
76%
80%
75%
80%
70%
First-Time
80%
82%
84%
84%
85%
83%
86%
82%
85%
77%
Overall
86%
83%
81%
83%
83%
82%
84%
77%
82%
80%
First-Time
90%
89%
85%
87%
89%
89%
88%
82%
87%
87%
26 The Bar Examiner, March 2015
Ten-Year Summary of Bar Passage Rates, 2005–2014 (continued)
Jurisdiction
Vermont
Virginia
Washington
West Virginia
Wisconsin
Wyoming
Guam
N. Mariana Islands
Palau
Puerto Rico
Virgin Islands
AVERAGES
2005
2006
2007
2008
2009
2010
2011
2012
2013
2014
Overall
73%
68%
66%
65%
61%
76%
68%
65%
76%
67%
First-Time
80%
78%
70%
79%
68%
87%
71%
69%
83%
75%
Overall
68%
68%
67%
73%
69%
70%
72%
69%
71%
66%
First-Time
76%
74%
76%
82%
76%
77%
79%
77%
77%
72%
Overall
71%
78%
77%
73%
67%
71%
66%
64%
76%
76%
First-Time
77%
80%
78%
74%
69%
70%
67%
66%
82%
80%
Overall
64%
60%
63%
67%
73%
65%
74%
72%
68%
73%
First-Time
71%
64%
74%
79%
81%
75%
83%
82%
76%
82%
Overall
77%
78%
89%
89%
89%
90%
84%
83%
83%
74%
First-Time
80%
82%
92%
92%
93%
92%
88%
86%
88%
81%
Overall
72%
72%
62%
64%
75%
71%
62%
53%
81%
72%
First-Time
80%
74%
70%
67%
79%
75%
62%
60%
84%
78%
Overall
77%
75%
76%
75%
52%
80%
67%
57%
63%
68%
First-Time
100%
70%
79%
73%
60%
90%
81%
60%
64%
77%
Overall
100%
88%
88%
83%
100%
63%
83%
100%
92%
88%
First-Time
100%
88%
86%
83%
100%
57%
100%
100%
92%
88%
Overall
71%
27%
—
67%
17%
57%
25%
30%
63%
18%
First-Time
71%
27%
—
50%
17%
67%
0%
38%
67%
15%
Overall
38%
46%
42%
44%
41%
42%
44%
36%
40%
39%
First-Time
46%
57%
52%
52%
48%
50%
50%
45%
45%
45%
Overall
69%
73%
56%
76%
65%
71%
49%
64%
61%
73%
First-Time
70%
70%
65%
84%
70%
77%
52%
70%
70%
77%
Overall
64%
67%
67%
71%
68%
68%
69%
67%
68%
64%
First-Time
76%
78%
79%
82%
79%
79%
79%
77%
78%
74%
2014 Statistics 27
2014 Statistics
Admissions to the Bar by Type, 2010–2014
Admission on Motion/by Transferred UBE Scorea
Admisson by Examination
Jurisdiction
2010
2011
2012
2013
2014
2010
2011
2012
2013
2014
Alabama
492
516
533
465
461
19
32
—
38/—
30/10
Alaska
106
70
106
103
79
19
36
44
27
37/8
Arizona
543
506
629
722
683
234
183
145
176/8
171/38
Arkansas
236
260
253
242
219
49
47
55
60
47
California
6,423
6,627
6,846
7,008
6,726
—
—
—
—
—
Colorado
1,005
1,101
1,080
1,019
914
130
155
157
185/13
245/45
Connecticut
635
531
585
564
516
15
28
83
116
81
Delaware
142
122
147
148
122
—
—
—
—
—
District of Columbia
191
194
204
92
253
2,875
2,970
2,932
3,028
2,670
Florida
3,190
3,646
3,342
3,476
3,137
—
—
—
—
—
Georgia
1,174
1,165
1,144
1,245
1,297
90
123
124
132
178
Hawaii
160
208
219
206
203
—
—
—
—
—
Idaho
149
137
183
158
132
91
73
92
63/10
71/34
Illinois
2,943
2,793
2,786
2,944
2,676
93
135
191
240
293
Indiana
618
578
625
609
565
42
65
52
66
58
Iowa
329
335
364
328
294
73
96
79
88
97
Kansas
370
356
322
316
277
47
39
116
77
94
Kentucky
486
554
476
581
475
62
91
83
87
91
Louisiana
671
744
664
533
722
—
—
—
—
—
Maine
168
157
145
152
128
4
6
20
31
48
Maryland
1,365
1,653
1,685
1,742
1,637
—
—
—
—
—
Massachusetts
2,216
2,278
2,289
2,233
1,998
162
138
174
178
194
Michigan
986
979
878
1,061
1,011
100
120
138
187
192
Minnesota
824
732
825
796
752
215
191
233
215/17
200/48
Mississippi
260
252
248
265
233
29
34
33
40
35
Missouri
861
877
922
911
899
72
88
111
115/8
138/29
Montana
150
192
200
170
112
—
—
—
—/34
—/72
Nebraska
117
104
80
142
147
146
141
198
173/1
119/3
Nevada
373
542
550
343
319
—
—
—
—
—
a
NCBE began collecting data for admission by transferred UBE score in 2013. Any persons admitted by transferred UBE score in 2011 (the first
administration of the UBE, in which three jurisdictions administered the UBE) and 2012 (in which six jurisdictions administered the UBE) are
included in those jurisdictions’ admission on motion numbers.
28 The Bar Examiner, March 2015
Admissions to the Bar by Type, 2010–2014 (continued)
Admission on Motion/by Transferred UBE Scorea
Admisson by Examination
Jurisdiction
2010
2011
2012
2013
2014
2010
2011
2012
2013
2014
New Hampshire
149
159
164
128
168
86
118
91
99/1
74/6
New Jersey
3,133
2,844
3,175
3,386
3,635
—
—
—
—
—
New Mexico
268
287
298
287
324
—
—
—
—
—
9,649
9,309
9,046
9,698
10,273
483
546
613
553
476
North Carolina
998
1,032
1,094
997
1,102
107
69
76
94
107
North Dakota
69
67
102
85
76
70
128
185
174/8
132/28
1,263
1,234
1,235
1,309
1,179
65
90
118
135
143
Oklahoma
380
411
510
392
328
61
54
73
71
69
Oregon
537
616
496
488
471
172
179
138
171
160
Pennsylvania
2,220
2,099
1,886
1,995
1,883
331
305
285
246
236
Rhode Island
202
185
204
201
158
—
—
—
—
—
South Carolina
466
508
526
598
469
—
—
—
—
—
South Dakota
74
74
87
91
52
18
22
23
30
22
Tennessee
700
681
668
858
709
150
140
124
153
135
Texas
2,929
3,097
2,988
3,356
2,892
328
379
408
480
533
Utah
385
545
390
424
441
67
61
53
53/22
61/43
Vermont
67
82
73
95
104
37
27
35
56
326
Virginia
1,645
1,411
1,577
1,528
1,224
60
41
43
62
98
Washington
950
923
935
1,006
910
231
225
232
318/29
484/69
West Virginia
193
224
221
208
185
66
83
73
66
53
Wisconsin
269
256
241
215
204
141
202
174
167
154
Wyoming
103
96
91
96
61
16
16
27
41/20
64/78
Guam
11
12
6
11
10
—
—
—
—
—
N. Mariana Islands
5
5
8
13
8
—
11
9
4
7
Palau
4
0
4
5
4
—
—
—
—
7
Puerto Rico
465
557
466
491
495
—
—
—
—
—
Virgin Islands
37
23
25
23
29
—
2
—
—
6
54,354
54,946
54,846
56,558
54,381
7,056
7,489
7,840
8,295/171
8,436/511
New York
Ohio
TOTALS
a
NCBE began collecting data for admission by transferred UBE score in 2013. Any persons admitted by transferred UBE score in 2011 (the first
administration of the UBE, in which three jurisdictions administered the UBE) and 2012 (in which six jurisdictions administered the UBE) are
included in those jurisdictions’ admission on motion numbers.
2014 Statistics 29
2014 Statistics
Admissions to the Bar by Type, 2010–2014 (continued)
Foreign Legal Consultants
Jurisdiction
2010
2011
2012
2013
2014
Arizona
1
—
1
1
—
California
5
3
4
13
17
Colorado
—
—
—
—
1
Delaware
—
1
—
—
—
District of Columbia
6
8
11
13
6
Florida
32
47
52
60
9
Georgia
1
—
1
2
1
Hawaii
—
—
—
—
1
Illinois
2
—
—
1
Iowa
1
—
—
—
—
Massachusetts
—
1
—
1
1
Michigan
—
—
—
—
1
Minnesota
—
1
1
—
2
New Jersey
1
—
—
—
—
New Mexico
—
—
1
—
—
New York
13
23
36
26
36
North Carolina
—
—
—
—
1
Ohio
—
—
—
—
2
Pennsylvania
—
1
—
—
1
South Carolina
—
2
1
—
—
Texas
2
4
6
8
3
Virginia
—
—
—
1
—
Washington
—
—
1
2
3
TOTALS
64
91
115
128
85
Admission by Diploma Privilegea
Jurisdiction
2010
2011
2012
2013
2014
New Hampshireb
14
19
20
22
22
Wisconsin
466
462
463
461
417
TOTALS
480
481
483
483
439
a
Diploma privilege is defined as an admissions method that excuses students from a
traditional bar examination.
b
Individuals are graduates of New Hampshire’s Daniel Webster Scholar Honors Program, which is a two-year, performance-based program that includes clinical experience,
portfolio review, and meetings with bar examiners.
30 The Bar Examiner, March 2015
2014 Admissions to the Bar by Examination,
on Motion, and by Diploma Privilege
(Note: Some jurisdictions have relatively low percentages of on-motion admissions, which may not be easily visible in this chart.
Please refer to the accompanying chart on pages 28–30 for precise numbers.)
0
1,000
2,000
3,000
4,000
5,000
6,000
7,000
8,000
9,000 10,000 11,000
Alabama
Alaska
Arizona
Arkansas
California
Colorado
Connecticut
Delaware
District of Columbia
Florida
Georgia
Hawaii
Idaho
Illinois
Indiana
Iowa
Kansas
Kentucky
Louisiana
Maine
Maryland
Massachusetts
Michigan
Minnesota
Mississippi
Missouri
Montana
Nebraska
Nevada
New Hampshire
New Jersey
New Mexico
New York
North Carolina
North Dakota
Ohio
Oklahoma
Oregon
Pennsylvania
Rhode Island
South Carolina
South Dakota
Tennessee
Texas
Utah
Vermont
Virginia
Washington
West Virginia
Wisconsin
Wyoming
Guam
N. Mariana Islands
Palau
Puerto Rico
Virgin Islands
By Examination
On Motion
By Diploma Privilege
2014 Statistics 31
2014 Statistics
The National Conference of Bar Examiners has produced the Multistate Bar Examination (MBE)
since 1972. In 2014, the MBE was part of the bar examination in 54 jurisdictions.
The MBE consists of 200 multiple-choice questions in the following areas: Civil Procedure,
Constitutional Law, Contracts, Criminal Law and Procedure, Evidence, Real Property, and Torts.
The purpose of the MBE is to assess the extent to which an examinee can apply fundamental legal
principles and legal reasoning to analyze given fact patterns.
Both a raw score and a scaled score are computed for each examinee. A raw score is the number
of questions answered correctly. Raw scores from different administrations of the MBE are not
comparable, primarily due to differences in the difficulty of the questions from one administration to the next. The statistical process of equating adjusts for variations in the difficulty of the
questions, producing scaled scores that represent the same level of performance across all MBE
administrations. For instance, if the questions appearing on the July MBE were more difficult than
those appearing on the February MBE, then the scaled scores for the July MBE would be adjusted
upward to account for this difference. These adjustments ensure that no examinee is unfairly
penalized or rewarded for taking a more or less difficult exam. Each jurisdiction determines its
own policy with regard to the relative weight given to the MBE and other scores. (Jurisdictions
that administer the Uniform Bar Examination [UBE] weight the MBE component 50%.)
32 The Bar Examiner, March 2015
Jurisdictions Using the MBE in 2014
Key for Jurisdictions Using the MBE in 2014
Gray shading indicates jurisdictions using the MBE. Jurisdictions not shown on the map that are included in this category:
the District of Columbia, Guam, Northern Mariana Islands, Palau, and Virgin Islands.
No shading indicates jurisdictions not using the MBE. Jurisdiction not shown on the map that is included in this category:
Puerto Rico.
2014 Statistics 33
2014 Statistics
2014 MBE
National Score Distributionsa
February
(Mean = 138.0)
July
(Mean = 141.5)
85
90
95
100
105
110
115
120
125
130
135
140
145
150
155
160
165
170
175
180
185
190
0.0
0.1
0.3
0.6
1.2
1.9
3.4
5.4
7.3
9.9
12.3
11.6
13.5
11.7
8.1
6.3
3.4
1.7
0.8
0.2
0.1
0.0
0.1
0.1
0.2
0.4
1.0
1.5
2.6
4.1
6.1
8.5
10.6
10.5
12.9
11.2
10.3
8.4
5.3
4.0
1.6
0.7
0.1
0.0
2014 MBE
National Summary Statistics
(Based on Scaled Scores)a
February
July
2014 Total
Number of Examinees
22,083
51,005
73,088
Mean Scaled Score
138.0
141.5
140.4
Standard Deviation
15.3
16.0
15.9
Maximum
187.1
187.5
187.5
Minimum
70.7
44.4
44.4
Median
138.3
142.2
141.2
Percentage of Examinees
MBE Scaled
Scoreb
2014 MBE National Score Distributionsa
Percentage of Examinees
15.0
12.5
10.0
7.5
5.0
2.5
0.0
85
95
105
115
125
135
145
155
165
175
185
MBE Scaled Score
February Exam (Mean=138.0)
a
July Exam (Mean=141.5)
The values reflect valid scores available electronically as of 1/21/2015.
These data represent scaled scores in increments of 5. For example, the percentage reported for 135 includes examinees whose MBE scaled scores were
between 130.5 and 135.4.
b
34 The Bar Examiner, March 2015
MBE National Examinee Counts, 2005–2014a
60,000
Number of Examinees
July
Year
Total
2005
21,265
49,998
71,263
2006
22,824
51,176
74,000
2007
22,250
50,181
72,431
2008
20,822
50,011
70,833
2009
18,868
50,385
69,253
2010
19,504
50,114
69,618
2011
20,369
49,933
70,302
2012
20,695
52,337
73,032
2013
21,578
53,706
75,284
2014
22,083
51,005
73,088
50,000
MBE Examinee Count
February
40,000
30,000
20,000
10,000
0
2005
2006
2007
2008
2009
2010
2011
2012
2013
2014
Year
February Exam
July Exam
MBE National Mean Scaled Scores, 2005–2014a
February
July
Year
Total
2005
137.7
141.6
140.4
2006
137.5
143.3
141.5
2007
136.9
143.7
141.6
2008
137.7
145.6
143.3
2009
135.7
144.5
142.1
2010
136.6
143.6
141.7
2011
138.6
143.8
142.3
2012
137.0
143.4
141.6
2013
138.0
144.3
142.5
2014
138.0
141.5
140.4
150
MBE Mean Scaled Score
Mean Scaled Scores
145
140
135
130
2005
2006
2007
2008
2009
2010
2011
2012
2013
2014
Year
February Exam
July Exam
a
The values reflect valid scores available electronically as of 1/21/2015.
2014 Statistics 35
2014 Statistics
The National Conference of Bar Examiners has produced the Multistate Professional Responsibility
Examination (MPRE) since 1980. In 2014, the MPRE was required in 53 jurisdictions.
The MPRE consists of 60 multiple-choice questions whose scope of coverage includes the following: regulation of the legal profession; the client-lawyer relationship; client confidentiality;
conflicts of interest; competence, legal malpractice, and other civil liability; litigation and other
forms of advocacy; transactions and communications with persons other than clients; different
roles of the lawyer; safekeeping funds and other property; communications about legal services;
lawyers’ duties to the public and the legal system; and judicial conduct. The purpose of the MPRE
is to measure the examinee’s knowledge and understanding of established standards related to a
lawyer’s professional conduct.
The MPRE scaled score is a standard score. Standard scaled scores range from 50 (low) to 150
(high). The mean (average) scaled score was established at 100, based upon the performance of
the examinees who took the MPRE in March 1999. The conversion of raw scores to scaled scores
involves a statistical process that adjusts for variations in the difficulty of different forms of the
examination so that any particular scaled score will represent the same level of knowledge from
test to test. For instance, if a test is more difficult than previous tests, then the scaled scores on
that test will be adjusted upward to account for this difference. If a test is easier than previous
tests, then the scaled scores on the test will be adjusted downward to account for this difference.
The purpose of these adjustments is to help ensure that no examinee is unfairly penalized or
rewarded for taking a more or less difficult form of the test. Passing scores are established by each
jurisdiction.
36 The Bar Examiner, March 2015
Jurisdictions Using the MPRE in 2014
(with Pass/Fail Standards Indicated)
Key for Jurisdictions Using the MPRE in 2014
Gray shading indicates jurisdictions using the MPRE. Jurisdictions not shown on the map that are included in this category:
the District of Columbia (75), Guam (80), Northern Mariana Islands (80), Palau (75), and Virgin Islands (75).
No shading indicates jurisdictions not using the MPRE. Jurisdiction not shown on the map that is included in this category:
Puerto Rico.
2014 Statistics 37
2014 Statistics
2014 MPRE
National Summary Statistics
(Based on Scaled Scores)a
2014 MPRE
National Score Distributionsa
MPRE
Scaled
Scoreb
Percentage of Examinees
March
(Mean = 93.1)
August
(Mean = 93.1)
November
(Mean = 94.5)
1.8
2.5
1.4
March
August
November
2014
Total
22,957
17,699
19,888
60,544
60
6.6
5.7
6.0
Mean Scaled Score
93.1
93.1
94.5
93.6
70
13.2
12.8
12.4
Standard Deviation
16.4
17.0
16.4
16.6
80
20.8
21.2
20.6
90
24.1
25.2
23.9
100
16.0
15.7
16.6
Number of Examinees
50
Maximum
149
145
150
150
Minimum
50
50
50
50
110
10.9
10.5
13.5
Median
94
94
94
94
120
5.2
5.0
3.6
130
1.3
1.1
1.9
140
0.1
0.3
0.2
150
0.0
0.0
0.0
2014 MPRE National Score Distributionsa
27.5
25.0
Percentage of Examinees
22.5
20.0
17.5
15.0
12.5
10.0
7.5
5.0
2.5
0.0
50
60
70
March (Mean = 93.1)
a
80
90
100
110
MPRE Scaled Score
August (Mean = 93.1)
120
130
140
150
November (Mean = 94.5)
The values reflect valid scores available electronically as of 1/23/2015 on both standard and alternative forms of the MPRE.
These data represent scaled scores in increments of 10. For example, the percentage reported for 70 includes examinees whose MPRE scaled scores
were between 70 and 79.
b
38 The Bar Examiner, March 2015
MPRE National Examinee Counts, 2005–2014 a
25,000
Aug.
Nov.
2005
19,869
15,703
21,716
Year
Total
57,288
2006
21,684
15,986
23,308
60,978
2007
21,724
17,107
23,404
62,235
2008
20,288
16,536
23,568
60,392
2009
21,755
18,085
22,483
62,323
2010
22,478
18,641
23,345
64,464
2011
22,136
19,773
24,731
66,640
2012
2013
24,280
22,320
19,028
19,895
23,191
20,459
66,499
62,674
2014
22,957
17,699
19,888
60,544
MPRE Examinee Count
Number of Examinees
Mar./
Apr.
20,000
15,000
10,000
5,000
0
2005
2006
2007
2008
2009
2010
2011
2012
2013
2014
Year
Mar./Apr. Exam
August Exam
November Exam
MPRE National Mean Scaled Scores, 2005–2014 a
Mean Scaled Scores
Mar./
Apr.
Aug.
Nov.
Year
Total
2005
98.3
98.0
99.6
98.7
2006
98.6
96.9
98.1
98.0
2007
98.5
98.0
99.2
98.6
2008
98.9
95.6
97.9
97.6
2009
98.8
95.8
97.3
97.4
2010
97.4
95.7
97.2
96.8
2011
97.1
93.4
96.3
95.7
2012
99.3
95.8
97.2
97.6
2013
94.6
94.3
98.1
95.6
2014
93.1
93.1
94.5
93.6
MPRE Mean Scaled Score
110
105
100
95
90
2005
2006
2007
2008
2009
2010
2011
2012
2013
2014
Year
Mar./Apr. Exam
August Exam
November Exam
a
The values reflect valid scores available electronically as of 1/23/2015 on both standard and alternative forms of the MPRE.
2014 Statistics 39
2014 Statistics
The National Conference of Bar Examiners has produced the Multistate Essay Examination (MEE)
since 1988. In 2014, the MEE was used in 31 jurisdictions.
NCBE offers six 30-minute questions per administration.
The purpose of the MEE is to test the examinee’s ability to (1) identify legal issues raised by a
hypothetical factual situation; (2) separate material which is relevant from that which is not; (3)
present a reasoned analysis of the relevant issues in a clear, concise, and well-organized composition; and (4) demonstrate an understanding of the fundamental legal principles relevant to the
probable solution of the issues raised by the factual situation. The primary distinction between the
MEE and the Multistate Bar Examination (MBE) is that the MEE requires the examinee to demonstrate an ability to communicate effectively in writing.
Areas of law that may be covered on the MEE include the following: Business Associations (Agency
and Partnership; Corporations and Limited Liability Companies), Civil Procedure, Conflict of
Laws, Constitutional Law, Contracts, Criminal Law and Procedure, Evidence, Family Law, Real
Property, Torts, Trusts and Estates (Decedents’ Estates; Trusts and Future Interests), and Uniform
Commercial Code (Secured Transactions). Some questions may include issues in more than one
area of law. The particular areas covered vary from exam to exam. Each jurisdiction determines
its own policy with regard to the relative weight given to the MEE and other scores. (Jurisdictions
that administer the Uniform Bar Examination [UBE] weight the MEE component 30%.)
40 The Bar Examiner, March 2015
Jurisdictions Using the MEE in 2014
*
Key for Jurisdictions Using the MEE in 2014
Gray shading indicates jurisdictions using the MEE. Jurisdictions not shown on the map that are included in this category:
the District of Columbia, Guam, Northern Mariana Islands, and Palau.
No shading indicates jurisdictions not using the MEE. Jurisdictions not shown on the map that are included in this category:
Puerto Rico and Virgin Islands.
*Alaska began administering the MEE in July 2014.
2014 Statistics 41
2014 Statistics
The National Conference of Bar Examiners has produced the Multistate Performance Test (MPT)
since 1997. In 2014, the MPT was used in 41 jurisdictions.
NCBE offers two 90-minute MPT items per administration. A jurisdiction may select one or both
items to include as part of its bar examination. (Jurisdictions that administer the Uniform Bar
Examination [UBE] use two MPTs as part of their bar examinations.)
The MPT is designed to test an examinee’s ability to use fundamental lawyering skills in a realistic situation. Each test evaluates an examinee’s ability to complete a task that a beginning lawyer
should be able to accomplish. The MPT requires examinees to (1) sort detailed factual materials
and separate relevant from irrelevant facts; (2) analyze statutory, case, and administrative materials for applicable principles of law; (3) apply the relevant law to the relevant facts in a manner
likely to resolve a client’s problem; (4) identify and resolve ethical dilemmas, when present; (5)
communicate effectively in writing; and (6) complete a lawyering task within time constraints.
Each jurisdiction determines its own policy with regard to the relative weight given to the MPT
and other scores. (Jurisdictions that administer the UBE weight the MPT component 20%.)
42 The Bar Examiner, March 2015
Jurisdictions Using the MPT in 2014
Key for Jurisdictions Using the MPT in 2014
Gray shading indicates jurisdictions using the MPT. Jurisdictions not shown on the map that are included in this category:
the District of Columbia, Guam, Northern Mariana Islands, and Palau.
No shading indicates jurisdictions not using the MPT. Jurisdictions not shown on the map that are included in this category:
Puerto Rico and Virgin Islands.
2014 Statistics 43
The Revised ABA Standards for
Approval of Law Schools:
An Overview of the Major Changes
by Jeffrey E. Lewis
T
he revised ABA Standards and Rules of
During the Comprehensive Review the SRC
Procedure for Approval of Law Schools
held 23 public meetings, met with various interest
(Standards), which became effective on
groups, and participated in conferences, includ-
August 12, 2014, are the culmination of
ing those of the ABA, the Association of American
a comprehensive review of the Standards begun in
Law Schools, and the National Conference of Bar
2008. The Comprehensive Review was undertaken
Examiners (NCBE). The SRC agenda and draft pro-
by the Standards Review Committee (SRC) of the
posals and recommendations to the Council were
ABA Section of Legal Education and Admissions to
widely published. Hundreds of comments were
the Bar (Section) under direction of the Section Coun-
received and considered by the SRC and the Council.
cil (Council), which is recognized by the U.S. Depart-
Six hearings were held during the various notice and
ment of Education as the national accrediting agency
comment periods following Council preliminary
for programs leading to the J.D. degree in American
approval. The Committee also held two open forum
1
law schools.
meetings for interested parties to provide comments
to the Committee before making its recommenda-
The SRC is charged with reviewing proposed
tions to the Council.
changes or additions to the Standards and may also
initiate such changes. (The Department of Education
The Comprehensive Review was completed
requires that all accrediting agencies periodically
with Council approval of the SRC recommendations
review and update their standards and policies per-
at its March and June meetings in 2014, and with the
taining to approval of schools and programs.) This
concurrence of the ABA House of Delegates at the
Comprehensive Review was undertaken in light
ABA Annual Meeting in Boston on August 11, 2014.
of recommendations from the Accreditation Policy
While the new Standards became effective after that
2
Task Force —assembled in 2007 to review accredi-
meeting, the Council and the Section, cognizant that
tation policies and practices by taking a fresh look
law schools will need time to do the work that some
from a policy perspective—and from three special
of the revised and new Standards will require, estab-
committees appointed to consider the recommenda-
lished a transition and implementation plan. Under
tions of the Accreditation Policy Task Force and to
this plan, site visits in 2014–2015 will substantially
report their suggestions to the Council: the Special
rely on the 2013–2014 Standards. The new Standards
Committees on Transparency, Security of Position,
will be applied to site visits beginning in 2015–2016,
3
and Outcome Measures. Thus, the review began
with the exception of certain new Standards pertain-
with a considerable prelude of careful thought about
ing to learning outcomes, curriculum changes, and
needed change.
assessment methods, which will be applied begin-
44 The Bar Examiner, March 2015
ning in 2016–2017 and applied as appropriate to
4
302-1 states that other professional skills in which
students who become 1L students that year. The
competency is required are to be determined by the
revised Rules of Procedure, on the other hand, do not
law school and provides a nonexclusive list of skills
require a delay for implementation.
that may be included, while Interpretation 302-2
This article highlights some of the major changes
to the Standards and offers examples of those changes.
allows a law school to identify any additional learning outcomes pertinent to its program.
Most categories of change include too many exam-
Assessment of Student Learning: In a related develop-
ples to cover; in these cases, the most important
ment, new Standard 314, Assessment of Student Learning,
changes are highlighted. Selected pertinent excerpts
requires that law schools use formative and summative
from the new Standards are on pages 50–52. While
assessment methods to measure student learning and to
the Rules of Procedure were also substantially modi-
provide meaningful feedback to students.
fied, those changes are not discussed in this article.
Note that many Standards and Interpretations were
renumbered as a result of the significant revisions
(references in this article are to the revised Standards
unless otherwise noted); the material found on the
Standards web page of the ABA Section is helpful in
that regard.5
Interpretation 314-1 provides definitions of formative and summative assessment methods (see the
sidebar on pages 50–52), while Interpretation 314-2
provides for flexibility in implementing the assessment requirement.
Evaluation of Program of Legal Education, Learning
Outcomes, and Assessment Methods: A further
Categories of Major Changes to
the Standards
related development is the adoption of new Standard 315,
Learning Outcomes: The Standards were revised to
law schools evaluate their programs of legal education,
incorporate student learning outcomes. This is one of the
adopted learning outcomes, and assessment methods on a
most significant developments in the new Standards.
regular basis.
Evaluation of Program of Legal Education, Learning
Outcomes, and Assessment Methods, which requires that
New Standards 301(b) and 302, Learning
Law schools are expected to use the results of
Outcomes, introduce the requirement that law
their evaluations to assess the degree to which stu-
schools establish and publish learning outcomes
dents have achieved competency in the learning out-
designed to achieve the objectives of the program
comes and to improve their programs accordingly.
of legal education. This development is in line
Interpretation 315-1 gives law schools flexibility in
with the recommendation of the Outcome Measures
determining what assessment methods to use across
Committee to increase reliance on output measures.
the curriculum. The transition and implementation
It is consistent with the best practices in higher
plan for the Standards provides a phase-in period
education.
so that law schools may develop their learning out-
Certain minimum learning outcomes requiring
competency in four key areas are outlined, though
comes and assessment methods to be in compliance
with the new Standard.
broadly stated to give law schools maximum flexibil-
Curriculum—Professional and Practical Training:
ity. (See the sidebar on pages 50–52 for the learning
The revised Standards emphasize professional and practi-
outcomes specified in Standard 302.) Interpretation
cal training through new curricular requirements.
The Revised ABA Standards for Approval of Law Schools: An Overview of the Major Changes 45
Although the Curriculum Standard (now
For example, one of the most significant changes
Standard 303) already mandated that law schools
in the Standards is the elimination of the calculation
require each student to receive substantial instruc-
of the student-faculty ratio (former Interpretations
tion in the areas of professional responsibility and
402-1 and 402-2). The SRC was of the view that
professional skills, the Standard now contains a
the ratio did not properly account for all students
specific graduation requirement of credit hours in
enrolled in the law school and did not properly
these areas. Under the revised Standard, law schools
account for the size of the faculty, given the impor-
must require each student to satisfactorily complete
tant changes in law school curriculum, teaching
one course of at least two credit hours in profes-
methodologies, and administrative structures since
sional responsibility and at least six credit hours of a
these two Interpretations were adopted. There are
course or courses in experiential learning. Simulation
a number of factors that can be considered in mak-
courses, law clinics, and field placements qualify as
ing a functional judgment about the adequacy of
experiential learning courses as long as they involve
the teaching faculty without resorting to a student-
professional skills with multiple opportunities for
faculty ratio calculation. The elimination of the
performance and self-evaluation. New Standard 304,
student-faculty ratio removes from the Standards
Simulation Courses and Law Clinics, defines and
what was widely viewed as an artificial and mis-
sets out the qualification requirements for simula-
leading calculation that did not accurately reflect the
tion courses and law clinics; field placements are
quality of the legal education being delivered in any
addressed under modified Standard 305.
particular law school.
The types of “substantial opportunities” for addi-
Experimentation and Innovation: Experimentation
tional professional and practical training required to
and innovation are also encouraged with the new variance
be offered by law schools in former Standard 302(b)
process found in Standard 107.
(for live-client or other real-life practice experiences,
student participation in pro bono activities, and
Standard 107 distinguishes between variances in
small group work) have been reworded in what is
an emergency and those sought to experiment with
now Standard 303(b); those substantial opportunities
a new or innovative program or concept. In making
are now listed as law clinics or field placements and
this distinction, experimentation is encouraged. The
pro bono legal services, including law-related public
Council may grant a variance if it is consistent with
service (small group work was eliminated).
the purpose and objectives of the Standards overall.
The SRC was of the view that the new experimen-
Revised
Interpretation
303-3
(former
tal variance rule has the potential to improve the
Interpretation 302-10) encouraging law schools to
delivery of legal education over time, and that the
promote pro bono opportunities for law students
potential benefits of Council-authorized variances
now references pro bono legal services in the context
outweigh any potential harm.
of the ABA Model Rules of Professional Conduct and
recommends that each student provide at least 50
Admission Test: Interpretation 503-3 provides a limited
hours of pro bono service during law school.
variance to the use of the LSAT in admissions.
Increased Flexibility: One of the primary goals of the
There has long been a presumption that the LSAT
Comprehensive Review of the Standards was to provide
satisfied the Standard 503 requirement of a “valid
law schools increased flexibility.
and reliable admission test.” The new Interpretation
46 The Bar Examiner, March 2015
provides a safe harbor variance from the use of the
Additional Substantive Changes: A number of other
LSAT in very limited circumstances and for no more
substantive changes were made to the Standards, a few
than 10% of any entering class. Outside of the safe
examples of which are listed below.
harbor, the variance process is available to clarify
when any other alternative test admissions programs
may be employed on an experimental basis. (See the
sidebar on pages 50–52 for the circumstances of the
variance.)
Granting of J.D. Degree Credit for Prior Law
Study: At least two-thirds of the credits needed
for the J.D. degree must now be obtained in an
ABA-accredited J.D. curriculum. Specifically, new
Standard 505 provides that the total credits per-
Technological Developments: A number of changes
mitted for prior law study (abroad, or non-ABA-
were made in response to developments in the technologi-
accredited school, or LL.M. program) are limited to
cal environment, two examples of which are listed below.
one-third of the credits required for the J.D. degree.
These changes also provide greater flexibility to law
This ends the practice of some schools to grant one
schools.
year of credit for a law school education abroad and
Distance Education: Standard 306 now provides
that a law school may grant up to a total of 15 credit
hours toward the J.D. degree for distance education
courses that otherwise qualify under the Standards.
one year of J.D. credit for an LL.M. at their law school
toward the equivalent of three years of the required
J.D. credits. One or the other, but not both, may be
credited toward the J.D. degree.
This is one-sixth of the typical law student’s law
Student Support Services: Another substantive
study and an increase from 12 credit hours in the
change was the modification of Standard 508 to pro-
former Standard. Importantly, while the opportunity
vide that debt counseling is a mandatory function
for distance education has been increased, the rules
for student support services. While most law schools
governing creditable distance education are still
have long provided this counseling, it is now no
properly designed to ensure that the law student’s
longer a matter of discretion to do so.
academic experience is comparable to the traditional
classroom experience. Standard 306 was generally
updated for more clarity and now includes a clearer
definition of a distance education course.
Qualifications for Admission to the Bar: The
substance of Standard 504 was changed to specify
that the requirement to advise students of the “character, fitness, and other qualifications for admission
Library Collection: Options for the format of the
to the bar in every U.S. jurisdiction” must be fulfilled
law library collection have also changed consider-
by including a statement in its application for admis-
ably, with a movement from the traditional book
sion and on its website. The revised Standard pro-
collection to databases that are electronically avail-
vides explicit language to be used for the statement
able. In recognition of this development, Standard
(see the sidebar on pages 50–52).
606, which referenced a “core collection of essential
materials accessible in the law library,” has been
amended to require “a core collection of essential
Increased Objectivity: The Standards were modified in
a number of instances to be more objective.
materials through ownership or reliable access.”
An important example is found in Standard
Interpretation 606-2 further defines “reliable access”
101, Basic Requirements for Approval. Specifically,
by providing guidance on ways in which to fulfill
Standard 101(a) previously required that an approved
this requirement.
law school “shall demonstrate that its program is
The Revised ABA Standards for Approval of Law Schools: An Overview of the Major Changes 47
consistent with sound legal education principles.
For example, new Standard 310 uses the U.S.
It does so by establishing that it is being operated
Department of Education definition of a credit hour:
in compliance with the Standards.” That language
50 minutes of classroom or direct faculty instruction
was changed to make the Standard more objective:
plus 120 minutes of out-of-class work per week for
an approved law school “shall demonstrate that it is
15 weeks (including one week for a final exam). In
being operated in compliance with the Standards.”
other words, a total of 170 minutes per week for 15
Changes such as this were made throughout the
weeks of instruction (including one week of exams)
Standards.
qualify for one academic credit. The Standard also
provides some alternate ways of determining the
Requirements Regarding Policies: Where law schools
time; it refers to an “equivalent amount of work over
are required to have policies, there is a new requirement
a different amount of time.” This represents a shift
that they must adopt, publish, and adhere to those policies.
from the use of minutes to the use of the concept
of a credit hour to describe the various curricular
For example, Standard 303 previously required
a law school to “have and adhere to sound academic
standards.” The revised Standard, now Standard
308, requires law schools to “adopt, publish, and
adhere to sound academic standards.” This “adopt,
requirements of the Standards.
Required Disclosures: Revised Standard 509 builds on
recent Council action strengthening the reporting requirements for consumer information.6
publish, and adhere to” language is used in several
Law school reporting of information such as
places throughout the revised Standards and was
employment and conditional scholarships must now
intended to achieve more objectivity as well as to
be accomplished through prescribed charts. This
increase transparency.
makes it possible for the prospective law student to
achieve a reliable and accurate comparison between
Reporting Requirements: The Standards were revised
law schools on these important considerations. Law
to highlight reporting requirements.
school reporting of other required information (such
as academic requirements and transfer of credit) is
For example, old Interpretation 101-1 covering
information that must be furnished by law schools
not susceptible to a specific format but must be disclosed in a “readable and comprehensive manner.”
to the Accreditation Committee and the Council
was upgraded in importance by moving it into new
Elimination of Certain Standards and Interpre-
Standard 104 that provides that the information
tations: Several Standards and Interpretations were
provided by a law school must be “complete, accu-
eliminated because they were seen as being unenforceable,
rate, and not misleading and must be submitted in
unnecessary, unclear, or repetitive.
the form, manner, and time frame specified by the
Council.” Moving this requirement into the Standard
highlights the importance of providing accurate
information to the Council.
Conforming
to
Department
Most notable was the elimination of the 20-hour
limitation on employment for full-time students
(former Standard 304(f)); this was viewed as fundamentally unenforceable.
Education
Redesignation of Certain Interpretations as
Requirements: Some changes in the Standards were
Standards: The SRC identified a number of Interpre-
required by U.S. Department of Education regulations.
tations that were, in their substance, Standards and not
48 The Bar Examiner, March 2015
of
mere interpretations of Standards. They were, therefore,
this Standard generated substantial criticism
redesignated as Standards. Such redesignation was one of
from law school faculty during the comment
the most significant changes to the Standards.
period.7 Since no proposal for change garnered
For example, former Interpretation 305-4(a) provided that “[a] law school that has a field placement
the approval of a majority of the Council, current Standard 405 remains in place.
program shall develop, publish and communicate
• Credit for Compensated Field Placement
to students and field instructors a statement that
Programs: Another controversial issue was
describes the educational objectives of the program.”
the prohibiting of law schools from grant-
This is now Standard 305(f) (with minor modifica-
ing credit for field placement programs for
tions). (Other examples are listed under other catego-
which the student receives compensation (for-
ries of change in this article.)
mer Interpretation 305-3, now Interpretation
305-2). Retention of this Interpretation was rec-
Standards Subject to Future
Evaluation
ommended by the Council, but it was referred
Several issues addressed by the SRC and the
vision. The House concurred in all of the pro-
Council were highly controversial, and in the end
posed new Standards and Rules of Procedure
no changes were made:
with the exception of Interpretation 305-2. This
• Bar Passage: The SRC proposed changes to the
bar passage provision (former Interpretation
back to the Council after the House of Delegates
heard strong testimony for and against the pro-
Interpretation remains in place pending further
review by the Council.
301-6, regarding determination of the suffi-
Several other matters raised during the
ciency of a law school’s bar passage rate in
Comprehensive Review will continue to be studied.
meeting the objectives of its program of legal
For example, one remaining issue is whether certain
education to prepare its students for admission
groups currently covered by the non-discrimination
to the bar). Most significant was the proposal
Standard, such as those with disabilities or certain
to require a law school to report bar examina-
sexual orientation characteristics, should also be
tion results for all its graduates known to have
included in the Standard that requires law schools
taken a bar examination within two calendar
to demonstrate by concrete action a commitment to
years from graduation. The proposals were
diversity and inclusion.
stalled in part due to complaints from law
schools that obtaining information regarding
bar passage for all students was a difficult task
for law schools. Former Interpretation 301-6 was
moved, unchanged, to new Standard 316.
Conclusion
Overall, as a result of the changes to the Standards,
programs of legal education in American law schools
will remain rigorous, while at the same time becom-
• Professional Environment: The attempt to clarify
ing more practical and skills-focused. There will be
the requirements regarding tenure also failed to
a greater focus on outcomes (such as learning out-
pass. Standard 405, Professional Environment,
comes, bar exam results, and employment rates). The
governs the status and security of position for
revised Standards also require increased reporting of
law faculty. Alternative proposals to modify
consumer information for greater transparency. In
(text continues on page 53)
The Revised ABA Standards for Approval of Law Schools: An Overview of the Major Changes 49
Excerpts from the 2014–2015 Standards
and Rules of Procedure for Approval of Law Schools
CHAPTER 3: PROGRAM OF LEGAL EDUCATION
Standard 301. OBJECTIVES OF PROGRAM OF LEGAL
EDUCATION
(a) A law school shall maintain a rigorous program of legal education
that prepares its students, upon graduation, for admission to the bar
and for effective, ethical, and responsible participation as members
of the legal profession.
(b) A law school shall establish and publish learning outcomes
designed to achieve these objectives.
Standard 302. LEARNING OUTCOMES
A law school shall establish learning outcomes that shall, at a minimum, include competency in the following:
(a) Knowledge and understanding of substantive and procedural
law;
(b) Legal analysis and reasoning, legal research, problem-solving,
and written and oral communication in the legal context;
(c) Exercise of proper professional and ethical responsibilities to clients and the legal system; and
(d) Other professional skills needed for competent and ethical participation as a member of the legal profession.
Interpretation 302-1
For the purposes of Standard 302(d), other professional skills are determined
by the law school and may include skills such as, interviewing, counseling, negotiation, fact development and analysis, trial practice, document
drafting, conflict resolution, organization and management of legal work,
collaboration, cultural competency, and self-evaluation.
Interpretation 302-2
(b) A law school shall provide substantial opportunities to students
for:
(1) law clinics or field placement(s); and
(2) student participation in pro bono legal services, including lawrelated public service activities.
…
Interpretation 303-3
Rule 6.1 of the ABA Model Rules of Professional Conduct encourages
lawyers to provide pro bono legal services primarily to persons of limited
means or to organizations that serve such persons. In addition, lawyers are
encouraged to provide pro bono law-related public service. In meeting the
requirement of Standard 303(b)(2), law schools are encouraged to promote
opportunities for law student pro bono service that incorporate the priorities established in Model Rule 6.1. In addition, law schools are encouraged
to promote opportunities for law students to provide over their law school
career at least 50 hours of pro bono service that complies with Standard
303(b)(2). Pro bono and public service opportunities need not be structured
to accomplish any of the outcomes required by Standard 302. Standard
303(b)(2) does not preclude the inclusion of credit-granting activities within
a law school’s overall program of law-related pro bono opportunities so long
as law-related non-credit bearing initiatives are also part of that program.
…
Standard 304. SIMULATION COURSES AND LAW CLINICS
(a) A simulation course provides substantial experience not involving an actual client, that (1) is reasonably similar to the experience of
a lawyer advising or representing a client or engaging in other lawyering tasks in a set of facts and circumstances devised or adopted
by a faculty member, and (2) includes the following:
(i) direct supervision of the student’s performance by the faculty
member;
A law school may also identify any additional learning outcomes pertinent
to its program of legal education.
Standard 303. CURRICULUM
(ii) opportunities for performance, feedback from a faculty
member, and self-evaluation; and
(a) A law school shall offer a curriculum that requires each student to
satisfactorily complete at least the following:
(iii) a classroom instructional component.
(1) one course of at least two credit hours in professional responsibility that includes substantial instruction in the history, goals,
structure, values, and responsibilities of the legal profession and
its members;
(b) A law clinic provides substantial lawyering experience that (1)
involves one or more actual clients, and (2) includes the following:
(2) one writing experience in the first year and at least one additional writing experience after the first year, both of which are
faculty supervised; and
(ii) direct supervision of the student’s performance by a faculty
member;
(3) one or more experiential course(s) totaling at least six credit
hours. An experiential course must be a simulation course, a law
clinic, or a field placement. To satisfy this requirement, a course
must be primarily experiential in nature and must:
(i) integrate doctrine, theory, skills, and legal ethics, and engage
students in performance of one or more of the professional skills
identified in Standard 302;
(ii) develop the concepts underlying the professional skills
being taught;
(iii) provide multiple opportunities for performance; and
(iv) provide opportunities for self-evaluation.
50 The Bar Examiner, March 2015
(i) advising or representing a client;
(iii) opportunities for performance, feedback from a faculty
member, and self-evaluation; and
(iv) a classroom instructional component.
…
Standard 306. DISTANCE EDUCATION
(a) A distance education course is one in which students are separated from the faculty member or each other for more than one-third
of the instruction and the instruction involves the use of technology
to support regular and substantive interaction among students and
between the students and the faculty member, either synchronously
or asynchronously.
(b) Credit for a distance education course shall be awarded only
if the academic content, the method of course delivery, and the
method of evaluating student performance are approved as part
of the school’s regular curriculum approval process.
(c) A law school shall have the technological capacity, staff,
information resources, and facilities necessary to assure the educational quality of distance education.
(d) A law school may award credit for distance education and may
count that credit toward the 64 credit hours of regularly scheduled classroom sessions or direct faculty instruction required by
Standard 310(b) if:
Interpretation 314-1
Formative assessment methods are measurements at different
points during a particular course or at different points over the
span of a student’s education that provide meaningful feedback
to improve student learning. Summative assessment methods are
measurements at the culmination of a particular course or at the
culmination of any part of a student’s legal education that measure the degree of student learning.
Interpretation 314-2
(1) there is opportunity for regular and substantive interaction
between faculty member and student and among students;
A law school need not apply multiple assessment methods in any particular course. Assessment methods are likely to be different from school
to school. Law schools are not required by Standard 314 to use any
particular assessment method.
(2) there is regular monitoring of student effort by the faculty
member and opportunity for communication about that effort;
and
Standard 315. EVALUATION OF PROGRAM OF LEGAL
EDUCATION, LEARNING OUTCOMES, AND ASSESSMENT
METHODS
(3) the learning outcomes for the course are consistent with
Standard 302.
The dean and the faculty of a law school shall conduct ongoing
evaluation of the law school’s program of legal education, learning outcomes, and assessment methods; and shall use the results
of this evaluation to determine the degree of student attainment
of competency in the learning outcomes and to make appropriate
changes to improve the curriculum.
(e) A law school shall not grant a student more than a total of 15
credit hours toward the J.D. degree for courses qualifying under
this Standard.
(f) A law school shall not enroll a student in courses qualifying
for credit under this Standard until that student has completed
instruction equivalent to 28 credit hours toward the J.D. degree.
(g) A law school shall establish an effective process for verifying
the identity of students taking distance education courses and that
also protects student privacy. If any additional student charges
are associated with verification of student identity, students must
be notified at the time of registration or enrollment.
Interpretation 306-1
Technology used to support a distance education course may include,
for example:
(a) The Internet;
Interpretation 315-1
Examples of methods that may be used to measure the degree to which
students have attained competency in the school’s student learning outcomes include review of the records the law school maintains to measure
individual student achievement pursuant to Standard 314; evaluation of
student learning portfolios; student evaluation of the sufficiency of their
education; student performance in capstone courses or other courses
that appropriately assess a variety of skills and knowledge; bar exam
passage rates; placement rates; surveys of attorneys, judges, and alumni;
and assessment of student performance by judges, attorneys, or law
professors from other schools. The methods used to measure the degree of
student achievement of learning outcomes are likely to differ from school
to school and law schools are not required by this standard to use any
particular methods.
…
(b) One-way and two-way transmissions through open broadcast,
closed circuit, cable, microwave, broadband lines, fiber optics, satellite,
or wireless communications devices;
(c) Audio and video conferencing; or
CHAPTER 5: ADMISSIONS AND STUDENT SERVICES
…
Standard 503. ADMISSION TEST
(d) Video cassettes, DVDs, and CD-ROMs, if the cassettes, DVDs,
or CD–ROMs are used in a course in conjunction with any of the
technologies listed in paragraphs (a) through (c).
Interpretation 306-2
Methods to verify student identity as required in Standard 306(g)
include, but are not limited to (i) a secure login and pass code, (ii) proctored examinations, and (iii) other technologies and practices that are
effective in verifying student identity. As part of the verification process,
a law school shall verify that the student who registers for a class is the
same student that participates and takes any examinations for the class.
…
Standard 314. ASSESSMENT OF STUDENT LEARNING
A law school shall utilize both formative and summative assessment methods in its curriculum to measure and improve student
learning and provide meaningful feedback to students.
…
Interpretation 503-3
(a) It is not a violation of this Standard for a law school to admit no more
than 10% of an entering class without requiring the LSAT from:
(1) Students in an undergraduate program of the same institution
as the J.D. program; and/or
(2) Students seeking the J.D. degree in combination with a degree
in a different discipline.
(b) Applicants admitted under subsection (a) must meet the following
conditions:
(1) Scored at or above the 85th percentile on the ACT or SAT for
purposes of subsection (a)(1), or for purposes of subsection (a)(2),
scored at or above the 85th percentile on the GRE or GMAT; and
(sidebar continues on page 52)
The Revised ABA Standards for Approval of Law Schools: An Overview of the Major Changes 51
(2) Ranked in the top 10% of their undergraduate class through six
semesters of academic work, or achieved a cumulative GPA of 3.5
or above through six semesters of academic work.
Standard 504. QUALIFICATIONS FOR ADMISSION TO THE
BAR
(a) A law school shall include the following statement in its application for admission and on its website:
In addition to a bar examination, there are character, fitness, and
other qualifications for admission to the bar in every U.S. jurisdiction. Applicants are encouraged to determine the requirements
for any jurisdiction in which they intend to seek admission by
contacting the jurisdiction. Addresses for all relevant agencies
are available through the National Conference of Bar Examiners.
…
Standard 509. REQUIRED DISCLOSURES
(a) All information that a law school reports, publicizes, or distributes shall be complete, accurate and not misleading to a reasonable law school student or applicant. A law school shall use due
diligence in obtaining and verifying such information. Violations
of these obligations may result in sanctions under Rule 16 of the
Rules of Procedure for Approval of Law Schools.
(b) A law school shall publicly disclose on its website, in the form
and manner and for the time frame designated by the Council, the
following information:
(1) admissions data;
(b) The law school shall, as soon after matriculation as is practicable, take additional steps to apprise entering students of the
importance of determining the applicable character, fitness, and
other requirements for admission to the bar in each jurisdiction in
which they intend to seek admission to the bar.
(2) tuition and fees, living costs, and financial aid;
(3) conditional scholarships;
Standard 505. GRANTING OF J.D. DEGREE CREDIT FOR
PRIOR LAW STUDY
(4) enrollment data, including academic, transfer, and other
attrition;
(a) A law school may admit a student and grant credit for
courses completed at another law school approved by the Council
if the courses were undertaken as a J.D. degree student.
(5) numbers of full-time and part-time faculty, professional
librarians, and administrators;
(6) class sizes for first-year and upper-class courses; number
of seminar, clinical and co-curricular offerings;
(b) A law school may admit a student and grant credit for courses
completed at a law school in the United States that is not approved
by the Council if graduates of the law school are permitted to sit
for the bar examination in the jurisdiction in which the school is
located, provided that:
(1) the courses were undertaken as a J.D. degree student; and
(2) the law school would have granted credit toward satisfaction of J.D. degree requirements if earned at the admitting
school.
(7) employment outcomes; and
(8) bar passage data.
(c) A law school shall publicly disclose on its website, in a readable and comprehensive manner, the following information on a
current basis:
(1) refund policies;
(c) A law school may admit a student and grant credit for courses
completed at a law school outside the United States if the admitting law school would have granted credit towards satisfaction of
J.D. degree requirements if earned at the admitting school.
(2) curricular offerings, academic calendar, and academic
requirements; and
(d) A law school may grant credit toward a J.D. degree to a graduate of a law school in a country outside the United States for credit
hours earned in an LL.M. or other post-J.D. program it offers if:
(3) policies regarding the transfer of credit earned at another
institution of higher education. The law school’s transfer of
credit policies must include, at a minimum:
(1) that study led to successful completion of a J.D. degree
course or courses while the student was enrolled in a postJ.D. degree law program; and
(i) A statement of the criteria established by the law
school regarding the transfer of credit earned at another institution; and
(2) the law school has a grading system for LL.M. students in
J.D. courses that is comparable to the grading system for J.D.
degree students in the course.
(ii) A list of institutions, if any, with which the law
school has established an articulation agreement.
(e) A law school that grants credit as provided in Standard 505(a)
through (d) may award a J.D. degree to a student who successfully completes a course of study that satisfies the requirements of
Standard 311 and that meets all of the school’s requirements for
the awarding of the J.D. degree.
(d) A law school shall distribute the data required under Standard
509(b)(3) to all applicants being offered conditional scholarships at
the time the scholarship offer is extended.
(f) Credit hours granted pursuant to subsection (b) through (d)
shall not, individually or in combination, exceed one-third of the
total required by the admitting school for its J.D. degree.
(e) If a law school makes a public disclosure of its status as a law
school approved by the Council, it shall do so accurately and shall
include the name and contact information of the Council.
…
Source: American Bar Association Section of Legal Education and Admissions to the Bar, Standards and Rules of Procedure for Approval of Law
Schools (2014–2015), available at http://www.americanbar.org/groups/legal_education/resources/standards.html.
52 The Bar Examiner, March 2015
addition, the revisions address many of the changes
August 13, 2014, available at http://www.americanbar.org/
content/dam/aba/administrative/legal_education_and_
admissions_to_the_bar/governancedocuments/2014_
august_transition_and_implementation_of_new_aba_
standards_and_rules.authcheckdam.pdf.
that have occurred in legal education since the
last Comprehensive Review. Finally, the revisions
respond to changes and requirements in the U.S.
5.
The 2014–2015 Standards and Rules of Procedure for Approval
of Law Schools are available on the Standards web page
of the ABA Section of Legal Education and Admissions
to the Bar, http://www.americanbar.org/groups/legal_
education/resources/standards.html. This web page also
includes an overview and detailed explanation of the changes
to the Standards, as well as a redline version of the revised
Standards.
6.
The major substantive revisions to Standard 509 went into
effect in August 2013.
7.
For a summary of the proposals to modify Standard 405,
see American Bar Association Section of Legal Education
and Admissions to the Bar, Explanation of Changes, available at http://www.americanbar.org/content/dam/aba/
administrative/legal_education_and_admissions_to_the_
bar/council_reports_and_resolutions/201408_explanation_
changes.authcheckdam.pdf.
Department of Education regulations, streamline
the sabbatical review process, strengthen curricular
requirements, and strengthen the reporting requirements for consumer information.
It should be noted that the Comprehensive
Review occurred during a period of dramatic change
in the legal profession and legal education—a transition from high enrollments and bountiful employment opportunities to reduced enrollments and a
contraction of the job market. The Standards, substantially improved against the backdrop of these
stresses and strains on the legal profession and legal
education, are destined to strengthen the quality of
American legal education as we go forward.
Notes
1.
Editor’s Note: For a summary of the Standards review process, the goals of accreditation, and critical issues encompassed in the current Comprehensive Review, see Donald J.
Polden, Comprehensive Review of American Bar Association Law
School Accreditation Policies and Procedures: A Summary, 79(1)
The Bar Examiner 42–49 (February 2010).
2.
The Report of the Accreditation Policy Task Force is available at http://www.americanbar.org/content/dam/
aba/migrated/legaled/actaskforce/2007_05_29_report_
accreditation_task_force.authcheckdam.pdf.
3. Information about the charges of these three committees, as well as their final reports, are available on the
Special Committees Report web page of the ABA Section of
Legal Education and Admissions to the Bar, http://www
.americanbar.org/groups/legal_education/committees/
standards_review/comp_review_archive/special_
committee_reports.html (last visited Feb. 13, 2014).
4.
For details of the transition and implementation plan, see
Transition to and Implementation of the New Standards
and Rules of Procedure for Approval of Law Schools,
Jeffrey E. Lewis is Dean Emeritus and Professor of Law at the
Saint Louis University School of Law, where he served as dean
from 1999 to 2010. He served on the law faculty at the University
of Florida College of Law from 1972 to 1999, and during his tenure
at the University of Florida he served as associate dean for seven
years and dean for eight years. Dean Lewis served as chair of the
American Bar Association Standards Review Committee from
August 2011 to August 2014. He has served on the Council of
the ABA Section of Legal Education and Admissions to the Bar;
he has also chaired the ABA Accreditation Committee, served
on the Accreditation Committee of the Association of American
Law Schools, and chaired or served as a member of over 20 ABA/
AALS site evaluation teams.
The Revised ABA Standards for Approval of Law Schools: An Overview of the Major Changes 53
The Testing Column
Essay Grading Fundamentals
by Judith A. Gundersen
A
s I write this column, bar
compared between the control group
exam graders across the
and current test takers. This equating
country are in some stage
process ensures comparable score mean-
of grading essays and per-
ing across MBE administrations.
formance tests. Every U.S. jurisdiction is
responsible for grading the written com-
But what about equating the MEE
ponent of its bar examination—whether
and the MPT? These tests cannot be
the written component consists of the
equated in the same sense that the MBE
Multistate
(MEE),
is equated because their questions are
the Multistate Performance Test (MPT),
too memorable to be reused or embed-
jurisdiction-drafted questions, or some
ded in an exam—examinees spend 30
Essay
Examination
combination of two or all three. Grading the written
minutes on a given MEE question and 90 minutes on a
portion of the bar examination is a painstaking process
given MPT question, as opposed to just a few minutes
that accounts for at least half of an examinee’s grade—
on an MBE question. Any examinee who had seen an
thus a significant component of the overall bar exam
MEE or MPT question before would remember it and
score. This column focuses on some essay (and perfor-
have an advantage over an examinee who had never
mance test) grading fundamentals: rank-ordering, cal-
seen the question. (Once an MEE or MPT is admin-
ibration, and taking into account an examinee’s ability
istered, none of its questions is ever used again on
to communicate in writing. Adhering to these funda-
another test form. Retired questions are made avail-
mentals helps ensure fair and reliable essay grading
able for purchase or free of charge on our website as
procedures and score results.
study aids or for use in law schools.)
First, a few words are in order about the role that
Because MEEs and MPTs cannot be equated in the
equating plays in the overall context of grading. As
same way as the MBE, but are a critical piece of the
stated many times in this column and elsewhere in
bar exam score, NCBE recommends the best practice
the Bar Examiner, the purpose of the bar examination
of scaling the written scores to the MBE: raw scores
is to determine minimal competence to be licensed as
earned on each MEE and MPT question are added
an attorney. Both fairness to examinees and protec-
up and then scaled to the MBE. This in effect puts
tion of the public dictate that the bar exam be reliable
the overall score earned on the written portion of the
and valid across test forms and administrations. The
exam on the MBE scaled score distribution, thereby
Multistate Bar Examination (MBE) is the only part of
using the equating power of the MBE to give compa-
the bar exam that is equated across all administrations.
rability to the written portion. Scaling preserves the
This is done by embedding a mini test form within the
important rank-ordering judgments that graders have
MBE with known statistical properties that is then
made on answers.1
54 The Bar Examiner, March 2015
Rank-Ordering Papers
MEE and MPT questions are developed to test content and skills set forth in the MEE subject matter
outline and the MPT list of skills tested. Within each
MEE and MPT, multiple issues are raised that might
via on-demand streaming. Finally, the grading materials are included in MEE and MPT study aids, so
prospective examinees can become familiar with the
questions and what graders are looking for in examinee answers.
be addressed by examinees—some issues are easier
Rank-ordering papers is harder when a grader
to identify and some are subtler. Multiple issues
perceives that the answers are all very good or all very
help graders make meaningful grading distinctions
poor. But meaningful distinctions between papers
among papers. Some papers should get high scores,
can and should be made no matter whether a paper
some average scores, and some lower scores, regard-
evidences a weak or strong performance. That is, a
less of what score scale a jurisdiction uses (1–5, 1–6,
grader should take into account an examinee’s use of
1–10, etc.), and regardless of whether, taken as a
the facts, the quality and depth of the examinee’s legal
whole, all papers are strong or weak. What matters is
rank-ordering among papers—relative grading.
analysis, the examinee’s issue-spotting ability, and
the quality of the examinee’s writing (more on this
later). Considering each paper as a whole, informed
Rank-ordering works best if distinctions are made
by the grading materials, rank-ordering papers using
between papers and scores are spread out over the
the entire score scale will best ensure that examinees’
whole score scale (whatever that may be). For exam-
written scores reflect their performance on this portion
ple, if a jurisdiction uses a 1–6 scale (a “1” paper being
of the exam.
a very poor answer relative to the other answers in the
jurisdiction, and a “6” paper being an excellent answer
relative to the other answers in the jurisdiction), it is
important that graders assign 1’s, 2’s, 3’s, 4’s, 5’s, and
Achieving and Maintaining Grading
Consistency: Calibration
6’s, not just compress all of their grades between 3’s
Whether a grader grades all the answers to a certain
and 4’s. Were a grader to give every answer in her
question himself or with other graders, getting and
group of papers a “3,” for example, the question would,
staying calibrated is critical. Calibration is the pro-
in effect, be thrown out—it would have no impact on
cess by which a grader or group of graders develops
examinees’ scores. It would be like keying all answers
coherent and identifiable grading judgments so that
correct in a multiple-choice question. Similarly, but to
the rank-ordering is consistent throughout the grad-
a lesser degree, bunching all grades between just two
ing process and across multiple graders. It shouldn’t
of the points on a 6-point scale would diminish the
matter to an examinee if her answer is paper number
relative value that this particular question would have
1 for grader A or paper number 233 for grader B.
on an examinee’s overall written score.
To calibrate, graders begin by reading a set of 10 or
To prepare graders, NCBE provides detailed grad-
more common papers and assigning tentative grades.
ing materials, which are subjected to review by outside
Multiple graders compare their grades on the sample
content experts, editing by drafting committees, and
group and see where they need to resolve grading
proofing and cite-checking by NCBE lawyer-editors.
judgments. Once any differences between grading
User jurisdictions also have the option of reviewing
judgments are worked out, then another sample group
the questions and grading materials before adminis-
of 10 papers should be read to see if the graders are in
tration. NCBE hosts an MEE/MPT grading workshop
alignment. Again, grading differences on this second
after each administration, with three participation
set of sample papers must be resolved. Finally, a third
options for graders: in person, by conference call, or
set of 10 common papers might be necessary to ensure
The Testing Column 55
that graders are grading consistently. If the total num-
means writing a well-organized paper that demon-
ber of examinees or papers to be graded in an admin-
strates an understanding of the law and how to apply
istration reaches the hundreds or thousands, it might
it to the facts in the problem. It means, as stated in the
be a good idea to embed a few common papers among
MEE instructions, “show[ing] . . . the reasoning by
multiple graders, those papers then being checked to
which you arrive at your conclusions.”
ensure that consistency is maintained over the course
of the grading process.
The MPT has more specific criteria for assessing
the quality of an examinee’s writing than the MEE,
Single graders should also start with a defined
as MPT examinees are instructed on the proper tone
set of papers to gauge what the pool of answers will
for the assignment (e.g., persuasive, objective), the
look like and assign tentative grades until they’ve
proper audience (e.g., court, client, opposing counsel),
seen more papers. Because grading is relative and
and sometimes the desired formatting (e.g., the use of
papers are to be rank-ordered, context is everything.
headings, statement of facts, case citations). Thus, in
Early grades will probably need rechecking as more
general, it can be easier for graders to make distinc-
answers are read. Some graders find it helpful to keep
tions on the quality of writing when grading MPTs.
benchmark papers—representative papers for each
However, graders can make a meaningful assessment
point on the score scale—to help re-orient themselves
of writing ability on both the MPT and the MEE.
after a grading break. It may also be helpful for a
grader or graders to try to put papers in buckets or
piles representing each point on the score scale to
ensure that they are, in fact, using the whole score
scale and not bunching all answers between two
points on their score scale.
Taking into Account Examinees’
Ability to Communicate in Writing
Conclusion
Graders have an important job, and they know it.
I’ve met hundreds of graders over the years, and
they all strive to make consistent and fair decisions,
and take their jobs very seriously. Employing the
practices and principles of rank-ordering, achieving
and maintaining calibration, and assessing written
communication ensures a fair and reliable process
One way for graders to make distinctions between
for grading the all-important written portion of the
papers is to take into consideration examinees’ abil-
bar examination.
ity to communicate in writing—this is a construct
of the MEE and MPT and is set forth in the purpose
statement of the MEE and the skills tested in the
Notes
1.
For a detailed explanation about scaling, see the December
2014 Testing Column: Mark A. Albanese, Ph.D., The Testing
Column: Scaling: It’s Not Just for Fish or Mountains, 83(4) The
Bar Examiner 50–56 (December 2014).
2.
The NCBE job analysis is part of a content validity study conducted by NCBE in conjunction with its testing program. The
job analysis was carried out through a survey distributed to
a diverse group of lawyers from across the country who had
been in practice from one to three years. Its goal was to determine what new lawyers do, and what knowledge, skills, and
abilities newly licensed lawyers believe that they need to
carry out their work. The job analysis, entitled A Study of the
Newly Licensed Lawyer, is available on the NCBE website at
http://www.ncbex.org/publications/ncbe-job-analysis/.
MPT. A lawyer’s ability to communicate in writing
is a critical lawyering skill. NCBE’s 2012 job analysis confirmed this—100% of all respondents to the
survey we distributed to new lawyers stated that the
ability to communicate in writing was “extremely
significant” to their jobs as lawyers.2 If writing didn’t
matter, then the bar exam could consist solely of
multiple-choice questions—which would save a lot
of time and effort. But it does matter.
Demonstrating the ability to communicate in writing does not mean using legalese or jargon. Rather, it
56 The Bar Examiner, March 2015
Judith A. Gundersen is the Director of Test Operations for the
National Conference of Bar Examiners.
Litigation Update
by Fred P. Parker III and Jessica Glad
Cases Reported
ABA-Approved Law Schools
Attorney from non-ABA-approved law school
In the Matter of Odegua J. Irivbogbe, Applicant to the West Virginia Board of
Law Examiners, 2014 WL 2404312 (WV)
Attorney Discipline
Unauthorized practice of law
In re Seth Cortigene and Newton B. Schwartz, Sr., 144 So. 3d 915 (LA 2014)
Character and Fitness
Financial irresponsibility
In the Matter of the Application of T. Z.-A. O. for Admission to the Bar of
Maryland, 441 Md. 65, 105 A.3d 492 (MD 2014)
ABA-Approved Law Schools
Attorney from non-ABA-approved law school
In the Matter of Odegua J. Irivbogbe, Applicant to the West Virginia Board
of Law Examiners, 2014 WL 2404312 (WV)
Odegua Irivbogbe graduated from the University of
the educational requirements of Rules 2.0 and 3.0.
Lagos in Nigeria, passed the New York bar exam in
The Board found that Rule 3.0(b)(4) requires that a
2007, and was admitted to the New York Bar in 2008.
graduate of a foreign law school where the common
She never practiced in New York. She moved to West
law of England exists as the basis for its jurisprudence
Virginia and filed an application in July 2012 with the
must successfully complete 30 basic credit hours at an
West Virginia Board seeking admission by examina-
ABA-approved law school in order to sit for the West
tion under the Board’s Rule 3.0. The Board denied
Virginia bar exam, and Irivbogbe had not completed
the application based on Irivbogbe’s failure to meet
these credits.
60 The Bar Examiner, March 2015
Irivbogbe then requested an administrative hear-
to sit pursuant to Rule 3.0(b)(1), which applies to
ing, which was held in February 2013. The hearing
graduates of non-ABA-approved law schools who
examiner concluded that the Board’s decision must
have passed the bar exam in another state and have
be affirmed. The Board reviewed this report in June
been admitted in that state. This argument had been
2013 and voted to deny Irivbogbe’s application
rejected by the Board because that rule applies only
based on her failure to meet the educational require-
to graduates of U.S. law schools.
ments in Rules 2.0 and 3.0 because the Rules do not
allow the Board any discretion to waive or modify
The Court stated that the main issue on appeal
was whether the Board had correctly concluded
these requirements.
that Rule 3.0(b)(4) applies to all foreign law school
The matter was appealed to the West Virginia
graduates. The Court agreed with the Board that all
Supreme Court. The Court reviewed the record de
foreign-educated applicants must successfully com-
novo with regard to questions of law, the application
plete a minimum of 30 credit hours of basic courses
of the law to the facts, and whether the applicant
selected from certain listed areas of law. Since
should or should not be admitted to the practice of
Irivbogbe had not done this, she was not currently
law in West Virginia.
eligible for admission to practice in West Virginia by
examination.
Irivbogbe argued that her legal education and
the study materials she had used in preparing for the
Irivbogbe also argued that a denial would vio-
New York bar exam were equivalent to an education
late her right to equal protection. The Board had
received at an ABA-approved law school and that
found that Irivbogbe was not similarly situated to
she should be allowed to sit for the West Virginia bar
applicants who were educated at ABA-approved law
exam. She further asserted that the Board had misap-
schools. The Court agreed and affirmed the Board’s
plied rule 3.0(b)(4) and that she should be allowed
decision.
Attorney Discipline
Unauthorized practice of law
In re Seth Cortigene and Newton B. Schwartz, Sr., 144 So. 3d 915 (LA 2014)
In a matter of first impression, the Supreme Court of
due to his failure to comply with his professional
Louisiana considered whether it has the authority to
obligations. Schwartz was licensed to practice law
impose discipline on a lawyer not admitted to the bar
in Texas and Pennsylvania, but was not licensed in
of Louisiana.
Louisiana. The ODC charged Schwartz with engag-
This case arose from consolidated disciplinary
ing in the unauthorized practice of law by appearing
proceedings resulting from formal charges filed by
at and participating in a Louisiana deposition. The
the Office of Disciplinary Counsel (“ODC”) against
ODC charged Cortigene with facilitating Schwartz’s
attorneys Seth Cortigene and Newton B. Schwartz,
misconduct and failing to report it to disciplinary
Sr. Cortigene was licensed to practice law in Texas
authorities. The matter proceeded to a hearing on the
and Louisiana but was currently ineligible to practice
formal charges filed against both attorneys.
Litigation Update 61
The hearing committee concluded that both
report it to disciplinary authorities. The Court dis-
attorneys had violated the Rules of Professional
barred Cortigene, but concluded that the appropriate
Conduct as charged. It recommended that Cortigene
sanction for Schwartz’s conduct if he were a member
be disbarred. However, because Schwartz was not
of the state bar would be a three-year suspension.
admitted to the Louisiana bar, the hearing committee declined to recommend disbarment for him, and
The Court then answered in the affirmative what
instead recommended that Schwartz be publicly
it considered to be a res nova issue, that is, whether
reprimanded and permanently enjoined from the
it has the authority to impose discipline on a lawyer
practice of law in the state.
not admitted to the bar of Louisiana. Pursuant to its
The disciplinary board adopted the hearing committee’s factual findings, its legal conclusions, and
most of its sanctions recommendations. It agreed
that disbarment would be the appropriate sanction
for Schwartz’s misconduct, but reasoned that disbarment was inapplicable to an attorney not admitted
to the Louisiana bar. Instead, the board concluded
that the only sanction applicable to a non-Louisiana
attorney would be a public reprimand. The ODC
appealed to the Supreme Court of Louisiana, asserting that the board had erred in concluding that
Schwartz could not be disbarred in Louisiana.
plenary power to define and regulate all facets of
the practice of law, the Court explained, “we have
the right to fashion and impose any sanction which
we find is necessary and appropriate to regulate the
practice of law and protect the citizens of this state.
This power is broad enough to encompass persons
not admitted to the bar who attempt to practice law
in this state.” Therefore, the Court concluded, “in
the exercise of our plenary authority, we may enjoin
a non-Louisiana lawyer from seeking the benefits of
a full or limited admission to practice in this state.”
Accordingly, the Court adjudged Schwartz
The Court conducted an independent review
guilty of conduct that would warrant a three-year
of the record to determine whether the alleged mis-
suspension if he were a member of the Louisiana
conduct was proven by clear and convincing evi-
bar. However, recognizing that he was not a mem-
dence. Based on its review, the Court concluded that
ber of the state bar, the Court enjoined Schwartz for
Schwartz had engaged in the unauthorized practice
a period of three years from seeking admission to
of law. It also found that Cortigene had facili-
practice law in Louisiana on either a permanent or
tated Schwartz’s unauthorized practice and failed to
temporary basis.
Character and Fitness
Financial irresponsibility
In the Matter of the Application of T. Z.-A. O. for Admission to the Bar of Maryland,
441 Md. 65, 105 A.3d 492 (MD 2014)
In May 2012, T. Z.-A. O. (“Applicant”) filed an
Committee for the Fifth Appellate Circuit (“the
application for admission to the Maryland Bar with
Committee”). Applicant passed the July 2012
the State Board of Law Examiners, and in June 2012
Maryland bar exam; however, based on the results of
the Board sent the application to the Character
the Committee’s investigation, in June 2013 a three-
62 The Bar Examiner, March 2015
member panel of the Committee held a hearing to
tracts and agreements related to the car pur-
determine whether Applicant possessed the good
chase. The loan application did not mention
moral character and fitness necessary for admission
the 2004 bankruptcy and falsely stated that
to the Maryland Bar. The panel found the following
Applicant owned a home, made no rental
facts:
or mortgage payments, and earned $3,500 a
• In May 1996, Applicant was arrested for
public indecency in Columbus, Ohio. He
pled guilty in July 1996 and was sentenced to
30 days in jail with 1 day credited toward the
sentence and the other 29 days suspended.
month. Applicant certified that all the information on the application was true, correct,
and complete. Applicant later claimed that
the sales representative must have inserted
the false information about home ownership and income into the car loan applica-
• In 2004 Applicant applied to and was
tion. Applicant knew that the interest rate
accepted at Tulane Law School. Question 28
was 14.95% and that his monthly payments
on the law school application asked about
would be $674.70 when he took possession
criminal charges and convictions. Applicant
of the vehicle.
answered in the negative and certified that
his answers were true.
• In fall 2007, Applicant stopped making
monthly car payments because he claimed
• Applicant disclosed the 1996 arrest and
that the contract contained irregularities;
conviction when he applied to the Florida
however, he continued using the car and
Bar, and the Florida Board discovered that
did not turn it in until February 2008. At
Applicant had failed to disclose the 1996
that time there was a $19,000 arrearage on
matters to Tulane and contacted Applicant,
the car loan. Applicant testified that he liti-
who then notified Tulane about his failure
gated against the finance company and the
to disclose.
company forgave the outstanding arrearage.
• In May 2004, shortly after applying to law
• At the time of the hearing, Applicant was
school, Applicant filed a petition for Chapter
self-employed and performed research and
7 bankruptcy and at the hearing admitted
writing for a law firm in Florida. In 2012
that his financial activities had been irre-
Applicant had earned $24,000, and from
sponsible and included the use of multiple
January to June 2013 he had earned between
credit cards when he was unemployed with
$18,000 and $19,000. Applicant admitted
no means to pay the balances. In August
that he had $220,000 in private and federal
2004 $58,000 in debt was discharged.
student loan debt, that he was making only
• In August 2006, Applicant purchased a new
Honda vehicle. Applicant stated that he had
been planning to purchase a used car, but
minimum payments on the private loans,
and that the federal loans were in forbearance or deferred.
that the sales personnel persuaded him to
The Committee concluded that Applicant had
test-drive new vehicles and prepared the car
not shown financial responsibility. He had continued
loan application as well as additional con-
to accumulate debt with no plans to repay it. Since
Litigation Update 63
his bankruptcy, in which he had discharged $58,000
mitted before the Committee and the Board.” The
in debt, he had accumulated nearly $200,000 in stu-
Court noted that in bar admission cases failure to
dent and consumer loans and had discharged the
honor financial obligations and lack of candor are
$19,000 vehicle loan. He also had not acknowledged
serious matters that do not reflect well upon an
the untruthful information on his loan application,
applicant’s fitness to practice law, adding that abso-
and he had stated that his application to the Florida
lute candor is a requisite of admission in Maryland.
Bar was denied because of illegal behavior and financial irresponsibility. The Committee recommended
that Applicant’s application be denied.
The Court again concluded that Applicant’s
“inability to honor financial obligations and to be
financially responsible, as well as (his) lack of can-
In December 2013, the Board conducted a
dor, reflect that he does not presently possess the
hearing and at its conclusion recommended that
moral character and fitness necessary to practice
Applicant be denied admission to the Maryland Bar,
law” in Maryland. The Court stated that Applicant’s
finding that after his bankruptcy he had established
character and fitness to practice were shown to be
numerous consumer credit accounts and that he
questionable by his lack of candor in connection
had not taken his credit obligations seriously until
with the 2006 car loan application containing false
it appeared that they might keep him from being
financial information. The Court added that rather
admitted in Maryland. Applicant was 31 years old
than accepting full responsibility for this, he blamed
and a college graduate when he entered law school,
the car sales representative. Stating that both the
was 32 and a law student when he signed the car
Committee and the Board had recommended denial
loan papers, was 33 when he applied to be admitted
of admission, the Court again agreed, concluded that
in Florida, and was 38 when his Maryland applica-
Applicant had failed to meet the burden of proof,
tion was accepted. He seemed to treat these incidents
and denied his application for admission to the
of financial irresponsibility as youthful indiscretions.
Maryland Bar. The Court also denied Applicant’s
The Board found that “[h]e [had] shown no commit-
Motion for Reconsideration of the Court’s April 2014
ment to honesty and financial responsibility.”
denial of Applicant’s application.
The matter was then reconsidered by the Court
of Appeals of Maryland, which under the Court’s
rules is “charged with the responsibility to conduct
an independent evaluation of the applicant’s moral
character based upon testimony and evidence sub-
64 The Bar Examiner, March 2015
Fred P. Parker III is Executive Director Emeritus of the Board of
Law Examiners of the State of North Carolina.
Jessica Glad is Staff Attorney for the National Conference of Bar
Examiners.