A literature review of some studies about the bar exam

In recent weeks, there’s been a surge in assertions that the bar exam does “nothing,” is “pointless,” is “worthless,” and so on. Take, for instance, the claims in this op-ed from an Oregon appellate judge, critiquing the claim that the bar acts to protect the public as “completely unfounded,” the bar “does not function to protect the public,” and “I have never heard anyone make a cogent connection between the types of lawyer conduct that harms the public and the screening that occurs via the bar examination.”

These assertions are wrong. I thought I’d start to compile a literature review of studies about the bar exam (one that assuredly can and will be supplemented!).

Here, I seek to compile two sets of studies that may be relevant. (I only include a brief excerpt from each; please do read the entire study for more about the size of the sample, the strength of the inferences, the location where the study took place, and so on!) The first are studies that examine the relationship between bar exam performance and attorney discipline. The second are studies that examine the relationship between bar exam performance and law school grades.

One can challenge these studies, of course (e.g., the strength of the evidence, how significant the effect in any given study is, etc.). And one can still conclude that the bar exam is too costly, too high a barrier to the practice of law, and so on. And one could finally admit these relationships but examine alternative concerns, like the coronavirus pandemic and bar exam logistical problems. But these are, of course, distinct arguments—and I’ve found arguments about the bar exam are too quickly and easily conflated.

For instance, one could accept that there exists a relationship between bar exam performance and attorney discipline—lower bar exam scores correlate with higher ultimate attorney discipline rates, and failing the bar exam at least once correlates with higher ultimate attorney discipline rates. But one could still argue that the risk is too low, or too distantly removed; or, that the increased risk should lead bar licensing authorities to use more powerful tools to prevent that discipline later through better mentoring and oversight. Indeed, in the article I co-authored with Professor Rob Anderson, we identified several of these points—do read the piece!

Additionally, the relationship (at least, a moderate relationship) between law school grades and bar exam performance, I think, tends to undermine the claim that the bar exam is “meaningless”—unless law schools are willing to concede that their own grading is meaningless and employers are willing to concede that reliance on grading is meaningless. (Maybe some will!) While the bar exam is a different test than law school exams (and both are different from the actual practice of law), they do, I think, all tend to test legal analysis, albeit in varied ways. One could still, of course, critique the bar exam as excessively reliant on rote memorization, too costly an investment for law school graduates, and so on. (And while one might ask why the bar exam continues to require graduates of ABA-accredited schools to pass the exam, it’s principally because licensing authorities distrust law schools to maintain adequate admissions, retention, and graduation standards.)

Two more things to consider, too. The first is how to examine the bar exam—one could look at the binary pass-fail outcome, or one could examine a given bar exam score (e.g., distinguishing performance on the bar exam of, say, a score of 145 versus a score of 130). The second is Type I & Type II errors—for instance, the bar admits some people who may end up facing discipline one day, but it does not admit some people who may go their entire careers without facing discipline.

There is much to chew on when examining the relationship between the bar exam and other items, but these, I hope, provide a fruitful starting point for discussions about the bar exam—without the heated (and erroneous) rhetoric that’s too common in these debates.

Please, of course, feel free to contact me if you find a useful study that merits inclusion!

Relationship between bar exam performance and attorney discipline

Leslie C. Levin, Christine Zozula, & Peter Siegelman, A Study of the Relationship Between Bar Admissions Data and Subsequent Lawyer Discipline, LSAC (2013) [archive]

[Ed.: In a Connecticut study, Model 4 drops law school rank and grades as variables, revealing that failing the bar exam is a predictor of future discipline, even when many “character and fitness” variables are controlled.]

Jeffrey S. Kinsler, Is Bar Exam Failure a Harbinger of Professional Discipline?, 91 St. John’s Law Review 883 (2018)

Using bar exam and disciplinary data from Tennessee, this Article substantiates the following theses: (1) The more times it takes a lawyer to pass the bar exam the more likely that lawyer will be disciplined for ethical violations, particularly early in the lawyer’s career; and (2) The more times it takes a lawyer to pass the bar exam the more likely that lawyer will be disciplined for lack of diligence—including non-communication—and/or incompetence.

Robert Anderson IV & Derek T. Muller, The High Cost of Lowering the Bar, 32 Georgetown Journal of Legal Ethics 307 (2019)

Using a large dataset drawn from publicly available California State Bar records, our analysis shows that bar exam score is significantly related to likelihood of State Bar discipline throughout a lawyer’s career. We investigate these claims by collecting data on disciplinary actions and disbarments among California-licensed attorneys. We find support for the assertion that attorneys with lower bar examination performance are more likely to be disciplined and disbarred than those with higher performance.

Kyle Rozema, Does the Bar Exam Protect the Public?, draft 2020

I study the effects of requiring lawyers to pass the bar exam on whether they are later publicly disciplined for misconduct. In the 1980s, four states began to require graduates from all law schools to pass the bar exam by abolishing what is known as a diploma privilege. My research design exploits these events to estimate the effect of the diploma privilege on the share of lawyers who receive public sanctions by state discipline bodies. Lawyers admitted on diploma privilege receive public sanctions at similar rates to lawyers admitted after passing a bar exam for the first decade of their careers, but small differences begin to emerge after a decade, and larger differences emerge after two decades. The estimates suggest that the diploma privilege increased the share of lawyers who received a public sanction within 25 years after bar admission from 4.5 percent to between 4.6 and 6.5 percent.

Relationship between bar exam performance and overall law school grades

Douglass Boshkoff, Phillips Cutright, & Karen Cutright, Course Selection, Student Characteristics and Bar Examination Performance: The Indiana University Law School Experience, 27 Journal of Legal Education 127 (1975)

For example, [the table] shows that 90.9% of students with a cumulative grade point average of 2.8 or higher passed the examination, while only 38.4% of those with an average of 2.0 to 2.3 were successful. The differential indicates a powerful effect of academic performance in law school. Furthermore, this differential is affected little when the other characteristics of graduates with high or low grades are considered . . .

Kristine S. Knaplund & Richard H. Sander, The Art and Science of Academic Support, 45 Journal of Legal Education 157 (1995)

UCLA students with a B+ (83) average in law school are one-tenth as likely to fail the bar exam as students with a C+(73) average.

Linda F. Wightman, LSAC National Longitudinal Bar Passage Study (1998)

Using data from all jurisdictions combined, the logistic regression analyses showed that both adjusted LGPA and LSAT score were statistically significant factors in explaining bar examination outcomes. Another way to evaluate the utility of this model for explaining bar examination outcomes was to examine the correlation between predicted and actual outcomes. For these data, the correlation between predicted and actual pass or fail was .52. (By comparison, the mean correlation between LSAT score and first-year law school average [FYA] was .41 for law schools participating in the 1990-92 LSAC correlation studies. The multiple correlation of LSAT score and UGPA with FYA was .49 for those same schools.)

Linda Jellum & Emmeline Paulette Reeves, Cool Data on a Hot Issue: Empirical Evidence that a Law School Bar Support Program Enhances Bar Performance, 5 Nevada Law Journal 646 (2005)

From the July 1997 examination [of the Virginia bar exam] through the February 2001 examination, the passage rate for students [from Richmond Law] in the bottom half of the class was 51.3%. The fourth-quartile passage rate was 26.0%. In comparison, the passage rate for the top half of the class was 93.9%.

Michael Kane, Andrew Mroch, Douglas Ripkey, & Susan Case, Impact of the Increase in the Passing Score on the New York Bar Examination, National Conference of Bar Examiners (2006)

The high correlations between the two versions of the L-GPA and bar examination scores indicate that there is substantial overlap in what is being evaluated on the bar examination and what is being evaluated in law schools. The strong positive correlation (.63) between the 4-pt L-GPA and bar examination scores indicate that relative performance in law school (independent of the selectivity of the law school) is an important determiner of performance on the bar exam; the 4-pt L-GPA accounts for almost 40% of the variance in bar examination scores. The Index-Based L-GPA has a somewhat higher correlation with bar examination scores (.68) indicating that the strength of the relationship between grades in law school and performance on the bar examination can be enhanced by taking the selectivity of the law school into account; the Index-Based L-GPA accounts for about 47% of the variance in bar examination scores.

The bar examination scores have their highest correlation with the Index-Based L-GPA and their second-highest correlation with the 4-pt L-GPA. So it is clear that performance on the bar examination is strongly related to performance in law school. The correlation of bar examination scores with LSAT scores is fairly high, and the correlation with U-GPA, which has the lowest value of the four correlations, is also reasonably high. Note that U-GPA has a higher correlation with bar examination scores than it has with the LSAT scores. This is somewhat surprising, because the bar examination is taken three or more years after graduation from college, while the LSAT is generally taken closer to the completion of undergraduate education.

Lorenzo A. Trujillo, The Relationship between Law School and the Bar Exam: A Look at Assessment and Student Success, 78 University of Colorado Law Review 69 (2007)

Through research, surveys, and compilation of the resulting data, it became apparent that the single most important predictor of bar passage rate was a student's relative law school class rank.
. . .
This research indicates that neither the LSAT nor undergraduate GPA are as meaningful indicators of success on the bar exam as class rank, which remains the best predictor for success on the bar exam.

Douglas K. Rush & Hisako Matsuo, Does Law School Curriculum Affect Bar Examination Passage? An Empirical Analysis of Factors Related to Bar Examination Passage during the Years 2001 through 2006 at a Midwestern Law School, 57 Journal of Legal Education 224 (2007)

Table 2 demonstrates a strong association between a graduate's final class rank, by quartiles based on final LGPA, and bar examination passage. Graduates of the School of Law who ranked in the first quartile of their law school graduating class passed the bar examination at a 100 percent rate over the five year period of the study. Graduates who ranked in the second quartile of their law school graduating class passed the bar examination at a 95.6 percent rate during the same period. Graduates who ranked in the third quartile of their graduating class passed the bar examination at an 82.6 percent rate and the bar examination passage rate dropped to49.5 percent for those graduates who ranked in the fourth quartile of their graduating class.

The association becomes even more apparent for those graduates who ranked in the bottom 10 percent of their graduating class. Those graduates passed the bar examination at a 27.6 percent rate during the five year period of the study.

Donald H. Zeigler, Joanne Ingham, & David Chang, Curriculum Design and Bar Passage: New York Law School's Experience, 59 Journal of Legal Education 393 (2010)

The bar pass rate of the bottom 10 percent of the graduating class [of New York Law School] was truly abysmal, as is the case at many law schools. The pass rate was sometimes in the single digits and never more than 20 percent.

Derek Alphran, Tanya Washington & Vincent Eagan, Yes We Can, Pass the Bar-University of the District of Columbia, David A. Clarke School of Law Bar Passage Initiatives and Bar Pass Rates - From the Titanic to the Queen Mary!, 14 UDC/DCSL L.Rev. 9 (2011)

As shown in Table 1 there was a difference in bar passage rates on first attempt of 36.8% between students in the upper half of the law school GPA distribution and the bottom half of the GPA distribution. The bottom half of the class was students with a GPA of 2.91 and below. There was a bar passage rate of 92.7% for first and multiple attempts for the top half of the class and 66.4% for the bottom half of the class. This difference in bar passage rates is more pronounced when GPAs are broken out by quartile (Table 2). The bar passage rate on first attempt of the top quartile was86.8% and of the bottom quartile was 25.0%. Over 94.2% of students in the top quartile had passed the bar after their second attempt as compared to 46.0% of students in the bottom quartile.

Nicholas L. Georgakopoulos, Bar Passage: GPA and LSAT, not Bar Reviews, draft 2013

The most striking result of the analysis is the accuracy with which the law school GPA predicts bar passage on the first try. This is visible in the probit models of the first and fourth columns but also in the simple frequency table 5. Graduates with a GPA below 2.6 pass the bar at a less than 10% rate, with two out of 21 students passing. Students with a GPA over 3.2 pass the bar at a well over 95% rate, with three graduates out of 110 failing.

The extraordinary power of GPA to predict bar passage diminishes dramatically for graduates taking the bar for a second time. The second time takers are significantly fewer in number. As a corollary of the high success rate of graduates with GPAs above 3.2 on their first try, very few such graduates appear in this subsample. The success rates, however, do not change nearly as fast as in the subsample of first-time takers. From GPAs of 2.5 to GPAs of 3.1, success rates hover about 50%.

Leslie C. Levin, Christine Zozula, & Peter Siegelman, A Study of the Relationship Between Bar Admissions Data and Subsequent Lawyer Discipline, LSAC (2013) [archive]

Higher law school grades and law school class rank are both negatively associated with discipline risk, but the effect is only statistically significant for grades.

Scott Johns, Empirical Reflections: A Statistical Evaluation of Bar Program Interventions, 54 University of Louisville Law Review 35 (2016)

We found that LSAT and bar exam scores share about 20% of variance . . . 1LGPA and bar exam scores share about 40% of variance . . . and GLGPA and bar exam scores share about 50% of variance . . . . In sum, traditional law school variables share a moderate to strong relationship with bar exam scores but still leave nearly 50% or more of bar exam scores explained by other variables.

Katherine A. Austin, Catherine Martin Christopher, & Darby Dickerson, Will I Pass the Bar Exam?: Predicting Student Success Using LSAT Scores and Law School Performance, 45 Hofstra Law Review 753 (2017)

For first-time bar exam takers, linear regression was conducted to determine whether Texas Tech Law final GPA predicted an individual’s bar exam score. Final law school GPA significantly predicted bar exam performance . . . .

Roger Bolus, Performance Changes on the California Bar Examination: Part 2 (2018)

Based on the results of over 7,500 examinees sitting for the CBX in 2013, 2016 and 2017, the single best indicator for predicting success on the CBX was the final law school GPAs of candidates. This result, while important, is not surprising: students who excel on law school exams would be expected to perform well on the bar as well. Overall, the statistical models developed below which include examinees demographic characteristics, pre-admission credentials and law school performance predicts more than 54 percent of the variability in CBX Total Scale scores. By social science standards, this degree of predictive power is reasonably strong, and well in-line with findings of past efforts in this area.

Amy N. Farley, Christopher M. Swoboda, Joel Chanvisanuruk, Keanen M. McKinley, & Alicia Boards, Law Student Success and Supports: Examining Bar Passage and Factors that Contribute to Student Performance, AccessLex (2018)

More specifically, the post-1L [GPA] model accurately identified 58% of failers, and the most comprehensive post-3L [GPA] model accurately detected nearly 4 out of 5 students who would fail the bar.

Robert R. Kuehn & David R. Moss, A Study of the Relationship Between Law School Coursework and Bar Exam Outcomes, 68 Journal of Legal Education 623 (2019)

Similar to other studies, performance in law school, measured by LGPA, bears the strongest relationship to bar exam outcomes at [Washington University in St. Louis and Wayne State]. Yet LGPA explains only approximately twenty percent of the variability in bar passage rates among graduates. One notable finding at both schools was the correlation coefficient between first-year and final law school grades--above 0.92. This high correlation strongly signals at the end of the first year which group of students is most likely to fail the bar exam and therefore might merit additional assistance over the next two years.

Three curiosities of Oregon's diploma privilege rule for the 2020 bar exam

On the heels of Utah and Washington, Oregon has announced it will enact a form of diploma privilege. Oregon’s rule is closer to Utah’s than Washington’s—like Utah, it extends only to first-time test-takers who recently graduated from ABA-accredited schools, with some caveats. I had my analysis on Utah’s proposal then, with some “heat and light” reactions to it. But I wanted to highlight three curiosities.

First, it expressly treats Oregon schools differently from out-of-state schools. From the rule:

Granting a one-time "diploma privilege" to persons who timely submitted complete applications for the July 2020 Oregon bar examination and who either (1) graduated in 2020 from one of the three Oregon law schools; or (2) graduated in 2020 from any other law school accredited by the American Bar Association that had a minimum of 86 percent of graduates pass a 2019 Bar exam on their first attempt. All character and fitness requirements continue to apply.

Among Oregon’s three schools, only the University of Oregon (86%) had a first-time passing rate that met or exceeded the 86% threshold. Both Lewis and Clark (81%) and WIllamette (82%) would have failed the standard. Utah’s rule was an 86% threshold that applied to everyone, but one that both Utah schools met. It’s not clear how such in-state favoritism will be received. But, as I noted in the Utah proposal, recognizing diploma privilege for about a third of all law schools is generous compared to in-state only practice.

Second, it uses the 86% standard, and I don’t know where or why. The first-time pass rate for the July 2019 exam was 84%. That’s lower if you count the February 2019 exam results, too. Did Oregon just use the 86% Utah used? Maybe? Given that Utah and Oregon use different cut scores, it seems extra strange. (If someone has more information, please share! I’m dealing with a second-hand report here!)

Third, the 86% standard seems even less justifiable given that Oregon is announcing a temporary reduction in the cut score from 274 to 266. That moves Oregon from one of the highest in the country to at the bottom end of average. Schools that historically have a sub-86% pass rate would assuredly do better on this exam than their overall pass rate would otherwise suggest. (Conversely, however, and undermining my own point, the pass rate should also increase in this administration, so perhaps it accounts for the fact that the test will be easier and therefore a school’s cumulative pass rate should be accordingly higher.)

In any case, I think both Utah and Oregon recognize that first-time test-takers pass at overwhelmingly high rates in their jurisdictions, and the cost of a handful of additional admitted attorneys outweighs the consumer protection concerns. Washington’s rule, in contrast, would admit far more who failed the bar once, or even multiple times. It remains to be seen what long-term effect this has, or whether other jurisdictions adopt similar proposals.

But both Utah and Oregon emphasize that overall pass rates matter a lot for out-of-state benefits. Schools with relatively high pass rates in tougher jurisdictions like California and Virginia won’t reap the benefit unless they can secure more success for their students in absolute terms.

UPDATE: I’m usually pretty generous with the comments for posts, but this one has prompted some bickering and I’ve trimmed back on them.

The Washington State bar exam experiment of 2020 will be one to watch

The bar exam continues to confound licensing authorities in light of the coronavirus pandemic. There are many cohorts to consider, and there are many questions I still have.

Utah’s proposal is to offer diploma privilege to (1) first-time test-takers from (2) ABA-accredited schools whose overall first-time pass rate exceeded the Utah state average. I noted this was a fairly generous proposal considering how stingy “diploma privilege” has been, but that did cut some out. It certainly has not been without critique as too generous and too stingy. Nevada’s proposal updates the “performance test” for online administration. Time will tell what these or others may yield.

Washington, however, is doing something far more notable. To begin, it lowered its cut score from 135 to 133. This is a fairly modest change—most jurisdictions are in the 133 to 135 range, but it will certainly make the exam easier.

But Washington has gone a step farther. It announced the following change: all graduates of ABA law schools may have “diploma privilege” and earn admission. That includes, “The diploma privilege option will be available to applicants currently registered to take the examinations who are taking the tests for the first time and those who are repeating the tests.”

Now, this is remarkable in going beyond Utah for a couple of reasons. First, it applies to all ABA law grads, not simply those whose schools met the Utah threshold (although, as I noted, most Utah applicants would meet this test). Second, it includes repeaters, a cohort I’ve found mostly neglected in scrutinizing how to handle the bar exam.

I’m a little surprised on the repeater front, and it’s a reason to watch Washington in the decade (!) or so to come. Here’s why.

We know that lower bar exam scores are associated with higher ultimate attorney discipline rates, as Professor Rob Anderson and I have chronicled in California and studies elsewhere. Professor Kyle Rozema finds a similar effect.

It’s worth looking at the July 2019 bar exam cohort of ABA law school graduates to see what happened in Washington and what we can roughly expect from this July 2020 decision.

536 graduates took the July 2019 bar exam in Washington. 465 were first-time test-takers, and 71 were repeaters.

Among the 465 first-time test takers in July 2019, 366 passed, a 78.7% pass rate. Nearly 4 in 5 passed on the first attempt. 99 failed. One could imagine, then, a bar exam that exclusively looked to this cohort. Indeed, Washington has lowered its cut score. The pass rate would likely exceed 80% and perhaps even 85%. The vast majority of first-time test-takers pass.

Among those who fail the first time, a number likely would pass on the second attempt. From the February 2020 exam, we saw 96 repeat, 44 pass and 52 fail. We don’t know how many of those were on their second attempt or a subsequent attempt, but it’s likely a good number who passed on the second attempt. (It’s also worth noting that some number of the 99 who failed presumably didn’t try again in the February 2020 exam.)

Back to the July 2019 results. 71 repeated. Of those, 27 passed, and 44 failed for a 38% pass rate. Of note, 44 of test-takers (about 8%) failed the bar exam at least twice, and some of those more than that.

So that’s, I think, where the interesting part of this experiment lies. One can question the efficacy of the bar exam and the like, but it matches pretty closely with law school grades. Furthermore, we know lower scores or those who’ve failed at least once tend to face higher discipline rates. This could be a fairly notable shock to the system to admit so many at once who’ve failed multiple times and may never be admitted to the bar otherwise. Those whose entry would be delayed, those who’d drop out from taking the bar—they’re now all admitted, and all at once.

One could raise “access to justice” issues for underserved legal populations, which is a reason to admit more lawyers to the bar, but I wonder about this at a couple of levels. Are these exam test-takers who’d otherwise fail serving those “underserved” populations? And if so, will they do so well?

Another is that Professor Anderson and I, along with the study of Professor Rozema, noted that attorney discipline tends to manifest later in careers. Will we see more discipline earlier?

I don’t want to portend too much doom and gloom. Optimistically, perhaps the Washington State Bar has some relevant basis for helping supervise attorneys and identify those who are at-risk of discipline earlier in the career to prevent it from happening; indeed, Professor Anderson and I suggest that might be one such solution in the event state bars lower their cut scores. And, to be fair, character and fitness and other related scrutiny will still apply, which means that the bar exam is not the only thing standing between law school graduates and the practice of law.

Still, this is a fairly remarkable one-time event in Washington that should be worth watching. As states continue to grapple with the appropriate bar licensing regime, the results of this experiment will be helpful in assessing the costs and benefits of a bar exam.

The 2020 Nevada bar exam looks to an old exam format with a new twist

Nevada has announced it will offer an online bar exam this year. Like other states grappling with bar exam changes, there remain questions I have—security, a new format for test-takers, and fixing the appropriate pass rates. (For what it’s worth, Nevada has consulted with Roger Bolus, who regularly helps state bars think about licensing and cut scores, so that offers promise.)

Karen Sloan reports it as the “first-ever” open book bar exam. Open-book helps some (but not all!) of the security concerns. And the state board of bar examiners praised the open-book format as consistent with what “lawyers do,” looking up applicable law and answering problems.

Really, it’s a new twist on the “Performance Test” component of the bar exam. Such a test gives students a “closed universe” set of facts and law. Examinees answer questions based on these facts. Instead of rote memorization, the test more closely imitates what lawyers do. The new twist? Treat it like an open-book exam.

Most states have a part of their bar exam that looks like the Performance Test—indeed, the Multistate Performance Test is one the National Conference of Bar Examiners offers. But many states have trimmed back its use, and it’s certainly never replaced the bar exam. Why?

A nice piece by Stephen Klein in the Bar Examiner in 1996 summarizes some of these discussion around the Performance Test. Applicants like the test more. Recent lawyers who take the test perform well on it, unlike other components of the bar exam (where rote memorization has faded).

But, the correlation between the Performance Test, other essays, and the multiple choice MBE is high. True, the Performance Test is not extremely highly correlated with those other components. It’s more highly correlated with essays than the MBE. But the correlation between the MBE and essays is also not perfect. In short, all three are related, but they do appear to test different kinds of skills. For all three types of exams, “cross-cutting abilities” like “legal reasoning” appear to drive overall performance.

The Performance Test costs more to administer (like essays), because it requires more labor-intensive grading. It’s tougher to scale without the MBE component as a reference point for more consistent results.

Women tend to perform better on the essay and Performance Test components, so we might expect an essay-only exam to result in a different kind of outcome between men and women than tests that include both a multiple-choice component and an essay component. (There’s no material difference for race or ethnicity.)

So, why haven’t state bars moved toward the Performance Test? Obviously, cost, inertia, and the NCBE’s UBE format are some reasons today, but those can’t entirely explain it.

Instead, I suspect it’s two major reasons. First, state bar licensing authorities remain stuck to the notion that new lawyers need to "know” a number of content areas of the law. The New York State Bar Association, for instance, has pushed this point significantly in what would be a major roll-back to recent bar licensing portability. There’s an expectation that all lawyers simply need to know certain stuff. Memorization is a way to ensure they know it, even if it’s fleeting. And specific content areas being tested force applicants to learn that stuff.

I don’t know that really works the way licensing authorities intend. There’s little evidence that specific substantive law school courses translate into particular bar exam success. Applicants forget much of the memorization shortly after the exam. Many will never practice in most of the areas tested.

Second, if the Performance Test is already highly correlated to the memorization-intensive components of the bar exam, then what value is it really adding? We know that there’s a relationship between low bar exam scores and elevated ultimate attorney discipline rates. Whether that score comes through multiple choice, essays, a two-day exam, a three-day exam, and so on seems marginal. A lower-cost exam that’s more reliable with easier scaling of scores has been attractive to bar licensing authorities. (And, of course, the inertia with the NCBE.)

All this is to say, I’m a fan of the Performance Test on the whole. If we could have two tests, one that costs slightly more but more closely hews to what lawyers do and requires dramatically less memorization, with the present system, the reform seems like the much better bet.

Nevada’s experiment, I hope, will give some reassurance to other states that they, too, could look to alternative bar exam and bar licensing models. We’ll see how it all shakes out in the months ahead.

Eight questions I have about the summer/fall 2020 bar exam

In the midst of the many changes to the bar exam this summer or fall, here are eight things I’m watching with some interest and some open questions.

1. Online security and reliability. Some states have announced the move to an online test. I’m pretty skeptical that such plans can be built in such a short time (so is Professor Josh Blackman, and the authors of a white paper on the bar exam describe an online exam as “very risky”). The botched AP test administration this spring is just the latest confirmation of my skepticism. I hope states will be able to build something secure and reliable, but we’ll have to wait and see.

2. A new format for test-takers to learn. I think it’s fair to say that bar exam test preparation providers have crafted study programs designed to assist learning in a particular format: in virtually all states, essays and multiple choice covering particular topics. Now, some state bar exams are literally creating a new exam they’ve never administered before out of thin air, whole cloth, or whatever metaphor comes to mind—short essay, short answer, no multiple choice, etc. Indiana, for instance, now will create “short answer questions” on MBE topics. It remains to be seen how students will study for this new test, what materials will be provided to them ahead of time to understand for the format of the test, and whether test-prep companies can accommodate in a timely and effective fashion.

3. Scaling, equating, and pass rates. In most states that administer the Multistate Bar Exam, licensing authorities scale essay exam answers to the equated MBE results. To greatly simplify, it ensures that the test is measuring the same things year after year. But some states are creating their own tests this year. It will be impossible to do that, and I haven’t seen any good statements about what states plan to do except generic claims that they’ll ensure comparable scoring. Michigan, for instance, announced, “Experts will work with the BLE to determine an appropriate passing score based on results from previous July exams.” Nevada “will take all reasonable measures to address the reliability of an essay only exam.” It might be something as crude as making sure the pass rate this year looks similar to past years. Or it might result in a dramatic fluctuation in pass rates. It’s hard to say without seeing more.

4. Out-of-state bias. New York and Massachusetts have announced rules prioritizing in-state ABA law school graduates over others. (Utah accommodated a number of out-of-state law schools, but not all, in its “diploma privilege-plus” model.) It’s not clear whether litigation will follow, whether alternative accommodations will be provided, or whether other states will follow suit.

5. The limitations of the UBE and questions about reciprocity. As UBE states move away from administering a UBE exam this term, students who have increasingly relied on the UBE as a way of improving portability of their license or ensuring licensure in multiple jurisdictions will face a setback. It might be that some states will soften reciprocity requirements (which, I think, seems unlikely)—but, more likely, a cohort of graduates will simply be out of luck.

6. What will “supervised practice” really look like. Utah has the most at stake with supervised practiced as it transitions to a type of “diploma privilege-plus” model, where recent graduates at a number of law schools. But many other jurisdictions, like Ohio and New York, are also allowing graduates to engage in supervised practices. Experience in Canada and elsewhere suggests that such proposals tend to disadvantage first-generation attorneys (i.e., those without a family support structure to help supervise practice) and those with limited socioeconomic means. It remains to be seen how this plays out in practice. It’s also not clear how many graduates take advantage of them, or simply extend their bar exam study periods longer as the bar exam is pushed later.

7. Discipline rates. We know that attorneys tend to be disciplined later in their careers. We also have some evidence that lower bar exam scores correlate with higher ultimate discipline rates. To the extent that alternative exams or admissions practices are adopted, we wouldn’t expect to see much for a decade at least in terms of attorney discipline rates. So we can revisit this in, maybe, 2035 (!) to see if the standards this year changed anything.

8. USNWR fallout. While some have worried about whether law schools will meet their ultimate 75% pass rate for graduates within two years, I think that’s less of a concern—failing to meet the 75% ultimate pass rate means schools must justify to the ABA why they’ve fallen below and what steps they’re taking, and, if in 2022 some schools fall below and can point to the circumstances of the summer/fall 2020 bar exam, I think it would be a small problem for schools to explain non-compliance.

A bigger problem, I think, is the USNWR rankings. USNWR includes as one of its twelve components how a school did in its pass rate compared to the overall pass rate for the modal jurisdiction of its graduates taking the bar exam. It remains unclear how Utah’s new proposal, how novel scoring systems, how pass-fail policies in the last semester, or how increased disruption that may particularly affect law school graduates in, say, New York City will affect pass rates.

To be fair, bar passage rates are a very small portion of the overall formula. But another and larger component is employment statistics. As bar exam grading gets pushed back several weeks (in California, the goal will be to complete it by December 31!), it remains to be seen if a softer employment market coupled with delayed bar exam grading leads to weaker job figures for some schools as of March 15, 2021, the 10-month employment figure date.

Blockchain and the bar exam

Over the last few years, the word “blockchain” has been sprinkled around everything as one of the hottest buzzwords in technology. I confess, I use the word tongue in cheek. I think most references to “blockchain” are hype, and many are duped by believing that this word makes the product to which it’s attached is somehow more valuable, more efficient, or more likely to succeed. It’s a Theranos or WeWork level of hype.

So you can imagine my skepticism when I saw the Massachusetts and California announcements that their bar exams would be administered “online” September 9 and 10.

I’m still skeptical.

The NCBE, in something of an understatement, said there are “significant issues,” including security, in providing an online exam.

But I want to put this in a bit of perspective. These bar exams are scheduled to take place in four and a half months. These state bar licensing authorities believe they can create a secure remotely administered test by then. Because, recall, no one does this now.

Let’s put aside the security issue for a moment and simply focus on reliability of software. Six years ago, ExamSoft had an issue during the July 2014 bar exam where thousands test-takers were unable to upload their answers for hours. Some (I think, wrongly) even blamed this debacle on a decline in bar passage rates that cycle. Exam software is not sufficiently reliable even in the best of times. Add to that the remote (and secure) delivery of materials that have previously been printed, and the collection of those materials after the exam.

In-room security is a huge problem, too. Bar exams are notorious for picayune requirements, like a small clear plastic bag containing limited personal effects, sign-in sheets to use the restroom during the exam, and so on. Remote proctoring software purports to watch the eye movement of exam test-takers during the exam, to scan the room before and after to make sure no one else is present, and other rather theatrical promises. Let’s face it—those probably work in much lower stakes tests.

Now, as a small pushback, perhaps cheating on the bar exam doesn’t yield much. The MBE is difficult to cheat on given short periods of time and its intensive fact-application component, which makes cheating difficult (unless, I suppose, someone else is literally taking the test for you). MPTs turn on a closed universe of facts, so, again, unless someone else is writing the exam for you, looking at an outline or something won’t help much. But the ability to outline dump rightly-stated black letter law is probably a huge temptation for the essay components, and probably the very easiest thing for cheating.

And, I think, those most inclined to cheat on the bar exam—and be advantaged by cheating—are probably the ones most at risk of failing and most likely to commit malpractice later in their careers. Maybe we’re not really worried because most who take the bar exam pass anyway, and these are, after all, extraordinary times. But when I consider the repeaters—those who’ve failed before—and wonder about the pressures (and incentives) to cheat, it gives one some pause.

Really, the ideas strike me as the kinds of things pitched to bar licensing authorities with some hype: “Oh, we totally can do this online!” Perhaps a string of buzzwords about security—AI, blockchain security, and so on—were persuasive. But to build something like this out in four months—not just build, but test, fix, and feel comfortable using—strikes me as unrealistic. Even a years-long preparation of the digital LSAT led to some small problems in the first widespread use (even if it was mostly seamless).

A proposal from Dean Jennifer Mnookin at UCLA and Dean Erwin Chemerinsky at Berkeley was far better. They proposed canceling the July 2020 test to reduce uncertainty about later scheduling and postponing, and allowing recent graduates “to practice law for a defined—and relatively limited—period, such as until the July 2022 bar exam releases its results.” Granted, this would only help some cohorts, and it would exclude, say, repeaters from the proposal, but in trying times there are going to be tradeoffs in all decisions.

Finally, I’m not a Luddite! I think if we can develop a secure remotely-administer bar exam—perhaps one that looks different than the one we have today—we should go for it. Remotely, of course, is the great challenge. Moving to a digital, or a year-round test that one can self-schedule in a secure location, seem more promising. But this is a years-long project, and one that probably must start with volunteers on a small scale before ramping up.

I doubt these bar licensing authorities will actually move forward with a remotely administer bar exam this September. These licensing authorities can change their minds later, of course. Time will tell.

UPDATE: Professor Josh Blackman has more here, with comparisons to the Iowa caucuses and healthcare.gov.

UPDATE: It should be noted that Massachusetts intends to administer an alternative examination in the event the Uniform Bar Exam cannot be administered. It would be interesting to see how much notice it gives prospective test-takers about its form and contents; whether that meets the other security and practical concerns I raised; and whether it would be as reliable an exam. Of course, we’ll see plenty of experimentation, as Utah is doing!

California State Bar working group recommends cutting bar exam topics from 13 to 8

There’s an interesting draft report out from the California Attorney Practice Analysis Working Group, appointed to examine some recommendations about the content of the bar exam. Two recommendations (about the scope of “entry-level practice”; and relevant competencies, some of which might need to be reassessed in terms of the existing bar exam testing format) are worth a read. But more interesting to me was the call to reduce the number of legal topics tested on the bar exam. The goals included de-emphasizing memorization, and offering a core set of minimum competencies.

Seven existing topics—all of which are tested on the MBE—are recommended to remain: Civil Procedure, Constitutional Law, Contracts, Criminal Law and Procedure, Evidence, Real Property, and Torts. The eighth is a new topics, Administrative Law and Procedure. Topics to be removed are Business Associations, Community Property, Remedies, Trusts, and Wills and Succession. Professional Responsibility would also be removed, as it’s duplicative of the MPRE, mandated in law school, and a new mandatory training for entry-level attorneys.

It’s unclear where a working group proposal like this will lead, but other reforms of longstanding practices, like cutting the bar exam from three days to two, have occurred in California recently. We’ll see what comes of this proposal.

Other agenda items of note include expediting the scoring process so results come out earlier; improving grading and cheating concerns; and long-term considerations about the UBE and the cut score.

February 2020 MBE bar scores fall to all-time record low in test history

What had been a record low in February 2018 after a record low in February 2017, became a new record low in February 2020. The mean score was 132.6, down from 134.0 last year and edging out the February 2018 low of 132.8. (That’s off from the recent 2011 high of 138.6.) We would expect bar exam passing rates to drop in most jurisdictions.

For perspective, California's overall "cut score" is 144, Virginia's 140, Texas's 135, and New York's 133.

Given how small the February pool is in relation to the July pool, it's hard to draw too many conclusions from the February test-taker pool. The February cohort is historically much weaker than the July cohort, in part because it includes so many who failed in July and retook in February. The NCBE reports that “more than two-thirds” of test-takers were repeaters.

Schools must ask themselves why bar rates remain persistently low and bar exam scores remain low. Declining entering class quality and ineffective bar preparation programs may be among the challenges.

The decline in scores comes at a particularly poor time. Some are advocating for “diploma privilege” for the Class of 2020 in light of bar exam postponements given the coronavirus pandemic. Bar licensing authorities will assuredly be skeptical of such proposals as they look at all-time low scores like these.

Heat and light over the Utah bar diploma privilege proposal

I wrote about Utah’s proposal to allow a modified version of “diploma privilege” for graduates of chunk of law schools. I don’t particularly support or oppose the proposal, but I pointed out some places where it makes sense and others where it doesn’t. But the reaction to it seems quite strong—oddly opposed in a couple of dimensions. But the opposition, I think, is too often more heat than light.

Some critique the plan, calling the exception too narrow. Take this statement from commentary in California:

“We vehemently disagree with the Utah proposal as it only benefits a small percentage” of graduating third-year law students, said Escontrias.

The Utah rule would only allow the diploma privilege to those who graduated between May 1, 2019, and June 30, 2020, and only from American Bar Association-accredited law schools “that had a first-time taker bar examination passage rate in 2019 of 86%" or higher.

As my post mentioned, about 1/3 of all law schools qualify their graduates to earn diploma privilege in Utah in 2020 (if the proposal is enacted). Additionally, by my calculations, that’s about 42% of graduating law students. One is hard-pressed to call that “a small percentage.” Finally, by my (rough!) reckoning, it would extend to at least 90% of those who were registered for the Utah bar exam this July, if not more—hardly a “small percentage” (although, of course, it’s not extended to any students who registered for bar exams in the other 49 states or Puerto Rico). Despite the rule that “only” extends to certain populations, then, this rule is beneficial to the vast majority of prospective Utah attorneys graduating from law school this year.

Then again, some call the exception too broad:

I think this is the [Utah] Supreme Court’s way of making it way to[o] easy to become a licensed attorney in Utah, which goes against everything the Utah State Bar has stood for,” said Emy Cordano of COR LAW.

“I see no reason why they should get a free pass,” Cordano added. “The bar exam is the supreme test of whether or not you are going to make it as a lawyer in the courtroom.”

Remarks like these (and there are others) sound much more like hazing. For instance, in an extraordinary circumstance where the bar exam will be postponed, it’s hard to call accommodations given that practice as making it “too easy”—they are, after all, admittedly, accommodations, not standard rules. Additionally, it’s not “easy” or a “free pass”—students still have to graduate law school, and still must complete supervised practice, and still must pass the MPRE and the character and fitness review. It’s a pass on one component of access to the practice of law.

And it’s hardly the “supreme test” of whether you “are going to make it” “in the courtroom.” (Note the litigation bias—lots of attorneys don’t spend their time in the courtroom.) That, I think, is left to clients to determine after one has practiced. The bar exam likely keeps out some attorneys who are at a higher likelihood of engaging in misconduct. But it’s not some guarantee of quality.

All in all, then, there are some increasingly heated disputes. I do hope, however, that bar licensing authorities, including Utah, look closely at the present circumstances, tailor solutions for those present circumstances, and consider the more long-term solutions appropriately in the years to come.