Multistate Bar Exam scores for July 2024 rebound to highest level since 2013

The Multistate Bar Exam scores for July 2024 have been released, and they show promising signs. The mean scaled score is a 141.8, the highest since 2013’s 144.3 (before the grim numbers that started in 2014). That score is slightly higher than some other recent years (2014’s 141.5 and 2017’s 141.7), but a good bit higher than the 139- and 140-range scores in some other recent years.

We are now through the COVID-19 pandemic cohort, so the vast majority of Class of 2024 students taking the July 2024 exam had in-person instruction and a somewhat “normal” legal education experience. But it’s hard to draw many conclusions from small cohorts like this.

The figures suggest overall bar passage rates will increase in most jurisdictions. That may also portend slightly stronger employment outcomes for some students. Time will tell.

Analysis of first-time bar passage data for Class of 2023 and ultimate bar passage data for Class of 2021

The ABA has released its new batch of data on bar passage. The data includes the first-time passage data for the Class of 2023 and the “ultimate” passage data for the Class of 2021. As I noted earlier, USNWR has increased the weight on bar passage as a metric (18% of the methodology is for first-time passage, 7% for ultimate), and it is one of the biggest metrics. It is also one of the most volatile metrics.

To offer a snapshot of what the data means, I looked at both the first-time and ultimate passage data. I compared schools’ performance against their Class of 2022 and 2020 metrics. I weighed the data the way USNWR does for a point of comparison.

Note that USNWR has not yet released its latest rankings for Spring 2024. That will include the Class of 2022 and 2020 metrics. This new batch of data will appear on rankings released in 2025.

Here are the schools projected to improve in this metric (which, again, under the current methodology, is 25% of the rankings) over the Classes of 2022 and 2020. The numbers below show the change in score; that is, they show how much a school is projected to improve or decline in the scoring. It is not the bar passage data, which is a comparative metric that can be harder to make meaningful if viewed simply in raw terms. That said, these numbers are, in their own way, meaningless, as they are just one factor among several.

Pontifical Catholic University of P.R. 0.316441

Appalachian School of Law 0.2564

Texas Southern University Thurgood Marshall School of Law 0.230382

Widener University-Delaware 0.225445

Northern Kentucky University 0.201138

Stetson University College of Law 0.188107

Villanova University 0.179214

Miami, University of 0.158228

Kansas, University of 0.152897

Albany Law School 0.141105

Baltimore, University of 0.129179

Texas Tech University 0.127884

Southern Illinois University 0.127633

Cincinnati, University of 0.127493

Saint Louis University 0.122641

North Carolina Central University 0.118442

Pittsburgh, University of 0.116755

Memphis, University of 0.106309

Vanderbilt University 0.104692

Boston College 0.102051

Here are the schools projected to decline in this metric over the Classes of 2022 and 2020.

Willamette University -0.41276

New Hampshire, University of -0.3903

Illinois, University of -0.32793

Case Western Reserve University -0.32763

Florida A&M University -0.31228

Ohio Northern University -0.30423

City University of New York -0.25318

Kentucky, University of -0.20322

Southern University -0.18699

Missouri, University of -0.17881

Puerto Rico, University of -0.166

Seattle University -0.16593

Pennsylvania State-Dickinson Law -0.15274

Regent University Law School -0.14854

Tulsa, University of -0.14119

Colorado, University of -0.1361

Gonzaga University -0.1291

Cleveland State University College of Law -0.12842

California Western School of Law -0.12533

St. Thomas University (Florida) -0.12298

Does a school's "ultimate bar passage" rate relate to that school's quality?

With a loss of data that USNWR used to use to assess the quality of law schools, USNWR had to rely on ABA data. And it was already assessing one kind of outcome, first-time bar passage rate.

It introduced “ultimate bar passage” rate as a factor in this year’s methodology, with a whopping 7% of the total score. That’s higher than the median LSAT score now. It’s also much higher than the at-graduation rate in previous methodologies (4%).

Here’s what USNWR had to say about this metric:

While passing the bar on the first try is optimal, passing eventually is critical. Underscoring this, the ABA has an accreditation standard that at least 75% of a law school’s test-taking graduates must pass a bar exam within two years of earning a diploma.

With that in mind, the ultimate bar passage ranking factor measures the percentage of each law school's 2019 graduates who sat for a bar exam and passed it within two years of graduation, including diploma privilege graduates.

Both the first-time bar passage and ultimate bar passage indicators were used to determine if a particular law school is offering a rigorous program of legal education to students. The first-time bar passage indicator was assigned greater weight because of the greater granularity of its data and its wider variance of outcomes.

There are some significant problems with this explanation.

Let’s start at the bottom. Why did first-time bar passage get greater weight? (1) “greater granularity of its data” and (2) “its wider variance of outcomes.”

Those are bizarre reasons to give first-time bar passage greater weight. One might have expected that there would be an explanation (right, I think) that first-time bar passage is more “critical” (more than “optimal”) for employment success, career earnings, efficiency, and a host of reasons beneficial to students.

But, it gets greater weight because there’s more information about it?

Even worse, because of wider variance in outcomes? The fact that there’s a bigger spread in the Z-score is a reason to give it more weight?

Frankly, these reasons are baffling. But maybe no more baffling than the opening justification. “Passing eventually is critical.” True. But following that, “Underscoring this, the ABA has an accreditation standard that at least 75% of a law school’s test-taking graduates must pass a bar exam within two years of earning a diploma.”

That doesn’t underscore it. If eventually passing is “critical,” then one would expect the ABA to require a 100% pass rate. Otherwise, schools seem to slide by with 25% flunking a “critical” outcome.

The ABA’s “ultimate” standard is simply a floor for accreditation purposes. Very few schools fail this standard. The statistic, and the cutoff, are designed for a minimal test of whether the law school is functioning appropriately, at a very basic level. (It’s also a bit circular, as I’ve written about—why does the ABA need to accredit schools separate and apart from the bar exam if it’s referring to accreditation standards as passing the bar exam?)

And why is it “critical”?

USNWR gives “full credit” to J.D.-advantage jobs, not simply bar passage-required jobs. That is, its own methodology internally contradicts this conclusion. If ultimately passing the bar is “critical,” then one would expect USNWR to diminish the value of employment outcomes that do not require passing the bar.

Let’s look at some figures, starting with an anecdotal example.

The Class of 2020 at Columbia had a 96.2% ultimate bar passage rate. Pretty good—but good for 53d nationwide. The gap between 100% and 96.2% is roughly the gap between a 172 median LSAT and a 163 median LSAT. You are reading that correctly—this 4-point gap in ultimate bar passage is the same as a 9-point gap at the upper end of the LSAT score range. Or, the 4-point gap is the equivalent to the difference in a peer score of 3.3 and a peer score of 3.0. In other words, it’s a lot.

Now, the 16 students at Columbia (among 423!) who attempted the bar exam once but did not pass it may say something. It may say that they failed four times, but that seems unlikely. It may be they gave up—possible, but why give up? It could be that they found success in careers that did not require bar passage (such as business or finance) and, having failed the bar exam once, chose not to try to take it.

It’s hard to say what happened, and, admittedly, we don’t have the data. If students never take the bar, they are not included in this count. And so maybe there’s some consistency in the “J.D. advantage” category (i.e., passing the bar exam is not required) as a “full credit” position. But for those who opt for such a job, half-heartedly try the bar, fail, and give up—well, they fall out of the “ultimate bar passage” category.

Another oddity is that the correlation between first-time passage rate (that is, over- and under-performance relative to the jurisdiction) and ultimate bar passage rate is good, but at 0.68 one might expect two different bar passage measures to be more closely correlated. And maybe that’s good not to have measures so closely bound with one another. But these are literally both bar passage categories. And they seem to be measuring quite different things.

(Note that including the three schools from Puerto Rico, which USNWR did for the first time this year, distorts this chart.)

You’ll see there’s some correlation, and it maybe tells some stories about some outliers. (There’s a caveat in comparing cohorts, of course—this is the ultimate pass rate for the Class of 2020, but the first-time rate for the Class of 2022.) Take NCCU. It is in a state with a lot of law schools with students with high incoming predictors, whose graduates pass the bar at high rates. NCCU appears to underperform relative to them on the first-time metric. But its graduates have a high degree of success on the ultimate pass rate.

So maybe there’s some value in offsetting some of the distortions for some schools that have good bar passage metrics but are in more competitive states. If that’s the case, however, I’d think that absolute first-time passage, rather than cumulative passage, would be the better metric.

Regardless, I think there’s another unstated reason for using this metric: it’s publicly available. Now that a number of law schools have “boycotted” the rankings, USNWR has had to rely on publicly available data. They took out some factors and they devalued others. But here’s some publicly available data from the ABA. It’s an “output,” something USNWR values more now. It’s about bar passage, which is something it’s already looking at. It’s there. So, it’s being used. It makes more sense than the purported justifications that USNWR gives.

And it’s given 7% in the new rankings. That’s a shocking amount of weight to this metric for another reason: what students actually rely on this figure?

When I speak to prospective law students (whether or not they’re planning to attend a school I’m teaching at), I have conversations about employment outcomes, yes. About prestige and reputation. About cost and about debt. About alumni networks. About geography. About faculty and class size.

In thirteen years of legal education, I’m not sure I’ve ever thought to mention to a student, “And by the way, check out their ultimate bar passage rate.” First time? Sure, it’s happened. Ultimate? Can’t say I’ve ever done it. Maybe that’s just reflecting my own bias. But I certainly don’t intend to start now. If I were making a list of factors I’d want prospective students to consider, I’m not sure “ultimate bar passage rate” would be anywhere on the list.

In any event, this is one of the more bizarre additions to the rankings, and I’m still wrapping my head around it.

Multistate Bar Exam scores hold steady, remain consistent with recent low scores

It has been difficult to project much about the bar exam given changes in administration and the pandemic. The July 2022 bar exam would reflect three potentially significant things: the decision of law schools to move to pass-fail grading in their courses (particularly affecting 1L courses) in the Spring 2020; the decision of law schools to significantly reduce academic attrition for 1Ls in the summer of 2020; and the decision of law schools to have a number of remote learning options for the bulk of law students taking the bar in July 2022.

Now the MBE scores have been released, and the scores are a a slight drop from July 2021—but still consistent with scores between 2014 and 2019, and certainly not an all-time low.

The score is comparable to last summer’s scores, but it remains near recent lows. It appears that these disruptions did not materially affect bar passage rates (of course, it’s impossible to know how rates may have differed without these variables—perhaps they would have improved markedly, or remained just the same!). Of some interest: test-takers declined somewhat notable, from 45,872 to 44,705.

Puerto Rico lowers its bar exam cut score in response to threats that its law schools may lose accreditation

Back in 2019, I assessed the potential effect of the American Bar Association’s revised Standard 316, which requires an “ultimate” bar passage rate of 75% within two years for a graduating class. There, I noted:

Let’s start with the schools likely in the most dire shape: 7 of them. While the proposal undoubtedly may impact far more, I decided to look at schools that failed to meet the standard in both 2015 and 2016; and I pulled out schools that were already closing, schools in Puerto Rico (we could see Puerto Rico move from 3 schools to 1 school, or perhaps 0 schools, in short order), and schools that appeared on a list due to data reporting errors.

Will state bars lower their cut scores in response?

It’s possible. Several state bars (like South Dakota as mentioned above) have lowered their cut scores in recent years when bar passage rates dropped. If states like California and Florida look at the risk of losing accredited law schools under the new proposal, they may lower their cut scores, as I suggested back in 2016. If the state bar views it as important to protect their in-state law schools, they may choose the tradeoff of lowering cut scores (or they may add it to their calculus about what the score should be).

The ABA Journal recently reported the plight of two of Puerto Rico’s law schools that have failed to meet that standard in several years. Indeed, for Pontifical, their pass rates have worsened fairly dramatically in recent years: 71% for 2017, 52% for 2018, and 46% for 2019.

That article tipped me off to changes in Puerto Rico’s bar exam cut score. Puerto Rico does not use the UBE or a standardized bar exam score, so their passing score of “596 out of 1000 points” doesn’t offer a whole lot of information. But the Supreme Court of Puerto Rico did choose to lower the cut score to 569.

A 2021 report offers some reasons to be skeptical of this change, after studying predictors and exam performance:

For both set of analyses completed, the results did support the hypothesis that the applicants in the more recent years were not as well prepared than the applicants in previous years. Average P-values for a common set of items declined over time, and when comparing specific test administration pairs, the pattern consistently saw applicants from earlier test administrations performing better.

. . .

The hypothesis that the steady decline in overall pass rate on the Puerto Rico Bar Examination is a result of applicants being less prepared for the examination is supported by the decline in performance on the 14 anchor items administered on every test administration.

The Supreme Court of Puerto Rico expressly considered the effect of the new ABA Standard 316 on Puerto Rico’s law schools as an impetus for change.

Ante la necesidad de determinar si, además de las medidas ya concretadas por el Poder Judicial para atender los efectos de la aplicación del Estándar de Acreditación 316 de la ABA en nuestra jurisdicción, era necesario disminuir o modificar la nota de pase de los exámenes de admisión al ejercicio de la profesión legal, en el 2020 la Oficina de Administración de los Tribunales (OAT) comisionó a la compañía ACS Ventures un análisis sobre este particular.

A standard-setting study for the cut score had two rounds of standard-setting. One recommended a score of 584 (with a range of 574 to 594), and the other 575 (with a range of 569 to 581). The Supreme Court took the lowest of these ranges, 569. That said, the pass rate would still be at 46.4% even with that score, better than the rate of closer to 33% under the present standard:

We recommend that the program consider a final passing score for the Bar Examination somewhere in the range of the recommended passing score (575) and a score that is two standard errors of the mean below this score (569). The rationale for this recommendation is that the reference point for the panelists during the study was the Minimally Competent Candidate and panelists made judgments to predict how these candidates would perform on the multiple-choice questions and essay questions for the examination. This means that the distribution of reference candidates was all intended to be minimally competent. In creating that distribution, the lower bound would likely best represent the threshold of minimum competency suggested by the panelists. Setting the passing score at 569 would mean that approximately 46.4% of candidates would pass the examination while setting the passing score at 575 would mean that approximately 41.5% of candidates would pass. This range is consistent with the recommendations of the panelists as characterizing the performance of the minimally competent candidate.

The ABA has given Puerto Rican law schools an extra three years to try to comply. The lower cut score will make it easier to do so, although it remains unclear that even with this cut score all schools will be able to meet the standard.

But it also shows the rarity of the ABA of actually enforcing this standard, except for continuing to give schools more time to demonstrate compliance. We’ll see what happens in the next three years.

Comment on the ABA's proposal to end admissions tests as a requirement for law school admission

Earlier, I blogged about the ABA’s proposal to end the admissions test (typically, the LSAT) as a requirement for law school admissions. I’ve submitted a comment on the proposal, which you can read in its entirety here. The comment recommends disclosure of four pieces of information if the ABA accepts the proposal: the number of matriculants who do not have a standardized test score; the percentage of students receiving—and the 75th, 50th, and 25th percentile amounts of—grants among students admitted without a standardized test score; total academic attrition among students who lack a standardized test score; and the first-time and ultimate bar exam passage rates for students without a standardized test score. The comment explains why each item would be a useful disclosure.

You can view other comments here.

California audit reveals significant underreporting and underenforcement of attorney discipline

The full report is here. The National Law Journal highlights a few things:

In a review of the agency’s disciplinary files, acting state auditor Michael Tilden’s office found one lawyer who was the subject of 165 complaints over seven years.

“Although the volume of complaints against the attorney has increased over time, the State Bar has imposed no discipline, and the attorney maintains an active license,” the report said.

In another instance, the bar closed 87 complaints against a lawyer over 20 years before finally recommending disbarment after the attorney was convicted of money laundering.

It’s a pretty remarkable story that highlights two things worth considering for future investigation.

First, when Professor Rob Anderson and I highlighted the relationship between bar exam scores and ultimate attorney discipline rates, we could only draw on publicly-available discipline records. In a sense, what we observed was a “tip of the iceberg.” Now, this could come out in a couple of different ways. On the one hand, it might be that the relationship is even stronger, and that attorney misconduct manifests earlier, if we had complete access to the kind of complaints that the California bar has. On the other hand, it might also be the case (as we point out in the paper) that some attorneys are better at concealing (or defending) their misconduct than others, and that might be hidden in the data we have. It would be a separate, interesting question to investigate.

Second, it highlights the inherent error in comparing attorney discipline rates across states. California’s process is susceptible to unique pressures or complications, as all states’ systems are. You cannot infer much from one state to another (unless you are looking at relative changes in states over time as a comparative benchmark), which is an effort some have (wrongly) attempted.

It will be interesting to see what comes out of the reforms proposed in California and if the effort improves public protection.

What happens if the ABA ends the requirement that law schools have an admissions test? Maybe less than you think

In 2018, the American Bar Association’s Council on the Section of Legal Education and Admissions to the Bar considered a proposal dropping the requirement of an admissions test for law schools. I wrote about it at the time over at PrawfsBlawg (worth a read!). The proposal did not advance. Many of these points hold true, but I’ll look at how a new proposal differs and what might come. The proposal is still in its early stages. It’s possible, of course, that the proposal changes, or that it is never adopted (as the 2018 proposal wasn’t).

To start, many law schools currently admit a non-trivial number of students without the LSAT. Some of those are with the GRE. A few are with the GMAT. Several admit students directly from undergraduate programs with a requisite ACT or SAT score. The GRE has gained more acceptance as a valid and reliable predictor of law school admissions, although how USNWR uses it in calculating its rankings is not how ETS recommends using the GRE.

The 2018 proposal concluded, “Failure to include a valid and reliable admission test as a part of the admissions process creates a rebuttable presumption that a law school is not in compliance with Standard 501.” The 2022 proposal is even more generous: “A law school may use admission tests as part of sound admission practices and policies.” No rebuttable presumption against.

There are varying levels of concern that might arise, so I’ll start with the point that I think inertia will keep many law schools using not just standardized tests but the LSAT.

First, the most significant barrier to prevent a “race to the bottom” in law school admissions: the bar exam. As it is, schools must demonstrate an ultimate bar passage rate of 75% within two years of graduating. That itself is a major barrier for dropping too low. Even there, many schools do not like an overly-low first-time passage rate, and student take note of first-time bar passage rates, which have increased importance in the USNWR rankings.

Now, some states have been actively considering alternative paths to attorney licensing My hunch—and it’s only a hunch—is that this move by the ABA will may actually reduce the likelihood that state bars will consider alternative pathways to attorney licensing beyond the bar exam, such as version of “diploma privilege.” If state bars are concerned that law schools are increasingly likely to admit students without regard to ability, state bars may decide that the bar exam becomes more important as a point of entry into the profession.

Of course, this isn’t necessarily true. If schools can demonstrate that they are admitting (and graduating) students with the ability to practice law to the ABA, and perhaps to the state bars, then that could elevate trust. But state bar licensing authorities appear to have long distrusted law schools. We’ll see if these efforts complicate proposals for bar exam reform, or simply highlight closer working relationships with (in-state) law schools and bar licensing authorities.

In short, unless schools come up with adequate alternatives on the admissions front to address bar passage at the back end, it’s unlikely to be a drastic change. And it might be that efforts in places like Oregon, which are focused on both the law school side and the consumer-facing side of the public, will assuage any such concerns.

Second, a less obvious barrier is legal employment. That’s a tail-end problem for inability to pass the bar exam. But it’s also an independent concern among, say, large law firms or federal judges to choose from graduates with the highest legal ability. There are proxies for that, law school GPA or journal service among them. But the “prestige” of an institution also turns in part on its selectivity, measured in part by the credentials of high LSAT scores. If firms or judges are less confident that schools are admitting the highest caliber law students, they may begin to look elsewhere. This is a complicated and messy question (alumni loyalty, for instance, runs deep, and memories of institutional quality run long), but it may exert some pressure on law schools to preserve something mostly like the status quo.

Third, for admissions decisions of prospective students, there’s a risk about how to evaluate GPAs. For instance, it’s well known that many humanities majors applying to law school have disproportionately higher GPAs than their LSAT scores suggest; and that hard sciences majors have disproportionately lower GPAs than their LSAT scores suggest. The LSAT helps ferret out grade inflation and avoids collegiate major grading biases. It is not immediately clear that all admissions decisions at schools will grasp this point if the focus shifts more substantially to UGPA as the metric for admissions (which is less accurate a predictor of Law school success than LSAT, and less accurate still than LSAT and UGPA combined).

Fourth, who benefits? At the outset, it’s worth noting that all schools will still indicate a willingness to accept the LSAT, and for law students interested in the broadest swath of application interest are still going to take the LSAT. Additionally, it’s likely that schools will continue to seek to attract high-quality applications with merit-based scholarships, and LSAT (or GRE) scores can demonstrate that.

One group of beneficiaries are, for lack of a better word, “special admittees.” Many law schools often admit a select handful of students for, shall we say, political or donor reasons. These students likely do not come close to the LSAT standards and may have the benefit of avoiding the test altogether. (Think of the Varsity Blues scandal.)

A second group of beneficiaries are law schools with a large cohort of undergraduates at a parent university that allows for the channeling of students into the law school. Right now, schools are capped at how many students can be admitted under such programs with an LSAT requirement as opposed to only a UGPA and some ACT or SAT requirement. That cap is now lifted.

Relatedly, pipeline programs become all the more significant. If law schools can develop relationships with undergraduate institutions or programs that can identify students who will be successful in law school upon completion of the program, it might be that the law school will seek to “lock” these students into the law school admissions pool.

In other words, it could most redound to the benefit of law schools with good relationships with undergraduate institutions, both as a channeling mechanism and as a way of preventing those students from applying to other schools (through a standardized test). We may see a significant shift in programming efforts.

There are some who may contend that racial minorities and those from socio-economically disadvantaged backgrounds will benefit, as they tend to score lower on standardized tests and bear the brunt of the cost of law schools adhering to standardized testing. That may happen, but I’m somewhat skeptical, with a caveat of some optimism. The LSAT is a good predictor of bar exam success (and of course, a great predictor of law school grades, which are a great predictor of bar exam success), so absent significant bar exam changes, there will remain problems if schools drop standardized testing in favor of metrics less likely to predict success. That said, if schools look for better measures in pipeline programs, things that prospective students from underrepresented communities can do that will improve their law school success, then it very well could redound to the benefit of these applicant pools and potentially improve diversification of the legal profession. But that will occur through alternative efforts that are more likely to predict success, efforts which we’re beginning to see but are hardly widespread.

Finally, what about USNWR? Unless many schools change, it seems unlikely that USNWR would drop using LSAT and GRE as a metric. Many schools, as noted, already have a cohort that enters without any standardized test scores that are measured in the rankings.

But we can see how the rankings have been adjusted for undergraduate schools:

A change for the 2022 edition -- if the combined percentage of the fall 2020 entering class submitting test scores was less than 50 percent of all new entrants, its combined SAT/ACT percentile distribution value used in the rankings was discounted by 15 percent. In previous editions, the threshold was 75 percent of new entrants. The change was made to reflect the growth of test-optional policies through the 2019 calendar year and the fact that the coronavirus impacted the fall 2020 admission process at many schools.

. . .

. . . U.S. News again ranks 'test blind' schools, for which data on SAT and ACT scores were not available, by assigning them a rankings value equal to the lowest test score in their rankings. These schools differ from ones with test-optional or test-flexible admissions for which SAT and ACT scores were available and were always rank eligible.

It’s possible, then, that alternative rankings weights would be added to account for schools that had increasing cohorts without standardized test scores. But, as long as it remains a factor, I imagine most law schools will continue to do everything in their power to focus on maximizing the medians for USNWR purposes, as long as the incentives remain to do so.

*

In short, it’s quite possible that we’ll see a number of innovative developments from law schools on the horizon if the proposal goes through. That said, I think there are major barriers to dramatic change in the short term, with a concession that changes in other circumstances (including the bar exam, improved undergraduate or pipeline programs, and USNWR) could make this more significant in the future.

But I’d like to suggest two points of data collection that may be useful to examine the change. First, it would be useful if law schools, perhaps only those with more than 10% of their incoming class who enter without standardized test scores, disclose the attrition rates of who had a standardized test and those who did not. Second, it would be useful if they disclosed the cumulative and ultimate bar passage rates of each cohort. I think this information would help demonstrate whether schools are maintaining high standards, both in admission and in graduation, regardless of the source of admission. But, law schools already disclose an extraordinary amount of information, and perhaps those will just be quietly disclosed to the ABA during reaccreditation rather than in some public-facing capacity.

USNWR has erratically chosen whether "statewide bar passage" rate includes only ABA-approved law schools over the years

I was directed to the fact that the new USNWR bar exam metric includes “the weighted state average among ABA accredited schools' first-time test takers in the corresponding jurisdictions in 2020.” “ABA accredited” was added. Didn’t the first-time bar exam passage rate only include ABA accredited schools in the past?

Previous methodology looked at the modal state where a law school’s graduates took the bar exam, and the “jurisdiction's overall state bar passage rate for first-time test-takers in winter and summer” of that year.

I looked at the 2022 rankings (released in 2021, using the 2019 bar exam data). I picked California, known for its significant cohort of non-ABA test-takers. The overall first-time pass rate was 59%, but the first-time pass rate among ABA accredited schools was 69%. (Historical stats are here.) USNWR used the 59% rate.

That first surprised me. I had assumed USNWR only used ABA accredited data. It also made me think that California schools would be harmed the most by this shift in metrics (even if I think it’s more accurate). That’s because California schools are less likely to “overperform” if the pass rate is higher (e.g., using only ABA accredited test-takers instead of all test-takers).

But then I dug further.

The 2021 rankings (released in 2020, using 2018 bar exam data) reported California’s first-time bar pass rate as 60%. The ABA first-time rate was 60%. But the overall rate was 52%. So in this year, USNWR used only ABA accredited schools.

The 2020 rankings (released in 2019, using 2017 bar exam data) reported a first-time pass rate of 58%. That’s the same as the overall first-time pass rate of 58%, not the 66% from ABA accredited law schools. So in this year, USNWR used overall first-time pass rates. And it appears USNWR did the same in 2019 (released in 2018, using 2016 bar exam data).

In short, there does not appear to be any reason why USNWR has used one method or another over the years. Certainly, this year it is expressly using only ABA data, and maybe it intends to stick with that going forward. But it’s another, subtle change that could adversely affect those schools (e.g., California) with a significant cohort of non-ABA test-takers. It’s probably the right call. But it also highlights the inconsistency of USNWR in its methodology over the years.