MBE scores improve for July 2019 exam

The National Conference of Bar Examiners recently shared that mean Mulistate Bar Exam scores have improved for the July 2019 test—and the biggest improvement in over a decade.

Coming off a 34-year-low in MBE scores, the news is welcome. First-time test-takers were roughly the same over the July 2018 and July 2019 exams, which is also good news—it likely means first-time pass rates will improve, at least modestly, in most jurisdictions. Still, the median 141.5 score is a far cry from the July 2018 administration, which had a 145.6 score. Schools continue to see higher-than-hoped failure rates because admissions and retention practices still haven’t kept pace with changes in student quality. Whether this July is a turning point for future years remains to be seen—pessimistically, the turn in July 2017 scores didn’t portend much good news for July 2018, but time will tell.

California's leak of bar exam topics should have little if any impact on likelihood of passing

The California State Bar made a big mistake with a lot of questions that remain to be answered. Fortunately, the mistake should have little, if any, impact on anyone’s likelihood of passing the July 2019 bar exam.

From the scant details we know so far, the Bar disclosed to a “number of deans of law schools” the topics that will be tested in the essay portion of the bar exam. That leak took place Thursday, July 25. Late Saturday, July 27, the Bar said it only recen0tly discovered this and emailed all test-takers the same information.

It’s not clear how the leak occurred. It’s also not clear who received the information or what they did with it. Law deans who received this information could have (1) not read the email; (2) read it, but kept its contents confidential; or (3) read it, and shared that information with some test-takers. (3) is the obvious problem. It’s also possible they forwarded the email along to other law school administrators with (1) or (2) in mind, but someone else did (3). (It’s also unclear how the error came to the attention to the Bar or whether any deans revealed this information to the Bar.)

But the obvious concern among test-takers is how this affects their chances of passing the bar. Fortunately, it won’t affect just about anyone’s odds, with a possible exception I’ll get to near the end.

The bar exam is equated and scaled. The essays are scaled to the multistate bar exam (“MBE”), the multiple choice component. I’ve written about what equating and scaling looks like. What it means very roughly is this: there is an opportunity to account for the difficulty or ease of the test itself, and for the higher or lower quality of test-takers, by looking at how those test-takers have done compared to other test-takers on previous administrations of the exam. It’s all anchored to the MBE.

Let me make up a few numbers for the hypotheticals. Suppose there are three bar exam test-takers who get a 150, 140, and 130 on the MBE. Now suppose on the essays they get a 151, 141, and 131. The bar associates the 151 with the 150, the 141 with the 140, and the 131 with the 130, to help scale the scores in recognition that maybe these essays were a bit easier than usual. If the essay scores were 149, 139, and 129, they’d scale them to account for the fact that the essays were a bit tougher than usual.

Suppose that the essays instead are 160, 158, and 156. High compression now—but no difference. They’re still scaled back to the MBE and turn into the equivalent of 150, 140, and 130. These numbers could be 8,000,000,000, 17, and -216: the importance is the relative rank of test-takers against one another, then scaling it back to the MBE.

Revealing the topics of the essays gives everyone two advantages. The first is common (assuming everyone opens the email more than a few hours before the essays Tuesday, July 30.) You know that Question 1 is about Civil Procedure. Everyone knows that and can start from word go addressing Civil Procedure and only Civil Procedure on Question 1. So everyone’s scores may go up a bit.

I say “may,” because what this really does is help the students with the very lowest scores who may have missed the fact that it was a Civil Procedure question in the first place. I think most test-takers know what the call of the question is about. The very lowest scoring test-takers may not. So it may help their scores. But, again, if the scores shift from, say, 151, 141, and 91 to 152, 142, and 97 (disproportionately helping the very lowest scores), it won’t really matter—they’ll still be scaled back to the MBE.

The second advantage is that students can stop studying other, irrelevant topics like Wills & Trusts and focus exclusively on these areas. Again, it should have the same effect: everyone studies the same set of topics, everyone knows them a bit better, everyone scores a bit better. It probably helps the more marginal test-takers over the highest scoring test-takers, but, again, scaling accounts for that if all the boats are rising.

The problem may arise in a narrow set of circumstances. First, suppose the leak did make its way to some (likely very small set of) test-takers, and say it did so very promptly—perhaps as early as 120 hours before the bar exam. Those test-takers immediately stop preparing for irrelevant essays and focus only on relevant essay topics (and the MBE).

Other students only get the information when the Bar emails them, about 58 hours before the essays.

So we have to assume that the three years of legal education, the two months of bar exam preparation, and so on fall to the wayside, and the definitive change in knowledge and understanding occurs in a 62-hour head-start on eliminating some irrelevant subject of study. And in that 62 hours, the students effectively gain additional knowledge to help them for the essays that their peers did not have.

All possible (but, I think, pretty marginal). But it also has to have another effect. It doesn’t really matter if you were a test-taker who was going to get a 157 but for the head start and now gets a 158; it doesn’t really matter if you were a test-taker who was going to get a 92 but for the head start and now gets a 93. In both cases, you’d pass or fail regardless, and it’d have no impact on anyone around you.

Instead, assuming this did confer an advantage (again, possible, but pretty marginal), let’s look at the test-taker on the outside. That test-taker got a 143.9, the closest cut score to passing, but failed. She then looks at the test-taker who got a 144.0 and passed, or maybe a couple who did so (or even those who got a much higher or lower scores)—and these are test-takers who got a 62-hour head-start and eked out a higher score because of that advantage. And they specifically leap-frogged the 143.9—that is, their essay scores in particular would have been lower for the 143.9 but for the 62-hour head start, but they received higher scores, which were then scaled higher. Only for those students whose essay scores improved at a higher rate than others, in a way that affected those at or near the cut score, would experience any material change. (Of course, part of this is context: rather than thinking of a 62-hour head start, it might be useful to say that the person spent only ~1300 hours instead of ~1362 hours thinking about irrelevant topics, assuming two months’ preparation for the bar exam….)

Realize that a lot of things have to happen for this scenario to occur. It’s narrow, sure. But it’s certainly possible. It’s a reason I say “little, if any” impact—this is the scenario where it could change, but it requires a lot to align.

Undoubtedly, test-takers are understandably worried and anxious about any change in information about the bar exam, or any even potential inequities. Law deans will no doubt use this to highlight the incompetence of the Bar and call for an overhaul, perhaps a temporary reduction of the cut score to account for these inadequacies. (What the right solution is, I don’t know. Whether scoring changes come remains to be seen.)

But for the vast majority of law students, the leak should not affect them at all. I hope this gives a small comfort to the thousands of test-takers ahead of Tuesday’s test—you have worked very hard for many long hours, and it is that that will overwhelmingly determine your performance. Hang in there.

I have been lightly updating this post for clarity.

UPDATE, 7-28: Joan Howarth and Rob Anderson have helpfully pointed out that scores might actually increase. By eliminating some subjects, test-takers can now focus on the MBE all the more, too. If that increases MBE scores on Wednesday, then it lifts essay scores that are scaled to those MBE scores, and the overall pass rate could rise. Again, it may be marginal in these last few hours, but it remains possible.

UPDATE, 7-29: I’ve received some thoughtful emails about impact that the disclose may have that I didn’t consider (or didn’t adequately consider) in this post—the mental impact on subgroups, the effect on accommodated test-takers or non-native English speakers, the number of test-takers who have digitally “shut down” and may not learn of this information, and so on. It’s true that all these could have an impact in a way that wouldn’t be reflected in equating and scaling. And these are things very difficult to identify after the fact. Of course, a death in the family or a virus contracted shortly before the exam could have a similar impact—except, of course, (1) this affects all test-takers, and (2) it’s caused by an error from the bar. I appreciate people carefully thinking through the potential effects.

Do state bar licensing authorities distrust law schools?

It’s late July, so it’s time for another round of op-eds and blog posts about the bar exam—it doesn’t test the things that are required of legal practice, the cut score is unjustifiably high, it’s a costly and burdensome process for law students, etc.

Granted, these arguments have may varying degrees of truth, but, as any reader of this blog is no doubt familiar, I am pretty skeptical of these claims—and I say that as one who, as a law professor, in my own self-interest, would subjectively like to see an easier bar exam for law school graduates. But graduates have had persistently low scores for coming up on half a decade, mostly attributable to the decline in admissions practices at many law schools. And I think we too quickly conflate a lot of arguments about the bar exam.

But I’ve long had an uncomfortable thought about the bar exam as I’ve read the claims of legal educators (often law school deans) over the last several years. Law schools complain that their students have invested three years of their lives, plus tuition, plus the effort to pass the bar exam, and many fail—only, of course, to retake at still more invested time and cost before ultimately passing (or maybe never passing). Isn’t it unfair to these graduates?

Maybe, of course, depending on the “right” cut score in a jurisdiction. But… what about the opposite perspective? That is, are law schools graduating students who are not qualified to engage in the practice of law?

That’s a very cold question to ask. The ABA’s (new, slightly higher) standard for accrediting law schools is that at least 75% of its graduates should pass the bar exam within two years—it’s long had an outcome-oriented element to accrediting law schools. So the ABA admits that law schools can graduate a significant cohort who are never able to pass the bar.

Now, getting 100% first-time bar passage rate is pretty challenging—there are usually at least a couple of students at even the most elite law schools in even the biggest boom-times of legal education who’d fail the bar exam on the first attempt, for lack of effort or personal circumstances even if not for lack of ability.

But nevertheless, why do state bar licensing authorities—which also have a role in the accreditation of schools in the state (even if they mostly outsource it to the ABA)—require graduates of in-state law schools to take the bar exam? Does it reflect a distrust of those in-state law schools?

There’s only one state now with “diploma privilege,” Wisconsin. That is, graduates of law schools at the University of Wisconsin or Marquette University are automatically admitted to the bar. Many more states had diploma privilege several decades ago, but those have gradually been replaced until just Wisconsin remains.

Some complain about Wisconsin’s diploma privilege in the vein of, “Does it seem like Wisconsin’s law schools are really teaching sufficiently Wisconsin-centric law to preclude the need to take the bar exam?” But I think that mistakes what may be a driving force in these discussions (and the barrier that’s happened in jurisdictions considering reinstating diploma privilege).

In short, the bar exam is essentially a licensing authority’s way of verifying that the law schools are graduating qualified practitioners of law. Yes, the bar exam may be an imperfect way of doing it. But given that the bar exam highly correlates with law school grade point average, one can’t say it’s particularly irrelevant (unless law professors make the same claim about law school grades!).

Now imagine you’re the bar licensing authority in Wisconsin. You look at what’s happening at Wisconsin and at Marquette. And you’re satisfied—these two schools admit a good batch of students each year; their academic dismissal and transfer acceptance rules are sound; they graduate qualified students each year. Yes, maybe a few would fail the bar exam in Wisconsin each year—but we know there can be some randomness, or some cost of retaking for candidates who’ll ultimately pass, and the like. But the licensing authority trusts the law schools in the state. The law schools are consistently graduating students who, on the whole, are capable of practicing law in the state.

That’s a really good relationship between the state bar licensing authority and the law schools in the state, no?

So… what does that tell us about the other 49 states and the District of Columbia? (Although Alaska doesn’t have a law school….)

It may tell us that state bar licensing authorities do not have the same faith in these in-state law schools. That is, they believe law schools are not consistently graduating students capable of practicing law in the state. And that’s a cold truth for law schools to consider.

Of course, state bar licensing authorities may also have idiosyncratic reasons for preserving the bar exam (e.g., “We took the bar, so kids these days have to take the bar!”). And it might also be the case that many law schools or bar licensing authorities haven’t seriously considered trying to reinstate diploma privilege.

But I wonder about three persuasive reasons—which should cover the ideological spectrum!—for law schools in a few jurisdictions to consider pressing for diploma privilege. I look at the upper Midwest, the Great Plains, and northern New England in particular.

First, it encourages greater diversity in the legal profession. These arguments are consistently raised in California among other places—law schools are simply more diverse than the legal profession as a whole (due largely in recent years to changes in demographics), and reducing a barrier to the bar would immediately lift the diversity of the legal profession. (It would also encourage increased residence in state of those graduates, as the third point below indicates.)

Second, it reduces state regulatory occupational licensing authority burdens. We’ve seen a small revolution in states from Arizona to Pennsylvania to try to reduce the amount of occupational licensing burdens, from reducing the kinds of positions that need licensing to allowing interstate recognition of occupational licenses. Allowing a reduction in the burdens of occupational licensing would be consistent with that trend—even if it’s of a long-regulated profession like law.

Third, in these jurisdictions I named, states can offer a competitive advantage against other states where demographics favor more rapid population growth. Declining birth rates, aging populations, migration patterns, whatever it may be—there is simply less growth in the upper Midwest, Great Plains, and northern New England than other areas of the country. By offering in-state graduates the guarantee of bar admission, there is a greater incentive for these younger attorneys to stay in the state and practice locally rather than migrate elsewhere.

I also mention these jurisdictions because many have just one or two law schools, similar to Wisconsin, and therefore relatively easy for the schools to act together (or as one institution!) to meet the standards that would satisfy the state bar licensing authority.

The tradeoff for law schools? All the law schools in the state have to admit and graduate students who consistently appear able to pass the bar exam and practice law—a particularly high first-time pass rate and a near-100% ultimate pass rate.

As law schools for a few years have reduced admissions standards to preserve revenue, this is a particularly challenging prospect. State bar licensing authorities often appear increasingly distrustful of law school behavior, just as law schools often appear increasingly distrustful of state bar licensing authority behavior.

But developing a local community of trust between the state bar and in-state law schools could redound to significant benefits for all parties in short order. Whether that claim can be made persuasively, and whether law schools could alter their behavior in the short term for a potential long-term improvement of both their graduates’ positions and their state bar’s position, remains to be seen.

Assessing the effect of the ABA's new ultimate bar passage requirement

The ABA, after years of wrestling with the idea, finally approved a requirement that ”at least 75% of a law school’s graduates who sat for a bar exam must pass within two years of graduation.” Here’s a Q&A on some of the likely effect—at least, answering questions I’ve thought about for the last few years!

How many law schools could face accreditation risks?

There are several ways of looking at this question. You can look at all of the law schools’ ultimate bar passage rates for 2015 and 2016, but the rule only formally takes effect for the Class of 2017 (that is, bar passage attempts through 2019). We can look to past law school activity, which gives us a good starting place. But we can also be skeptical of these lists for several reasons—we should anticipate law school behavior will change, and so on.

Let’s start with the schools likely in the most dire shape: 7 of them. While the proposal undoubtedly may impact far more, I decided to look at schools that failed to meet the standard in both 2015 and 2016; and I pulled out schools that were already closing, schools in Puerto Rico (we could see Puerto Rico move from 3 schools to 1 school, or perhaps 0 schools, in short order), and schools that appeared on a list due to data reporting errors. Finally, I removed South Dakota, which saw its bar passage rate drop when the bar exam cut score was raised, but that cut score has been lowered and it appears to be in good shape.

  1L Class Size 2018 Attrition Bar Cut score
2012 2018 Delta
Atl's John Marshall 181 108 -40.3% 9.0% GA 135
Barry 293 255 -13.0% 3.0% FL 136
UDC 125 64 -48.8% 2.8% DC 133
Florida Coastal 580 60 -89.7% 3.3% FL 136
Golden Gate 227 237 4.4% 3.1% CA 144
New England 450 185 -58.9% 0.8% MA 135
Cooley 897 541 -39.7% 2.3% MI/FL 135/136

These schools represent just about 3% of law schools and just over 3% of 1Ls in 2018.

Undoubtedly, other law schools that are at or near the cutoff that are probably going to be watching their admissions, retention, and bar preparation more closely, but these are, I think, the ones most likely to face a direct effect.

Will law schools institute more selective admissions procedures?

It could be. For the most at-risk law schools, however, it’s not clear they can be much more selective absent significant financial investment (which they may lack). The alternative is for the most at-risk schools to shrink their class sizes. But some (not all) have had dramatic cuts already, as seen above. If schools can sustain bigger cuts, they may do so—but it’s not clear how sustainable that is.

For schools not directly affected but facing the heat of the new standard, they may have to begin reconsidering admissions strategies that value chasing USNWR rankings over selecting a higher quality incoming class.

Will law schools increase the number of academic dismissals?

It’s possible. From the chart above, most of these schools have fairly low dismissal rates. There’s room for higher non-transfer (academic + “other”) attrition. But ABA Standard 501(b) requires “law school shall only admit applicants who appear capable of satisfactorily completing its program of legal education and being admitted to the bar,” and Interpretation 501-3 provides, “A law school having a cumulative non-transfer attrition rate above 20 percent for a class creates a rebuttable presumption that the law school is not in compliance with the Standard.” So schools can increase dismissals, but not too much.

Will this proposal disproportionately affect schools in California, HBCUs, or for-profit schools?

Despite the fact that California has one of the highest cut scores at 144, only one school failed to meet the standard in both 2015 and 2016 (while another, not listed, is closing). California law school graduates typically score much higher on the bar exam than test-takers nationwide. A 75% pass rate within two years of graduation is therefore fairly attainable, even as first-time bar pass rates remain low. But even in California, the overall first-time pass rate among graduates of California’s ABA-accredited law schools in July 2018 was 64%, meaning many schools exceed 75% on the first attempt, and many more quickly cross 75% on students’ second attempt. That said, several California law schools failed to meet the standard in at least one of 2015 or 2016.

Only one HBCU law school is on the list. (Another missed the cutoff in 1 of 2 years.) Two for-profit law schools are in the list (others have closed recently as their numbers dwindle).

Perhaps unsurprisingly, most of the at-risk schools are in jurisdictions with relatively higher cut scores (135 and up). (The median bar exam cut score is around 133-135 in most jurisdictions.)

Will state bars lower their cut scores in response?

It’s possible. Several state bars (like South Dakota as mentioned above) have lowered their cut scores in recent years when bar passage rates dropped. If states like California and Florida look at the risk of losing accredited law schools under the new proposal, they may lower their cut scores, as I suggested back in 2016. If the state bar views it as important to protect their in-state law schools, they may choose the tradeoff of lowering cut scores (or they may add it to their calculus about what the score should be). Of course, lowering cut scores may have downsides, too, but that’s another matter….

Could schools encourage their graduate to take an “easier” bar or skip the bar exam altogether?

It’s possible. But discouraging students from taking the bar exam strikes me as an unrealistic proposition—there’s little incentive for a JD not to at least try, and the law school has few mechanisms except maybe pleading with students not to take the bar.

Taking an “easier” bar is a likelier proposition, but, again, if students are dead set on taking a “hard” bar, there is little school can do—a student who wants to practice in California not Alabama may simply be unpersuadable. The rise of the Uniform Bar Exam, however, makes this a much more promising possibility for some. A school worried about graduate passing the Oregon (137) or Colorado (138) could encourage the graduate to sit for the North Dakota (130) bar—all are the UBE, after all. If the student passes the ND bar, great! If they pass, and get a high enough score to waive into OR or CO, all the better! The only downside is convincing the student to go sit in ND for the bar exam if they don’t want to, and potentially pay for two state bar admissions if they pass, but schools might find modest funds to offset those costs.

Additionally, schools might find additional resources to subsidize students who fail the bar to retake it. Taking the bar is an expensive proposition, and students may be discouraged after a failure (or two, or three) from retaking it. To prevent those students from dropping off, schools might increasingly subsidize repeat efforts. That’s good for graduates, if it happens.

Will law schools invest in bar prep courses or change their curriculum?

Assuredly yes. But that’s not the right question [ed.: who’s writing these questions!]. Instead, will those actually help any students? The answer, in all likelihood, is no.

First, schools likely have been implementing bar passage improvement programs for several years, given that bar passage rates have been in decline for several years. But the sad evidence is that, so far, they don’t appear to be improving bar passage results. Worse, a recent California bar study specifically examining programs at several law schools found no relationship between bar prep programs at law schools and bar passage results.

Schools might be tempted to tweak their curriculum—require more bar-related courses or expand coverage of content in the first year—but that, too, seems unhelpful. There’s no evidence that performance in a given substantive law school course relates to performance on that topic on the bar exam.

Undoubtedly, the response for many law schools will be, “Don’t just stand there, do something!” But it remains highly contested, in my view, about whether the “do something” will lead to improvement.

All in all, is the new standard a good thing?

Well, maybe? (A great answer of an academic, I know.) Tightening admissions and increasing academic dismissals certainly improve the likelihood that graduates ultimately pass the bar exam, which puts them in compliance with the standard. But it is only a likelihood—schools may not take risks on certain bands of students who might ultimately succeed even if their predictors don’t show it. Then again, if massive debt loads, an uncertain job market for marginal law school graduates, and still a high risk of failure are put into the equation, maybe we want more risk-averse decisionmaking at law schools.

That said, I continue to wonder about why the ABA is accrediting law schools as it increasingly obsesses over bar passage rates. Barry Currier has written to defend that we ought to require a bar exam and that ABA law school accreditation standards should have a bar passage standard. But it’s not clear to me why bar passage is tied in most jurisdictions to attending an ABA-accredited school. And it strikes me that if the ABA is insisting that good law schools are (among other things) the ones where most of the graduates pass the bar exam, it’s not clear that ABA accreditation is doing much value-added except telling us what the bar exam is already telling us.

What’s the bottom line here?

Oh, I digress. In short, I think a few law schools will face intense pressure in the short-term future, and a few may close. Many others will consider some structural changes in admissions and retention practices (which should improve rates), and curricular and bar prep changes (which likely won’t improve rates), to the extent those schools can afford to do so. But I won’t expect anything too dire. While it’s safe to say that 30 or so law schools have something to worry about, a much smaller number are facing existential threats to their schools.

February 2019 MBE bar scores bounce back from all-time lows

After cratering to all-time record lows last year, scores on the February administration of the Multistate Bar Exam have bounced back. It’s good news, but modest—the rise returns to scores from February 2017, which were at that time the lowest in history. Scores have now bounced back to match the second-lowest total in history… which is slightly better.

To be fair (which is not to say I’ve been unfair!), part of this overall score is likely driven by the Uniform Bar Exam. It used to be that there were more test-takers who’d passed a previous bar exam and would have to take another test in another jurisdiction. Those who’d already passed were likely to score quite well on a second attempt on a new bar. But the National Conference of Bar Examiners has indicated that the rise of the UBE has dropped the number of people taking a second bar, which in turns drops the number of high scorers, which in turn drops the MBE scores. So the drop in the MBE scores itself isn’t entirely a cause of alarm. It’s a reflection that the UBE is reducing the number of bar test-takers by some small figure each year.

We now know the mean scaled national February MBE score was 134.0, up 1.8 points from last year's 132.8. We would expect bar exam passing rates to rise in most jurisdictions. Just as repeaters caused most of the drop last time, they are causing most of the rise this time. Repeaters’ scores simply appear to be more volatile as a cohort of test takers.

A couple of visualizations are below, long-term and short-term trends.

For perspective, California's "cut score" is 144, Virginia's 140, Texas's 135, and New York's 133. The trend is more pronounced when looking at a more recent window of scores.

The first major drop in bar exam scores was revealed to law schools in late fall 2014. That means the 2014-2015 applicant cycle, to the extent schools took heed of the warning, was a time for them to improve the quality of their incoming classes, leading to some expected improvement for the class graduating in May of 2018. But bar pass rates were historically low in July 2018. It’s not clear that law schools have properly adapted even after five years.

Until then, we wait and see for the July 2019 exam. For more, see Karen Sloan over at NLJ.

Law school ruin porn hits USA Today

I actually laughed out loud when I started reading this “yearlong investigation” by four USA Today journalists on the state of legal education. I call the genre, “law school ruin porn.”

“Ruin porn” has long been a genre of photojournalism to display the decay of urban centers or Olympic sites. And I think the genre works for “law school ruins,” or exploiting details about the most marginal law schools and the most at-risk students, then treating them as typical of the profession.

Here’s how the piece opens:

Sam Goldstein graduated from law school in 2013, eager to embark on a legal career.

Five years later, he is still waiting. After eight attempts, Goldstein has not passed the bar exam, a requirement to become a practicing attorney in most states.

"I did not feel I was really prepared at all" to pass the bar, Goldstein  said of his three years in law school. "Even the best of test preps can't really help you unless you've had that solid foundation in law school."

In the meantime, many take lower-paying jobs, as Goldstein did, working  as a law clerk. What he earned didn't put a dent in his $285,000  in student-loan debt, most of which was accrued in law school.   

The piece is reminiscent of a genre of journalism that peaked in 2011 in a series of pieces by David Segal in the New York Times. Here’s how one of them opened:

If there is ever a class in how to remain calm while trapped beneath $250,000 in loans, Michael Wallerstein ought to teach it.

Here  he is, sitting one afternoon at a restaurant on the Upper East Side of Manhattan, a tall, sandy-haired, 27-year-old radiating a kind of surfer-dude serenity. His secret, if that’s the right word, is to pretty  much ignore all the calls and letters that he receives every day from  the dozen or so creditors now hounding him for cash.

“And I don’t open the e-mail alerts with my credit score,” he adds. “I can’t look at my credit score any more.”

Mr.  Wallerstein, who can’t afford to pay down interest and thus watches the outstanding loan balance grow, is in roughly the same financial hell as  people who bought more home than they could afford during the real estate boom. But creditors can’t foreclose on him because he didn’t spend the money on a house.

He spent it on a law degree. And from every angle, this now looks like a catastrophic investment.

Well, every angle except one: the view from law schools.

The fundamental problem with a piece like this one in USA Today is how it treats the outlier as the norm. The vast majority of law students do pass the bar exam on the first attempt. The vast majority of law schools are at no risk of failing to meet the ABA’s standards. But the piece is framed in quite a different fashion.

A student like the one USA Today found is nearly impossible to find. For instance, I blogged earlier about a look at how 2293 first-time test-takers did on the Texas bar exam. Only 10 failed the bar exam even four times. Granted, that includes about another 150 who failed one, two, or three attempts and stopped attempting (at least, stopped attempting in Texas). But it’s nearly impossible to find graduates who have had such poor performance, bad luck, or some combination for such an extended period of time.

USA Today also profiled a graduate of Arizona Summit Law School, the outlier for-profit law school—I’ve blogged about how before 1995, the ABA would never accredit for-profit law schools, until the Department of Justice compelled law schools to do so. (More on Arizona Summit in a bit.)

The ostensible focus of the piece is the ABA’s renewed proposal to require law schools to demonstrate an “ultimate” bar passage rate of 75% within two years of graduation. The result appears dire: “At 18 U.S. law schools, more than a quarter of students did not pass the bar exam within two years,” according to Class of 2015 data.

Of course, George W. Bush would have lost the 2000 presidential election if the National Popular Vote plan were in place. Or, less snarkily, if the rules change, we should expect schools—and perhaps state bars—to change how they behave. If 75% were the cut off, we would expect not just changes in admissions standards, but changes in bar exam cut scores, changes in where students are encouraged to take the bar exam, increased academic dismissal rates, and so on—in short, the 18 from the Class of 2015 doesn’t tell us much.

That said, there are two other reasons the 18 figure doesn’t tell us much. First, and this makes me more “doom and gloom,” it’s too conservative a figure to show the schools that may face a problem in the near future. Any school near an 80% ultimate pass rate, I think, would feel the heat of this proposal—a bad year, a few frustrated students who stop repeating, a weak incoming class, and so on could move a school’s figures a few percentage points and put them in danger. Another 12-15 law schools are within a zone of danger of the new ABA proposal.

Second, the 18 is not nearly as dire as the USA Today piece makes it seem. Two of them are schools in Puerto Rico, which are so different in kind from the rest of the ABA-accredited law schools in the United States that they are essentially two entirely different markets.

At the very end of the piece, it finally conceded something about Arizona Summit: “Arizona Summit Law School in Phoenix, Whittier Law School in Southern California and Valparaiso Law School in northern Indiana are not  accepting new students and will shut once students finish their degrees.” Even without the ABA proposal, 3 of the 18 schools are shutting down—including Arizona Summit, the foil of the opening of the piece. So now the student is not simply an outlier, an 8-time bar test taker from a for-profit school, but from a for-profit school that is no longer in operation. An outlier of an outlier of an outlier—given treatment as something typical. Talk about burying the lede.

And while the data comes from the ABA, I have to wonder whether, because this is the first data disclosure from law schools, some of it is not entirely helpful. (Again, one would think a year-long investigation would clear up these points.) Take Syracuse, listed as an ultimate pass rate of 71%. Its first-time July 2015 bar pass rate was 79%. (Its subsequent July 2016 flew to 89%.) Its combined February & July 2015 pass rates were 86%, along with 75% in New Jersey. (Its California rate for those two tests was 1-for-13.) Now, perhaps it has an unusually high number of individuals failing out of state; or who didn’t take the July 2015 bar the first time and ultimately failed—I have no idea. But it’s the kind of outlier statistic that, to me, merits an inquiry rather than simply a report of figures. (UPDATE: Syracuse has indicated that the figures were, in fact, inaccurate, and that their ultimate bar passage rate was 82.6%.)

The piece also unhelpfully quotes, without critique, some conclusions from “Law School Transparency.” (You may recall that several years ago LST tried to shake down law schools by charging them at least $2750 a year to “certify” that those schools met LST’s disclosure standards.) For instance, “The number of law schools admitting at least 25% of students considered ‘at risk’ of failing the bar jumped from 30 schools to 74 schools from 2010 to 2014, according to a report in 2015 by Law School Transparency.” Of course, if one cares about ultimate pass rates, which this article purports to care about, then how is it that just 18 schools missed the “ultimate” pass rate compared to LST’s projected 74 (for 2014, but things weren’t exactly better by 2015). In part because LST’s “at risk” is an overly broad definition—because it doesn’t include academic dismissals (despite mentioning it in the report), because it doesn’t account for variances in state bars (despite mentioning it in the report, but not included in identifying “at risk”), because it’s not clear whether LST is primarily concerned with first-time or ultimate passage (the report jumps around), because LST adds a level of risk (which USA Today mistakenly reports) to “at risk” of not graduating in addition to “at risk” of not passing the bar (which, I think, is an entirely valid thing to include), and so on.

A lengthy investigative piece should, in theory, provide greater opportunity for subtlety and fine-tuning points, rather than list a bunch of at-risk schools and serially identify problems with as many of them as possible. That isn’t to say that there aren’t some existential problems at a handful of law schools in the United States, or that the ABA’s proposal isn’t worthy of some serious consideration. It’s simply that this form of journalism is a relic of 2011, and I hope we see the return of more nuanced and complicated analyses to come.

Do specific substantive courses prepare students for those topics on the bar exam? Probably not

Earlier, I blogged about the disconcerting conclusion from recent bar performance and the results of a California State Bar study that law school “bar prep programs” appear to have no impact on students ability to pass the bar exam.

But what about specific substantive course areas? Does a student’s performance in, say, Torts translate into a stronger bar exam score?

The answer? Probably not.

First, let me aclear a little underbrush about what claim I’d like to examine. We all know that students take some subjects that appear on the bar, but most don’t take all of them. Virtually all law school graduates take a specific bar preparation course offered by a for-profit company to help train them for the bar exam.

But law schools might think that they could improve bar passage rates by focusing not simply on “bar prep,” but on the substantive courses that will be tested on the bar exam. If bar passage rates are dropping, then curricular reform that tries to require students to take more Evidence, Torts, or Property might be a perceived soslution.

So what exactly is the relationship between substantive course area performance and the bar exam? Not much.

Back in the 1970s, LSAC commissioned a study looking at law schools in several states and their performance on the bar exam. The then-new Multistate Bar Exam had five subjects. Researchers looked at how law students performed in each of those substantive subject areas in law school: Contracts, Criminal Law, Evidence, Property, Torts. (The results of the study are found at Alfred B. Carlson & Charles E. Werts, Relationships Among Law School Predictors, Law School Performance, and Bar Examination Results, Sep. 1976, LSAC-76-1.)

They then looked at whether The LSAC study examined first-year subject-area grades; first-, second-, third-year grades; and overall law school GPA, and their correlations with MBE subject areas. The higher the number, the closer the relationship.

Torts is an illustrative example. The relationship between the TORT/L (grades in Torts) and the performance of students on the MBE area of Torts is 0.19, a relatively weak correlation. But grades in Torts were more predictive of performance in Real Property, Evidence, Criminal Law, and Contracts—perhaps a counterintuitive finding. That is, your Torts grade told you more about your performance in the Property portion of the bar exam than the Torts section.

Again, these numbers are relatively weak, so one shouldn’t draw much from from that noise, like 0.19 to 0.26.

In contrast, LGPA/L (law school GPA) was more highly correlated than any particular bar exam subject area, and highly correlated (0.55) with the total MBE performance. Recall that overall law school GPA includes a number of courses—bar related and not—and that it’s more predictive than any particular substantive course area.

The LSAC study dug into further findings to conclude that the bar exam is testing “general legal knowledge,” and that performance in any particular subject area is not particularly indicative of strength of performance on that subject area on the bar exam.

The short of it is, this is good evidence that the important thing coming out of three years of law school is not the substantive transmission of knowledge, but the, for lack of a better phrase, ability to “think like a lawyer” (or simply engage in critical legal analysis). Bar prep courses the summer before the bar exam are likely the better place to cram the substantive knowledge for the bar; but the broad base of legal education is what’s being tested (perhaps imperfectly!) on the bar exam.

We also have the results of a recent study by the California State Bar. The study looked at student performance in particular course areas and the relationship with bar exam scores. After examining the results of thousands of students and bar results from 2013, 2016, and 2017, the findings are almost identical.

The correlations between any one subject that that subject on the bar exam are modest, and sometimes they’re (slightly) more highly correlated with different subject areas—the same findings as LSAC’s 1976 study. But none of them are nearly as strong as the overall law school GPA, which is between .6 and .7 over the overall MBE and written components as the study finds. (Unfortunately, this study didn’t break out the relationship between law school GPA and particular MBE topic areas.)

The study did, however, make an interesting finding and reached what I think is an incorrect possible conclusion.

The study discovered that cumulative GPA in California bar exam-related subject areas (listed above) was significantly more highly correlated with the cumulative GPA in non-California bar exam-related subject areas.

It went on to find no relationship (in some smaller sets of data) between bar passage rates and participation in clinical programs; externships; internships; bar preparation courses; and “Non-Bar Related Specialty Course Units” (e.g., Intellectual Property).

Here’s the finding I’d take issue with: “However, overall CBX [California bar exam] performance correlated more strongly statistically with aggregate performance in all of the bar-related courses than with aggregate performance in all non-bar-related courses, suggesting that there may be some type of cumulative effect operating.”

I’m not sure that’s the right assumption to reach. I think that the report understates the likelihood that grade inflation in seminar courses; higher inconsistency in grading in courses taught by adjuncts; or grades in courses that don’t measure the kinds of skills evaluated on the bar exam (e.g., oral advocacy in graded trial advocacy courses) all affect non-bar-related course GPA. That is, my suspicion is that if one were to measure the GPA in other substantively-similar non-bar-related courses (e.g., Federal Courts, Antitrust, Secured Transactions, Administrative Law, Merger & Acquisitions, Intellectual Property, etc.), one would likely find a similar relationship as performance in bar-related course GPA. That’s just a hunch. That’s what I’d love to see future reports examine.

That said, both in 1976 and in 2017, the evidence suggests that performance in a specific substantive course has little to say about how the student will do on the bar—at least, little unique to that course. Students who do well in law school as a whole do well on each particular subject of the bar exam.

When law schools consider how to best help prepare their students for the bar, then, simply channeling students into bar-related subjects is likely ineffective. (And that’s not to say that law schools shouldn’t offer these courses!) Alternative measures should be considered. And I look forward to more substantive course studies like the California study in the future.

Why are law school graduates still failing the bar exam at a high rate?

The first decline took place in the July 2014 bar exam, which some believed might be blamed on an ExamSoft software glitch. Then came continued declines in the July 2015 exam, which some blamed on the addition of Civil Procedure to the Multistate Bar Exam. The declines persisted and even worsened.

Five straight July bar exam cycles with persistent low pass rates across the country. But the bar exam has not become more difficult. Why?

One reason rates remain low is that predictors for incoming classes remain low. LSAT scores actually declined among the most at-risk students between the incoming classes admitted in the 2011-2012 cycle (graduating in 2015) and the 2014-2015 cycle (graduating in 2018). The 25th percentile median LSAT among full-time entrants dropped 2 LSAT points between those who graduated in the Class of 2015 and the Class of 2018. Indeed, 11 schools saw a drop of at least 5 LSAT points in their 25th percentile incoming classes—almost as many as those that saw any improvement whatsoever (just 12 schools, including Yale and Stanford).

Not all LSAT declines are created equal: a drop from 170 to 168 is much more marginal than a drop from 152 to 150; and a drop can have a bigger impact depending on the cut score of the bar exam in each jurisdiction. But it’s no surprise, then, to see the persistently low, and even declining, bar passage rates around the country with this quick aggregate analysis.

Nevertheless, since around September 2014, law schools have been acutely aware of the problem of declining bar passage rates. Perhaps it was too late to course-correct on admissions cycles through at least the Class of 2017.

But what about academic advising? What about providing bar preparation services for at-risk students? Given that law schools have been on notice for nearly five years, why haven’t bar passage rates improved?

I confess, I don’t know what’s happened. But I have a few ideas that I think are worth exploring.

First, it seems increasingly likely that academic dismissal rates, while rising slightly over several years, have not kept pace to account for the significant decline in quality of entering students. Of course, academic dismissals are only one part of the picture, and a controversial topic at that, particularly if tethered to projections about future likelihood to pass the bar exam on the first attempt. I won’t delve into those challenging discussions; I simply note them here.

Another is that law schools haven’t provided those academic advising or bar preparation services to students—but that seems unlikely.

Still another, and perhaps much more alarming, concern is that those bar services have been ineffective (or not as effective as one might hope). And this is a moment of reckoning for law schools.

Assuredly, when the first downturns of scores came, law schools felt they had to do something, anything, to right the ship. That meant taking steps that would calm the fears of law students and appease universities. Creating or expanding bar preparation courses, or hiring individuals dedicated with bar preparation, would be easy solutions—law students could participate in direct and tangible courses that were specifically designed to help them achieve bar exam success; law faculty could feel relieved that steps were being taken to help students; university administrators could feel confident that something was being done. Whether these bolstered existing courses or added to them, assuredly schools provided opportunities to their students.

But… to what end? Something was done at many institutions. Has it been effective?

Apparently not. The lagging (and falling) bar passage rates are a sign of that. Granted, perhaps the slide would be worse without such courses, but that seems like cold comfort to schools that have been trying to affirmatively improve rates.

We now have the first evidence to that effect. A report commissioned by the California State Bar recently studied several California law schools that disclosed student-specific data on a wide range of fronts—not just LSAT and UGPA in relation to their bar exam score, but law school GPA, courses taken, even participation in externships and clinic.

One variable to consider was involvement in a bar preparation course. Did participation in a bar preparation course help students pass the bar? I excerpt the unsettling finding here:

Five law schools provided data for this variable. Students averaged about 1.5 units (range 0 to 6). For all those students, there was a -.20 (p<.0001) correlation between the number of units taken and CBX TOTSCL [California Bar Exam Total Scale Scores]. The source of this negative relationship appears to be the fact that in five out of six [sic] of the schools, it was students with lower GPAs who took these classes. After controlling for GPA, the number of bar preparation course units a student takes had no relationship to their performance on the CBX. A follow up analysis, examining just the students in the lower half of GPA distribution, showed that there was no statistically significant difference in CBX TOTSCL for those who took a bar preparation course versus those who did not (p=.24). Analyses conducted within each of the five schools yielded similar findings.

This should be a red flag for law schools seeking to provide bar preparation services to their students. In this student, whatever law schools are doing to help their students pass the bar has no discernible impact on students’ actual bar exam scores.

Granted, these are just five California law schools and the California bar. And there has been other school-specific programs at some institutions that may provide a better model.

But it’s worth law schools considering whether students are on a path toward improving bar passage success or simply on a hamster wheel of doing more work without any discernible positive impact. More studies and evidence are of course in order. But the results from the last several years, confirmed by the study of five California law schools, suggests that revisiting the existing the model is of some urgency.

MBE scores drop to 34-year low as bar pass rates decline again

On the heels of some good news in recent administrations of the July bar exam comes tough news from the National Conference of Bar Examiners: the Multistate Bar Exam (MBE) scores have dropped to a 34-year low, their lowest point since 1984.

For perspective, California's "cut score" is 144, Virginia 140, Texas 135, New York 133. A bar score of 139.5 is comparable to 2015 (139.9) in recent years. One would have to go back to the 80s to see comparable scores: 1982 (139.7), 1984 (139.2), & 1988 (139.8).

I’d hoped that perhaps qualifications of students have rebounded a bit as schools improved their incoming classes a few years ago; perhaps students are putting more effort into the bar than previous years; or other factors. That appears to not be the case this year.

That said, MBE scores may be slightly less predictive of what will happen with actual bar pass rates. the NCBE has pointed out that the rise of the Uniform Bar Exam has led to a number of test-takers transferring scores to new jurisdictions rather than taking a second jurisdiction’s bar—and, presumably, those who pass in one jurisdiction are much more likely to pass in another jurisdiction (accepting that cut scores can vary in some jurisdictions). The UBE points to a few thousand such transfers last year, at least some of whom may have taken the bar exam. But put against more than 40,000 MBE test-takers, the effect, while real, may be small.

Instead, we’re left to watch as results come in state by state. Tracking first-time pass rates (from jurisdictions that share them so far—ideally, ABA graduates would be a better measure, but this works reasonably well for now), declines have been pretty consistent: New Mexico (-14 points), Indiana (-3), North Carolina (+1), Oklahoma (-8), Missouri (-7), Iowa (-3), Washington (-3), and Florida (-4). But in many of these jurisdiction, pass rates were worse in, say, 2015 or 2016.

We’ll know more in the months to come, but it looks like another year of decline will cause some continued anguish in legal education. The increased quality of law school applicants this year will help the July 2021 bar exam look much better.

Note: I chose a non-zero Y-axis to show relative performance.