Some thoughts about "bar exam federalism"

Professor Dan Rodriguez has some important thoughts over at PrawfsBlawg about the “high costs” of “bar federalism.” I had a few thoughts I wanted to add to his helpful perspective.

While most states have moved toward a delayed bar exam with an expanded “limited practice” model for would-be bar exam test-takers, Utah recently announced a “diploma privilege” model, where graduates of certain ABA-accredited law schools who have 360 hours (or about 9, 40-hour weeks) of supervised practice by the end of the year will be eligible for admissions without a bar exam. (This model, of course, is subject to public comment and review.)

Utah chooses not to limit the diploma privilege to its two in-state law school, Utah and BYU, which was a concern I raised earlier in considering the cohorts to be affected by delays to the bar exam. Instead, it extends to graduates of any ABA-accredited school with a 2019 cumulative bar passage rate of 86%, which was the State of Utah’s bar pass rate.

Professor Rodriguez laments that this standard might disproportionately adversely affect California’s law schools, because California has an unusually high cut score for its bar examination. That is, an 86% threshold doesn’t make a lot of sense when we consider the varying bar exams around the country—and that Wisconsin has two law schools with diploma privilege that would essentially automatically qualify. While I think his concerns are legitimate, I look at it from the opposite perspective (in a way that negates any concerns), and descriptively overstated.

That is, while Professor Rodriguez laments that many schools are left out of Utah’s proposal, I see Utah’s proposal as exceedingly generous, increasing the diploma privilege opportunity from two in-state schools to around 65 schools, about 1/3 of all ABA-accredited schools! I suppose it all depends on one’s perspective.

Now, a bit of a slightly unfair narrative here, so please bear with me—even indulge me. Utah’s bar exam statistics disclose very little. But Utah knows how many out-of-state law school grads take its bar exam each year. My back of the envelope calculations suggest at least 2/3 of test-takers are in-state. For a bar that has around 225 first-time test-takers in July from ABA-accredited schools, we are dealing with a very small pool of out-of-state test-takers in the first place—say, around 75, if not fewer.

Furthermore, we know that “national” schools, typically “selective” or “elite” schools, place graduates nationally. So if the Utah bar is concerned about a rule that would keep graduates of, say, schools from New Haven or Cambridge from returning to Utah, it needs a rule that would allow them, too. But not a rule based on something crass like USNWR rankings.

I don’t think an 86% bar passage rates is a great way of measuring schools with a sufficient quantity of “good” graduates such that the Utah state bar feels comfortable admitting them without an exam, but it has its virtues. For instance, every school in the Top 20 of the USNWR rankings makes the cut. Outside the top 20, only a few in, say, the top 45 miss the cut—Emory, UC-Irvine, UC-Davis, and the University of Washington, to name most if not all. And this is also a notable cut line given that both BYU and Utah are in the top 45 of the latest USNWR rankings. Again, crass, but roughly accurate.

If one considers the selectivity of the law school as both a proxy of the number of out-of-state bar exam test-takers, and the quality of the graduates, then the standard gives the benefit to a batch of other schools that fall outside the top 45 but, on the whole, gives an advantage to the very schools most likely to send grads to Utah and to pass the bar at the highest rates.

But, Professor Rodriguez wonders about a disproportionate impact on California schools. Dean Paul Caron, for instance, emphasizes that just four of California’s 21 ABA-accredited law schools would qualify.

Focusing on USNWR top-45 schools, UC-Davis saw just 12 of its 148 test-takers take a non-California bar in 2019—scattered across 5 jurisdictions, with an out-of-state first-time pass rate of 75% (and not reflective of California’s high cut score, it should be noted). UC-Irvine had just 11 of its 135 test-takers take a non-California bar in 2019—scattered across 7 jurisdictions, with an out-of-state first-time pass rate of 91%. (These are much lower out-of-state figures than either Emory or Washington.)

This is to emphasize an earlier point—the most “elite” or “selective” schools disproportionately place students in out-of-state bar exams. And California’s schools—even California’s very good schools like Davis and Irvine—place very few out of state. (It’s also, I think, a testament to law student choices of California schools as a greater commitment to remaning in in California.) And, I think it’s fair to assume, very very few into Utah.

Indeed, the Utah state bar knows as of April 1, its cutoff for this rule, which test-takers from which schools enrolled for the state’s bar. I would guess that it’s a fraction of its prospective test-takers who don’t make the cut.

That’s disappointing for them to be sure. But, again, I look at it from the perspective of allowing about 63 law schools to secure diploma privilege where I’d expect two. And while some out-of-state would-be test-takers are out of luck, so are repeaters for this administration of the exam—some of whom assuredly would have passed.

There are probably better rules to come up with, as Professor Rodriguez emphasizes. But they would be more complicated and be targeted at an increasingly small cohort of students. That isn’t to diminish the deep disappointment recent graduates of those excluded law schools must feel as they face a delay to their practice of law, and the need to take a bar exam when others don’t. But it’s to say that I think the impact is not only a generous one to the vast majority of law school graduates who’d take the Utah bar, but aslo adversely affects very few. Perhaps the Utah bar will disclose those figures in the weeks ahead.

State bar licensing authorities converging on coronavirus trend: postpone bar, allow grads limited ability to practice in interim

The last few weeks of disruption arising from the coronavirus pandemic have yielded calls for state bar licensing authorities to consider what accommodations they should offer ahead of the schedule July 2020 administration of the bar exam. I’ve looked at some of these proposals here and here.

One of the more popular points of advocacy—and, it’s worth emphasizing, driven by law school faculty of ABA-accredited law schools, law school deans of ABA-accredited law schools and law students graduating from those institutions—is for “emergency diploma privilege.” (I’ve pointed out how this addresses only a slice of the bar exam-taking population here.) Sadly, this point of view has spiraled away from the emergency-oriented concerns into more broad-based (and, in some ways, timeless) critiques of the bar exam generally.

Diploma privilege would be a dramatic change, even if on an “emergency” basis. And, again, it only addresses a subset of the exam-taking population. So perhaps it’s no surprise to see state bar licensing authorities offering a two-step approach, an approach, I think, that will become the norm. It’s a model raised among New York law school deans (as a more modest measure to their larger proposal for diploma privilege!) and the American Bar Association, in addition to something states like Tennessee and Arizona are implementing.

First, the bar exam may be delayed—perhaps into September or October, perhaps into February of 2021.

Second, the state bar licensing authority offers more generous opportunities to engage in the limited supervised practice of law, until those would-be bar test-takers are able to take the first bar exam available.

In some ways, it simply extends the limited practice of law opportunities that already exist for recent law school graduates or those awaiting bar results. It allows the accommodation of in-state and out-of-state law school graduates; it allows it for JD and non-JD graduates; it even allows some accommodation for those from other jurisdictions who want to take that state’s bar and practice there (as the Tennessee order expressly contemplates). It does not, however, accommodate those who have previously failed the bar exam (e.g., deemed lacking minimum competence to practice in a previous administration of the bar exam).

The two-step proposal helps address the many cohorts who have an interest in the July 2020 bar exam. It does increase the inconvenience of recent graduates—they’ll likely take a bar exam while working or need to take time off from working to take the bar exam. Particularly if they study over the summer for a September test, only to find the September test further delayed and need to take it in February, it would be particularly frustrating.

That said, the two-step proposal is minimally disruptive to the status quo and allows recent graduates to quickly enter the working legal profession. And it’s minimally disruptive from the state bar perspective in that the bar exam—how every jurisdiction measures minimum competence (with all of the controversy that surrounds it, to be certain!)—will remain in place, simply at a later date. States have had little difficulty with limited practice status granted to recent graduates—extending that slightly is a natural solution.

I anticipate many more states will move in this direction, but time will tell if that changes—some jurisdictions might get more creative, the coronavirus might worsen, or other intervening events might change things.

Did the California bar enact "diploma privilege after the 1906 California earthquake and during WWII"?

This claim, versions of which I’ve seen elsewhere, appears in a letter signed by hundreds of law students urging the California Supreme Court to enact an “emergency diploma privilege” as an alternative to the traditional bar exam. (Some of my thoughts on that are here and here, and some accommodations state bars should assuredly now take are here.)

Here’s the claim:

Diploma privilege is feasible. After all, California enacted diploma privilege after the 1906 California earthquake and during WWII.[footnote]

The claims are not entirely accurate and much narrower than presented (both here and elsewhere).

The footnote of the letter cites this article, which actually confirms my instinct about the California bar in 1906—the California State Board of Bar Examiners was formed in 1919, which then created “the first written exams” in 1920. (Independently, he article does not claim that “diploma privilege” was “enacted” after the 1906 earthquake.)

Prior to 1906, then, admission to the bar was on motion. There was no written bar exam (as we know it) in the first place.

And the 1906 episode is a specific incident at the University of California - Hastings that prompted the 1906 story. Here’s a story from the Hastings Law Journal in 1953:

The fire and earthquake of 1906 worked its havoc on Hastings. The City Hall was completely destroyed. Although former students allege that they started out for the College on April 18, no trace of Hastings could be found. The College year ended abruptly. So great was the catastrophe that it was deemed permissible to waive final examinations, which, through coincidence, ha[d] been scheduled for that day. The class of 1906 had an unique distinction. Whereas earlier classes had been admitted to the bar on the motion of the faculty (college examinations being passed to the satisfaction of the faculty) the class of 1906 was admitted “because of the motion of the earth!”

It’s worth emphasizing, then, that diploma privilege was actually the norm in 1906, not the exception. The exception arose in whether graduating law students needed to complete their final examinations. And it only extended, apparently, to the University of California - Hastings, given that the law school was destroyed the day of the exam. (Apparently, the California bar accepted the motion of the faculty even in the absence of a final law school examination.) And there was no traditional written “bar exam” at the time.

Additionally, the World War II exception is somewhat narrower than identified, but relevant as I detail below. Here’s how it was summarized in that footnote citation, Admissions standards evolve across decades, by Kathleen Beitiks in the California Bar Journal:

For many years the California examination consisted of a written test exclusively. And under some circumstances during World War II, bar examiners waived the exam requirement for returning soldiers whose legal careers were interrupted.

This exception was actually a statutory exemption enacted by the California legislature. Here’s Chapter 65 from the First Extraordinary Session of the 56th California Legislature, enacted in 1946:

The provisions of subdivisions (d) and (h) of Section 6060 do not apply to any person who, after September 16, 1940, and prior to the termination of hostilities between the United States and the nations with which the United States is now at war as determined by Act of Congress or Proclamation of the President, has graduated from a law school accredited by the examining board and who after such graduation served in the armed forces of the United States before taking an examination for admission to the bar, nor to any person, who, after September 16, 1940 satisfactorily completed at least two years of study at a law school then accredited by the examining board and whose legal education was thereafter interrupted by his service in the armed forces of the United States, and who subsequently graduates from a law school accredited by the examining board. The provisions of this section shall not apply to any person who enters the armed forces of the United States after the effective date of this section, nor to any person who at the time of entering the armed forces was not a bona fide resident of this State.

This section shall remain in effect until the ninety-first day after final adjournment of the Fifty-eighth Regular Session of the Legislature. While this section is in effect it shall supersede any existing provisions of law which are in conflict with this section; but such provisions are not repealed by this section and after this section is no longer effective shall have the same force as though this section had not been enacted.

(Then, subsection (d) required one to be a bona fide resident of the state for three months before the bar exam, and subsection (h) required someone to “have passed a final bar examination given by the examining committee.”)

It’s not that diploma privilege was “enacted” “during World War II.” It’s that veteran residents of California who graduated from a California-accredited law school but had their education “interrupted” by military service had the bar requirement waived after the war ended.

That’s not to say that those who signed this letter don’t have a point in drawing an analogy here. There was an emergency interruption of law students’ ordinary careers as they went to serve in the military. Upon their return, the legislature waived the requirement that they take the bar exam. That seems particularly sensible, given that their study of law was interrupted—much easier to take the bar right after graduating rather than after years of military service in between. But, ease and convenience of the test-taker—more so than the pre-existing concerns of “protecting the public”—appear (!) to have motivated the legislature in this case.

So, too, should state bar licensing authorities (or state legislatures) consider what kind of interruption, and what kind of previous educational experience, might qualify for admission to the bar—a matter of robust debate in the weeks ahead in states around the country.

When the Task Force on the New York Bar Examination plagiarizes your work without attribution

UPDATE: The chair of the Task Force reached out to me with apologies and intends to update the report with attribution. I’ll link to that updated report when it’s available.

My blog isn’t much. It makes no money. It garners little attention. I don’t earn money consulting from it. It contains my half-baked musings, the best of which might become an article, the worst of which I strike through and hope people forget.

But at the very least, it would be nice to see my work acknowledged if it’s useful.

Sadly, the Task Force on the New Your Bar Examination found my work useful, but chose to copy without attribution.

Its recent report on the state of the bar exam takes large chunks of my blog and treats it as its own work product. Several paragraphs are lifted from my 2015 post, “No, the MBE was not ‘harder’ than usual.”

Here’s a part of my post:

The MBE uses a process known as "equating," then "scales" the test. These are technical statistical measures, but here's what it's designed to do. (Let me introduce an important caveat here: the explanations are grossly oversimplified but contain the most basic explanations of measurement!)


Standardized testing needs a way of accounting for this. So it does something called equating. It uses versions of questions from previous administrations of the exam, known as "anchor" questions or "equators." It then uses these anchor questions to compare the two different groups. One can tell if the second group performed better, worse, or similarly on the anchor questions, which allows you to compare groups over time. It then examines how the second group did on the new questions. It can then better evaluate performance on those new questions by scaling the score based on the performance on the anchor questions.

This is from Page 46 of the Task Force report:

The MBE also uses a process known as “equating,” which “scales” the test to adjust for differences between exams and by different test takers over time. Equating uses versions of questions from previous administrations of the exam, known as “anchor” questions or “equators” to compare two different groups. This way, in theory, one can tell if the second group performed better, worse, or similarly on the anchor questions, which allows groups of test takers to be compared across test administrations. Then, how the second group did on the new questions is examined so that performance on the new questions can be evaluated based on performance on the anchor questions.

Here’s another part of my post:

Consider two groups of similarly-situated test-takers, Group A and Group B. They each achieve the same score, 15 correct, on a batch of "equators." But Group A scores 21 correct on the unique questions, while Group B scores just 17 right.

We can feel fairly confident that Groups A and B are of similar ability. That's because they achieved the same score on the anchor questions, the equators that help us compare groups across test administrations.

And we can also feel fairly confident that Group B had a harder test than Group A. (Subject to a caveat discussed later in this part.) That's because we would expect Group B's scores to look like Group A's scores because they are of a similar capability. Because Group B performed worse on unique questions, it looks like they received a harder batch of questions.

The solution? We scale the answers so that Group B's 17 correct answers look like Group A's 21 correct answers. That accounts for the harder questions. Bar pass rates between Group A and Group B should look the same.

In short, then, it's irrelevant if Group B's test is harder. We'll adjust the results because we have a mechanism designed to account for variances in the difficulty of the test. Group B's pass rate will match Group A's pass rate because the equators establish that they are of similar ability.

When someone criticizes the MBE as being "harder," in order for that statement to have any relevance, that person must mean that it is "harder" in a way that caused lower scores; that is not the case in typical equating and scaling, as demonstrated in this example.

Let's instead look at a new group, Group C.

On the unique questions, Group C did worse than Group A (16 right as opposed to 21 right), much like Group B (17 to 21). But on the equators, the measure for comparing performance across tests, Group C also performed worse, 13 right instead of Group A's 15.

We can feel fairly confident, then, that Group C is of lesser ability than Group A. Their performance on the equators shows as much.

That also suggests that when Group C performed worse on unique questions than Group A, it was not because the questions were harder; it was because they were of lesser ability.

This is from pages 46-47 of the report:

Consider two groups of similarly-situated test-takers, Group A and Group B. They each achieve the same score, 15 correct, on a set of the “equator” questions. But Group A scores 21 correct on the unique questions, while Group B scores just 17 of these questions right. Based on Groups A and B’s same score on the equator questions, we can feel fairly certain that Groups A and B are of similar ability. We can also feel fairly certain that Group B had a harder test than Group A. This is because we would expect Group B’s scores to look like Group A’s scores because they are of a similar capability. Because Group B performed worse on unique questions, it looks like they received a harder group of questions. Now we scale the answers so that Group B’s 17 correct answers look like Group A’s 21 correct answers, thus accounting for the harder questions. Bar pass rates between Group A and Group B should then look the same. In short, it is irrelevant if Group B’s test is harder because the results will be adjusted to account for variances in test difficulty. Group B’s pass rate will match Group A’s pass rate because the equators establish that they are of similar ability.

Now consider Group C. In the unique questions, Group C did worse than Group A (16 right as opposed to 21 right), much like Group B (17 to 21). But on the equators, the measure for comparing performance across tests, Group C also performed worse, 13 right instead of Group A’s 15. We can feel fairly certain, then, that Group C is of lesser ability than Group A. Their performance on the equators shows as much. That also suggests that when Group C performed worse on unique questions than Group A, it was not because the questions were harder; it was because they were of lesser ability.

I don’t have particular comments on the rest of the report. I just highlight that my work was copied but never cited. I’m glad someone found it a little helpful. I’d be more glad if there was attribution.

Small accommodations that state bar licensing authorities should start to consider now

On the heels of thinking about how to handle the July 2020 bar exam and alternative solutions, some states (led by New York) are moving to postponing the bar exam in light of the coronavirus pandemic. I imagine many states will follow this proposal (even if it’s of limited effectiveness).

But state bar licensing authorities should also be thinking of small accommodations that can help ease test-takers’ concerns in times of heightened uncertainty—if the state bar chooses to administer a traditional exam.

First, state bar licensing authorities should offer the ability of prospective test-takers to seek a refund up until the day of the exam itself if those test-takers choose not to take the bar. Just as California recently allowed for refunds after it accidentally disclosed exam topics ahead of the exam, states can offer a generous refund policy like, ideally, hotels and airlines are doing. That would ease concerns for test-takers who are uncertain about how their study habits, family responsibilities, or health will fare in the months ahead. (Admittedly, I understand this is a cost to licensing authorities that are renting out space for test-takers. But it’s one, I think, they should bear here.)

Second, state bar licensing authorities should allow for a later registration deadline for test-takers without a late fee attaching. For example, if registration is usually twelve weeks before the exam, make it ten or eight weeks. Understandably again, state bars look to reserve space based on registered test-takers. But there might be higher levels of uncertainty looking at the fall, so test-takers might hesitate—particularly those who have jobs in another state that may not pan out in an economic downturn, or practicing attorneys in another jurisdiction weighing whether to take an additional bar exam this fall or postpone until a later date.

Third, state bar licensing authorities should offer reduced prices for repeaters in the February 2021 exam. This is a concession to some applicants that they may not be as prepared as they would otherwise be for the fall 2020 test. This, I think, would be a modest gesture to help on a global level for all those who may fail the fall 2020 exam.

Fourth, state bar licensing authorities should expedite grading. To be honest, this is something that state bar licensing authorities should be working on anyway (so I shouldn’t treat this as a coronavirus pandemic-related concern). That’s especially true in California, which recently reduced its exam from three days (two days of essays) to two days (one day of essays), but still takes until late November to release results. States like Maryland and Rhode Island, or others that also have November release dates or even late October, should consider adding graders or thinking how to expedite the grading process to push out results earlier to test-takers. That would minimize the job-related effect on test-takers in the event the exam is pushed to the fall.

These are modest decisions, but they all cost the state bar licensing authority money. But if budget concessions can be made, and if reduced costs with vendors (say, sites hosting the bar exam) can be negotiated, these would help all test-takers in the fall 2020 administration of the bar exam.

Law school work product as a substitute for bar exams

Yesterday, I offered some thoughts about the bar exam and the potential solutions to delays caused by Covid-19. Some solutions, like “emergency diploma privilege,” primarily focus on in-state graduates of the Class of 2020 of ABA-accredited law schools taking the bar for the first time. I mused that there are several cohorts of bar exam test-takers that state bar licensing authorities must consider when adjusting their bar exam policies.

One idea I’ve mulled over is, admittedly, an imperfect and half-baked idea, and one that’s labor intensive—but one that opens up opportunities for essentially all comers (with some caveats) to get admitted to a jurisdiction.

In short: state bars could review portfolios of prospective attorneys’ work product (from law school, supervised practice, or actual practice) and determine whether the candidate meets the “minimum standards” to practice in the jurisdiction.

Okay, in length: this isn’t easy. But graduates of all ABA-accredited law schools are required to take a legal writing course and an upper-division writing course. Written work product is assuredly something that comes out of every law school.

Students also commonly take a core of courses that are on the bar exam and take written exams in those courses. They write exams examining issues of Civil Procedure, Evidence, Contracts, and Property.

Law school graduates also spend their summers—or some semesters—working with legal employers and developing written work product. They work in clinics or externships during the semester, and they write memos or other work product.

State licensing authorities, then, could require prospective applicants to develop portfolios of written work product. State bar graders could then go through the standard “calibration” exercises they usually do to “grade” written work product. Multiple “graders” would look at each component of the work product.

Now, there are huge problems with this, I know, and I’ll start with a few. First and foremost is the lack of standardization. Not everyone does the same writing assignments in law school, much less the same types of writing assignments. Exam essays typically lack context (e.g., they don’t have a “statement of facts” like a typical legal writing assignment would). Not everyone spends a summer in the practice of law (e.g., in finance) or has written work product (e.g., an externship with a judge that doesn’t want disclosure of work product). There’s no scaling and equating like one has with the bar exam to improve reliability. Grading would take quite some time.

In the alternative, state licensing authorities could authorize “supervised practice” (Proposal 6 in the working paper on bar exam alternatives) and use the work product from that supervised practice later to submit to the licensing authority to supplement the law school work product.

But an advantage of this proposal, I think, is that the written product is what we’d expect of attorneys and a good measure of their ability. Law school grades (i.e., mostly the assessment of written work product) strongly correlate with bar exam scores and bar exam success. It would extend to in-state or out-of-state, to ABA-accredited graduates or others, to foreign-trained attorneys, or to licensed practitioners in other states. It could even apply to those who’ve failed the bar exam before—if they’re judged on their work to be ”minimally qualified,” all the better.

I toss it out as one possible solution that requires little additional or new work on the part of prospective applicants to the bar, that judges them on something relevant to their ability to engage in legal analysis, and that mitigates concerns around different cohorts of applicants to the bar.

Maybe it’s too much work, the disparities in the types of work product too vast, for us to consider. At the same time, federal judges commonly review clerkship applicants on an open-ended consideration of written work product. Perhaps there’s something to be said for looking at past written work product.

Some thoughts on the bar exam and Covid-19

A helpful and timely working paper from several law professors—including authors whose work I’ve admired in the past like Professor Deborah Jones Merritt, Professor Joan Howarth, and Professor Marsha Griggs—offers much to consider about the bar exam in light of the coronavirus pandemic and the spread of the illness Covid-19. That is, the coronavirus outbreak may persist into July and call for rethinking how to address the bar exam. They’ve done a tremendous job in a short period of time thinking about it and writing about it.

I wanted to address the question slightly differently from the framing in their paper, however. The framing in their paper is as follows:

At the same time, it is essential to continue licensing new lawyers. Each year, more than 24,000 graduates of ABA-accredited law schools begin jobs that require bar admission. The legal system depends on this yearly influx to maintain client service.

These solutions, then, are oriented toward looking at the Class of 2020 and how this cohort of attorneys can be licensed. To be sure, this is how the bulk of new lawyers are added to the legal profession each year; this is a pressing concern for law schools, whose graduates are placed into a precarious position; and this is assuredly the focus of state licensing boards.

But looking at the position slightly differently can present a very different picture: instead of looking at the Class of 2020 graduates of ABA-accredited schools taking the bar exam, one might look at the administration of the bar exam. I think this yields some contrasts in the scope of their proposals.

The authors of the paper nicely identify six alternatives, the first three “likely to fail,” the last three with “considerable promise,” and perhaps jurisdiction-specific solutions mean some apply in some places but others in others:

  1. Postponement

  2. Online exam

  3. Exams administered in small groups

  4. Emergency diploma privilege

  5. Emergency diploma privilege-plus

  6. Supervised practice

I’ll come to some details of these proposals in a moment, but the bulk of them are in the paper. At the same time, I want to focus on several population who take the bar exam in a given year:

Cohort A. JD graduates of an ABA-accredited law school from that state: This is probably the largest contingent of bar test-takers, although many take the test out of state.

Cohort B. JD graduates of an ABA-accredited law school from out of state. Some law schools like Yale predominantly place graduates out of state. Virtually every law school sends at least some students to take another state’s bar exam.

Cohort C. LLM graduates of an ABA-accredited law school from that state. While JD graduates are the vast majority of graduates each year, foreign-trained lawyers commonly earn an LLM in the United States to enable them to take the bar exam and practice in the United States.

Cohort D. LLM graduates of an ABA-accredited law school from out of state. Given that New York and California are popular destinations for most foreign-trained attorneys, LLM graduates in other states often head to those states to take the bar.

Cohort E. Graduates of non-ABA-accredited law schools. While these are far rarer, in states like California graduates of state-accredited schools can take that state’s bar exam.

Cohort F. Test-takers who failed a bar exam previously. A significant number of retakers make up the bar exam test-taking cohort each year.

Cohort G. Attorneys admitted in other jurisdictions taking the bar. While reciprocity exists in some states, it doesn’t in others, and attorneys sometimes have to take a bar exam to get admitted to that jurisdiction.

(Maybe you can think of other groups. Let me know!)

So, the bar exam is being administered to these sets of test-takers.

Cohorts A through F are all “new” lawyers in the United States; Cohort G includes those who are already practitioners elsewhere (or perhaps let their license expire elsewhere).

Proposals 1 (Postponement), 2 (Online exam), and 3 (Exams administered in small groups) would apply to all seven of these cohorts. But, I think, as the authors of the paper note, these seem less likely options. Particularly Proposal 1—it’s not clear when this pandemic will end, and states have to act uniformly to take advantage of the uniform bar exam or the MBE. And regarding Proposals 2 & 3, feasibility might be possible if aggressive measures were pursued.

Proposal 4 (Emergency Diploma Privilege) offers strong benefits for Cohort A. Undoubtedly, recent law school graduates would not have to study for the summer bar exam; they would not need to spend money on bar prep courses; they would be guaranteed to be admitted to practice (subject, of course, to character & fitness reviews, and passing the MPRE).

That said, I would take some issue with the comparison to Wisconsin—yes, Wisconsin has had diploma privilege. But, (1) the diploma privilege mandates an extensive required curriculum, (2) Wisconsin’s cut score for the bar exam is the lowest in the United States, and (3) the state has just two law schools, Wisconsin and Marquette, and about 75 law schools have worse median LSAT profiles, and about 60 law schools have worst 25th LSAT profiles, among their incoming classes than Marquette. In other words, all other states have higher bar exam standards, many have graduating students with materially lower predictors of bar passage, and no states require the kinds of core curriculum of Wisconsin.

But setting all those aside, it is an emergency situation (and perhaps Proposal 5 can help take care of some of this), and we shouldn’t expect outcomes like those in careful set-ups like Wisconsin. But note that this only benefits Cohort A. Cohort B (out of staters) would not benefit, unless the states began instituting some reciprocity of diploma privilege as the paper suggests as a possibility. It’s not clear that LLM graduates would benefit (in Wisconsin, for instance, they can’t—it applies only to “84 semester credit” degrees, i.e., the JD). The paper’s proposal extends only to ABA-accredited schools and first-time test-takers: “solely to graduates of the class of 2020 (including those who graduated in December 2019) from accredited law schools. Individuals who had previously taken and failed a bar examination in any state could be excluded.” (Emphasis added.) And it doesn’t help Cohort G, those trying to get into the bar.

Now, it might be that Proposal 4 is still a good proposal and needs to be supplemented with other proposals (say, Proposal 3 now that the test-taking cohort is much smaller). But it’s to emphasize that bar exam solutions focusing on recent graduates may miss significant other cohorts seeking admission to the bar.

Proposal 5 adds to Proposal 4—requiring some “bridge the gap” programs, CLE requirements, CALI lessons, or the like. It would add complexity and help overcome some of the concerns of Proposal 4—that is, given that Wisconsin has a bar that requires greater supervision on the law school end, maybe other states could require greater supervision on the back end.

Proposal 6 would allow supervised practice, with a supervisor who would advice them and, upon completion of 240 hours’ of work (e.g., 6, 40-hour weeks), graduates could be admitted to that bar. This helps extend to Cohort B: “Notably, this option would allow jurisdictions to license lawyers graduating from law schools in any state.” Again, however, the proposal has some limitations, extending to “2020 graduates of accredited law schools.”

These last three proposals can help Cohort A. They could, in some circumstances, help Cohort B.

But it’s not clear that they would necessarily help others. It could be, I suppose, that a bar might loosen its reciprocity rules under Cohort G for those who registered to take the bar. Or it might extend some of them to non-ABA-accredited graduates.

It’s particularly worth considering, however, what to do with everyone else. That is, these programs might help recent graduates. But some people will still want to take the bar! Should states just cancel the bar? Those who failed before can’t take it? Should they try one of the first three proposals for other cohorts?

It’s not clear to me what the best approach is. The bar exam affects far more than recent law school graduates, although law school educators (including me!) are particularly concerned with this cohort. The state bar is going to have to determine how to handle all of these cohorts who might be affected if Covid-19 restrictions extend into July.

There are no easy answers. I appreciate the authors of this study for putting such clear and helpful options on the table. I imagine state bars around the country are considering the appropriate paths to take. I look forward to seeing more such discussions play out in the weeks ahead, and I hope state bars can come up with solutions that best help the legal system and all prospective test-takers.

Thoughts on a "better bar exam"

The ABA Journal has a long piece by Stephanie Francis Ward on the bar exam. It includes a few quotations from me that I thought I’d dive into.

“I think people really want to solve this one major problem, and that’s, ‘Is there a body of students out there who would be good lawyers, but are failing the bar? Is there some way of getting them through the bar?’ ” asks Derek Muller, a law professor at Pepperdine University, who writes at the website Excess of Democracy.

He also wonders if changing the bar exam would solve any problems, including a decreasing national pass rate. The overall pass rate was 54% in 2018; and in 2008 it was 71%, according to NCBE data.

“If lawyers are saying, ‘I did it this way, kids need to do it this way, too,’ that’s not productive. At the same time, if schools are saying, ‘We need to change the bar in whatever way we can to get kids to pass,’ that’s not productive either,’ ” Muller says.

He’s not sure that changing the format would make much of a pass rate difference, and he wonders how some ideas, like giving a partial bar exam after students finish their first year of law school, would actually play out.

“I think that would put extraordinary pressure on the first year of law school in a different way. For a lot of students, it’s a steep learning curve the first year, and to add another exam, I think that would be a step backward,” Muller says.

There are really three major concerns that Ms. Ward helpfully drew from our conversation.

First, what exactly is the problem we’re trying to solve? There are so many competing debates, in my view, that it helps to parse them out. The most material concern, in my view, is a Type I/II error problem—is the bar letting the wrong people in to practice law, or is it preventing the wrong people from practicing law? Few, I think, believe the bar admits too many. So if it’s admitting too few, what system do we want to help figure out which “good” attorneys are out there who are passing the present bar exam. A lot of reform efforts, in my view, don’t adequately start with this precise formulation of the problem.

Second, the “kids” (a pejorative I use only in scare quotations!) often face two competing arguments. Either new bar admittees need to take the bar exam “the old fashioned way,” which is essentially a traditionalist argument that holds little weight in the face of material criticism. Or new bar admittees should face as few barriers to practice as possible, essentially an argument raised primarily by law school deans who have seen demand drop in recent years, admissions standards declines, and bar passage rates decline along with them. There’s assuredly merit in both—but both are also too easily wrapped up in self-interest and merit deeper reflection.

Third, some reform proposals, in my view, are worse than the existing problem. Take the one suggested, a “baby bar” after the first year. That’s a proposal that exists in California among non-ABA-accredited schools to ensure that admissions standards are acceptable and that retaining the students in their legal education is worthwhile. For the vast majority of law students at ABA law schools, this is not a problem. It would place extraordinary pressure on preparing for this exam in the summer, curtailing summer working opportunities and changing the first-year curricular emphasis. And it would effectively create a second bar exam when people are already complaining about the first!

All this is to say, reform efforts (and there are thoughtful ones out there!) must make careful evaluations—evidence-based, out of public interest rather than law school interests or anticompetitive guild interests. We’ve seen fairly little change in the last decade, I think, in part because the interests have not often been public oriented, and, when they are, the proposed changes do little to address the primary concerns.

Professor Deborah Jones Merritt’s proposals are the right kind—thinking about breaking up the bar exam into components to make testing easier (think how the MPRE is already a separate component), or providing more time flexibility (to address accommodation concerns and to ensure deeper thinking on legal issues). We’ll see if these, or others, make their way into the bar in the decade to come.

Patent bar exam results have been declining alongside state bar exam pass rates

Bar exam pass rates have fallen and remained relatively low for several years. The National Conference of Bar Examiners noted long ago that Multistate Professional Responsibility Exam scores had been declining, which hinted at a future slide in state bar exam pass rates in the future—even though the MPRE is independent of the state bar exam.

I recently discovered that the United States Patent and Trademark Office publishes patent bar pass rates. From my understanding, this bar is most commonly taken by law students or recent graduates. Test-takers are those who intend to do certain patent legal practice before the USPTO. Test-takers have declined in recent years, consistent with the decline in overall law school enrollment.

I noted that patent bar pass rates have declined in recent years alongside state bar pass rates. I took the first-time overall state bar pass rates against the patent bar pass rates.

It’s another indication of an overall concern about the ability of law school graduates, regardless of the form of the test or the area of examination, to pass—and an indication that law schools need to consider solutions apart from content-specific or state bar exam-specific concerns.