Justice Ginsburg turns the "Purcell Principle" upside down in Wisconsin primary case

The coronavirus pandemic has led to a number of late-breaking election law challenges since mid-March in states holding primaries. The Wisconsin primary is the most recent saga. After a growing pandemic concern, including a “stay at home” order issued March 24, a federal court issued a preliminary injunction on April 2 changing some election procedures ahead of the April 7 election. The Supreme Court on April 6 effectively put some of those changes on hold in its decision in Republican National Committee v. Democratic National Committee.

That’s all the summary or commentary I’ll provide on the merits—changes to the primary could have occurred in the state legislature or, in some jurisdictions, by executive order, and there are challenging questions to consider in primary elections as the election approaches in the midst of a pandemic.

Instead, one item struck me in the dispute between the five justice per curiam majority opinion, and the opinion Justice Ruth Bader Ginsburg wrote on behalf of herself and Justices Stephen Breyer, Sonia Sotomayor, and Elena Kagan. It concerns an interpretation of Purcell v. Gonzalez (2006).

There, the Court reversed a lower court decision that altered an election rule close in time to the election. The Purcell Court emphasized:

Faced with an application to enjoin operation of voter identification procedures just weeks before an election, the Court of Appeals was required to weigh, in addition to the harms attendant upon issuance or nonissuance of an injunction, considerations specific to election cases and its own institutional procedures. Court orders affecting elections, especially conflicting orders, can themselves result in voter confusion and consequent incentive to remain away from the polls. As an election draws closer, that risk will increase.

Professor Rick Hasen has referred to this as the “Purcell Principle,” and the Supreme Court has, for the most part, adhered to it—it has reversed lower court injunctions that have altered election rules close in time to the election (the Court’s per curiam opinion cites a couple in RNC v. DNC). It would be nice if the Court articulated better standards than Purcell provides, but the point remains.

But note what these decisions do—the Supreme Court changes a lower court ruling, which had changed an election rule. Purcell is designed to rein in lower courts that change rules. It’s not designed to rein in the Supreme Court and prevent the Court from restoring the original rule.

Consistent with Purcell, the majority opinion tells a lower court not to change the election rules too close in time to the election (here, five days before the election);

This Court has repeatedly emphasized that lower federal courts should ordinarily not alter the election rules on the eve of an election. See Purcell v. Gonzalez, 549 U. S. 1 (2006) (per curiam); Frank v. Walker, 574 U. S. 929 (2014); Veasey v. Perry, 574 U. S. __ (2014).

But that’s where I noticed a couple of passages from Justice Ginsburg’s dissenting opinion interpreting Purcell differently:

This Court’s intervention is thus ill advised, especially so at this late hour. See Purcell v. Gonzalez, 549 U. S. 1, 4–5 (2006) (per curiam). Election officials have spent the past few days establishing procedures and informing voters in accordance with the District Court’s deadline. For this Court to upend the process—a day before the April 7 post-mark deadline—is sure to confound election officials and voters.

Second, the Court’s order cites Purcell, apparently skeptical of the District Court’s intervention shortly before an election. Nevermind that the District Court was reacting to a grave, rapidly developing public health crisis. If proximity to the election counseled hesitation when the District Court acted several days ago, this Court’s intervention today—even closer to the election—is all the more inappropriate.

That can’t be what Purcell dictates. Purcell’s entire point is that lower courts can’t change the rules of elections close in time to the election—not that once they do so, the Supreme Court (which always hears the case even closer to the election) can’t restore the original rule.

Now, had Justice Ginsburg offered an alternative interpretation of Purcell—say, a Purcell exception—it might have been more persuasive. For instance, she notes that “the District Court was reacting to a grave, rapidly developing public health crisis.” If Purcell dictates that election rules shouldn’t be changed close in time to elections, perhaps there are times when facts are so late-breaking and so dire that Purcell should give way.

But that would require the Court to articulate a deeper understanding of Purcell, including its contours and its potential exceptions. Justice Ginsburg’s dissenting opinion doesn’t do so, turning Purcell on its head and rendering it largely meaningless.

While one can have varying views on the majority and dissenting opinions, their rightness or wrongness, and the challenges in Wisconsin, I do think this misunderstanding of Purcell was significant enough to draw a little attention to it.

This post has been updated.

Did the California bar enact "diploma privilege after the 1906 California earthquake and during WWII"?

This claim, versions of which I’ve seen elsewhere, appears in a letter signed by hundreds of law students urging the California Supreme Court to enact an “emergency diploma privilege” as an alternative to the traditional bar exam. (Some of my thoughts on that are here and here, and some accommodations state bars should assuredly now take are here.)

Here’s the claim:

Diploma privilege is feasible. After all, California enacted diploma privilege after the 1906 California earthquake and during WWII.[footnote]

The claims are not entirely accurate and much narrower than presented (both here and elsewhere).

The footnote of the letter cites this article, which actually confirms my instinct about the California bar in 1906—the California State Board of Bar Examiners was formed in 1919, which then created “the first written exams” in 1920. (Independently, he article does not claim that “diploma privilege” was “enacted” after the 1906 earthquake.)

Prior to 1906, then, admission to the bar was on motion. There was no written bar exam (as we know it) in the first place.

And the 1906 episode is a specific incident at the University of California - Hastings that prompted the 1906 story. Here’s a story from the Hastings Law Journal in 1953:

The fire and earthquake of 1906 worked its havoc on Hastings. The City Hall was completely destroyed. Although former students allege that they started out for the College on April 18, no trace of Hastings could be found. The College year ended abruptly. So great was the catastrophe that it was deemed permissible to waive final examinations, which, through coincidence, ha[d] been scheduled for that day. The class of 1906 had an unique distinction. Whereas earlier classes had been admitted to the bar on the motion of the faculty (college examinations being passed to the satisfaction of the faculty) the class of 1906 was admitted “because of the motion of the earth!”

It’s worth emphasizing, then, that diploma privilege was actually the norm in 1906, not the exception. The exception arose in whether graduating law students needed to complete their final examinations. And it only extended, apparently, to the University of California - Hastings, given that the law school was destroyed the day of the exam. (Apparently, the California bar accepted the motion of the faculty even in the absence of a final law school examination.) And there was no traditional written “bar exam” at the time.

Additionally, the World War II exception is somewhat narrower than identified, but relevant as I detail below. Here’s how it was summarized in that footnote citation, Admissions standards evolve across decades, by Kathleen Beitiks in the California Bar Journal:

For many years the California examination consisted of a written test exclusively. And under some circumstances during World War II, bar examiners waived the exam requirement for returning soldiers whose legal careers were interrupted.

This exception was actually a statutory exemption enacted by the California legislature. Here’s Chapter 65 from the First Extraordinary Session of the 56th California Legislature, enacted in 1946:

The provisions of subdivisions (d) and (h) of Section 6060 do not apply to any person who, after September 16, 1940, and prior to the termination of hostilities between the United States and the nations with which the United States is now at war as determined by Act of Congress or Proclamation of the President, has graduated from a law school accredited by the examining board and who after such graduation served in the armed forces of the United States before taking an examination for admission to the bar, nor to any person, who, after September 16, 1940 satisfactorily completed at least two years of study at a law school then accredited by the examining board and whose legal education was thereafter interrupted by his service in the armed forces of the United States, and who subsequently graduates from a law school accredited by the examining board. The provisions of this section shall not apply to any person who enters the armed forces of the United States after the effective date of this section, nor to any person who at the time of entering the armed forces was not a bona fide resident of this State.

This section shall remain in effect until the ninety-first day after final adjournment of the Fifty-eighth Regular Session of the Legislature. While this section is in effect it shall supersede any existing provisions of law which are in conflict with this section; but such provisions are not repealed by this section and after this section is no longer effective shall have the same force as though this section had not been enacted.

(Then, subsection (d) required one to be a bona fide resident of the state for three months before the bar exam, and subsection (h) required someone to “have passed a final bar examination given by the examining committee.”)

It’s not that diploma privilege was “enacted” “during World War II.” It’s that veteran residents of California who graduated from a California-accredited law school but had their education “interrupted” by military service had the bar requirement waived after the war ended.

That’s not to say that those who signed this letter don’t have a point in drawing an analogy here. There was an emergency interruption of law students’ ordinary careers as they went to serve in the military. Upon their return, the legislature waived the requirement that they take the bar exam. That seems particularly sensible, given that their study of law was interrupted—much easier to take the bar right after graduating rather than after years of military service in between. But, ease and convenience of the test-taker—more so than the pre-existing concerns of “protecting the public”—appear (!) to have motivated the legislature in this case.

So, too, should state bar licensing authorities (or state legislatures) consider what kind of interruption, and what kind of previous educational experience, might qualify for admission to the bar—a matter of robust debate in the weeks ahead in states around the country.

When the Task Force on the New York Bar Examination plagiarizes your work without attribution

UPDATE: The chair of the Task Force reached out to me with apologies and intends to update the report with attribution. I’ll link to that updated report when it’s available.

My blog isn’t much. It makes no money. It garners little attention. I don’t earn money consulting from it. It contains my half-baked musings, the best of which might become an article, the worst of which I strike through and hope people forget.

But at the very least, it would be nice to see my work acknowledged if it’s useful.

Sadly, the Task Force on the New Your Bar Examination found my work useful, but chose to copy without attribution.

Its recent report on the state of the bar exam takes large chunks of my blog and treats it as its own work product. Several paragraphs are lifted from my 2015 post, “No, the MBE was not ‘harder’ than usual.”

Here’s a part of my post:

The MBE uses a process known as "equating," then "scales" the test. These are technical statistical measures, but here's what it's designed to do. (Let me introduce an important caveat here: the explanations are grossly oversimplified but contain the most basic explanations of measurement!)


Standardized testing needs a way of accounting for this. So it does something called equating. It uses versions of questions from previous administrations of the exam, known as "anchor" questions or "equators." It then uses these anchor questions to compare the two different groups. One can tell if the second group performed better, worse, or similarly on the anchor questions, which allows you to compare groups over time. It then examines how the second group did on the new questions. It can then better evaluate performance on those new questions by scaling the score based on the performance on the anchor questions.

This is from Page 46 of the Task Force report:

The MBE also uses a process known as “equating,” which “scales” the test to adjust for differences between exams and by different test takers over time. Equating uses versions of questions from previous administrations of the exam, known as “anchor” questions or “equators” to compare two different groups. This way, in theory, one can tell if the second group performed better, worse, or similarly on the anchor questions, which allows groups of test takers to be compared across test administrations. Then, how the second group did on the new questions is examined so that performance on the new questions can be evaluated based on performance on the anchor questions.

Here’s another part of my post:

Consider two groups of similarly-situated test-takers, Group A and Group B. They each achieve the same score, 15 correct, on a batch of "equators." But Group A scores 21 correct on the unique questions, while Group B scores just 17 right.

We can feel fairly confident that Groups A and B are of similar ability. That's because they achieved the same score on the anchor questions, the equators that help us compare groups across test administrations.

And we can also feel fairly confident that Group B had a harder test than Group A. (Subject to a caveat discussed later in this part.) That's because we would expect Group B's scores to look like Group A's scores because they are of a similar capability. Because Group B performed worse on unique questions, it looks like they received a harder batch of questions.

The solution? We scale the answers so that Group B's 17 correct answers look like Group A's 21 correct answers. That accounts for the harder questions. Bar pass rates between Group A and Group B should look the same.

In short, then, it's irrelevant if Group B's test is harder. We'll adjust the results because we have a mechanism designed to account for variances in the difficulty of the test. Group B's pass rate will match Group A's pass rate because the equators establish that they are of similar ability.

When someone criticizes the MBE as being "harder," in order for that statement to have any relevance, that person must mean that it is "harder" in a way that caused lower scores; that is not the case in typical equating and scaling, as demonstrated in this example.

Let's instead look at a new group, Group C.

On the unique questions, Group C did worse than Group A (16 right as opposed to 21 right), much like Group B (17 to 21). But on the equators, the measure for comparing performance across tests, Group C also performed worse, 13 right instead of Group A's 15.

We can feel fairly confident, then, that Group C is of lesser ability than Group A. Their performance on the equators shows as much.

That also suggests that when Group C performed worse on unique questions than Group A, it was not because the questions were harder; it was because they were of lesser ability.

This is from pages 46-47 of the report:

Consider two groups of similarly-situated test-takers, Group A and Group B. They each achieve the same score, 15 correct, on a set of the “equator” questions. But Group A scores 21 correct on the unique questions, while Group B scores just 17 of these questions right. Based on Groups A and B’s same score on the equator questions, we can feel fairly certain that Groups A and B are of similar ability. We can also feel fairly certain that Group B had a harder test than Group A. This is because we would expect Group B’s scores to look like Group A’s scores because they are of a similar capability. Because Group B performed worse on unique questions, it looks like they received a harder group of questions. Now we scale the answers so that Group B’s 17 correct answers look like Group A’s 21 correct answers, thus accounting for the harder questions. Bar pass rates between Group A and Group B should then look the same. In short, it is irrelevant if Group B’s test is harder because the results will be adjusted to account for variances in test difficulty. Group B’s pass rate will match Group A’s pass rate because the equators establish that they are of similar ability.

Now consider Group C. In the unique questions, Group C did worse than Group A (16 right as opposed to 21 right), much like Group B (17 to 21). But on the equators, the measure for comparing performance across tests, Group C also performed worse, 13 right instead of Group A’s 15. We can feel fairly certain, then, that Group C is of lesser ability than Group A. Their performance on the equators shows as much. That also suggests that when Group C performed worse on unique questions than Group A, it was not because the questions were harder; it was because they were of lesser ability.

I don’t have particular comments on the rest of the report. I just highlight that my work was copied but never cited. I’m glad someone found it a little helpful. I’d be more glad if there was attribution.

What law schools can learn from a disrupted Spring 2020 semester

In the middle of a disrupted semester, law schools (and higher education in general) are making significant accommodations for faculty and students. Tenure clocks are delayed for a year, student evaluations won’t be used to evaluate faculty, grading policies have been altered, exam formats may change, and the list goes on.

But it would be a mistake for schools to throw up their hands at the end of the semester and say that nothing can be learned from it! Indeed, a great deal can be learned from this sudden experiment—yes, with some limited value and all appropriate caveats, but there’s much to consider.

Here are a couple things that law schools should look closely at after the end of the semester—and, in some cases, schools might want to start thinking about how to evaluate these things now.

First, classroom experiences. While student evaluations shouldn’t be used to evaluate teaching performance (indeed, perhaps they should be of limited value in all circumstances!), they can tell us a lot about the classroom experience of students as professors abruptly switched to online formats under a variety of approaches. Schools can look and see what might have worked or didn’t work. Synchronous or asynchronous? Traditional material or guest speakers?

Schools should comb through the student evaluations open-response components to figure out if any online practices were particularly successful—or particularly unsuccessful. Evaluations might include specific questions about the online components. Or they could be an entirely separate questionnaire submitted to students. Institutions should solicit from faculty their online pedagogy and see what can be gleaned from those practices and student reactions. This is particularly true given the Socratic-based casebook methods used in most law schools and particularly in most first-year or core curriculum.

Second, grading policies. Law schools around the country have implemented different grading policies in light of coronavirus-related disruption. Much is in dispute. Much, I think, is speculative. But whatever policy a school adopts, a school ought to examine whether the benefits or costs of the policy came to pass.

For schools that kept their grading as is: were second semester 1L grades as highly correlated with first semester as in previous 1L years? If they were less correlated, did it affect certain racial, gender, or socioeconomic groups more than others?

For schools that switched to optional pass-fail: did some students disproportionately take advantage of the pass-fail option—based on GPA quartile, demographics, etc.? Did students who remain with grades benefit from higher grades as pass-fail students worked "less," if all were curved together?

For schools that went to mandatory pass-fail: did students with better "resume bias" (e.g., elite undergraduate institutions) have a disproportionately better OCI experience than in previous years? Did bar passage rates worsen compared to graduates of other schools? [Of course, the economic downturn and alterations to the bar exam could affect this.] Did academic dismissal rates change?

These are just a few ideas to try to measure the effects of changes to policies and look for good or bad signs.

*

I’m sure others will think of important things for law schools to reconsider—exam policies, attendance rules, faculty committees and meeting rules, “work from home” alternatives for faculty and staff, wet ink signature requirements, and so on. But I want to emphasize that schools should look to learn the right kinds of lessons from this disrupted semester.

Small accommodations that state bar licensing authorities should start to consider now

On the heels of thinking about how to handle the July 2020 bar exam and alternative solutions, some states (led by New York) are moving to postponing the bar exam in light of the coronavirus pandemic. I imagine many states will follow this proposal (even if it’s of limited effectiveness).

But state bar licensing authorities should also be thinking of small accommodations that can help ease test-takers’ concerns in times of heightened uncertainty—if the state bar chooses to administer a traditional exam.

First, state bar licensing authorities should offer the ability of prospective test-takers to seek a refund up until the day of the exam itself if those test-takers choose not to take the bar. Just as California recently allowed for refunds after it accidentally disclosed exam topics ahead of the exam, states can offer a generous refund policy like, ideally, hotels and airlines are doing. That would ease concerns for test-takers who are uncertain about how their study habits, family responsibilities, or health will fare in the months ahead. (Admittedly, I understand this is a cost to licensing authorities that are renting out space for test-takers. But it’s one, I think, they should bear here.)

Second, state bar licensing authorities should allow for a later registration deadline for test-takers without a late fee attaching. For example, if registration is usually twelve weeks before the exam, make it ten or eight weeks. Understandably again, state bars look to reserve space based on registered test-takers. But there might be higher levels of uncertainty looking at the fall, so test-takers might hesitate—particularly those who have jobs in another state that may not pan out in an economic downturn, or practicing attorneys in another jurisdiction weighing whether to take an additional bar exam this fall or postpone until a later date.

Third, state bar licensing authorities should offer reduced prices for repeaters in the February 2021 exam. This is a concession to some applicants that they may not be as prepared as they would otherwise be for the fall 2020 test. This, I think, would be a modest gesture to help on a global level for all those who may fail the fall 2020 exam.

Fourth, state bar licensing authorities should expedite grading. To be honest, this is something that state bar licensing authorities should be working on anyway (so I shouldn’t treat this as a coronavirus pandemic-related concern). That’s especially true in California, which recently reduced its exam from three days (two days of essays) to two days (one day of essays), but still takes until late November to release results. States like Maryland and Rhode Island, or others that also have November release dates or even late October, should consider adding graders or thinking how to expedite the grading process to push out results earlier to test-takers. That would minimize the job-related effect on test-takers in the event the exam is pushed to the fall.

These are modest decisions, but they all cost the state bar licensing authority money. But if budget concessions can be made, and if reduced costs with vendors (say, sites hosting the bar exam) can be negotiated, these would help all test-takers in the fall 2020 administration of the bar exam.

My best advice for teaching in an online classroom? Ask for student feedback

I’ve seen a lot of law professors engage with one another on blogs or Twitter, swapping advice about tips and tricks for teaching with Zoom, teaching in a remote environment, and so on. I wanted to contribute my best, small piece of advice to that discussion: ask for student feedback.

The student’s (or “user’s”) experience may look different from the professor’s experience. Small details about sound settings, lighting, or screen sharing may look quite different to the students. Students may have anxiety about details like class discussion format or technology that professors may either not be aware of or simply fail to raise with students.

The best thing I’ve done so far is to solicit feedback from students, both before I started teaching in a virtual classroom and after. In the before, I discovered their concerns and anxieties and could try to address them as best I could before our first class—address them in providing them transparency and guidance, and to be aware as we started our class together. After the first week of remote teaching, I did the same so I can tweak details like how my PowerPoint slides display on the shared screen and so on.

It wasn’t much to solicit feedback—an anonymous Google form with a single open-ended prompt in both cases.

But I think that it’s probably one of the most effective ways to figure out what students concerns are and to address them in the specific class environment, adapted to the specific professor’s teaching style.

We’ll see how the remainder of the semester goes, but it’s brought my attention to matters I wouldn’t otherwise have considered, and I hope it improves the experience for students in the weeks ahead.

Law school work product as a substitute for bar exams

Yesterday, I offered some thoughts about the bar exam and the potential solutions to delays caused by Covid-19. Some solutions, like “emergency diploma privilege,” primarily focus on in-state graduates of the Class of 2020 of ABA-accredited law schools taking the bar for the first time. I mused that there are several cohorts of bar exam test-takers that state bar licensing authorities must consider when adjusting their bar exam policies.

One idea I’ve mulled over is, admittedly, an imperfect and half-baked idea, and one that’s labor intensive—but one that opens up opportunities for essentially all comers (with some caveats) to get admitted to a jurisdiction.

In short: state bars could review portfolios of prospective attorneys’ work product (from law school, supervised practice, or actual practice) and determine whether the candidate meets the “minimum standards” to practice in the jurisdiction.

Okay, in length: this isn’t easy. But graduates of all ABA-accredited law schools are required to take a legal writing course and an upper-division writing course. Written work product is assuredly something that comes out of every law school.

Students also commonly take a core of courses that are on the bar exam and take written exams in those courses. They write exams examining issues of Civil Procedure, Evidence, Contracts, and Property.

Law school graduates also spend their summers—or some semesters—working with legal employers and developing written work product. They work in clinics or externships during the semester, and they write memos or other work product.

State licensing authorities, then, could require prospective applicants to develop portfolios of written work product. State bar graders could then go through the standard “calibration” exercises they usually do to “grade” written work product. Multiple “graders” would look at each component of the work product.

Now, there are huge problems with this, I know, and I’ll start with a few. First and foremost is the lack of standardization. Not everyone does the same writing assignments in law school, much less the same types of writing assignments. Exam essays typically lack context (e.g., they don’t have a “statement of facts” like a typical legal writing assignment would). Not everyone spends a summer in the practice of law (e.g., in finance) or has written work product (e.g., an externship with a judge that doesn’t want disclosure of work product). There’s no scaling and equating like one has with the bar exam to improve reliability. Grading would take quite some time.

In the alternative, state licensing authorities could authorize “supervised practice” (Proposal 6 in the working paper on bar exam alternatives) and use the work product from that supervised practice later to submit to the licensing authority to supplement the law school work product.

But an advantage of this proposal, I think, is that the written product is what we’d expect of attorneys and a good measure of their ability. Law school grades (i.e., mostly the assessment of written work product) strongly correlate with bar exam scores and bar exam success. It would extend to in-state or out-of-state, to ABA-accredited graduates or others, to foreign-trained attorneys, or to licensed practitioners in other states. It could even apply to those who’ve failed the bar exam before—if they’re judged on their work to be ”minimally qualified,” all the better.

I toss it out as one possible solution that requires little additional or new work on the part of prospective applicants to the bar, that judges them on something relevant to their ability to engage in legal analysis, and that mitigates concerns around different cohorts of applicants to the bar.

Maybe it’s too much work, the disparities in the types of work product too vast, for us to consider. At the same time, federal judges commonly review clerkship applicants on an open-ended consideration of written work product. Perhaps there’s something to be said for looking at past written work product.

Some thoughts on the bar exam and Covid-19

A helpful and timely working paper from several law professors—including authors whose work I’ve admired in the past like Professor Deborah Jones Merritt, Professor Joan Howarth, and Professor Marsha Griggs—offers much to consider about the bar exam in light of the coronavirus pandemic and the spread of the illness Covid-19. That is, the coronavirus outbreak may persist into July and call for rethinking how to address the bar exam. They’ve done a tremendous job in a short period of time thinking about it and writing about it.

I wanted to address the question slightly differently from the framing in their paper, however. The framing in their paper is as follows:

At the same time, it is essential to continue licensing new lawyers. Each year, more than 24,000 graduates of ABA-accredited law schools begin jobs that require bar admission. The legal system depends on this yearly influx to maintain client service.

These solutions, then, are oriented toward looking at the Class of 2020 and how this cohort of attorneys can be licensed. To be sure, this is how the bulk of new lawyers are added to the legal profession each year; this is a pressing concern for law schools, whose graduates are placed into a precarious position; and this is assuredly the focus of state licensing boards.

But looking at the position slightly differently can present a very different picture: instead of looking at the Class of 2020 graduates of ABA-accredited schools taking the bar exam, one might look at the administration of the bar exam. I think this yields some contrasts in the scope of their proposals.

The authors of the paper nicely identify six alternatives, the first three “likely to fail,” the last three with “considerable promise,” and perhaps jurisdiction-specific solutions mean some apply in some places but others in others:

  1. Postponement

  2. Online exam

  3. Exams administered in small groups

  4. Emergency diploma privilege

  5. Emergency diploma privilege-plus

  6. Supervised practice

I’ll come to some details of these proposals in a moment, but the bulk of them are in the paper. At the same time, I want to focus on several population who take the bar exam in a given year:

Cohort A. JD graduates of an ABA-accredited law school from that state: This is probably the largest contingent of bar test-takers, although many take the test out of state.

Cohort B. JD graduates of an ABA-accredited law school from out of state. Some law schools like Yale predominantly place graduates out of state. Virtually every law school sends at least some students to take another state’s bar exam.

Cohort C. LLM graduates of an ABA-accredited law school from that state. While JD graduates are the vast majority of graduates each year, foreign-trained lawyers commonly earn an LLM in the United States to enable them to take the bar exam and practice in the United States.

Cohort D. LLM graduates of an ABA-accredited law school from out of state. Given that New York and California are popular destinations for most foreign-trained attorneys, LLM graduates in other states often head to those states to take the bar.

Cohort E. Graduates of non-ABA-accredited law schools. While these are far rarer, in states like California graduates of state-accredited schools can take that state’s bar exam.

Cohort F. Test-takers who failed a bar exam previously. A significant number of retakers make up the bar exam test-taking cohort each year.

Cohort G. Attorneys admitted in other jurisdictions taking the bar. While reciprocity exists in some states, it doesn’t in others, and attorneys sometimes have to take a bar exam to get admitted to that jurisdiction.

(Maybe you can think of other groups. Let me know!)

So, the bar exam is being administered to these sets of test-takers.

Cohorts A through F are all “new” lawyers in the United States; Cohort G includes those who are already practitioners elsewhere (or perhaps let their license expire elsewhere).

Proposals 1 (Postponement), 2 (Online exam), and 3 (Exams administered in small groups) would apply to all seven of these cohorts. But, I think, as the authors of the paper note, these seem less likely options. Particularly Proposal 1—it’s not clear when this pandemic will end, and states have to act uniformly to take advantage of the uniform bar exam or the MBE. And regarding Proposals 2 & 3, feasibility might be possible if aggressive measures were pursued.

Proposal 4 (Emergency Diploma Privilege) offers strong benefits for Cohort A. Undoubtedly, recent law school graduates would not have to study for the summer bar exam; they would not need to spend money on bar prep courses; they would be guaranteed to be admitted to practice (subject, of course, to character & fitness reviews, and passing the MPRE).

That said, I would take some issue with the comparison to Wisconsin—yes, Wisconsin has had diploma privilege. But, (1) the diploma privilege mandates an extensive required curriculum, (2) Wisconsin’s cut score for the bar exam is the lowest in the United States, and (3) the state has just two law schools, Wisconsin and Marquette, and about 75 law schools have worse median LSAT profiles, and about 60 law schools have worst 25th LSAT profiles, among their incoming classes than Marquette. In other words, all other states have higher bar exam standards, many have graduating students with materially lower predictors of bar passage, and no states require the kinds of core curriculum of Wisconsin.

But setting all those aside, it is an emergency situation (and perhaps Proposal 5 can help take care of some of this), and we shouldn’t expect outcomes like those in careful set-ups like Wisconsin. But note that this only benefits Cohort A. Cohort B (out of staters) would not benefit, unless the states began instituting some reciprocity of diploma privilege as the paper suggests as a possibility. It’s not clear that LLM graduates would benefit (in Wisconsin, for instance, they can’t—it applies only to “84 semester credit” degrees, i.e., the JD). The paper’s proposal extends only to ABA-accredited schools and first-time test-takers: “solely to graduates of the class of 2020 (including those who graduated in December 2019) from accredited law schools. Individuals who had previously taken and failed a bar examination in any state could be excluded.” (Emphasis added.) And it doesn’t help Cohort G, those trying to get into the bar.

Now, it might be that Proposal 4 is still a good proposal and needs to be supplemented with other proposals (say, Proposal 3 now that the test-taking cohort is much smaller). But it’s to emphasize that bar exam solutions focusing on recent graduates may miss significant other cohorts seeking admission to the bar.

Proposal 5 adds to Proposal 4—requiring some “bridge the gap” programs, CLE requirements, CALI lessons, or the like. It would add complexity and help overcome some of the concerns of Proposal 4—that is, given that Wisconsin has a bar that requires greater supervision on the law school end, maybe other states could require greater supervision on the back end.

Proposal 6 would allow supervised practice, with a supervisor who would advice them and, upon completion of 240 hours’ of work (e.g., 6, 40-hour weeks), graduates could be admitted to that bar. This helps extend to Cohort B: “Notably, this option would allow jurisdictions to license lawyers graduating from law schools in any state.” Again, however, the proposal has some limitations, extending to “2020 graduates of accredited law schools.”

These last three proposals can help Cohort A. They could, in some circumstances, help Cohort B.

But it’s not clear that they would necessarily help others. It could be, I suppose, that a bar might loosen its reciprocity rules under Cohort G for those who registered to take the bar. Or it might extend some of them to non-ABA-accredited graduates.

It’s particularly worth considering, however, what to do with everyone else. That is, these programs might help recent graduates. But some people will still want to take the bar! Should states just cancel the bar? Those who failed before can’t take it? Should they try one of the first three proposals for other cohorts?

It’s not clear to me what the best approach is. The bar exam affects far more than recent law school graduates, although law school educators (including me!) are particularly concerned with this cohort. The state bar is going to have to determine how to handle all of these cohorts who might be affected if Covid-19 restrictions extend into July.

There are no easy answers. I appreciate the authors of this study for putting such clear and helpful options on the table. I imagine state bars around the country are considering the appropriate paths to take. I look forward to seeing more such discussions play out in the weeks ahead, and I hope state bars can come up with solutions that best help the legal system and all prospective test-takers.

Most law schools have become more affordable in the last three years, 2019 edition

Six years ago, I noted that around 30 law schools had become “more affordable” over the a three-year period. Three years ago, I noted that most law schools had become more affordable. In the last three years, law schools have continued to become more affordable—at least, in the measure of student debt.

USNWR reports average indebtedness at graduation among law school graduates and the percentage who took out loans. (Go there to see the highs and the lows.)

I removed all schools that failed to disclose debt figures for either 2016 graduates or 2019 graduates. (I had a partial data set from 2016, apologies!) That brought us to 150 schools.

Some schools are unable to read the USNWR forms correctly and only report some of the debt one year and the cumulative debt another year; I don't attempt to determine which schools made that error, but schools appear better at reporting data over the years.

I calculated 6.5% inflation between the Class of 2016 and the Class of 2019, and adjusted the 2016 figures accordingly. (Inflation adjustment comes with its own controversial choices, to be sure!) The debt figures listed on USNWR are an average for those who incurred debt; to arrive at a more accurate picture of the debt load of the class as a whole, I then factored in the percentage of students who graduated without any debt to reach an overall average.

Among the 150 schools, 120 saw a decline in overall debt loads; just 30 saw an inflation-adjusted increase.

Many possible reasons for the changes are possible. As I explained in 2016, students may graduate without debt for many reasons: "That could be because they are independently wealthy or come from a wealthy family willing to finance the education; they could have substantial scholarship assistance; they could earn income during school or during the summers; they could live in a low cost-of-living area, or live frugally; or some combination of these and other factors. It's worth noting that several thousand students graduate each year without any debt."

Scholarship awards appear to be outpacing tuition hikes—which has been a several-year trend and places schools in increasingly precarious financial positions. Students are no longer purchasing health care due to the ability to remain on their parents' health insurance under federal law, a significant cost for students a few years ago. Schools have increasingly eased, or abolished, stipulations on scholarships, which means students graduate with less debt. Some schools have slashed tuition prices. We might simply be experiencing the decline of economically poorer law students, resulting in more students who need smaller student loans—or none at all. Students may be taking advantage of accelerated programs that allow them to graduate faster with less debt (but there are few such programs). Finally, as JD class sizes shrink, it's increasingly apparent that students who would have paid the "sticker" price as increasingly pursuing options at institutions that offer them tuition discounts.

Additionally, as I've noted before, the "percentage may be somewhat deceptive, because at a very low-cost school, a modest increase in debt load may appear, on a percentage basis, much higher than comparable increase at a high-cost school. A $10,000 increase in debt at a school that previously had just $20,000 in debt looks like 50%; at a school with $100,000 in debt, just 10%. But I thought percentage would still be the most useful."

And of course, these debt figures are only an average; they do not include undergraduate debt, credit card debt, or interest accrued on law school loans while in school. And, as I've written, "The averages are not precise, either, for individuals. The average may be artificially high if a few students took out extremely high debt loads that distorted the average, or artificially low if a few students took out nominal debt loads that distorted the average."

It's worth noting that some of these changes are hardly random.

Major announcements from institutions like Iowa, Arizona, and Chicago back in 2013 signaled major changes in tuition or scholarship structures in 2016; those schools led the reduction in debt for the Class of 2016. Those reductions remained largely steady for the Class of 2019 for Iowa and Arizona, but Chicago saw a fairly sizeable increase in debt loads.

Similarly, announcements from Tulsa, George Mason, Texas A&M, and Wayne State on slashing tuition or major scholarship programs turned into significant reductions in student debt loads.

Finally—and while it should go without saying, I fear I need to say it anyway—this is hardly a statement about whether any particular law school is a "good" value or whether the debt loads are appropriate. It's simply a relative comparison of debt loads over three years.

Inflation-Adjusted Average Law School Debt Incurred by All Law Students Between 2016 & 2019
School 2016 2019 Dollar diff Pct diff
University of Tulsa $94,834 $40,340 -$54,494 -57.5%
Northeastern University $92,739 $45,714 -$47,025 -50.7%
Ohio Northern University (Pettit) $99,056 $52,743 -$46,313 -46.8%
University of Detroit Mercy $90,919 $50,769 -$40,149 -44.2%
George Mason University $79,264 $45,946 -$33,319 -42.0%
Texas A&M University $99,638 $58,396 -$41,242 -41.4%
University of Missouri $68,569 $43,423 -$25,146 -36.7%
Elon University $143,573 $91,630 -$51,943 -36.2%
Wayne State University $64,458 $41,659 -$22,799 -35.4%
University of Kansas $72,021 $48,728 -$23,293 -32.3%
University of Wyoming $77,451 $52,565 -$24,886 -32.1%
University of Arkansas--Fayetteville $58,285 $40,030 -$18,255 -31.3%
Indiana University--Indianapolis (McKinney) $102,264 $70,370 -$31,894 -31.2%
University of Cincinnati $67,928 $46,985 -$20,943 -30.8%
Texas Tech University $72,171 $50,692 -$21,478 -29.8%
University of New Hampshire $79,842 $56,719 -$23,123 -29.0%
Florida State University $72,692 $52,888 -$19,804 -27.2%
New York Law School $136,346 $100,312 -$36,033 -26.4%
Western State College of Law at Westcliff University $101,993 $75,454 -$26,539 -26.0%
University of Richmond $93,356 $69,776 -$23,580 -25.3%
Drexel University (Kline) $86,604 $64,862 -$21,742 -25.1%
University of Louisville (Brandeis) $84,498 $63,427 -$21,071 -24.9%
Regent University $112,752 $85,343 -$27,409 -24.3%
Pace University (Haub) $106,847 $81,061 -$25,786 -24.1%
University of Toledo $76,705 $58,258 -$18,447 -24.0%
Illinois Institute of Technology (Chicago-Kent) $89,096 $68,387 -$20,709 -23.2%
University of Minnesota $78,017 $59,947 -$18,070 -23.2%
University of Tennessee--Knoxville $68,864 $53,255 -$15,610 -22.7%
University of Alabama $55,130 $43,057 -$12,073 -21.9%
Emory University $94,348 $73,766 -$20,582 -21.8%
Southern Illinois University--Carbondale $78,174 $61,142 -$17,032 -21.8%
University of North Carolina--Chapel Hill $75,952 $59,444 -$16,508 -21.7%
University of Dayton $98,846 $77,929 -$20,917 -21.2%
University of California (Hastings) $121,322 $96,303 -$25,019 -20.6%
University of Akron $75,190 $59,816 -$15,374 -20.4%
University of Nevada--Las Vegas $75,979 $60,637 -$15,342 -20.2%
Marquette University $130,402 $104,256 -$26,146 -20.1%
University of Georgia $69,415 $55,724 -$13,690 -19.7%
University of California--Davis $77,712 $62,486 -$15,226 -19.6%
University of Michigan--Ann Arbor $113,064 $91,026 -$22,038 -19.5%
Washburn University $66,857 $53,847 -$13,010 -19.5%
St. John's University $95,450 $76,945 -$18,505 -19.4%
Liberty University $59,671 $48,107 -$11,565 -19.4%
University of Pittsburgh $89,232 $72,596 -$16,636 -18.6%
Samford University (Cumberland) $108,188 $88,037 -$20,151 -18.6%
Suffolk University $106,737 $87,090 -$19,648 -18.4%
Northwestern University (Pritzker) $107,778 $88,138 -$19,640 -18.2%
Albany Law School $89,205 $72,957 -$16,248 -18.2%
University of Colorado--Boulder $81,575 $66,766 -$14,809 -18.2%
University of Oklahoma $67,108 $54,938 -$12,171 -18.1%
University of Miami $108,997 $89,275 -$19,722 -18.1%
Villanova University $75,422 $61,872 -$13,549 -18.0%
University of Illinois--Urbana-Champaign $76,180 $62,659 -$13,521 -17.7%
DePaul University $111,743 $91,990 -$19,754 -17.7%
Willamette University College of Law $131,498 $108,357 -$23,141 -17.6%
University of St. Thomas $78,999 $65,504 -$13,495 -17.1%
Baylor University $113,629 $94,415 -$19,214 -16.9%
Brooklyn Law School $84,611 $70,414 -$14,196 -16.8%
Stanford University $109,728 $91,379 -$18,349 -16.7%
Georgetown University $131,170 $109,668 -$21,502 -16.4%
Louisiana State University--Baton Rouge (Hebert) $68,023 $56,878 -$11,145 -16.4%
University of Montana $71,101 $59,526 -$11,576 -16.3%
Pepperdine University (Caruso) $126,341 $106,229 -$20,112 -15.9%
Golden Gate University $150,786 $126,974 -$23,812 -15.8%
Boston College $80,194 $68,029 -$12,166 -15.2%
Roger Williams University $116,497 $99,060 -$17,437 -15.0%
University of Maryland (Carey) $88,863 $75,764 -$13,099 -14.7%
Florida International University $87,527 $74,927 -$12,600 -14.4%
Fordham University $94,529 $81,126 -$13,403 -14.2%
Washington and Lee University $90,547 $78,408 -$12,139 -13.4%
University of Missouri--Kansas City $91,276 $79,055 -$12,221 -13.4%
University of Arkansas--Little Rock (Bowen) $55,520 $48,139 -$7,381 -13.3%
Quinnipiac University $92,334 $80,074 -$12,261 -13.3%
University of Utah (Quinney) $81,371 $70,607 -$10,763 -13.2%
University of Southern California (Gould) $103,426 $91,037 -$12,389 -12.0%
University of South Dakota $58,014 $51,107 -$6,906 -11.9%
University of Maine $74,405 $65,690 -$8,715 -11.7%
Wake Forest University $84,549 $74,714 -$9,835 -11.6%
Indiana University--Bloomington (Maurer) $78,538 $69,410 -$9,128 -11.6%
University of the Pacific (McGeorge) $135,007 $119,595 -$15,412 -11.4%
CUNY $64,328 $56,992 -$7,337 -11.4%
Duquesne University $98,700 $87,572 -$11,129 -11.3%
Seattle University $124,338 $110,377 -$13,961 -11.2%
Northern Illinois University $77,824 $69,191 -$8,633 -11.1%
University of Wisconsin--Madison $58,933 $52,502 -$6,432 -10.9%
University of Texas--Austin $73,528 $65,513 -$8,014 -10.9%
Mississippi College $103,608 $92,948 -$10,660 -10.3%
Drake University $105,759 $95,067 -$10,692 -10.1%
Yeshiva University (Cardozo) $81,204 $73,348 -$7,856 -9.7%
University of Mississippi $53,796 $48,644 -$5,151 -9.6%
University of Baltimore $94,694 $85,627 -$9,067 -9.6%
University of Pennsylvania (Carey) $118,391 $107,516 -$10,876 -9.2%
University of San Francisco $145,407 $132,305 -$13,101 -9.0%
University of North Dakota $48,761 $44,488 -$4,274 -8.8%
University of Nebraska--Lincoln $48,245 $44,066 -$4,180 -8.7%
University of Virginia $111,177 $102,309 -$8,868 -8.0%
University of Denver (Sturm) $129,882 $119,929 -$9,953 -7.7%
University at Buffalo--SUNY $79,323 $73,514 -$5,808 -7.3%
University of California--Berkeley $111,367 $103,316 -$8,051 -7.2%
Cornell University $109,464 $101,769 -$7,695 -7.0%
Duke University $105,132 $97,766 -$7,366 -7.0%
University of Notre Dame $99,175 $92,261 -$6,914 -7.0%
Harvard University $125,210 $117,278 -$7,933 -6.3%
Creighton University $111,743 $104,672 -$7,071 -6.3%
University of California--Los Angeles $92,890 $87,053 -$5,837 -6.3%
Lewis & Clark College (Northwestern) $115,655 $108,659 -$6,996 -6.0%
New York University $113,752 $107,997 -$5,755 -5.1%
Nova Southeastern University (Broad) $136,978 $130,501 -$6,477 -4.7%
Brigham Young University (Clark) $42,862 $41,168 -$1,694 -4.0%
University of Arizona (Rogers) $55,949 $53,878 -$2,071 -3.7%
Santa Clara University $113,377 $109,363 -$4,014 -3.5%
Ohio State University (Moritz) $72,098 $69,938 -$2,160 -3.0%
Temple University (Beasley) $69,212 $67,282 -$1,930 -2.8%
University of Illinois--Chicago (John Marshall) $141,204 $137,338 -$3,866 -2.7%
Vanderbilt University $87,247 $85,179 -$2,068 -2.4%
West Virginia University $68,747 $67,226 -$1,520 -2.2%
Tulane University $106,019 $103,693 -$2,326 -2.2%
American University (Washington) $127,674 $125,256 -$2,418 -1.9%
Loyola Marymount University $113,944 $113,036 -$909 -0.8%
University of South Carolina $76,948 $76,898 -$50 -0.1%
University of Houston $67,319 $67,467 $149 0.2%
Hofstra University (Deane) $117,074 $117,624 $549 0.5%
California Western School of Law $139,637 $140,401 $764 0.5%
University of Florida (Levin) $62,549 $63,195 $645 1.0%
University of Iowa $55,262 $56,082 $819 1.5%
St. Mary's University $115,854 $117,939 $2,085 1.8%
Boston University $74,210 $76,152 $1,942 2.6%
Stetson University $109,562 $112,575 $3,013 2.8%
Columbia University $108,041 $112,226 $4,185 3.9%
Seton Hall University $76,352 $80,080 $3,728 4.9%
University of California--Irvine $83,373 $87,917 $4,545 5.5%
Gonzaga University $92,832 $98,372 $5,541 6.0%
Southern Methodist University (Dedman) $90,731 $96,344 $5,614 6.2%
Yale University $88,832 $94,334 $5,502 6.2%
The Catholic University of America $102,315 $109,379 $7,064 6.9%
University of Massachusetts--Dartmouth $87,623 $95,554 $7,931 9.1%
Charleston School of Law $117,018 $128,379 $11,361 9.7%
George Washington University $93,366 $104,642 $11,276 12.1%
Washington University in St. Louis $57,885 $65,834 $7,949 13.7%
University of Memphis (Humphreys) $62,321 $71,139 $8,818 14.1%
University of Connecticut $52,844 $62,206 $9,362 17.7%
Oklahoma City University $88,184 $103,827 $15,643 17.7%
University of San Diego $91,396 $108,298 $16,902 18.5%
Ave Maria School of Law $114,409 $136,034 $21,625 18.9%
University of Chicago $89,043 $107,795 $18,751 21.1%
Florida Coastal School of Law $118,266 $147,238 $28,971 24.5%
Campbell University $115,833 $162,478 $46,645 40.3%
University of Idaho $62,984 $91,180 $28,196 44.8%
University of Kentucky $44,578 $65,102 $20,525 46.0%
North Carolina Central University $58,588 $100,022 $41,435 70.7%