Rethinking the best debt metrics for evaluating law schools

Last spring, I critiqued the methodology used in the latest USNWR rankings that added a new indebtedness metric. USNWR chose two categories, one which it gave slightly greater weight than another: the size of the average loan incurred among law students who incurred law school debt (3%); and the percentage of law students who incurred debt (2%).

In my judgment, it is nonsensical to separate those categories, as it disproportionately favored $0 in loans to $1 in loans.

But an additional category to consider is debt-to-income ratio. This is admittedly an imperfect measure, like all measures. But I make the case for it here.

This post, then, is an effort to look at alternative measures of indebtedness and what might have been.

(This post may be best viewed on a desktop or on a phone sideways.)

First, I’ll offer a few snapshots of these alternative rankings. One is to compare the way USNWR does the ranking (separating indebtedness from percent incurring debt) with one metric that combines them. Among the ten schools that would perform the best in this new metric (based on scaled score) - that is, the schools that do better with my averaging of everything rather than two distinct categories. (By “rank,” I mean the rank among the indebtedness score.)

  Average debt among incurring Pct debt USNWR score USNWR rank Overall average debt Overall average score Overall average rank Score delta
South Dakota $53,253 80% -0.03958 26 $42,602 -0.07447 8 -0.03489
Cleveland State $69,727 90% -0.00170 89 $62,754 -0.03420 52 -0.03250
Florida A&M $61,500 81% -0.02950 42 $49,815 -0.06006 17 -0.03056
Ohio Northern $71,134 88% -0.00479 83 $62,598 -0.03451 49 -0.02972
Nebraska $63,027 78% -0.03470 31 $49,161 -0.06136 14 -0.02667
Rutgers $62,210 75% -0.04213 22 $46,658 -0.06637 10 -0.02424
North Dakota $67,281 78% -0.03064 40 $52,479 -0.05473 26 -0.02409
Arkansas $68,877 79% -0.02690 48 $54,413 -0.05087 28 -0.02397
Texas Tech $56,898 72% -0.05385 7 $40,967 -0.07774 5 -0.02389
Utah $76,344 85% -0.00648 82 $64,892 -0.02993 57 -0.02345

And among the ten schools that would perform the worst - that is, that do better in the USNWR weighing than an overall average:

  Average debt among incurring Pct debt USNWR score USNWR rank Overall average debt Overall average score Overall average rank Score delta
Southwestern $190,184 82% 0.095404 183 $155,951 0.152034 183 0.05663
California Western $164,918 88% 0.084622 182 $145,128 0.130406 182 0.045785
St Thomas University $161,701 86% 0.077119 181 $139,063 0.118287 181 0.041167
Nova Southeastern $155,193 87% 0.073132 180 $135,018 0.110204 180 0.037071
Columbia $190,141 66% 0.05988 175 $125,493 0.09117 178 0.03129
Golden Gate $151,854 83% 0.061078 176 $126,039 0.092261 179 0.031183
American $159,723 76% 0.053056 172 $121,389 0.08297 173 0.029914
Harvard $170,866 70% 0.050374 168 $119,606 0.079407 171 0.029032
San Francisco $156,460 77% 0.052163 170 $120,474 0.081141 172 0.028978
Florida Coastal $145,245 86% 0.06143 177 $124,911 0.090006 176 0.028576

Now, these “score deltas” really mean little unless one sees how the fit in with the remainder of the USNWR scoring. Most schools would be affected by no more than 0.02. But for the schools at the outer ranges, it can have a material effect.

Note, too, that the worst-performing schools all get a little worse—these schools are ranked between 168 and 183, near the bottom of the metric in the first place. The top schools, however, range from 7th to 89th in the original metric, suggesting some greater penalty for affordable public schools across the range who do not have a lot of people graduating with zero debt.

One more thought. What about comparing the schools’ rankings on a third metric, the debt-to-income ranking I’ve used before? To do that, I’ll include both the USNWR debt rank and the average debt rank, so you get a sense of ordinal place. I’ll then sort by debt-to-income ratio, with another rank to compare how schools compare against one another. Absolute debt matters, true. But debt is also relative, and what comes after the J.D. matters a lot to students, too. That entire list is below. (There’s also some lag by a year, but it should still be a useful point of comparison.)

USNWR debt rank Average debt rank 2019 debt-to-income ratio Debt-to-income rank
BYU 4 2 0.76 1
Cornell 104 117 0.78 2
Northwestern 137 146 0.78 3
Stanford 100 113 0.80 4
Harvard 168 171 0.82 5
Duke 126 135 0.83 6
Penn 131 141 0.84 7
Virginia 133 139 0.90 8
Chicago 166 164 0.90 9
Hastings 122 122 0.94 10
Iowa 70 75 0.94 11
Washington University 25 35 0.94 12
Columbia 175 178 0.95 13
Wayne State 8 9 0.95 14
Illinois 13 25 0.96 15
Yale 102 115 0.97 16
Temple 53 53 0.97 17
Michigan 121 123 0.97 18
Georgia State 37 31 0.98 19
Vanderbilt 99 110 0.98 20
Texas 43 64 0.99 21
Alabama 5 3 1.00 22
Wisconsin 21 21 1.04 23
NYU 141 142 1.04 24
Kansas 55 51 1.04 25
Nebraska 31 14 1.04 26
Texas Tech 7 5 1.05 27
Tennessee 20 23 1.05 28
Arkansas 48 28 1.08 29
Boston University 44 61 1.09 30
Houston 38 44 1.10 31
Utah 82 57 1.12 32
Cincinnati 12 11 1.12 33
Missouri 18 16 1.13 34
Georgetown 135 144 1.13 35
UCLA 94 103 1.13 36
Oklahoma 10 22 1.15 37
Connecticut 32 40 1.15 38
North Dakota 40 26 1.16 39
Baylor 152 156 1.16 40
UNLV 36 36 1.17 41
CUNY 69 67 1.19 42
Boston College 60 78 1.19 43
Berkeley 124 130 1.20 44
Georgia 11 18 1.23 45
Rutgers 22 10 1.24 46
Villanova 28 37 1.25 47
Indiana-Bloomington 54 63 1.25 48
University of St Thomas 30 27 1.25 49
Mississippi 14 15 1.25 50
Drexel 19 20 1.26 51
Hawaii 34 42 1.27 52
Fordham 91 102 1.27 53
Penn State-Dickinson 1 1 1.29 54
Penn State-University Park 56 50 1.29 55
USC 110 120 1.30 56
Arizona 3 4 1.30 57
Tulsa 154 116 1.30 58
New Hampshire 62 43 1.33 59
Notre Dame 127 131 1.33 60
Florida State 23 12 1.34 61
George Mason 9 19 1.34 62
Mitchell Hamline 125 126 1.36 63
Washington 66 80 1.37 64
North Carolina 29 39 1.39 65
Case Western 57 69 1.41 66
Arizona State 45 54 1.43 67
LSU 63 72 1.43 68
Washburn 78 70 1.44 69
Richmond 90 94 1.44 70
Oregon 96 88 1.44 71
Duquesne 109 98 1.44 72
Florida 6 7 1.44 73
Northern Illinois 72 58 1.45 74
West Virginia 58 62 1.45 75
Akron 49 33 1.46 76
Washington & Lee 65 76 1.47 77
Indiana-Indianapolis 85 83 1.48 78
Pace 105 105 1.49 79
William & Mary 24 34 1.50 80
Ohio State 52 55 1.51 81
Montana 84 68 1.51 82
Quinnipiac 107 93 1.51 83
St Louis 101 97 1.51 84
St John's 86 95 1.52 85
Cardozo 33 56 1.53 86
Michigan State 92 86 1.54 87
Irvine 111 119 1.56 88
Texas A&M 51 46 1.56 89
Colorado 87 87 1.57 90
Northeastern 59 48 1.58 91
Arkansas 15 13 1.58 92
San Diego 132 137 1.59 93
New Mexico 68 66 1.60 94
Chicago-Kent 73 79 1.61 95
SMU 77 89 1.61 96
Kentucky 75 59 1.61 97
South Dakota 26 8 1.63 98
Buffalo 79 74 1.63 99
Memphis 80 71 1.65 100
Toledo 46 38 1.66 101
Wake Forest 61 73 1.66 102
Louisville 88 85 1.66 103
Minnesota 35 47 1.66 104
Cleveland State 89 52 1.68 105
Albany 119 92 1.68 106
Seton Hall 103 111 1.71 107
Wyoming 16 29 1.72 108
Syracuse 130 121 1.72 109
Brooklyn 76 84 1.74 110
Emory 81 91 1.76 111
Maryland 67 82 1.77 112
Tulane 129 136 1.79 113
Balitmore 138 127 1.79 114
Davis 39 60 1.81 115
Pittsburgh 106 101 1.81 116
Drake 98 99 1.83 117
Idaho 114 106 1.84 118
District of Columbia 142 124 1.85 119
Missouri-Kansas City 74 77 1.86 120
FIU 64 65 1.89 121
Santa Clara 165 167 1.90 122
Loyola-Chicago 71 81 1.97 123
George Washington 143 151 1.98 124
Loyola-Los Angeles 157 160 1.98 125
Lincoln Memorial 120 107 1.98 126
Northern Kentucky 93 90 1.99 127
Suffolk 118 114 2.03 128
Ohio Northern 83 49 2.04 129
South Texas 149 152 2.05 130
Western State 115 112 2.06 131
South Carolina 97 96 2.06 132
St Mary's 171 169 2.07 133
Catholic 159 161 2.07 134
Gonzaga 151 147 2.08 135
Southern Illinois 41 30 2.12 136
New York Law School 128 134 2.13 137
DePaul 134 133 2.13 138
Loyola-New Orleans 162 158 2.15 139
Chapman 167 170 2.20 140
Lewis & Clark 148 155 2.24 141
Regent 27 32 2.25 142
Howard 145 128 2.26 143
Florida A&M 42 17 2.27 144
Denver 164 168 2.28 145
Creighton 113 109 2.29 146
Pepperdine 140 145 2.30 147
Hofstra 123 129 2.30 148
California Western 182 182 2.33 149
Capital 116 118 2.35 150
Mercer 146 148 2.35 151
Detroit Mercy 2 6 2.35 152
Massachusetts-Darmouth 156 149 2.36 153
Widener Commonwealth 173 165 2.37 154
Seattle 158 157 2.45 155
Miami 117 125 2.46 156
Roger Williams 147 138 2.49 157
Belmont 95 104 2.50 158
Samford 112 108 2.50 159
McGeorge 178 177 2.55 160
Willamette 160 162 2.58 161
Stetson 139 143 2.59 162
American 172 173 2.70 163
Dayton 47 45 2.76 164
Golden Gate 176 179 2.84 165
Campbell 153 154 2.85 166
Illinois-Chicago 163 163 2.87 167
Marquette 169 166 2.87 168
Mississippi College 136 132 2.95 169
San Francisco 170 172 2.96 170
Nova Southeastern 180 180 3.14 171
Charleston 174 174 3.18 172
St Thomas University 181 181 3.23 173
Southwestern 183 183 3.40 174
Ave Maria 179 175 3.54 175
Florida Coastal 177 176 4.95 176
Maine 17 24 n/a n/a
Liberty 50 41 n/a n/a
Western New England 108 100 n/a n/a
Faulkner 144 140 n/a n/a
Vermont 150 153 n/a n/a
Elon 155 150 n/a n/a
Oklahoma City 161 159 n/a n/a

Multistate Bar Exam scores drop but remain consistent with scores since 2014

It was difficult to project much about the bar exam last fall given the pandemic. Jurisdictions made many changes to how they administered the exam. When the MBE scores were released last year, usually a harbinger of overall pass rates, we saw just 5700 July 2020 test-takers, down from 45,000 or so in a typical July. Many states developed novel exams; some changed cut scores or offered versions of “diploma privilege.” Early signs in some jurisdictions, however, pointed to dropping passing rates.

Now the MBE scores have been released, and the scores are a drop from July 2019—but still consistent with scores between 2014 and 2019.

I opted to leave the July 2020 MBE information blank in the chart, as it offers little for historical comparison (although it was much higher last year). You can see that scores bottomed out in 2018 at 139.5, so the 140.4 this year is a bit above that. Nevertheless, the decline from July 2019 suggests that bar passage rates continue to be a challenge for law schools and graduating law students. (Of course, the lower MBE mean does not automatically translate to lower bar pass rates, but it does portend that result.) We’ll see what individual jurisdictions continue to reveal in the weeks ahead.

Early signs point to dropping July 2021 bar exam results

Back in 2014, I noted early warning signs of a precipitous drop in bar exam pass rates around the country. The MBE scores were much lower. Part of that may have been attributable to an ExamSoft error, which I ultimately concluded was unlikely, and, at first tentatively blaming the NCBE’s administration of the exam, later backed off that claim as well, attributing the bulk of the decline to a decline in overall student quality as a major part of the answer.

What’s old is new again. July 2021 saw a major ExamSoft error, and now we’re beginning to see a downtick in bar passage rates. From five jurisdictions, and reporting overall results (not first-time results) so far:

Iowa, -12 points: July 2020, 83%; July 2021, 71%

Nebraska, -17 points: July 2020, 89%; July 2021, 72% [Nebraska offered a much smaller second exam in 2020, too]

New Mexico, -18 points: September 2020, 89%; July 2021, 71%

North Dakota, -11 points: July 2020, 76%; July 2021, 65%

South Dakota, +3 points: July 2020, 70%; July 2021, 73%

West Virginia, -19 points: July 2020, 77%; July 2021, 58%

Wyoming, -13 points: July 2020, 85%; July 2021, 72%

It becomes increasingly difficult to compare year-over-year exam results as the format and timing change, and as new variables enter the mix. Some possible variable to consider….

ExamSoft. Here we open with problems attributable to a remotely-administered exam as administered by ExamSoft. Not all of these jurisdictions were remote (e.g., Nebraska was in person). But certainly same-day exam problems are a significant issue. Stress and sleeplessness from the first day could trickle over to a second day, albeit with a more indirect effect. North Carolina, for instance, has already announced it would lower its cut score by 2 points specifically because of software issues. This is not unprecedented—California, for instance, made modest adjustments to the cut score for some test-takers after an earthquake hit testing sites in 2008.

Online learning. Many law students taking the bar exam had nearly half of their law school educational experience shifted largely, if not exclusively, online. It’s not clear what pedagogical effect that had on students in the long term. It would not surprise me that graduating 3Ls “lost” some amount of learning in the pandemic, which later translated to lower bar exam performance.

Pandemic fatigue. Relatedly, one could easily multiple the concerns from the July 2020 administration of the bar exam to prolonged difficulties arising from the coronavirus pandemic. Early summer 2021 may have been some of the better moments for most test-takers, so it’s difficult to know how it may have affected test-takers later in the summer.

Credentials/academic dismissal. These tend to be more individualized assessments at institutions. The incoming class of 2017 was not materially better than the incoming class of 2018—they were, from all I can see, largely comparable. Whether individualized decisions at schools, including academic dismissal rates, affected scores remains to be seen.

In short, I’m cautious about assigning any, all, or none of these responsibility for the decline in scores so far. We may see scores at other states increase, negating the premise of this post. But I’d say these are more likely to be canaries in the coal mine. I’d anticipate lower scores in many states, and a lot of questions about the cause. I wouldn’t rule out these possibilities so far, and I hope to see clear-eyed analysis of the possible sources for declining rates.

This post has been updated as results come in or are corrected.

With a sharp rise in LSAT scores, it's worth keeping an eye on the law school Class of 2024

The entering law school class, the Class of 2024, is a booming and highly-credentialed class. Applicants rose significantly (more than 12%), and while we’ll wait to see the final matriculant statistics until next month, that should translate into much higher incoming class sizes. One can attribute all sorts of causes, from the pandemic to politics, to the increased demand, mostly non-falsifiable narratives at this point.

But most surprising, in my view, is the sharp rise in LSAT scores. Those scoring 175 or higher were always a small band of applicants, but that pool more than doubled year over year. Those in the 170-174 LSAT band rose 56.5% year over year. And even the 160-169 LSAT band, which had over 13,000 applicants in 2020, rose more than 25%.

It’s possible, of course, that those more interested in law school were simply more highly credentialed this year and more likely to succeed on the exam. But I’d watch another factor that arose last year: the introduction of LSAT-Flex. That exam is online, unlike the traditional paper-and-pencil test in a limited proctored location offered at fixed times over the year. It’s also a shorter exam: LSAC reduced the number of sections from 5 (including one “experimental” section) to 3, a significant reduction in the “stamina” required to endure the exam.

I would be very interested to see the future psychometric studies from LSAC. I’d also be interested to see schools' 1LGPA correlations next year, and later whether the LSAT is as predictive of bar exam performance (standing alone, it is somewhat albeit predictive across law schools, more predictive when considered as a part of an index score, but much less predictive than 1LGPA or LGPA).

It could be that the test is now less predictive of 1LGPA performance given the changes in the exam. Or, perhaps, it’s actually more predictive, as the stamina components of the traditional format of the exam were less predictive. Or perhaps it has not changed at all, and the exogenous explanations are the right ones.

Whatever the case, it’s a significant shift worth watching in the years ahead.

USNWR law school ranking fiasco ends with a whimper (until next rankings cycle)

In March, I chronicled the repeated problems that cropped up with the latest USNWR law school rankings. I titled it, “The USNWR law school rankings are deeply wounded--will law schools have the coordination to finish them off?” I identified four specific problems that plagued this year’s rankings and offered a few ways forward. I concluded:

Of course, I imagine these, like most such projects, would fall to infighting. It’s one thing for law schools to write a strongly-worded letter decrying what USNWR is doing. It’s another thing to, well, do something about it. I confess my solutions are half-baked and incomplete means of doing so.

But if there’s a moment to topple USNWR law school rankings, it is now. We’ll see if law schools do so.

Well, the answer is, perhaps unsurprisingly, no.

There was no effort, no formality, no movement forward. Law school complained very loudly, and the moment the rankings were released, touted them to prospective students and largely forgot about these issues. Indeed, USNWR delayed releasing its new (controversial) “diversity” rankings of law schools, and still has not done so.

The moment has passed… until next year, when, I’m sure, we’ll see similar cries from law schools and complaints and letters, and little action.

"Supreme Court Raised the Bar for Challenge to GA Election Law"

I have this piece over at RealClearPolitics, “Supreme Court Raised the Bar for Challenge to GA Election Law.” It begins:

The Supreme Court’s recent decision in Brnovich v. Democratic National Committee has prompted extensive commentary about the implications for future challenges to election laws under Section 2 of the Voting Rights Act. Litigants arguing that some laws, such as Georgia’s newly enacted SB 202, disproportionately affect racial minorities may have a greater challenge meeting the standard set forth by the court than the standard that some lower courts had been using in recent years.

But while the justices split on a 6-3 vote on whether a pair of Arizona statutes ran afoul of the Act, it voted 6-0 (with three justices not addressing the question) in concluding that Arizona did not act with discriminatory intent. This holding sets the stage for the Justice Department’s recent lawsuit against Georgia, and it offers hints at how district courts and reviewing courts should behave. In short, the Justice Department has an uphill battle.

"Electoral Votes Regularly Given"

I have this (late stage!) draft at SSRN on a piece forthcoming in the Georgia Law Review, entitled “Electoral Votes Regularly Given.” Here’s the abstract:

Every four years, Congress convenes to count presidential electoral votes. In recent years, members of Congress have objected or attempted to object to the counting of electoral votes on the ground that those votes were not "regularly given." That language comes from the Electoral Count Act of 1887. But the phrase "regularly given" is a term of art, best understood as "cast pursuant to law." It refers to controversies that arise after the appointment of presidential electors, when electors cast their votes and send them to Congress. Yet members of Congress have incorrectly used the objection to challenge an assortment of pre-appointment controversies that concern the underlying election itself. This Essay identifies the proper meaning of the phrase "regularly given," articulates the narrow universe of appropriate objections within that phrase, and highlights why the failure to object with precision ignores constraints on congressional power.

"Brnovich, election-law tradeoffs, and the limited role of the courts"

I have this essay at SCOTUSblog, “Brnovich, election-law tradeoffs, and the limited role of the courts.” It begins:

Arizona “generally makes it quite easy for residents to vote.” This framing from Justice Samuel Alito in Brnovich v. Democratic National Committee set the path for the six-justice majority of the Supreme Court to reject challenges to two Arizona laws.

It marks a major victory for states that seek to innovate or tinker with their election laws — to expand them or to contract them. And it is the latest in a string of cases pushing the federal courts out of second-guessing state election laws.

And from near the end:

Brnovich is the latest in a line of cases suggesting that the federal courts should play a smaller role in the patrolling of how states administer elections. Crawford approved Indiana’s voter-identification law. The court’s 2019 decision in Rucho v. Common Cause said that federal courts should not entertain challenges to partisan gerrymandering under the Constitution. In 2020, it decided a series of cases, including Republican National Committee v. Democratic National Committee, which mostly instructed federal courts not to make late-breaking changes to how states administer elections, even in the middle of a pandemic. And it rejected a challenge to the presidential election in Texas v. Pennsylvania, letting state election officials’ decisions stand.

Without ABA, Biden judicial nominations rolling along

According to the Heritage Foundation’s “judicial appointments tracker,” President Joe Biden has more confirmed judicial appointments through July 7 of his first term (7) than the last six presidents combined (6). Granted, President Donald Trump confirmed a Supreme Court nominee, Justice Neil Gorsuch, in that window. But despite the Trump administration leaving relatively few vacancies, federal judges began retiring at an extraordinary clip at the beginning of the Biden administration. And the Senate, despite a 50-50 partisan divide, has moved expeditiously with nominations, aided by the decline of the filibuster for judicial nominations.

But it’s probably the Biden administration’s decision to dispense with the American Bar Association’s approval process that has expedited the process most of all in these early days. Since the George W. Bush and Trump administrations also dispensed with the ABA, it’s not clear that any administration will pre-clear nominations with it. Indeed, it is likely the Obama administration’s experience and frustration with the ABA’s process that made the decision for Obama-Biden alumni. Without the ABA’s pre-approval process, the Biden administration has been able to move much more quickly and much earlier on any given vacancy.

And the ABA has given glowing recommendations to every nominee thus far. We’ll see if that shine fades in the future, but it’s worth emphasizing a separate frustration from the Obama administration was the ABA’s decision to rate a number of its prospective nominees as “not qualified,” which the Obama administration dutifully scuttled. The Biden administration, it appears, would have no such plan to do so, and so it’s a wait-and-see approach if the ABA ever deems one of Mr. Biden’s nominees to be “not qualified.”