Visualizing the 2023 U.S. News law school rankings--the way they should be presented

Five years ago, I pointed out that the ordinal ranking at the heart of the USNWR rankings is perhaps one of its greatest deceptions. It crunches its formula and spits out a score. That score is normalized to give the top-scoring school (Yale) a score of 100, and it scales the rest of the scores off that.

But the magazine then chooses to display rank order of each school--even if there are significant gaps between the scores. To highlight one such example this year, Vanderbilt has a score of 80, USC has a score of 79, and Florida has a score of 73, which suggest that Vanderbilt and USC are quite close and that Florida is somewhat farther behind those two (even if in overall elite company!). But the magazine displays this as Vanderbilt tied for 17, USC 20, and Florida 21--distorting the narrow gap between Vanderbilt and USC, and the much wider gap between them and Florida. And even though the magazine displays the overall score, the ordinal ranking drowns out these scores. Indeed, as the rankings are ordinal, there is no space from one school to the next, suggesting that they are placed along an equal line.

This plays out elsewhere in the rankings, as law students agonize over small differences in ordinal ranking that belie fairly distinct clumpings of schools that suggest little difference--indeed, in many cases, differences likely only the result of rounding the raw score up or down to the next whole number.

Assuming one takes the USNWR formula seriously--which it doesn't even appear USNWR does, given its choice to rank--a better way would be to visualize the relative performance of each school based on the score, not assigning each school an ordinal rank. That provides better context about the relative position of schools to one another. And that can help illustrate sharp differences in the overall score, or groupings that illustrate a high degree of similarity between a number of schools.

Below is my attempt to visualize the rankings in that fashion. (Please note that this may look best on a desktop browser due to the size of the chart.)

Score USNWR 2023 Rankings, Visualized by Overall Score
100 Yale
99  
98 Stanford
97  
96 Chicago
95 Columbia | Harvard
94  
93 Penn
92 NYU
91 Virginia
90  
89 Berkeley
88 Michigan
87 Duke
86 Cornell
85 Northwestern
84 Georgetown
83 UCLA
82 Washington Univ.
81  
80 Boston Univ. | Texas | Vanderbilt
79 USC
78  
77  
76  
75  
74  
73 Florida | Minnesota
72 BYU | North Carolina
71 George Washington | Alabama | Notre Dame
70 Iowa
69 Georgia
68 Arizona State | Emory | George Mason | Ohio State | William & Mary
67 Illinois | Washington & Lee
66 Fordham | Davis | Irvine | Utah | Wake Forest
65  
64 Indiana | Wisconsin
63 Arizona
62 Texas A&M
61 Florida State | Maryland
60 Colorado | Washington
59 Hastings
58 Pepperdine | Richmond | Cardozo
57 Tulane
56 Tennessee | Villanova
55 Baylor | Penn State Dickinson | SMU | Houston | Wayn State
54 Temple
53 Penn State University Park | Connecticut | San Diego
52 Loyola (CA) | Kansas | Kentucky | Missouri | UNLV | Oregon
51 American | Loyola Chicago | Northeastern | Seton Hall | Miami
50 Case Western | Drexel | Georgia State | Denver | Nebraska | Pitt
49 St. John's | South Carolina
48 Rutgers | Arkansas
47  
46 Lewis & Clark | Cincinnati | Oklahoma
45 Michigan State | Hawaii | New Mexico
44 Chicago-Kent | Catholic | Buffalo | Louisville
43 Brooklyn | Florida International | Howard | Indianapolis | St. Louis
42 Syracuse | Montana
41 DePaul | Louisiana State | Marquette | Texas Tech | New Hampshire | Washburn
40 Drake | Stetson | Mississippi
39 Maine | Missouri-Kansas City
38 Gonzaga | Seattle
37 Chapman | Hofstra | Tulsa | West Virginia
36 Albana | Mercer | Suffolk | Baltimore | Dayton
35 Cleveland State | St. Thomas (MN)
34 Duquesne | NYLS | Wyoming | Willamette
33 Belmont | CUNY | Loyola New Orleans | Santa Clara | South Dakota | McGeorge
32  
31 Creighton | Samford | Detroit Mercy
30 Pace | Regent | Idaho | Memphis | Vermont

Breaking down my many posts on USNWR metrics

I’ve blogged plenty about various facets of the USNWR rankings over the years. Here’s an aggregation of the most significant posts.

Peer score

Do law professors generally think most other law schools are pretty awful? (2017)

Will Goodhart's Law come to USNWR's Hein-based citation metrics? (2019)

Gaming out Hein citation metrics in a USNWR rankings system (2019)

Significant one-year peer USNWR survey score drops, their apparent causes, and their longevity (2019)

Congrats to the University of Illinois-Chicago John Marshall Law School on an unprecedented USNWR peer score improvement (2020)

USNWR law school voters sank Yale Law and Harvard Law for the first time in rankings history. (2022)

Admissions (LSAT, GPA, acceptance rate)

Solving law school admissions; or, how U.S. News distorts student quality (2013)

Some more evidence of the scope of GRE admissions in legal education (2020)

For the second year in a row, Alabama's admissions standards (partially) trump Yale's (2021)

Non-LSAT standardized test scores in admissions remain concentrated at a handful of schools (2022)

Indebtedness

Indebtedness metrics and USNWR rankings (2021)

New USNWR metric favors $0 loans over $1 loans for graduating law students (2021)

Rethinking the best debt metrics for evaluating law schools (2021)

Employment

How state court clerkship opportunities affect legal employment (2014)

Law school-funded positions dry up with U.S. News methodology change (2016)

How should we think about law school-funded jobs? (2017)

At graduation employment figures for law school graduates in 2018 (2021)

Bar exam

USNWR has erratically chosen whether "statewide bar passage" rate includes only ABA-approved law schools over the years (2022)

Some dramatic swings as USWNR introduces new bar exam metric (2022)

Expenditures

Trying (unsuccessfully) to account for law school expenditures under the USNWR rankings formula (2022)

Overall

If U.S. News rankings were a cake, you wouldn't want to follow the recipe (2014)

When U.S. News rankings aren't news, but just 15 months late (2014)

Visualizing the 2018 U.S. News law school rankings--the way they should be presented (2017)

The new arms race for USNWR law specialty rankings (2019)

The absurd volatility of USNWR specialty law rankings (2020)

The tension in measuring law school quality and graduating first generation law students (2020)

Law school inputs, outputs, and rankings: a guide for prospective law students (2021)

The hollowness of law school rankings (2021)

The USNWR law school rankings are deeply wounded--will law schools have the coordination to finish them off? (2021)

Overall legal employment for the Class of 2021 improves significantly, with large law firm and public interest placement growing

Despite an ongoing pandemic, disrupted legal education, challenging bar exams, remote interviews, and the like, the red hot legal market benefited the Class of 2021. The trends were quite positive. Below are figures for the ABA-disclosed data (excluding Puerto Rico’s three law schools). These are ten-month figures from March 15, 2022 for the Class of 2021.

  Graduates FTLT BPR Placement FTLT JDA
Class of 2012 45,751 25,503 55.7% 4,218
Class of 2013 46,112 25,787 55.9% 4,550
Class of 2014 43,195 25,348 58.7% 4,774
Class of 2015 40,205 23,895 59.4% 4,416
Class of 2016 36,654 22,874 62.4% 3,948
Class of 2017 34,428 23,078 67.0% 3,121
Class of 2018 33,633 23,314 69.3% 3,123
Class of 2019 33,462 24,409 72.9% 2,799
Class of 2020 33,926 24,006 70.8% 2,514
Class of 2021 35,310 26,423 74.8% 3,056

The placement is still quite good. There was an increase of nearly 2500 full-time, long-term bar passage-required jobs year-over year, and the graduating class size was the largest since 2016. It yielded a placement of 74.8%. J.D. advantage jobs increased somewhat, too, perhaps consistent with an overall hot market.

It’s astonishing to compare the placement rates from the Class of 2012 to the present,. from 56% to 75%. And it’s almost entirely attributable to the decline in class size.

We can see some of the year-over-year categories, too.

FTLT Class of 2020 Class of 2021 Net Delta
Solo 260 234 -26 -10.0%
2-10 4,948 5,205 257 5.2%
11-25 1,755 2,004 249 14.2%
26-50 1,010 1,218 208 20.6%
51-100 856 1,003 147 17.2%
101-205 1,001 1,143 142 14.2%
251-500 1,030 1,108 78 7.6%
501+ 5,073 5,740 667 13.1%
Business/Industry 2,546 3,070 524 20.6%
Government 3,189 3,492 303 9.5%
Public Interest 2,284 2,573 289 12.7%
Federal Clerk 1,226 1,189 -37 -3.0%
State Clerk 1,938 2,094 156 8.0%
Academia/Education 269 328 59 21.9%

The trend continues last years uptick in public interest placement, which is not an outlier. Public interest job placement is up over 80% since the Class of 2017. These eye-popping number continue to rise. It is likely not an understatement to say that law students are increasingly oriented toward public interest, and that there are ample funding opportunities in public interest work to sustain these graduates. Sole practitioners continue to slide (they were in the low 300s not long ago in raw placement).

Additionally, extremely large law firm placement continues to boom. Placement is up more than 2000 graduates in the last several years, approaching 6000.

I wondered if government and clerkship declines last year may have been attributable to the pandemic, and it appears that employment has rebounded in these categories.

Some figures have been updated to correct errors.

February 2022 MBE bar scores fall, match record lows

What had been a record low in February 2020, after a record low in February 2018, after a record low in February 2017, we see a match of the February 2022 score. The mean score was 132.6, down from 134.0 last year and matching the February 2020 low. (That’s off from the recent 2011 high of 138.6.) We would expect bar exam passing rates to drop in most jurisdictions (although results have been decidedly mixed, with wild swings depending on the jurisdiction—big improvements in Iowa and North Dakota, for instance, and a big decline in Oklahoma).

Given how small the February pool is in relation to the July pool, it's hard to draw too many conclusions from the February test-taker pool. The February cohort is historically much weaker than the July cohort, in part because it includes so many who failed in July and retook in February. The NCBE reports that more than two-thirds of test-takers were repeaters. There were significantly more repeaters than February 2021 (unsurprising given 2020 pushed many first-timers to February). Comparisons to February 2020 seem better, then.

As interest in law schools appears to be waning, law schools will need to ensure that class quality remains strong and that they find adequate interventions to assist at-risk student populations.

Trying (unsuccessfully) to account for law school expenditures under the USNWR rankings formula

The most opaque elements of the USNWR law school rankings relate to expenditures. They are 10% of the rankings. USNWR does not disclose them. ABA does not disclose comparable figures. You can access them if you purchase access to the “Academic Insights” platform, but they are not otherwise publicly available (or available for any public distribution).

Nevertheless, they are probably the most significant factor as they relate to rankings. The gap between the top end and bottom end is significant. That gives schools the most opportunity to outperform others and climb in the weighted, scaled scores. They are also the least sticky. While peer reputation is notoriously sticky, and a whopping 25% of the ranking, it contributes very little volatility.

(This is not a new concern. Professor Brian Leiter and Professor Corey Rayburn Yung are just two recent examples of law faculty pointing to these problems.)

USNWR collects information from law schools on “expenditures.” (More on this undefined word in a moment.) It asks for “Instructional salaries,” “Administrative and student services salaries,” “Library salaries,” “Other salaries not included elsewhere,” “Fringe benefits,” “Law school expenses (exclude library),” “Financial aid,” and “Library operations.” There is a separate line for “indirect expenditures and overhead.” Law schools are instructed to “exclude expenditures for the LLM program.” That’s about all the instruction law schools are given.

The methodology reports that they calculate per-student metrics: “The average spending on instruction, library and supporting services (0.09) and the average spending on all other items, including financial aid (0.01): The faculty resources calculation for instruction, library and supporting services is adjusted for cost of living variations in law school salaries between school geographic locations by using publicly available Bureau of Economic Analysis Regional Price Parities index data.”

Some categories seems fairly straightforward. “Instructional salaries,” for instance.

But “Law school expenses,” an open-ended catch-all, remains a decidedly underdefined category, and a crucial (often large) component of the weight of the rankings worth 9%. Other “overhead” and adjacent expenses could fall into the “indirect” expenditures, which are weighted at just 1% of the rankings. The scandal exposed in 2005 revealed everything from water bills to the “market value” of LexisNexis accounts have been stuffed into these metrics. (And there’s a whole separate and complicated issue about how law schools “count” their physical building in these “expenditure” metrics.)

One might think that the word “expenditures” means dollars out the door, but it’s been obvious for years that accounting methods (including depreciation or other cost-basis allocation) have been a part of how schools calculate “expenditures.”

But many law schools report extraordinary expenditures. How extraordinary?

Imagine we say that a law school could calculate X dollars in revenue from “gross tuition.” That is, assuming no scholarships and no “discount rate” (which is a totally implausible assumption, I note), let’s figure out how much money a law school might collect from total tuition from its students.

Many schools report “expenditures” of 2X that number. And some even higher.

How is that even possible? Let’s try another small experiment. Suppose we calculate this in a real dollar total. If a school has, say, 600 students and $50,000/year tuition, we can say that “gross tuition” revenue is $30 million. (Again, an implausible assumption, but work with me.) A school reporting expenditures of $60 million, or $75 million, would be burning an extraordinary amount of additional, outside capital each year if it was truly “expenses.”

Law schools are usually cagey about revealing their endowments. We have some figures at the high end: Harvard Law in 2008 reported an endowment of $1.7 billion, which at 5% would spin off $85 million a year. (Of course, Harvard has around 1700 JD students, not 600.) Yale Law in 2009 reported an endowment of $1.2 billion, which would spin off around $60 million.

But most endowments are much more modest. Emory Law in 2016 reported an endowment of $43 million that would spin off $1.7 million a year—and most of that went toward scholarships (i.e., the 1% category, not the 9% category). The University of Texas Law Foundation presently reports an endowment of more than $150 million, which would spin off around $7.5 million. (At any of these institutions, parts of the endowment may be held by the parent university for use at the law school, so the figures could be much larger.)

It’s possible, of course, that many law schools earn a tremendous amount in one-time or annual gifts, among other outlets. And it’s also possible that parent universities are much more heavily subsidizing their law schools than they’ve otherwise been letting on. Finally, it’s also possible that law schools with rising non-JD programs are funneling the profits from those programs into the JD program.

These possibilities seem limited. I don’t know many schools with small endowments are consistently pulling in extremely large gifts each year. Parent universities can help subsidize in some cases and in limited circumstances, but even these rarely go beyond a few million dollars on an annual basis in the most extreme cases. And it’s not clear how expansive these non-JD programs are—certainly not so expansive at most universities to subsidize a JD program so heavily.

And, as I mentioned, this is an implausible baseline. Most schools have a fairly significant “discount rate” and offer a significant number of scholarships to students, offsetting tuition and “indirect” expenditures. So schools not only need revenue for these “indirect” expenditures, but then they’re also spending far more than whatever the X, 2X, etc. “gross tuition” baseline is.

In short, then, almost however you look at it, it’s impossible to account for law school expenditures under the USNWR rankings, except that there are accounting assumptions being made that bear no relation to the reality of the quality fo the education students receive.

This grows more obvious and pronounced each year. But, like many critiques of the rankings, it seems unlikely to go anywhere anytime soon.

USNWR has erratically chosen whether "statewide bar passage" rate includes only ABA-approved law schools over the years

I was directed to the fact that the new USNWR bar exam metric includes “the weighted state average among ABA accredited schools' first-time test takers in the corresponding jurisdictions in 2020.” “ABA accredited” was added. Didn’t the first-time bar exam passage rate only include ABA accredited schools in the past?

Previous methodology looked at the modal state where a law school’s graduates took the bar exam, and the “jurisdiction's overall state bar passage rate for first-time test-takers in winter and summer” of that year.

I looked at the 2022 rankings (released in 2021, using the 2019 bar exam data). I picked California, known for its significant cohort of non-ABA test-takers. The overall first-time pass rate was 59%, but the first-time pass rate among ABA accredited schools was 69%. (Historical stats are here.) USNWR used the 59% rate.

That first surprised me. I had assumed USNWR only used ABA accredited data. It also made me think that California schools would be harmed the most by this shift in metrics (even if I think it’s more accurate). That’s because California schools are less likely to “overperform” if the pass rate is higher (e.g., using only ABA accredited test-takers instead of all test-takers).

But then I dug further.

The 2021 rankings (released in 2020, using 2018 bar exam data) reported California’s first-time bar pass rate as 60%. The ABA first-time rate was 60%. But the overall rate was 52%. So in this year, USNWR used only ABA accredited schools.

The 2020 rankings (released in 2019, using 2017 bar exam data) reported a first-time pass rate of 58%. That’s the same as the overall first-time pass rate of 58%, not the 66% from ABA accredited law schools. So in this year, USNWR used overall first-time pass rates. And it appears USNWR did the same in 2019 (released in 2018, using 2016 bar exam data).

In short, there does not appear to be any reason why USNWR has used one method or another over the years. Certainly, this year it is expressly using only ABA data, and maybe it intends to stick with that going forward. But it’s another, subtle change that could adversely affect those schools (e.g., California) with a significant cohort of non-ABA test-takers. It’s probably the right call. But it also highlights the inconsistency of USNWR in its methodology over the years.

USNWR law school voters sank Yale Law and Harvard Law for the first time in rankings history.

The USNWR “peer score” is the single most heavily-weighted component of the law school rankings. USNWR surveys about 800 law faculty (the law dean, the associate dean for academics, the chair of faculty appointments and the most recently-tenured faculty member at each law school). Respondents are asked to evaluate schools on a scale from marginal (1) to outstanding (5). There’s usually a pretty high response rate—this year, it was 69%.

Until this year, Yale & Harvard had always been either a 4.8 or 4.9 on a 5-point scale in every survey since 1998.

But this year, Harvard's peer score was a 4.7. And Yale's was a 4.6.

What precipitated the drop (e.g., Harvard could be close to a rounding error, it may have been a 4.76 in the past and a 4.74 in the past) is anyone’s guess. But respondents do tend to react to certain influences, it would seem, and one could only speculate what might have prompted such responses in the fall 2021 or early 2022 when this cohort was surveyed.

Some dramatic swings as USWNR introduces new bar exam metric

The latest USNWR law school ranking has some significant swings in the bar exam component. It made three significant changes: increasing the weight from 2.25% to 3%, and measuring “all graduates who took the bar for the first time,” and including graduates who were admitted via diploma privilege in both a school’s passers and the overall passers. From the methodology:

Specifically, the bar passage rate indicator scored schools on their 2020 first-time test takers' weighted bar passage rates among all jurisdictions (states), then added or subtracted the percentage point difference between those rates and the weighted state average among ABA accredited schools' first-time test takers in the corresponding jurisdictions in 2020. This meant schools that performed best on this ranking factor graduated students whose bar passage rates were both higher than most schools overall, and higher compared with what was typical among graduates who took the bar in corresponding jurisdictions.

For example, if a law school graduated 100 students who first took the bar exam – and 88 took the Florida exam, 10 the Georgia exam and two the South Carolina exam – the school's weighted average rate would use pass rate results that were weighted 88% Florida, 10% Georgia and 2% South Carolina. This computation would then be compared with an index of these jurisdictions' average pass rates – also weighted 88-10-2. (For privacy, school profiles on usnews.com only display bar passage data for jurisdictions with at least five test-takers.) Both weighted averages included any graduates who passed the bar with diploma privilege. Diploma privilege is a method for J.D. graduates to be admitted to a state bar and allowed to practice law in that state without taking that state's actual bar examination. Diploma privilege is generally based on attending and graduating from a law school in that state with the diploma privilege.

In previous editions, U.S. News divided each school's first-time bar passage rate in its single jurisdiction with the most test-takers by the average for that lone jurisdiction. This approach effectively excluded many law schools' graduates who took the bar. Dividing by the state average also meant the location of a law school impacted its quotient as much as its graduates' bar passage rate itself. The new arithmetic accounts for average passage rates across all applicable jurisdictions as proxy for each exam's difficulty and reflects that passing the bar is a critical outcome measure in itself.

The new methodology really changes the results for two kinds of schools. (The increase in the weight from 2.25% to 3% obviously also benefits schools that do well and harms schools more that do poorly.)

First, it benefits good schools in jurisdictions with tougher bars and strong out-of-state placement.

Second, it harms Wisconsin’s two law schools.

Let’s start with the first. Which schools benefited most from 2022 (measuring the 2019 bar) to 2023 (measuring the 2020 bar)? (These charts exclude a handful of schools that did not include their bar passage statistics this time around.)

  Pass rate 2019 Jurisdiction rate Cumulative pass rate 2020 Cumulative jurisdiction rate USNWR score delta
San Francisco 38.7% CA 59% 78.4% 78% 0.0497
William & Mary 86.7% VA 78% 96.9% 81% 0.0323
Washington & Lee 80.0% VA 78% 92.6% 81% 0.0317
Emory 84.5% GA 77% 91.7% 78% 0.0295
Minnesota 94.0% MN 81% 98.9% 82% 0.0279
Georgia 94.5% GA 77% 94.4% 76% 0.0272
Kentucky 78.4% KY 75% 90.6% 80% 0.0266
Montana 88.9% MT 85% 92.5% 82% 0.0255
Penn State-Dickinson 88.5% PA 80% 91.7% 79% 0.0249
Drexel 77.1% PA 80% 84.0% 78% 0.0249

On the left are the school’s pass rate in 2019 with its modal jurisdiction, and that jurisdiction’s pass rate. Next is cumulative pass rate in 2020 along with the cumulative jurisdiction rate. Finally is the delta of the USNWR score—how much better the school did this year compared to last year in the weighted Z-score.

(I noted last year that we saw major swings at some schools in 2020. We see how those are playing out here.)

The University of San Francisco saw a tremendous improvement in California of almost 40 points (aided in part by a lower cut score in California in 2020). But the next three schools are telling. William & Mary and Washington & Lee are strong schools in a very tough bar exam market (Virginia is one of the toughest bars in the country), and Emory in Georgia in an above-average difficulty bar. Each did reasonably well in 2019. But when adding in performances in other jurisdictions, their scores climbed. ABA data shows W&M went 15-for-15 in DC, 15-for-15 in Maryland, and 10-for-10 in New York. All were excluded in the old metrics; all are easier bars than Virginia. W&L grads went 13-for-14 in DC, 13-for-13 in North Carolina, and 9-for-10 in New York. Emory went 21-for-21 in New York and 11-for-11 in Florida.

In other words, a diffuse and successful bar exam test-taking base redounds to the benefit of these schools.

Let me add one more detail. The new methodology puts law schools closer to parity with one another when comparing bar passage rates, especially those outside the “outliers.” The more graduates you have taking the bar, across jurisdictions, the less likely the difficulty of the bar matters in the end; and the inclusion of “diploma privilege” (or adjacent) admissions lifts the results. The 2019 “denominator” of the bar exam ranged from 55% at the low end of law schools (i.e., Maine) to 87% at the top end (i.e., Kansas), a gap of 32 points. That shrunk a bit in 2020 with the new methodology, from 70% to 99% (29 points). But the difference between the 10th and 90th percentiles shrunk significantly, from 2019 (61% and 81%, 20 points) to 2020 (75% to 86%, 11 points). In other words, there differences between the 19th and 168th law schools in terms of their “jurisdiction pass rate” was about half as much in the “overall pass rate” this year compared to last year.

Let’s look at the worst-performing schools.

  Pass rate 2019 Jurisdiction rate Cumulative pass rate 2020 Cumulative jurisdiction rate USNWR score delta
Western State 56.7% CA 59% 51.7% 78% -0.0688
Ohio Northern 95.7% OH 79% 66.7% 81% -0.0658
Golden Gate 43.9% CA 59% 44.1% 78% -0.0620
Faulkner 81.8% AL 77% 60.7% 79% -0.0584
Marquette 100.0% WI 71% 98.2% 99% -0.0538
Southern Illinois 59.4% IL 79% 50.6% 82% -0.0513
Wisconsin 100.0% WI 71% 100.0% 99% -0.0497
CUNY 74.5% NY 74% 66.7% 86% -0.0493
Pepperdine 81.0% CA 59% 78.6% 78% -0.0455
Pace 76.0% NY 74% 69.6% 85% -0.0422

You can see that several schools performed worse, or relatively worse, compared to their 2019 figures (again, consistent with what I noted earlier, major swings at some schools in 2020). But note outliers. Marquette (98.2%) and Wisconsin (100%) both have extraordinarily high bar passage rates, due principally to in-state diploma privilege.

In the past, this redounded to their benefit, as ordinary test-takers who took the bar exam performed substantially lower than 100% (see 71% in 2019), giving them a huge advantage. The new USNWR methodology, however, includes all of those diploma privilege admittees as “passers” in cumulative jurisdiction’s pass rate, too. Wisconsin and Marquette used to perform 30 points above the average; they’re now basically at the average.

In one sense, there’s a greater honesty to the metric in comparing similarly-situated graduates to one another. But it comes at the cost of punishing two schools whose graduates are all (or nearly all) immediately able to practice law. That’s a tremendously valuable outcome for law students.

It might be beneficial for USNWR to instead include two factors, absolute passers and relative passers (like this one). Some (especially California deans!) critique an “absolute” passer rate that lacks accounting for the difficulty of the bar. But if we care about law students’ ability to practice law, it seems to me that it’s important to capture whether your graduates are successfully getting students there, regardless of how hard or easy the bar exam is. (Of course, relative performance also should matter, I think, at least to some degree, as it suggests that some schools are improving opportunities for their graduates.) I confess, others would disagree.

How did other schools, like those in Utah, Washington, or Oregon, not perform much better or worse despite emergency “diploma privilege” being introduced? Recall it’s a mixed bag, depending on the school and the state, history and out-of-state test-takers. February 2020 did not have such exemptions, which are partially included in the figures above. Utah and Oregon still had a decent set of in-state test-takers, as diploma privilege did not extend to everyone—but schools in those states didn’t see as dramatic changes in overall passing rates, as in both states they were keyed to pre-set levels of test-taker success (86%, with an exception in Oregon for in-state schools), and that meant most people taking the test would have passed, anyway. Washington, in contrast, opened up diploma privilege to essentially all test-takers, and the corresponding increase in passers put the University of Washington near the bottom of changes from 2019 to 2020 (suffering something that Wisconsin and Marquette experienced this year).

It’s a seemingly small change in methodology, and it’s hard to know what a number like “0.0497” means to an overall score. But it’s worth identifying that the changes are not value-neutral and can affect similarly-situated schools quite differently.

Federal judges have already begun to drift away from hiring Yale Law clerks

On the heels of the latest controversy at Yale Law School, which David Lat ably describes over at Original Jurisdiction, a federal judge penned an email to fellow judges: “The latest events at Yale Law School, in which students attempted to shout down speakers participating in a panel discussion on free speech, prompt me to suggest that students who are identified as those willing to disrupt any such panel discussion should be noted. All federal judges—and all federal judges are presumably committed to free speech—should carefully consider whether any student so identified should be disqualified from potential clerkships.”

The truth is, Yale Law has already seen falling clerkship placement numbers in recent years. Incidents like this may harden some judges’ opposition. (There are caveats, of course, about what factors affect a judges hiring practices, the political salience of the issues here, and so on.)

I closely track federal judicial clerkship placement, and I have in recent years included a three-year average of clerkship placement in a report I release every two years. The latest version of that report is here. But we can look at some trends among a handful of schools. I select eight of the (historically) highest-performing: Yale, Stanford, Chicago, Harvard, Duke, Virginia, Michigan, and UC-Irvine. I’ll look at the last eight years’ placement. (Any choice of schools and window of time is a bit arbitrary, and I could go back for more data or more schools if I wanted. I didn’t look at 2012 or earlier data, so I don’t know what I’m missing with this cutoff.)

Let me start by pointing out that the total placement among recent graduates has been fairly steady (see the chart). Schools report between 1150 and 1250 placements per year.

Some declines may well be attributable to vacancies in the federal judiciary that were unfilled. It does not appear that there is a “trend” of hiring materially fewer recent law school graduates in favor of clerks with work experience.

But this means that there’s roughly a fixed set of possible clerkship positions each year. If some schools are declining in placement we would expect to see other schooling improvement in placement. We can’t necessarily make those as one-to-one tradeoffs (e.g., a judge “stops” hiring from Yale and “starts” hiring from Chicago), but we can watch some aggregate trends.

I’ll start with percentage of graduates placed into a full-time, long-term federal clerkship. Admittedly, this doesn’t capture those who work then clerk. But there is some consistency in the reporting of data over the years. It makes no distinction among competitiveness of clerkships or types of judges (e.g., appellate or district court). Percentages can also fluctuate with the class size or be deceptive based on class size; I’ll dig into the raw figures in a moment.

A few items stand out. Yale would typically place between 25% and 35% of its class into federal clerkships. Its number is low in 2020, but not the lowest in this time period. A couple of times, Stanford has placed a higher percentage of clerks than Yale.

But noteworthy is Chicago’s climb, from 10% of the class in 2013 to a whopping 27.6% in 2020, for the first time in recent memory besting Yale.

A few other trends are noteworthy. Apart from Irvine’s decline (which may coincide with the departure of founding Dean Erwin Chemerinsky), we see that the University of Virginia placing fourth with 17.5% placement. It’s done well in recent years, including occasionally edging out Harvard, but (apart from a 2017 dip) shows a trendline of consistent and perhaps improving placement.

Let’s now look at the raw totals of placement. Recall that these figures are going to help assess placement into the market of roughly 1150 to 1250 total new clerks a year.

Harvard tops the list, as its 15-20% placement into clerkships still means a whopping 80 to 120 clerks a year, given its tremendous class size. But, it is notable to see it at an eight-year low in placement. Yale, which had consistently been second in raw placement for the previous seven years, has slipped to fourth in 2020, as both Chicago (56) and Virginia (55) placed more federal clerks than Yale (52).

Now, it’s perhaps no coincidence that Yale graduates just 197 students in the Class of 2020, its smallest class in this eight-year period, and perhaps correspondingly saw a decline in overall placement in different ways. Still, federal judges needed clerks in 2020. They simply looked elsewhere at slightly higher rates.

But at a larger level, it’s worth noting that federal judges do change their hiring preferences, and we may be witnessing some of that right now, regardless of whether some judges are “investigating” whether some graduates of some law schools have acted in a disruptive manner at a public event. There are, of course, any number of reasons why federal judges looked elsewhere, returning to a point at the top of this post. It could be that law students at some law schools, more than others, are self-selecting out of applying to federal judges (option for lucrative large law firm placement, competitive government positions, or the booming public interest sector).

And finally, it could also be that this blip is hardly a “trend,” and we’ll wait for a month to see what the Class of 2021 figures show.