What I got right (and wrong) in projecting the USNWR law rankings

In January, when I projected the new USNWR law rankings, I wrote, ”I wanted to give something public facing, and to plant a marker to see how right—or more likely, how wrong!—I am come spring. (And believe me, if I’m wrong, I’ll write about it!)”

With the rankings out, we can compare them to my projections.

A couple of assumptions were pretty good. Ultimate bar passage and student-librarian ratio were added or re-imagined factors. More weight was put on outputs. Less weight was put on the peer score.

But I thought USNWR would need to add some weight to admissions statistics to make up for the loss of other categories. I was wrong. They diminished those categories and added a lot—a lot—to add to outputs. Employed at 10 months rose from 14% to 33%. First-time bar passage rose from 3% to 18%. Those are massive changes. For reference, I thought a reasonable upper-bound for employment could be 30% and first-time bar passage 12%.

The model was still pretty good.

I got 13 of 100 schools exactly right—not great.

63 schools hit the range I projected them in—pretty good, but not great.

But 81/100 schools placed in the general trend I had for them—would they rise, fall, or stay in the same spot. Even if I missed, for almost all the schools I hit the right direction for them.

Again, part of this comes from the fact that so many factors correlate with one another that it’s relatively easy to spot trends, even if my models missed some of the biggest swings. But I also included some bigger swings in my factors, which I think also helped put the trajectory in the right place.

Barring changes to the methodology next year (sigh)….

Did "boycotting" the USNWR law rankings affect those schools' peer scores?

On the heels of examining the peer score declines of Yale and Harvard in this year’s USNWR rankings, I wanted to look at the peer scores more generally.

Earlier this year, when I was modeling USNWR law rankings, I offered this thought:

I used last year’s peer and lawyer/judge scores, given how similar they tend to be over the years, but with one wrinkle. On the peer scores, I reduced any publicly “boycotting” schools’ peer score by 0.1. I assume that the refusal to submit peer reputational surveys from the home institution (or, perhaps, the refusal of USNWR to count those surveys) puts the school at a mild disadvantage on this metric. I do not know that it means 0.1 less for every school (and there are other variables every year, of course). I just made it an assumption for the models (which of course may well be wrong!). Last year, 69% of survey recipients responded, so among ~500 respondents, the loss of around 1% of respondents, even if quite favorable to the responding institution, would typically not alter the survey average. But as more respondents remove themselves (at least 14% have suggested publicly they will, with others perhaps privately doing so), each respondent’s importance increases. It’s not clear how USNWR will handle the reduced response rate. This adds just enough volatility, in my judgment, to justify the small downgrade.

Was that true?

USNWR’s methodology provides that it withdrew the survey responses of “boycotting” schools: “Peer assessment ratings were only used when submitted by law schools that also submitted their statistical surveys. This means the schools that declined to provide statistical information to U.S. News and its readers had their academic peer ratings programmatically discarded before any computations were made.” So my first assumption was right.

But did it affect those schools adversely?

But among these ~60 schools, 7 of them saw an increase in their peer score (12%). Another 28, nearly half, saw a decline. The average effect on their peer score was a decline of 0.043, slightly less than than the 0.1 I projected, but still an average decline.

Another 130 or so did not boycott. 29 of them (22%) saw an increase in their peer score, and 36 (27%) saw a decline—a mixed bag, with declining schools slightly outpacing increasing schools. The average effect on their peer was a marginal decrease of less than 0.01)—in other words, a decline, but somewhat less than the “boycotting” schools. (Peer scores have been long declining.)

So, my assumption was somewhat right—it slightly overstated the effect but it rightly identified the likely effect.

Schools that saw a 0.2-point decline: Emory,* Chicago-Kent, UC-Irvine,* UCLA,* Illinois, New Hampshire,* Toledo, Yale*

Schools that saw a 0.2-point increase: Northern Kentucky

*denotes schools that “boycotted” the rankings

Yale, Harvard Law peer scores in USNWR law rankings plunge to lowest scores on record

Last year, I noted a change in the USNWR “peer score” for Yale and Harvard. Until this year, the “peer score” was the single most heavily-weighted component of the law school rankings. USNWR surveys about 800 law faculty (the law dean, the associate dean for academics, the chair of faculty appointments and the most recently-tenured faculty member at each law school). Respondents are asked to evaluate schools on a scale from marginal (1) to outstanding (5). There’s usually a pretty high response rate—last year, it was 69%; this year, it was 64.5%.

Until last year, Yale & Harvard had always been either a 4.8 or 4.9 on a 5-point scale in every survey since 1998.

Last year, Harvard’s peer score dropped to 4.7 Yale’s dropped to 4.6.

USNWR changed its methodology and reduced the peer score from 25% of the rankings to 12.5% of the rankings. It also explained, “Peer assessment ratings were only used when submitted by law schools that also submitted their statistical surveys. This means the schools that declined to provide statistical information to U.S. News and its readers had their academic peer ratings programmatically discarded before any computations were made.”

Harvard’s score dropped again to 4.6. Yale’s score plunged again this year, to 4.4.

Because peer score was so significantly devalued in this year’s methodology, it affected them less than it might have in previous years.

Some perspective on Yale’s decline, and a caveat.

On perspective: very few schools have experienced a peer score decline of 0.4 over two years. Since USNWR began the survey in its 1998 rankings, it has happened four times. (Below are the years of the survey administration, not the release in USNWR.) One was New York Law School, surveyed in 2010-2012. The other three schools were one-year drops of 0.4, St. Louis University in 2012, Illinois in 2011, and Villanova in 2011, all of which arose from admissions or prominent university scandals, as I chronicled here in 2019. Yale is just the fifth school to experience such a drop.

The caveat is, a very high peer score is harder to maintain if anything goes wrong. A few vindictive 1s from survey respondents can drop a 4.9 much more quickly than they can drop a 3.5 or a 2.0. Of course, we have no idea whether there’s simply a larger number of faculty rating Yale a 4 instead of a 5, or if other survey responses are changing the results.

It’s entirely possible that voters are “reacting” against Yale (or Harvard) for actions over the last couple of years. Whether it’s responding to public disputes arising from politically-charged episodes on campus or negative reaction to initiating the “boycott” that caused a change in methodology, some unwanted for some schools. The ~63 “boycotting” schools are clustered closer to the top of the overall rankings, and it’s possible those schools (staffed with more graduates of these schools) thought more highly of these schools—with their responses out of the rankings, the peer score fell. It’s possible that the composition of administrators or recently-tenured faculty across the country have fewer graduates of these schools than in years past, a slower shift away from institutional loyalty to their “status.” On it goes.

Nevertheless, the peer survey is the one national barometer, if you will, of sentiment among law school deans and faculty. We shall see how USNWR proceeds next year—particularly as law schools may not be inclined to share the names of potential survey respondents next fall, which means another methodological change may be coming.

New USNWR methodology will yield dramatically more compression and volatility in law school rankings

Back in December, I wondered about the “endgame” for law schools purporting to “boycott” the USNWR law school rankings. I noted one effort may be to delegitimize the rankings. Another was to change the methodology, but that seemed questionable:

So, second, persuade USNWR to change its formula. As I mentioned in the original Yale and Harvard post, their three concerns were employment (publicly available), admissions (publicly available), and debt data. So the only one with any real leverage is debt data. But the Department of Education does disclose a version of debt metrics of recent graduates, albeit a little messier to compile and calculate. It’s possible, then, that none of these three demanded areas would be subject to any material change if law schools simply stopped reporting it.

Instead, it’s the expenditure data. That is, as my original post noted, the most opaque measure is the one that may ultimately get dropped if USNWR chooses to go forward with ranking, it would need to use only publicly-available data. It may mean expenditure data drops out.

Ironically, that’s precisely where Yale and Harvard (and many other early boycotters) excel the most. They have the costliest law schools and are buoyed in the rankings by those high costs.

So, will the “boycott” redound to the boycotters’ benefit? Perhaps not, if the direction is toward more transparent data.

USNWR has announced dramatic changes to its methodology. The weight of inputs (including admissions figures) will be decreased significantly. The weight of outputs will be increased significantly. And reliance on non-public data (like expenditures) will disappear.

As I noted last December, it was not likely that the changes in methodology to more transparent data or away from certain categories would redound to the benefit of some of these boycotting law schools. Harvard, which has dropped to 5 this year, has experienced it. Yale, which I anticipate will drop to 4 next year (more on that in a future post), is another.

In other words, it’s not clear that the boycott benefited the initiating schools and appears to have harmed their competitive advantage.

But I get ahead of myself.

The changes are significant. You can read about them here. There are reasons to think the changes are good, bad, or mixed, but that’s not what I want to focus on for the moment. I want to focus on the effect of the changes.

The changes move away from the least volatile metrics toward the most volatile metrics. Some volatility is because of compression—some metrics are very compressed and even slight changes year to year can affect the relative positioning of schools. Some of the volatility is because of the nature of the factors themselves—a handful of graduates who fail or pass the bar, or who secure good jobs or who end up unemployed, are much more likely volatile outcomes on a year-to-year basis than changes in admissions statistics.

Expenditures per student, now eliminated from the rankings, were not very volatile because there was a huge spread between schools and the schools remained roughly in place each year—it’s very hard to start miraculously spending an extra $20,000 per student per year, and any marginal increase each year was often offset by everyone else’s marginal increases. Additionally, the vast spread in Z-scores made law schools uniquely situated to spread themselves (especially at the top), which could drown out much of the movement in other factors the rankings.

In employment, there is tremendous compression near the top of the rankings in particular. The difference between 99% employed and 97% employed can mean a a rankings change of 15 spots; from 90% to 87% can be 50 spots for the Class of 2021. (These percentages are also influenced in how USNWR weighs various sub-categories of employment, too.) (Additionally, I was right last month that law schools misunderstood the employment metrics—essentially nothing changed for most schools, although a couple of outliers lower in the rankings saw their position improve markedly, likely due to USNWR data entry errors—the rankings were closer to my original projections that I got from ABA data.)

And, some metrics simply don’t change much year to year. Let’s just briefly look at the correlation between some figures from the 2023 and 2024 data that have decreased in weight.

Peer assessment score (weight decreased from 25% to 12.5%): 0.996

Lawyer/judge assessment score (weight decreased from 15% to 12.5%): 0.997

Median LSAT score (weight decreased from 11.25% to 5%): 0.993 (Strictly speaking, law schools are ranked on a composite of their GRE and LSAT medians, but this captures the vast amount of the rankings factor.)

Median UGPA (weight decreased from 8.75% to 4%): 0.965

These four categories have dropped from a combined 60% of the rankings to 29%—and these four were remarkably stable year to year.

And now to the correlation between some figures from the 2023 and 2024 data that have increased in weight:

First-time bar passage (weight increased from 3% to 18%): 0.861

10-month employment outcomes (weight increased from 14% to 33%): 0.809

These two categories alone have gone from 17% of the rankings to 51% of the rankings—and these are much more volatile.

We can compare the spread of schools from top to bottom, visualizing them by their overall “score.” (I’ve done this before, including last year.) We can see a dramatic compression in the scores these schools received that form the basis of their ultimate ranking. This compression is the result of two decisions by USNWR.

First, USNWR has increased categories that are much more compressed (e.g, employment) rather than the ones that were much more spread out (e.g., expenditures). (Of course, USNWR eliminated non-public data like expenditures as a result of the law schools’ “boycott,” which deprived USNWR of non-public data like this.)

Second, it has added Puerto Rico’s three schools to the rankings. While this may not seem like a big deal, it is a huge deal if you are putting all the schools on a 0-100 scoring spectrum.

One school, Pontifical Catholic University of Puerto Rico, is so far below every other school that it singularly distorts the scale for everyone. Its median LSAT score is a 135, which is roughly in the 4th percentile of LSAT test-takers (the next lowest is 143 at Inter-American, another Puerto Rican law school, which is roughly the 21st percentile). Only 19% of its graduates secured full-time, long-term, bar passage-required or JD-advantage jobs (the next lowest is Inter-American at 37%). And these law schools have long had challenges retaining ABA accreditation with their bar exam pass rates. This is not to pick on one law school—it is simply to note its numbers compared to the rest.

In past years, the gap between the “worst” and “second-worst” school in these metrics was relatively close, so the 0-100 scale showed some appropriate breadth—the bottom school might be a 0, but the next closest school would be a 3, and the next a 5, and so on upward. (While USNWR does not publicly disclose the scores of schools near the bottom of the rankings, it does not take much effort to reverse engineer the rankings, as I was doing earlier this year, to see where schools fall.) This year, however, the gap between the “worst” and “second-worst” school is significantly larger, which means no one is close to the school that scores 0. And again, standardizing on a 0-100 scale means more compression for 195 schools, followed by a gap ahead of the 196th school—it effectively converts a 0-100 scale into something like a 15-100 or 20-100 scale. This compression creates more ties in the ordinal rankings.

Let me offer a brief visualizing of, say, the top ~60 schools, this year and last year. This is not an optical illusion. There are about 60 schools on each side of the visualization.

The compression in the scores is dramatic. It is possible that this will spread out in future years if Puerto Rico’s three law schools are removed from the rankings again. But that will not account for all of it.

We already know most of the factors that will go into next year’s ranking (Class of 2020 ultimate bar passage, Class of 2022 first-time bar passage, Class of 2022 employment scores)—at 58%, these three categories are the bulk of the rankings, and the most volatile. The remaining 42% have nearly no volatility from year to year. So we can quite easily see the changes that will come. That will be for another post soon.

This isn’t to say, of course, that law schools had a variety of motives, express and implied, for the common good and for their self interest, for refusing to share information with USNWR. But it’s to point out that the end has moved in very different direction than those schools may have anticipated. We should expect compression and volatility in the rankings in the years to come.

Which law schools have the best and worst debt-to-income ratios among recent law school graduates? 2023 update

In late 2020, I last blogged about the “debt-to-income” ratio of recent law school graduates.

The Department of Education offers data with incredible insights into debt and earnings of university graduates. Recent updates are available, and we can look at the data again. Here’s the data fields from the Department of Education:

Institution-level data files for 1996-97 through 2020-21 containing aggregate data for each institution. Includes information on institutional characteristics, enrollment, student aid, costs, and student outcomes.

Field of study-level data files for the pooled 2014-15, 2015-16 award years through the pooled 2017-18, 2018-19 award years containing data at the credential level and 4-digit CIP code combination for each institution. Includes information on cumulative debt at graduation and earnings one year after graduation.

One intriguing figure is the “debt-to-income” ratio (some people hated this term, but I’m still using it), or how much student debt recent graduates have compared to their annual earnings. Lower is better. (A slightly better way is to calculate what percentage of your monthly paycheck is required to service your monthly debt payment, or the debt-service-to-monthly-income ratio, but this gives a good idea of the relationship between debt and income.) It’s entirely imperfect, of course—graduates have interest accrued on that debt when they graduate; they may have other debt; and so on. It’s just one way of looking at the data!

I took the raw data file and pulled out all domestic schools that had a concentration in “law” for a “doctoral degree” or “first professional degree.” I then compared the median debt load to the median earnings figures. (Of course, there’s no guarantee these figures are the same person, and there may be other mismatches, like high earners with low debt or low earners with high debt. Again, just one way of looking at the data!)

Not all schools are listed due to some data issues—sometimes the Department of Education fails to disclose certain data for some institutions.

The Department of Education site defines these figures as follows:

Field of Study Median Earnings

The median annual earnings of individuals who received federal financial aid during their studies and completed an award at the indicated field of study. To be included in the median earnings calculation, the individuals needed to be working and not be enrolled in school during the year when earnings are measured. Median earnings are measured in the fourth full year after the student completed their award.

These data are based on school-reported information about students’ program of completion. The U.S. Department of Education cannot fully confirm the completeness of these reported data for this school.

For schools with multiple locations, this information is based on all of their locations.

These calculations are based, in part, on calendar year 2020 earnings which may have been impacted by the pandemic and may not be predictive of earnings values in non-pandemic years.

Field of Study Median Total Debt for Loans Taken Out at This School

The median federal loan debt accumulated at the school by student borrowers of federal loans (William D. Ford Federal Direct Loan Program, the Federal Family Education Loan Program, and Graduate PLUS Loans) who completed an award at the indicated field of study. Non-federal loans, Perkins loans, and federal loans not made to students (e.g., parents borrowing from the federal Parent PLUS loan program) are not included in the calculation. Only loans made at the same academic level as the award conferred are included (e.g., undergraduate loans are not included in the median debt calculation for graduate credential levels). Note that this debt metric only includes loans originated at this school, so this metric should be interpreted as the typical debt level for attending this school alone, not necessarily the typical total debt to obtain a credential for students who transfer from another school. For schools with multiple locations, this information is based on all of their locations.

These data are based on school-reported information about students’ program of completion. The U.S. Department of Education cannot fully confirm the completeness of these reported data for this school.

That means debt loads can of course be higher if undergraduate loans were factored in.

A number of elite schools are near the top—despite their high debt levels, they translate into high median incomes among their graduates. A number of lower-cost schools also fare well near the top.

A good rule of thumb might be that “manageable” debt loads are those where debt is about equal to expected income at graduation—i.e., a ratio of 1.00 or lower. Only 20 schools meet that definition among median debt and earnings, and a few others are close. That said, law graduates to have higher earnings and see their salaries rise faster than a typical borrower, so maybe it’s not the best rule of thumb, either.

Many ratios, however, are significantly higher than that. 59 have ratios above 2.00; of those, 13 have ratios above 3.00. Only a couple of schools in the USNWR “top 50” rankings cross the 2.00 ratio.

Many borrowers will be eligible for Public Service Loan Forgiveness programs, either at the federal level or at their own law schools. If schools have disproportionately higher percentages of students entering those programs, their debt levels will appear worse than they actually and their salaries will appear on the lower end of the income side. It’s another limitation in thinking about a single-figure metric.

Of course, medians are likely skewed in other ways—the highest-earning graduates likely received the largest scholarships and, accordingly, graduated with the lowest debt.

But, the figures are below. I sort by the lowest (i.e., best) debt-to-income ratio. (Due to size of chart, results may be best viewed on a desktop or on a phone turned sideways.) I noted a few years ago that schools at the bottom of the list (i.e., with the highest ratio) appeared at a much higher risk of facing “adverse situations.”

School Debt-to-Income Ratio Median Debt Median Income
Harvard Univ. 0.54 $93,235 $172,727
Northwestern Univ. 0.78 $154,286 $196,640
George Mason Univ. 0.81 $65,077 $80,019
Cornell Univ. 0.83 $162,160 $195,233
Univ. of California-Berkeley 0.83 $155,891 $186,967
Univ. of Nebraska-Lincoln 0.84 $54,456 $64,977
Univ. of Pennsylvania 0.87 $171,488 $196,219
The Univ. of Alabama 0.88 $61,500 $70,082
Univ. of Michigan-Ann Arbor 0.88 $132,524 $150,448
Duke Univ. 0.91 $158,000 $173,119
Boston Univ. 0.91 $117,740 $128,883
Univ. of Wisconsin-Madison 0.92 $61,500 $67,155
Fordham Univ. 0.93 $147,561 $158,382
Wayne State Univ. 0.93 $61,466 $65,928
Univ. of Virginia 0.94 $178,812 $189,235
Villanova Univ. 0.95 $69,861 $73,474
Univ. of New Hampshire 0.95 $61,500 $64,654
Vanderbilt Univ. 0.97 $139,857 $144,075
Columbia Univ. in the City of New York 0.99 $198,924 $201,681
Georgetown Univ. 0.99 $162,286 $164,429
Stanford Univ. 1.00 $153,302 $153,149
Washington Univ. in St Louis 1.01 $92,540 $91,359
Texas A & M Univ.-College Station 1.02 $71,446 $70,263
Univ. of Southern California 1.02 $138,518 $135,745
Univ. of Illinois Urbana-Champaign 1.03 $77,159 $75,235
Univ. of Kansas 1.03 $61,500 $59,724
Georgia State Univ. 1.03 $72,563 $70,243
Univ. of Chicago 1.04 $188,691 $181,658
Univ. of North Dakota 1.04 $61,500 $58,885
Univ. of Utah 1.06 $74,012 $69,909
The Univ. of Tennessee-Knoxville 1.06 $61,500 $57,949
Univ. of Houston 1.06 $86,372 $81,124
Boston College 1.07 $123,000 $114,959
Univ. of Florida 1.08 $71,483 $66,008
Northeastern Univ. 1.10 $70,571 $63,909
Univ. of Arkansas 1.13 $65,000 $57,557
Temple Univ. 1.14 $81,733 $71,731
Univ. of Nevada-Las Vegas 1.14 $82,985 $72,511
Univ. of California-Davis 1.16 $92,689 $80,209
Univ. of Cincinnati 1.16 $66,694 $57,672
Univ. of Iowa 1.16 $80,268 $69,147
The Pennsylvania State Univ. 1.17 $65,436 $56,119
Univ. of Oklahoma-Norman 1.17 $69,800 $59,521
Florida State Univ. 1.17 $66,707 $56,790
The Univ. of Montana 1.18 $72,126 $61,101
Univ. of Georgia 1.18 $82,694 $69,896
Univ. of North Carolina at Chapel Hill 1.20 $91,570 $76,259
Rutgers Univ.-New Brunswick 1.23 $71,218 $57,880
CUNY Sch. of Law 1.23 $81,666 $66,167
Indiana Univ.-Bloomington 1.24 $92,000 $74,327
Univ. of Wyoming 1.24 $70,488 $56,842
Ohio State Univ. 1.25 $91,529 $73,515
Univ. of Arkansas at Little Rock 1.25 $61,500 $49,115
Univ. of St Thomas 1.26 $74,968 $59,454
Univ. of Mississippi 1.27 $69,701 $55,037
Drake Univ. 1.28 $83,526 $65,460
Drexel Univ. 1.28 $72,191 $56,367
Ohio Northern Univ. 1.29 $61,500 $47,520
Texas Tech Univ. 1.31 $86,163 $65,990
Cleveland State Univ. 1.31 $71,500 $54,679
Brooklyn Law Sch. 1.32 $96,951 $73,383
Duquesne Univ. 1.35 $72,500 $53,684
Univ. of Connecticut 1.36 $96,386 $70,942
Univ. of Missouri-Columbia 1.36 $73,501 $53,923
Quinnipiac Univ. 1.37 $81,000 $59,034
Yeshiva Univ. 1.38 $101,500 $73,371
Northern Illinois Univ. 1.38 $75,688 $54,663
Univ. of Minnesota-Twin Cities 1.40 $98,423 $70,206
Washington and Lee Univ. 1.41 $97,335 $69,076
Louisiana State Univ. 1.41 $88,622 $62,823
Washburn Univ. 1.41 $77,330 $54,793
Univ. of Kentucky 1.43 $75,150 $52,479
Univ. of Akron Main 1.44 $71,000 $49,471
Univ. of Hawaii at Manoa 1.44 $98,536 $68,638
Univ. of Washington-Seattle 1.44 $108,519 $75,253
Univ. of Toledo 1.44 $76,000 $52,619
Illinois Institute of Technology 1.46 $97,727 $67,070
St. John's Univ.-New York 1.47 $112,017 $76,210
Univ. of Notre Dame 1.47 $128,413 $87,091
Saint Louis Univ. 1.48 $99,458 $67,352
William & Mary 1.50 $105,023 $70,191
Univ. of Maine 1.50 $85,950 $57,401
Indiana Univ.-Purdue Univ.-Indianapolis 1.50 $97,806 $65,061
Albany Law Sch. 1.51 $93,800 $62,238
Loyola Univ. Chicago 1.52 $119,367 $78,406
Univ. of California-Irvine 1.54 $133,605 $86,874
Florida International Univ. 1.55 $90,411 $58,150
Arizona State Univ. Immersion 1.56 $100,564 $64,489
Southern Methodist Univ. 1.57 $145,569 $92,581
Univ. of Richmond 1.58 $100,229 $63,433
Case Western Reserve Univ. 1.59 $98,460 $61,746
Univ. of New Mexico 1.59 $91,267 $57,225
Univ. at Buffalo 1.61 $94,242 $58,697
Univ. of Tulsa 1.61 $90,365 $56,260
Univ. of Colorado Boulder 1.61 $105,696 $65,704
Massachusetts Sch. of Law 1.63 $80,384 $49,371
Univ. of Oregon 1.64 $98,655 $60,241
Univ. of California-Hastings College of Law 1.64 $139,352 $84,760
Seton Hall Univ. 1.65 $115,179 $69,650
Mitchell Hamline Sch. of Law 1.66 $101,761 $61,445
Wake Forest Univ. 1.66 $105,023 $63,235
Yale Univ. 1.67 $140,977 $84,669
Univ. of Memphis 1.69 $92,250 $54,715
West Virginia Univ. 1.71 $93,735 $54,919
Michigan State Univ. 1.71 $103,630 $60,480
Regent Univ. 1.72 $85,898 $49,875
Univ. of Pittsburgh-Pittsburgh 1.74 $109,178 $62,907
Suffolk Univ. 1.75 $113,386 $64,945
Univ. of Idaho 1.76 $100,091 $56,904
Univ. of Missouri-Kansas City 1.78 $97,000 $54,597
Argosy Univ. 1.78 $106,114 $59,569
Emory Univ. 1.79 $134,617 $75,208
Univ. of Louisville 1.80 $96,424 $53,541
Syracuse Univ. 1.80 $113,050 $62,765
Univ. of Maryland, Baltimore 1.84 $118,506 $64,417
Univ. of the District of Columbia 1.84 $110,258 $59,909
Univ. of Baltimore 1.85 $106,102 $57,324
Gonzaga Univ. 1.85 $110,687 $59,741
Valparaiso Univ. 1.87 $89,751 $48,100
Univ. of San Diego 1.87 $145,850 $77,990
Capital Univ. 1.87 $106,377 $56,836
Southern Illinois Univ.-Carbondale 1.88 $91,500 $48,797
Belmont Univ. 1.88 $101,152 $53,747
San Joaquin College of Law 1.94 $109,339 $56,377
Samford Univ. 1.95 $108,958 $55,927
New York Law Sch. 1.99 $142,500 $71,646
Loyola Marymount Univ. 2.04 $155,436 $76,120
DePaul Univ. 2.09 $131,463 $62,841
Pepperdine Univ. 2.10 $154,886 $73,898
Santa Clara Univ. 2.10 $180,127 $85,894
Baylor Univ. 2.11 $172,756 $81,912
Univ. of South Carolina-Columbia 2.12 $115,354 $54,513
Mercer Univ. 2.13 $124,216 $58,393
George Washington Univ. 2.14 $176,325 $82,298
The Catholic Univ. of America 2.18 $147,964 $67,970
South Texas College of Law Houston 2.18 $142,976 $65,593
Western New England Univ. 2.19 $97,835 $44,639
Seattle Univ. 2.20 $144,542 $65,675
Univ. of Miami 2.21 $148,750 $67,424
Univ. of the Pacific 2.22 $147,082 $66,300
Elon Univ. 2.24 $119,023 $53,224
Touro Univ. 2.27 $132,011 $58,113
New England Law-Boston 2.27 $126,248 $55,545
Creighton Univ. 2.28 $128,182 $56,322
Univ. of Detroit Mercy 2.30 $122,626 $53,322
Northern Kentucky Univ. 2.31 $101,097 $43,718
Marquette Univ. 2.37 $137,200 $57,795
Lincoln Memorial Univ. 2.39 $108,228 $45,341
Roger Williams Univ. Sch. of Law 2.39 $122,459 $51,200
North Carolina Central Univ. 2.40 $117,597 $49,032
Widener Univ. 2.46 $131,126 $53,209
St. Mary's Univ. 2.47 $145,002 $58,704
American Univ. 2.47 $161,696 $65,460
Lewis & Clark College 2.49 $149,506 $60,132
Hofstra Univ. 2.54 $163,347 $64,417
Univ. of Massachusetts-Dartmouth 2.55 $123,227 $48,378
Campbell Univ. 2.56 $135,880 $53,113
Univ. of Denver 2.56 $161,053 $62,896
Chapman Univ. 2.58 $170,800 $66,272
Howard Univ. 2.58 $185,348 $71,861
Southern Univ. Law Center 2.58 $118,010 $45,662
Florida Agricultural and Mechanical Univ. 2.59 $115,500 $44,537
Stetson Univ. 2.70 $142,533 $52,813
Univ. of Illinois Chicago 2.71 $153,993 $56,822
Vermont Law and Graduate Sch. 2.75 $139,540 $50,783
Mississippi College 2.83 $143,299 $50,576
Willamette Univ. 2.85 $162,945 $57,152
Oklahoma City Univ. 2.90 $145,281 $50,180
Faulkner Univ. 2.91 $137,560 $47,349
Golden Gate Univ. 2.93 $154,813 $52,909
Ave Maria Sch. of Law 2.94 $144,259 $49,074
Nova Southeastern Univ. 2.95 $162,455 $54,987
Florida Coastal Sch. of Law 3.17 $158,836 $50,102
St. Thomas Univ. 3.18 $166,022 $52,281
California Western Sch. of Law 3.19 $179,866 $56,303
Southwestern Law Sch. 3.50 $203,702 $58,279
Barry Univ. 3.54 $154,477 $43,676
Univ. of San Francisco 3.58 $182,582 $50,987
Charleston Sch. of Law 3.66 $152,981 $41,855
Appalachian Sch. of Law 3.79 $123,970 $32,667
Atlanta's John Marshall Law Sch. 3.96 $193,041 $48,790
Inter American Univ. of Puerto Rico 4.00 $110,693 $27,693
Arizona Summit Law Sch. 4.31 $227,656 $52,864
Western Michigan Univ.-Thomas M. Cooley 4.95 $202,668 $40,967
Pontifical Catholic Univ. of Puerto Rico 8.43 $122,712 $14,563

This table has been updated with information from Maine. Some schools, including BYU and Texas, do not have complete data reported in the Department of Education data set and cannot be included here.

How did Big Law survive the "Death of Big Law"?

The second in an occasional series I call “dire predictions.”

In 2010, Professor Larry Ribstein published a piece called The Death of Big Law in the Wisconsin Law Review. Here are a few of the more dire claims Professor Ribstein made:

  • “Big Law’s problems are long-term, and may have been masked until recently by a strong economy, particularly in finance and real estate. The real problem with Big Law is the non-viability of its particular model of delivering legal services.”

  • “When big firms try to expand without the support structure they are prone to failure. Big Law recently has been subject to many market pressures that have exposed its structural weakness. The result, not surprisingly, is that large law firms are shrinking or dying and smaller firms that do not attempt to mimic the form of Big Law are rising in their place.”

  • “These Big Law efforts to stay big are not, however, sustainable. Hiring more associates makes it harder for firms to provide the training and mentoring necessary to back their reputational bond.”

  • “In a nutshell, these firms need outside capital to survive, but lack a business model for the development of firm-specific property that would enable the firms to attract this capital. These basic problems have left Big Law vulnerable to client demands for cheaper and more sophisticated legal products, competition among various providers of legal services, and national and international regulatory competition. The result is likely to be the end of the major role large law firms have played in the delivery of legal services.”

  • “The death of Big Law has significant implications for legal education, the creation of law and the role of lawyers. First, a major shift in market demand for law graduates ultimately will affect the demand for and price of legal education. Big Law’s inverted pyramid, by which law firms can bill out even entry-level associate time at high hourly rates, has created a high demand and escalating pay for top law students. The pressures on Big Law discussed throughout this Article are ending this era with layoffs, deferrals, pay reductions, and merit-based pay.”

The late Professor Ribstein’s piece is only one such article in a movement of pieces that arose in the 2009-2010 reaction to the financial crisis. But large law firms appear to be thriving and continue to hire associates at ever-increasing clips among new law school graduates. Two charts to consider.

First, the number of law firms with gross total annual revenue exceeding $1 billion has climbed swiftly over the last decade or so. There were just 13 such firms in 2011, but 52 in 2021 (and down to 50 in 2022). True, inflation can account for rising total revenue. But it also reflects large law firms staying large—or becoming larger. (Figures from law.com AmLaw annual reports.)

Second, law student placement in those jobs. For the Class of 2011, nearly 4700 graduates ended up in those positions, just over 10% of the graduating class. Since then, graduating classes have shrunk by several hundred students, which has helped the overall placement rate as a percentage of graduates. But raw placement has nearly doubled in the last decade, too, to over 8500 for the Class of 2022, or nearly 25% of the graduating class.

Of course, one could find ways that “Big Law” is changing, whether that’s through the use of technology, the relationships it has with clients, its profits and salary structure, whatever it may be.

But “Big Law,” despite the dire predictions in the midst of the financial crisis, does not appear anywhere close to dead. To the extent there are large firms aggregating attorneys, with partners sharing significant profits among themselves and hiring a steady stream of associates for large and sophisticated work of large corporate clients, the model does not appear dead, but growing. Perhaps other types of disruption will appear in the future to change this model. But the financial stability of the model appears largely intact.

Overall legal employment for the Class of 2022 improves slightly, with large law firm and public interest placement growing

The aftermath of a pandemic, bar exam challenges, or a softening economy didn’t dampen the employment outcomes for law school graduates in 2022. Outcomes improved a touch. Below are figures for the ABA-disclosed data (excluding Puerto Rico’s three law schools). These are ten-month figures from March 15, 2023 for the Class of 2022.

  Graduates FTLT BPR Placement FTLT JDA
Class of 2012 45,751 25,503 55.7% 4,218
Class of 2013 46,112 25,787 55.9% 4,550
Class of 2014 43,195 25,348 58.7% 4,774
Class of 2015 40,205 23,895 59.4% 4,416
Class of 2016 36,654 22,874 62.4% 3,948
Class of 2017 34,428 23,078 67.0% 3,121
Class of 2018 33,633 23,314 69.3% 3,123
Class of 2019 33,462 24,409 72.9% 2,799
Class of 2020 33,926 24,006 70.8% 2,514
Class of 2021 35,310 26,423 74.8% 3,056
Class of 2022 35,638 27,607 77.5% 2,734

Placement is very good. There was an increase of over 1000 full-time, long-term bar passage-required jobs year-over-year, and the graduating class size was the largest since 2016. It yielded a placement of 77.5%. J.D. advantage jobs decreased somewhat, perhaps consistent with a hot law firm market last year.

It’s remarkable to compare the placement rates from the Class of 2012 to the present, from 56% to 78%. And it’s largely attributable to the decline in class size.

Here’s some comparison of the year-over-year categories.

FTLT Class of 2021 Class of 2022 Net Delta
Solo 234 160 -74 -31.6%
2-10 5,205 5,070 -135 -2.6%
11-25 2,004 2,115 111 5.5%
26-50 1,218 1,360 142 11.7%
51-100 1,003 1,175 172 17.1%
101-205 1,143 1,246 103 9.0%
251-500 1,108 1,145 37 3.3%
501+ 5,740 6,137 397 6.9%
Business/Industry 3,070 2,797 -273 -8.9%
Government 3,492 3,591 99 2.8%
Public Interest 2,573 2,875 302 11.7%
Federal Clerk 1,189 1,130 -59 -5.0%
State Clerk 2,094 2,053 -41 -2.0%
Academia/Education 328 375 47 14.3%

The trend continues last years uptick in public interest placement, which is not an outlier. Public interest job placement is up over 100% since the Class of 2017. These eye-popping number continue to rise. It is likely not an understatement to say that law students are increasingly oriented toward public interest, and that there are ample funding opportunities in public interest work to sustain these graduates. (I include a visualization of the trend of raw placement into these jobs here.)

Sole practitioners continue to slide significantly (they were in the low 300s not long ago in raw placement).

Additionally, extremely large law firm placement continues to boom. Placement is up more than thousands graduates in the last several years. Placement in firms with at least 101 attorneys is around 8500. Nearly 25% of all law school graduates landed in a “Big Law” firm, and more than 30% of those who were employed in a full-time, long-term, bar passage-required job landed in a “Big Law” firm.

Federal clerkship placement has dropped a bit, perhaps because more judges are hiring those with work experience rather than recent graduates, or perhaps because the pool of potential candidates is shrinking as more judges hire students for multiple clerkships.

Some law schools fundamentally misunderstand the USNWR formula, in part because of USNWR's opaque methodology

Earlier this week, USNWR announced it was indefinitely postponing release of its law school rankings, after delaying their release one week. It isn’t the first data fiasco that’s hit USNWR in law rankings. In 2021, it had four independent problems, two disputed methodology and two disputed data, that forced retraction and recalculation.

There are likely obvious problems with the data that USNWR collected. For instance, Paul Caron earlier noted the discrepancies in bar passage data as released by the ABA. I noticed similar problems back in January, but (1) I remedied some of them and (2) left the rest as is, assuming, for my purposes, close was good enough. (It was.) The ABA has a spreadsheet of data that it does not update, and individual PDFs for each law school that it does update—that means any discrepancies that are corrected must later be manually supplemented to the spreadsheet. It is a terrible system. It is exacerbated by the confusing columns that ABA uses to disclose data. But it only affected a small handful of schools. It is possible USNWR has observed this issue and is correcting it. And it is possible this affects a small number of schools.

A greater mistake advocated by law school deans, however, relates to employment data. Administrators and deans at Yale, Harvard, and Berkeley, at the very least, have complained very publicly to Reuters and the New York Times that their employment figures are not accurate.

They are incorrect. It reflects a basic misunderstanding of the USNWR data, but it is admittedly exacerbated by how opaque USNWR is when disclosing its metrics.

In 2014, I highlighted how USNWR publicly shares certain data with prospective law students, but then conceals other data that it actually uses in reaching its overall ranking. This is a curious choice: it shares data it does not deem relevant to the rankings, while concealing other data that is relevant to the rankings.

The obvious one is LSAT score. USNWR will display the 25th-75th percentile range of LSAT scores. But it uses the 50th percentile in its ranking. That could be found elsewhere in its publicly-facing data if one looks carefully. And it is certainly available in the ABA disclosures.

Another less obvious one is bar passage data. USNWR will display the first-time pass rate of the school in the modal jurisdiction, and that jurisdiction’s overall pass rate. But it uses the ratio of first-timers over the overall pass rate, a number it does not show (but simple arithmetic makes easier). And in recent years, it now uses the overall rate from all test-takers across all jurisdictions, which it also does not show. Again, this is certainly available in the ABA disclosures.

Now, on to employment data. As my 2014 post shows, USNWR displays an “employed” statistics, for both at-graduation and 9 or 10 months after graduation. But it has never used that statistic in its rankings formula (EDIT: in recent years—in the pre-recession days, it weighed employment outcomes differnetly). It has, instead, weighed various categories to creates its own “employment rank.” That scaled score is used in the formula. And it has never disclosed how it weighs the other categories.

Let’s go back to what USNWR publicly assured law schools earlier this month (before withdrawing this guidance):

The 2023-2024 Best Law Schools methodology includes:

. . .

Full credit for all full-time, long-term fellowships -- includes those that are school funded -- where bar passage is required or where the JD degree is an advantage

Maximum credit for those enrolled in graduate studies in the ABA employment outcomes grid

Note that the methodology will give “full credit” or “maximum credit” for these positions. That is, its rankings formula will give these positions, as promised to law schools based on their complaints, full weight in its methodology.

I had, and have, no expectation that this would change what it publicly shares with prospective law students about who is “employed.” Again, that’s a different category, not used in the rankings. I assume, for instance, USNWR believes its consumers do not consider enrollment in a graduate program as being “employed,” so it does not include them in this publicly-facing metric.

Now, how can law schools know that this publicly-facing metric is not the one used in the rankings methodology, despite what USNWR has said? A couple of ways.

First, as I pointed out back in January, “I assume before they made a decision to boycott, law schools modeled some potential results from the boycott to determine what effect it may have on the rankings.” So law schools can use their modeling, based on USNWR own public statements, to determine where they would fall. My modeling very closely matches the now-withdrawn rankings. Indeed, Yale was the singled greatest beneficiary of the employment methodology change, as I pointed out back in January. It is very easy to run the modeling with school-funded and graduate positions given “full weight,” or given some discounted weight, and see the difference in results. It is impossible for Yale to be ranked #1 under the old formula—that is, in a world where its many graduates in school-funded or graduate positions did not receive “full weight” in the methodology. Again, very simple, publicly-available information (plus a little effort of reverse-engineering the employment metrics from years past) demonstrates the outcomes.

Second, USNWR will privately share with schools subscribing to its service an “employment rank.” This raw “rank” figure is the output of the various weights it gives to employment metrics. It does not reveal how it get there; but it does reveal where law schools stand.

It takes essentially no effort to see that the relationship between the “employment” percentage and the “employment rank” is pretty different or will look largely the same. And that’s even accounting for the fact that the “rank” can include subtle weights for many different positions. At schools like Yale, there are very few variables. In 2021, it had students in just 10 categories. And given that a whopping 30 of them were in full-time, long-term, law school funded bar passage required positions, and another 7 in graduate programs, the mismatch between “employment” percentage and “employment rank” should be obvious, or the two categories should match pretty cleanly.

Third, one can also reverse engineer the “employment rank” to see how USNWR gives various weight to the various ABA categories. This takes some effort, but, again, it is entirely feasible to see how these jobs are given various weights to yield a “rank” that looks like what USNWR privately shares. And again, for schools that run these figures themselves, they can see if USNWR is actually giving full “weight” to certain positions or not.

USNWR’s opaque approach to “employment rank” certainly contributes to law schools misunderstanding the formula. But law schools—particularly elite ones who initiated the boycott and insisted they do not care about the rankings, only now to care very much about them—should spend more effort understanding the methodology before perpetuating these erroneous claims.

February 2023 MBE bar scores fall to all-time record low in test history

After all-time lows in 2020, matched in 2022, the February 2023 administration of the Multistate Bar Exam has hit new lows. The mean score was a 131.1, down from 132.6 last year and 134.0 the year before. We would expect bar exam passing rates to drop in most jurisdictions. That’s off a recent high in 2011 of 138.6.

Given how small the February pool is in relation to the July pool, it's always hard to draw too many conclusions from the February test-taker pool. The February cohort is historically weaker than the July cohort, in part because it includes so many who failed in July and retook in February. The NCBE reports that 72% were repeaters, which also contributes to a weaker pool. That said, there are ominous signs for the near future, according to the NCBE report: “We saw a decrease in performance across all groups of examinees, and the decrease was the greatest (about two scaled score points) for likely first-time test takers.” The NCBE points to learning loss from the COVID pandemic, and it remains to be seen how much changes in law school education affected things. (But, frankly, I anticipated things could have been much worse last year, so there are a number of open questions, to be sure, about specific pedagogical issues, or specific issues relating to student wellness.)

As interest in law schools appears to be waning, law schools will need to ensure that class quality remains strong and that they find adequate interventions to assist at-risk student populations. Likewise, changes to the USNWR methodology may well increase the importance of the bar exam in the very near future.