Law school academic dismissal and conditional scholarship eliminations, 2023

Last year, I highlighted the fact that law schools have wide variance in how they handle academic dismissals of first-year law students and how they handle reducing or eliminating scholarships. Both categories, I argued, are negatives for law schools and the kind of information that USNWR could (and perhaps should) incorporate into its rankings. I offered a few ways of comparing schools to one another.

Here’s a visualization of the percentage of first-year law students who were academically dismissed in 2023. These percentages are slightly different than the opaque percentages that are reported to the ABA. These figures look at enrollment as of October 5, 2022; and the ensuing total number of first-year law students who were academically dismissed the following year. The figures exclude transfers, and those who withdrew for other reasons. I organize the chart by USNWR ranking and only look at the top 100 schools.

You can see that most schools have zero or negligible academic attrition, and that it picks up slightly as the chart moves down. But a few schools have somewhat higher academic attrition, 5% or higher.

Now over to scholarship reductions or eliminations. The ABA does not distinguish between the two, or distinguish in the amount. Instead, any reduction or elimination is included. The percentage here is also slightly different than the ABA data—it is the percentage of the overall class in this chart, not the percentage among scholarship recipients. That is, if you did not receive a scholarship, you are included in the denominator in this chart, so this chart includes all 1Ls at each school.

There are far fewer schools that reduce or eliminate scholarships, because the vast majority of school simply do not have “conditional” scholarship policies. But, again, as one moves down the chart, one can see some more reductions or eliminations, with a handful eliminating or reducing scholarships for 10% or more of the class. And there is some overlap among academic attrition rates and conditional scholarship data.

Annual Statement, 2023

Site disclosures

Total operating cost: $192

Total content acquisition costs: $0

Total site visits: 81,587 (+38% over 2022)

Total unique visitors: 81,357 (+57% over 2022)

Total pageviews: 120,609 (+70% over 2022)

Top referrers:
Twitter (9189)
Reddit (3558)
LinkedIn (3548)
Leiter’s Law School Reports (3051)
TaxProf Blog (2830)
Buzzfeed (2001)
Instapundit (1080)
Reuters (562)

Most popular content (by pageviews):
Ranking the most liberal and conservative law firms among the top 140, 2021 edition (November 8, 2021) (23,677)
Modeling and projecting USNWR law school rankings under new methodologies (January 18, 2023) (8851)
Projecting the 2024-2025 USNWR law school rankings (to be released March 2024 or so) (May 15, 2023) (8124)
What does it mean to “render unto Caesar”? (May 3, 2020) (5057)
California’s “baby bar” is not harder than the main bar exam (May 28, 2021) (4314)
Who's likely to benefit from the new USNWR law school rankings formula? (January 2, 2023) (3161)

I have omitted "most popular search results" (99% of search results not disclosed by search engine, very few common searches in 2023).

Sponsored content: none

Revenue generated: none

Disclosure statement

Platform: Squarespace

Privacy disclosures

External trackers: one (Google Analytics)

Individuals with internal access to site at any time in 2023: one (Derek Muller)

*Over the course of a year, various spam bots may begin to visit the site at a high rate. As they did so, I added them to a referral exclusion list, but their initial visits are not disaggregated from the overall totals. These sites are also excluded from the top referrers list. Additionally, all visits from my own computers are excluded.

**Google switched its analytics platform midyear. It is possible that some categories are overinclusive and others are underinclusive due to overlapping or mismatched data.

Hard questions about experiential learning and legal education

The American Bar Association created an Experiential Credits Working Group out of the Standards Committee suggesting three potential proposals—increasing the number of “experiential credits” in legal education from 6 to 9 or 15. There are some major, and difficult, questions to address.

A decade ago, the ABA added a requirement that all law schools would be required to include six “experiential” units to their curriculum as a condition for JD graduation. This would expand, perhaps significantly, that requirement.

From the beginning of the new proposal, let me open with this:

We have assumed that the value of experiential education in a professional program has been established through the literature on adult pedagogy and professional education pointing to the activity of using doctrine and skills in context in combination with the exercise of critical perspectives, values, and habits as necessary for professional formation.

Respectfully, this is something of a string of thoughts, opening with an “assumption.” Now the assumption, as it says, has been “established.” But established, how?

One might say, the ABA should examine whether, and how, the six-unit requirement has advanced the ends it was originally designed to achieve. From what I’ve seen, it has not achieved what it was designed to do. Professor Robert Kuehn, for instance, has chronicled how the implementation has been a “whimper,” with good evidence—there have been few substantive changes to curriculum; much occurred by “restructuring” how courses are labeled, something of an accounting issue; state bars have lamented the lack of preparedness of law school graduates; and so on.

Now, Professor Kuehn concludes, “The ABA should heed these calls for reform and revisit the proposals for fifteen-credits of experiential coursework and a mandatory, live-client clinical experience for all J.D. students.” I appreciate his thoughtful perspective (as he was quite involved in these matters a decade ago). But it’s unclear to me that if the evidence of partial implementation has been unsuccessful that a larger implementation would be successful. Now, perhaps part of this is an acknowledgment that schools must be pressed to do something beyond “accounting” or “restructuring” credits, to something quite dramatic. But it remains a surprise to me that this ABA proposal offers no assessment of what has or has not worked in the six-unit model. Instead of evaluating the existing program, it simply assumes that more is better—and, in fact, this assumption doesn’t offer quantity that is valuable, but some sense that more is always better in this context.

So, the “value of experiential education in a professional program has been established.” Established how? “literature on adult pedagogy and professional education.” This is surely right. Now, that said, there are many things that are valuable in professional programs—behavioral theories of a learning environment, or cognitive theories about internal motivation for learners, are also valuable. And this literature, the assumption notes, is “pointing to the activity of using [a] doctrine and skills [b] in context [c] in combination with the exercise of [d] critical perspectives, values, and habits as necessary for professional formation.” I add some subdivisions here. But I think [b] is doing most of the work. I think a lot of doctrine and skill can arise in combination with critical perspective, values, and habits necessary for professional formation. And, again, it offers no evaluation of tradeoff between this and other ways of professional formation.

To recap, then, I have some skepticism of (1) a measure of the efficacy of these programs, particularly in light of the change a decade ago and no evaluation of its success or changes in law schools; and (2) a proposal that something is “valuable” without much if any evaluation of tradeoffs.

To be fair, however, (2) does get some attention later:

However, this support for increasing the number of credits was coupled with concerns about [1a] challenges for part-time and/or evening students to meet an increased number of credits, [2] changes to the curriculum, and [3] financial and [1b] logistical ramifications. The roundtable unearthed concerns about the impact increasing the number of experiential learning credits might have on [4] bar exam performance and [2] the reduction of room for elective courses in a student’s schedule, among other things.

I also subdivided this section, and I think there are two [2]s, and two parts to [1]. I want to focus on a couple of these.

As to [2] I think the concern about changes to the curriculum and reduction of elective courses does get at the tradeoffs.

[3] is a recognition that requiring more experiential courses could come at an added financial cost.

[4] is a recognition that there may be a relationship between course offerings and the bar exam, and some risk that shifting this balance may adversely affect bar exam performance. Professor Kuehn has an important paper on this topic, suggesting the answer is no. At the macro level at these two schools studied under existing standards, it may well be correct. It would be interesting to evaluate how this plays out on a broader scale.

These are, I think, pretty important concerns. And to be fair, some of these questions of tradeoffs and requests for information are a part of a new ABA request for information from schools and stakeholders, per Paul Caron. (See, e.g., “The impact that increasing the number of required experiential credits will have on the ability of students to take elective courses”; “Costs associated with an increase in experiential credits”; and so on.)

But note that none of the questions ask about the existing practices or the adoption of the existing decade-old requirement. Indeed, some of the questions ask for even broader and more dramatic changes. (See, e.g., “Whether the Council should consider requiring a full experiential semester (offered as a single semester) of all law students”; “Whether other types of experiential learning, for example judicial clerkships, should be included within the Standard’s definition of an experiential course,” a change to the status quo.)

It’s disappointing to see such a proposal of potentially radical changes to legal education with no effort to examine the more incremental measure adopted a decade ago that appears, at least implicitly, to have failed to achieve what proponents desired.

One more point, and that’s to heterogeneity. And this point comes in two subpoints.

First, have any other schools adopted similar measures like the ones the ABA is proposing; if they have, have they succeeded in doing what they set out to accomplish; and if they have not, then why are these ideas, which as the original memo points out are “assumed” values, adopted? There are thorny questions to go through, to be sure. If changes aren’t adopted because of stagnant law faculty, that’s one thing. If they weren’t adopted because of systemtic weaknesses in the cost-benefit analysis, that would be worth knowing, too. And if they have been adopted, wouldn’t it be beneficial to know how those experiments have actually played out?

Second, more on experiments. The ABA’s own Task Force on the Future of Legal Education a decade ago posited, “We think legal education would be improved if there were more room for trying different models. . . . The Task Force recommends that participants in the legal education system, but particularly law schools, universities, the Section of Legal Education, the Association of American Law Schools, and state bar admission authorities, pursue or facilitate this increased diversification of law schools as they each develop plans and initiatives to address the current challenges in legal education.” A decade ago, I pointed out the irony of proposals like this juxtaposed with the ABA’s additional new requirements.

Frankly, additional mandates on legal education continue to stifle any innovation or heterogeneity among law schools. And it comes at a time when there’s greater skepticism from the ABA about homogeneity in, say, law schools admissions testing requirements.

But while the ABA seems more willing for heterogeneity in admissions, it seems to want homogeneity on many other things. Over the last couple of years, the ABA’s more recent changes also offer significant new uniform requirements on law schools, including:

  1. Providing “substantial opportunities to students for . . . the development of a professional identity,” and that “students should have frequent opportunities for such development during each year of law school and in a variety of courses and co-curricular and professional development activities.”

  2. Providing “education to law students on bias, cross-cultural competency, and racism: (1) at the start of the program of legal education, and (2) at least once again before graduation.”

  3. Providing “resources related to financial aid and student loan debt and the availability of individual student loan counseling at the law school, the university of which it is a part, or from third party sources”

  4. Providing “information on law student well-being resources,” “informing law students and providing guidance regarding relevant information and services, including assistance on where the information and services can be found or accessed.”

Several things aren’t clear to me. I don’t know how many schools do or don’t do these things. (I’m fairly sure nearly every law school has education on bias, cross-cultural competency, and racism at orientation, and that nearly every law school has information about well-being resources.) It’s not clear what the baseline looks like (e.g., how students currently respond to professional formation), or where it ought to lead (e.g., whether student loan outcomes are “better” in the future with the availability of that information). And each new requirement includes some allocation of law school resources (passed along to students in the form of tuition raises), sometimes new, or sometimes simply in ensuring compliance by developing a record for ABA accreditation purposes.

Undoubtedly, there’s been plenty of praise for the ABA, here and before, for “doing something” in response to actual and perceived problems. But whether it yields any benefits in the future seems impossible to measure.

This proposal to change “experiential” learning, I think, is along the same lines. There’s no baseline comparison or evaluation. There’s no effort to figure out what schools are doing and what works. There’s no articulation of what a successful implementation of the proposal will look like. Instead, it’s some value-laden assumptions and some mandates—but this one (unlike the more recent four proposals) with a direct consequence on the curriculum as a whole.

I look forward to the exchange of information among law schools in the months ahead as the ABA attempts to determine how it ought to proceed. And this is not to say whether these proposals are good or bad. It’s to ask whether there’s enough to suggest they should be placed as a requirement on all law schools without exception in the United States. But I am doubtful that the right questions are being asked to gather the proper information in the first place for us to evaluate without largely falling back on our priors.

NYU, Cornell, and the new USNWR law school rankings landscape

After I project next year’s USNWR law school rankings, as I did last May and again here in December, there’s always a lot of chatter about the changes, about schools moving up and down. But the more notable thing is why the schools have changed spots, and there’s not a lot of explanation built into a single ranking metric. And some schools attract more attention than others. I would say I’ve received a decent number of questions about NYU (projected to be around 11) and Cornell (projected to be around 18) than most other schools, as both are significantly lower than their typical ranking. Why?

It has much less to do with any changes at those institutions, and much more about the rankings methodology changes.

And one way of thinking about the change—and specifically the change that adversely affected NYU and Cornell the most—is a change from quality to quantity.

This is a crude approximation, and it’s also likely to be a bit controversial to frame it this way. But bear with me as I set it up.

As I noted last December, it was not clear to me what the “endgame” for law schools “boycotting” the rankings was. It certainly pressured USNWR to change its methodology, as it could only rely on publicly-available data. And USNWR did change, not only to publicly-available data, but also in its weights of existing factors. That included a shift from “inputs” (e.g., admissions statistics) and toward “outputs” (e.g., employment and bar passage). At a high level, that seems pretty ordinary.

But the more subtle shift is what I posit here, a shift from quality to quantity. Here’s what I mean.

Employment outcomes were, of course, always a part of the rankings, and a pretty big factor (18%). But 4% of the weight went to at-graduation jobs. Those were typically the most “elite” (or high quality) jobs—large law firms and federal clerkships, among others. That’s not publicly available data, so it dropped out. And the 10-month employment metric rose from 14% of the rankings to 33%. And that metric treats a job as a job—all jobs are the same, if they are full-time, long-term, bar passage required or J.D. advantage (and pursuing an advanced degree). As you can see here from my earlier blog post, how schools get to the best employment outcomes can vary dramatically.

NYU and Cornell place an overwhelming number of their graduates into “elite” law firm outcomes, so they suffered when the “at graduation” metric dropped off. And USNWR does not weigh the quality (actual or perceived) of jobs differently—a job is a job. NYU and Cornell had high reputational scores, likely in part because of their consistent elite placement into legal jobs. But those metrics dropped from 40% to 25% of the rankings, too. “Quality” metrics, if you will, began to drop off. Instead, quantity metrics increased—including raw employment placement 10 months after graduation. It’s simply a question of putting graduates through the bar exam and into a job. Maybe that sounds crass, but that’s the metric measured. And maybe it’s a good thing to measure these raw outputs—employed graduates are better than unemployed graduates.

So let’s see what happened in these employment metrics.

Recently I looked at the schools I estimated to be in the “top ten” of USNWR’s employment metrics. These are estimates, because USNWR does not release how it weights each category of employment. But three schools stand out among these top ten, as they are schools that typically do well in the rankings but are not among the top ten of USNWR rankings: SMU, Texas A&M, and Washington University in St. Louis. Let’s compare their employment outcomes with NYU and Cornell.

"Full weight" All other employment Unemployed Approx. emp. rank
SMU 97.8% 1.1% 1.1% top 10
Texas A&M 99.4% 0.0% 0.6% top 10
Washington Univ. 97.8% 2.2% 0.0% top 10
NYU 97.5% 0.4% 2.1% 20
Cornell 94.1% 2.0% 4.0% 40

You can see that “full weight” jobs are worse (slightly for NYU, more so for Cornell) and unemployed outcomes higher than the three “top ten” schools I list here.

But the increased compression of the metrics leads to increased volatility—including dropping schools for even marginal differences in employment outcomes. NYU has around 2.1% (10 out of 473) of its graduates unemployed. That sinks its employment ranking to around 20th. Cornell has about 4% (8 of 202) of its graduates unemployed (or unknown). That puts its employment ranking around 40th. There’s tremendous compression at the top of these rankings. And given that employment is worth a whopping 33% of the overall rankings, marginal differences matter.

Let’s now compare the types of jobs. I’ll pull out three types: employment at firms with 501 or more attorneys; at firms with 101 to 500 attorneys; and federal judicial clerks. I sum those three categories at the end for a total, separate out all other outcomes, and the final unemployed figure. Finally, I include the raw total number of graduates at the end.

501+ firm 101-500 firm Fed Clerks Total All other Unemployed Total Grads
SMU 22.2% 9.3% 4.8% 36.3% 62.6% 1.1% 270
Texas A&M 7.6% 4.1% 4.7% 16.3% 83.1% 0.6% 172
Washington Univ. 31.7% 12.3% 8.4% 52.4% 47.6% 0.0% 227
NYU 54.5% 12.3% 6.3% 73.2% 24.7% 2.1% 473
Cornell 73.8% 7.4% 3.5% 84.7% 11.4% 4.0% 202

”Quality” is a controversial measure, to be sure. Students, of course, can have high quality employment outcomes in state court clerkships, public interest jobs, and government. These categories, however, are probably among the most competitive, if not the most sought after, categories of legal employment that offer students the highest salaries and the most options at the end. Again, controversial measure, to be sure, and not a one size fits all. But it’s worth the point of comparison.

For NYU and Cornell, the at-graduation metric and the elevated reputational score rankings likely helped account for some of the elite job placement. At Cornell, 84.7% land in these elite jobs, an astonishingly high percentage. At NYU, it’s 73.2%. The other three schools I list here aren’t particularly close. And for NYU, it’s all the more impressive that it graduated 473 students, nearly or more than doubling what most of the other schools do. It’s an extraordinary effort to secure that many high quality jobs for its graduates.

But note that, for USNWR purposes, those placement rates aren’t captured. It’s just jobs. It’s the quantity of placement. Getting students out of that “unemployed” bucket is basically the way to rise in the rankings these days.

I don’t mean to pick on any particular schools—they’re all strong schools in their own ways, and they offer some contrasts with one another and points of comparisons in the rankings. And it also doesn’t mean that students aren’t graduating into meaningful and successful careers. It’s simply a way of explaining why schools like NYU and Cornell are sliding in the new metrics, and others are finding success.

One more detail. Bar passage has become a major figure, too—up from 3% to a whopping 18%. But NYU’s first-time bar passage rate was 32d, and Cornell’s 57th, compared to all other schools.

These schools did not exactly have a bad bar passage rate. NYU had a first-time pass rate of 94.9%, nearly 14 points above the average of passers in jurisdictions where its graduates took the bar. Cornell was 90.3%, nearly 10 points above. But those numbers pale in compares to schools like North Carolina (20 points above), Harvard (20 points), UCLA (19), Chicago (19), and Berkeley (19).

The raw outputs aren’t that different. North Carolina, for instance, had a 93.75% first time pass rate, even lower than NYU. But the North Carolina state bar rates were quite low compared to New York’s so, UNC outperformed by 20 points, a higher rate than NYU. This delta of outperformance is a good way of accounting for bar difficulty. But it sets up schools like NYU and Cornell for worse outputs because the state bar is easier and the competition in the state is high quality. The first-time pass rate in North Carolina was 72%; in New York, it’s 83%. (I noted this several years ago with the decision in California to lower the cut score of the bar exam—it has the incidental effect of reducing the apparent quality of bar passage stats of schools like Stanford and UCLA.) And maybe it’s a good reason for including ultimate bar passage as a separate metric—as I wrote earlier, “So maybe there’s some value in offsetting some of the distortions for some schools that have good bar passage metrics but are in more competitive states. If that’s the case, however, I’d think that absolute first-time passage, rather than cumulative passage, would be the better metric.” But It’s literally impossible for NYU or Cornell to overperform the New York bar by more than 17 points.

In short, if you have a question about why a school moved up or down in the rankings, it can usually be distilled into “employment and bar passage.” And if methodological changes are coming, the most likely targets will be these areas, where compression and volatility can lead to surprising changes year over year.

Updating and projecting the 2024-2025 USNWR law school rankings (to be released March 2024 or so)

Last May, I projected USNWR law school rankings based on the publicly-available employment and bar passage data. New ABA data fills out most of the rest of the rankings data. I thought I’d update and see what changed. In short, not much. It’s not surprising, as I mentioned in May that this data is not only weighted less in the rankings but less subject to change. Most of the movement essentially occurred from changes in rounding errors that pushed schools up or down a tied spot.

Some ABA has data, which I tried to fix as best I can. I also have to approximate certain measures (e.g., which GRE percentiles USNWR uses), in addition to estimates of employment rankings, but these are about as robust as one can get. Of course, the peer and lawyer-judge scores will not be available until the spring, so I use last year’s scores. These are stickiest of all and least likely to change—but, given changes in survey response rates, it’s possible there’s slightly more volatility in these metrics.

As usual, I only list the top 100. Schools are sorted by rank, and then by estimated score within the rank (e.g., if six schools are tied at 50, the school at the top of the list has the highest score and is most likely to move up, and the school at the bottom of the list has the lowest score and is most likely to move down).

School December 2023 projected rank May 2023 projected rank Current rank
Stanford 1 1 1
Yale 2 2 1
Chicago 3 3 3
Harvard 4 4 5
Virginia 4 4 8
Penn 6 6 4
Duke 6 6 5
Michigan 6 8 10
Columbia 9 8 8
Northwestern 9 10 10
Berkeley 11 10 10
NYU 11 10 5
UCLA 13 13 14
Washington Univ. 14 14 20
Georgetown 14 14 15
North Carolina 16 16 22
Texas 16 16 16
Cornell 18 18 13
Minnesota 19 19 16
Notre Dame 19 19 27
Vanderbilt 19 19 16
USC 22 22 16
Georgia 22 23 20
Boston Univ. 24 24 27
Wake Forest 24 24 22
Texas A&M 24 27 29
Florida 27 24 22
Utah 28 28 32
Boston College 28 31 29
William & Mary 28 28 45
Alabama 28 28 35
Washington & Lee 32 31 40
Ohio State 32 31 22
Iowa 34 34 35
George Mason 34 35 32
Indiana-Bloomington 36 35 45
Fordham 36 35 29
Florida State 36 35 56
Colorado 39 39 56
Arizona State 39 39 32
BYU 39 39 22
SMU 39 39 45
Baylor 39 39 49
George Washington 44 39 35
Irvine 44 45 35
Illinois 46 46 43
Connecticut 46 46 71
Davis 48 46 60
Wisconsin 48 50 40
Emory 48 46 35
Washington 48 50 49
Tennessee 48 52 51
Villanova 53 52 43
Penn State-Dickinson 53 52 89
Kansas 55 52 40
Temple 55 52 54
Pepperdine 55 57 45
Missouri 55 60 71
San Diego 55 57 78
UNLV 60 60 89
Penn State Law 60 57 80
Oklahoma 60 60 51
Wayne State 60 65 56
Cardozo 60 60 69
Kentucky 60 60 60
Loyola-Los Angeles 66 65 60
Arizona 66 68 54
Northeastern 68 65 71
Maryland 68 68 51
Richmond 68 68 60
Seton Hall 68 72 56
Cincinnati 68 72 84
South Carolina 73 77 60
Drexel 73 68 80
Nebraska 73 72 89
Georgia State 76 77 69
St. John's 76 72 60
Tulane 76 72 71
Florida International 76 77 60
Houston 76 77 60
Loyola-Chicago 76 77 84
UC Law-SF 82 82 60
Catholic 82 85 122
Drake 82 82 88
Maine 85 82 146
LSU 85 85 99
Pitt 85 85 89
Marquette 85 85 71
Belmont 89 90 105
Denver 89 90 80
New Hampshire 89 85 105
Lewis & Clark 89 90 84
New Mexico 89 93 96
Oregon 94 93 78
Texas Tech 94 97 71
UMKC 96 93 106
Case Western 96 97 80
Rutgers 96 nr 109
Dayton 96 97 111
Samford 100 nr 131
Regent 100 93 125
Duquesne 100 nr 89
Indiana-Indianapolis 100 nr 99
Miami 100 nr 71
Cleveland State 100 nr 111
West Virginia 100 nr 111

In the near future, I hope to model a few alternative rankings based on potential changes to the USNWR methodology that may be coming.

Law school 1L JD hits six-year low, non-JD enrollment trends down

The 2023 law school enrollment figures have been released. They show the a drop in JD enrollment and a drop in non-JD enrollment. About 16% of law school enrollees are not enrolled in a JD program.

For nine of the last 10 years, 1L JD enrollment has been between 37,000 and 38,500, remarkable consistency. In 2021, it hit a recent high of 42,718, but it trended down last year, and again this year, down to 37,886 the lowest since 2017’s 37,398.

Total JD enrollment sits at 116,851, well off the peak of 2010-2011 with 147,525.

Non-JD enrollment has been more fickle in recent years. It ballooned to more than 24,000 students last year, good for more than 17% of all law school enrollees, but settled down to 21,966, 15.8% of all law school enrollees, and still consistent with recent highs.

Ten schools have at least 40% of their total overall law school enrollment made up of non-JD students in 2023.

Perhaps the most valuable legal education job in the new USNWR rankings landscape? Career development

After the USNWR law school rankings shakeup earlier this year, I pointed out that spending money on law professors would have less influence than in years past. So, where might there be incentives to spend more money?

Undoubtedly, career services and career development offices.

Now, I haven’t followed this, so I cannot possibly even know anecdotally whether this is the case, but it would be worth considering whether there are more career development personnel being hired at schools (to improve the counselor:student ratio), whether different strategies are being employed (e.g., relationship between “placement” and “development,” targeting particular types of jobs for graduates, reconsidering categories of jobs for reporting purposes, reexamination of school-funded positions, etc.), or whether those personnel are being paid more to retain successful career counselors.

But as the methodology has changed, tiny changes in employment outcomes can yield dramatically different law school rankings. Employment outcomes overwhelm every other category. Indeed, admissions is less important and outcomes like employment are dramatically more important, so much so that one might rethink admissions in light of employment more than median LSAT and UPGA scores (with lots of promise and lots of peril).

So let’s take a look at what to expect in the next USNWR law school rankings as it relates to employment outcomes.

Here are the ten schools (in alphabetical order) I project to be in or near the top 10 in employment outcomes. I show three categories of jobs: “full weight jobs,” all other jobs, and unemployed/unknown. (As an aside, some law school advertise their “full weight” employment of graduates, which is a meaningless term in the real world and refers exclusively to categories that USNWR gives “full weight” in its rankings methodology.)

This is what USNWR sees. Full weight, a variety of categories of jobs it gives lesser weight to, and unemployed. Among the top ten, you’ll see the profiles look very similar. “Full weight” jobs are between 97.8% (SMU) to 99.4% (Texas A&M). Unemployed ranges from 0% (Washington University in St. Louis) to 1.1% (Northwestern). These are highly efficient outputs for law schools.

But let’s look under the hood. Not all of these law schools get to what USNWR sees the same way. To start, USNWR includes five categories in its “full weight” jobs: full-time, long-term bar passage required jobs; BPR jobs funded by law schools; full-time, long-term, JD advantage jobs; JDA jobs funded by law schools; and students pursuing an advanced degree. Schools get there in different ways.

Schools took varying routes to get where USNWR sees them. Yale, for instance, has 6% of its grads in school-funded bar passage-required jobs, and another 6% in school-funded JD advantage jobs. The rest of the schools put between 0% and 3% of grads in school-funded bar passage-required jobs; school-funded JD advantage jobs are negligible at these other schools. Likewise, JD advantage jobs vary dramatically, from nearly zero (Virginia) to 11% (Texas A&M). Career development offices take different paths to get to “full weight” employment. (Relatively few were pursuing an advanced degree anywhere.)

One more. ABA data reveals rich classifications of jobs by several employment categories. I created six cohorts of jobs. The first are “biglaw” jobs, those at firms with 101 or more attorneys. Then, “federal clerks.” Next, “mid law” jobs, firms with 26 to 100 attorneys. Then “state clerks.” Next, “small law,” sole practitioners or those at firms with 25 or fewer attorneys. Finally, “public interest” jobs. All other job categories (regardless of duration or funding) were in a final bucket. Again, we can see that schools get to “full weight” in different ways.

Not just different ways, but pretty dramatically different ways. Placement into “Biglaw” ranged from 12% (Texas A&M) to 71% (Northwestern). Federal clerkship placement ranged from 4% (Columbia) to 24% (Yale). ”Midlaw” was a significant category for Texas A&M (12%), SMU (9%), and Washington University (8%). State clerkships were most significant at Duke (5%). “Small law” was a major category for Texas A&M (33%) and SMU (30%). Yale dominates public interest placement here (20%). Jobs that don’t fit any of these six cohorts (e.g., business, government, education, etc.) were significant at Texas A&M (30%), Washington University (22%), SMU (19%), and Yale (15%).

In short, to get to the “top ten” of “full weight” jobs, schools have taken wildly divergent approaches in achieving results. Career development offices have significantly different strategies for the school, the region, the student body, whatever one wants to think about it.

This isn’t to say that some categories of jobs are or are not better or worse, although I’m sure readers have their own thoughts. But it’s to say that USNWR rankings do not distinguish among them. And if they do not, the route to get there can be flexible and varied. This is just one snapshot into how varied those outcomes can be that get to the same USNWR end.

California has lost 4 ABA-accredited law schools in the last decade

Golden Gate has announced a closure plan for its law school program. Karen Sloan at Reuters highlights some of the trends of recent closures, a trickle we’ve seen over the years. Golden Gate was long at risk.

But California is a stark trend. Just a decade ago, in 2013, the school had 21 ABA-accredited law schools. That number has now dropped to 17 in a decade. Whittier closed, and, at the time, I suggested due to problems unique to California. Thomas Jefferson and La Verne opted to give up their ABA accreditation and be accredited only by the state of California. California lowered the cut score for its bar exam in 2020, but that appears not to have been enough to save Golden Gate.

These four schools graduated 817 JD students in 2013. That was nearly 16% of the 5184 graduates of ABA-accredited law schools that year. The closure of these schools is a major change in the legal education landscape in California.

And while the other law schools graduated 4367 students in 2013, they graduated just 3765 last year, which means they’re not exactly capturing many of the students in California who’d have attended school elsewhere.

It’s been a big decade for the shape of the legal education market in California, and how it plays out in the decades to come remains to be seen.

Law schools have shed 7% of their full time faculty in the last five years

The ABA disclosures reveal trends over time. And they reveal that law schools have shed about 7% of their full time faculty in the last 5 years, from 2017 to 2022—around 700 people (from 10,026 to 9.342). They may be partially replacing them with part-time faculty, which have increased by around 300 in that time (from 16,783 to 17,081). Or they may be “right sizing” as the long tail of the recession and the decline of interest in legal education. Or it could be that law schools are facing challenges staffing faculties. Or maybe other things altogether. (Full-time faculty include tenured, tenure track, and any other instructional faculty status, as long as they have full time employment at the law school.)

41 schools saw a 20% decline in full-time faculty over this period. And, of course, some declines can appear larger at schools with starting base faculty sizes, so I also list the total faculty in 2022 and 2017.

Law School FT 2022 FT 2017 Change
Atlanta's John Marshall 15 26 -42.3%
Pittsburgh 35 55 -36.4%
Catholic 24 37 -35.1%
Faulkner 17 26 -34.6%
Buffalo 40 61 -34.4%
West Virgnia 29 43 -32.6%
Western State 15 22 -31.8%
William & Mary 44 64 -31.3%
Kentucky 29 42 -31.0%
Denver 60 85 -29.4%
Nova Southeastern 40 56 -28.6%
Chicago-Kent 48 67 -28.4%
Barry 28 39 -28.2%
Akron 23 32 -28.1%
Chapman 36 50 -28.0%
Western New England 21 29 -27.6%
Arkansas 36 49 -26.5%
Liberty 20 27 -25.9%
Ohio Northern 18 24 -25.0%
Southern Illinois 24 32 -25.0%
American 71 94 -24.5%
Toledo 22 29 -24.1%
Cooley 41 54 -24.1%
Missouri 29 38 -23.7%
Washington 49 64 -23.4%
DePaul 36 47 -23.4%
Detroit Mercy 23 30 -23.3%
Pepperdine 40 52 -23.1%
Touro 34 44 -22.7%
Northern Illinois 24 31 -22.6%
UC-Davis 40 51 -21.6%
Loyola-New Orleans 40 51 -21.6%
Wake Forest 48 61 -21.3%
San Francisco 37 47 -21.3%
Montana 15 19 -21.1%
Indiana-Bloomington 49 62 -21.0%
New Mexico 38 48 -20.8%
District of Columbia 23 29 -20.7%
Oklahoma City 23 29 -20.7%
McGeorge 35 44 -20.5%
Samford 20 25 -20.0%

That said, 11 schools saw significant faculty growth in this period.

Law School FT 2022 FT 2017 Change
Southern 54 33 63.6%
Lincoln Memorial 22 14 57.1%
Roger Williams 32 23 39.1%
UNT Dallas 22 16 37.5%
Appalachian 13 10 30.0%
Campbell 33 26 26.9%
South Dakota 20 16 25.0%
Penn State Law 52 42 23.8%
Washington University (St. Louis) 96 78 23.1%
Florida International 39 32 21.9%
Dayton 30 25 20.0%