NYU, Cornell, and the new USNWR law school rankings landscape
After I project next year’s USNWR law school rankings, as I did last May and again here in December, there’s always a lot of chatter about the changes, about schools moving up and down. But the more notable thing is why the schools have changed spots, and there’s not a lot of explanation built into a single ranking metric. And some schools attract more attention than others. I would say I’ve received a decent number of questions about NYU (projected to be around 11) and Cornell (projected to be around 18) than most other schools, as both are significantly lower than their typical ranking. Why?
It has much less to do with any changes at those institutions, and much more about the rankings methodology changes.
And one way of thinking about the change—and specifically the change that adversely affected NYU and Cornell the most—is a change from quality to quantity.
This is a crude approximation, and it’s also likely to be a bit controversial to frame it this way. But bear with me as I set it up.
As I noted last December, it was not clear to me what the “endgame” for law schools “boycotting” the rankings was. It certainly pressured USNWR to change its methodology, as it could only rely on publicly-available data. And USNWR did change, not only to publicly-available data, but also in its weights of existing factors. That included a shift from “inputs” (e.g., admissions statistics) and toward “outputs” (e.g., employment and bar passage). At a high level, that seems pretty ordinary.
But the more subtle shift is what I posit here, a shift from quality to quantity. Here’s what I mean.
Employment outcomes were, of course, always a part of the rankings, and a pretty big factor (18%). But 4% of the weight went to at-graduation jobs. Those were typically the most “elite” (or high quality) jobs—large law firms and federal clerkships, among others. That’s not publicly available data, so it dropped out. And the 10-month employment metric rose from 14% of the rankings to 33%. And that metric treats a job as a job—all jobs are the same, if they are full-time, long-term, bar passage required or J.D. advantage (and pursuing an advanced degree). As you can see here from my earlier blog post, how schools get to the best employment outcomes can vary dramatically.
NYU and Cornell place an overwhelming number of their graduates into “elite” law firm outcomes, so they suffered when the “at graduation” metric dropped off. And USNWR does not weigh the quality (actual or perceived) of jobs differently—a job is a job. NYU and Cornell had high reputational scores, likely in part because of their consistent elite placement into legal jobs. But those metrics dropped from 40% to 25% of the rankings, too. “Quality” metrics, if you will, began to drop off. Instead, quantity metrics increased—including raw employment placement 10 months after graduation. It’s simply a question of putting graduates through the bar exam and into a job. Maybe that sounds crass, but that’s the metric measured. And maybe it’s a good thing to measure these raw outputs—employed graduates are better than unemployed graduates.
So let’s see what happened in these employment metrics.
Recently I looked at the schools I estimated to be in the “top ten” of USNWR’s employment metrics. These are estimates, because USNWR does not release how it weights each category of employment. But three schools stand out among these top ten, as they are schools that typically do well in the rankings but are not among the top ten of USNWR rankings: SMU, Texas A&M, and Washington University in St. Louis. Let’s compare their employment outcomes with NYU and Cornell.
You can see that “full weight” jobs are worse (slightly for NYU, more so for Cornell) and unemployed outcomes higher than the three “top ten” schools I list here.
But the increased compression of the metrics leads to increased volatility—including dropping schools for even marginal differences in employment outcomes. NYU has around 2.1% (10 out of 473) of its graduates unemployed. That sinks its employment ranking to around 20th. Cornell has about 4% (8 of 202) of its graduates unemployed (or unknown). That puts its employment ranking around 40th. There’s tremendous compression at the top of these rankings. And given that employment is worth a whopping 33% of the overall rankings, marginal differences matter.
Let’s now compare the types of jobs. I’ll pull out three types: employment at firms with 501 or more attorneys; at firms with 101 to 500 attorneys; and federal judicial clerks. I sum those three categories at the end for a total, separate out all other outcomes, and the final unemployed figure. Finally, I include the raw total number of graduates at the end.
”Quality” is a controversial measure, to be sure. Students, of course, can have high quality employment outcomes in state court clerkships, public interest jobs, and government. These categories, however, are probably among the most competitive, if not the most sought after, categories of legal employment that offer students the highest salaries and the most options at the end. Again, controversial measure, to be sure, and not a one size fits all. But it’s worth the point of comparison.
For NYU and Cornell, the at-graduation metric and the elevated reputational score rankings likely helped account for some of the elite job placement. At Cornell, 84.7% land in these elite jobs, an astonishingly high percentage. At NYU, it’s 73.2%. The other three schools I list here aren’t particularly close. And for NYU, it’s all the more impressive that it graduated 473 students, nearly or more than doubling what most of the other schools do. It’s an extraordinary effort to secure that many high quality jobs for its graduates.
But note that, for USNWR purposes, those placement rates aren’t captured. It’s just jobs. It’s the quantity of placement. Getting students out of that “unemployed” bucket is basically the way to rise in the rankings these days.
I don’t mean to pick on any particular schools—they’re all strong schools in their own ways, and they offer some contrasts with one another and points of comparisons in the rankings. And it also doesn’t mean that students aren’t graduating into meaningful and successful careers. It’s simply a way of explaining why schools like NYU and Cornell are sliding in the new metrics, and others are finding success.
One more detail. Bar passage has become a major figure, too—up from 3% to a whopping 18%. But NYU’s first-time bar passage rate was 32d, and Cornell’s 57th, compared to all other schools.
These schools did not exactly have a bad bar passage rate. NYU had a first-time pass rate of 94.9%, nearly 14 points above the average of passers in jurisdictions where its graduates took the bar. Cornell was 90.3%, nearly 10 points above. But those numbers pale in compares to schools like North Carolina (20 points above), Harvard (20 points), UCLA (19), Chicago (19), and Berkeley (19).
The raw outputs aren’t that different. North Carolina, for instance, had a 93.75% first time pass rate, even lower than NYU. But the North Carolina state bar rates were quite low compared to New York’s so, UNC outperformed by 20 points, a higher rate than NYU. This delta of outperformance is a good way of accounting for bar difficulty. But it sets up schools like NYU and Cornell for worse outputs because the state bar is easier and the competition in the state is high quality. The first-time pass rate in North Carolina was 72%; in New York, it’s 83%. (I noted this several years ago with the decision in California to lower the cut score of the bar exam—it has the incidental effect of reducing the apparent quality of bar passage stats of schools like Stanford and UCLA.) And maybe it’s a good reason for including ultimate bar passage as a separate metric—as I wrote earlier, “So maybe there’s some value in offsetting some of the distortions for some schools that have good bar passage metrics but are in more competitive states. If that’s the case, however, I’d think that absolute first-time passage, rather than cumulative passage, would be the better metric.” But It’s literally impossible for NYU or Cornell to overperform the New York bar by more than 17 points.
In short, if you have a question about why a school moved up or down in the rankings, it can usually be distilled into “employment and bar passage.” And if methodological changes are coming, the most likely targets will be these areas, where compression and volatility can lead to surprising changes year over year.