One might expect, for instance, California law schools to see higher attrition given the fact that the “cut score” is higher in California than most jurisdictions. But not all California schools have the same kind of attrition rates, it seems. Florida has a more modest cut score, and a few law schools here are high on the list. Virginia has a high cut score, and it has no law schools that appear to be outliers.
CJ Ryan and I pointed out in our bar exam analysis earlier this year that we found academic attrition on the whole did not affect how we would model projected bar exam success from law schools—but that a few law schools did appear to have unusually high academic attrition rates, and we cited to the literature on the robust debate on this topic (check out the footnotes in our piece!).
But to return to an earlier point, a good majority of schools (about 60%) have negligible 1L academic attrition—even many schools with relatively low incoming predictors among students. Most law schools, then, conclude that even the “good” reasons for attrition aren’t all that great, all things considered. And, I think, many of these schools see good success for their graduates in employment and bar passage.
Now, the title of this post is about USNWR. What of it?
USNWR has now done a few things that make academic attrition much more attractive to law schools.
First, it has devalued admissions statistics. It used to be that schools would fight to ensure they had the highest median LSAT and UGPA scores possible. That, of course, meant many students could enter well below the median (a reason I used the 25th percentile above) and not affect the score. But the decision to admit a non-trivial number of students below the targeted median could put that median at risk—too many matriculants, and the median might dip.
But, students at the lower end of incoming class credentials also tend to receive the fewest scholarship dollars—that is, they tend to generate the most revenue for a law school. Academic dismissal is a really poor economic decision for a law school—that is, dismissing a student is a loss of revenue (remember the earlier figure… perhaps $100,000).
USNWR previous gave a relatively high weight to the median LSAT score in previous rankings methodologies. That meant schools needed to be particularly careful about the cohort of admitted students—the top half could not be outweighed by the bottom half. That kept some balance in place.
Substantially devaluing the admissions metrics, however, which on the whole seems like a good idea, creates different incentives. Schools no longer have as much incentive to keep those medians as high. It can be much more valuable to admit students, see how they perform, and academically dismiss them at higher rates. (Previously, higher dismissal rates were essentially a strategy that placed low priority on the medians, as a smaller class with a higher median could have been more effective.) It’s not clear that this will play out this way at very many schools, but it remains a distinct possibility to watch.
Second, it has dramatically increased the value of outputs, including the bar exam and employment outcomes. Again, a sensible result. But if schools can improve their outputs by graduating fewer students (recall the bar exam point I raised above, and as others have raised), the temptation to dismiss students grows. That is, if the most at-risk students are dismissed, the students who have the lowest likelihood of passing the bar exam and the most challenging time securing employment are out of the schools “outputs” cohort.
I told you this would be a touchy subject.
So let’s get a bit more crass.
In next year’s projected rankings, I project five schools tied for 51st. What if each of these schools academically dismissed just five more students in their graduation class (regardless of size, but between 2-6% of the class for these schools)? Recall, this is a significant financial cost to a law school—perhaps half a million dollars in tuition revenue over two years. And if a school did this continually each incoming 1L class, that can be significant.
But let’s try a few assumptions. (1) 4/5 students would have failed the bar exam on the first attempt; (2) 2/5 students would not have passed the bar within two years; (3) each of these students was attached to one of the five most marginal categories of employment, spread out roughly among the school’s distribution in those categories. These are not necessarily fair assumptions, but I try to cabin them. To start, while law school GPA (and the threshold for academic dismissal) are highly correlated with first-time bar passage success, they are not perfect, so I accommodate that with a 4/5. It is less clear about persistence for re-taking the bar, so a reason why I reduced it 2/5. As for employment, it seems as though the most at-risk students would have the most difficulty securing employment, but that is not always the case, and I tried to accommodate that by putting students into a few different buckets beyond just the “unemployed” bucket.
Among these five schools, all of them rose to a ranking between 35 and 45. (Smaller schools rose higher than larger schools, understandably.) But the jump from 51 to 39, or 51 to 35, is a pretty significant event for a relatively small academical dismissal rate increase.
The incentive for law schools, then, is not only to stay small (more on that in another post)—which enables more elite admissions credentials and easier placement of students into jobs—but to get smaller as time goes on. Attrition is a way to do that.
That’s not a good thing.
Indeed, I would posit that attrition is, on the whole, a bad thing. I think there can be good reasons it, as I noted above. But on the whole, schools should expect that every student they admit will be able to successfully complete the program of legal education. Schools’ failure to do so is on them. There can be exceptions, of course—particularly affordable schools, or schools that would refund tuition after a year to a student, are some cases where attrition is more justifiable. But I’m not persuaded that those are in the majority of cases. And given how many schools manage zero, or nearly-zero, attrition, it strikes me as a sound outcome.
Publicly-available data from the ABA clearly and specifically identifies attrition, including academic attrition, in public disclosures.
I would submit that USNWR should consider incorporating academic attrition data into its law school rankings. As it is, its college rankings consider six-year graduation rates and first-year retention rates. (Indeed, it also has a predicted graduation rate, which it could likewise construct here.) While transfers out usually reflect the best law students in attrition, and “other” attrition can likely be attributed to personal or other idiosyncratic circumstances, academic attrition reflects the school’s decision to dismiss some students rather than help them navigate the rest of the law program. Indeed, from a consumer information perspective, this is important information for a prospective law student—if I enter the program, what are the odds that I’ll continue in the program?
I think some academic attrition is necessary as a check on truly poor academic performance. But as the charts above indicate, there are wide variances in how schools with similarly-situated students use it. And I think a metric, even at a very low percentage of the overall USNWR rankings, would go a long way to deterring abuse of academic attrition in pursuit of higher rankings.