Mixed motives, many questions as Yale, Harvard "drop out" of the USNWR law rankings

November 16, 2022 may be a date marking a sea change in legal education, or a blip that will offer a few concessions and tradeoffs in the near future. But the announcement by Yale Law School, and the swift ensuing announcement (an act of conscious parallelism) from Harvard Law School, that they would, if you will, “drop out” of participating in the USNWR law rankings, is extraordinary. I wondered last year whether the weakened position of USNWR may result in a fatal blow to them. Perhaps that blow is here, now.

I’ve written a lot about USNWR law rankings, some of the good and some of the bad. So let’s walk through Yale’s and Harvard’s expressed justifications for dropping out.

First, employment metrics. From Yale:

One of the most troubling aspects of the U.S. News rankings is that it discourages law schools from providing critical support for students seeking public interest careers and devalues graduates pursuing advanced degrees. Because service is a touchstone of our profession, Yale Law School is proud to award many more public interest fellowships per student than any of our peers. These fellowships have enabled some of our finest students to serve their communities and the nation on our dime. Even though our fellowships are highly selective and pay comparable salaries to outside fellowships, U.S. News appears to discount these invaluable opportunities to such an extent that these graduates are effectively classified as unemployed. When it comes to brilliant students training themselves for a scholarly life or a wide-ranging career by pursuing coveted Ph.D. and master’s degrees, U.S. News does the same. Both of these tracks are a venerable tradition at Yale Law School, and these career choices should be valued and encouraged throughout legal education.

And from Harvard:

[T]he U.S. News methodology undermines the efforts of many law schools to support public interest careers for their graduates. We share, and have expressed to U.S. News, the concern that their debt metric ignores school-funded loan forgiveness programs in calculating student debt. Such loan forgiveness programs assist students who pursue lower paying jobs, typically in the public interest sector. We have joined other schools in also sharing with U.S. News our concern about the magazine’s decision to discount, in the employment ranking, professional positions held by those who receive public interest fellowships funded by their home schools. These jobs not only provide lawyers to organizations for critical needs, they also often launch a graduate’s career in the public sector.

The salient critique is a right one. In the 2008-2010 era, schools often “concealed” the employment status of graduates, if you will (or, as was alleged at the time by a number of those “shedding light” on employment practices), by hiring their own graduates. That boosted overall employment rates of the school, even if those careers were not meaningful careers at all—indeed, even if they were only short-term and part-time.

Once upon a time, USNWR treated all employment outcomes equally. And the ABA didn’t have granular employment data. No more. The ABA collects this granular data. USNWR gives “full weight” to full-time, long-term, bar passage-required and JD-advantage positions, but it discounts those positions if they are school-funded.

In a different era, this made more sense, but it is harder to justify discounting those positions if they are full-time, long-term. It’s a substantial investment (Yale indicates its public interest fellows received a $50,000 annual salary plus benefits) in a way that the short-term, part-time positions didn’t. In fact, those positions have all but dried up now that USNWR double-discounts them (they’re discounted for not being full-time, long-term positions, and they’re further discounted as school-funded positions). Total law school-funded, short-term, part-time positions, regardless of type of employment: Class of 2011, 964; Class of 2016, 165; Class of 2021, 59. From nearly 1000 such jobs to about 50 such jobs in a decade.

My recent look at the employment data suggests that a handful of elite schools (including Yale and Harvard) place a disproportionate number into these full-time, long-term, school-funded positions. At Yale, it’s at times over 10% of the class. At Harvard, it can exceed 20 students, a substantial number but smaller as a percentage basis at a school of Harvard’s size.

I think Yale and Harvard are right to critique this point of USNWR, but it’s fallen on deaf ears for many years.

Briefly, Yale mentions graduate programs, which, as justified could be a refuge for some schools seeking to “conceal” a set of their graduates into, perhaps, that school’s own, say, master’s programs. The gist of this metric is to say that JD graduates should be pursuing careers in law, but Yale candidly sends more (7, 2, and 10 in the last three graduating class) than a typical cohort, and perhaps justifiably so. But USNWR certainly does discount those placements. It’s a problem that may be unique to Yale (albeit modest in scope).

Second, debt metrics. From Yale:

In addition, the rankings exclude a crucial form of support for public interest careers — loan forgiveness programs — when calculating student debt loads. Loan forgiveness programs matter enormously to students interested in service, as they partially or entirely forgive the debts of students taking low-paying public interest jobs. But the rankings exclude them when calculating debt even though they can entirely erase a student’s loans. In short, when law schools devote resources to encouraging students to pursue public interest careers, U.S. News mischaracterizes them as low-employment schools with high debt loads. That backward approach discourages law schools throughout the country from supporting students who dream of a service career.

. . .

[T]he way U.S. News accounts for student debt further undercuts the efforts of law schools to recruit the most capable students into the profession. To its credit, U.S. News has recognized that debt can deter excellent students from becoming lawyers and has tried to help by giving weight to a metric that rests on the average debt of graduating students and the percentage of students who graduate with debt. Yet a metric based on debt alone can backfire, incentivizing schools to admit students with the means to pay tuition over students with substantial financial need. A far better measure is how much financial aid a law school provides to its students, rewarding schools that admit students from low-income backgrounds and support them along the way. That crucial measure receives inadequate weight in the rankings.

And from Harvard:

[T]he debt metric adopted by U.S. News two years ago risks confusing more than it informs because a school may lower debt at graduation through generous financial aid, but it may also achieve the same effect by admitting more students who have the resources to avoid borrowing. The debt metric gives prospective students no way to tell which is which. And to the extent the debt metric creates an incentive for schools to admit better resourced students who don’t need to borrow, it risks harming those it is trying to help.

The indebtedness metric was introduced in 2021. I certainly had questions about its methodology. I also suggested there might be value in considering alternative metrics, including ones that considered earnings potential. Undoubtedly, however, these metrics sink schools like Yale and Harvard. They rightly point out that it could distort admissions to well-funded students over those who could take on debt. (I’ve chronicled the “first generation” issue here, too.)

That said, most students take on debt, and indebtedness is a real concern for graduates. First, I’m not sure it confuses students, as, I think, there are not many schools that principally fund their institutions through wealthy admittees in a way that distorts the average debt loads. Indeed, USNWR also separates those who incur no debt from those who do incur debt, so the average debt load portrait offers a more accurate picture if you end up incurring debt. Second, it does reflect the need-based or other scholarship-based opportunities for students at an output level—what will it look like after graduation?

But, Yale identifies a related concern, which has some truth and a new question. If a school wants to repay debts for those who performing public interest (as those going into private practice will have an ability to repay their loans, and the loans are a “good” investment for their future earnings career), this indebtedness metric can’t track that. But, if Yale offers its own loan forgiveness, more generous than federal loan forgiveness, for public interest or government work, it solves an ability to pay problem, and a raw “indebtedness” metric doesn’t account for the fact that many of these graduates will never need to repay these loans.

And that raises a new question. How many graduates feel compelled to pursue such work because of their loans, as opposed to those who enter knowing they intend to have their loans later forgiven? That is, do the high debt loads constrain the choices of students? Undoubtedly, lower loan totals free up students to more choices. And there’s a benefit, then, of indicating the raw debt loads, even with a robust forgiveness program. But this is something of an unknowable answer.

Furthermore, the flipside of indebtedness is expenditures per student. This has been, perhaps, the bane of existence of the USNWR formula for many years. Yale and Harvard have benefited tremendously from this metric for decades. They report exceedingly high costs per student, aided by their generous endowments, which has helped keep them atop the rankings. But expenditures are opaque data not readily auditable. And they, worse still, incentivize simply spending more money, not spending it more effectively or to any particular benefit of the students. For the schools here to be so concerned about indebtedness metrics but not to speak a word about the expenditures metric (which has lasted much longer, is a much larger component of the rankings, and exacerbates much greater inequality between schools, particularly private and public) rings a bit hollow.

Third, admissions. From Yale:

The U.S. News rankings also discourage law schools from admitting and providing aid to students with enormous promise who may come from modest means. Today, 20% of a law school’s overall ranking is median LSAT/GRE scores and GPAs. While academic scores are an important tool, they don’t always capture the full measure of an applicant. This heavily weighted metric imposes tremendous pressure on schools to overlook promising students, especially those who cannot afford expensive test preparation courses. It also pushes schools to use financial aid to recruit high-scoring students. As a result, millions of dollars of scholarship money now go to students with the highest scores, not the greatest need. At a moment when concerns about economic equity stand at the center of our national dialogue, only two law schools in the country continue to give aid based entirely on need — Harvard and Yale. Just this year, Yale Law School doubled down on that commitment, launching a tuition-free scholarship for students who come from families below the poverty line. These students overcame nearly insurmountable odds to get to Yale, and their stories are nothing short of inspiring. Regrettably, U.S. News has made it difficult for other law schools to eliminate the financial barriers that deter talented minds from joining our profession.

And from Harvard:

[B]y heavily weighting students’ test scores and college grades, the U.S. News rankings have over the years created incentives for law schools to direct more financial aid toward applicants based on their LSAT scores and college GPAs without regard to their financial need. Though HLS and YLS have each resisted the pull toward so-called merit aid, it has become increasingly prevalent, absorbing scarce resources that could be allocated more directly on the basis of need.

I want to set aside “expensive” test preparation for a moment, as there is lower cost test prep than ever thanks to free services provided by LSAC and others. But I do want to discuss the LSAT/UGPA topic.

It’s been quite obviously correct that schools have been obsessively focused on their median LSAT and undergraduate GPA of incoming students and directing scholarships toward those medians. It’s diverted resources from need-based aid. That said, of course, it’s hard to know how some schools being need-focused in their aid would affect students who may have options elsewhere, at schools where they are not focused on need-focused aid.

But Yale’s note emphasizes a different issue: “the full measure of an applicant.” The ABA is on the verge of ending the requirement of an admissions test. It is entirely possible that schools that want to move toward a more “holistic” admissions process will need to find alternative pathways to measure applicants and do not want to rely on the LSAT. But as long as USNWR measures the LSAT, it will remain important for them to consider. Jettisoning the LSAT, then, requires a consideration of how to jettison USNWR.

Finally, the unstated reasons. As I’ve indicated, some of these reasons are more persuasive than others, but, of course, the timing matters too. Yale and Harvard each experienced an unprecedented drop in their peer scores this past year. Harvard dropped to a tie for 4th last year, down from its typical “top three” status. Some may be attributable to these rankings decisions; others maybe not.

So I’m watching closely to see how other schools respond. Frankly, there are benefits to USNWR for law students. They roughly approximate school quality, albeit extremely imperfectly. They can help prospective students, especially those new to the legal profession, with some rough guides of quality. Obviously, they create some bad incentives for law schools, and law students can too easily conflate rank with quality, or over-rely on the rankings.

But given how difficult it is for the ABA to revoke accreditation, USNWR offers at least some quality control for law schools, or at least some law schools. The bar exam is another bound of quality control for law schools. We’ll see if other schools jump on board, and how it may affect school choices moving forward.

Furthermore, it’s worth considering how, if at all, USNWR responds. Much of the rankings are based on their self-collected data (peer scores) or data they could independently collect from the ABA. Others, like debt and expenditures, are not so readily available. Some schools currently do not report to USNWR, but they are unranked. Will USNWR attempt to impute or estimate data back into Yale and Harvard, then rank them anyway? Will they just drop them to the “unranked” category? Will they change their criteria to use only information they can collect independently? Time will tell.

Elite federal clerkships don't reflect the whole universe of student clerkship opportunities

Much has been written about disputes or “boycotts” of federal judges hiring clerks from particular law schools. But reviewing clerkships generally reveals that the debate right now is a niche subset of student employment opportunities.

It’s a very small subset of elite, “credentialed” clerkships in dispute at the moment. There are about 800 active federal judges not counting dozens, probably hundreds more active senior federal judges. They hire around 1200 or so recent law school graduates in term (one or two year) positions each year, not counting the many post-graduate hires or career clerks.

Schools like Montana, Alabama, Kentucky, Memphis, and West Virginia routinely outplace NYU, Georgetown, and Columbia as a percentage of their graduates going on to federal clerkships right after graduation.

But the dialogue is obsessive about Yale (which places around 50 graduates a year into federal clerkships, and many more after graduation), with a sliver of judges like Judge James Ho on the 5th Circuit. Why?

The subset of elite, “credentialed” clerkships.

A peril of a set of highly credentialed, very young, former Supreme Court clerks nominated to the federal judiciary of late is increasingly sharp elbow among judges to be on the next Supreme Court “short list” (which, most recently for a Republican administration, appeared to include around 40 names, which is hardly short). That’s increasing competition for elite, pedigreed clerks.

The discussions of these clerks, as hired through an ideological valence, about reverence for their former Supreme Court bosses, about being “feeders” of clerks to the Court, and so on, all run this channel. But in truth, it’s a tiny fraction of the clerkship opportunities for law school graduates.

So many clerkships are about geographical fit, about helping launch careers of new graduates into a federal territory where they’ll ultimately practice and be ambassadors for the court (not judge the judge). The vast majority of the federal docket, too, is not about abortion or other hot-button topics, but grinding through 922(g) sentencing or suppression hearings or Social Security appeals or immigration disputes. It’s a judge working very closely with a very small team to draft work product.

In the elite subset of credentialed clerkships, there’s a lot to say about judicial hiring practices, complaints about it, ideological screens and preferences, and so on. But the vast majority of the federal clerkship experience, and the work, is nothing like the debate over the narrow subset that’s attracting significant attention. To the extent there are calls for “reform,” I hope those in positions of authority are mindful of this disparity as the conversation continues to play out.

Multistate Bar Exam scores hold steady, remain consistent with recent low scores

It has been difficult to project much about the bar exam given changes in administration and the pandemic. The July 2022 bar exam would reflect three potentially significant things: the decision of law schools to move to pass-fail grading in their courses (particularly affecting 1L courses) in the Spring 2020; the decision of law schools to significantly reduce academic attrition for 1Ls in the summer of 2020; and the decision of law schools to have a number of remote learning options for the bulk of law students taking the bar in July 2022.

Now the MBE scores have been released, and the scores are a a slight drop from July 2021—but still consistent with scores between 2014 and 2019, and certainly not an all-time low.

The score is comparable to last summer’s scores, but it remains near recent lows. It appears that these disruptions did not materially affect bar passage rates (of course, it’s impossible to know how rates may have differed without these variables—perhaps they would have improved markedly, or remained just the same!). Of some interest: test-takers declined somewhat notable, from 45,872 to 44,705.

Puerto Rico lowers its bar exam cut score in response to threats that its law schools may lose accreditation

Back in 2019, I assessed the potential effect of the American Bar Association’s revised Standard 316, which requires an “ultimate” bar passage rate of 75% within two years for a graduating class. There, I noted:

Let’s start with the schools likely in the most dire shape: 7 of them. While the proposal undoubtedly may impact far more, I decided to look at schools that failed to meet the standard in both 2015 and 2016; and I pulled out schools that were already closing, schools in Puerto Rico (we could see Puerto Rico move from 3 schools to 1 school, or perhaps 0 schools, in short order), and schools that appeared on a list due to data reporting errors.

Will state bars lower their cut scores in response?

It’s possible. Several state bars (like South Dakota as mentioned above) have lowered their cut scores in recent years when bar passage rates dropped. If states like California and Florida look at the risk of losing accredited law schools under the new proposal, they may lower their cut scores, as I suggested back in 2016. If the state bar views it as important to protect their in-state law schools, they may choose the tradeoff of lowering cut scores (or they may add it to their calculus about what the score should be).

The ABA Journal recently reported the plight of two of Puerto Rico’s law schools that have failed to meet that standard in several years. Indeed, for Pontifical, their pass rates have worsened fairly dramatically in recent years: 71% for 2017, 52% for 2018, and 46% for 2019.

That article tipped me off to changes in Puerto Rico’s bar exam cut score. Puerto Rico does not use the UBE or a standardized bar exam score, so their passing score of “596 out of 1000 points” doesn’t offer a whole lot of information. But the Supreme Court of Puerto Rico did choose to lower the cut score to 569.

A 2021 report offers some reasons to be skeptical of this change, after studying predictors and exam performance:

For both set of analyses completed, the results did support the hypothesis that the applicants in the more recent years were not as well prepared than the applicants in previous years. Average P-values for a common set of items declined over time, and when comparing specific test administration pairs, the pattern consistently saw applicants from earlier test administrations performing better.

. . .

The hypothesis that the steady decline in overall pass rate on the Puerto Rico Bar Examination is a result of applicants being less prepared for the examination is supported by the decline in performance on the 14 anchor items administered on every test administration.

The Supreme Court of Puerto Rico expressly considered the effect of the new ABA Standard 316 on Puerto Rico’s law schools as an impetus for change.

Ante la necesidad de determinar si, además de las medidas ya concretadas por el Poder Judicial para atender los efectos de la aplicación del Estándar de Acreditación 316 de la ABA en nuestra jurisdicción, era necesario disminuir o modificar la nota de pase de los exámenes de admisión al ejercicio de la profesión legal, en el 2020 la Oficina de Administración de los Tribunales (OAT) comisionó a la compañía ACS Ventures un análisis sobre este particular.

A standard-setting study for the cut score had two rounds of standard-setting. One recommended a score of 584 (with a range of 574 to 594), and the other 575 (with a range of 569 to 581). The Supreme Court took the lowest of these ranges, 569. That said, the pass rate would still be at 46.4% even with that score, better than the rate of closer to 33% under the present standard:

We recommend that the program consider a final passing score for the Bar Examination somewhere in the range of the recommended passing score (575) and a score that is two standard errors of the mean below this score (569). The rationale for this recommendation is that the reference point for the panelists during the study was the Minimally Competent Candidate and panelists made judgments to predict how these candidates would perform on the multiple-choice questions and essay questions for the examination. This means that the distribution of reference candidates was all intended to be minimally competent. In creating that distribution, the lower bound would likely best represent the threshold of minimum competency suggested by the panelists. Setting the passing score at 569 would mean that approximately 46.4% of candidates would pass the examination while setting the passing score at 575 would mean that approximately 41.5% of candidates would pass. This range is consistent with the recommendations of the panelists as characterizing the performance of the minimally competent candidate.

The ABA has given Puerto Rican law schools an extra three years to try to comply. The lower cut score will make it easier to do so, although it remains unclear that even with this cut score all schools will be able to meet the standard.

But it also shows the rarity of the ABA of actually enforcing this standard, except for continuing to give schools more time to demonstrate compliance. We’ll see what happens in the next three years.

Comment on the ABA's proposal to end admissions tests as a requirement for law school admission

Earlier, I blogged about the ABA’s proposal to end the admissions test (typically, the LSAT) as a requirement for law school admissions. I’ve submitted a comment on the proposal, which you can read in its entirety here. The comment recommends disclosure of four pieces of information if the ABA accepts the proposal: the number of matriculants who do not have a standardized test score; the percentage of students receiving—and the 75th, 50th, and 25th percentile amounts of—grants among students admitted without a standardized test score; total academic attrition among students who lack a standardized test score; and the first-time and ultimate bar exam passage rates for students without a standardized test score. The comment explains why each item would be a useful disclosure.

You can view other comments here.

Biden experiences unprecedented hot streak with ABA judicial nominee ratings

I’ve blogged about the ABA’s judicial nominee ratings, wondering whether the ABA was any good at evaluating nominees. You can take a look at its historical ratings.

But President Joe Biden is experiencing an unprecedented hot streak. He’s had 100 ABA judicial nominee evaluations returned, and not a single one of them had a single “not qualified” vote among them.

Mr. Biden is the third president, joining Presidents George W. Bush and Donald Trump, to reject the ABA’s “pre-screening” power in evaluating judicial nominees. In the past, a president would submit potential nominees to the ABA and receive a rating back. Most of the time, a majority “not qualified” vote would sink the potential nominee, and the person would never face a formal nomination. Mr. Bush first broke the tradition on grounds that the ABA tended to give more conservative nominees lower ratings than more progressive nominees.

President Barack Obama resumed the tradition. In his first three years, the ABA, apparently, gave outright “not qualified” ratings (a majority vote of “not qualified”) to 14 potential nominees. For another 7 nominees, the ABA gave a minority vote of “not qualified.”

As a point of comparison (Democratic to Democratic administrations), Mr. Biden has zero, majority or minority “not qualified.” That’s a remarkable achievement. Given how many candidates Mr. Obama named who received a “not qualified,” it suggests some combination of White House vetting and ABA reviewing have changed, although it’s entirely unclear how to measure this. But it does show that Mr. Biden is on an unprecedented hot streak.

California audit reveals significant underreporting and underenforcement of attorney discipline

The full report is here. The National Law Journal highlights a few things:

In a review of the agency’s disciplinary files, acting state auditor Michael Tilden’s office found one lawyer who was the subject of 165 complaints over seven years.

“Although the volume of complaints against the attorney has increased over time, the State Bar has imposed no discipline, and the attorney maintains an active license,” the report said.

In another instance, the bar closed 87 complaints against a lawyer over 20 years before finally recommending disbarment after the attorney was convicted of money laundering.

It’s a pretty remarkable story that highlights two things worth considering for future investigation.

First, when Professor Rob Anderson and I highlighted the relationship between bar exam scores and ultimate attorney discipline rates, we could only draw on publicly-available discipline records. In a sense, what we observed was a “tip of the iceberg.” Now, this could come out in a couple of different ways. On the one hand, it might be that the relationship is even stronger, and that attorney misconduct manifests earlier, if we had complete access to the kind of complaints that the California bar has. On the other hand, it might also be the case (as we point out in the paper) that some attorneys are better at concealing (or defending) their misconduct than others, and that might be hidden in the data we have. It would be a separate, interesting question to investigate.

Second, it highlights the inherent error in comparing attorney discipline rates across states. California’s process is susceptible to unique pressures or complications, as all states’ systems are. You cannot infer much from one state to another (unless you are looking at relative changes in states over time as a comparative benchmark), which is an effort some have (wrongly) attempted.

It will be interesting to see what comes out of the reforms proposed in California and if the effort improves public protection.

Where the major political parties spent their legal dollars between 1Q2021 and 1Q2022?

I pulled the FEC data for the DCCC, DNC, DSCC, NRCC, NRSC, and RNC from January 1, 2021 to March 31, 2022 to see where the major political party arms spend their money. I looked at any expenditure labeled legal, law, or attorney. I deduped and merged entries for this 15-month period. To start, here’s where Democratic-affiliated outlets spent money, to any outlet receiving at least $25,000 in this period. (It excludes internal spending or transfers.)

PERKINS COIE WA (DC) $27,815,540
ELIAS LAW GROUP LLP DC $2,424,174
WILMER CUTLER PICKERING HALE AND DORR LLP DC $2,149,733
BROOKS PIERCE MCLENDON HUMPHREY & LEONARD LLP NC $1,043,825
KAPLAN HECKER FINK LLP NY $1,020,537
LATHAM & WATKINS LLP PA $645,175
DECHERT LLP PA $622,246
HEMENWAY & BARNES LLP MA $470,814
KREVOLIN HORST GA $385,765
DENTONS COHEN & GRIGSBY PC PA $382,242
BONDURANT MIXSON & ELMORE LLP GA $209,606
COVINGTON & BURLINGTON LLP DC $162,481
BALLARD SPAHR LLP PA $160,930
BALLARD SPAHR LLP AZ $160,930
CHERRY, BEKAERT & HOLLAND VA $150,333
MUNGER TOLLES OLSON LLP CA $143,955
MILLER CANFIELD PADDOCK AND STONE PLC MI $137,672
HIATT, JONATHAN MD $120,300
GREENBERG TRAURIG LLP PA (NY) $117,587
LAW OFFICE OF EVELYN GONG PLLC NY $116,910
LOCKRIDGE GRINDAL NAUEN PLLP MN $90,000
THE LAW OFFICE OF ADAM C BONIN PA $58,020
MELOY LAW FIRM MT $49,728
GUREWITZ, MARY ELLEN MI $40,800
HERRON, MICHAEL NH $39,025
SONNENFELDT, MICHAEL NY $36,500
FOX ONEILL SHANNON SC WI $35,656
STAFFORD ROSENBAUM LLP WI $34,822
CIVITECH TX $31,800
JACKSON LEWIS PC KS $29,500
JAMS INC CA $27,002
BRYAN D HOBEN ESQ NY $26,500
WOLF RIFKIN SHAPIRO SCHULMAN RABKIN LLP NV $26,410
JAMES & HOFFMAN DC $25,980

Firms with (a second state) in parentheses indicated that spending was labeled as being sent to another branch of that firm in that second state. Unfortunately, I cannot explain why identical amounts went to Ballard Spahr in two different states (there were not perfectly symmetrical transactions); I did not merge in case they are duplicates, but I left both there as a point of comparison.

Here are the firms on the Republican side. (Note more firms but smaller totals, and more spending that may not precisely align with “legal” expenditures but more media or press-related costs.)

Unfortunately, I had the same duplication problem with Wiley Rein and with King & Spalding.

WILEY REIN LLP MD $2,681,866
WILEY REIN LLP NJ $2,681,866
JONES DAY DC $2,146,453
CONSOVOY MCCARTHY VA $1,858,158
HOLTZMAN VOGEL JOSEFIAK PLLC VA $1,744,962
SHUTTS & BOWEN LLP FL $1,550,270
KASOWITZ BENSON TORRES LLP NY $1,262,500
ON MESSAGE INC VA $1,251,610
MCGUIRE WOODS LLP VA $799,938
DHILLON LAW GROUP INC CA $699,618
NECHELESLAW LLP NY $676,039
BUTZEL LONG ATTORNEY'S AND COUNSELORS MI $506,095
BELL MCANDREWS & HILTACHK LLP CA $433,952
VAN DER VEEN HARTSHORN AND LEVIN PA $349,864
FISCHETTI & MALGIERI LLP NY $333,945
CONSTITUTIONAL LITIGATION & ADVOCACY GROUP PC DC $300,000
NEWMEYER & DILLION LLP CA $225,286
BLANK ROME PA $209,030
HALL BOOK SMITH GA $190,249
MICHAEL BEST & FRIEDRICH LLP WI $179,573
BAKER & HOSTETLER LLP OH $150,000
RASKIN & RASKIN PA FL $150,000
DIGENOVA & TOENSING LLP DC $141,146
GOLDSTEIN LAW PARTNERS LLC PA $140,823
DEROHANNESIAN & DEROHANNESIAN NY $137,526
DILLON MCCANDLESS KING COULTER & GRAHAM LLP PA $136,749
KLEINBARD LLC PA $126,129
CROSBY OTTENHOFF GROUP DC $110,000
SNELL & WILMER LLP AZ $107,411
SKADDEN ARPS SLATE MEAGHER & FLOM NY $106,233
JOHN CIAMPOLI ESQ. NY $95,388
LAW OFFICE OF LINDA A. KERNSLLC PA $88,647
ROBINSON GRAY STEPP & LAFFITTE LLC SC $83,805
KING & SPALDING LLP DC $81,649
KING & SPALDING LLP GA $81,649
TAYLOR ENGLISH DUMA LLP GA $81,421
IMPERIUM PUBLIC STRATEGIES TN $80,000
SPARTAN PUBLIC AFFAIRS LLC VA $80,000
LODGE, JOHN III TX $79,831
CLARK HILL PLC PA $75,000
KINCAID, ADAM VA $75,000
STATECRAFT PLLC AZ $71,785
BRICKER & ECKLER LLP OH $70,770
THE NATIONAL REPUBLICAN REDISTRICTING TRUST VA $70,000
ALAN R OSTERGREN PC IA $68,025
LANDSLIDE STRATEGIES VA $65,000
BULEY, JEFFREY NY $63,136
BRADLEY ARANT BOULT CUMMINGS LLP AL $60,614
BELIN MCCORMICK IA $57,988
VAN DE BOGART LAW P.A. FL $51,481
MARQUIS AURBACH ATTORNEYS AT LAW NV $50,357
PHELPS TX $48,297
SHANAHAN LAW GROUP PLLC NC $47,927
PORTER WRIGHT MORRIS & ARTHUR LLP OH $38,871
LITTEN & SIPE LLP VA $36,554
CROSS XAMINE INVESTIGATION INC MI $36,195
DAVIDSON, DONNA GARCIA TX $35,000
KEVIN CLINE LAW PLLC NC $34,158
2652 GROUP LLC VA $34,082
OGLETREE DEAKINS NASH SMOAK & STEWART P.C. SC $31,593
HUCKABY DAVIS LISKER VA $31,500
DANIEL K HAGOOD PC TX $30,098
BROWN, MICHAEL DC $30,000
CUTOLO BARROS LLC NJ $30,000
MR&A LLC PA $29,193
AMERICA RISING LLC VA $27,709

What happens if the ABA ends the requirement that law schools have an admissions test? Maybe less than you think

In 2018, the American Bar Association’s Council on the Section of Legal Education and Admissions to the Bar considered a proposal dropping the requirement of an admissions test for law schools. I wrote about it at the time over at PrawfsBlawg (worth a read!). The proposal did not advance. Many of these points hold true, but I’ll look at how a new proposal differs and what might come. The proposal is still in its early stages. It’s possible, of course, that the proposal changes, or that it is never adopted (as the 2018 proposal wasn’t).

To start, many law schools currently admit a non-trivial number of students without the LSAT. Some of those are with the GRE. A few are with the GMAT. Several admit students directly from undergraduate programs with a requisite ACT or SAT score. The GRE has gained more acceptance as a valid and reliable predictor of law school admissions, although how USNWR uses it in calculating its rankings is not how ETS recommends using the GRE.

The 2018 proposal concluded, “Failure to include a valid and reliable admission test as a part of the admissions process creates a rebuttable presumption that a law school is not in compliance with Standard 501.” The 2022 proposal is even more generous: “A law school may use admission tests as part of sound admission practices and policies.” No rebuttable presumption against.

There are varying levels of concern that might arise, so I’ll start with the point that I think inertia will keep many law schools using not just standardized tests but the LSAT.

First, the most significant barrier to prevent a “race to the bottom” in law school admissions: the bar exam. As it is, schools must demonstrate an ultimate bar passage rate of 75% within two years of graduating. That itself is a major barrier for dropping too low. Even there, many schools do not like an overly-low first-time passage rate, and student take note of first-time bar passage rates, which have increased importance in the USNWR rankings.

Now, some states have been actively considering alternative paths to attorney licensing My hunch—and it’s only a hunch—is that this move by the ABA will may actually reduce the likelihood that state bars will consider alternative pathways to attorney licensing beyond the bar exam, such as version of “diploma privilege.” If state bars are concerned that law schools are increasingly likely to admit students without regard to ability, state bars may decide that the bar exam becomes more important as a point of entry into the profession.

Of course, this isn’t necessarily true. If schools can demonstrate that they are admitting (and graduating) students with the ability to practice law to the ABA, and perhaps to the state bars, then that could elevate trust. But state bar licensing authorities appear to have long distrusted law schools. We’ll see if these efforts complicate proposals for bar exam reform, or simply highlight closer working relationships with (in-state) law schools and bar licensing authorities.

In short, unless schools come up with adequate alternatives on the admissions front to address bar passage at the back end, it’s unlikely to be a drastic change. And it might be that efforts in places like Oregon, which are focused on both the law school side and the consumer-facing side of the public, will assuage any such concerns.

Second, a less obvious barrier is legal employment. That’s a tail-end problem for inability to pass the bar exam. But it’s also an independent concern among, say, large law firms or federal judges to choose from graduates with the highest legal ability. There are proxies for that, law school GPA or journal service among them. But the “prestige” of an institution also turns in part on its selectivity, measured in part by the credentials of high LSAT scores. If firms or judges are less confident that schools are admitting the highest caliber law students, they may begin to look elsewhere. This is a complicated and messy question (alumni loyalty, for instance, runs deep, and memories of institutional quality run long), but it may exert some pressure on law schools to preserve something mostly like the status quo.

Third, for admissions decisions of prospective students, there’s a risk about how to evaluate GPAs. For instance, it’s well known that many humanities majors applying to law school have disproportionately higher GPAs than their LSAT scores suggest; and that hard sciences majors have disproportionately lower GPAs than their LSAT scores suggest. The LSAT helps ferret out grade inflation and avoids collegiate major grading biases. It is not immediately clear that all admissions decisions at schools will grasp this point if the focus shifts more substantially to UGPA as the metric for admissions (which is less accurate a predictor of Law school success than LSAT, and less accurate still than LSAT and UGPA combined).

Fourth, who benefits? At the outset, it’s worth noting that all schools will still indicate a willingness to accept the LSAT, and for law students interested in the broadest swath of application interest are still going to take the LSAT. Additionally, it’s likely that schools will continue to seek to attract high-quality applications with merit-based scholarships, and LSAT (or GRE) scores can demonstrate that.

One group of beneficiaries are, for lack of a better word, “special admittees.” Many law schools often admit a select handful of students for, shall we say, political or donor reasons. These students likely do not come close to the LSAT standards and may have the benefit of avoiding the test altogether. (Think of the Varsity Blues scandal.)

A second group of beneficiaries are law schools with a large cohort of undergraduates at a parent university that allows for the channeling of students into the law school. Right now, schools are capped at how many students can be admitted under such programs with an LSAT requirement as opposed to only a UGPA and some ACT or SAT requirement. That cap is now lifted.

Relatedly, pipeline programs become all the more significant. If law schools can develop relationships with undergraduate institutions or programs that can identify students who will be successful in law school upon completion of the program, it might be that the law school will seek to “lock” these students into the law school admissions pool.

In other words, it could most redound to the benefit of law schools with good relationships with undergraduate institutions, both as a channeling mechanism and as a way of preventing those students from applying to other schools (through a standardized test). We may see a significant shift in programming efforts.

There are some who may contend that racial minorities and those from socio-economically disadvantaged backgrounds will benefit, as they tend to score lower on standardized tests and bear the brunt of the cost of law schools adhering to standardized testing. That may happen, but I’m somewhat skeptical, with a caveat of some optimism. The LSAT is a good predictor of bar exam success (and of course, a great predictor of law school grades, which are a great predictor of bar exam success), so absent significant bar exam changes, there will remain problems if schools drop standardized testing in favor of metrics less likely to predict success. That said, if schools look for better measures in pipeline programs, things that prospective students from underrepresented communities can do that will improve their law school success, then it very well could redound to the benefit of these applicant pools and potentially improve diversification of the legal profession. But that will occur through alternative efforts that are more likely to predict success, efforts which we’re beginning to see but are hardly widespread.

Finally, what about USNWR? Unless many schools change, it seems unlikely that USNWR would drop using LSAT and GRE as a metric. Many schools, as noted, already have a cohort that enters without any standardized test scores that are measured in the rankings.

But we can see how the rankings have been adjusted for undergraduate schools:

A change for the 2022 edition -- if the combined percentage of the fall 2020 entering class submitting test scores was less than 50 percent of all new entrants, its combined SAT/ACT percentile distribution value used in the rankings was discounted by 15 percent. In previous editions, the threshold was 75 percent of new entrants. The change was made to reflect the growth of test-optional policies through the 2019 calendar year and the fact that the coronavirus impacted the fall 2020 admission process at many schools.

. . .

. . . U.S. News again ranks 'test blind' schools, for which data on SAT and ACT scores were not available, by assigning them a rankings value equal to the lowest test score in their rankings. These schools differ from ones with test-optional or test-flexible admissions for which SAT and ACT scores were available and were always rank eligible.

It’s possible, then, that alternative rankings weights would be added to account for schools that had increasing cohorts without standardized test scores. But, as long as it remains a factor, I imagine most law schools will continue to do everything in their power to focus on maximizing the medians for USNWR purposes, as long as the incentives remain to do so.

*

In short, it’s quite possible that we’ll see a number of innovative developments from law schools on the horizon if the proposal goes through. That said, I think there are major barriers to dramatic change in the short term, with a concession that changes in other circumstances (including the bar exam, improved undergraduate or pipeline programs, and USNWR) could make this more significant in the future.

But I’d like to suggest two points of data collection that may be useful to examine the change. First, it would be useful if law schools, perhaps only those with more than 10% of their incoming class who enter without standardized test scores, disclose the attrition rates of who had a standardized test and those who did not. Second, it would be useful if they disclosed the cumulative and ultimate bar passage rates of each cohort. I think this information would help demonstrate whether schools are maintaining high standards, both in admission and in graduation, regardless of the source of admission. But, law schools already disclose an extraordinary amount of information, and perhaps those will just be quietly disclosed to the ABA during reaccreditation rather than in some public-facing capacity.