NCBE has data to prove Class of 2014 was worst in a decade, and it's likely going to get worse

I have blogged extensively about the decline in bar pass rates around the country after the July 2014 test. My original take was more inquisitive, and I later discounted the impact that ExamSoft may have had. After examining the incoming LSAT scores for the Class of 2014, I concluded that it was increasingly likely that the NCBE had some role, positing elsewhere that perhaps there was a flaw in equating the test with previous administrations.

The NCBE has come back with rather forceful data to show that it wasn't the MBE (and that my most recent speculation was, probably, incorrect)--it was, in all likelihood, the graduates who took the test.

In a December publication (PDF), the NCBE described several quality-control measures that confirmed it was the test-takers, and not the test. First, on re-takers v. first-time test-takers:

Among the things I learned was that whereas the scores of those we know to be retaking the MBE dropped by 1.7 points, the score drop for those we believe to be first-time takers dropped by 2.7 points. (19% of July 2014 test takers were repeaters, and 65% were believed to be first-time takers. The remaining 16% could not be tracked because they tested in jurisdictions that collect inadequate data on the MBE answer sheets.) The decline for retakers was not atypical; however, the decline for first-time takers was without precedent during the previous 10 years.

I had suggested from earlier data from a few states that re-takers and first-time test-takers performed similarly; but, disclosing data from a much broader dataset and using the more precise issue of MBE performance, first-time test-taker performance was much worse.

Second, on equating the test:

Also telling is the fact that performance by all July 2014 takers on the equating items drawn from previous July test administrations was 1.63 percentage points lower than performance associated with the previous use of those items, as against a 0.57 percentage point increase in July 2013.

As equating the test is probably the biggest possible flaw on the NCBE's end, it's extremely telling that the equating of specific items on previous administrations yielded such a significant decline, and such a sharp contrast with the July 2013 test.

Third, and, in my view, one of the most telling elements, the MPRE presaged this outcome:

The decline in MPRE performance supports what we saw in the July 2014 MBE numbers. In 2012, 66,499 candidates generated a mean score of 97.57 (on a 50–150 scale). In 2013, 62,674 candidates generated a mean score of 95.65. In 2014, a total of 60,546 candidates generated a mean score of 93.57. Because many MPRE test takers are still enrolled in law school when they test, these scores can be seen as presaging MBE performance in 2014 and 2015.

A steady decline in MPRE scores, then, foretold this problem. This further undermines any notion that ExamSoft or other test-specific factors impacted the outcome; the writing was on the wall years ago. But as few schools carefully track MPRE performance, it might not have been an obvious sign until after the fact.

The NCBE bulletin then points out additional factors that distort student quality: a decrease in quality at the 25th percentile of admitted students at many institutions (i.e., those at the highest risk of failing the bar), the impact of highest-LSAT score reporting rather than average-LSAT score reporting for matriculants (a change embraced by both the ABA and LSAC despite evidence that taking the highest score overstates student quality), and an increase in transfer students to higher-ranked institutions (which distorts the incoming student quality metrics at many institutions). Earlier, I blogged that a decline in LSAT scores likely could not explain all of the decline--it could explain part, but there are, perhaps, other factors at play.

The NCBE goes on to identify other possible factors, ones that may merit further investigation in the legal academy:

  • An increase in "experiential learning," including an increase in pass-fail course offerings, and which often means students take fewer graded, more rigorous, "black-letter" courses;
  • A decline in credit hours required for graduation and a decline required (i.e., often more rigorous) courses;
  • An increase in bar-prep companies over semester-long coursework to prepare for the bar;
  • A lack of academic support for at-risk students as the 25th percentile LSAT scores of matriculants worsens at many institutions.

So, after I waffled, and blamed some decrease in student quality, and then started to increasingly consider the NCBE as a culprit, this data moves me back to putting essentially all of the focus on student quality and law school decisionmaking. Law schools--through admissions decisions, curriculum decisions, academic support decisions, transfer decisions, as a reaction to non-empirical calls from the ABA or other advocacy groups, or some combination of these factors--are primarily in control of the students' bar pass rates, not some remarkable decision of the NCBE. How schools respond will be another matter.

Further, the NCBE report goes on to chart the decline in the 25th percentile LSAT scores at many institutions. The declines in many places are steep. They portend some dramatic results--the decline in bar pass rates this year is only the beginning of probably still-steep declines in the next couple of years, absent aggressive decisions within the present control of law school. (The admissions decisions, after all, are baked in for the current three classes.)

Coupled with the decline of prospective law students, law schools are now getting squeezed at both ends--their prospective student quality is increasingly limited, and their graduates are going to find it still harder to pass the bar. And we'll see how they respond to this piece of news from the NCBE--I, for one, find the data quite persuasive.

Increasingly appears NCBE may have had role in declining MBE scores and bar pass rates

Despite protests from the National Council of Bar Examiners to the contrary (PDF), it increasingly appears that the NCBE had some role in the decline of Multistate Bar Exam scores and, accordingly, the decline in bar passage rates around the country.

Causation is hard to establish from my end--I only see the data out there and can make guesses. But with California's bar results released, we now have 34 of 51 jurisdictions (excluding American territories but including the District of Columbia) that have released their overall bar pass rates. Comparing them to 2013, we see that 20 of them experienced at least a 5-point drop in scores. Louisiana is the only state that does not use the MBE, and it's quite the outlier this time around.

A single state, of course, cannot establish that the MBE is to blame. But it's a data point of note.

Some have blamed ExamSoft. On that, I remain skeptical. First, it would assume that the exam-takers on Tuesday were "stressed out" and sleepless as a result of the upload fiasco, which caused them to perform poorly on Wednesday's MBE. Perhaps I'm too callous to think it's very much of an excuse--it might be for some, but I would have doubts that it would have a dramatic effect on so many. One problem is that reporting of the actual problems of ExamSoft has been spotty--there have been no journalists who did the legwork of investigating which states had the problems, or to what extent.

But we have a couple of data points we can now use. First, jurisdictions that do not use ExamSoft, but use some other exam software like Exam4 or require handwriting. Second, the jurisdictions whose essay components occurred on Thursday, not Tuesday--meaning there was no ExamSoft debacle the night before the MBE.

Again, there does not appear to be a significant trend in any of these jurisdictions--they appear to be randomly distributed among the varying scores. While it might be a cause for some, I am not convinced it's a meaningful cause.

Finally, the NCBE has alleged that the class of 2014 was "less able." That's true, as I've pointed out, but only to a point--the decline in scores should not have been as sharp as it was. One small way of trying to compare this point is to examine repeater test-taker data.

A problem with measuring repeater data right now is that few jurisdictions have disclosed it. Further, most bar exams are quite easy, and repeaters are few. Finally, repeaters should fail the bar at extremely high rates, as it would prove the validity of the test--and which skews the results figures. But it might be useful to extract the data and compare first-time from repeater pass rates this cycle, at least in jurisdictions that had a significant number of repeaters. If the Class of 2014 was "less able," then we might expect the first-time takers' pass rates to decline at a higher rate than the repeat takers' pass rates.

Places like California saw identical declines. Others, like Texas and Pennsylvania, actually saw a slightly increased rate of failure from repeaters than from first-time takers. Ohio is on the other side, with a decline pass rate for first-time takers but a decent increase in the rate for repeat takers.

In short, I haven't been able to find an explanation that would identify the cause of the sharp decline in rates. Some, I think, is explained by a slightly lower-quality incoming class--one I've noted will lead to still sharper declines in the years ahead.

But after looking at all this information, I'm increasingly convinced that some decision in the NCBE's scoring of the MBE had some role in the decline of the scores, and of the pass rates around the country. That's speculation at this point--but it's a point, I think, worth investigating further, assuming additional data would be made available.

Previous posts on this subject

A more difficult bar exam, or a sign of declining student quality? (October 2, 2014)

Bar exam scores dip to their lowest level in 10 years (October 14, 2014)

Bar exam posts single-largest drop in scores in history (October 27, 2014)

Did ExamSoft cause the bar passage rate decline? (October 27, 2014)

National Conference of Bar Examiners: Class of 2014 "was less able" than Class of 2013 (October 28, 2014)

Class of 2014 LSAT scores did not portend sharp drop in MBE scores (November 11, 2014)

The bleak short-term future for law school bar passage rates (November 17, 2014)

The bleak short-term future for law school bar passage rates

This is the last (for now!) about the bar exam. And it's not about what caused the MBE and bar passage rate declines--it's what it means for law schools going forward. The news is grim.

There's no question there was a decline in the law school applicant profile from the Class of 2013 to the Class of 2014. The dispute that Jerry Organ and I (and others) have had is whether the decline in bar passage rates should have been as stark. But going forward, the Class of 2015, and 2016, and likely 2017, and probably 2018, will each be incrementally worse profiles still.

And it's not simply at the median LSAT and GPA. It's at the below-median profiles, particular LSAT, that should concern schools.

Those with LSAT scores below 150, and even 155, are at a substantially higher risk of failing the bar in most jurisdictions. For the Class of 2016, about 2/3 of schools have a 25th percentile LSAT of 155 or lower--that is, 25% of their incoming classes have an LSAT at or below 155. And over 80 schools have a 25th percentile LSAT at 150 or lower.

Furthermore, about half of schools have a 50th percentile LSAT of 155 or lower, and a full 30 schools have a 50th percentile of 150 or lower.

The increasing willingness of schools to accept these low-LSAT performers is a function of a combination of decisions made years ago. U.S. News & World Report evaluates only LSAT medians. This decision distorts evaluation of law student quality. To ensure that their medians remained as strong as possible, schools increasingly admitted more "imbalanced" students--students with a median or better LSAT and substantially below-median GPA, or a median or better GPA and a substantially below-median LSAT. That meant the 25th percentile LSAT began to sag at more schools--the bottom of the class became worse at a higher rate than the middle of the class. (There are other, less-measurable decisions at the moment, such as factoring the highest LSAT score instead of the average LSAT score of applicants, which probably distorts student quality; possible decisions to academically dismiss fewer students; educational programming decisions that may channel more students toward the kinds of courses that may not sharpen legal analysis for the bar exam, to the extent it affects bar passage; transfer-related decisions at schools; and much more.)

As bar passage rates decline--perhaps sharply--we should see still-falling rates, particularly from institutions that made the admissions decision years ago to prop up the median but sacrifice the quality of the bottom of the class.

For schools that made this decision years ago, the results will become increasingly sharp in the years ahead. If a school did not sufficiently reduce its class size, or worried about LSAT medians, it favored short-term interests; those short-term interests are becoming long-term as those classes graduate; are likely face the more significant debt (as the below-median students are less likely to have obtained merit-based aid); pass the bar at lower rates; in that cohort, are likely find employment at lower rates (if they are unable to pass the bar); and trickle back out to an already-reluctant applicant pool.

I've said before that I'm not a "doomsday" predictor. But these bar results portend a significantly worsening portrait for law school bar passage rates in the years ahead, if schools made short-term decisions years ago and are now facing the long-term results. For the long-term schools with visionary deans and faculties anticipating the long-term future of the institution, the results may not be quite so grim. (But we shall see how many of those there are.)

Class of 2014 LSAT scores did not portend sharp drop in MBE scores

UPDATE: Jerry Organ (University of St. Thomas) has posted an even more thorough and thoughtful analysis of the LSAT scores and projected bar passage rates at the Legal Whiteboard. He, too, finds the NCBE's conclusion difficult.

I've blogged (here and here and here and here and here) about the sharp drop in bar passage rates around the country from the July 2014 administration of the bar exam, largely due to the unprecedented drop in MBE scores. A recent Wall Street Journal blog post about the reaction of the dean of Brooklyn Law School shows the sides in the fight. Did the NCBE screw up its exam, yielding a sharp drop in scores? Or did law schools admit a disproportionately unqualified class?

Here's an attempt to measure the quality of the class and correlate it with MBE scores. (Maybe it's just awful math.)

The LSAT is fairly highly correlated with MBE scores. Consider this NCBE report (PDF). I extrapolated those figures for the LSAT and the average MBE scores. I then weighted them against the number of matriculants in law school: LSAC reports the number of matriculants with scores of 175+, 170-174, and so on. I took a rough estimate of the expected MBE score for each range; I then averaged it out for the entire class.

When I first charged it, the projected MBE scores were much higher than the actual MBE scores that arose three years later. (I used the 2009-2010 LSAT matriculant data, for instance, and mapped it on the MBE results three years later, in 2013.) I attributed this to several possibilities, the most significant of which is that repeaters probably significantly drag down the MBE score. But subtracting five points from the projected MBE score lead to an almost perfect match with the actual MBE score, with one exception.*

Note that the LSAT score reporting changed beginning in the 2009-2010 cycle (i.e., the Class of 2013): schools could report the highest LSAT scores, rather than the average LSAT scores, of matriculants. That meant that the LSAT scores were probably overstated in the last two graduating classes.

But in the charge, we see a fairly significant correlation between my extremely rough approximation of a projected MBE score based on the LSAT scores of the matriculating classes, and the actual MBE scores, with one exception: this cycle.

My math is rough--and maybe it's just bad. But as this comports with every other analysis I've done, and as I've not been able to find any other factors that would contribute to an across-the-board decline in scores, I'm increasingly convinced that a problem occurred on the NCBE's end--and not that the Class of 2014 was somehow disproportionately and dramatically worse than other classes.

That said, we should expect to see declining MBE scores (and bar passage rates) of some kind in the next few years, as academic quality of entering classes continues to decline; and, we should expect bar passage-required employment outcomes to see some (likely negative) effect due to a sharp drop-off in bar passage rates.

*I should add that I could have simply plotted the projected results so that you could observe the similarity (or differences) in the rise and fall; or, in the alternative, I could have plotted them on two different Y axes. Subtracting five points, however, seemed like the easiest way to make the visualization more obvious.

National Conference of Bar Examiners: Class of 2014 "was less able" than Class of 2013

Continuing a series about the decline in bar passage rates, the National Conference of Bar Examiners recently wrote a letter to law school deans that explained its theory behind the reason in a 10-year low in Multistate Bar Exam scores and the single-biggest drop in MBE history. I've excerpted the relevant paragraphs below.

In the wake of the release of MBE scores from the July 2014 test administration, I also want to take this opportunity to let you know that the drop in scores that we saw this past July has been a matter of concern to us, as no doubt it has been to many of you. While we always take quality control of MBE scoring very seriously, we redoubled our efforts to satisfy ourselves that no error occurred in scoring the examination or in equating the test with its predecessors. The results are correct.
Beyond checking and rechecking our equating, we have looked at other indicators to challenge the results. All point to the fact that the group that sat in July 2014 was less able than the group that sat in July 2013. In July 2013 we marked the highest number of MBE test-takers. This year the number of MBE test-takers fell by five percent. This was not unanticipated: figures from the American Bar Association indicate that first-year enrollment fell 7% between Fall 2010 (the 2013 graduating class) and Fall 2011 (the 2014 class). We have been expecting a dip in bar examination numbers as declining law school applications and enrollments worked their way to the law school graduation stage, but the question of performance of the 2014 graduates was of course unknown.
Some have questioned whether adoption of the Uniform Bar Examination has been a factor in slumping pass rates. It has not. In most UBE jurisdictions (there are currently 14), the same test components are being used and the components are being combined as they were before the UBE was adopted. As noted above, it is the MBE, with scores equated across time, that reveals a decline in performance of the cohort that took July 2014 bar examinations.
In closing, I can assure you that had we discovered an error in MBE scoring, we would have acknowledged it and corrected it.

Well, that doesn't make any sense.

First, whether a class is "less able" is a matter of LSAT and UGPA scores. It is not a matter of the size of the class.

Second, to the extent a brief window into the LSAT scores for the entering classes in the Fall of 2010 and 2011 are useful metrics, Jerry Organ has noted elsewhere that the dip in scores was fairly modest in that first year after the peak application cycle. It certainly gets dramatically worse, but nothing suggesting that admissions in that one-year window fell off a cliff. (More data on the class quality is, I hope, forthcoming.)

Third, to the extent that the size of the class matters, it does not adequately explain the drop-off. Below is a chart of the mean scaled MBE scores, and an overlay of the entering 1L class size (shifted three years, so that the 1L entering class corresponds with the expected year of graduation).

If there's supposed to be a drop-off in scores because of a drop-off in enrollment (and there is, indeed, some meaningful correlations between the two), it doesn't explain the severity of the drop in this case.

This explanation, then, isn't really an explanation, save an ipse dixit statement that NCBE has "redoubled" its efforts and an assurance that "[t]he results are correct." An explanation is still yet to be found.

Did ExamSoft cause the bar passage rate decline?

I’ve blogged about the sharp decline in the MBE scores and the corresponding drop in bar passage rates in a number of jurisdictions around the United States. I’m still struggling to find an explanation.

One theory is that the ExamSoft fiasco affected the MBE scores. Most states have two days of exams: a day of essays followed by a day of multiple choice in the MBE. The software most states use for the essay response portion had problems in July 2014--test-takers were unable to upload bar results in a timely fashion. As a result, students slept less and stressed more the night before the MBE, which may have yielded lower scores on the MBE.

We can test this in one small way: several states do not use ExamSoft. Arizona, Kentucky, Maine, Nebraska, Virginia, and Wisconsin all use Exam4 software; the District of Columbia does not permit the use of computers. If ExamSoft yielded lower scores, then we might expect bar passage rights to remain unaffected in places that didn’t use it.

But it doesn’t appear that the non-ExamSoft jurisdictions did any better. Here are the disclosed changes in bar passage rates of July 2013 in jurisdictions that did not use ExamSoft:

Arizona (-7 points)

District of Columbia (-8 points)

Kentucky (unchanged)

Virginia (-7 points)

These states have already disclosed their statewide passage rates, and they do not appear to be materially better than the other scores around the country.

It might still be a factor in the jurisdictions that use ExamSoft in conjunction with other variables. But it doesn’t appear to be the single, magic explanation for the decline. There are likely other, yet-unexplained variables out there.

 (I’m grateful to Jerry Organ for his comments on this theory.)

Bar exam posts single-largest drop in scores in history

The Mulitstate Bar Exam, a series of multiple choice questions administered over several jurisdictions, has existed since 1972. The NCBE has statistics disclosing the history of scaled MBE scores since 1976.

After tracking the decline in bar scores across jurisdictions this year, I noted that the MBE had reached a 10-year low in scores. It turns out that's only part of the story.

The 2.8-point drop in scores is the single largest drop in the history of the MBE.

The largest other drop was in 1984, which saw a 2.3-point drop over the July 1983 test. The biggest increase was 1994, which saw a 2.4-point increase over the July 1993 test. And the only other fluctuation exceeding two points was the 1989 2.2-point increase over the July 1988 test.

What might be behind this change? I've speculated about a few things earlier; I'll address some theories in later posts this week.

Bar exam scores dip to their lowest level in 10 years

Earlier, I noted that there had been a drop in bar passage rates in a handful of jurisdictions. (Follow that post to track state-by-state changes in the pass rates as the statistics come in.) A commenter theorized:

It's quite simple actually: the NCBE did a poor job of normalizing the MBE this year. The median MBE score is down a couple of points, and because states scale their essays to match the MBE results in their state, it also means median essay scores have decreased a small amount. Combine the two scores and you are seeing (in states using a 50/50 system), a 4-5 point drop in scores.

It's actually quite damning to the NCBE, because bar passage rates should be up and median MBEs also up if the historical correlation between LSAT and bar passage is taken into account.

Tennessee recently disclosed at the national mean scaled MBE score for July 2014 was 141.47. That's the lowest mean scaled MBE score for July since 2004, when the mean scaled MBE score was 141.2 (PDF). It's also almost three points lower than the July 2013 score.

There are innocuous reasons why the score dropped. It might be that there were a disproportionately high number of repeated test-takers. It might be that an increase in non-American law degree test-takers yielded a drop. Or there might be other reasons, too.

But for whatever reasons, the decline in MBE scores is almost assuredly the reason that bar passage rates have dropped in a number of jurisdictions. Whether similar declines are going to arise in places like New York and California in the weeks ahead is simply a matter of waiting.

A more difficult bar exam, or a sign of declining student quality?

I saw this thread at Top-Law-Schools about bar passage rates apparently somewhat lower than previous years. Thanks to link aggregation of bar statistics at Deceptively Blonde, I could start comparing results to the NCBEX annual statistics (PDF). Unfortunately, due to selectivity of statistical releases at this point, it's not possible at this moment to get a sufficiently granular analysis of bar passage. (For instance, most bars only report total pass rates, which include all takers, including repeaters and those from non-ABA accredited schools.) But we can start with a little anecdata until the full NCBEX data is released next spring.

These figures compare overall bar takers in July. Numbers are rounded to maintain consistency with NCBEX data.

Alabama, -6 points (July 2013: 71%; July 2014: 65%)

Florida, +1 point (July 2013: 71%; July 2014: 72%)

Idaho, -15 points (July 2013: 80%; July 2014: 65%)

Indiana, -8 points (July 2013: 76%; July 2014: 68%)

Iowa, -11 points (July 2013: 92%; July 2014: 81%)

New Mexico, +3 points (July 2013: 81%; July 2014: 84%)

North Carolina, -1 point (July 2013: 63%; July 2014: 62%)

Oklahoma, -3 points (July 2013: 82%; July 2014: 79%)

Oregon, -10 points (July 2013: 75%; July 2014: 65%)

Vermont, -6 points (July 2013: 72%; July 2014: 66%)

Washington, -8 points (July 2013: 85%; July 2014: 77%)

Of the ten states that have disclosed overall bar passage rates, seven have passage rates that dropped at least five points, and three have passage rates that dropped at least ten points.

Why?

Have state bars begun increasing the difficulty of their exams? That seems unlikely, because it's usually a big deal, and a public deal, for a state to adjust an exam. The fact that this is happening in several places also makes it unlikely.

Has student quality declined? The graduating class of 2014 was admitted in 2011, at a time of a very high applicant pool and some of the highest standards for most schools--while we might see a decline in passage rates in the next couple of years as schools sacrifice LSAT medians, GPA medians, and, perhaps most importantly, index scores (as I blogged about here), it doesn't explain why there's a drop for this graduating class. That said, the applications in 2011 were down slightly from the 2010 peak. (If anything, it may portend an even more dire situation as the student quality at institutions makes its way to graduation.)

Is it simply a brief anomaly from a few states? It might be. Looking at 2012 results (PDF), North Carolina had a 72% passage rate in July 2012; Washington had a 64% passage rate. So perhaps some significant oscillation in a few jurisdictions is not unprecedented.

At this stage, it's a small data point to keep an eye on as the bar results come in. Additionally, if bar passage rates decline overall, we might see another wave of consequences: fewer students passing state bars in July means lower employment outcomes for students in bar passage-required positions that must be reported the following February. Schools that slashed admissions standards three years ago might be seeing the consequences if higher numbers of their graduates fail the bar.


Update: Here are a few additional results. This will occasionally be updated. For a chart identifying a sharp decline in MBE scores, please see this post.

Alaska, -3 points (July 2013: 68%; July 2014: 65%)

Arizona, -8 points (July 2013: 68%; July 2014: 76%)

California, -7 points (July 2013: 56%; July 2014: 49%)

Colorado, -4 points (July 2013, 79%; July 2014: 75%)

Connecticut, +3 points (July 2013: 77%; July 2014, 77%)

Delaware, -9 points (July 2013: 72%; July 2014: 63%)

District of Columbia, -8 points (July 2013: 47%; July 2014: 39%)

Georgia, -6 points (July 2013: 80%; July 2014: 74%)

Kentucky, unchanged (July 2013: 76%; July 2014: 76%)

Louisiana, +17 points (July 2013: 53%; July 2014, 70%)*

Massachusetts, -6 points (July 2013: 82%; July 2014: 76%)

Michigan,  +1 point (July 2013: 62%; July 2014: 63%)

Minnesota, -9 points (July 2013: 88%; July 2014: 79%)

Missouri, -4 points (July 2013: 89%; July 2014, 85%)

Nevada, -9 points (July 2013: 66%; July 2014: 57%)

New Jersey, -4 points (July 2013: 79%; July 2014: 75%)

New York, -4 points (July 2013: 69%; July 2014: 65%)

Ohio, -5 points (July 2013: 82%; July 2014: 77%)

Pennsylvania, -1 point (July 2013: 77%; July 2014: 76%)

South Carolina, -6 points (July 2013: 77%; July 2014: 71%)

Tennessee, -12 points (July 2013, 78%; July 2014, 66%)

Texas, -11 points (July 2013, 82%; July 2014, 71%)

Virginia, -7 points (July 2013: 75%; July 2014: 68%)

Running totals for change in passage rate (for 34 jurisdictions)

Drop of at least ten points: 5

Drop of five to nine points: 15

Essentially unchanged (drop of four points to increase of four points): 13

Increase of five or more points: 1*

*Louisiana is the only state that does not use the MBE.