Scroll Top

Join thousands of law students - it's free

Why today’s law students are not less qualified

Related Articles

After setting a record for first-year enrollment in 2010, law schools have seen a significant decline in interest. Application volume is down by about a third; first-year enrollments down by about a quarter. The aggregate numbers, while stunning, understate how dire the trends are at many schools. Of the 196 law schools in the contiguous 48 states and Hawaii that enrolled a class in fall 2010, all except three received fewer applications in 2013 than they did in 2010. The declines ranged from a modest 3 percent to a nightmarish 65 percent. The median decline was 38 percent.

This sudden and widespread constricting of the applicant pool has intuitively prompted concerns about the current quality of entering student cohorts. Two questions seem to predominate: 

• Are today’s entering law students weaker than students from previous cohorts?

• Are desperate schools enrolling unqualified students in order to fill seats?  

From my observations, it seems that assumptions relating to both questions tend to range from probably to certainly. But are these assumptions correct? Are law schools preying on witless students as a survival strategy? An in depth review of the data suggests that the assumptions are mostly, if not completely, unfounded. The culprit behind these erroneous perceptions is likely a fundamental misunderstanding of how to interpret Law School Admission Test scores. 

Discussions of law student quality usually revolve around numerical indicators, such as LSAT scores and undergraduate GPAs. Therefore, a conclusion that a cohort of law students is weaker compared to another cohort usually means the former has a lower LSAT/UGPA profile than the latter. Such beliefs often persist even when statistically unsupported. While an LSAT score difference may mean nothing statistically, it often carries much practical (and misplaced) significance.

So what does the LSAT/UGPA data say about differences between the fall 2013 entering cohort compared to the fall 2010 cohort? The median LSAT/UGPA profile for the fall 2010 cohort at the 196 law schools studied was 157/3.41; in fall 2013, it was 155/3.38.

So, in 2013, the typical first-year law student had an LSAT score that was two points lower and a UGPA that was three one-hundredths lower than in 2010. The UGPA difference is so insignificant that it is not even worth discussing. But what about the LSAT score difference? Is a difference of two points statistically significant? The very short answer to that question is no.

Understanding the insignificance of small differences in LSAT scores requires one to comprehend the limitations of the test. Perceptions of standardized tests too often take mythical forms. We believe they can see the unseen (the contents of a person’s brain) and with that insight predict the future (a person’s chances of success). And while standardized tests can be useful factors in the admission process, there is nothing mythical about them. Their predictive value is limited by the same variability of life to which we are all subject.  

The LSAT is intended to predict first-year law school performance. A perfect predictor (which probably does not exist) would have a correlation coefficient of 1.0. LSAT coefficients range from 0.12 to 0.56, depending on the school, with a median of 0.36. The LSAT’s imprecise predictive value relates not only to the limited range of skills and abilities it measures, but also the inexact nature of the score it produces.  

Probably the least acknowledged piece of information in an applicant’s LSAT score report (except maybe the writing sample) is the applicant’s score band. The score band is a range of scores that serves as an estimate of the applicant’s “true” score. This estimate is necessary because every standardized test has what is called a standard error of measurement or SEM. The SEM is essentially the difference between a test taker’s observed score and her true score. No test can guarantee that an observed score is a true score, but SEM estimates allow test makers to offer a range within which the true score might fall. The size of the score band is determined by the test’s SEM. The contours of the score band are determined by the applicant’s observed score.  

The LSAT has an SEM of 2.6 points. Therefore, score bands range from -3 to +3 of an applicant’s observed score. So if an applicant has an observed score of 155, his score band would be 152-158. (The process for calculating score bands is different for the relatively few test takers with extreme observed scores.) Complicating the matter even more, this range is guaranteed to be accurate only about 68 percent of the tim e—meaning, there is almost a 1-in-3 chance that an applicant’s true score falls outside (above or below) even the score band. All of this is instructive, given both the overreliance on the LSAT and hypersensitivity to small score differences.

So with an LSAT score that is two points lower, can it be rightly be said that the typical 2013 entering student is weaker than her counterpart in the 2010 cohort? Again, the answer is no. A two-point difference in scores is statistically insignificant. It is statistically possible that either student’s true score is higher or lower than the other’s; it is also possible that both students’ true scores are the same. Put simply, it is impossible to distinguish the scores with statistical certainty. Even when quartiles and means are assessed, the differences between the cohorts are statistically indistinguishable (even though each data point is lower for the 2013 cohort).

But what about the belief that desperate law schools are enrolling unqualified students in order to fill seats? Practically every law school is admitting a larger proportion of its applicants. Among the 196 schools studied, only five had a lower admit rate in 2013 than in 2010. The overall admit rate increased from 36 percent for the fall 2010 class to 51 percent for fall 2013. Adding to (and reflecting) the competitive pressures are overall yield rates that have declined from 30 percent in 2010 to 25.5 percent in 2013. So law schools unquestionably have had to go deeper on the proverbial bench to fill their classes. But have some schools acted in bad faith? While harder to answer with certainty, this contention deserves a considerable amount of skepticism.

In exploring the question, it seems useful to first identify schools that might have the most incentive to behave in this fashion. For purposes of this discussion, I will refer to them as “desperate” schools. Large declines in application volume and large increases in admit rates might be two signs of desperation. Once desperate schools are identified, it would be useful to assess their LSAT trends, compared to non-desperate schools. The objective would be to use differences in those trends between the two groups to glean whether desperate schools have comported themselves differently than others.

Thirty-two law schools experienced declines in application volume of 50 percent or more. This type of drop would have been unimaginable just a few years ago and has surely caused many sleepless nights among enrollment managers. A robust applicant pool is essential to a strong class. Losing half or more of your application volume in just a few years is an undeniable blow. But have these schools experienced decreases in their median LSAT scores that may portend an unseemly admissions strategy?  

This group of 32 schools saw their median LSAT scores decrease an average of 2.5 points between 2010 and 2013; all other schools saw their medians decrease an average of 2.15 — a seemingly insignificant difference of 0.35 points. The difference is rendered even more insignificant when you consider the decreased medians in terms of proportions. The 32 desperate schools lost an average of 1.6 percent of their median LSAT “value,” while all other schools lost an average of 1.4 percent. Lastly, the 2013 average median LSAT score for the desperate schools was 153, which produces a score band of 150-156. The average for the other schools was 156, which produces a score band of 153-159. The three-point overlap between the bands suggests a lack of statistical significance. But to the extent there is a difference, it is likely not attributable to bad faith. 

How about schools with the largest increases in admit rates? Twenty-eight schools saw their admit rates increase 25 percent or more between 2010 and 2013. Could these trends be direct evidence of an unethical opening of the floodgates? This group of schools saw their median LSAT scores decrease an average of 3.17 points between 2010 and 2013. All other schools saw their medians decrease an average of 2.04 — a difference of 1.13. This difference, while seemingly tangible, likely lacks statistical significance because of overlapping score bands. The average LSAT median of 153 for the 28 schools yields a score band of 150-156, which overlaps with the 154-160 score band for the other schools (average median score is 156.5). So can these trends be used to support a conclusion that this group of law schools is preying on unqualified students? Not by my estimation.  

Lastly, I thought it would be interesting to assess the outcome trends for the nine law schools that experienced both “desperate” trends. These schools saw their median LSAT scores decrease an average of 3.88 points between 2010 and 2013. All other schools saw their medians decrease an average of 2.12 — a difference of 1.76. Once again, the difference, while seemingly tangible, likely lacks statistical significance. The average median LSAT for the group of nine schools is 152, which produces a score band of 149-155. The average for the other schools is 156, which produces an overlapping score band of 153-159. The data fails to support the contention.

Law schools are employing an array of strategies to get through the current adversity. A few of these strategies directly contradict the contentions mentioned earlier. While admit rates increased between 2010 and 2013, the actual number of students admitted decreased an average of 7.5 percent per school. More significantly, the average entering class size was 21 percent smaller in 2013. Critics might say these trends were forced upon schools, and I would largely agree. But I would also note that if a school desired to simply fill its class with warm bodies, it would have little problem doing so — even today.  

So given the lack of statistical support for the cynical narratives, there may be another way to frame the trends we are seeing. It is possible that schools are being forced to take a more holistic approach to assessing applicants, with LSAT misuse being a luxury no longer afforded. If this is true, entering classes just might be getting stronger, despite our willingness to assume otherwise.

Aaron N. Taylor

Aaron N. Taylor

Digital Magazine
Newsletter Signup

Get unlimited access

Get a premium subscription to the National Jurist for less than $2 a month.