When the fraud lawsuit against New York Law School was dismissed last week, the school proclaimed vindication. It was anything but that. What Judge Schweitzer ruled is that no reasonable consumer would have relied upon the obviously inflated employment percentages and salary numbers posted by the school.
New York Law School, for example, claimed that 92.6% of its 2008 graduates were employed within nine months of graduation, and the “midrange of full-time private sector salaries” was $71,250-$160,000 (representing that 75% of graduates in full time private sector jobs earned at least $71,250, and 25% earned at least $160,000). The judge observed that, for a low ranked law school like NYLS, numbers like these are improbably high.
It was the students’ fault, Judge Schweitzer held, for accepting these claims at face value, failing to conduct a more thorough investigation to uncover the true, undoubtedly much worse, job results for graduates. According to the judge, the law against fraud only protects “a reasonable consumer acting reasonably”—gullible fools are out of luck. The New York Daily News put Judge Schweitzer’s message bluntly: “YOU SHOULD have known your law degree was a dud.” (Is this holding really a victory for NYLS?)
No one should celebrate a judicial ruling that representations by a law school cannot be relied upon by reasonable people. Law schools, after all, are educational institutions charged with training competent and ethical lawyers. In the opinion of the judge, people should view law schools with the same skepticism they view used car dealers—beware—investigate—double check claims about mileage per gallon and blue book value.
While I have doubts about the soundness of the judge’s decision in this case, he is absolutely correct that prospective law students must not take at face value employment numbers advertised by law schools. The recent US News ranking confirms that, despite new ABA reporting rules, law schools nationwide continue to advertise unbelievably high employment numbers for graduates.
By all accounts, 2010 was the worst year so far in the most dismal market for legal employment in several decades (though 2011 might yet turn out to be worse). Only 64% of 2010 graduates obtained full time jobs as lawyers. Given the poor job market, and given the new reporting rules imposed by the ABA, one would have expected that the employment figures law schools reported for their 2010 graduates would be substantially lower than for 2009. That did not happen, however—numerous law schools continue to claim that over 90 percent of their 2010 graduates obtained employment.
Law schools did generally report lower employment numbers, but typically the reduction was small, only a few percentage points beneath their 2009 numbers. The 2009 numbers posted by law schools under the old rules, however, were themselves substantially inflated. Had the new ABA rules been effective in producing greater transparency, law schools across the board should have shown large drops.
A few striking anomalies will expose the depth and pervasiveness of the problem (additional anomalies are identified here and here).
Yale Law School, to its credit, reported a significant drop in employment, falling from 96.5% in 2009 to 91.8% in 2010. That sounds about right for a top school like Yale in a historically tough employment environment. What’s puzzling is that every other top 15 school reports a higher employment rate than Yale, despite Yale’s two major advantages: It is ranked #1, with the most outstanding student credentials, and the size of its graduating class is relatively small, making it easier to place most everyone.
Incredibly, a number of law schools ranked far beneath Yale reported a notably higher employment rate, including George Mason (96.4%), Loyola Marymount (94.1%), Kentucky (94.2%), and UNLV (93.2%). Even a few bottom schools reported employment on a par with Yale: Florida International (90.1%), Baltimore (91.2%), Akron (91.8%), Toledo (90.1%), and Atlanta’s John Marshall (91.6%). It is absurd to think that any of these schools exceeded or matched Yale’s employment rate.
Washington University (my own institution), even more so than Yale, reported a drastically reduced employment rate, falling from 95.5% in 2009 to 80.7% in 2010. (Ouch!) Our 80% employment rate is unusually low among top 25 law schools—not until the 76th ranked law school is a lower employment number reported. Indeed, a significant number of bottom 100 law schools (including New York Law School) report a higher employment percentage. Wash. U. paid a severe price in the ranking for reporting such a poor employment number, falling from 18th to 23rd, despite improving in other measures. Dean Syverud, the incoming Chair of the ABA Section on Legal Education, was determined to scrupulously comply with the new ABA reporting standards, knowing that the school would likely suffer as a result. Understandably, a number of students vocally questioned the wisdom of his decision, given the resulting fall in ranking.
A few ranking winners exhibited anomalies in the opposite direction. The two biggest climbers in the first tier were the University of Washington (up from 30 to 20) and Arizona State University (up from 40 to 27). Both schools somehow defied the legal recession and reported significant leaps in their employment rate. UW went from 89.5% in 2009 to 96% in 2010; ASU went from 89.8% in 2009 to 98.2% in 2010. (Both schools put Yale to shame--or perhaps the opposite.)
ASU’s feat is especially curious because its (usually) closely ranked peer school, Arizona (ranked 42 last year and 43 this), has similar scores on most measures except for employment. In a year when ASU reported a large jump in employment, claiming the highest rate in the entire country, Arizona’s employment rate dropped from 89.4% to 87.4% (a couple of percentage points down like most schools). This is an odd disparity given that they are even-handed competitors in the same legal market. These results are even more peculiar when one considers that Arizona’s bar pass rate (93.7%) was much higher than ASU’s (85.9%).
Looking at bar pass rates exposes another set of strange findings. At most law schools the bar pass rate is about the same as or higher than the employment rate—that makes sense because a graduate cannot hold a job as a lawyer without passing the bar. However, eighteen law schools in the top 100 report employment rates at least 10% above their bar pass rate. At the high-rising University of Washington, for example, 96% were reported as employed, but only 85% of the class passed the bar. ASU likewise had a much higher employment rate than bar pass rate (in contrast to Arizona).
Here is a list of the top 100 schools with the largest gap between their reported employment rates and their bar pass rates: San Diego (employment percentage is 23% higher than bar pass, 88.2% to 65%), UNLV (20%), West Virginia (20%), Washington and Lee (18%), LSU (15%), Hofstra (15%), Pacific (15%), George Mason (13%), Loyola Marymount (13%), Davis (12%), ASU (12%), University of Washington (11%), Seattle (11%), Syracuse (11%), UCLA (10%), Tulane (10%), Lewis & Clark (10%), and Kentucky (10%). The explanation for this presumably is that lots of their graduates are employed in non-lawyer jobs (it is also possible, but unlikely, that lots of their graduates did not take the bar).
Inexplicably, several schools list students in “JD required” jobs at a percentage higher than the school’s bar pass rate. At San Diego, for example, 76% of the class had “JD required” jobs (25% of these jobs were part time), but only 65% of the class passed the bar. At Loyola Marymount, 84% were in “JD required” jobs, but only 80.9% passed the bar. At ASU, 89.3% were in “JD required” jobs, although only 85.9% passed the bar. It’s not clear how the number of students in “JD required” jobs can exceed the number of students who passed the bar, since the latter is necessary for the former. (The employment figures are taken 9 months after graduation, subsequent to the bar results.)
When a law school lists an employment rate that is substantially higher than its bar pass rate, a significant percentage of graduates typically are counted by the school in questionable categories, mainly “academic” and “business.” The combined total of these two categories accounts for about 30% of the claimed employment for Syracuse and Hofstra; about 25% of employment at George Mason, San Diego, and West Virginia; about 20% at Loyola Marymount and Tulane. That’s a lot of academics and business folks coming out of law schools.
And we must not forget the mushy “JD preferred” category that many law schools stuff significant proportions of their graduating class into. Lots of law schools claim 10% to 20% of their graduates land in “JD preferred” jobs. I guess that’s a bunch of corporate compliance officers and FBI agents.
Imagine Judge Schweitzer learning about these numbers put out recently by law schools, and shaking his head, thinking that any prospective student who believes any of this is an unsalvageable fool.
It bears emphasizing that the ABA has already implemented a stricter set of reporting requirements—and the above results occurred under the new ABA regime.
The fundamental problem here is that law schools can technically tell the “truth” even when they report absurdly high employment percentages. The categories themselves—“academic,” “business,” “JD preferred,” “employed job unknown”—beg for abuse. They guarantee the situation will not improve.
The numbers this year bear a telling resemblance to US News employment numbers in the late 1990s and early 2000s, when certain law schools gamed more aggressively than others and the techniques used to boost scores were beginning to spread. By the late 2000s, when gaming had become pervasive, nearly all of the top 100 schools reported employment rates from the mid-nineties to 100%, reaching an equilibrium in which every school reported high employment. As a consequence, no school could gain a comparative advantage from employment numbers, which mostly washed out as a factor among peer schools.
To offer one illustration, for the class of 2008, ASU reported 99.7% employed and Arizona reported 97.3%—this is in stark contrast to the large gap that separates them this year, at 98.2% and 87.4%, respectively. This disparity gave ASU a huge ranking boost compared to its more modest neighboring school. When one digs into the underlying numbers, however, the disparity between them begins to dissipate (11.3% of ASU’s JD jobs are part time, compared to 1.7% at Arizona; ASU’s combined “academic” and “business” jobs constitute 17.7% of overall employment, while Arizona’s is 12.3%).
To regain its traditional parity with ASU, Arizona will be sorely tempted next year to more liberally categorize future graduates as employed in “academic” or “business” positions, or will use some other expedient to raise its employment rate to approximate ASU’s. ASU’s aggressive gaming thus puts pressure on Arizona to aggressively game.
Project this scenario nationwide and it becomes evident that advertised employment rates will once again creep up until they stabilize at a high equilibrium, with virtually all law schools misrepresenting their employment results to one degree or another.
The “weakest link” problem in game theory is a situation in which the worst actor(s) produce negative consequences that affect everyone. The law school ranking competition is a “weakest link” situation in the sense that, as long as aggressive massagers exist among law schools, there will be immense pressure on every law school to engage in aggressive massaging just to keep up. Good behavior under these circumstances will be punished in the ranking (and honorable deans will be fired)—and dubious behavior will be rewarded (and strategic deans will get raises). That is why the ABA reforms will inevitably fail.
There is only one possible solution: Only full-time “JD required” jobs should be counted and advertised by law schools. Any other category is susceptible to manipulation and will be exploited by law schools to conceal poor employment results. Current claims that the ABA reporting rules will improve transparency because they provide more “granular data” are wrong—they will produce more obfuscation by law schools.
If only full-time “JD required” jobs can be reported, law schools will be stripped bare and prospective students will finally get a clear look at their real job prospects. Law schools undoubtedly will vehemently oppose this proposal, calling it unfair, failing to credit them for all the great non-lawyer jobs their graduates are getting. If you find that argument persuasive, I have a used car I would like to sell you—it has terrific resale value and gets great gas mileage.
Brian Tamanaha is a law professor at Washington University in St. Louis and a renowned jurisprudence scholar. His forthcoming book, Failing Law Schools, is due out in June and will cover the problems with legal education. This column originally appeared on the blog, Balkinization, and is reprinted here with permission.