When the National Jurist released their law school rankings, it drew a few laughs and a lot of criticism from the law school establishment. The NJ ranked law schools based on the following criteria: (Courtesy of TaxProf Blog and Above the Law)
- Post Graduate Success: 50% (Sum of Employment Rate: 22.5%, Super Lawyers: 12.5%, Partners in NLJ 200: 10%, Bar Passage: 5%)
- Student Satisfaction: 35% (RateMyProfessors.com: 20%, Princeton Review: 15%)
- Affordability (10%) and Diversity (5%)
The results are not that surprising given that most of the USNEWS T14 schools were at the top of the list. The appearance of schools like Alabama, Texas Tech and North Carolina on the top were somewhat interesting. Looking at this and comparing it with the ridiculous Cooley rankings seems to suggest that any ranking list will not be taken seriously if Yale, Harvard and Stanford were not in the upper echelons.
While I will give NJ credit for trying to shake up the bullshit and heavily gamed rankings system set up by USNEWS, their proposed ranking system is really off the wall. I don’t have access to the full article so I cannot determine how the above percentages were calculated. But NJ’s methodology also uses faulty criteria in ways that may be worse than USNEWS.
First, NJ is giving the heaviest weight to the school’s Employment Rate (presumably immediate post grad). As anyone with an LSAT score over 135 knows, most law schools distort their post-graduate employment statistics and some even outright lie. Assuming NJ does not audit the self-reported employment information, they (like USNEWS) will make this determination based on molested and unreliable data.
Second, NJ’s entire Student Satisfaction section uses biased, highly speculative and perhaps even false information. I assume that most of the reviewers at Ratemyprofessors.com will post a good or bad review based on his or her final grade with a particular teacher. The reviewer who gets the grade he wants will sing high praises while the student who got the D or F will call him a cocksucker, out-of-touch, grumpy asshole, or the bitch who crawled into academia because she couldn’t hack it at a real job. And I suspect Princeton Review’s ratings are primarily based on the small number of respondents who 1) bother to talk to the telemarketer during dinnertime; 2) talks to the cute girl/guy manning the PR table next to the all-you-can-drink frat party; or 3) Responds to PR’s many spam emails in exchange for the 50% discount at the student bookstore. Also, I recall PR ranked every little minutiae about a school that no one really cares about, like Best Party School.
Finally, I wonder why using information from Super Lawyers and NLJ200 Partners is relevant. NJ probably assumes that most law school graduates ultimately desire to be a Super Lawyer and/or a NLJ200 partner. I suppose positions like Chief Counsel for a government agency, Chief Legal Officer of a Fortune 500 company, a successful solo or small firm practitioner and even Law School Dean are parting gifts for those who lose the law school game. And from what I have heard, being on Super Lawyers does not make your penis or breasts larger nor does it give you a license to print money. If you make enough of an impression (or kiss enough ass) to get on this list, it really is nothing more than a line you put on your otherwise boring website to attract clients.
Again, props to NJ for trying but it only showed how NOT to create an alternative ranking system. A ranking system that heavily relies on soft, subjective, and unreliable data will be criticized and worse, be ignored.
A ranking system should be largely based on objective information that should be easy to obtain publicly and easy to audit by a disinterested party. Examples would be LSAT scores (not GPA), tuition (including other school “fees” and books but excluding cost of living), and bar passage rates.
Other factors should be considered but because of its subjectivity, given a much lower weight (less than 5%). This includes undergraduate GPA and number of graduates that obtain judicial clerkship positions (federal, state and some specialty courts) immediately after graduation.
Some data is so untrustworthy that it should not be included in the rankings. Like post-graduate employment statistics. If Employment Rates were removed from ranking consideration, I’m willing to bet my left arm that most law schools will downsize or eliminate their almost useless Career Services in exchange for giving each graduate five years of Symplicity and Linkedin paid membership.
More on rankings in a future post.