Toward the 50 “Best in Class” Courts

The word “toward” is, of course, the rub. We have today no such list of top court performers that represent the “best in class” for each of the levels of courts: supreme courts, intermediate appellate courts, general jurisdiction courts, limited jurisdiction courts, and specialized courts. Justice Served, the alliance of court management and justice experts headed by my friend Chris Crawford, is in its eighth year of reviewing thousands of court online offerings around the world for the now-famous Top-10 Court Websites awards. Why can’t we do this for court organizations as a whole? Would it attract interest (or objections)?

I have a long-standing interest in comparative performance measurement and benchmarking (see “How Do we Stack Up Against Other Courts? The Challenges of Comparative Performance Measurement,” The Court Manager, Vol. 19, No.4, Winter 2004-2005.) It was rekindled last week when I read the cover story in the March 26 Business Week, “The Best Performers: The Business Week Fifty,” the "best in class" from the 10 economic sectors that make up the S&P 500. Can we do for courts what Business Week does yearly for companies and what Chris Crawford does for court websites?

How We Pick the Top 50
As Business Week acknowledges, any method of ranking organizations will be imperfect. So whatever we do with courts will be a work in progress. But remember: What gets measured gets attention. What gets measured gets done! And we Americans love lists and rankings.

First, we would need to focus on a handful of core performance measures for each level of court. The measures would need to represent a scorecard balanced across key success factors such as access, fairness, timeliness and court user satisfaction. Courts that either do not take these measures or do not make them available would not be included in the rankings. (After all, a court without a website can’t make Justice Served top ten.)

We would need to make plenty of adjustments in the performance data to avoid comparing apples and oranges. This would be tricky but not impossible. (Running data screens and applying weights and algorithms would make great fun for folks like me.)

Second, we would compare courts with their peers at the same level. Within each level of court, we would list the courts according to the core performance measures. That is, courts would be listed according to their combined rankings. For example, a court that finished third in on-time case processing (e.g., time-to-disposition) and ninth in court user satisfaction (a combined score of 12) would rank lower than a court that finished third in on-time case processing and fourth in court user satisfaction (a combined score of 7).

We would construct a final ranked listing of all courts at each level based solely on the results of their performance measures. It would be reviewed by a panel of experts who would make slight adjustments in the rankings to inject wisdom that the numbers may not provide. Finally, like Business Week’s best 50, the ranked listing would be accompanied by profiles of the courts on the list.

Are We Ready for the 50 “Best” Courts
Depending who you ask and how you frame the question, comparative performance measurement is either the best thing that can happen in court administration or the biggest threat to court governance in years. Both sides have legitimate arguments to make. With the emergence of business intelligence solutions that eliminate the barriers to data access and distribution, it may happen regardless of the debate.

In any event, I doubt that there are today 50 courts that would meet the entry requirement of a balanced scorecard of performance measures. However, the Utah courts made a major contribution to this goal when it launched its statewide CourTools initiative. Other states like Arizona, Oregon and Minnesota may not be far behind. Just last week, 40 judges and court managers met in Phoenix at a conference sponsored by the Maricopa County Courts that focused exclusively on the measure of court user satisfaction.

So, which courts will be among the 50 best performers? Will yours be among them?


For the latest posts and archives of Made2Measure click here.

© Copyright CourtMetrics 2007. All rights reserved.

Popular posts from this blog

Top 10 Reasons for Performance Measurement

The “What Ifs” Along the Road in the Quest of a Justice Index

Q & A: Outcome vs. Measure vs. Target vs. Standard