A new study proposes an outcomes-based approach to rank medical schools, pointing to shortcomings in the oft-cited U.S. News & World Report rankings.
The study, published in the Journal of the Association of American Medical Colleges, was conducted by researchers at Brigham and Women's Hospital. It examined data on more than 600,000 doctors from 127 medical schools, assigning them a score based four factors: number of articles published, grants, clinical trials, and awards.
What happens when a medical school changes its name?
The authors compared their new method to developments in Major League Baseball, where the "subjective evaluation of players was supplanted by the data-driven system of sabermetrics." U.S. News & World Report relies heavily on survey responses and admissions data in determining its rankings.
Under the new rankings, some things stayed the same. For instance, Harvard Medical School was number one on both lists. However, in other cases, relatively low-ranked schools fared better under the new method. The Albert Einstein College of Medicine at Yeshiva University rose from number 34 to 13.
The researchers argue their method is the first step to having a more evidence-based system of evaluation for medical schools, in terms of their ability to produce physicians who go onto successful careers in biomedical research.
However, lead study author Matthew Goldstein acknowledges schools focusing on preparing primary care physicians, rather than research-oriented physicians, "would likely be disadvantaged from our model." His co-author, Mitchell Lunn, says the true goal of the study was to foster a "national discussion about the most meaningful criteria we should be measuring and reporting," and to "improve the quality of medical education across the nation" (Small, FierceHealthcare, 1/22; Brigham and Women's Hospital release, 1/21).
Next in Today's Briefing
HBR: Difficult task at hand? Step one is to tidy your desk.