Ironic that the schools deploring college rankings sit atop the list.
Talk about being FOS. Love these woke pillars of virtue signaling charging $75,000/year while eating up 4 years of potential earnings. What do they sell? Prestige and the promise that the anointed posses high IQ & tenacity. Kind of like a glorified sorting hat, ableit a rather expensive one.
Rebellion Over College Rankings Seems Likely to Fail
U.S. News, others turn to public data to determine standings as some schools reject surveys
Harvard is among the universities that have stopped cooperating with U.S. News & World Report on the publication’s rankings of law and medical schools.
Josh Zumbrun, WSJ
Updated Jan. 27, 2023 3:31 pm ET
In the past two weeks, Harvard, Stanford and Columbia universities, the University of Pennsylvania and the Icahn School of Medicine at Mount Sinai said they would stop cooperating with U.S. News & World Report’s medical-school rankings.
That followed the decision last year by universities including Yale, Georgetown, Harvard, Stanford, Columbia and California, Berkeley to quit cooperating on the publication’s law-school rankings.
Critics are cheering the exodus from a process they say leads students to focus on external prestige rather than education quality and encourages schools to game rankings at the expense of students. The schools that are withdrawing say the rankings are elitist, and penalize institutions that admit strong candidates without high test scores.
“In the 40 years of rankings, this is the biggest shock to the system—that gives me hope,” said Colin Diver, a former president of Reed College, which has long abstained from the U.S. News ranking. Mr. Diver is the author of “Breaking Ranks: How the Rankings Industry Rules Higher Education and What to Do About It.”
But hopes that this marks the death knell for college rankings are likely to be in vain. The reality is that what the schools themselves contribute to the rankings is relatively small: The data includes test scores, alumni giving, financial information and so on. But most of the data used to determine the rankings can be derived from publicly available information, or surveys conducted by U.S. News itself. Indeed, U.S. News has revised the survey over the years in response to criticism. There is a case to be made that the less the schools contribute, the more objective the rankings might become, in some respects.
Large sections of the U.S. News ranking are measures of a school’s reputation, so are mostly unaffected by a few withdrawals.
For the undergraduate rankings, 20% of the score is assigned from a survey asking institutions to assess each other. For law schools, 25% comes from peer assessment and 15% from a survey of lawyers and judges. For medical schools, 30% of the score comes from the assessment of peer institutions and residency directors. By comparison, standardized tests are 5% of the weighting for national universities, 11.25% for law schools, 13% for research-focused medical schools and 9.75% for primary-care medical schools.
When the U.S. News rankings began in 1983, peer assessment was the source of the ranking, with schools simply asked to rank which other schools were the best. Critics such as Leon Botstein, the president of Bard College, compared this to judging for “figure skating but with fewer rules.”
But the reality is that many applicants care what people think of their school, and the withdrawal of a few dozen institutions, even the prominent ones, doesn’t end the ability of U.S. News to conduct the survey with everyone else.
U.S. News collects much of its academic data via its questionnaire to schools. But versions of that data are available elsewhere, from accrediting groups or the Education Department. (U.S. News discloses its weightings, but not the exact methodology to create scores in each bucket.)
Real Time Economics
The latest economic news, analysis and data curated weekdays by WSJ's Jeffrey Sparshott.
On the one hand, relying on public data makes U.S. News’s rankings somewhat less distinctive. Many initiatives are built largely from these same sources. For example, The Wall Street Journal also publishes college rankings. On the other hand, this same proliferation of rankings shows the hunger of students for such information.
In a written statement, a spokeswoman for U.S. News said it would continue to rank law schools by drawing on the voluminous data that law schools are required to make available to the American Bar Association, “whether or not schools respond to our annual survey.”
For example, the ABA requires schools to report bar-passage rates and employment rates. That data can still feed U.S. News’s ranking for placement success, currently an additional 26% of the law-school weighting.
U.S. News said that for the next law-school rankings, there will be changes in how data points are weighted, including a reduced emphasis on peer assessment.
The spokeswoman declined to comment on future medical-school rankings. With medical schools too, much data is available from sources other than the schools themselves. For primary-care medical schools, for example, 30% of the weighting is built with data from the American Academy of Family Physicians.
When schools haven’t provided data for the undergraduate rankings, U.S. News has been able to rank them anyway, using publicly available data. Any school receiving federal financial aid has to provide detailed information—average SAT scores, acceptance rates, graduation rates, debt burdens of graduates and more are posted on the Education Department’s College Scorecard website.
SHARE YOUR THOUGHTS
Have you relied on rankings when choosing a college or postgraduate program? Why, or why not? Join the conversation below.
By switching to public data, the rankings might lose some indicators, but the data that remains might be more reliable because schools have less opportunity to influence the rankings with incorrect information. Last year, Columbia dropped from second place to 18th in the undergraduate rankings after a math professor discovered the university had submitted false information. A former dean of the business school at Temple University was sentenced to 14 months in prison for fraud after submitting falsified data for the M.B.A. rankings.
Even if people continue consuming the rankings, Mr. Diver said it would be a positive if schools stop behaving in dishonest ways and pandering to the rankings.
In a written statement, Eric Gertler, the chief executive of U.S. News, defended the rankings, saying that “where students attend school and how they use their education are among the most critical decisions of their life,” and “we believe students deserve access to all the data and information necessary to make the right decision.”
Would students have a better experience if they considered what they really wanted out of a college—beyond its ranking? Probably. But will applicants begin to ignore publicly available compilations of how their schools rank? Probably not anytime soon.
Write to Josh Zumbrun at email@example.com