Some 20 years ago, I spent my summer in Washington, D.C., as an intern for the U.S. News & World Report college rankings. Part of my job was to call colleges to get missing data that was used to compile the rankings or confirm the data that the magazine already had from the school. Princeton University ended up No. 1 in the rankings that year. Last year, Princeton was No. 1. In other words, not much has changed in two decades. The top of the list has remained relatively consistent for quite some time.

But the universities at the top of the U.S. News list represent a tiny fraction of American higher education. There is no doubt that Princeton is a good school and most of its graduates get good jobs and have solid and rewarding careers. What parents and students really want to know, however, is how to differentiate the outcomes of the thousands of other colleges that are not among the top schools in the U.S. News rankings.

A spate of new rankings and other studies have emerged in recent years attempting to answer that question by looking more closely at the employment and earnings record of college graduates and weighing that against the cost of attending college and chances of graduating on time. These rankings, from the Wall Street Journal, the Economist and Gallup, among others, are based largely on new data sources about recent college graduates.

Money magazine was one of the first to use data on the employment outcomes of graduates, and its rankings remain among the best for students and parents to consider (but certainly not the end-all and be-all of the college search process). The latest edition of the Money rankings was released on Monday. Like other rankers, Money tweaks its formula every year – partly in response to complaints from colleges and because it wants to sell a new set of rankings to a new crop of prospective students. From a business perspective, rankings need to change slightly every year, but also include enough familiar names and a few surprises.

This year, Money added a data set to its methodology known in higher ed circles as the “Chetty data.” That refers to Raj Chetty, a Stanford professor, who has led a team of economists that has received access to millions of anonymous tax records that span generations. The group has published several headline-grabbing studies recently based on the data. In findings published in January, the group tracked students from nearly every college in the country and measured their earnings more than a decade after they left campus, whether they graduated or not.

The results were grim for a higher education system that claims to be a ladder to upward mobility. The data showed, for example, that the City University of New York propelled almost six times as many low-income students into the middle class and beyond as all eight Ivy League campuses, plus Duke, M.I.T., Stanford and Chicago, combined. The California State University system advanced three times as many.

Advertisement

Money magazine included the Chetty data as part of 27 measures that it uses to rank schools in an effort to illustrate the track record of a campus in moving less-affluent students into the upper middle class.

Several schools perform well on the measure of economic mobility, and it clearly helped them in the final Money rankings. The magazine’s editors let me look at the rankings with the Chetty data included and without. The City University of New York’s Baruch College ended up No. 2 overall in the rankings, behind Princeton (of course). Without the economic mobility data included, Baruch would have ranked No. 30.

Other campuses that performed better as a result of the economic mobility data included the College of New Jersey, the University of California at Riverside, and the University of Florida. One downside of using the Chetty data in rankings like this is that it’s dated. It includes students who enrolled in college in the late 1990s and are now in their 30s. So the data, and thus the rankings, don’t capture the recent strides some colleges have made in enrolling more low-income and middle-income students.

But the Chetty data and how Money magazine used the numbers this year is a good start in helping families differentiate between the outcomes of thousands of colleges. It comes as there is concern that the U.S. Education Department under the Trump administration might abandon consumer tools started under the Obama administration, like the College Scorecard. Last month, Inside Higher Ed reported that the Education Department appears to be planning to keep the scorecard at least for another year.

It’s not clear, however, that the federal government needs to construct data tools like the College Scorecard. It just needs to collect and disseminate the data. The proliferation of rankings and other studies in recent years shows that there are plenty of outside groups — both nonprofit organizations and profit-making magazines — willing to build the necessary tools that get us closer to a future of better consumer information about one of the biggest purchases we’ll make in our lifetimes.

Jeffrey J. Selingo wrote this for The Washington Post.


Only subscribers are eligible to post comments. Please subscribe or login first for digital access. Here’s why.

Use the form below to reset your password. When you've submitted your account email, we will send an email with a reset code.

filed under: