Published on 2016/04/22

Rankings Punish Innovation and Diminish Higher Ed Diversity

The EvoLLLution | Rankings Punish Innovation and Diminish Higher Ed Diversity
By understanding the limiting nature of popular ranking systems, higher education institutions can take great strides to actually fulfilling their missions and creating access and opportunities for success for traditionally underserved demographics.

College rankings are seen by many institutions as the be-all and end-all of their institution’s brand, a major factor in their opportunities to succeed. But what happens when you can look at the postsecondary space from outside the rankings? Hampshire College was dropped from the US News rankings because of its decision to engage in “test-blind” admissions practices. In this interview, Meredith Twombly reflects on the true value of institutional rankings like those published by US News and World and shares her thoughts on the impact these rankings have on the diversity and richness of the American postsecondary environment.

The EvoLLLution (Evo): What role do you think college rankings play in prospective students’ enrollment decisions?

Meredith Twombly (MT): For many students, rankings play an enormous role in the college selection process. The college search process is an overwhelming ordeal and the rankings, whether they are valid or not, reduce a great deal of complexity for those families. They take the thousands of college options out there, and turn them into neatly ranked lists. They make a research project that is so stressful suddenly so simple—too simple.

That said, I work for Hampshire College, a school that was banished from the rankings two years ago because we refuse to use SAT/ACT scores in our admissions process. The fact that our decision to stop using the scores was based on our own evidence that they were poor predictors of student success at our college didn’t matter to US News. Despite our omission from their rankings, our applications are actually increasing at Hampshire. My gut feeling is that students are calling many of the traditional, status quo structures in our education system—including college rankings—into question.

On my best days I feel like we are approaching a tipping point in which many of the less-than-supportive and outdated aspects of the education system will be widely recognized as such. The prospective students I meet when representing Hampshire are looking for authentic education systems that are attentive to the science of learning and that demonstrate their value and excellence through their true student outcomes. They are looking for educational opportunities that are aligned with their unique values and aspirations and they do not want to be reduced to a number.

Evo: How well do rankings capture and communicate the unique missions and aspects of different institutions that allow them to serve specific student demographics?

MT: Most rankings, especially the well-known US News rankings, don’t even attempt to quantify or consider the unique missions that various colleges pursue. On the contrary, such rankings focus on largely superficial “inputs” such as the high school GPAs, class rank and SAT/ACT scores of their freshmen classes.

Think about that: US News ranks colleges based on factors that were determined before the students ever set foot on the campus!

The rankings would be more accurately titled “US News Rankings of College Freshmen.” If the rankings could move away from student input measures that have nothing to do with the colleges and move toward meaningful process and output measures that are tied to mission—well, that might be helpful.

There is an old saying, “Not everything that counts can be counted and not everything that can be counted counts.” When I look at the data involved in most college rankings systems, it is obvious that they all rely on easy-to-access data points with little thought as to whether they are the right data points. Most of the audience for these rankings are neither statisticians nor social scientists, so they tend to just accept results at face value. In truth, it would be difficult to integrate mission into the rankings, but it would not be impossible. I have been asked a few times how I would construct a rankings formula and while I love data and using data as one means of answering complex questions, I am not convinced that there is any utility in ranking colleges. College selection—and learning for that matter—is a deeply personal process that needs to pay careful attention to the individual student and their fit with the college. I’m not sure a rankings scheme, at least not one available today, can help with that.

Evo: How would America’s rich tapestry of institutions—that currently creates access and opportunities for success for a wide variety of students with a wide variety of goals—be impacted if every institution began pushing for higher rankings?

MT: Sadly, we are not far from that being the reality for four-year colleges.

My primary concern there is that the rankings force institutions to compromise or ignore their missions and chase those numbers that will increase the college’s ranking. It also creates an atmosphere where colleges don’t want to innovate or take risks for fear of negatively impacting their rankings. This is especially harmful to the prospects of first-generation students or students with health challenges who might be less likely to have high SAT/ACT scores or might have lower first-year retention rates. Recently there was a story about a college president who wanted to encourage certain first-year students to drop out before the fall census day so that they would not be counted toward the first-year retention rate. He was afraid some students who were struggling might damage the college’s ranking. I’m pretty sure this strategy is not consistent with the mission of that college, but the rankings have such overwhelming power these days that these types of things are really happening. Quite simply, that college’s first duty should have been to support and work with their struggling students.  Additionally I would add that when you consider how Hampshire was dropped from US News when we stopped using standardized tests, it’s clear that the rankings punish innovation. Luckily the faculty, president and board at Hampshire wasn’t deterred by the US News banishment.

Still, while many admissions officers at peer colleges have applauded our move privately, none have come forward to follow suit. They simply cannot take the risk of getting dropped from US News.

Evo: What needs to change in order to minimize the number of institutions that choose the “rankings pathway”?

MT: There needs to be greater awareness of all the perverse incentives that are built into the rankings. I have spoken to many journalists over the last two years since Hampshire went test-blind and was banished from the US News rankings. Many of those journalists acknowledged their dismay with the rankings arms race but none of them were willing to write that story, the story of the maladaptive impact that rankings have on a constructive college admissions process and the behavior of mission-driven institutions. There are plenty of stories floating around about college admissions, student anxiety, helicopter parents and so on.

What we need now is a more critical lens from journalists, one that asks, “Who benefits from these rankings?” “Why are they so superficial and based so heavily on student inputs instead of student outputs, and who loses in this system?” “If the rankings were to disappear tomorrow, which students would gain the most and why?” “Which colleges would gain the most and why?”

These are all fair questions and deserve widespread coverage.

Print Friendly
Non-traditional-Guide-V
 

Key Takeaways

  • Though factors like test scores are shown to be poor predictors of students’ likelihood for postsecondary success, not tracking those scores leads to institutions being dropped from the US News rankings.
  • Common rankings ignore the unique missions and goals of different institutions, seeking to lump them into homogenous categories and define quality based on inputs rather than outputs.
  • Ultimately, the rankings-centric higher education environment means that access to postsecondary education is limited for any students who could be seen as a risk to the ranking.