Published on
Avoiding Icebergs on Higher Ed’s Big Data Seas
Despite their reputation for being rigid or slow, universities have become surprisingly susceptible to a flavor-of-the-day mentality. Increased competition for students has resulted in concepts like MOOCs, nano-credentials, and CBE being just a few examples of substantive education models whose potential was quickly outpaced by promises of rapid impacts. Data-driven decision making is another one of those concepts that gets bandied about in ways that result in lots of frantic activity but little actual progress. While many institutions of higher education focus on Big Data few actually voice clear visions of how it can be applied to drive institutional mission and goals forward. Most often data application resembles the story of the seven blind men and an elephant, with stakeholders each focused on their view or interests but not able to demonstrate a comprehensive view of strategic reality.
This is particularly true in trying to define and measure success around current programs or using analysis to determine whether a new program should be launched in an online environment. At the College of Online and Continuing Education (COCE), the decision to launch new programs is conditioned upon mutual agreement between our marketing team and our academic team. Marketing must agree that key data points indicate a substantive enough ROI on a degree to merit the investment and opportunity costs. Academics must review the data about students, curricular outcomes/frameworks, and the history of academic support to determine whether a rigorous online degree program for our student population is feasible. The data points to be considered are not whether some school somewhere with some students can create a program that will be successful at supporting student learning but rather given our mission, model and audience, if we can do so. More than once marketing has proposed a program based on success elsewhere only to have academics reject it because there was not enough capability to build a rigorous online experience that would adhere to our commitment to student success, and more than once marketing has denied approval of a program favored by academics because of its boutique nature.
SNHU also has the additional complexity of being able to create programs in three different models—a traditional face-to-face coming-of-age experience for 18- 22-year-olds, a national online experience for non-traditional students, and a B2B competency-based experience. That makes it even more important to use data wisely, as the answer to student performance and experience in one model is not evidence of potential success in other models. To be successful in achieving our mission across all these modalities requires an awareness of what the various data points are telling us about the given model.
Measuring institutional outcomes (critical and creative thinking, effective communication, ability to successfully navigate diverse environments, etc.) as well as program specific outcomes (“What do we need to do differently in game design for an online student to mirror the outcomes of our f2f model? Political science? Counseling?) requires consideration of data points such as “Can we create an ADA-compliant model?” and “What evidence do we have that we can provide the student with additional academic support in this area when they need it?” Data points such as our success with tutoring services or learning communities help us determine whether we are likely to be successful in new ventures. If we can’t, the only right thing to do is say no until we can say yes.
For programs already launched we have to be doubly cautious as our model provides literally hundreds of thousands of data points each day for consideration and reviewing the wrong data points can quickly result in a false sense of success or stability revealing itself as sudden disaster. I sometimes compare it to those passengers standing on the deck of the Titanic being told they should get off of what initially seemed like a perfectly good, brand-new unsinkable ship to get in lifeboats that would take them out into the cold, dark Atlantic. Initial review of evidence did not reflect the reality of the doomed vessel, and, for those who didn’t accurately size up the situation, by the time they discovered their error it was too late. The most common and obvious case of this is in programs that have strong leads and high conversion rates resulting in explosive enrollments. While teams are congratulating each other for meeting a KPI, failure to build out data systems for tracking subsequent data points often results in either low persistence rates over time or cannibalization of one program by another. Big Data analytics must be provided all the way through the lifecycle of the student experience—including post-graduation employment—if it is to be fully realized. Those using it will come to see that it is circular, with data at the point of graduation and employment feeding back to the beginning of the student experience for increased efficacy moving forward.
A data-informed culture, particularly one that prides itself on innovation and pilots, must be cautious of too much data, as all data is not created equal, and data itself is not a panacea. Having a million data points may in fact be counterproductive, as it isn’t the quantity of the data that is critical but rather having the key data points to answer the right questions. Our data team has a very strong voice at the table and has helped us appreciate the difference between causality and correlation. Given that every course section has common milestones and outcomes there is a daily inundation of student and faculty data elements, and we know when performance is trending down on student success almost immediately. In our early days of using data, negative information might have resulted in frantic action to impact downward trends; over time we have developed the patience and maturity to methodically go through even critical data to make sure that whatever solutions are put forward solve the actual problem. If student performance is down; is it down in every section of the course, among certain cohorts of students, noted by common poor performance on a single activity? The answers to those types of questions will drive proposed solutions for the faculty, students, curriculum design teams, or other stakeholders.
In the end, data is like fire; on its own it is neither good nor bad but depending on how it is applied it can bring disaster or relief. Having strong leadership that can keep the team focused on the institutional mission will also help the team figure out the appropriate data models to fulfill that mission or to course correct when bright shiny pieces of data threaten to distract from the mission.
Leadership towards a data-informed culture includes creating dialogue rather than dictates from all stakeholders of the team, including academics who might be suspicious initially of both methods and motives of administration. Recognizing this and preparing for it will allow leaders to steer their teams through treacherous waters while keeping their eyes on the North Star that is their core mission.
Author Perspective: Administrator