Visit Modern Campus

Cutting through the Clutter: How to Ensure Your Colleagues Can Leverage Data

The EvoLLLution | Cutting through the Clutter: How to Ensure Your Colleagues Can Leverage Data
Making data digestible is critical to its usefulness, but leaders across the institution—not just the folks in charge of the analytics operation—need to be involved in defining the need for data and in establishing a process that ensures its value.
Imagine this common practice for institutional research and survey offices. After a survey closes, the data team cleans and summarizes the data, identifies trends over time (if possible), edits Excel tables for size and format, turns some of those tables into graphs for visual interpretation, adds paragraphs of exposition to provide context for the data, posts on a website and, finally, sends a note to constituents to read through the (perhaps upwards of 70-page!!) report at their convenience.

You can then imagine how many busy campus administrators head to the site and thumb through the finished product.

That was the practice occurring in our office and at our campus: so much data and so few ways in which we provided it meaningfully to decision makers. In fact, our Higher Learning Commission (HLC) liaisons said as much after their August 2015 visit for our Data Discovery Day:

“Indiana University Southeast has been collecting a lot of data for a lot of years,” read their formal response to us. “As the institution moves forward to make sure the pieces of evidence they have can be used in concert with each other, they should be able to get a handle on issues that affect persistence and possible remedies.”

With the way in which we presented survey data, at least, that “handle” was unlikely to be “gotten.”

The other liaison’s response to our data collection and presentation was, “The creation of a cross-functional team will be a great asset to IUS as it moves forward. Because persistence and completion is a multi-faceted issue, having as many perspectives at the table as possible will allow for rich discussions and interpretation of the data.”

So we created a small, cross-functional persistence team to collect, examine and make decisions on data to improve student outcomes in persistence and completion.

But the larger issue was still present: how do we submit actionable data across the campus, in focused and digestible bits, for users to consider when making decisions? For the past few years, we began to take a different approach to the reports we published, opting for smaller, focused (say, school-specific) reports that units could use to inform their decision making. Reports that used to be 30 pages were narrowed down to five or 10 pages. Comments from open-ended survey questions were categorized by unit so that comments related to satisfaction (or dissatisfaction) with a particular student service office or a school were sent only to the appropriate unit. For instance, survey questions related to curriculum were included only in an academics report, sent to our EVCAA, deans and academic directors, while questions on pride and alumni involvement were aggregated and contextualized in a report sent to the Alumni Director.

While we changed the process of data reporting for survey results, feedback from units did not indicate that any data-informed changes were being made as a result. Were we not reaching the schools and administrative units adequately? And if we couldn’t document change, were we truly connecting assessment to planning?

The solution appeared obvious to some: We need to sit and chat about the data and how it might be used, and not rely on busy administrators to read reports in isolation. So, in September of 2017, an Academic Council was resurrected by our Executive Vice Chancellor for Academic Affairs, Dr. Uric Dufrene. In a note to invitees, Dr. Dufrene stated that the council’s purpose was to “serve as a data-driven, institutional leadership council for planning, progress, and quality.” Its main outcomes were to increase awareness of the link between planning, assessment and budgeting as well as “continuous improvement in support of overall institutional effectiveness.” And to keep the group on track, its standing items were to be: strategic planning and budgeting, accreditation, persistence and completion, program quality, and assessment and co-curricular assessment and closing the loop.

After the kickoff meeting in September, the council met four more times during the 2017-18 academic year. I was responsible for providing data pieces to discuss, which included alumni survey data and National Survey of Student Engagement (NSSE) High Impact Practice data. Others also filled the agenda with facts about our first-year students (from our Dean of Persistence and Student Success), online course enrollment and credit hour production data, demonstrations of job market data, and a Political Science program review.

There were two keys to this being a successful venture. First, the right people had to be in the room. An invite went out to all deans and academic directors as well as Vice Chancellors from non-academic units. Second, the data must be useful and, if so, used by constituents. And again, if so, usage must be reported back to close the loop.

Are the right people in the room? For the most part, yes. When vice chancellors from non-academic units are unable to attend the monthly Friday morning meetings, they typically send a representative from their units. However, as with most meetings, you have to strike a balance between having too many people in the room (i.e. inviting all Student Affairs directors as well) and trusting that those in attendance report back to their subordinates and receive feedback about how data is used in their units. I doubt that is truly happening all of the time, but we are at least aware of the need to do so.

Is the data useful? I think so. Results from alumni surveys and graduating student surveys are peppered with insight from our successful students. So why would we ignore that, especially in an era of continuous improvement and always wanting to elicit experiences from our students? We encourage units to tout positive comments and results on their webpages, in their marketing materials, and certainly in their recruitment of students into their majors. I’d like to see us fold more retention and student success data into our meetings. Those reports and data points are meaningful and concrete (student stays or student goes) rather than more abstract and indirect survey data results.

Is the data used by constituents? We want to think so, and I think it might be. But I also believe there is opportunity to improve our use of survey data. One question I often receive is, “Is this data representative?” Well, the formation of IR stemmed from a split from scholarly research over a half century ago, and centered on research within a university for planning and decision making (rather than scholarly research whose applicability is based on how generalizable the unit of study is to one’s home campus population, such as a large, residential public, or a small liberal arts school). So the answer is yes, this data is representative of our students. These are the voices of our students. And while there may be more business major respondents than English major respondents, these still are true experiences. By disaggregating the data by school, the results become even more useful for those units as the results they receive are truly their students’ voices.

Lastly, are we reporting back how we use the data? Honestly, I don’t think we’ve gotten that handle yet, but we are improving. We’re striving for cycles of policy/procedural implementation, assessment and continuous improvement that is sustainable and organic.

IU Southeast has the right people at the table discussing some of the right data, being used to inform decision making some of the time and closing the loop on occasion. The leveraging of data across silos has to start somewhere. For us, it began in 70-page reports that sat idle on a seldom-used website.

Author Perspective: