Striving for 60: Online Education, Instructional Design, and the Attainment ChallengePaul Cochrane | Director of Online Teaching and Learning, University of Southern Maine
The workforce need is clear: More Americans will need a postsecondary credential in the next decade to be competitive in the labor market. We’ve already witnessed the impact of the advance of technology, machine learning, and automation on employment in the United States. According to the Lumina Foundation, the 2008 recession eliminated 6.3 million jobs that required a high school education or less. Nearly all of the 8 million-plus new jobs created since have required at least some higher education, such that Lumina estimates that by 2025, 60 percent of jobs in the United States will require at least some postsecondary education.
Achieving this 60 percent benchmark will require educating over 16 million more Americans. For those of us engaged in the work of developing programs for non-traditional students, this challenge is a call to evaluate our approaches to program design, evidence-based practice, and the use of data.
In spite of the Obama administration’s emphasis on increasing attainment, overall rates have hovered around 40 percent. Improving attainment by another 20 percent will challenge our institutions to serve a student audience that is older, more diverse and more likely to be first-generation college attendees. More of these students will also need remedial work in English language skills and math. The task before us is both to persuade these students to engage in (or return to) higher education and to develop the structures that will help a larger proportion of this more fragile population persist to graduation. Engaging, flexible, high-quality course content that respects the experience and learning preferences of non-traditional students will help draw students in. Program design that integrates proactive advising and tailored student services will help retain them.
Those of us engaged in instructional design frequently find ourselves engaged in the cultural work of managing change at our institutions, where small pilot projects are often the seeds of larger change. This approach is at odds with a growing body of research that suggests that implementing effective practices systemically can have a substantively greater impact than small-scale application. There are numerous examples of institutions that have experienced significant gains in retention, graduation, and student satisfaction through deliberate program design. One leading example is Southern New Hampshire University, which Clayton Christensen highlights in his “Jobs to be done” theory of innovation. As an example of design that works from the student’s needs forward, Christensen notes that SNHU has been able to attract new students into programs with high graduation and student satisfaction rates relative to other online programs. Another example, rooted in the application of psychology research to program design, is the guided pathways model. Building from the science of choice, which suggests that the cognitive overload of too much choice hinders students’ success, the guided pathways approach offers students a default academic plan with a recommended course sequence.
Institutions that have developed these highly structured degree programs, coupled with proactive student support and clear alignment to students’ career goals, have seen significant gains in student retention and graduation rates. Additional work in areas such as high-impact instruction and the application of the science of cognition to instructional design offer the promise of significant additional gains in evidence-based teaching and learning.
Expanded Use of Data
Ideally, data collected at the national and state levels would provide the information needed to help institutions compare effective practices across institutions and help prospective students understand the market value of various academic credentials. Clearly, more work remains to be done. Retention measures, for example, were designed to measure the success of traditional first-time, full-time students. As such, they do not adequately capture the performance of part-time students or older adult students who bring transfer credits, nor do they allow for effective comparison of retention rates across institutions for these populations. One bright spot is the recent expansion of IPEDS tracking to include questions about the distance education programs institutions offer and the students enrolled partially or fully in distance education.
Within institutions, the expanding availability of robust institutional- and course-level data, coupled with rapidly improving tools for effectively utilizing that data, raises a significant opportunity for evidence-based instructional design practices to gain traction. Particularly within instructional design, this data can help guide programming decisions and the best ways to serve and retain students. We can also hope that better data will help change perceptions about online education within our institutions. In spite of the number of institutions that have advanced new forms of digital learning, as recently as 2015 Babson Survey Research Group’s Online Report Card found that while 71 percent of academic leaders rate outcomes in online education as the same or superior to face-to-face education, only about 29 percent reported that their faculty accept the value and legitimacy of online education.
To be engaged in online education is frequently to be engaged in the cultural work of managing change within our organizations. As we seek to help our organizations adapt to changing student needs, it may be that the most important work that we can do is to embrace our role as educators within the academic community. My experience has been that the demands on faculty time and attention are many and the spectrum of trends that faculty are expected to be aware of and respond to is extensive. Quite often, what seems like resistance to change is rooted in a misunderstanding about the differing needs of non-traditional students. The more that we can ground our work in concrete discussions of the research literature and the data driving our recommendations, the more likely we are to be able to guide change in positive directions.
Author Perspective: Administrator