Visit Modern Campus

The Future of Online Education: Will Our Courses Foreshadow Our Ends?

The EvoLLLution | The Future of Online Education: Will Our Courses Foreshadow Our Ends?
Technology-enhanced higher education will move from anomaly to accepted to norm as the next few decades unfold.

This article based on material from my new book, Online Education Policy and Practice: The Past, Present, and Future of the Digital University (Taylor & Francis/Routledge), which examines the development of online education in American higher education.

Over the past twenty years, we have seen the development and expansion of online courses and programs, blended learning, the rise of for-profit colleges, and the MOOC phenomenon.

In this article, I will focus on the last two chapters of my book that consider the future of higher education as it becomes more dependent upon online technology for the delivery of instruction: the near future (through the 2020s) and the more distant future (2030 and beyond). I am optimistic about the near future of the digital university. Faculty will continue to develop greater facility with instructional technology and will come to integrate it seamlessly into their academic programs. American higher education is moving to a model where almost every course offered will have an online component.

This is desirable during a time when enrollments will rise and perhaps get a boost if calls for debt-free public college education gain momentum. Because of state funding constraints, there will likely be fewer full-time, tenure-track faculty as a percentage of the total faculty as contract, untenured adjunct faculty, and tutors will take on more of the teaching load. Instructional approaches such as learning analytics, adaptive learning, competency-based instruction, interactive media, and mobile technology will mature in the 2020s.

In the 2030s and beyond, it is likely that major new technology breakthroughs such as artificial intelligence, massive cloud computing, and brain machine interfaces will emerge that will change many aspects of human endeavor including education. Nanotechnology will pave the way for quantum computing. After all, history tells us that technology developed for one purpose or activity may have significant unforeseen effects on other activities. Steven Johnson, in How We Got to Now: Six Innovations that Made the Modern World, observes that technological innovations “have set in motion a much wider array of changes in society than you might reasonably expect.”[1] For instance, Johannes Gutenberg’s printing press created a surge in demand for eyeglasses because the “new” or expanded practice of reading made many Europeans realize that they were far-sighted. Or more recently, in 1999, when Google’s search software provided a stunning breakthrough improvement over any previous search mechanisms, the entire internet became more useful and functional. Several years later when Google started selling advertisements tied to search requests, the nature of advertising changed dramatically as ad agencies and their clients flocked to Google and other web-based services to promote products. Johnson suggests that Google’s evolution in advertising “hollowed out the advertising base of local newspapers” thereby having unexpected but serious consequences for newspaper journalism.[2] The same may be true for colleges and universities; a new technology may evolve that will have significant repercussions on many postsecondary education endeavors.

Predicting what will happen in the future is difficult. The timing of predictions in particular is speculative. Looking at the work of futurist Ray Kurzweil, especially with regard to the “singularity” when man-machine technology will begin to outperform human brain functions, sheds light here. Technology that augments the human brain such as neural implants, brainnets, and self-generating nanobots will evolve in the post 2030 timeframe. Quantum computing will allow these technologies to communicate with highly advanced super cloud networks. Artificial intelligence will dominate much of the man-machine interface technologies. Lest we think that this is science fiction, research and development are well on their way at major research institutions and corporations. In 2016, the Defense Advanced Research Projects Agency (DARPA) announced a new initiative to study and develop the implantation of neural devices that facilitate digital interfacing into the brains of soldiers. DARPA is the same agency that initiated the development of the internet in the 1960s. In 2015, scientists at Ohio State University announced that they had “grown” the first human brain in a laboratory. The brain was engineered from adult human skin cells and grown in a dish. Stephen Hawking has issued warnings about brain-machine interfaces and the potential loss of control over humanity’s future. Education will be but one of the human endeavors that will be affected by these developments.

In closing, we can look back in time to a literary classic to form a picture of what the future might hold. In A Christmas Carol, during Ebenezer Scrooge’s visit to the future, he faces his own mortality in the form of a tombstone inscribed with his name.

He asks the Ghost of Christmas Yet to Come, “Before I draw nearer to that stone to which you point, answer me one question. Are these the shadows of the things that will be, or are they shadows of things that may be?”

The Ghost continued to point downward to the grave by which it stood.

“Men’s courses will foreshadow certain ends, to which, if persevered in, they must lead,” said Scrooge. “But if the courses be departed from, the ends will change.”

– – – –

References

[1] Johnson, S. (2014). How we got to now: Six innovations that made the modern world. New

York: Riverhead Books.

[2] Ibid

[3] Dickens, C. (1843). Project Gutenberg E-Book Version of A Christmas carol. Project Gutenberg. Retrieved from: http://www.gutenberg.org/files/46/46-h/46-h.htm  Accessed: September 7, 2015.

Author Perspective: