Increase Revenue with Modern Continuing Education Software
How using modern eCommerce principles drives revenue in Continuing Education
This article is excerpted from America Needs Talent, authored by Jamie Merisotis and published in 2015.
Have you heard about the graduate from Stanford who went deeply in debt getting a degree in classics, only to find no job on the other side of graduation? Now she’s prepping iced caramel macchiatos for minimum wage.
How about the philosophy major from Michigan who’s paying off his student loans by stocking shelves at Target while living at home with his parents?
Let’s not forget the art history student who studied Michelangelo at Johns Hopkins and ended up flipping burgers at McDonald’s.
And have you read all the alarming headlines?
“College is dead,” declared Time. “For Most People, College Is a Waste of Time,” says The Wall Street Journal. The same publication also asks, “Do Too Many Young People Go to College?”
Elsewhere, The Huffington Post tells us “Why College Is Usually a Waste of Time,” while Rush Limbaugh decries the “Big Education Racket.”
Scan the blogs and you will see headlines like “College Is a Waste of Time.” Or “College Is a Ludicrous Waste of Money.” Or the slightly more multifaceted “College Is a Waste of Time and Money,” and also “College Is a HUGE Waste of Time.” Thanks for the clarification.
PayPal cofounder Peter Thiel created a fellowship in 2012 to encourage Americans under the age of twenty to drop out of college and pursue entrepreneurial work. He’s paying for 100 of them to do so. He hasn’t yet explained how the other nineteen million students who currently go to college can find their own Uncle Peter.
Thus we find what is certainly one of the most prevalent media memes of the past few years: Go to college and you will end up studying some useless field, get a degree that has zero applicable use in getting a job, go deeply in debt to do so, and go straight from commencement to waiting tables, tending bar, or sweeping floors while living in the bedroom you grew up in. Who on earth would sign up for and spend their money on that? So, the argument goes, fewer and fewer people should pursue higher education.
As far back as the 1970s, there was talk of “overeducated Americans” and dead-end educational opportunities. Newspapers were printing stories about college graduates who were scraping by at menial jobs that had no earthly relation to the rarefied field of study they pursued in college. The imminent demise of higher education is not a new theory. Nor is the argument that getting a postsecondary degree is futile. This is what I like to call a zombie doctrine. No matter how many times you try to slay it, it comes back to life, stronger each time.
But if we are to grow America’s talent, we will have to slay the undead once and for all.
As discussed, before the 1950s, the college degree was a rare commodity. But in the wake of the GI Bill, it went from being an accoutrement of the elite to the nationally acknowledged credential showing you are qualified for success in the job market. And no matter how many scary scenarios the mainstream media presents about the sad fate of college grads and the uselessness of their degrees, it still is the key to employment.
Let’s not kid ourselves: America’s higher education system is indeed in need of an overhaul. But let’s draw a distinction between murdering higher education and modernizing it. The former will cause us to slip behind in the talent contest. The latter is our first key to winning it.
Here is a quick rundown of what must change:
Big data, online programs, and the explosion of social media have transformed teaching and forced huge changes in every corner of the higher education system, from assessment and accreditation, to student recruitment and support, to faculty hiring and development. Each is part of the overall redesign of higher education already under way.
There is no shortage of ideas about how to redesign higher education, some better than others. Take Massive Open Online Courses, or MOOCs, for example. No innovation in higher education has received more attention from the national media than MOOCs. This may be in part because the biggest names in the world of MOOCs—Coursera, EdX, Udacity—have their origins in universities with global reputations, including Stanford, Harvard, and MIT.
MOOCs are an interesting point on the continuum of non-classroom-based learning in higher education. Starting in the 1960s, “distance education” consisted of correspondence courses that offered independent study via the mail. Students would receive course materials, take exams, and be graded by a professor (usually one hired by a for-profit company that was offering the correspondence course), all for a fixed fee. Few correspondence courses were available where students could receive actual credits from an accredited college or university.
In the 1980s, technology took the idea of distance learning a step further. Satellite technology, videotapes, and other media were used to transmit information to students faster than the older mail-based correspondence courses. The advent of the global Internet in the 1990s continued this advancement, giving learners access to materials and courses that were previously available only through classroom-based settings. Some of these became connected to the regular curriculums of colleges and universities, allowing students to receive course credits.
In each of these cases, the material being transmitted was essentially the same as or similar to what was provided in classrooms. MOOCs are an extension of that model, with the added notion that there could be unlimited numbers of students participating—tens of thousands in some cases. It’s likely that the scale potential of MOOCs, combined with the reputations of the institutions and people behind their creation, was at least partially responsible for the near-hysteria about MOOCs that emerged in 2012 and 2013.
Yet the current limitation of MOOCs is fairly obvious. Even with the added value of being able to access video clips, problem sets, and other information beyond traditional course materials, MOOCs are largely still a transmission model, simply an extension of the model that has existed for decades.
Much has been written about the dramatic drop-off in interest in MOOCs from prospective learners—the thousands that sign up in the beginning rapidly dwindle to hundreds once the course assignments and more serious work begins. Colleges themselves also have waning interest. A survey of more than 3,000 academic leaders in 2014 showed a dramatic decline in the development of new MOOCs compared to 2012, when the craze really began. The Chronicle of Higher Education headline says it all: “The MOOC Hype Fades.”
Over the longer term, MOOCs may very well have a place in the ecosystem of learning, a sort of extension of the role of traditional colleges and universities. More interesting, though, is the ways in which technology is being used not as a transmission tool, but as an interactive learning platform, where the technology adapts to and targets the lessons based on what students already have learned. The best example of this may be the Open Learning Initiative established by Carnegie Mellon University. OLI, as it is called, began in the early 2000s and has been widely tested in a variety of course contexts, ranging from English and French to chemistry and statistics. OLI essentially redesigns general education courses (using highly advanced learning science techniques) so that they can be completed faster—sometimes twice as fast—as traditional courses. The kicker is that student performance and knowledge retention often is the same or better over time as it is in traditional classrooms.
Many of the different approaches to redesigning higher education like OLI have value. But on their own they are unlikely to change the system overall. In part, this is because the idea of a “system” of higher education may be one of the most difficult concepts to alter in our psyche. For much of the past century and a half, the “system” consisted of different combinations of colleges and universities—elite and open-access; research-oriented and teaching-focused; and community colleges, public and private. All of these different combinations led us to the not-unreasonable conclusion that our system of higher education was the most diverse in nature, most inclusive in its admissions, and the “best” based on the international reputational ranking of our top research universities.
Yet the “system” has already begun to evolve. Many commentators, including Clay Christensen and Thomas Friedman, have argued that the industry of higher education is ripe for disruption, and that there’s a revolution already underway in that industry. To paraphrase the seventies jazz poet Gil Scott-Heron, the revolution is being televised, blogged, tweeted, and MOOC’ed in ways that we could never even have dreamed just a few years ago. And it’s dramatically changing what people learn, where they learn it, and how they will use it in work and in life. As a result, we see evidence that “the system” is really an ecosystem of people, with students at the center: Institutions of higher education, with faculty playing an especially important role; policy and professional organizations; employers; and others are other major elements of that system. In other words, the colleges and universities aren’t the system, and the fact that we still see it that way confuses and perhaps even confounds progress at a time when the urgency for change is high.
Now, this doesn’t mean that colleges and universities as we know them are suddenly superfluous. Far from it. The knowledge-development role that colleges and universities play is critical, as is their broader role of service to community and society. But the institutional focus of the system—the idea that decisions and funding and policies should primarily respond to the needs of colleges and universities—is no longer appropriate, if it ever was. It must be replaced by a focus on meeting the needs of students and, by extension, the needs of society.
A redesign of college must also recognize that students themselves have changed. The traditional picture of a college student—eighteen years old, fresh out of high school, headed to campus for the first time, and destined to work the same job or career from graduation until retirement—is as antiquated as the system they are about to be submerged in. In fact, if you count the students who go to college with the support of their parents, attend full time, and live in college housing, you end up describing less than one out of every four college students today.
Today’s student population is huge and growing and remarkably diverse; in fact, it looks nothing like your father’s freshman class or even your own freshman classes. (I went to college in the eighties, and have the skinny ties and R.E.M. albums to prove it.) The 21st century student represents all ages and income groups, all races and ethnicities, part-time and full-time students, living on and way off campus, pursuing not just four-year degrees but also adding skills and credentials of all kinds to their personal portfolios. The main thing they all have in common is a search for a better and bigger return on their investment.
And college, for that matter, no longer looks like the college we still see in movies or television: It’s not ivy-covered towers, professors in tweed, and students packed into lecture halls or poring over books in dorms. It’s students studying at community colleges—who alone represent almost one-half of the enrollment in US higher education—to adult learners taking undergraduate and graduate classes from for-profit entities. It’s online, on iPads, at a distance, at dinner tables, and in video conference rooms.
Indeed, it’s becoming increasingly clear that “college-level learning” does not even need to take place in a traditional institution of higher education as we know it. With the emergence of trends like taking a person’s prior military, education, national service, or work experience and assigning actual credit or value to it in a college context—prior learning assessment—we see the emergence of a new paradigm. In this new world, providing students smarter pathways into and through higher education will be critical. All learning should count. Everyone should know what degrees represent so they can be put to use most effectively, whether it’s for employment or further education, and everyone should know the next step they need to take to move toward their personal goals.
At its root, we need to rethink and reimagine the entire premise of higher education. We must ask ourselves what type of product we want to be sold and produced by the nation’s colleges and universities and other providers of postsecondary learning.
To me, one answer is a system that cultivates and tracks talent, and deploys a prepared and imaginative workforce that can obtain and create jobs, becomes this century’s bold innovators, and ensures that America thrives in the global economy. It’s a system that produces people who lead a good, moral, globally literate, and civically engaged and responsible life that we can all share. Higher education must be redesigned so that it truly serves our needs as a society.
One of the best commentators and critics of the current model of American higher education, Kevin Carey, eloquently describes this path to higher education transformation in his 2015 book, The End of College. I count Kevin as a friend and ferociously smart colleague, and have learned a great deal from his insights on a variety of topics, college-level learning outcomes, student financing, and new postsecondary education delivery models, to name just a few. And his observation about getting to a true system redesign, I think, perfectly summarizes what’s ahead. “Many of those who have lived and learned in colleges as we know them cherish their memory and institutions,” Carey writes, “But the way we know them is not the only way they can be. Our lifetimes will see the birth of a better, higher learning.”
He’s right. And the single most drastic, effective, and yes, revolutionary, way to get there has to do with the end of time as we know it.
Perhaps the most outdated feature of our current higher education system is how we measure learning. Today, this is done according to the amount of time spent at desks and in classrooms—or sometimes even time spent online—rather than by how much students actually absorb and subsequently what they do with that knowledge.
But what would happen if we turned this system on its head? What if college credits were awarded based not on seat time, but rather on measurable learning? What if we prioritized outcomes over inputs?
Dating from the early days of the 20th century, the standard unit of college currency has been the credit hour. This, like so many of our institutions, came about by accident. At the beginning of the 20th century, when Andrew Carnegie attempted to adjust the low pay of professors at Cornell University, on whose board he served as a trustee, he created a pension system, which was then offered to other universities with a catch: All schools that bought into the pension plan, now known as TIAA-CREF, had to adopt the credit-hour system, which the National Education Association had used to determine high school credit. And with that, the Carnegie Unit became the standard metric of a student’s fluency in a subject, both in college and high school. Today, 120 hours usually equals a bachelor’s degree.
And yet the credit hour was never meant to measure how much a student actually learned. The colleges and universities are merely measuring how many hours students put in, not what they are truly getting out of it or what they will take with them into the job market. To make matters worse, this outdated metric is totally unsuited to measuring many of the innovations that have the potential to reshape higher education. Online learning and curriculum customized to a student’s needs and ambitions, for example, don’t mesh with a system that relies solely on hours logged to determine what is being learned. And the credit hour isn’t even really “currency.” Many institutions don’t even accept credit transfers, or do so with so many caveats that no one who doesn’t already have a degree can figure out how it works. It’s a currency that all too often has no exchange rate.
So it’s time for a change. It’s time for a system that awards learning credits that are based on learning, not time. It’s time for a student-centered credentialing system that prioritizes what you know and can do over where and how you get your education. And the only way to do this is to remove and replace the credit hour.
How do we do this? Obviously, there are a lot of parts to this puzzle. Still, we know the basic aspects of the higher education system the nation needs: At its core, it’s a system that offers multiple, clearly marked pathways to various levels of student success—pathways that are affordable, clear and interconnected, with no dead ends, no cul-de-sacs and plenty of on- and off-ramps.
Second, these pathways must be built on the foundation of learning, with degrees and other postsecondary credentials representing those well-defined and transparent results mentioned a moment ago, validated through quality assessment. And as I noted earlier, all learning certified as high-quality should count—no matter how, when, or where it was obtained.
In the ideal scenario, then, in this new system every student will know where they are going, how much it will cost to get there, how much time it will take, and what to expect at journey’s end—both in terms of learning outcomes and career prospects.
We don’t have the luxury of commissioning long studies on how to make such a system a reality. Time really is our enemy at this point, given the urgency of our national need for talent. We also know that change is never easy. Still, the payoff for making this change will be huge, so let’s explore how it can happen.
Two major shifts in thinking need to undergird the system-redesign project, a pair of new perspectives that must drive all of the smaller changes. One has to do with how we might envision a system where institutions of higher education are no longer at the center. The other has to do with how the shift from a time-based system to a learning-focused one will actually happen.
In place of the time-based method of “keeping score,” a system must be constructed that defines, fosters, measures and rewards what truly matters: student learning.
I don’t think I can overstate this point when it comes to the redesigned higher education system: We must focus on learning outcomes as the true measure of educational quality. Not time, not institutional reputation (like the US News & World Report and other rankings do), but genuine learning. That is, those competencies that are informed by the real world in which students must thrive.
This will not be easy. Traditionally, Americans have highly correlated where they went to school with who they are. Take a look at the Facebook accounts of anyone who has been to college. Most people list the school they attended as a key part of who they are—right alongside other important life information like relationship status, hometown and current job.
More important, until fairly recently, employers often associated where people went to college with the quality of education they received. But as a recent survey of employers showed, employers are no longer on board with this outdated idea: Only 14 percent believe colleges and universities are preparing students adequately for work. Indeed, another survey of college alumni conducted by Gallup and Purdue University found that the factors associated with having a great job and a great life—workplace engagement and personal well-being—have no correlation with where you went to school. None. Ivy League versus public university, Top 100 rated versus others. It didn’t matter.
The proxy measures for quality have been just that—proxies. But as a better understanding of what higher education produces for society has developed—what we need it to produce in terms of more talent and better talented citizens—it has become clear that these proxies are insufficient.
This article is excerpted from America Needs Talent, authored by Jamie Merisotis and published in 2015.
– – – –
 Amanda Ripley, “College Is Dead. Long Live College!” Time, October 18, 2012.
 Charles Murray, “For Most People, College Is a Waste of Time,” The Wall Street Journal, August 13, 2008.
 Lauren Weber, “Do Too Many People Go to College?” The Wall Street Journal, June 21, 2012.
 Seth Roberts, “Why College Is Usually a Waste of Time,” The Huffington Post, May 25, 2011.
 “The Big Education Racket,” The Rush Limbaugh Show, October 27, 2011.
 “Billionaire Offers College Alternative,” CBS News, May 17, 2012.
 Richard B. Freeman, The Overeducated American, Academic Press, 1976.
 William G. Bowen, Matthew M. Chingos, Kelly A. Lack, and Thomas I. Nygren, “Interactive Learning Online at Public Universities: Evidence from Randomized Trials,” ITHAKA, May 2012.
 “Time Is the Enemy: The Surprising Truth About Why Today’s College Students Aren’t Graduating… and What Needs to Change,” Complete College America, September 2011.
 Kevin Carey, The End of College: Creating the Future of Learning and the University of Everywhere, Riverhead Books, 2015.
 Paul Fain, “Hour by Hour,” Inside Higher Ed, September 5, 2012.
 Lumina Gallup Poll, 2014.
How using modern eCommerce principles drives revenue in Continuing Education
Author Perspective: Association