Accreditation: Disruption or Evolution? (Part 1)Chari Leader-Kelley | Senior Fellow, CAEL
A student graduates from high school, immediately attends college and takes five years to complete a bachelor of science in sports management. This student also graduates with $25,000 in student loans and zero job prospects. He is considering graduate school, deferring his student loans, racking up another $25,000 in debt and hoping for a job three years from now.
Should this student go to graduate school? Would he be better off taking a job at a telemarketing firm or with a temporary agency instead?
A single parent earning $30,000 a year registers for college seeking a better career pathway and some form of assurance she will have a better economic future. She struggles to complete two courses per semester, eventually dropping out of college after two years with a debt of $10,000. She later re-enrolls only to find many of her courses will not transfer, and her course choices no longer fit her new career plan. Her future is uncertain and her new college has an accelerated program with a much higher tuition rate, bundled with financing.
Does she know what her real chances are for completing the degree, being able to pay off her debt over time and truly earning more money with some job security?
These scenarios are very real for today’s college students, particularly first-generation students. In fact, over the next decade, increasing numbers of cash-strapped adults will be asking, “Should I complete a bachelor’s degree, or go on to get a master’s degree, in order to be eligible for better-paying jobs or career pathways?”
What assurances do prospective students have that the education they will be paying for (likely for many years to come) will indeed yield the opportunities of their dreams?
These are the underpinning issues that have caused many to question the effectiveness of accreditation in today’s environment. Accreditation has been a process (at least for the regional accrediting bodies) focused on self-study, strategic planning, adequate resources, governance and collegiality and peers reviewing peers — all in a somewhat voluntary system. Anyone who has worked at a college knows the main emphasis has been on how well a college articulates and embodies its mission. And a mission statement is drafted carefully to be sufficiently vague to allow virtually any activity the college/university seeks to undertake to relate to its mission. However, much that has been traditionally expected — the institutional self-study, evidence of data-driven decisions, student retention and graduation and faculty governance — has tilted the process and measures of accreditation off balance. Processes designed for traditional institutions serving traditional students no longer align with today’s mobile, older and more career-driven students attending college while balancing multiple commitments. The tension is ramping up as state and federal governments seek a better return on their investments in today’s colleges and universities, particularly with an uncertain economy ahead.
Policymakers are calling for transparency and accountability, the likes of which colleges and universities have been immune from or easily able to dodge. At the same time, President Obama’s push for “institutional report cards” tied to Title IV funding, while unlikely to actually be approved as envisioned, may improve transparency for the consumer as well as assist accreditors in boiling down key indicators of quality.
This is the first of a two-part series. To the read the final installment, please click here.
Author Perspective: Association