Published on
Badges Won’t Save Us: What Skills Credentialing Actually Requires
Tara J. Konya | Director of MBA and Graduate Business Programs, University of New England
Sonja Strahl | Director of Learning Solutions, University of Maryland Global Campus
When we set out to embed skills-based digital credentials into our online graduate business programs, we were confident. We had institutional support for this pilot, redesigned courses with embedded skills and the conviction that graduates needed more than a transcript. The business world was evolving, and students needed verifiable evidence of what they could do. What followed was not easy. It felt like navigating uncharted terrain with no trail or map. What we share here are the lessons we wish someone had told us before we started.
The Problem We Thought We Were Solving
The problem was clear: Business school graduates struggled to articulate what they had learned in employer-facing terms. A student might say, “I took a financial management class and earned an A,” but an employer wants to know what that actually means about their abilities. Badging and digital credentials seemed like the obvious fix. What we quickly learned was that building skills and describing them in language employers understand are two different challenges. Without shared definitions and clear evidence, a credential carries little weight.
Academia is at a crossroads, and we can no longer afford to wait. The research is clear: technical skills are necessary but no longer sufficient. The WEF Future of Jobs Report 2025 finds employers expect 39% of core skills to change by 2030, with human skills such as resilience, flexibility, agility, creative thinking and leadership rising fastest alongside AI and big data competencies. Deloitte’s 2026 Global Human Capital Trends puts it plainly: “Competitive advantage is now primarily less driven by technology differentiation and more by cultivating the human edge.”
Most industry reports are written for employers, not for universities responsible for preparing students with these skills. It therefore falls to academia to translate what employers need into what we teach and make sure graduates can speak that language. When we began this project, we realized how much foundational work that translation required.
Lesson One: “Skills” Means Something Different to Everyone in the Room
“Skills” functions like a homonym. Everyone hears the same word but carries a different meaning. Faculty interpreted skills in two ways, and both were problematic to our work. One group oversimplified skills as just course topics such as marketing or operations. The result was that credentials documented technical content covered in courses, not demonstrated capability. The second group assumed existing learning outcomes already were skills. They were not. Outcomes written as “students will summarize…” describe intentions, not observable performance in the context of an authentic experience.
Administrators frequently cited “critical thinking” without defining how to assess it. Marketing focused on search engine optimization (SEO). Career services spoke employer language disconnected from course content, and students described themselves as good decision makers but couldn’t show evidence.
The lesson? Translation across stakeholder groups must happen as the first step in the process, before making any platform decisions, before developing any marketing materials and before redesigning or updating any courses. Skills must be clearly defined and that definition agreed upon. Moving forward without a shared vocabulary produces confusion and misunderstanding, resulting in digital credentials that no one fully understands or trusts.
Lesson Two: Course Outcomes Are Not Skills
At first glance, our course outcomes appeared to align with employer job descriptions, so we assumed the alignment work was done. It wasn’t. True skills alignment requires connecting three elements: academic content (what is taught), assessments (the evidence students produce) and labor market demand. When we examined our assessments closely, we found they evaluated outcomes or objectives rather than skills. This lack of alignment is understandable, given that the courses were designed around the traditional academic focus of outcomes, but the expectation has shifted real-world skills that employers value must now drive course design.
A rubric rewarding case-study analysis in a discussion forum is not evidence a student can perform that analysis on the job. Assessments must produce a specific, verifiable artifact tied to a rubric that defines expected performance at a measurable quality level. These tools are essential for issuing a verifiable credential. Without them, all you have is a grade, a classroom measure, not a professional credential.
Lesson Three: Data Is a Governance Issue, Not an IT Issue
A badge is only as credible as the data behind it. When we examined course assessment data, we assumed gathering it would be straightforward. Instead, we found traditional academic systems that didn’t easily communicate, data that was too aggregated to be actionable and inconsistencies in how performance was captured and used. Faculty could adjust grades and assignments to create individualized experiences, but uniqueness is difficult to measure or track and it breaks systems. We also had no mechanism for easily transferring student performance data to external platforms. In short, we had selected a credentialing platform before building the infrastructure to support it. What you track is what you trust. We had no reliable way to track anything.
What We Would Tell Institutions Starting Today
- Build your team. The essential five: (1) an academic leader with faculty credibility, (2) a registrar who understands what data can be credentialed, (3) a data systems lead to map assessment-to-platform flow, (4) a career services partner to validate skills language against employer demands, and (5) an instructional designer as your linchpin—connecting outcomes, skills and rubrics while designing learning experiences that produce verifiable evidence of mastery.
- Build shared vocabulary. Begin using shared language across faculty, staff and leadership even if it isn’t fully finalized. Collaboration will close gaps over time. Bring real job descriptions into conversations on curricula, so faculty can see how employers describe the skills students develop through coursework.
- Start with a pilot, not the entire institution. Identify two or three courses, or a small program, with faculty who are curious and willing to be bold. Test the framework, create and validate data flows, and refine the translation between academic and employer language. Share what you learn broadly with the team and across your institution to begin shifting the culture toward a focus on skills, then scale!
- Don’t build a badge; build a system. Most institutions start with the credential and work backward. Start instead with governance, assessment design and data infrastructure. The badge is simply what comes out the other end. A solid framework is what allows you to scale across programs without starting from scratch each time.
- Keep it simple. If a student cannot name the skill, explain what they did to earn it and articulate why it matters to an employer, your work has failed the student. Institutional complexity should never burden the student, and not every accomplishment deserves a badge. Over-credentialing only overcomplicates the experience, creating confusion for both students and employers.
The Bigger Question
Students pursue degrees with purpose, expecting education that prepares them for career success. Graduate business programs exist to advance careers, but that promise is unfulfilled if graduates don’t fully understand the skills they’ve acquired or cannot demonstrate what they’ve learned. Making learning visible is no longer optional in today’s business world. The real question is whether institutions will evolve to meet these changing needs, providing verified evidence, employer-recognized language and infrastructure to capture true student learning or remain in the traditional model of assumed mastery, where capability goes unarticulated. Students deserve to confidently name their skills and advocate for their potential.
We are still on this journey, especially as AI becomes the driver of new career opportunities. The wilderness metaphor still fits; there is no established trail or guardrails, but we are walking with better maps than when we started and with a hard-won conviction that the most important work happens long before the first badge is issued. For every academic leader considering this journey, the question is not “Are we ready to launch?” but “Are we ready to build?”
Additional Resources
World Economic Forum. (2025). Future of Jobs Report 2025. https://www.weforum.org/publications/the-future-of-jobs-report-2025/in-full/
Deloitte. (2026). 2026 Global Human Capital Trends: From tensions to tipping points. https://www.deloitte.com/us/en/insights/topics/talent/human-capital-trends.html
McKinsey & Company. (2023). We are all techies now: Digital skill building for the future. https://www.mckinsey.com/capabilities/people-and-organizational-performance/our-insights/we-are-all-techies-now-digital-skill-building-for-the-future
Lightcast. (2024). The workforce of the future: Harnessing AI for tomorrow’s jobs. https://lightcast.io/resources/blog/the-workforce-of-the-future-harnessing-ai-for-tomorrows-jobs