Measuring Prior Learning Success by Leveraging DataJoseph D. Levy | Executive Director of Assessment and Accreditation, National Louis University
When thinking about Prior Learning Assessment (PLA) success, it’s all about the data.
PLA programs lend themselves to an array of data informed possibilities (both quantitative and qualitative). At the very least, PLA data can provide:
- PLA program usage (overall and individual program results and trends)
- Demographics of students using PLA programs (e.g., age, gender, race, major program)
- Student learning outcomes (how we know they learn, showing what learning looks like)
- Pass rates for PLA-related exams or courses
- Survey feedback for PLA-related programs or courses
This is not an exhaustive list, but even this sampling provides more than enough information to inform student successes or struggles by individual PLA programs. It can also inform who those students are and what achievement looks like for them. Such insights can benefit the student individually, as well as directly influence PLA program offerings or coordination in order to better promote future student success.
PLA data can benefit a host of stakeholders in providing meaningful and useful information. Areas such as enrollment, marketing, academic advising, and learning support can leverage the data to best recruit, engage, advise, and support students to participate in PLA. Faculty members — even those not engaged in PLA-related work — can benefit from knowing about PLA results in order to recommend it as an option for their students. Most importantly, this data benefits prospective, current, and former students to make them aware of opportunities available at the institution to recognize their lived experiences and knowledge as credit toward a potential or existing credential.
Directions for Data
Leveraging data to make improvements to positively impact student achievement, as well as grow offerings or reinforce PLA program sustainability, requires intentional effort. Data does not exist in isolation, nor are they magically interpreted on their own. Because data interpretation may not be second-nature for everyone, consider the following five tips:
1. Connect your results to your objectives
Hopefully, you are not collecting data on things which do not matter to you. To that end, you should work to root your results with purpose. With respect to data collected, know what you are reporting on and how it should be answering or providing insight for you.
2. Make meaning of your results
It is not enough to just share or report data as it stands. Data alone doesn’t tell a story. Build a narrative around your numbers to explain your evidence. You might contextualize what high or low numbers mean in a given circumstance, or perhaps justify why a percentage is used instead of a count. As part of this process, be sure to indicate how the results may point to strengths or areas to improve in relation to your programs. Meaning and interpretation become your messages to inform and engage your audience.
3. Craft your content
Once you have interpreted your results, you’ll want to codify your findings as more than just data points strung together. Identify overarching themes; build an argument; present a beginning, middle, and end or next steps for your content. Compile your story and present it in a compelling way.
4. Share your story
Be ready to share versions of your story to multiple audiences through different media. While the data collected was grounded in your objectives, you want to present your results as relevant to your audiences. Cater to the needs and interests of your readers. Find ways to embed your results in regular communications, meetings, or activities to show connection and relatedness of your results to institutional operations.
5. Leverage as strategy
Let your data serve as catalysts for collaboration and connection over common causes. Use your results as evidence and to amplify the voices or efforts in need. Additionally, your results should inform future plans, goals, and targets. Allow your findings to influence intentionality as you begin the assessment cycle anew.
At NLU, manageable initiatives to leverage PLA data have been created and implemented to provide meaningful information for program sustainability and growth. Operational objectives and learning outcomes were articulated and aligned. A formal assessment plan was created. Results were analyzed, interpreted, and reported to relevant audiences to invite support in acting on evidence for institutional betterment.
While we continue to gather and analyze data from our PLA programs, our short and long-term goals include: increasing PLA program usage, monitoring demographic differences in PLA student success, expanding faculty engagement and academic majors employing PLA programs, and executing on a marketing plan to create a better understanding of PLA opportunities and the students who use PLA.
Our focus this year has been all about the data, but with that comes intentional grounding in our goals, commitment to making meaning of the results, and acting on our findings – including sharing our story in a way that resonates with different audiences. Our assessment efforts have proven enlightening and informative to us, and we are measuring the impact our actions have had in relation to institutional support and student success. We hope these tips prove useful for you and your efforts. Look forward to hearing your story of turning data to information in the future!
Author Perspective: Administrator