Beyond Remediation: Using Technology to Maximize Retention and Completion
Driving the completion and attainment rate nationally is of critical importance to the health of the economy, but accomplishing this has been challenging for higher education institutions. It requires them to attract and serve new groups of students who may not be as academically prepared as colleges and universities would like, and while remedial programming helps the most motivated of those learners it often leads to stop-outs and drop-outs. Excelsior College recently received a grant from the department of education to create the Diagnostic Assessment and Achievement of College Skills (DAACS), an open-source assessment tool that will target resources and services to students based on their academic and non-academic needs. In this interview, Jason Bryer, who is leading the charge to create the tool, discusses the process of making this idea a reality and shares his thoughts on its value to the postsecondary environment.
The EvoLLLution (Evo): What is the Diagnostic Assessment and Achievement of College Skills (DAACS)?
Jason Bryer (JB): The DAACS is a low-stakes formative assessment designed to give students and academic advisors a sense of a student’s strengths and weaknesses. We came to this by looking at placement exams. At one point, at Excelsior College in particular, we were trying to evaluate what we know about students when they enter the college. Students bring a great deal of transcripts with them when they enter the college, in pursuit of transfer credit and prior learning assessment, but some of those transfer credits can be fairly old. This creates an issue for us that many other institutions run into, which is, “What does the student actually need to prove knowledge?”
Until recently, the answer was a placement exam, but that’s not really in line with the college’s mission or philosophy. We don’t want to track students into remedial courses, but we still want to have some valuable information about students’ preparedness. We wanted to fix what was wrong, but keep what’s right, with placement exams.
One of the things that’s wrong with them is that they’re high stakes. If you look at the literature base, students who are identified as unprepared tend to, at best, delay graduation. At worst, they eventually withdraw form the institution without ever having begun credit-bearing activities.
DAACS is going to be module-based, in the version we will be making available, with a focus on reading, writing and math because those are core areas in which we assume students have some background knowledge when they enroll in a credit course. The innovative piece, though, will be on the non-academic side where they will receive constant feedback about their performance. We’re going to help students understand their strengths and how to play to their strengths, but and also identify their weaknesses and direct them toward resources that help them to shore up those deficiencies.
How do institutions benefit from implementing systems like DAACS?
DAACS tools help to improve the efficacy and accuracy of our predictive models. A lot of institutions are spending a lot of time, effort and money developing predictive analytics models and departments. Excelsior is no different. But the models are only as good as the data you have. Often, you see a great deal of research looking at how SAT scores or high school GPAs can be used to predict postsecondary success.
This accounts for only a 10 to 20 percent variance in first year retention. If we can collect more student information in a way that students want to give us the information, I think we can be more accurate with predictive analytics, especially in terms of retention. This allows us to make better use of the limited resources we have.
We can use academic advisors who can reach out to students we think will be “high risk” based on their responses to DAACS and other proven variables and get them the resources they need.
What will be involved in developing this tool?
DAACS is going to be an open-source solution that will be available for other institutions to use. We’re trying to use our available resources and build it on top of other open-source software. We’ll be using open-source, computer-adaptive testing software from Cambridge University and using other open-source software tools to do all the assessment. The assessments we’re using already exist: surveys, grit-determination, college-readiness exams and other tools.
We’ll be spending our time integrating these resources into a single system and then taking students’ results and building the software that gives them a personalized, custom report immediately, and all the feedback that goes with it.
What impact do you hope the DAACS will have on non-traditional student success in future?
We have short-term and long-term goals. From our own internal research, we know that the sooner students engage in successful credit acquisition, the more likely they are to persist and graduate—it’s a chain reaction. So, of course, our long-term goal is to increase completion rates but it’s hard to say that something that happens when they enroll impacts their graduation.
Of course, if the student has a bad experience enrolling and fails their first course, the likelihood of that student leaving is very high. So will be looking at these short-term goals: Can we increase a student’s likelihood for success in their first course? Can we help students start their educational experience in a positive way with realistic expectations and realistic goals about what they can achieve? What we find is that some students come in highly motivated but their goals are too unrealistic and that’s problematic because if they hit one little speed bump, it can often derail them.
The short-term goals are getting students on-track to successful credit acquisition that hopefully starts a chain reaction getting them to a second course and eventually to completion.
What will be some of the most significant challenges you envision facing in the development of the DAACS?
The biggest challenge is getting students to take these results seriously and really reflect upon them. I don’t foresee huge challenges in getting an assessment tool up and running or in providing the feedback that students need. The challenge is in presenting the feedback in a way that the students can relate to, can act upon and be motivated by.
Given what we know in the K12 space, in the wake of No Child Left Behind and even with the common core initiatives, we have students coming out of high school who have been “assessed to death.” And then, the first thing they do when they come to college is take another assessment. They need to understand that this is not the kind of summative assessment they’re used to. It’s an assessment designed to help them and to get them to a place where they can see this as a helpful tool to become better learners, not a hindrance.
That means we’ll need to work with academic advisors as well to ensure students can make sense of the reports.
What kind of timeline do you think there will be between the launch of the pilots at Excelsior and WGU and a wider rollout to the rest of the market?
We are anticipating piloting in early 2017, and we think this will take about six months. Both WGU and Excelsior have continuous enrollment and large numbers of students are coming in per month, so inside six months we will have covered a large number of students.
It’s likely that we will roll out on a wider scale in a little over two years. Hopefully other institutions will look at this at that point and then contribute to it.
We’re building this in a modular fashion, especially with the non-academic side, and we’re including the core academic areas as well. We can foresee in the future, though, other institutions developing areas that pertain to areas important to them. So a STEM-focused institution, for example, may develop a sciences section.
This interview has been edited for length.
Author Perspective: Administrator