Attract and Retain Learners with Digital Badges
Discover how digital badges create a positive experience for your learners.
Among the challenges facing higher education, few are more critical than the issue of effective instruction. In fact, for Carl Wieman, Nobel Prize award-winning physicist, the biggest problem with higher education today is its emphasis on publishing and researching and its lack of focus on the measurement of teaching quality (Westervelt, 2016).
Professor Wieman’s concerns are corroborated by a 2014 report from the W.T. Grant, Bill and Melinda Gates, and Spencer Foundations, which found a dearth of tools to measure instruction at the postsecondary level. In fact, the report noted that it would make sense for foundations to “invest … in the development of measurement tools [whose] intended use [would be] to provide direction to [instructor’s] development” or “to support a taxonomy of instruction—a snapshot of what is happening in higher education today” (p. 3).
The School of Business and Technology at Excelsior College has always taken effective online instruction seriously but hasn’t always had the tools to measure it. Because of this, in 2014 a leadership team created an approach to measure online instructional quality through our Red-Yellow-Green (RYG) faculty development process. The goal of RYG is to create a shared vision and language for effective instruction, and ensure continuous improvement and professional development of instructors within our school.
As the college prepares to launch the second iteration of RYG’s observational audit, it is a good time to reflect on the lessons learned from RYG’s first implementation.
The Excelsior College Context
Founded by the New York State Board of Regents in 1971, Excelsior College (then known as Regents College) became host to one of the United States’ first external degree programs. It was not until 2006, that the College offered its own online courses.
Online courses at Excelsior College are delivered asynchronously in eight-week and 15-week formats to our working adult population, who are on average 37 years old and live throughout the country and the world. Courses are developed using a centralized, team-based approach. Multiple stakeholders—faculty, subject matter experts (SMEs), instructional designers, and assessment professionals—work collaboratively on the development of the syllabus, readings, assignments, discussions, and assessments.
The college relies on instructors—sometimes but not always the same as our SMEs—to breathe life into the course by providing content expertise, facilitating instructor-to-student and student-to-student engagement, giving timely and substantive feedback, and monitoring and evaluating student performance. The Community of Inquiry framework, an empirically tested online learning approach, which focuses on social, cognitive and teaching presence, is the college’s basis for fostering deep and meaningful learning (Garrison, Cleveland-Innes, & Vaugh n.d.).
The Red-Yellow-Green Process
Our RYG combines multiple methods to assess instructional quality, including automated processes and checklists to observe specific behaviors (i.e. logging into the course, number of discussion posts, assignments graded late); items from the end-of-course student evaluations that focus specifically on the instructor; and observational audits of courses using a protocol and rubric to identify strengths and weaknesses of faculty across four areas:
Each of these components is used to calculate a final score, which serves as a guideline of overall instructional quality and helps determine the type of development and follow-up actions required for the instructor.
The item weighted the most heavily in the process is the observational audit, viewing instruction within the context of a live section. Observational audits are conducted by a trained representative from the school who utilizes a rubric to score instructors. The representative also provides narrative feedback to the faculty program director. The feedback is documented in a Microsoft Access database, which reports at the instructor and course level, and provides a longitudinal view of instructor performance.
The goal is to develop instructors by identifying their strengths and weaknesses. The RYG tool has helped to highlight common issues in our courses, around the provision of substantive feedback and general levels of engagement.
As Excelsior prepares to launch a revised evaluation tool and a full research study to examine the efficacy of the tool and its impact on learning, the college thought it was important to share our process and lessons learned with others in the community.
Buy-In: When the RYG process was launched, a presentation was provided to all stakeholders, including program directors and instructors. Expectations were shared and the tool’s purpose as a way of ensuring professional development and continuous improvement was stressed. Despite these efforts, some faculty remained wary of the approach and the use of terms like “red, yellow, and green.” They had questions about how it was being utilized and the potential impact it would have on their teaching assignments.
We learned that we needed to communicate more frequently and to use multiple methods to communicate our purpose. Furthermore, examples of how the process worked needed to be shared with stakeholders. Through frequent conversations, instructors came to understand that the tool is being used to establish clearer expectations and greater skill development.
Capacity: In the beginning, program directors were solely responsible for completing audits—a task that quickly became too onerous. An adjustment was made. Other members of the team began sharing this responsibility, resulting in more audits—and better quality audits—being completed.
Training: With multiple users, the question arose as to whether the tool was performing the way it was intended. Were the same behaviors being monitored and evaluated, and were the criteria being interpreted consistently? Over time, the importance of a systematic training process for the implementation and utilization of the tool became apparent.
Flexibility: In many ways, the initial tool led to conversations about our definitions of high-quality instruction. These conversations led to developing an inventory of good practices, habits and behaviors that were observed from our audits. As an outgrowth, we started establishing domains of quality instruction within our courses. In so doing, better operational definitions were developed and embedded in the next iteration of the tool, with the intention that it will more objectively measure high-quality instruction.
Ongoing Follow-up: We made many valuable observations of instructional quality in our courses, but did not always have a systematic process in place to manage our follow-up with individual instructors. The implementation of the Access database has allowed for tracking of our follow-up actions.
The new tool will be piloted across multiple schools at the college to validate the instrument. The pilot will evaluate RYG’s reliability for assessing instruction, as well as determine its impact on instruction within courses and student learning more generally. To accomplish this, Excelsior will be pursuing funding to support our research. The college is happy to share the tool and more lessons learned with other institutions.
– – – –
Garrison, M. Cleveland-Innes, & N. Vaugh. (n.d.). The Community of Inquiry [web log]. Retrieved from https://coi.athabascau.ca/
Westervelt, Eric. (2017, June 7). Hey Higher Ed, Why Not Focus on Teaching? NPR. Retrieved from http://www.npr.org/sections/ed/2017/06/07/530909736/hey-higher-ed-why-not-focus-on-teaching
William T. Grant Foundation, Spencer Foundation, & Bill & Melinda Gates Foundation. (2015). Measuring Instruction in Higher Education: Summary of a Convening. New York, NY. Retrieved from http://wtgrantfoundation.org/library/uploads/2015/11/Measuring-Instruction-in-Higher-Education.pdf
Discover how digital badges create a positive experience for your learners.