Visit Modern Campus

Pursuing a Major Business Process Improvement Implementation: Establishing Priorities, Getting Started and Staying Focused

The EvoLLLution | Pursuing a Major Business Process Improvement Implementation: Establishing Priorities, Getting Started and Staying Focused

This is the final installment of a three-part series exploring why you might take on a major project, reflecting on who needs to be involved (and when/how), and discussing what it takes to identify and prioritize the processes. In the first installment, the benefits of a major implementation were outlined. In the second installment, the authors dove into identifying the critical players who can move a project forward. In this final piece, Farthing reflects on the process of establishing priorities and staying on track.

At Ithaca College, we knew that we had to do this work. Like any other organization, we just didn’t know how or where to start. Like many organizations, we hired a consultant with expertise in the area of Enterprise Content Management to assist us. Even with the help of a third party to identify the opportunities to improve, we still had a list of over 150 processes to choose from and then we asked the question: Where do we start?

The firm we engaged completed an assessment of the opportunities that would gain value from an institutional Enterprise Content Management (ECM) system. The analysis included opportunities related to the capture and categorization of digital content from all sources including email, born-digital content, scanned documents, etc. In addition, opportunities to use ECM for the automation of business processes and workflow were identified that included replacing paper forms with electronic forms.  The analysis included items such as the benefits, risks, cost estimates, staffing models, governance recommendations, document taxonomy classifications and deployment guidelines.

Their findings were organized into three categories: Institutional Administrative Processes, Core Department findings and Non-Core department findings. These represented the processes that we would improve upon. Now we had a framework of what we would be dealing with and it was now up to us to make decisions of where to start and realistically identify how much we could do in the time allotted and within the budget approved.

Analysis and Decision Making

With the Executive Steering Committee and a few subject matter experts; an internal analysis of the consultant findings was the first logical step. Based on what we learned, we were better positioned to identify the scope of the project and start making decisions. Ithaca College had never engaged in this type of work in the past, so we were charting new territory. High-level decisions that needed to be made were identified: What application would we implement? What are the staffing requirements? How much budget will we need? When will we start? When will we be done? Where do we start?

Software Selection

We followed a standard RFP process that included vendor demonstrations of available products that had the features and functionality we were looking for. Through this, we selected a vendor that not only had a robust ECM platform to meet our needs, but also was purposely build for higher education. Once we knew what the application was, we could look at the technical skill sets required to administer and develop within it and start defining the budget.


Information Technology

We looked at our existing resource pool within Information Technology to determine what skill sets we had in house. We made decisions to allocate development staff into the dedicated team for this effort and backfill their existing roles with temporary employees. This way, we would be able to allocate the resources back into their prior roles when the project ended.

We did identify a few roles that we did not have on staff. These included a project manager, a business analyst and a quality assurance developer. With these positions, we hired a full-time/temporary project manager for the implementation, and the business analyst and quality assurance developer would be full time to the project and retained in their positions after the project ended to support new work.

We decided on a one-year ramp up schedule for staffing so we could start making progress and gain experience before adding to the team. The ramp up to full staffing was a great decision made by the Steering Committee; if we fully staffed the team at the start, we would have constrained ourselves. By allowing a ramp up to occur, we could pass on our experience gained to new team members and in turn increase our development efficiency and overall velocity to deliver solutions.

Functional Office Staff

 Each business process improvement through ECM involved a new team from each functional office. These resources were not dedicated to the project, so effective utilization of their time against their existing workloads and priorities presented a challenge.

 Involvement of the business subject matter experts on process, policy and their insight to identify areas of improvement was critical to our success. What we found is that it’s not always the manager, director or vice president of an area that has the requisite knowledge base, but the office worker, specialist or in some cases the customer (student). Having these individuals available to us to inform us of their experience, listen to their ideas and generally brainstorm with our wildest imaginations to make the process better.


We had over 150 opportunities identified and we needed to distill from that what was the highest priority to the institution, what was the lowest and everything in between. Essentially, we needed to know where to start and where we were going to go next. This was no easy task.

What we did was identify the top 10 institutional priorities, assigned them a weight and a prioritization value. With this common scorecard, each opportunity was consistently evaluated to determine how important the business process was to the Institution as a whole. Each process owner was responsible for scoring their improvement opportunities. The result was a stack ranked listing of the opportunities, in order, based on institutional importance. Upon presentation of the prioritized list to the Steering Committee, there were very few debates on the priority order and oddly, the prioritized order followed the flow of student data from initial contact in the Office of Admission to the management of student data by Alumni Affairs following graduation.

Having a common scorecard and a prioritized list is a nifty thing to have, but guess what—priorities change! No problem. When a shift in priorities occurred, we would reevaluate the list, account for new priorities and adjust as needed to maintain consistency with institutional goals and values. We reprioritized the list at the end of each semester. This was a critical exercise as it took guesswork, emotion, negotiation and a lot of meetings out of the equation.

Equally important were periodic reviews of the priority order. As time goes on priorities change, new technologies are implemented, old processes are retired and new ones created. Since this was a multi year effort, we reviewed and reprioritized the overall list each semester. By the third year, each of the categorical subcommittees were able to self-govern and reprioritize with little intervention from the overall steering committee. This in part was due the repeatable nature of the process and our ability to shift based on institutional need or where we could have the greatest impact.


We started with a pilot/proof of concept with our largest and most complex department: the Office of Admission. This made sense, not only was it prioritized first, but this where and when a student’s data enters our environment.

We had three key challenges: the team was not fully staffed, we had to adapt to new technology and, importantly, we had to adapt to change. These are not easy hurdles to overcome.

We were very fortunate to work with the dedicated individuals in this office and their dedication to change was critical to our success. This pilot project laid the groundwork for us to address the remaining opportunities and learn from the experience gained.


Working with the functional office staff to identify how the process currently works, develop an improved process and identify how the new process is supposed to work with the application and automation. Also included are the technical requirements about how the system will work, how it will interface with and update ERP systems and function as a technical platform.


With a detailed specification in hand (requirements), the development staff is better positioned to provide realistic estimates against the overall level of effort as well as build a better solution that is aligned with what the customer is actually asking for. Combine the strong requirements with the feedback loops we created through Showcase and Feedback (which we discussed in Part II of this series) and you’ve got the makings of a very strong development unit.

Testing, Remediation and More Testing:

Sometimes this can turn into an endless loop of enhancement and feature requests and you’ll need a highly functioning testing resource to keep everything on track. We hired a Quality Assurance tester specifically for this project. This individual was solely tasked with creating test plans, use case scenarios and structured testing sessions with end users. When code was ready to be tested, it would first go to the QA resource who would test the process and provide feedback to the development team. Then structured testing sessions with the user base to obtain their signoff on the delivered product. Errors identified during testing were documented, submitted back to development for remediation and then retested. Spending this much time in testing yielded excellent results and increased the overall quality of our work and assisted in our goal of getting it right the first time.

User Acceptance:

After multiple rounds of testing and remediation had occurred, we would be back in front of the Showcase and Feedback committee. It was here where we would walk through the entire process and obtain feedback on the delivered product. Our objective was to obtain acceptance by the user community in writing. This was not an easy task as mentioned above. The community tends to try and include new features, functionality and enhancements. Having the scope of the delivered product identified up front provides an opportunity to control these requests so we can clearly showcase what was asked for up front.

Going Live:

Each new process represented a significant change in each process. In an institution with over 6,000 students and nearly 2,000 faculty and staff, providing individualized training was not an option. We ended up going with narrated screen videos and marketing through our institutional communications platform, student newspaper and word of mouth. We found that in most cases, because we had clear requirements and had feedback from the community as we coded, the delivered product was intuitive and easy to use with little or no training. The key here was marketing what was now available, how and where to access it and who to go to if you needed support.

Reflecting on the Process

Through this project, we changed the way business is conducted on our campus. We improved operations, increased efficiency and reduced cost—the project deliverable trifecta! We learned quite a bit about ourselves and created opportunities to improve how we work within IT and more so with all of the constituencies across our campus. In order to be successful, everyone needed to be totally committed to the work we were doing and we needed support from every channel on campus that we impacted. This was no small task and going back to the beginning, it took a college, not a single individual or department, but a community of diverse professionals interacting with the student population.

Previous Articles in this Series

Pursuing a Major Business Process Improvement Implementation: Why Take the Plunge? (Part 1)

Pursuing a Major Business Process Improvement Implementation: Reflecting on Who Needs to be Involved… and When (Part 2)