Published on 2014/01/08

Technology Partners, Big Data and the Future of Privacy

Technology, Big Data and the Future of Educational Privacy
While partnerships on technology matters are critical for operational efficiency, institutions must be well aware of their obligations regarding information security and privacy.
Colleges and universities are increasingly relying on technology partnerships to improve administrative and academic services, to gain efficiencies and to produce better outcomes. In the United States, higher education is under siege for unrelenting tuition hikes that have consistently outstripped both inflation and real incomes while policymakers and the public are getting more vocal in their demands for greater accountability. Ever grander deployment of technology has been one of the strategies colleges have adopted in the hopes of containing costs and, at least ideally, improving outcomes. Not coincidentally, this greater reliance on technology produces massive volumes of new data, which a broad and amorphous crowd of Big Data boosters — researchers, state and federal policymakers, educational reformers and IT companies — believe will ensure not only more institutional accountability, but also better pedagogical diagnostics and better teaching and learning.

Technology has certainly had an edifying impact on administrative practices in higher education. Self-service, 24/7 availability of basic administrative transactions — ranging from admissions to registration to financial and student accounts issues — has vastly improved the student experience from the long lines, inconvenient hours and ā€œarena eventsā€ that typified these functions a generation ago. Beyond administrative functions, the core functions of teaching and learning have likewise experienced enormous improvements because of technology. Aside from more timely and broader access to information, course content and educational material, colleges can now offer much richer content, online lectures and real-time tutoring and technology-enabled learning analytics in ways that will transform higher education in fundamental ways. These improvements should certainly be celebrated to the extent that they make higher education more effective, efficient, affordable and more broadly available.

As with other transformational shifts, however, it is important to pragmatically embrace the new while paying realistic attention to its potential risks. The ubiquity of technology on campus — often implemented through contracts with outside vendors — is generating massive amounts of very sensitive and commercially-valuable personal data about students, and it is increasingly allowing third-parties to access, warehouse, mine and use the data in ways that threaten well-established precepts of good privacy practice.

Such seemingly pedestrian functions as simple storage of enrollment records can quickly transform into massive longitudinal individual tracking systems that generate information previously known only to the students themselves and, in some cases, information that even the subjects don’t know about themselves. Enrollment tracking data repositories, for example, can now trace individuals across multiple institutions — from K-12 through graduate school — and even follow them into the labor market. Various efforts are underway to link these data marts with other data systems — health records, juvenile justice, police and court records, military service and unemployment insurance — to create ā€œhuman capitalā€ data systems that could be used to promote more effective policy and planning. The Big Brother undertones of even the most basic deployments of such systems are fairly obvious. What’s more, the well-intentioned aspirations of these attempts as social engineering often involve providing access to third-parties, typically IT firms, and often start-ups with promising insights but not the deep pockets and the track-record to ensure they will be around to face the consequences of possible breaches or internal misuses of the data entrusted to them.

Insufficient privacy protections in implementation of IT on campus can quickly lead to dystopian extremes that exceed Orwell’s worst fears. Consider, for example, the growing use of web-based instructional materials and e-textbooks. As more analytics are embedded into instructional material, more information about students’ habits, intellectual strengths and weaknesses and modes of engagement with the subject matter are being collected, analyzed and stored. While this information can certainly lead to breakthroughs in teaching and the advancement of learning, its possible misuses and its potentially detrimental impact on individual privacy are fairly obvious. Colleges and universities understand the importance of protecting the privacy of their students, and realize their excellent track record in that regard is the reason for the high levels of trust students and families place in colleges.

But with the rapid growth of technology on campus, institutional attention to privacy has too frequently degenerated to platitudes and abstract contract provisions that confuse security with privacy. Security, of course, addresses issues related to preventing unauthorized access to information, which is indeed a huge problem in itself. Privacy, in contrast, has to do with authorized access, and contracts should at a minimum be able to articulate, with specificity and intentionality, what information is collected, why it is collected, who will get to access and use it, whether and under what circumstances it may be re-disclosed, how long it will be retained, how it will be disposed of, what rights the subjects of information have to review and correct their records and what obligations and liabilities accrue to custodians of personal information. In the United States, many IT contracts fall far short of this level of concreteness, and instead make vague references to confidentiality and the federal educational privacy law known as FERPA, the Family Educational Rights and Privacy Act of 1974, as amended.

Using FERPA compliance as a substitute for real privacy is sloppy practice for various reasons, not least of all because Big Data enthusiasts, in their zeal to access records, convinced the Obama administration to significantly rewrite and weaken FERPA in 2011. Beyond what privacy advocates (correctly) view as the wholesale evisceration of FERPA, amorphous references to a legal baseline are a poor replacement for articulating privacy ‘dos and don’ts’ in a clearheaded and intentional manner. Proper privacy contract provisions should ideally track Fair Information Practices and conform to their framework as appropriate. The Department of Homeland Security also articulates these in connection with its mission. As medical researchers have learned and demonstrated through their development of the Institutional Review Boards, sound privacy practices — far from being an obstacle to legitimate uses of data — promote public confidence in, and greater systemic integrity of, data-driven research and quality improvement.

Print Friendly, PDF & Email

Readers Comments

Avatar
Quincy Adams 2014/01/08 at 8:10 am

Fascinating discussion on privacy rights and information security. We, in higher education, have never been down this path before, and I believe it will take some years before institutions have the knowledge and sense of responsibility to build privacy protections into their contracts with third parties. Reviewing what the medical field has done is a good place to start.

Avatar
Lauri Nieminen 2014/01/08 at 12:16 pm

Universities today blindly enter into contracts with third parties because they want to quickly build the capacity to perform data mining. There’s no understanding at the top of the long-term consequences of allowing private, for-profit companies access to institutions’ valuable demographic data. To a certain extent, I think the pressure to develop the ā€˜best’ pedagogical models or outcomes has caused some institutions to be wilfully blind to this shortcoming. I see no viable solutions, unless the federal government, accreditors or students/families start demanding privacy protection as an accountability measure.

Avatar
M William 2014/01/08 at 2:45 pm

I agree with the commenter above. The problem is that students haven’t demanded accountability from their institutions regarding their personal data. Contrary to Nassirian’s suggestion, I don’t believe students consider privacy protection as a factor in their trust of their institution. In general, when the public thinks of privacy, banking and medical records come to mind, but not student or other personal information. This is why students so readily share details about their lives on social media sites like Facebook. Until we start demanding real privacy measures and guarantees, institutions will continue feeding us platitudes.

Leave a Reply

Your email address will not be published. Required fields are marked *