A Summary of the Project

We are conducting a three-year research project investigating faculty perspectives of student privacy and their practices in relation to emerging learning analytics tools and initiatives.

Learning Analytics, Defined

Learning analytics (LA) is defined as the “measurement, collection, analysis, and reporting of [student and other data] for the purposes of understanding and optimizing learning and the environments in which it occurs.”1 Advocates of data-driven education argue that “[e]very click, every Tweet or Facebook status update, every social interaction, and every page read online” leaves a digital trail of student behaviors available for aggregation and analysis.2 Analyzing these data may help to personalize educational experiences and resources, as well as potentially surface cost-saving processes and improve an institution’s fiscal administration.3

The Motivation for the Project

We recently conducted a study of over 8,000 LIS syllabi published since 2010 for student privacy language; only 2% of the syllabi included some form of student privacy language discussing policies, rights, and instructions for protecting one’s privacy.4 The lack of literature and the findings in our study signal that faculty may neither be aware of the emerging student privacy problems, nor able to address them in their instruction. The literature on instructors and learning analytics generally focuses on use practices and feature preferences.5 One article suggests that instructors are uneasy about access to some LA data and visualizations, fearing that such access would bias their instruction.6

The student privacy issues learning analytics create are significant, and faculty are arguably on the frontline of student privacy. Their tool choices, instructional designs, and course policies impact the degree to which students retain privacy. Consequently, we need more information about instructor perceptions of student privacy, experiences with learning analytics, and instructional choices to protect their student’s privacy.

The Project’s Phases

Phase One: Survey

The research team will conduct a survey investigating how faculty incorporate student privacy resources into their courses and address student privacy in their instructional designs and adoption of educational technologies. Furthermore, the survey will explore the student privacy issues faculty are aware of as they relate to learning analytics. The team will work with its statistical and survey consultant to develop, pilot, and validate the survey.

Phase Two: Interviews

Building on the results from the Phase One survey, the research team will aim to conduct 30 semi-structured interviews with faculty, librarians, and instructional technologists. Interviews with faculty will enable deeper inquiry into some of the findings from Phase One. Additionally, interviews will provide the conditions necessary to discuss values and ethics related to student privacy in ways surveys cannot. Interviews with librarians and instructional technologists will investigate how these professionals perceive their role in addressing student privacy and supporting faculty in their instructional efforts with respect to privacy.

Phase Three: Facilitated Discussions

The final phase of the project uses the empirical findings from phases One and Two to facilitate student privacy discussions among faculty, librarians, and instructional technologists. The discussion will be led by the research team and supported by a privacy facilitation consultant. Working with our privacy facilitation consultant, we will develop a Facilitated Discussions Toolkit, of which parts will be used during the discussions. The team will include summaries of the research, curated literature, presentation slides, marketing materials, and discussion guides.

Dissemination of Project Findings

Phases One and Two lead directly to peer-reviewed publications and presentations. We will target high-ranking, open-access journals in order to maximize the reach of disseminated findings. All pre-prints of journal articles will be accessible on this website. Additionally, the team will present emerging findings and final analyses of data at scholarly conferences; related artifacts (e.g., posters, slides) will be accessible on this website.

  1. Siemens, 2012,
  2. Shum & Ferguson, 2012,; Siemens & Long, 2011,
  3. Goldstein & Katz, 2005,
  4. Jones & VanScoy, 2019,
  5. Bentham, 2017,; Knight, Brozina & Novoselich, 2016,
  6. Knight, Brozina & Novoselich, 2016,