This toolkit was drafted by a Value in Academic Libraries' subcommittee on Learning Analytics and Privacy. It is meant to assist academic librarians as they consider responsibly engaging with campus learning analytics at their respective institutions. Hopefully, it serves as an overview and bridge to much of the study that has been undertaken recently on this quickly developing subject.
This toolkit relies heavily on the comprehensive work of the 2018 LIILA Report. (i) It is hoped that this toolkit can continue to grow along with the subject matter.
[i] Oakleaf, M. Library Integration in Institutional Learning Analytics (LIILA. Nov. 15, 2018), available online at: https://library.educause.edu/~/media/files/library/2018/11/liila.pdf
Two commonly cited definitions of learning analytics are as follows:
the “collection and analysis of usage data associated with student learning…to observe and understand learning behaviors in order to enable appropriate interventions,”[i] and
“the measurement, collection, analysis, and reporting of data about learners and their contexts, for the purposes of understanding and optimizing learning and the environments in which it occurs.”[ii]
Within higher education, the term learning analytics refers to the use of institution-level systems that 1) collect individual-level student learning data, 2) centralize it in a warehouse or “record store,” and 3) serve as a unified source for research seeking to understand and support student learning and success. Learning analytics is also used as an umbrella term to include centralized or decentralized learner record stores (LRS), Integrated Planning and Advising for Student Success (iPASS) systems, early alert systems, and engagement tracking systems. Currently, most learning analytics efforts are descriptive in nature, though many aspire to become predictive in order to anticipate and correct systemic and structural obstacles to student learning and success. [iii]
Among librarians, learning analytics is sometimes used synonymously with information literacy "assessment" generally or, somewhat more specifically, studies that attempt to find correlations between library interactions and student outcomes, often as "one-off" or episodic studies. This is a unique use of the term; others in the academy do not use "learning analytics" in this way. Thus, applying the term "learning analytics" to studies that do not connect individual library data with institutional data--longitudinally and using shared systems or regular contribution of library data to institutional systems--can lead to unnecessary confusion and a lack of shared understanding in discussions amongst librarians using different definitions and with educational partners from elsewhere in academia.[iv]
Learning analytics is also not a panacea that can or should replace other forms of library assessment. Rather, it is only one tool in a broad library assessment toolbox that includes both quantitative and qualitative approaches. Because learning analytics takes a different approach from previously applied library assessment methods--that is, leveraging large scale data to understand and improve learning and success outcomes for students--it may offer new insights about prior assessment findings, most likely in the form learning more about "what" is happening with regard to student-library interactions. It is not anticipated, however, to supplant assessment approaches that delve more deeply in "why" something is happening. Therefore learning analytics will not overturn the best practice of using multiple methods to investigate educational research questions.[v]
Learning analytics helps educators discover, diagnose, and predict challenges to learning and learner success, and points the way to designing interventions that benefit all students—especially those less familiar with the unwritten and often opaque rules for success in higher education, including first-generation students, community college students, students of diverse backgrounds, students with disabilities, and veterans. In this way, learning analytics provides a tool to support the success of a wide range of students.[vi] Interventions typically include setting or refining policies, improving processes, making referrals, sending notifications, prompting student meetings in near real-time and by providing learners with insight into their own learning habits. [vii]
[i] EDUCAUSE Learning Initiative. (2011, April). Learning analytics: The coming third wave (brief). Louisville, CO: EDUCAUSE. Retrieved from https://library.educause.edu/~/media/files/library/2011/4/elib1101-pdf.pdf
[ii] Conole, G., Gasevic, D., Long, P., & Siemens, G. (2011). Message from the LAK 2011 general & program chairs. Proceedings of the 1st International Conference on Learning Analytics and Knowledge, LAK 2011. Banff, AB, Canada.
[iii] Oakleaf, M. (2019, October). Libraries and learning analytics: Anticipating change in student success initiatives. North Carolina State University Libraries, Raleigh, NC.; Oakleaf, M. (2019, September). Library integration in institutional learning analytics. Florida State University Libraries, Tallahassee, FL.
[IV] Oakleaf, M. (2019, October). Libraries and learning analytics: Anticipating change in student success initiatives. North Carolina State University Libraries, Raleigh, NC.; Oakleaf, M. (2019, September). Library integration in institutional learning analytics. Florida State University Libraries, Tallahassee, FL.
[v] Oakleaf, M. “Library Integration in Institutional Learning Analytics” (LIILA, Nov. 15, 2018), available online at https://library.educause.edu/~/media/files/library/2018/11/liila.pdf; Oakleaf, M. (2019, October). Libraries and learning analytics: Anticipating change in student success initiatives. North Carolina State University Libraries, Raleigh, NC.; Oakleaf, M. (2019, September). Library integration in institutional learning analytics. Florida State University Libraries, Tallahassee, FL.
[vi] Oakleaf, M. “Library Integration in Institutional Learning Analytics” (LIILA, Nov. 15, 2018), available online at https://library.educause.edu/~/media/files/library/2018/11/liila.pdf
[vii] ECAR-ANALYTICS Working Group. The Predictive Learning Analytics Revolution: Leveraging Learning Data for Student Success. ECAR working group paper. Louisville, CO: ECAR, October 7, 2015; 7 things you should know about analytics. (2011, April). EDUCAUSE Learning Initiative. Retrieved from http://er.educause.edu/articles/2011/9/penetrating-the-fog-analytics-in-learning-and-education.
As discussed in the ECAR working group paper (2015) cited below, there are two general types of predictive learning analytics: embedded and platform tools. The latter being more comprehensive with the ability to pull in different streams of student information.
How institutions choose to implement learning analytics is quite varied, depending on variables such as size, resources, consortia relationships. Some have chosen to build their own, some have purchased a product from a vendor such as EAB's Navigate, and some are working in collaborative groups have built and developed together open source learning analytics.
Source: ECAR-ANALYTICS Working Group. The Predictive Learning Analytics Revolution: Leveraging Learning Data for Student Success. ECAR working group paper. Louisville, CO: ECAR, October 7, 2015.
Please share your suggestions and feedback about this toolkit via this Feedback Form.