Learning Analytics Policy: Guidance for staff
This document addresses questions raised during the consultation on the University of Glasgow’s Learning Analytics policy.
What is learning analytics?
Learning analytics is an umbrella term, often seen as encompassing closely related concepts such as learner analytics, academic analytics, student analytics and assessment analytics.
The University of Glasgow’s Learning Analytics Policy draws on the Jisc Code of Practice for Learning Analytics, which describes learning analytics as follows: “Learning analytics uses data about students and their activities to help institutions understand and improve educational processes, and provide better support to learners.” (Jisc Code of Practice for Learning Analytics, updated August 2018).
The Society for Learning Analytics Research (SoLAR), similarly, defines learning analytics as follows: “Learning analytics is the measurement, collection, analysis and reporting of data about learners and their contexts, for purposes of understanding and optimising learning and the environments in which it occurs” (Society for Learning Analytics Research, What is Learning Analytics?).
Approval
Do I need ethical approval to carrying out a learning analytics project?
Ethical approval to undertake a learning analytics project is not necessarily required. The decision whether or not to seek ethical approval for a learning analytics project depends on the purpose of the data gathering and analysis. Please see Section 7 of the Learning Analytics Policy, which sets out when approval should be obtained.
Projects which use University of Glasgow data, are not intended to have externally-facing outputs (e.g. conference presentations or publications) and are carried out within the parameters of the University Learning Analytics Policy do not require approval. The following are some examples of learning analytics activities which do not require approval:
- Monitoring engagement with VLE activities to check if students have participated, as a check on student engagement – for example, a course convener may be trying to establish whether any students need a reminder to participate in a mandatory task, or to find out if students are having problems accessing resources;
- A project that uses University of Glasgow data for a routine purpose such as to enhance the quality of learning and teaching provision or to inform curriculum redesign in a School;
- An internal investigation that seeks to correlate the extent of VLE engagement with assessment outcomes, the findings of which are used internally to verify the usefulness of resources in helping students demonstrate attainment of intended learning outcomes as a quality assurance indicator.
Projects which will have externally-facing outputs require ethical approval, in the usual way for research projects. If the data is being collected and analysed for a research project, including a Scholarship of Teaching and Learning (SoTL) inquiry, then there is an assumption that the outcomes of the analyses will be shared publicly outside the institution, for instance as a blog post, conference presentation, journal article or book. In these cases, ethical approval should be sought. Examples include:
- A study that seeks to correlate the extent of engagement with online study resources with assessment outcomes, the outcomes to be externally disseminated as a journal article;
- A SoTL project that uses routinely collected VLE data to seek to correlate the extent of engagement with formative quizzes with assessment outcomes, which is intended for external publication on a blog.
Staff should apply to the appropriate College Ethics Committee before commencement of the project. Advice on ethics should also be sought from the appropriate College Ethics Committee where the academic or institution has questions of learning analytics data about the performance of students based on protected characteristics.
Can I get ethical approval to use data that has already been gathered for a different purpose?
If the person carrying out a learning analytics project originally intended to use the findings only internally, to improve learning and teaching practice or inform curriculum design, but subsequently wished to disseminate the findings externally, then ethical approval should be sought.
However, if the previously-gathered data involved human participant data which was not anonymised, it would not be acceptable to use it without permission from the participants. In such circumstances, the University Ethics Committee advises against seeking retrospective permission or ethical approval. If needed, advice should be requested from the relevant School/College/Institute Research Ethics Convenor.
Ethical approval is not required for the use of secondary data, that is existing data collected by others, if the data is in the public domain and has been anonymised by the time it became secondary data.
Data
What data can I use?
The University of Glasgow Learning Analytics policy includes a list of data types that are commonly used in learning analytics projects. This list is not comprehensive, and new sources of data will undoubtedly appear in the coming years as we increase the number and range of digital tools we use. Broadly speaking, however, these can be seen to fall into two types:
Student engagement data
- General access data (e.g. Moodle logs)
- Enrolment data (e.g. MyCampus)
Student performance data
- Asynchronous discussion participation (e.g. in Moodle or Microsoft Teams)
- Synchronous online participation (e.g. in Zoom or Microsoft Teams)
- Course completion data (e.g. Moodle course completion option)
- Feedback to tutor from students (structured and/or unstructured)
- Assessment results/completion (e.g. from Moodle or MyCampus)
How can I be sure that my learning analytics project is compliant with GDPR, and how do I identify the legal basis for it?
Individuals must satisfy themselves of the legal basis for processing personal data at the outset of the work. Please see the Data Protection & Freedom of Information Office information on GDPR and the Lawful Basis Interactive Guidance Tool by the Information Commissioner’s Office.
If you intend to undertake a Learning Analytics project involving personal data, a DPIA (Data Protection Impact Assessment) must be completed. Undertaking a DPIA provides an opportunity to draw out the intended processing activities and purposes, which in turn helps in identifying the applicable legal basis, any potential security or other risks, and ensures the processing is proportionate to the project’s aims and that it begins with appropriate safeguards in place.
How long do I need to keep data that I’ve collected for a learning analytics project?
This depends on whether the data is anonymous or not and what the legal basis for any non-anonymous data is. Please see the Data Protection & Freedom of Information Office on data retention for research data and record retention schedules.
What is the difference between anonymous and pseudonymous data?
Interpreting Data
How should I interpret the data I’m using?
Principle 5 in the policy states that: “Data should be interpreted in as full a context as possible, with recognition that learning analytics data does not give a complete picture of a student’s learning. We also need to remember that the information generated is only as good as the input.”
It is important to remember that learning analytics can only capture a snapshot of students’ activity at any one point in time (a ‘transient construct’, Slade and Prinsloo, 2013) and that, depending on the aims of the learning analytics activity, this evidence may need to be triangulated against other sources (including and beyond the VLE) collected over a period of time. The more sources of information available, the greater the likelihood that any predictions about student performance will be accurate.
Nevertheless, all data needs to be interpreted carefully. There is a tendency to equate engagement with connected learning in terms of interactions between students, with ‘active’ collaborative learning seen to be superior to more ‘passive’ (autonomous) forms of learning that do not generate the same interaction data (e.g. discussion forum contributions) - such judgements about the nature of learning influence the way that learning analytics data is captured and interpreted (Gourlay, 2017).
In a similar vein, capturing evidence that students clicked links to access resources does not necessarily mean that they meaningfully engaged with them, or that meaningful learning did not happen offline instead. Wilson et al. (2017) refer to these ‘activity analytics’ as ‘questionable proxies for learning’, and outline a range of other limitations, including conflicting outcomes from empirical studies of the predictive nature of learning analytics, potentially biased algorithms, the ethics of personalising recommendations to learners, and disciplinary differences.
The bottom line is that learning analytics data is likely to be incomplete and should always be interpreted with caution, with students themselves being made aware of the potential limitations of these investigations, despite their affordances to enhance learning.
Sharing Findings
Should I share learning analytics outcomes with students?
As noted in the definitions above, the purpose of learning analytics should be to optimise and support student learning. The ultimate ethical principle is to ‘do no harm’ according to Fritz and Whitmer (2019) although they also question the ethics of ‘doing nothing’ when data is available that could be harnessed to benefit student learning. Some institutions have invested in software that considers different measures of engagement and provides an indicator to students of their general progression, such as the Purdue Course Signals dashboard. However, the use of predictive analytics comes with caveats; the study by Wilson et al. (2017), for example, indicated that while student behaviours can be characterised as falling into certain ‘groups’ of study behaviour, there was no direct correlation with student success in assessments and patterns of study deemed by staff to be successful. It would therefore be unethical to encourage students to study in a specific way if there is no evidence that it would help them, or indeed compromise their success. Ellis (2013) raised similar concerns about the impact of demotivating students whose performance is predicted to be lower than average.
In their review of the ethical issues underpinning learning analytics, Slade and Prinsloo (2013) repeatedly emphasise transparency with students, to the level of students as change agents as co-owners of their data, as well as students being informed about what data is being collected about them. It is therefore worth acknowledging students as stakeholders and valuing their input during discussions of what data should be collected and how it should be used.
How should I share learning analytics outcomes in the context of scholarship?
As noted above, a learning analytics project undertaken for scholarship purposes will require ethics approval, with the assumption that any outputs will be publicly disseminated (this includes the University of Glasgow’s annual Learning & Teaching conference, which is open to external delegates).
One potentially grey area is the extent to which the outcomes can be shared across the institution. While there is a clear remit for student data to feed into central administrative systems such as MyCampus, there are serious considerations to be made in terms of sharing the outcomes of learning analytics projects between other services, Schools and Colleges. This might be something that academics might wish to do in terms of sharing good practice across the institution before publicly sharing their findings, or if they have interesting findings to share internally but did not apply for ethics approval to share outcomes beyond the institution. Before doing so, some guiding questions that would delimit the way that learning analytics outcomes is shared are as follows:
- Are the data being reported aggregated or individual student data?
- Are the data anonymised?
- Can the data be easily attributed to an individual student, even if de-identified, e.g. due to small cohort sizes?
- If you were this student, or the cohort represented by the data, how would you feel about it being shared beyond the course/programme of study?
- Do you have the students’ permission to share the outcomes of analyses that draws upon their personal data?
Another consideration to be made is the privacy of student data. The age of mass surveillance, including capturing learning analytics, has been equated with the metaphor of the panopticon with associated power differentials (Slade and Prinsloo, 2013; McMullan, 2015). While academics and professional services staff would never intentionally sell student data for profit, the use of commercial learning technologies such as Turnitin pose their own ethical issues, given that data is owned by Advance Publications which also owns Conde Nast (Stommel, 2019). The GDPR issue also raises questions about who has access to data stored in third-party platforms, even where there is a data sharing agreement in place. Any concerns about the storage of student data on third party platforms should be directed to the Data Protection and Freedom of Information Office.
Further Support
What support is available at the University of Glasgow?
Depending on your query, the following may be of use:
- Data protection & FOI office
- University/College Ethics committees:
- University Ethics committee
- CoSS (including University Services)
- Arts
- MVLS
- CoSE
Where can I find out more about learning analytics?
Jisc have assembled a range of helpful resources and guidance on learning analytics in UK higher education. Jisc also promote their own learning analytics system (Predictor), to which the University of Glasgow is not subscribed.
The SHEILA project has built a policy development framework for institutions looking to use digital data to enhance learning. The SHEILA webpages also include a list of institutional learning analytics policies.
The Society for Learning Analytics Research (SoLAR) publishes the online Journal of Learning Analytics research.
References
Ellis, C., 2013. Broadening the scope and increasing the usefulness of learning analytics: The case for assessment analytics. British Journal of Educational Technology, 44(4), pp.662-664.
Fritz, J. and Whitmer, J., 2019. Ethical Learning Analytics: “Do No Harm” Versus “Do Nothing”. New Directions for Institutional Research, 2019: 27-38. doi:10.1002/ir.20310
Gourlay, L., 2017. Student engagement, ‘learnification’ and the sociomaterial: Critical perspectives on Higher Education policy. Higher Education Policy, 30(1), pp.23-34.
McMullan, T., 2015. What does the panopticon mean in the age of digital surveillance? The Guardian, 23 July 2015. https://www.theguardian.com/technology/2015/jul/23/panopticon-digital-surveillance-jeremy-bentham
Slade, S. and Prinsloo, P., 2013. Learning analytics: Ethical issues and dilemmas. American Behavioral Scientist, 57(10), pp.1510-1529.
Stommel, J., 2019. Critical pedagogy, civil disobedience and EdTech. Association for Learning Technology conference, 3-5 September 2019, University of Edinburgh. https://altc.alt.ac.uk/2019/sessions/altc-keynote-jesse-stommel/
Wilson, A., Watson, C., Thompson, T.L., Drew, V. and Doyle, S., 2017. Learning analytics: Challenges and limitations. Teaching in Higher Education, 22(8), pp.991-1007.
Not cited but potentially of interest:
Pardo, A. and Siemens, G., 2014. Ethical and privacy principles for learning analytics. British Journal of Educational Technology, 45(3), pp.438-450.