Trends and Futures

Day 3: Key Issues in the Use of Learning Analytics: Ethics, Privacy and Engagement

Written by Dr. Patrick Tran, UNSW Canberra

Along with the promise and potential embodied in LA come mounting challenges including ethical issues and the pitfalls related to data interpretation. We will explore these issues and their potential solutions in this post.

Ethical Issues

Fingers type on a keyboardEducational institutions have become more concerned about the privacy and security of their students as a result of the mass collection and centralization of student data, and the recent data breach incidents (including at ANU). The LA ethical and privacy issues can be summarized into 7 categories: (1) Privacy; (2) Informed consent, transparency and de-identification of data; (3) Location and interpretation of data; (4) Management, classification, and storage of data; (5) Data ownership; (6) Possibility of error; and (7) Role of knowing and the obligation to act (Steiner et al., 2011).

Several works have been conducted to propose a systemic approach to addressing ethical concerns of LA, including the “Code of Practice for Learning Analytics” (Sclater, 2016) and the “OECD Privacy Framework” developed by Organization of Economic Cooperation and Development (OECD, 2019).

The privacy taxonomy introduced by Solove (2006) provides a comprehensive overview of potential harms related to wrongful treatment of personal information during the data analytics process.

Infographic overview of Privacy Taxonomy, showing how surveillance of data from users can cause issues such as distorition, exposure, and breach of confidentiality for users.
Privacy Taxonomy (Solove, 2006)

From this framework, we could infer what impacts LA can have on learners if not used properly. For example, assessment analytics helps learners track their own progress but may also cause them performance-related stress. As reported by Mayer-Schönberger and Cukier (2014), misuse of past performance data could have grave consequences, e.g. dismissing students’ ability to change and forcing them out of higher education. Similarly, constant fear of surveillance can adversely impact learners’ psychological well being. Furthermore, collection and correlation of learners’ data at an unprecedented scale regardless of need makes the learners vulnerable to identity theft and other privacy breaches. These are real risks and it is negligent for anyone to disregard them.

To address the privacy concerns, many universities have implemented strict privacy policies, data management and human research ethics procedures. These measures attempt to define ownership and stewardship of LA data, duty of care in data management as well as ethical requirements for the use, analysis and reporting of LA results. The process and techniques used in LA applications must be transparent, to the highest possible degree, to everyone involved (Beattie et al 2014). This includes what data is being collected, how it is being aggregated, what benefits and risks it may have, and who it is being shared with.

Data Interpretation and Intervention

Image by Pexels from Pixabay

Reaching valid conclusions after analyzing data is a critical step in the LA process as it has detrimental effect on instructional decision making. Overgeneralization of learning data may be caused by a number of factors, including biased data or analytic techniques. This can disadvantage certain learners by, for example, imposing coarse indicators of academic success on the entire learner population that undermine individuality instead of jointly defining with individual students what success really mean for themselves (Dishon, 2017).

“Pedagogic lurking” is reported as a widespread engagement phenomenon in online courses, with some perceiving it as problematic and others as a step towards more active participation. In particular, learners who do not participate in an online learning environment in an active way are referred to as “lurkers”. It is important to make clear what “active participation” means. We often measure participation and engagement through visible indicators such as the number of messages, their word count and word choices learners use in an online discussion forum. However, as pointed out by many researchers (Honeychurch et al., 2017; Dennen, 2008), these indicators may not show the entire picture of learner engagement. Some learners may be cognitively engaged in active information processing and motivated to learn without showing any visible signs. As a result, we should include other avenues into our attempts to measure learners’ engagement. These could be self-report questionnaires and in-class observations. For a further discussion of this, check out our previous course on Student Engagement.

In a well-designed LA application, data and findings should be summarized in a dependable and accurate manner by not just interpreting it through a qualitative and quantitative lens, but also through consideration of many relevant factors and stakeholders. Beyond describing what obvious on the graphs and tables or counting occurrences, the analysts should make sure appropriate measures are used and compare the observed findings with other cases. In many cases, education domain knowledge such as learning theories and course-specific context can be used to guide this interpretation process. Finally, conclusions should be drawn to answer the research questions if any. Where appropriate, these conclusions should be synthesized and generalized such that the discovered knowledge is actionable and can be applied in new or similar contexts.

Finally, changes and remedial actions resulted from LA findings must be designed and planned with strong pedagogical ground, careful consultation with both the educators and learners. If a change is institutionalized without a proper learning context, a teacher could feel pressured to make arbitrary pedagogical decision, such as forcing students to post in online forums even though discussion had already taken place in the classroom.

Conclusion

An aerial view of two students sitting with a lecturer at a table.LA opens doors to great opportunities for educators and learners to afford insight into how learning takes place. Analytic outcomes and recommendations resulted from learning data promise enhanced learning experience, but not without precautions against misuse and misinterpretation of the data. Some people naively believe that “some analytics is better than nothing” but this is not always true, LA can also be worse than nothing if it systematically ignores important indicators other than the available data and leads to harmful changes to the learning environment. Because there are so many ways in which learning data can be misinterpreted, the line between the promise of “personalization” and the danger of “discrimination by design” is fine and blurred.

Similarly, the obsession with more data and more analysis poses mounting challenges in protecting data and safeguarding its use. Finally, the need to include an ethical dimension in the applications of LA has become urgent and important due to significant concerns in data privacy and security.

question mark

Activities

Choose any or all of the following questions and post your responses in the discussion forum:

  • Will you use LA to support your teaching, and how? Assume that you have access to the tools and support you need.
  • In your opinion, what are ingredients of a successful LA application?
  • How would you measure students’ engagement in a blended course that contains a portion of face-to-face instruction and some online learning activities? Give an example of what conclusion can be drawn from your measures.

We would love your feedback!

Our team is currently running an evaluation of the Coffee Courses, and we would really appreciate if you could take our survey and share your experiences.

References:

Beattie, S, and Woodley, C, and Souter, K (2014) Creepy Analytics and Learner Data Rights. In: Australasian Society for Computers in Learning and Tertiary Education (ascilite2014), 23 – 26 November 2015, Dunedin, New Zealand. https://research.moodle.org/84/

V. P. Dennen, “Pedagogical lurking: Student engagement in non-posting discussion behavior,” Computers in Human Behavior, vol. 24, no. 4, pp. 1624-1633, 2008. https://www.sciencedirect.com/science/article/pii/S074756320700115X

G. Dishon, “New data, old tensions: Big data, personalized learning, and the challenges of progressive education,” Theory and Research in Education, vol. 15, no. 3, pp. 272–289, 2017. https://journals.sagepub.com/doi/full/10.1177/1477878517735233

S. Honeychurch, A. Bozkurt, L. Singh, and A. Koutropoulos, “Learners on the Periphery: Lurkers as Invisible Learners,” European Journal of Open Distance and E-Learning, vol. 20, no. 1, pp. 192-212, 2017. https://www.eurodl.org/?p=current&sp=full&article=752

OECD. “The OECD Privacy Framework,” 2019; http://oecd.org/sti/ieconomy/oecd_privacy_framework.pdf.

V. Mayer-Schönberger, and K. Cukier, Learning with big data: The future of education: Houghton Mifflin Harcourt., 2014.

N. Sclater, “Developing a Code of Practice for Learning Analytics,” Journal of Learning Analytics, vol. 3, no. 1, pp. 16-42, 2016. https://epress.lib.uts.edu.au/journals/index.php/JLA/article/view/4512

D. J. Solove, “A taxonomy of privacy,” University of Pennsylvania Law Review, vol. 154, pp. 477, 2006. https://scholarship.law.gwu.edu/cgi/viewcontent.cgi?article=2074&context=faculty_publications

C. M. Steiner, M. D. Kickmeier-Rust, and D. Albert, “LEA in Private: A Privacy and Data Protection Framework for a Learning Analytics Toolbox,” Journal of Learning Analytics, vol. 3, no. 1, pp. 66–90, 2016. https://epress.lib.uts.edu.au/journals/index.php/JLA/article/view/4588

7 thoughts on “Day 3: Key Issues in the Use of Learning Analytics: Ethics, Privacy and Engagement

  1. Hi all, I recently added the Digital Pedagogy Lab in the USA and heard a wonderful keynote from Ruha Benjamin which looked at how algorithsm and big data are perceived to be neutral but in fact carry all the biases and subjectivities of humanity – as humans code and create them! She explores the often racist and sexist undertones of algorithms and how they are impacting decision making for people’s futures. It was really great and I encourage anyone interested to give her keynote a listen – it’s on YouTube here: https://www.youtube.com/watch?v=wJPhN4mucCQ

  2. Fascinating and highly topical Coffee Course, and today’s post and first comment in particular (for me). Katie’s comment makes me wonder where empathy sits in the design of the algorithms upon which LA are built. Is empathy considered (at all) in these algorithms? How and by who – and who is excluded in the “application” of empathy in algorithm design? I don’t know if/how these questions have been considered, but they do strike me as important – eg would algorithms built on empathy mitigate against embedded biases? What would it even mean to “build an algorithm on empathy”? I wonder what others think?

    1. Joseph, one way to have empathy, and reduce bias, in the LA design, I suggest would be to have designers with a background similar to the students. As an example, if your students are first-in-family to university, from a low socioeconomic rural background, and you were a rich city kid with parents with PhDs, then it is going to be difficult to understand the student experience. I signed up as an international student, in part, to understand what that was like.

    2. Hi Joseph and Katie,

      I’m not sure it is possible to get neutral data. Even building in empathy is problematic, because it would still be based on particular biases. Who gets to decide what is appropriate empathy, which situations require empathy, etc? I know my pedagogy leans towards the higher end of empathy and gives students the benefit of the doubt. Colleagues have called me naive and a bleeding heart. Students have commented that they wish more academics took such a holistic approach to their progress. No one is wrong (except the students who thought they could take advantage of my empathy – they learnt about their own naivety the hard way).

      So, how can LA be applied successfully? I believe by *not* relying on it. LA is nice to have. But, given how much information it does not convey, and the resulting challenges of interpretation, it should be treated as but one source of information, no more or less important than the myriad other sources of educational data. Moving forward, I plan to use LA to triangulate and confirm other information.

  3. I am cautious about using LA to support my teaching, or even claiming to be. ANU Computer Science has its own bespoke system called “FAIS” which calculates statistics for each course, and makes comparisons across courses. This is used live during the end of semester examiners meeting (run a bit like a dutch auction). If I claim to be doing LA in this highly analytical environment I would be laughed out of the room. http://www.tomw.net.au/blog/2009/11/examiners-meetings-for-e-learning.html

    If I was going to do LA, I guess I would use whatever the students use in the ANU Master of Applied Data Analytics (“R” or similar).

    The ingredients of a successful LA application, I suggest, are to address the stakeholders key concerns: administrators want to run the largest number of students through with the least staff cost, instructors want to not have students complaining about assessment, and students want to pass.

    Today I am finalizing the marks for a blended module run this semester with 245 students. Student engagement is not assessed, or evaluated. Before considering doing that, I would need to ask why, and who was it for? There is a risk that this might creep into the student assessment, with students being punished for not being seen to participate. Also I would need to do such measurement in a way it was compatible with non-blended courses.

    The simplest way to measure student engagement would be with the existing assessment: students are required to participate in online forums, and that could be a measure of engagement. However, more posting is not necessarily better. Some students get the idea they will get a higher mark for lots of long posts, and I have to tell them quality is better than quantity.

    The Moodle logs could be used to measuring engagement. However, I design courses so student can download all the materials at the beginning of semester, and study off-line (if they are located remotely). So students might only access the system a couple of times a week to check in and post, when they can access a satellite link.

  4. I’m open to the idea that LA could be useful and supportive if applied in ethical and equal ways (though I agree with others above that these would be difficult [maybe impossible?] to define and ensure). However, I was particularly struck by one line in this post: “misuse of past performance data could have grave consequences, e.g. dismissing students’ ability to change and forcing them out of higher education.” This, I think, is the issue at the heart of my hesitation to rely on LA or to draw any sweeping conclusions from it.

Leave a Reply to Tom Worthington Cancel reply

Your email address will not be published. Required fields are marked *

*