Trends and Futures

Day 2: Using Learning Analytics to Support Learning and Teaching

Written by Dr. Patrick Tran, UNSW Canberra

There are many LA applications reported in the literature. We will explore some simple LA tools available in Moodle (Wattle for ANU users) that should be accessible to teaching staff. Your own institution’s LMS will likely have similar tools available.

For more diverse LA examples, please check out the longer version of this post.

LA Example 1: Monitoring Students’ Online Behaviors With Course Reports

As learning data is stored in various systems, retrieving and combining them in a meaningful way can be challenging. In Moodle, instructors can gain insights into student engagement with their courses and communicate with certain students. Most of the built-in reporting capabilities in Moodle are centered around the user access log, which contains user ID, course ID, action, event, target, timestamp and origin of the access entry. Based on this log data, a number of reports can be created to support different user views. Just go to the Settings / Reports menu to explore the reports. (For users at ANU, this is available under the Settings wheel -> More -> Reports).

Logs

This data may be useful to monitor students’ online behaviors, e.g. how often and when a course resource is viewed or modified by students. For example, you can check if a particular student has accessed a particular item in your Moodle site.

Screen capture of Moodle Logs, showing a list of what students have accessed and when in this Moodle site.

 

Activity Report

This report displays the number of views by activities (forum, page, assignment dropbox …) and resources (files). This report can give you a sense of how particular activities and resources are being used.

A screen shot of the activity report, which shows all the activities and resources in a Moodle site and how many times they have been accessed.

 

Activity Completion

If you enable the “Completion tracking” option under Course settings and configure “Activity completion” under Settings of an activity, you will find the Activity Completion report under Reports menu.

Moodle Activity Completion, which lets you set how and what Moodle will track in its activity completion tracking.

This creates a customised report that you can use to track whether students have completed the items of the course according to the criteria that you identify.

Moodle Activity Completion tracking shows at a glance what students have completed the required items.

Statistics

This report displays access statistics of a course over a period of time. This report can be filtered by role, by action (views/posts). I use this report to understand the patterns of user access to the course during the semester.

Moodle statistics graph demonstrates access to the site, and can be filtered by role.

Course Participation

This report shows students’ participation for a particular activity measured by the number of actions (view or post) during a set time-frame. There is an option to send a message to some selected students. I use this report all the time to identify and follow up with those students who did not engage well with the course.

Course participation report shows whether or not students have completed a task and allows you to quickly message them.

All data presented in above reports are summarized in the following diagram:

Diagram of Moodle Built-in reports

For a detailed description of these reports, please refer to the Course Reports Moodle Documentation.

LA Example 2: Quiz Data

Scenario: Online quizzes have been widely used in eLearning as both a formative and summative assessment tool. Instructors can utilize the auto-grade, instant feedback and randomization features in quiz platforms such as Moodle Quiz. We can in fact collect much more data about the assessment process with these online quizzes than paper-based tests. With some simple data processing and visualization steps, educators could at least gain some good understanding about how their students performed in a test (scores achieved, time spent) or indicative levels of difficulty of the questions used. I often conduct an Item Analysis to measure the psychometric quality of the questions in my major assessment items. For more details visit Item Analysis.

About the data and technology used: Quiz data is collected from the built-in “Results” reports found in the quiz settings page. This includes students’ grades, actual responses and time spent. We can also link student’s demographic information to this performance data if necessary. You then graph this data in MS Excel. I however prefer Tableau as I like its fast and intuitive drag-and-drop approach to visualizing data.

About the LA application: Below is the findings presented as a “data story” in Tableau for an online course that involved two cohorts of students and three tests in Moodle. From the graph below, we found that students performed very poorly in Test 3, question 5 across all cohorts. Further investigation may help confirm that the question was in fact faulty or the knowledge tested by that question was not covered enough in class.

Data story graph showing that Question 5 of the quiz was failed by almost all the students in the course, while the other questions were successfully completed by the majority of students.

By analysing your quiz results, you can feed-forward into the rest of the course or future offerings of the course to address any issues in students struggling in particular areas. This information can help you as a teacher adjust your approach in the future to support students in relation to this very specific area of need.

LA Example 3: Open Text Data

Scenario: Student feedback data can be collected in many ways, using interviews, surveys or suggestion boxes. This data more often than not contains open text data. We can conduct a thematic analysis of this rich data to identify the common themes.

About the data: Text from the transcript of student interviews.

Technology used: I used NVIVO to analyse the text data while visualizing my findings by Tableau and wordart.com for word clouds.

About the LA application: A word cloud is first created to highlight the issues the respondents talked about most. For each identified theme, the number of respondents who mentioned the good aspects (what worked) and things that need improvement (what didn’t work) are counted. This data is visualized in a dual-axis bar chart.

Word cloud showing the key words discovered through thematic analysis.

 

 

A visualisation of the data presented in the thematic analysis, which shows which key themes students had positive and negative feedback on.

 

Other LA Tools

In addition to the built-in analytic reports, Moodle has a large repository of plugins related to LA. Check out the Moodle Documentation for a full list of LA tools.

Another sophisticated LA system which helps instructors personalize engagement with students at scale is the Student Relationship Engagement System (SRES). SRES has been piloted and used in a number of major universities such as UNSW, University of Melbourne, and University of Sydney. SRES can be used for attendance checking, sending personalized emails based on if-then rules, personalized feedback to students. Visit SRES for further information.

question markActivity

Please share your experience with LA in your capacity as either a student, instructor or administrator.

If you know or have used an LA tool in your teaching and learning, please share:

(1) a brief use case,

(2) any detail on data and analytic techniques; and

(3) how the analytic results are delivered to you (e.g. report, dashboard or any user interface).

Post your response to this activity in the discussion forum.

7 thoughts on “Day 2: Using Learning Analytics to Support Learning and Teaching

  1. As a learning designer I build in short quizzes, and forum discussion questions into each part of a course. In the notes to the instructor, I suggest they sort the results in ascending order, so they can see which students are having difficulties. I do this routinely each week, when teaching. With Moodle/Wattle, it is just a matter of having the grade-book sorted in ascending order. The students with the lowest marks, r now marks, can then be easily identified.

    Where the student is having a problem, I then look at the logs to see if, and how often they are using Moodle. I don’t look at the logs unless I have already decided the student has a problem, as I have ethical concerns about monitoring what students do.

    Identifying students who need help creates an obligation to help them. Early in an elective course that help may be encouraging students to withdraw before they are recorded as a fail. In a core course at the end of a degree program there may be noting useful I can do.

    1. Hi Tom, this is a really interesting point about the obligation to help when the data indicates a student is struggling. In a small course this is likely feasible but what if dashboards and other LA interfaces show a significant issue with a larger cohort of students potentially failing an assignment, unit, program? As an institution are we duty-bound to act to intervene, and how can this be managed? I was at a LA conference a few years ago where Charles Darwin Uni was showing how their analytics lead directly to phone calls to online students to check in and see how they are going, but this is obviously a resourcing issue that would need to be addressed. I also worry about the impact that student-facing learning analytics dashboards would have for the headspace of students who are struggling, by showing them how much / where they are failing, this could help or distress them further. Lots to think about – we will discuss more of these ethical issues in Day 3!

  2. In case you don’t already know, the built-in Moodle Quiz reports are quite useful. Under Quiz / Settings / Results, you can find 3 basic reports: Grades, Responses and Statistics.
    The Statistics report provides you with some basic visualization of the questions’ difficulty (facility index) and discrimination levels (discrimination efficiency). More details on these metrics can be foun dhere: http://www.proftesting.com/test_topics/steps_9.php

  3. I’m an Educational Technologist and I tried to capture Faculty level Moodle data via the use of configurable reports (with a colleagues help!). I found it very difficult to draw conclusions from the data captured. It can definitely help with measuring compliance – was X completed, was the course outline posted etc – but I don’t think I was able to accurately comment about student engagement. For one, it assumes activities/tools are being used correctly to begin with, and I think that is too big an assumption in our (my) case given we have oversight over so many courses. I otherwise mostly find myself using the Moodle logs to diagnose issues, to try and uncover the steps that were taken or to investigate academic integrity matters. I’d like to move towards supporting academics to use this data at the course level in a more positive way – rather than investigating if a students claims are true etc I’d like it to help them see whether students are engaged with the content, and help them adjust or improve practice. I also debate whether it can give us a truly accurate reflection of engagement. A colleague is writing a thesis about this, using LMS data. I will be interested to hear her findings! I have also found some reports in Moodle to be a little misleading – for instance if using a Book, the Course Participation report only marks as a ‘Yes’ when all pages are clicked through, including the final button and it wont capture anything if the student uses the navigation menu on the left to go between chapters (this also effects Activity Completion). I believe Activity Report captures each click in a Book – but the clicks per user is helpful. Even though all the reports use the logs data – I don’t think Moodle have made enough information available to see what data is being used in each report. Demystifying some of the terms in the logs would also be helpful. I think it will involve a lot of playing and testing to see properly.

    1. I think you definitely hit the nail on the head, Natalie, for being cautious about the story LA tries to tell! For starter, it is NOT telling the whole story of student engagement as I will argue on Day 3 of the course. Even with the appropriate “pedagogical insights” and “educational contexts” available to help guide the LA models and interpret LA results, LA is still far from perfect with regards to whether we can rely on these models alone. In some cases with the right data and right context, LA can perform very well but its accuracy may not scale in a larger or automated context. A more sensible approach to LA, like you said, is to take LA results with a grain of salt, consider it as an addition to your toolbox when you make decisions. Having access to some LA tools is a lot better than nothing, but again, this notion does not come without its baggage. We will look further into what “the more data the better” really imply later in this course.

  4. Other than my LA-reliant previous life, I have dabbled with the Moodle quiz reports. I find them useful for identifying particular concepts or questions that the class as a whole struggled with. In this regard, the reports, coupled with online (rather than paper) quizzes, provide much faster feedback to the teaching team.

    I leave logs and activity completion reports for verifying questionable student claims. For example, when a student insists they submitted an assignment, yet it mysteriously got lost on the ether. More often than not, a quick review behind the scenes suggests that the student never accessed the assignment, or only opened it for the first time 30 minutes before the submission deadline. Even in these situations, I try to resist drawing conclusions from the LA data alone.

    Actually, once a student reaches out to me, I occasionally keep tabs on their logs in the lead up to deadlines. However, this is with their consent, and is almost exclusively for students who are struggling with time management issues. I guess this ties in with the sense of obligation Tom mentioned above.

  5. I provide Wattle grammar quizzes as part of my French course resources each week. Completion of these quizzes is a criterion for the 10% course participation grade, so I use LA by accessing the quiz report to see who’s done them. However, I wouldn’t feel comfortable using LA from these quizzes for anything else except this basic criterion of completion, because I don’t think the data tells me anything meaningful about the students’ learning. I also underline to my students that I don’t take their percentage result on the quiz into account, as it is a homework exercise rather than an assessment piece, to remove any pressure and anxiety from the activity. It seems to be working well so far, although I only know this from asking them directly in class!

Leave a Reply to Patrick Tran Cancel reply

Your email address will not be published. Required fields are marked *

*