Usability Evaluation of Learning Analytics - Project MUSE

20
Usability Evaluation of Learning Analytics: A Student-Centred Perspective / Évaluation de l'utilisabilité de l'analytique de l'apprentissage: Student-Centred Perspective : une perspective centrée sur l'étudiant Ali Shiri Canadian Journal of Information and Library Science, Volume 42, Numbers 1-2, March-June/mars-juin 2018, pp. 94-112 (Article) Published by University of Toronto Press For additional information about this article [ Access provided at 8 Sep 2022 23:39 GMT with no institutional affiliation ] https://muse.jhu.edu/article/717389

Transcript of Usability Evaluation of Learning Analytics - Project MUSE

Usability Evaluation of Learning Analytics: A Student-Centred Perspective / Évaluation de l'utilisabilité de l'analytique de l'apprentissage: Student-Centred Perspective : une perspective centrée sur l'étudiant

Ali Shiri

Canadian Journal of Information and Library Science, Volume 42,Numbers 1-2, March-June/mars-juin 2018, pp. 94-112 (Article)

Published by University of Toronto Press

For additional information about this article

[ Access provided at 8 Sep 2022 23:39 GMT with no institutional affiliation ]

https://muse.jhu.edu/article/717389

Usability Evaluation Évaluation de of Learning Analytics: l’utilisabilité de A Student-Centred l’analytique de Perspective l’apprentissage : une

perspective centrée sur l’étudiant

Ali Shiri School of Library and Information Studies, University of Alberta [email protected]

Abstract: The purpose of this article is to report on the findings of a usability study of a learning analytics tool developed for the eClass learning management system at the University of Alberta. The study was conducted with 39 graduate students using and interacting with a learning analytics tool. The study design consisted of watch­ing an online tutorial on how to use and interact with a learning analytics tool, com­pleting a number of tasks, and, finally, completing usability survey questionnaires. The majority of students found the tool easy to use, easy to learn, and visually appealing. It was also noticeable that 62% of the participants felt that they gained new insight into their interaction with digital learning objects and 72% felt that this would provide them with insight as to how they could improve their interaction with learning objects. Concerns were raised by students in relation to the potential for misinterpretation of student data by instructors.

Keywords: learning analytics, data analytics, learning management systems, usability evaluation, student learning data, usability studies

Résumé : L’objectif de cet article est de présenter les résultats d’une étude d’utilisabil­ité d’un outil d’analytique de l’apprentissage développé pour le système de gestion de l’apprentissage en ligne à l’Université de l’Alberta. L’étude a été menée auprès de 39 étudiants diplômés utilisant et interagissant avec un outil d’analytique de l’ap­prentissage. La conception de l’étude consistait à regarder un didacticiel en ligne qui portait sur la façon d’utiliser et d’interagir avec un outil d’analytique de l’apprentis­sage, en effectuant un certain nombre de tâches et en remplissant pour finir des questionnaires de sondage d’utilisabilité. La majorité des étudiants ont trouvé l’outil facile à utiliser, facile à apprendre et visuellement attrayant. Il a également été remar­qué que 62% des participants ont pensé avoir acquis de nouvelles perspectives sur leur interaction avec les objets d’apprentissage numérique et 72% ont estimé que cela leur donnait une perspective nouvelle sur la façon dont ils pourraient améliorer leur interaction avec les objets d’apprentissage. Certaines préoccupations ont été soulevées par les étudiants quant à la possibilité d’une mauvaise interprétation des données des élèves par les instructeurs.

© 2018 The Canadian Journal of Information and Library Science La Revue canadienne des sciences de l’information et de bibliothéconomie 42, no. 1–2 2018

Usability Evaluation of Learning Analytics 95

Mots-clés : analytique de l’apprentissage, analyse des données, systèmes de gestion de l’apprentissage, évaluation de l’utilisabilité, données d’apprentissage des étudiants, études d’utilisabilité

Introduction The widespread development of online teaching and learning and the introduction of numerous online courses and programs have presented new opportunities and demand for institutions of higher learning to develop new ways of monitoring and evaluating the online learning environment. Terms such as “educational data min­ing” and “academic analytics” as well as the more commonly adopted term “learn­ing analytics” have been used in the literature to refer to the methods, tools, and techniques for gathering very large volumes of online data about learners and their activities and contexts. Learning analytics has more specifically been defined by the first International Conference on Learning Analytics and Knowledge (2011) as “the measurement, collection, analysis, and reporting of data about learners and their contexts, for purposes of understanding and optimizing learning and the en­vironments in which it occurs.” The interest in, and demand for, the development of software tools to perform learning analytics has increased over the past several years (Ferguson 2012). Such tools can offer new insight into online teaching and learning. Siemens et al. (2011) and Siemens and Long (2011), for instance, have proposed several advantages to such learning analytics tools, including:

• early detection of at-risk students and generation of alerts for learners and educators • personalization and adaptation of learning processes and content • extension and enhancement of learner achievement, motivation, and confidence by providing learners with timely information about their performance and that of their peer

• higher quality learning design and improved curriculum development • more rapid achievement of learning goals by giving learners access to tools that help them to evaluate their progress.

While course and learning management systems, such as Moodle, hold very large datasets related to student interactions and activities, they typically lack, or have limited, built-in learning analytics capabilities (Dawson 2010). The depth of their extraction, aggregation, reporting, and visualization functionalities have often been basic or non-existent. With such limitations, instructors are not able to use available data points in a multidimensional way to make detailed and comparative inferences about student activities and interactions within one par­ticular course activity or across the entire course content. For instance, it is not possible for an instructor to (1) comparatively and visually identify the most fre­quently used resources within a course; (2) detect the kinds of resources used by high-performing students in class; or (3) identify the nature of course materials not used by low- and average-performing students.

The purpose of this article is to report on the findings of a usability study conducted in October 2016 in relation to a learning analytics beta software

96 CJILS / RCSIB 42, no. 1–2 2018

application, eClass learning analytics tool, that was developed for the University of Alberta’s eClass learning management system in 2016. Because this tool is incorporated into the university’s learning management system, it can be used for any course that is available within the system, regardless of it being offered face-to-face, blended, or completely online. The study involved students from a graduate course, LIS 502: Organization of Information, in the School of Library and Information Studies at the University of Alberta in Edmonton, Canada. Consistent with the literature, the learning analytics tool was developed with the intention of supporting instructors in their ability to monitor students’ online learning activities, interaction, and performance and to facilitate the provision of personalized and enhanced advice to students. It was also intended that the tool be used to provide students with a new means by which to regularly monitor and manage their learning activities and interactions and to enable them to com­pare their performance with their peers in an ongoing and real-time manner.

The study contributes to three areas of research and scholarship—namely, usability evaluation, learning analytics, and learning management systems. In particular, it provides an example of a user-centred study that makes use of a real-world operational learning management system and its associated learning analytics tools in a real classroom setting. The methodological framework and the use of affordance strength approach for interface usability can be used by any usability study that focuses on evaluating user interfaces in learning analytics systems. This study also sheds light on some of the emerging issues surrounding the use of student learning data for student assessment and the potential for mis­use and misinterpretation of data that is processed and visualized by the learning analytics tools.

Prior research and context Learning analytics tools There have been a number of learning analytics tools developed for various learning management systems, such as Moodle, Desire2Learn, Canvas, and Blackboard. There are two general categories of learning analytics tools. The first category is designed exclusively to be used by instructors and course designers with features and functionalities for analysing and visualizing data related to stu­dent activities. The second category provides additional features and functional­ities for students as well as instructors with access to learners’ interaction and activity data. A significant number of these learning analytics tools are open source applications, some of which are still being developed and others that have not been kept up to date. Examples of recent open source learning analytics tools include the Social Networks Adapting Pedagogical Practice (SNAPP) (Dawson, Bakharia, and Heathcote 2010); the LOCO-Analyst (Jovanovic et al. 2008), the Graphical Interactive Student Monitoring Tool for Moodle (GISMO), and the Academic Analytic Tool (AAT) (Graf et al. 2011). Most of these applications are designed to allow only instructors to access and analyse student behaviour data in learning systems. One of the reasons for the limited number of learning analytics applications available in operational learning

Usability Evaluation of Learning Analytics 97

management systems is that the programmatic data generated in learning man­agement systems is noisy and large and requires substantial resources to process, make sense of, and make effective use of in order to support learning. Further­more, designing learning analytics tools that provide both students and instruc­tors access to learning data in an easy-to-understand and meaningful manner is a major design and implementation challenge.

Rationale for the development of eClass learning analytics tool Course and learning management systems such as Moodle hold very large data-sets related to student interactions and activities. However, student-tracking cap­abilities in these systems are usually limited, and, as a result, the depth of extraction and aggregation, reporting, and visualization functionality of these built-in analytics have often been basic or non-existent (Dawson 2010). While Moodle has reporting functions for students and instructors, these functional­ities are not easy to use. For instance, Moodle data can be downloaded as an Excel file, but it still requires analysis in order to be useful for students and in­structors. The Moodle learning management system does not currently have learning analytics tools to provide support for analysing, visualizing, and making sense of very large student and instructor activities datasets. For instance, it is not possible for an instructor to (1) comparatively and visually identify the most frequently used resources within a course; (2) detect the kinds of resources used by high-performing students in class; or (3) identify the nature of course materi­als not used by low- and average-performing students. The overarching goal of the analytical tool developed at the University of Alberta was to facilitate access to, and making sense of, learning data for students and instructors.

At the University of Alberta in Edmonton, Canada, we have developed a learning analytics application for eClass, a learning management system that is based on Moodle. The Moodle learning management system is currently used by many Canadian and international universities around the world. The choice of Moodle also lies in the rationale that it is widely used by the instructors and students of the University of Alberta and serves as a realistic environment for development, implementation, and user evaluation. This new application pro­vides learning analytics functionalities for both students and instructors, and, given that it was created for an open source learning management system, it can be adopted by other universities and colleges that use Moodle as their learning management system. The following figures will depict a number of screenshots of our newly developed learning analytics application. The screenshots provide visual representation of a learner’s data analysed using our application.

Visual dashboard for engagement data Our application provides a number of features to analyse and visualize content engagement, forum engagement, forum usage over time, and events by the user over time. Several figures demonstrating various features are included in the appen­dix to this article, available online at http://muse.jhu.edu/journal/497. Figure A1 shows an example of a graph that depicts students’ engagement data across a

98 CJILS / RCSIB 42, no. 1–2 2018

number of activities, such as visiting various course web pages, interacting with forums, files, or blocks. Through this graph, students are able to visually identify how students within a class have viewed and interacted with different parts and components of a course. It also allows individual students to choose a particular time range to view their own engagement and activities for the time period.

Figure A2 provides a longitudinal view of engagement data for students. This functionality allows for a holistic view of all activities over a certain period of time. They can also choose a particular activity, such as contributing to a forum or blog and view how they have been using or interacting with the forum over time. This feature allows them to keep track of their own use of various learning objects over the period of a semester.

The granularity function shown in figure A3 allows students to narrow down the time line to days and hours. This function will be useful for identify­ing how active students are before or after a particular quiz.

Visual dashboard for weekly discussion and selected students In order to allow students to gain a collective perspective of performance within a class, engagement data for all students are shown in figure A4. This graph is useful for instructors and students to quickly and visually see how students con­tribute to a discussion forum across several weeks.

Figure A5 allows a student to gain a comparative perspective of how other students, on average, interacted with course content and also where the particu­lar student stands this regard. This will allow a student to see the level of engage­ment on the part of her fellow students.

Learning analytics usability A number of user studies of learning analytics tools have been reported in the lit­erature. For instance, in a qualitative user study of the LOCO-Analyst learning analytics tool by instructors, the researchers found that multiple ways of visualiz­ing data increase the perceived value of the feedback types and that the partici­pants wanted features that would support supplementing the textual (tabular) feedback with visual representations (Ali et al. 2012). Kelly et al. (2017), in a study of the relationships between student behaviours in a learning management system (LMS) and the learning outcomes of students, found evidence to support the idea that instructors do not strongly influence student use of a LMS and that higher-performing students use a LMS differently from lower-performing stu­dents. Dimopoulos et al. (2013) conducted a usability study of learning analytics­enriched rubric and found that teachers and practitioners found the tool easy to accept and use and placed particular emphasis on the enhancement of the system in terms of visualization and social network analysis. Lukarov, Chatti, and Schroeder (2015) present three case studies of learning analytics evaluation and argue that there is no standard way of evaluating learning analytics tools that would measure the impact and usefulness of learning analytics tools for educa­tors and students. They argue that “these LA [learning analytics] tools should not only be usable and interesting to use but also useful in the context of the

Usability Evaluation of Learning Analytics 99

goals: awareness, self-reflection, and above all, improvement of learning.” How­ever, these studies have specifically examined the usability of a particular tool or prototype application. In this study, the findings shed light on a broader range of issues such as privacy, misinterpretation of data by instructors, and students concerns about using these tools as an assessment mechanism.

In this study, we conducted a student-focused usability evaluation of the eClass learning analytics tool to gain a better understanding of how students evaluate the tool in terms of ease of use, visual appeal, supporting them to understand their progress in a course, their interaction with learning objects, and the affordances of the interface and its usability.

Research design and method Participants The usability study reported here was completed as a student assignment in a graduate course on the organization of information at the School of Library and Information Studies at the University of Alberta in Edmonton, Canada. The re­searcher received ethics approval from the university to involve student partici­pants and to conduct this study. In total, 39 graduate students, who had enrolled in the above-mentioned course in October 2016, participated in the study. All of the participants received full credit (10%) if they completed the assignment, regardless of the nature of the feedback they provided via the survey. Based on an initial question, it was confirmed that none of the participants had prior experience in using or interacting with learning analytics tools.

Procedure and data-gathering tools The study consisted of three parts. Participants were first asked to watch an 11­minute online tutorial about the eClass learning analytics tool and its functional­ities. The tutorial was created and shared with students on YouTube. They were then invited to use the eClass learning analytics tool to complete a series of tasks in their LIS 502 eClass website. Finally, participants completed a usability sur­vey that served to gather data about their user experiences and impressions. The usability survey was designed using Google Forms.

The video tutorial consisted of two parts (i.e., two videos), figures from which are included in the appendix to this article, available online at http:// muse.jhu.edu/journal/497. The first part provided instruction on how to access and use the “content engagement graph” (figure A6), while the second part pro­vided instructions on how to access and use the “time-based engagement graph” (figure A7). A content engagement graph visualizes strictly the number of times a user viewed (i.e., accessed) or interacted (i.e., added or removed information) with various activities in a given site. The time-based engagement graph visua­lizes the number of times a user viewed or interacted with various activities within a given site over time. After viewing the two videos, participants were in­structed to complete two sets of tasks corresponding to the video tutorials. They were asked to work through a set of activities related to each of the content engagement graphs and the time-based engagement graphs. Lastly, participants

100 CJILS / RCSIB 42, no. 1–2 2018

were asked to fill out a usability survey containing Likert-style and open ques­tions, the data for which is reported and interpreted in this article.

Tasks The tasks in this usability study were designed to test participants’ initial reac­tions and general impressions of the usability of the eClass learning analytics tool and to gather data about its affordance strength. As Ruecker (2006) notes, the concept of affordance helps to establish a semantic space related to the fluid mediation of understanding that occurs between people and their environment as opposed to the unmarked term “functional.” Day (2011) introduces the idea of cultural, social, and physical affordances and notes that the mediation with which human–computer interaction deals begins with the “subject” needs, sug­gesting the importance of a user’s view of the functionalities of a user interface. The affordance strength approach has been used in a number of previous usabil­ity studies (Shiri et al. 2011; Ruecker, Radzikowska, and Sinclair 2011).

As part of the usability process, the goal was to solicit feedback on the parti­cipants’ overall sense of the tool, how easy or difficult it was to use, what func­tional or aesthetic elements worked well or could be refined, and how they could see themselves using the tool. The tasks in this usability study were designed to:

• provide participants with the opportunity to engage with the major functionalities of the tool, specifically the two types of available graphs

• familiarize participants with how to add and modify content engagement graphs so that they could access visualizations about their engagement during specified time periods for specified activities

• familiarize participants with how to add and modify time-based engagement graphs so that they could access visualizations about their engagement over a period of time for specified activities

• familiarize participants with how to compare their data to class averages when using both types of graphs

• encourage students to consider the potential uses (for students and/or instructors/ professors) for the data visualized in both types of graphs.

Results Given the nature of data-gathering methods, we collected quantitative and quali­tative data from the participants.

Quantitative results Qualitative data were mainly gathered using the Likert scale.

General usability questions The participants were first asked to assess the eClass learning analytics tool with respect to its general usability. They were asked to provide a score between “strongly disagree” and “disagree” in relation to five questions, which would cor­respond to the values of one through five on the five-point Likert scale. A single

Usability Evaluation of Learning Analytics 101

open question—asking students to enter any comments about their thought processes and impressions—followed this set of questions. The general usability questions were modelled after the following form: representations of each point on the Likert scale for questions 1–5 of our post-test questionnaire about the usability of the eClass learning analytics tool:

1. strongly disagree 2. disagree 3. neutral 4. agree 5. strongly agree.

The following are the quantitative results for the usability questions:

1. I think this tool would help me understand my progress in the online course: • 46% of participants chose “agree” or “strongly agree” • 31% of participants chose “neutral” • 23% of participants chose “disagree” or “strongly disagree.”

2. I think this tool would be easy to use: • 87% of participants chose “agree” or “strongly agree” • 10% of participants chose “neutral” • 3% of participants chose “disagree” or “strongly disagree.”

3. This tool is visually appealing: • 82% of participants chose “agree” or “strongly agree” • 13% of participants chose “neutral” • 5% of participants chose “disagree” or “strongly disagree.”

4. I feel confident that this tool will provide me with new insight into my interac­tion with digital learning objects: • 62% of participants chose “agree” or “strongly agree” • 26% of participants chose “neutral” • 13% of participants chose “disagree” or “strongly disagree.”

5. The appearance of visualization features about my learning data helped me understand how I have been interacting with digital information sources and how to improve my interaction with, and use of, learning objects: • 72% of participants chose “agree” or “strongly agree” • 18% of participants chose “neutral” • 10% of participants chose “disagree” or “strongly disagree.”

As can be seen from the above responses, the majority of students found the tool easy to use and visually appealing. Less than 46% of students felt that the tool helped them to understand their progress in the online course. It was also notice­able that 62% of the participants felt that they gained new insight into their interaction with digital learning objects, and 72% felt that this would provide them with insight as to how they could improve their interaction with learning objects. This is particularly important since gaining insight into one’s interaction with digital learning objects and their status compared with the rest of the class

102 CJILS / RCSIB 42, no. 1–2 2018

provides a quick overview of how the rest of the class interacts with digital learn­ing objects.

Affordance strength questions Following the usability questions, participants were asked to answer six quantita­tive affordance strength questions. These questions focused on gathering data on ease of use and ease of understanding the tool, willingness and motivation to use the tool, helpfulness of the tool for study planning, and the capability of the tool to support learning. The affordance strength questions were divided into two parts. Part 1 of the affordance strength questions was comprised of a set of two questions, and Part 2 was comprised of a set four questions. A single open question, asking students to enter any comments they had, followed each set of questions.

Affordance strength questions: Part 1. In Part 1, students were asked to provide a score between “very difficult” and “very easy” for two, five-point Likert-style questions that were modelled after the following form: representations of each point on the Likert scale for Part 1, questions 1–2 of our post-test questionnaire about the usability of the eClass learning analytics tool:

1. very difficult 2. difficult 3. neutral 4. easy 5. very easy.

The following are the qualitative results for the Part 1 affordance strength questions:

1. How easy or difficult was it to understand the learning analytics tool? • 92% of participants chose “easy” or “very easy.” • 8% of participants chose “difficult” or “very difficult.”

2. How easy or difficult was it to understand the visualization features? • 82% of participants chose “easy” or “very easy.” • 5% of participants chose “neutral.” • 13% of participants chose “difficult” or “very difficult.”

The vast majority of students found the tool and its visualization features to be very easy or easy to understand. Prior research has found that understanding visualization was influenced by learning and cognitive styles, which was not ex­amined in this study (Shiri, Ruecker, and Murphy 2013).

Affordance strength questions: Part 2. For Part 2, participants were asked to give a score between “not at all” and “very much” for four additional five-point Likert­style questions that were modelled after the following form: representations of

Usability Evaluation of Learning Analytics 103

each point on the Likert scale for Part 2, questions 1–4 of our post-test question­naire about the usability of the eClass learning analytics tool.

1. not at all 2. maybe 3. neutral 4. somewhat 5. very much.

The following are the qualitative results for the Part 2 affordance strength questions:

1. Would you want to use this interface? • 36% of participants chose “somewhat” or “very much.” • 13% of participants chose “neutral.” • 51% of participants chose “maybe” or “not at all.”

2. Could this tool be a helpful tool in your study planning? • 51% of participants chose “somewhat” or “very much.” • 13% of participants chose “neutral.” • 36% of participants chose “maybe” or “not at all.”

3. If this tool was offered in the eClass learning management system, how moti­vated would you be to use it to support your teaching/learning? • 21% of participants chose “somewhat” or “very much.” • 21% of participants chose “neutral.” • 59% of participants chose “maybe” or “not at all.”

4. Would this tool be capable of supporting your teaching/learning? • 41% of participants chose “somewhat” or “very much.” • 21% of participants chose “neutral.” • 38% of participants chose “maybe” or “not at all.”

The above responses provide mixed feedback on the helpfulness of the tool. 51% of the participants thought the tool was useful for study planning, and 41% of them felt that the tool would be capable of supporting learning. How­ever, only 36% were of the view that they would use the interface, and only 21% stated that they would be motivated to use the tool. One explanation for this discrepancy stems from the fact that the idea of this tool as a mechanism to support learning is new to students and is compounded with the metacognitive aspect of the tool that allows students to learn about their learning process.

As mentioned above, a single open question followed each of Part 1 and Part 2, asking students to enter any comments (i.e., qualitative feedback) they had in relation to the quantitative questions in each part. The results of the qual­itative feedback are discussed in the next section.

Qualitative results Students were asked to provide written feedback on a total of four open ques­tions as part of the usability study. After completing Part 1 of the affordance strength questions, participants were asked:

104 CJILS / RCSIB 42, no. 1–2 2018

• Affordance Strength Question 1: Please enter any comments regarding the grid (i.e., Part 1 quantitative affordance strength questions) above.

After completing Part 2 of the affordance strength questions, participants were asked:

• Affordance Strength Question 2: Please enter any comments regarding the grid (i.e., Part 1 quantitative affordance strength questions) above.

Also, at the very end of the survey, participants were asked for their written responses to the two general “stand-alone” open questions:

• General Question 1: Do you have any comments about improving the look and feel of the interface?

• General Question 2: Can you describe how you would use this tool?

Affordance Strength Question 1: Please enter any comments regarding the grid (i.e., Part 1 quantitative affordance strength questions) above Part 1 of the affordance strength quantitative questions focused on how easy or difficult it was to use the tool overall and how easy it was to understand the visu­alization component of the tool. This corresponding open question asked parti­cipants to provide any comments they wished to share. The most prominent themes identified within the comments made by the participants were (1) praise for general usability; (2) opportunities for refinement; and (3) data interpreta­tion. Detailed analysis of these themes is provided below.

Praise for general usability. Generally, participants responded quite favourably with respect to the usability of the eClass learning analytics tool. It was complimented by several participants as being “easy to use,” “intuitive,” “straightforward,” and “under­standable.” In addition, one participant specifically offered that “the use of [i]conogra­phy enhanced the ease of using the Tool,” suggesting that the visual design of the eClass learning analytics tool was instrumental in promoting usability. The role of the tutorial video in promoting ease of use was also acknowledged by several participants. They commented that the tutorial video enabled them to quickly and easily under­stand and use the eClass learning analytics tool.

Opportunities for refinement. While a later open question in the study’s survey specifically requested feedback in relation to potential changes that could be made to the eClass learning analytics tool, participants seemed interested and in­vested in improving the tool from the very beginning and immediately identified some excellent opportunities for refinement. Suggestions centred mainly on re­quests for clarification of terminology and units in the graphs, layout changes, and minor changes to functionality and default views.

With respect to clarification of terminology and units, some students indi­cated that they were uncertain as to the meaning of the terms “all core” and “all blocks” and about the corresponding bar graph data that appeared in relation to

Usability Evaluation of Learning Analytics 105

them when a default content engagement graph was accessed (see figure 2). “All core” refers to the number of views of certain key site functions, as defined by Moodle. While this explanation was provided in the study’s instructions, it con­tinued to be confusing for users and, as such, has since been renamed as “sys­tem” and removed from the default view. In addition, the “all blocks” category has been removed altogether, as it had initially been included for testing pur­poses only and did not display useful data.

A few participants were also unclear as to what certain scales and units that sometimes appeared on the y-axis of graphs represented. For instance, the y-axis scale of the content engagement graph displayed partial events (i.e., views or in­teractions) in decimal form for some participants, causing confusion. In addi­tion, the time-based engagement at times used the abbreviation “m” as its units (e.g., 100m), but no explanation as to what this unit represented was provided, and a few participants also found this confusing. Upon investigation, it was determined that “m” represented milli or (0.1). Inclusion of such units resulted from a setting that had inadvertently been applied and resulted in the display of SI units. This setting has since been turned off.

In relation to layout, a few participants indicated that overlap occurred when multiple graphs were accessed. The text of graph legends and/or the x-axis of graphs also appeared to be “cut off” (i.e., not visible) for a couple of partici­pants. As participants used their own laptops for the usability study, it was help­ful to get a sense of minor changes to settings or codes that could be made to optimize viewing of the learning analytics tool for a broad variety of computer models. One participant recommended additional functionality—namely, in­cluding a “back button” so that users could view previously generated graphs. This very practical function has been added to the current version of the eClass learning analytics tool.

Finally, as suggested in the tutorial video, several participants confirmed that, when first adding a new graph, it would be valuable for the default settings of that graph to reflect only the current semester’s data. At the time of the study, default graphs provided the previous 365 days’ worth of data (from the date of access). This proposed modification was echoed later on by several other stu­dents in response to question 3 of the survey and has been implemented.

Data interpretation. A number of participants raised concerns about the potential for misinterpretation of the data. For instance, two participants indicated that potentially incorrect conclusions about success could be drawn from the data. They suggested that engagement would not necessarily representatively “signify success” since, for instance, offline use of resources that had been downloaded would not be captured as engagement by the tool.

Affordance Strength Question 2: Please enter any comments regarding the grid (Part 2 quantitative affordance strength questions) above Part 2 of the affordance strength quantitative questions focused on whether stu­dents would want to use the eClass learning analytics tool, whether they thought

106 CJILS / RCSIB 42, no. 1–2 2018

it would helpful to their study planning, how motivated they would be to use it to support their learning, and whether they thought this tool would be capable of supporting their learning. Participants were asked to provide any comments they wished to share in relation to the Part 2 quantitative questions. The most prominent themes derived from the comments were: (1) potential uses and (2) potential for the instructor/professor to misinterpret the data. Detailed analysis of these themes is provided below.

Potential uses. The quantitative questions that preceded this open question refer­enced the potential uses of the eClass learning analytics tool and seemingly prompted participants to provide some preliminary ideas about use. One partici­pant thought that the tool “could help students see where they are spending most of their time and which classes they have been dedicating more time to.” Another participant indicated that they might use this tool to plan how to spend their study time for their next semester by reviewing where they spent most of their time and how this correlated with the academic (grades) outcomes. As was shared by many others later in the survey, two participants also indicated that the eClass learning analytics tool could be useful for monitoring their participation in the online forums component of eClass, particularly when participation was required for a course. Two participants simply indicated that they were unsure as to how it would inform their study planning in the future, while two others thought that access to the information would not impact their behaviour. Finally, one student insightfully suggested that one specific course of action should naturally flow from this study’s work; they proposed that it would be helpful to investigate in the literature to determine what metrics have been identified as beneficial to stu­dent success, which, in turn, could inform potential uses.

Six participants echoed comments from questions 1, stating that they mainly downloaded and used resources offline. However, these comments were made in the context of potential use, rather than in relation to concerns about how the data might be interpreted. These participants stated or implied that because they use eClass more sparingly, for instance, by downloading and accessing resources offline, the gaps in data that may result could prevent them from taking advantage of what the eClass learning analytics tool has to offer. Somewhat relatedly, one stu­dent pointed out that a limitation of the tool is that it “does not account for the time you spend offline.” As a final note related to potential uses, a few participants indicated that they generally found the tool to be “interesting” and “fun.” These and related descriptors were repeated throughout the open questions of the survey. While such feedback does not speak to ways in which the tool could be used per se, they suggest quite positively that using it is an enjoyable experience, which may, in turn, promote students’ desire to use it moving forward.

Potential for instructor/professor misinterpretation of data. Some participants ex­pressed earlier in the survey that their offline access would not be captured and, therefore, that conclusions about how online engagement impacts success may be

Usability Evaluation of Learning Analytics 107

limited. Along the same lines, some participants were concerned that the data, when considered by instructors or professors, may lead to misinterpretation of student participation/engagement. The downloading of resources to use offline was again mentioned. Participants were concerned that this practice could affect how they might be perceived or assessed. One participant commented that “[the] number of views doesn’t necessarily mean much if, at the beginning of the term, students look through all of the material and download relevant PDFs and links to their personal computers and then refer to those, not eClass, throughout the term.” Another remarked that they were concerned that instructors might make “quantitative judgments about success without full knowledge of . . . [a student’s] learning style or strategies.” Also related to the potential for misinterpretation, two participants commented (in response to this question and to question 1— included here for convenience) that if this tool was used for assessment by in­structors/professors, they might be motivated to “appear” to be engaged, rather than actually be engaged. Arguably, such behaviour could also result in misinter­pretation of the data.

General Question 1: Do you have any comments about improving the look and feel of the interface? Participants were generally particularly complementary of the look and feel of the eClass learning analytics tool. As in response to Affordance Strength Question 1, it was described again as “easy to use,” “straightforward,” and understandable. It was also toted as “user friendly” and “easy to navigate.” In reference to its visual design, the tool was complimented as being “simple,” “appealing,” and “aestheti­cally [and] visually appealing,” and the graphs were felt to have an “intuitive lay­out.” Also, as mentioned earlier, the instructional video that introduced and guided students through the tool was described as being valuable for its clarity and facilitat­ing understanding of the tool. Participants also provided some specific and valuable points for improvement. The following four requests echoed responses to Affor­dance Strength Question 1 and have all been addressed in a newer version of the eClass learning analytics tool: (1) request to set the default period for graphs to dis­play the latest semester (this was by far the most frequent request); (2) request for clarification around the units used on the axis of the graphs; (3) request for refor­matting of graph legends; and (4) request to include a back button. Additional layout improvements were also suggested. Two participants indicated that the checkbox located next to the time period selection tool—which had to be selected prior to selecting a time period for a graph—was cumbersome and confusing and should be removed (see figure 2, to left of “dates” selector). A few participants also requested the ability to close or remove generated graphs. Finally, an option to be able to hover over some key terms in order to access definitions was requested. This feedback was extremely valuable, and the tool has been modified accordingly.

General Question 2: Can you describe how you would use this tool? While some participants indicated at various points in the survey that they were unsure how to use this data to affect positive academic outcomes, many others

108 CJILS / RCSIB 42, no. 1–2 2018

offered a variety of interesting ideas about potential uses for instructors/profes­sors and students in response.

Potential uses for instructors. Participants thought that instructors and faculty (pro­fessors) could use the eClass learning analytics tool mainly to (1) determine student participation and overall engagement; (2) determine with which elements of an eClass site students most commonly interacted; and (3) assess the overall “utility of their eClass for students.” Regarding the use of data by instructors and professors, several participants cautioned that instructors and professors could misinterpret the data, especially when site resources were downloaded and repeatedly accessed offline. One student, however, thought that they might actually shift their behaviour to access sources online (i.e., leverage the data-recording capabilities of the eClass learn­ing analytics tool) if they knew that their information about their activity would be captured. Arguably, this suggestion reveals a potential benefit of the tool—namely, it provides students with a means by which they can evidence their participation in a measureable and quantifiable way. Participants also articulated concerns about access to student information and impacts on stress. Two students commented that knowing that professors had access to their information would cause them stress/ anxiety. In contrast, however, one participant indicated that the ability to “visualize” the effort they put into a class (i.e., access their own data) would actually reduce their stress and would serve as a motivator that could show them how productive they could be. Finally, two participants expressed concerns about their professors being able to “surveil” their actions. One participant proposed that the data should be made exclusively available to students, while a couple others wanted the option to opt out.

Potential uses for students. Participants offered a number of creative and enthusias­tic suggestions in relation to how students might use the tool. Most prominently, participants felt that the eClass learning analytics tool would be useful to track and monitor their participation on eClass forums when such forums formed part of the class assessment. Eight of 39 participants identified this as a potential func­tion. Several participants also mentioned that they would use the tool to either track or reflect on their participation generally. Another handful of participants offered that they might also use the visualized data to motivate online participa­tion in some shape or form. On a related note, one student indicated that they would use it to “keep from procrastinating,” and another indicated that it would help them “gauge how active” they were and determine whether they needed to adjust their activities accordingly. Others were hopeful that the tool could reveal to them those resources that they had “missed” out on, especially if their peers’ data was available and signalled activity in relation to mentioned resources. In fact, several participants indicated that seeing their peers’ interaction with the tool might serve to prompt them to increase or change participation. A number of other more general uses were noted, including the comparison of participation to class averages, time planning, goal setting, time management, and informing study habits. A few concerns regarding potential student uses were also raised.

Usability Evaluation of Learning Analytics 109

Two participants felt that this tool would not be useful, either because they be­lieved they had a good sense of their degree of participation or because the courses they were taking were primarily face-to-face and were not online courses. As was stated earlier in this article, because the learning analytics tool is incorpo­rated into the University’s learning management system, it can be used for any course that is available within the system, regardless of it being offered face-to­face, blended, or completely online. Also, one student indicated that they were prone to anxiety and that the tool might cause them stress because they would be concerned with having a record that demonstrated the irregular times at which they accessed the information.

Conclusion and further research Learning analytics applications have the potential to provide insight into how students interact with, and make use of, learning objects in their learning pro­cess. This insight can provide instructors with new knowledge about the ways in which students interact with the course website and how the instructor can pro­vide advice and help to improve teaching and learning. This article reports a study that has adopted a student-centred and empirically sound usability evalua­tion of a learning analytics tool within an operational learning management sys­tem and a real graduate course setting. The response to the eClass learning analytics tool was generally very positive. As for usability, both the quantitative and qualitative data reflect that an overwhelming majority of participants found the tool to be visually appealing and easy to use. In addition, and perhaps most encouragingly, a few participants indicated that they actually found the tool to be fun. It appears that using the tool is an enjoyable experience; this may moti­vate students to explore it moving forward. Another very positive outcome of the study was that participants were forthcoming with excellent constructive feedback. Many of their suggestions were practical and specific and have been incorporated into a newer, more refined version of the tool. An abundance of data was provided in relation to perceived potential instructor/professor uses of the data presented by the tool. Participants acknowledged that instructors and professors would be able to get a holistic sense of eClass site use by students and would benefit from being able to identify which students engaged online, with which activities, and to what extent.

These realizations were tempered with the potential for misinterpretation of student data, which counsels considerate (i.e., cautious) use of the tool, espe­cially by instructors/professors. Misinterpretation of data could be minimized by using a broad range of sources of evidence for assessing students’ performance and success, such as instructor–student interaction, engagement data, usage data, grades, as well as an individualized feedback mechanism. Students were eager to point out that records of their activity may not accurately represent their efforts and experience in relation to a given class. In particular, participants noted that downloading and use of materials offline would generate data that does not necessarily reflect student engagement. One student suggested, how­ever, that they saw an opportunity to leverage the tool to demonstrate their

110 CJILS / RCSIB 42, no. 1–2 2018

engagements—they would be willing to move more of their activity online if they knew it would be accounted. Ultimately, several students cautioned that drawing conclusions about student success or assessing students based on usage data should be done with care. It was further noted that students seemed to focus mostly on the direct impacts of the assessment of their work and less so on the ability of instructors or professors, for instance, to enhance the learning pro­cess or experience based on data or to identify students who are struggling or require additional challenge. As was noted earlier, it is particularly important that learning analytics tools be used as one of the many sources of evidence for the assessment of student learning and not as the sole source and that the analyt­ical results should be closely triangulated with various other data sources, such as students’ performance on different assignments, their success rate throughout the course, and their engagement throughout the term.

In relation to potential student uses of the data, students indicated mostly popularly that the ability to track interactions required by courses—forum postings being the example repeatedly mentioned—would be particularly valuable. Beyond that, various students reflected on the variety of uses for the tool, such as planning for studying, getting a sense of important course resources, reflecting on engage­ment and associated implications, using its results for motivation, and even, in one case, using the data as reassurance of one’s participation. A few students also men­tioned that it could be stressful to know that instructors and professors could “watch” (and judge) your online academic activity. Although a number of interest­ing ideas and considerations were generated, some students reported being at a loss as to how they might use the tool. Further research into impactful uses for student data in analytics tools would be valuable to help guide and support students and their learning. The eClass learning analytics tool was designed to be flexible and organic rather than prescriptive and to invite users to let their circumstances, needs, and imagination guide them in using and reflecting on the online engage­ment captured by the tool. The findings of this study have practical implications for designing visualization tools for learning management systems. This study has social and ethical implications for instructors who would like to use learning analy­tics tools in their courses. The findings from this study will be useful for the de­signers of learning analytics tools, instructors, and educational administrators who make use of learning analytics tools for assessment as well as for researchers who have an interest in conducting usability evaluation of learning analytics tools.

Acknowledgements I would like to thank Craig Jamieson, Senior Educational Instructional Design Specialist in the Faculty of Nursing at the University of Alberta, and Athena Photinopoulos for their excel­lent research assistance throughout this project.

References Ali, Liaqat, Marek Hatala, Dragan Gaševi�c, and Jelena Jovanovi�c. 2012. “A Qualitative

Evaluation of Evolution of a Learning Analytics Tool.” Computers and Education 58 (1): 470–89. https://doi.org/10.1016/j.compedu.2011.08.030.

Usability Evaluation of Learning Analytics 111

Dawson, Shane. 2010. “‘Seeing’ the Learning Community: An Exploration of the Development of a Resource for Monitoring Online Student Networking.” British Journal of Educational Technology 41 (5): 736–52. https://doi.org/10.1111/ j.1467-8535.2009.00970.x.

Dawson, Shane, Aneesha Bakharia, and Elizabeth Heathcote. 2010. “SNAPP: Realising the Affordances of Real-time SNA within Networked Learning Environments.” Paper presented at the Seventh International Conference on Networked Learning, Aalborg, Denmark, 3–4 May.

Day, Ronald E. 2011. “Death of the User: Reconceptualizing Subjects, Objects, and Their Relations.” Journal of the American Society for Information Science and Technology 62 (1): 78–88. https://doi.org/10.1002/asi.21422.

Dimopoulos, Ioannis, Ourania Petropoulou, Michail Boloudakis, and Symeon Retalis. 2013. “Using Learning Analytics in Moodle for Assessing Students’ Performance.” Paper presented at Moodle Research Conference, Sousse, Tunisia, 4-–5 October.

Ferguson, Rebecca. 2012. “Learning Analytics: Drivers, Developments and Challenges.” International Journal of Technology Enhanced Learning 4 (5–6): 304–17. http://doi. org/10.1504/IJTEL.2012.051816.

Graf, Sabine, Cindy Ives, Nazim Rahman, and Arnold Ferri. 2011. “AAT: A Tool for Accessing and Analysing Students’ Behaviour Data in Learning Systems.” In Proceedings of the First International Conference on Learning Analytics and Knowledge, 174–79. New York: Association for Computing Machinery.

International Conference on Learning Analytics and Knowledge. 2011. https://tekri. athabascau.ca/analytics/about.

Jovanovic, Jelena, Dragan Gasevic, Christopher Brooks, Vladan Devedzic, Marek Hatala, Timmy Eap, and Griff Richards. 2008. “LOCO-Analyst: Semantic Web Technologies in Learning Content Usage Analysis.” International Journal of Continuing Engineering Education and Life Long Learning 18 (1): 54–76. http://doi. org/10.1504/IJCEELL.2008.016076.

Kelly, Nick, Maximiliano Montenegro, Carlos Gonzalez, Paula Clasing, Augusto Sandoval, Magdalena Jara, Elvira Saurina, and Rosa Alarcón. 2017. “Combining Event-and Variable-Centred Approaches to Institution-facing Learning Analytics at the Unit of Study Level.” International Journal of Information and Learning Technology 34 (1): 63–78. https://doi.org/10.1108/IJILT-07-2016-0022.

Lukarov, Vlatko, Mohamed Amine Chatti, and Ulrik Schroeder. 2015. “Learning Analytics Evaluation-Beyond Usability.” Paper presented at the DeLFI Workshops co-located with the 13th e-Learning Conference of the German Computer Society, Munich, Germany, 1 September.

Ruecker, Stan. 2006. “Proposing an Affordance Strength Model to Study New Interface Tools.” Paper presented at the Digital Humanities Conference, Sorbonne, Paris, 5–9 July.

Ruecker, Stan, Milena Radzikowska, and Stéfan Sinclair. 2011. Visual Interface Design for Digital Cultural Heritage: A Guide to Rich-prospect Browsing. Farnham, UK: Ashgate Publishing.

Shiri, Ali, Stan Ruecker, Matt Bouchard, Amy Stafford, Paras Mehta, Karl Anvik, and Ximena Rossello. 2011. “User Evaluation of Searchling: A Visual Interface for Bilingual Digital Libraries.” Electronic Library 29 (1): 71–89. https://doi.org/ 10.1108/02640471111111442.

Shiri, Ali, Stan Ruecker, and Emily Murphy. 2013. “Linear vs. Visual Cognitive Style and Faceted vs. Visual Interaction in Digital Library User Interfaces.” In Proceedings of the Annual Conference of CAIS / Actes du congrès annuel de l’ACSI. Waterloo, ON: Canadian Association for Information Science.

112 CJILS / RCSIB 42, no. 1–2 2018

Siemens, George, D. Gasevic, C. Haythornthwaite, S. Dawson, S. Buckingham Shum, R. Ferguson, et al. 2011. “Open Learning Analytics: An Integrated and Modulazied Platform/ Proposal to Design, Implement, and Evaluate an Open Platform to Integrate Hetergenous Learning Analytics Techniques.” Society for Learning Analytics Research. https://solaresearch.org/wp-content/uploads/2011/12/ OpenLearningAnalytics.pdf.

Siemens, George, and Phil Long. 2011. “Penetrating the Fog: Analytics in Learning and Education.” EDUCAUSE Review 46 (5): 30. https://doi.org/10.17471/2499­4324/195.