Investigating a learning analytics interface for automatically marked programming assessments

Master Thesis

2021

Permanent link to this Item
Authors
Supervisors
Journal Title
Link to Journal
Journal ISSN
Volume Title
Publisher
Publisher
License
Series
Abstract
Student numbers at the University of Cape Town continue to grow, with an increasing number of students enrolling to study programming courses. With this increase in numbers, it becomes difficult for lecturers to provide individualised feedback on programming assessments submitted by students. To solve this, the university utilises an automatic marking tool for marking assignments and providing feedback. Students can submit assignments and receive instant feedback on marks allocated or errors in their submissions. This tool saves time as lecturers spend less time on marking and provides instant feedback on submitted code, hence providing the student with an opportunity to correct errors in their submitted code. However, most students have identified areas where improvements can be made on the interface between the automatic marker and the submitted programs. This study investigates the potential of creating a learning analytics inspired dashboard interface to improve the feedback provided to students on their submitted programs. A focus group consisting of computer science class representatives was organised, and feedback from this focus group was used to create dashboard mock-ups. These mock-ups were then used to develop high-fidelity learning analytics inspired dashboard prototypes that were tested by first-year computer science students to determine if the interfaces were useful and usable. The prototypes were designed using the Python programming language and Plotly Python library. User-centred design methods were employed by eliciting constant feedback from students during the prototyping and design of the learning analytics inspired interface. A usability study was employed where students were required to use the dashboard and then provide feedback on its use by completing a questionnaire. The questionnaire was designed using Nielsen's Usability Heuristics and AttrakDiff. These methods also assisted in the evaluation of the dashboard design. The research showed that students considered a learning analytics dashboard as an essential tool that could help them as they learn to program. Students found the dashboard useful and had an overall understanding of the specific features they would like to see implemented on a learning analytics inspired dashboard used by the automatic marking tool. Some of the specific features mentioned by students include overall performance, duly performed needed to qualify for exams, highest score, assignment due dates, class average score, and most common errors. This research hopes to provide insight on how automatically marked programming assessments could be displayed to students in a way that supports learning.
Description

Reference:

Collections