Investigating a learning analytics interface for automatically marked programming assessments

dc.contributor.advisorSuleman, Hussein
dc.contributor.authorNdenge, Kinsley
dc.date.accessioned2022-03-10T09:48:44Z
dc.date.available2022-03-10T09:48:44Z
dc.date.issued2021
dc.date.updated2022-03-08T11:52:22Z
dc.description.abstractStudent numbers at the University of Cape Town continue to grow, with an increasing number of students enrolling to study programming courses. With this increase in numbers, it becomes difficult for lecturers to provide individualised feedback on programming assessments submitted by students. To solve this, the university utilises an automatic marking tool for marking assignments and providing feedback. Students can submit assignments and receive instant feedback on marks allocated or errors in their submissions. This tool saves time as lecturers spend less time on marking and provides instant feedback on submitted code, hence providing the student with an opportunity to correct errors in their submitted code. However, most students have identified areas where improvements can be made on the interface between the automatic marker and the submitted programs. This study investigates the potential of creating a learning analytics inspired dashboard interface to improve the feedback provided to students on their submitted programs. A focus group consisting of computer science class representatives was organised, and feedback from this focus group was used to create dashboard mock-ups. These mock-ups were then used to develop high-fidelity learning analytics inspired dashboard prototypes that were tested by first-year computer science students to determine if the interfaces were useful and usable. The prototypes were designed using the Python programming language and Plotly Python library. User-centred design methods were employed by eliciting constant feedback from students during the prototyping and design of the learning analytics inspired interface. A usability study was employed where students were required to use the dashboard and then provide feedback on its use by completing a questionnaire. The questionnaire was designed using Nielsen's Usability Heuristics and AttrakDiff. These methods also assisted in the evaluation of the dashboard design. The research showed that students considered a learning analytics dashboard as an essential tool that could help them as they learn to program. Students found the dashboard useful and had an overall understanding of the specific features they would like to see implemented on a learning analytics inspired dashboard used by the automatic marking tool. Some of the specific features mentioned by students include overall performance, duly performed needed to qualify for exams, highest score, assignment due dates, class average score, and most common errors. This research hopes to provide insight on how automatically marked programming assessments could be displayed to students in a way that supports learning.
dc.identifier.apacitationNdenge, K. (2021). <i>Investigating a learning analytics interface for automatically marked programming assessments</i>. (). ,Faculty of Science ,Department of Computer Science. Retrieved from http://hdl.handle.net/11427/36023en_ZA
dc.identifier.chicagocitationNdenge, Kinsley. <i>"Investigating a learning analytics interface for automatically marked programming assessments."</i> ., ,Faculty of Science ,Department of Computer Science, 2021. http://hdl.handle.net/11427/36023en_ZA
dc.identifier.citationNdenge, K. 2021. Investigating a learning analytics interface for automatically marked programming assessments. . ,Faculty of Science ,Department of Computer Science. http://hdl.handle.net/11427/36023en_ZA
dc.identifier.ris TY - Master Thesis AU - Ndenge, Kinsley AB - Student numbers at the University of Cape Town continue to grow, with an increasing number of students enrolling to study programming courses. With this increase in numbers, it becomes difficult for lecturers to provide individualised feedback on programming assessments submitted by students. To solve this, the university utilises an automatic marking tool for marking assignments and providing feedback. Students can submit assignments and receive instant feedback on marks allocated or errors in their submissions. This tool saves time as lecturers spend less time on marking and provides instant feedback on submitted code, hence providing the student with an opportunity to correct errors in their submitted code. However, most students have identified areas where improvements can be made on the interface between the automatic marker and the submitted programs. This study investigates the potential of creating a learning analytics inspired dashboard interface to improve the feedback provided to students on their submitted programs. A focus group consisting of computer science class representatives was organised, and feedback from this focus group was used to create dashboard mock-ups. These mock-ups were then used to develop high-fidelity learning analytics inspired dashboard prototypes that were tested by first-year computer science students to determine if the interfaces were useful and usable. The prototypes were designed using the Python programming language and Plotly Python library. User-centred design methods were employed by eliciting constant feedback from students during the prototyping and design of the learning analytics inspired interface. A usability study was employed where students were required to use the dashboard and then provide feedback on its use by completing a questionnaire. The questionnaire was designed using Nielsen's Usability Heuristics and AttrakDiff. These methods also assisted in the evaluation of the dashboard design. The research showed that students considered a learning analytics dashboard as an essential tool that could help them as they learn to program. Students found the dashboard useful and had an overall understanding of the specific features they would like to see implemented on a learning analytics inspired dashboard used by the automatic marking tool. Some of the specific features mentioned by students include overall performance, duly performed needed to qualify for exams, highest score, assignment due dates, class average score, and most common errors. This research hopes to provide insight on how automatically marked programming assessments could be displayed to students in a way that supports learning. DA - 2021_ DB - OpenUCT DP - University of Cape Town KW - Computer Science LK - https://open.uct.ac.za PY - 2021 T1 - Investigating a learning analytics interface for automatically marked programming assessments TI - Investigating a learning analytics interface for automatically marked programming assessments UR - http://hdl.handle.net/11427/36023 ER - en_ZA
dc.identifier.urihttp://hdl.handle.net/11427/36023
dc.identifier.vancouvercitationNdenge K. Investigating a learning analytics interface for automatically marked programming assessments. []. ,Faculty of Science ,Department of Computer Science, 2021 [cited yyyy month dd]. Available from: http://hdl.handle.net/11427/36023en_ZA
dc.language.rfc3066eng
dc.publisher.departmentDepartment of Computer Science
dc.publisher.facultyFaculty of Science
dc.subjectComputer Science
dc.titleInvestigating a learning analytics interface for automatically marked programming assessments
dc.typeMaster Thesis
dc.type.qualificationlevelMasters
dc.type.qualificationlevelMPhil
Files
Original bundle
Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
thesis_sci_2021_ndenge kinsley.pdf
Size:
3.05 MB
Format:
Adobe Portable Document Format
Description:
License bundle
Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
license.txt
Size:
0 B
Format:
Item-specific license agreed upon to submission
Description:
Collections