Investigating a learning analytics interface for automatically marked programming assessments
| dc.contributor.advisor | Suleman, Hussein | |
| dc.contributor.author | Ndenge, Kinsley | |
| dc.date.accessioned | 2022-03-10T09:48:44Z | |
| dc.date.available | 2022-03-10T09:48:44Z | |
| dc.date.issued | 2021 | |
| dc.date.updated | 2022-03-08T11:52:22Z | |
| dc.description.abstract | Student numbers at the University of Cape Town continue to grow, with an increasing number of students enrolling to study programming courses. With this increase in numbers, it becomes difficult for lecturers to provide individualised feedback on programming assessments submitted by students. To solve this, the university utilises an automatic marking tool for marking assignments and providing feedback. Students can submit assignments and receive instant feedback on marks allocated or errors in their submissions. This tool saves time as lecturers spend less time on marking and provides instant feedback on submitted code, hence providing the student with an opportunity to correct errors in their submitted code. However, most students have identified areas where improvements can be made on the interface between the automatic marker and the submitted programs. This study investigates the potential of creating a learning analytics inspired dashboard interface to improve the feedback provided to students on their submitted programs. A focus group consisting of computer science class representatives was organised, and feedback from this focus group was used to create dashboard mock-ups. These mock-ups were then used to develop high-fidelity learning analytics inspired dashboard prototypes that were tested by first-year computer science students to determine if the interfaces were useful and usable. The prototypes were designed using the Python programming language and Plotly Python library. User-centred design methods were employed by eliciting constant feedback from students during the prototyping and design of the learning analytics inspired interface. A usability study was employed where students were required to use the dashboard and then provide feedback on its use by completing a questionnaire. The questionnaire was designed using Nielsen's Usability Heuristics and AttrakDiff. These methods also assisted in the evaluation of the dashboard design. The research showed that students considered a learning analytics dashboard as an essential tool that could help them as they learn to program. Students found the dashboard useful and had an overall understanding of the specific features they would like to see implemented on a learning analytics inspired dashboard used by the automatic marking tool. Some of the specific features mentioned by students include overall performance, duly performed needed to qualify for exams, highest score, assignment due dates, class average score, and most common errors. This research hopes to provide insight on how automatically marked programming assessments could be displayed to students in a way that supports learning. | |
| dc.identifier.apacitation | Ndenge, K. (2021). <i>Investigating a learning analytics interface for automatically marked programming assessments</i>. (). ,Faculty of Science ,Department of Computer Science. Retrieved from http://hdl.handle.net/11427/36023 | en_ZA |
| dc.identifier.chicagocitation | Ndenge, Kinsley. <i>"Investigating a learning analytics interface for automatically marked programming assessments."</i> ., ,Faculty of Science ,Department of Computer Science, 2021. http://hdl.handle.net/11427/36023 | en_ZA |
| dc.identifier.citation | Ndenge, K. 2021. Investigating a learning analytics interface for automatically marked programming assessments. . ,Faculty of Science ,Department of Computer Science. http://hdl.handle.net/11427/36023 | en_ZA |
| dc.identifier.ris | TY - Master Thesis AU - Ndenge, Kinsley AB - Student numbers at the University of Cape Town continue to grow, with an increasing number of students enrolling to study programming courses. With this increase in numbers, it becomes difficult for lecturers to provide individualised feedback on programming assessments submitted by students. To solve this, the university utilises an automatic marking tool for marking assignments and providing feedback. Students can submit assignments and receive instant feedback on marks allocated or errors in their submissions. This tool saves time as lecturers spend less time on marking and provides instant feedback on submitted code, hence providing the student with an opportunity to correct errors in their submitted code. However, most students have identified areas where improvements can be made on the interface between the automatic marker and the submitted programs. This study investigates the potential of creating a learning analytics inspired dashboard interface to improve the feedback provided to students on their submitted programs. A focus group consisting of computer science class representatives was organised, and feedback from this focus group was used to create dashboard mock-ups. These mock-ups were then used to develop high-fidelity learning analytics inspired dashboard prototypes that were tested by first-year computer science students to determine if the interfaces were useful and usable. The prototypes were designed using the Python programming language and Plotly Python library. User-centred design methods were employed by eliciting constant feedback from students during the prototyping and design of the learning analytics inspired interface. A usability study was employed where students were required to use the dashboard and then provide feedback on its use by completing a questionnaire. The questionnaire was designed using Nielsen's Usability Heuristics and AttrakDiff. These methods also assisted in the evaluation of the dashboard design. The research showed that students considered a learning analytics dashboard as an essential tool that could help them as they learn to program. Students found the dashboard useful and had an overall understanding of the specific features they would like to see implemented on a learning analytics inspired dashboard used by the automatic marking tool. Some of the specific features mentioned by students include overall performance, duly performed needed to qualify for exams, highest score, assignment due dates, class average score, and most common errors. This research hopes to provide insight on how automatically marked programming assessments could be displayed to students in a way that supports learning. DA - 2021_ DB - OpenUCT DP - University of Cape Town KW - Computer Science LK - https://open.uct.ac.za PY - 2021 T1 - Investigating a learning analytics interface for automatically marked programming assessments TI - Investigating a learning analytics interface for automatically marked programming assessments UR - http://hdl.handle.net/11427/36023 ER - | en_ZA |
| dc.identifier.uri | http://hdl.handle.net/11427/36023 | |
| dc.identifier.vancouvercitation | Ndenge K. Investigating a learning analytics interface for automatically marked programming assessments. []. ,Faculty of Science ,Department of Computer Science, 2021 [cited yyyy month dd]. Available from: http://hdl.handle.net/11427/36023 | en_ZA |
| dc.language.rfc3066 | eng | |
| dc.publisher.department | Department of Computer Science | |
| dc.publisher.faculty | Faculty of Science | |
| dc.subject | Computer Science | |
| dc.title | Investigating a learning analytics interface for automatically marked programming assessments | |
| dc.type | Master Thesis | |
| dc.type.qualificationlevel | Masters | |
| dc.type.qualificationlevel | MPhil |