An outcome evaluation of the LifeMatters Foundation's Numeracy Programme

dc.contributor.advisorLouw-Potgieter, Johaen_ZA
dc.contributor.authorHardwick, Nicken_ZA
dc.date.accessioned2017-09-06T07:09:25Z
dc.date.available2017-09-06T07:09:25Z
dc.date.issued2017en_ZA
dc.description.abstractThis dissertation includes an outcome evaluation report of the LifeMatters Foundation Numeracy Programme. This programme focuses on strengthening the foundational numerical skills of the participants, in this case a group of Grade 2 learners from two schools in the Western Cape area. In total, these two schools had five Grade 2 classes of which constituted the sample. While this programme has run before, the LifeMatters Foundation decided to redesign the programme and run a new pilot programme in 2016. This dissertation focuses on the evaluation of this pilot programme with the goal to attain information on two outcome questions. The first of these questions examined whether the programme participants' foundational numerical skills improved by the end of the programme and if they improved more than the skills of the comparison class. The comparison class for this evaluation was made up of 12 learners of one class that met the criteria for selection, but did not receive treatment. Each of the other four classes had the weakest 12 learners selected on the results of a class-based assessment delivered by the teachers. Therefore, in total, the evaluation included 60 participants. The second question examined if programme dosage, or the amount of attendance, was a significant contributor to the improvement of participants' numerical skills. As the programme was conducted over the course of the year, this question sought to control for the impact of maturation on the results and identify a programme effect. Secondary data, provided by the LifeMatters Foundation, were used in order to answer the two evaluation questions. This data consisted of the results of the participants on eight measurements conducted throughout the year. These measurements were standardised tests, known as Formal Assessment Tasks, designed by the Western Cape Education Department. The data analysis methods included descriptive and inferential statistics for learners' performance and average programme dosage, a repeated measures ANOVA with a betweensubjects factor for the differences between classes on each measurement, and a linear regression model for determining the effect of programme dosage on learners' final year mark. Results highlighted that two of the four classes were significantly different from the comparison class. Furthermore, analysis revealed that, on average, the programme was not having the desired effect on the learners' performance. These results must be interpreted with caution as there was an issue of overcoverage in the programme. This refers to the ratio of participants in the programme that should not be in the programme over the total number of participants. More than half of the participants should not have been included in the programme, as they were far more academically advanced than the rest of the participants. In order to improve this facet, it is recommended that the LifeMatters' foundation develop a selection measure that is standardised, valid, and reliable. The second evaluation question dealt with the impact of programme dosage on overall final mark, and as the average attendance of the programme was approximately 50%, there was no significant impact of attendance on final year mark. It is suggested that the requirements for attendance be re-evaluated as the low attendance rates played a role in the low programme effect. The evaluation was limited by a lack of an adequate comparison of groups at baseline, as well as poorly controlling for maturation, a threat to internal validity, through the poor attendance. Despite the limitations, the evaluation has provided useful information for programme improvement, and if the recommendations are followed further evaluations will provide more conclusive results around programme effect.en_ZA
dc.identifier.apacitationHardwick, N. (2017). <i>An outcome evaluation of the LifeMatters Foundation's Numeracy Programme</i>. (Thesis). University of Cape Town ,Faculty of Commerce ,Institute for Monitoring and Evaluation. Retrieved from http://hdl.handle.net/11427/25077en_ZA
dc.identifier.chicagocitationHardwick, Nick. <i>"An outcome evaluation of the LifeMatters Foundation's Numeracy Programme."</i> Thesis., University of Cape Town ,Faculty of Commerce ,Institute for Monitoring and Evaluation, 2017. http://hdl.handle.net/11427/25077en_ZA
dc.identifier.citationHardwick, N. 2017. An outcome evaluation of the LifeMatters Foundation's Numeracy Programme. University of Cape Town.en_ZA
dc.identifier.ris TY - Thesis / Dissertation AU - Hardwick, Nick AB - This dissertation includes an outcome evaluation report of the LifeMatters Foundation Numeracy Programme. This programme focuses on strengthening the foundational numerical skills of the participants, in this case a group of Grade 2 learners from two schools in the Western Cape area. In total, these two schools had five Grade 2 classes of which constituted the sample. While this programme has run before, the LifeMatters Foundation decided to redesign the programme and run a new pilot programme in 2016. This dissertation focuses on the evaluation of this pilot programme with the goal to attain information on two outcome questions. The first of these questions examined whether the programme participants' foundational numerical skills improved by the end of the programme and if they improved more than the skills of the comparison class. The comparison class for this evaluation was made up of 12 learners of one class that met the criteria for selection, but did not receive treatment. Each of the other four classes had the weakest 12 learners selected on the results of a class-based assessment delivered by the teachers. Therefore, in total, the evaluation included 60 participants. The second question examined if programme dosage, or the amount of attendance, was a significant contributor to the improvement of participants' numerical skills. As the programme was conducted over the course of the year, this question sought to control for the impact of maturation on the results and identify a programme effect. Secondary data, provided by the LifeMatters Foundation, were used in order to answer the two evaluation questions. This data consisted of the results of the participants on eight measurements conducted throughout the year. These measurements were standardised tests, known as Formal Assessment Tasks, designed by the Western Cape Education Department. The data analysis methods included descriptive and inferential statistics for learners' performance and average programme dosage, a repeated measures ANOVA with a betweensubjects factor for the differences between classes on each measurement, and a linear regression model for determining the effect of programme dosage on learners' final year mark. Results highlighted that two of the four classes were significantly different from the comparison class. Furthermore, analysis revealed that, on average, the programme was not having the desired effect on the learners' performance. These results must be interpreted with caution as there was an issue of overcoverage in the programme. This refers to the ratio of participants in the programme that should not be in the programme over the total number of participants. More than half of the participants should not have been included in the programme, as they were far more academically advanced than the rest of the participants. In order to improve this facet, it is recommended that the LifeMatters' foundation develop a selection measure that is standardised, valid, and reliable. The second evaluation question dealt with the impact of programme dosage on overall final mark, and as the average attendance of the programme was approximately 50%, there was no significant impact of attendance on final year mark. It is suggested that the requirements for attendance be re-evaluated as the low attendance rates played a role in the low programme effect. The evaluation was limited by a lack of an adequate comparison of groups at baseline, as well as poorly controlling for maturation, a threat to internal validity, through the poor attendance. Despite the limitations, the evaluation has provided useful information for programme improvement, and if the recommendations are followed further evaluations will provide more conclusive results around programme effect. DA - 2017 DB - OpenUCT DP - University of Cape Town LK - https://open.uct.ac.za PB - University of Cape Town PY - 2017 T1 - An outcome evaluation of the LifeMatters Foundation's Numeracy Programme TI - An outcome evaluation of the LifeMatters Foundation's Numeracy Programme UR - http://hdl.handle.net/11427/25077 ER - en_ZA
dc.identifier.urihttp://hdl.handle.net/11427/25077
dc.identifier.vancouvercitationHardwick N. An outcome evaluation of the LifeMatters Foundation's Numeracy Programme. [Thesis]. University of Cape Town ,Faculty of Commerce ,Institute for Monitoring and Evaluation, 2017 [cited yyyy month dd]. Available from: http://hdl.handle.net/11427/25077en_ZA
dc.language.isoengen_ZA
dc.publisher.departmentInstitute for Monitoring and Evaluationen_ZA
dc.publisher.facultyFaculty of Commerceen_ZA
dc.publisher.institutionUniversity of Cape Town
dc.subject.otherProgramme Evaluationen_ZA
dc.titleAn outcome evaluation of the LifeMatters Foundation's Numeracy Programmeen_ZA
dc.typeMaster Thesis
dc.type.qualificationlevelMasters
dc.type.qualificationnameMPhilen_ZA
uct.type.filetypeText
uct.type.filetypeImage
uct.type.publicationResearchen_ZA
uct.type.resourceThesisen_ZA
Files
Original bundle
Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
thesis_com_2017_hardwick_nick.pdf
Size:
2.1 MB
Format:
Adobe Portable Document Format
Description:
Collections