An outcome evaluation of the LifeMatters Foundation's Numeracy Programme

Master Thesis

2017

Permanent link to this Item
Authors
Journal Title
Link to Journal
Journal ISSN
Volume Title
Publisher
Publisher

University of Cape Town

License
Series
Abstract
This dissertation includes an outcome evaluation report of the LifeMatters Foundation Numeracy Programme. This programme focuses on strengthening the foundational numerical skills of the participants, in this case a group of Grade 2 learners from two schools in the Western Cape area. In total, these two schools had five Grade 2 classes of which constituted the sample. While this programme has run before, the LifeMatters Foundation decided to redesign the programme and run a new pilot programme in 2016. This dissertation focuses on the evaluation of this pilot programme with the goal to attain information on two outcome questions. The first of these questions examined whether the programme participants' foundational numerical skills improved by the end of the programme and if they improved more than the skills of the comparison class. The comparison class for this evaluation was made up of 12 learners of one class that met the criteria for selection, but did not receive treatment. Each of the other four classes had the weakest 12 learners selected on the results of a class-based assessment delivered by the teachers. Therefore, in total, the evaluation included 60 participants. The second question examined if programme dosage, or the amount of attendance, was a significant contributor to the improvement of participants' numerical skills. As the programme was conducted over the course of the year, this question sought to control for the impact of maturation on the results and identify a programme effect. Secondary data, provided by the LifeMatters Foundation, were used in order to answer the two evaluation questions. This data consisted of the results of the participants on eight measurements conducted throughout the year. These measurements were standardised tests, known as Formal Assessment Tasks, designed by the Western Cape Education Department. The data analysis methods included descriptive and inferential statistics for learners' performance and average programme dosage, a repeated measures ANOVA with a betweensubjects factor for the differences between classes on each measurement, and a linear regression model for determining the effect of programme dosage on learners' final year mark. Results highlighted that two of the four classes were significantly different from the comparison class. Furthermore, analysis revealed that, on average, the programme was not having the desired effect on the learners' performance. These results must be interpreted with caution as there was an issue of overcoverage in the programme. This refers to the ratio of participants in the programme that should not be in the programme over the total number of participants. More than half of the participants should not have been included in the programme, as they were far more academically advanced than the rest of the participants. In order to improve this facet, it is recommended that the LifeMatters' foundation develop a selection measure that is standardised, valid, and reliable. The second evaluation question dealt with the impact of programme dosage on overall final mark, and as the average attendance of the programme was approximately 50%, there was no significant impact of attendance on final year mark. It is suggested that the requirements for attendance be re-evaluated as the low attendance rates played a role in the low programme effect. The evaluation was limited by a lack of an adequate comparison of groups at baseline, as well as poorly controlling for maturation, a threat to internal validity, through the poor attendance. Despite the limitations, the evaluation has provided useful information for programme improvement, and if the recommendations are followed further evaluations will provide more conclusive results around programme effect.
Description

Reference:

Collections