Browsing by Subject "Exercise and Sports Physiotherapy"
Now showing 1 - 16 of 16
Results Per Page
Sort Options
- ItemOpen AccessA comparison of muscle damage, soreness, morphology, T2 changes and running performance following an ultramarathon race(2016) Van Niekerk, Wanda; Burgess, TheresaBackground: Exercise induced muscle damage collectively describes the response to strenuous or unaccustomed exercise. It is well - established that endurance running causes muscle damage. Indirect indicators of muscle damage include the loss of muscle strength, increased levels of muscle proteins, such as creatine kinase, in the blood and delayed onset of muscle soreness. Magnetic resonance imaging has been used to gain insight in to the underlying mechanisms associated with exercise induced muscle damage. The most common approach has focused on changes in transverse (T2) relaxation times after exercise. Given that inflammation and oedema are proposed as reasons for the changes in T2 times, there may be changes in morphological measurements such as muscle volume and peak cross sectional area. Few studies have utilised MRI morphological measurements to assess the effects of exercise induced muscle damage, and there is a lack of evidence regarding changes in muscle morphology after endurance running. Aim: The aim of this study was to investigate changes in transverse (T2) relaxation times and muscle morphology in endurance runners after a 90 km ultramarathon race. Specific objectives: (a) To determine the time course of recovery of muscle pain and plasma creatine kinase activity after a 90km ultramarathon race; (b) to determine changes in 5km time trial performance in an experimental group of endurance runners that took part in a 90 km ultramarathon race compared to a control group of endurance runners that did not take part in a 90 km ultramarathon race; (c) to compare changes in muscle morphology (volume and average cross sectional area) and T2 relaxation times of the quadriceps and hamstrings in an experimental group of endurance runners that took part in a 90 km ultramarathon race and a control group of endurance runners that did not take part in a 90 km ultramarathon race; and (d) to evaluate potential relationships between indicators of muscle damage (plasma creatine kinase levels and muscle pain measurements), morphological muscle changes, and T2 relaxation times in an experimental group of endurance runners that took part in a 90 km ultramarathon race and a control group of endurance runners that did not take part in a 90 km ultramarathon race. Methods: This was a descriptive, correlational study that involved secondary analysis of previously collected data. No new participants were recruited for the study. Participants were allocated to groups, based on whether they took part in a 90 km ultramarathon. The experimental group (n = 11) completed a 90 km ultramarathon. The control group (n = 11) consisted of endurance runners, who ran a minimum of 60 km.wk-1, but did not take part in the ultramarathon. Magnetic resonance images were taken seven days before and 10 - 15 days after an ultramarathon as part of an earlier study. The magnetic resonance images analysis included the digital segmentation and reconstruction of the rectus femoris, combined quadriceps and combined hamstrings muscle groups. Muscle volume and peak cross sectional area was calculated as well as T2 relaxation times. These measurements were correlated with muscle pain and plasma creatine kinase activity measurements obtained during the initial study. Results: There was a significant difference in hamstrings muscle volume between the experimental and control groups. The experimental group had a significantly lower muscle volume compared to the control group (p = 0.03). There was also a significant positive relationship between the T2 relaxation time and plasma CK activity. (r = 0.74; p = 0.04) Conclusion: Changes in muscle morphology in endurance runners are evident after a 90 km ultramarathon. The significant relationship between T2 relaxation times and plasma creatine kinase activity confirms that T2 relaxation time may be used as a non-invasive direct indicator of exercise induced muscle damage.
- ItemOpen AccessCross sectional study to determine whether there are central nervous system changes in rugby players who have sustained recurrent ankle injuries(2017) Rawlinson, Alice Jane; Parker, Romy; Burgess, TheresaBackground: Rugby is a popular game played around the world and has one of the highest recorded injury rates in sport. The literature exposes ankle injuries as one of the most common areas injured in sport and this trend carries through in rugby too, with lateral ankle sprains predominating. Recurrent ankle injuries are commonly reported in the literature and account for high economic and social burden. There are many intrinsic and extrinsic risk factors credited with causing lateral ankle injuries but to date the literature does not show conclusive evidence for management and prevention of recurrent injuries. A new area of research that has not previously been explored is the neurological influence on recurrent injury. Central processing is a recognised form of learning seen in adults and children during normal development and training and more recently acknowledged in injury settings. This phenomenon has also been seen in abnormal states of development such as neglect and chronic pain. Central Nervous System Changes In Recurrent Ankle Injuries In Rugby Player 2 Aim: The purpose of this study was to investigate whether there are changes in the central nervous system of rugby players with recurrent ankle injuries. Methods: An experimental and control group was used for this cross sectional study. Participants were recruited from the Golden Lions Rugby Union. Forty-six players in total were recruited. The control group consisted of 22 players, and the recurrent injury group consisted of 24 players. Medical and Sports History Questionnaire was administered as well as a battery of four physical test procedures. The questionnaire asked participants to provide information regarding demographics, playing position, training and playing history, current general health, current and previous injury history, and specifically ankle injury history. The four testing procedures were: body image testing, laterality testing, two point discrimination testing and pressure-pain threshold testing. Results: The results were collected and recorded. Between group and within group comparisons were made for the control and recurrent injury groups. From the Medical and Sports History Questionnaire the results indicated that the recurrent injury group participated in a significantly shorter preseason training period compared to the control group. The laterality testing within group analysis had a significant difference, the injured side had a slower recognition time [1.4(1.3-1.6)] compared to the uninjured side [1.3(1.15-1.5) Central Nervous System Changes In Recurrent Ankle Injuries In Rugby Player 3 p<0.01]. Pressure pain threshold testing produced a significant difference for the control group on the ATFL test site and the PTFL site. The PTFL site also demonstrated significant difference in the between group comparison analysis. The results from the two point discrimination testing and the body image testing produced interesting results. The two point discrimination tests performed on the both the recurrent injury group and the control group using within group comparison showed significant differences on the anterior talofibular ligament between the affected and nonaffected limbs. The between group test result were also significant for the injured vs control side at the ATFL site. The affected side showed a poorer ability to differentiate between one and two points, needing a bigger area before two points were distinguished from one. Similarly, body image testing showed significant differences in the within group comparison of total area drawn for the recurrent injury group only. In the recurrent injury group, the drawing of the affected foot was significantly larger than the drawing of the unaffected side. The control group showed no differences between sides. Conclusion: The study recommends that there is a relationship between central nervous system changes in recurrent ankle injuries in the sample group of professional rugby players. The data indicates that preseason length is a factor to be considered in recurrent ankle injuries. The clinical tests focussed specifically on central nervous system changes also produced some illuminating results. The recurrent injury group demonstrated significant difference between injured and uninjured sides in both two point discrimination testing of the ATFL ligaments and in the body image drawing of the foot and ankle. The control group in contrast didn't yield any differences between sides for these same tests. The pressure pain testing and laterality testing producing significant results also indicate the central nervous system involvement in recurrent injury.
- ItemOpen AccessDifferences in Technical Contact Performance Between Pool and Knockout Stages in Men's International Rugby Sevens(2022) de Klerk, Stephanus; Hendricks, ShariefIntroduction: Rugby sevens is a high intensity, intermittent, collision field sport requiring a combination of physical fitness, and technical and tactical ability. Research on the running demands of rugby sevens matches has been synthesised to inform training and practice. In contrast, only a paucity of research is available on the technical contact demands. Moreover, less is known about the technical performances of successful teams. Therefore, the first part of this thesis conducted a systematic review of the literature of the tackle- and/or ruck frequencies within rugby sevens matches to understand the technical contact demands of rugby sevens. The second part of this thesis is an original study that retrospectively analysed and compared tackle and ruck events between the pool and knockout stages in one full season of the 2018/2019 International Men's Rugby Sevens World Series. Methods: For part one, a systematic search according to the PRISMA guidelines was performed on three electronic databases. The key word combinations included “Rugby Sevens” OR “Rugby” AND “Sevens” OR “Sevens” AND “Contact Demands”. The initial search across the databases retrieved 812 titles. The abstracts and full-text articles that presented with quantitative data on tackle- and/or ruck frequencies or rates within a given match or tournament were included. After the screening process, a total of 15 articles were included in the final review. For the second part of the thesis, all matches from the 2018/2019 International Men's Rugby Sevens World Series were analysed for tackle- and ruck events using Sports Code elite version 6.5.1. This equated to 21 226 tackle events and 6 345 rucks events across 450 matches. Results: The systematic review found that the mean rucks per match ranged from 7.1±4.6 (Mean±SD) to 9.5±4.5 for winning teams and 7.6±3.7 to 11.1±4.6 for losing teams on men's elite level. From a tackle demands perspective, studies on men's elite-level found that the mean tackles per match were 20.3±6.7 for winning teams and 20.4±6.1 for losing teams. In the original study, the mean tackles per match were 47.2, 95% CI 46.4-48.0 across the season with no significant difference between pool- and knockout stages of the tournaments. The mean rucks per match were 14.1, 95% CI 13.7-14.5 across the season with a significant difference between the stages of competition (P value < 0.001) (pool 14.8, 95% CI 14.2-15.4 vs knockout 13.3, 95% CI 12.7-13.9). Tackle variables that proved significant in pool matches for tackle outcomes included the type of tackle, point of body contact, tackle sequence, attacker intention, and match rank. For knockout matches only point of body contact and attacker intention proved to be significant Discussion: The systematic review provides a synthesis of the current state of technical contact demands in rugby sevens. The next step was to understand contact performance to identify what the determinants of contact success in rugby sevens are. For the original study, pool- and knockout stages had similar tackle frequencies, but dissimilar ruck frequencies, with more rucks occurring in the pool stages. Higher ranked teams and teams progressing to knockout stages of competition had less rucks and successful tackles. This showed that these teams were more proficient at evasive play with regards to contact performance. Practitioners and coaches can use this information to plan contact training and optimise tournament preparation in the Sevens World Series. The systematic review and original study gives insight into the contact demands and performance of a rugby sevens match. With stakeholder involvement this research has the potential to create innovative injury prevention and performance strategies to be implemented on all rugby sevens platforms.
- ItemOpen AccessDoes a greater training load increase the risk of injury and illness in ultramarathon runners? A prospective descriptive, longitudinal design(2020) Craddock, Nicole; Burgess, Theresa; Lambert, Mike; Buchholtz, KimBackground: Ultramarathon running has become extremely popular over the years. Despite the numerous health benefits of running, there are also many negative effects of running such as increased risk of musculoskeletal injury and illness. Training loads imposed on an athlete should induce positive physiological adaptations to improve their performance. Monitoring of an athlete's training load has become extremely important in terms of injury prevention. Currently, the relationship between training loads and injury and illness incidence is uncertain. More research is needed in this field to minimise the risk of injury and illness and maximise performance in ultramarathon runners. Aim: To determine if there are any associations between injury and illness incidences and training loads among ultramarathon runners in the 12 week period preceding an ultramarathon event and the four week period after the event. Specific Objectives: - To describe the incidence rate of overall and region-specific running-related injuries in a population of ultramarathon runners in the 16 week period surrounding an ultramarathon event. - To describe the incidence rate of illness and illness-related symptoms in a population of ultramarathon runners in the 16 week period surrounding an ultramarathon event. - To describe the weekly and cumulative training parameters (training volume, training frequency, training intensity, training duration) of the injured and uninjured groups and the ill and healthy groups over the 16 week period. - To describe the weekly and cumulative absolute training load parameters (internal load, external load) of the injured and uninjured groups and the ill and healthy groups over the 16 week period. - To describe the weekly relative training load parameters (ACWR) of the injured and uninjured groups and the ill and healthy groups over the 16 week period. - To determine whether there are any significant differences between the injured and uninjured groups and the ill and healthy groups with regard to: a) mean training parameters; b) mean internal training load; and c) mean external training load, over the 16 week period. - To identify any significant associations between: a) absolute training load (internal training load; external training load) and injury and illness incidence; and b) relative training load and injury and illness incidence over the 16 week period. Methods: A prospective, descriptive, longitudinal study design was conducted in runners who were training for the 2019 Two Oceans Ultramarathon. One hundred and nineteen participants were recruited for this study and tracked over a period of 16 weeks (12 weeks leading up to the Two Oceans Ultramarathon event and for four weeks afterwards). Data was collected once a week via an online logbook. Training parameters measured included weekly average running distance, average duration, average frequency and average session RPE. Injury data included injury counts, the structure injured, the main anatomical location and time-loss from injury. Illness data included illness counts, the main illness-related symptoms and time-loss from illness. Results: The overall injury incidence proportion was 31%. The week after the ultramarathon race had the highest injury proportion of 7%. The overall injury incidence was 5 per 1000 training hours. The average time-loss due to injury was three training sessions missed. The overall illness incidence proportion was 66%. The week after the ultramarathon race also had the highest illness proportion of 22%. The overall illness incidence was 16 per 1000 training days. The average time-loss due to illness was three training sessions. A moderate significant negative association was found between external training load and injury (r=-0.56; p=0.025). No associations were found between internal training load and injury; or between internal and external training load and illness respectively. A significant relationship was found for external training load and injury incidence in weeks 5 to 8 for participants who ran less than 30km per week. A significant relationship was found for external training load and illness incidence in weeks 5 to 8, 9 to 12 and 13 to 16 for participants who ran less than 30km per week. A significant relationship was found between the ACWR of >1.5 and injury incidence in weeks 1 to 4, 5 to 8 and 13 to 16. A significant relationship was found between the ACWR of <0.5 and illness incidence in weeks 13 to 16. Conclusion: In conclusion a lower training load could potentially predispose to running-related injuries or the development of illness. Specifically, a weekly mileage of less than 30km per week may increase the risk of sustaining an injury or illness when training for an Ultramarathon event. An ACWR greater than 1.5 may increase the risk of injury in the subsequent week of training and an ACWR less than 0.5 may increase the risk of illness in the following week. Non-gradual changes to a weekly training load, whether increases or decreases, could increase the risk of incurring a running-related injury or illness. Maintaining an ACWR between 0.5 and 1.5 appears to be optimal in minimising the risk of sustaining a running-related injury or illness. We therefore recommend the use of both absolute and relative workloads in the monitoring of an athlete's training load with the aim of minimising injury and illness risk and maximising performance in ultramarathon runners.
- ItemOpen AccessDoes the use of upper leg compression garments aid performance and reduce post-race Delayed Onset Muscle Soreness (DOMS)?(2021) Kabongo, Ken; Bosch, AndrewIntroduction: Despite the lack of scientific knowledge on the physiological and biomechanical effects of wearing compression garments, there has been an increase in the use of these garments in endurance running. The purpose of this study was to compare the performance, pain and thigh circumference changes in endurance runners using upper leg compression garments while competing against runners who did not use compression garments in the same marathon race. Methods: A randomised controlled intervention study was conducted in endurance runners (n=18) participating in the 2019 Winelands Marathon (42.2km). The compression garment group (n=10) participated in the race wearing upper leg compression garments while the control group (n=8) did not. Participants in the compression garment group only wore the compression garments during the marathon. Various outcome measures of perceived exercise-induced muscle damage (EIMD) and running performance were assessed three days before, immediately post-race and two days post-race. Three days prior to the race, mid-thigh circumference measurements were performed. Immediately post-race, mid-thigh circumference measurements, Visual Analogue Scale (VAS) pain ratings and Likert scale for determination of muscle soreness were assessed and race performance times were recorded. Two days post-race, mid-thigh circumference measurements, VAS pain rating and Likert scale for determination of muscle soreness were repeated. Results: VAS pain ratings for hamstring (compression garment 2.50 vs control group 4.00) (p=0.04), knee flexion (compression garment 2.50 vs control group 5.00) (p=0.02) and hip extension (compression garment 2.50 vs control group 4.00) (p=0.04) had a statistically significant difference between the compression garment and control group immediately post-race. VAS pain ratings for hamstring (compression garment 0.00 vs control group 1.00) (p=0.04), knee flexion (compression garment 1.00 vs control group 2.00) (p=0.02) and hip extension (compression garment 1.00 vs control group 2.50) (p=0.04) had a statistically significant difference between the compression garment and control group two days post-race. There were no statistically significant differences in any other outcome measures (i.e. Likert scale for determination of muscle soreness, mid-thigh circumference and race performance) between the compression garment and control group. Conclusion: The use of upper leg compression garments is a recovery ergogenic aid which improves VAS pain ratings post-race. The results suggest that upper leg compression garments have a protective effect on the hamstring muscle in runners in the recovery phase. However, since a runner would be in a recovery phase after a marathon, a minor difference would be of little practical advantage since, importantly, there was no statistically significant differences in race performance and thigh circumference measures.
- ItemOpen AccessGastrocnemius muscle structure and function in habitually resistance-trained marathon runners and traditionally running-trained marathon runners: a comparative analysis(2017) Ellis, Tracy; Burgess, Theresa; Buchholtz, KimBackground: Marathon running involves running long distances and is associated with a high prevalence of running-related injuries. The calf has been identified as one of the most commonly injured structures during running. Running training causes an overload on muscle and stimulates a physiological adaptation to create a training response. Specific adaptations in metabolic and physiological function of a muscle may be further achieved through specificity of exercise training. Resistance training programmes are commonly implemented to enhance specific muscle strength and endurance; and are effective methods of performance and injury prevention. While evidence-based guidelines for resistance training exist, it is unclear whether runners are routinely incorporating evidence-based resistance training into marathon training programmes. If runners are performing habitual resistance training, it is also unknown if the resistance training is of sufficient magnitude or intensity to induce dose-related responses in calf muscle structure or function. Aim: The aim of this study was to evaluate gastrocnemius muscle structure and function in marathon runners who performed habitual resistance training in addition to regular endurance training, compared to marathon runners who performed traditional endurance running training only. Specific Objectives: • To describe the demographic and training characteristics of habitually resistance-trained marathon runners and traditionally running-trained marathon runners. • To determine if there were differences in gastrocnemius endurance, power and flexibility between habitually resistance-trained marathon runners and traditionally running-trained marathon runners. • To evaluate if there were differences in the gastrocnemius muscle structure and architecture in habitually resistance-trained marathon runners compared to traditionally running-trained marathon runners. • To establish if there were any differences in the number of calf injuries sustained in habitually resistance-trained marathon runners and traditionally running-trained marathon runners. Methods: Healthy male runners between 20 and 50 years were included in the study. Participants were required to have completed at least one marathon in the 12-month period prior to the study. Runners forming the "traditionally running-trained" group were required to be participating in regular endurance running training only. Runners in the "habitually resistance-trained group" were required to be performing resistance training in addition to regular endurance running training. Runners with any injury at the time of recruitment or runners who reported a calf injury within the six-month period prior to the study were excluded. Participants with any medical abnormalities detected during screening were also excluded from the study. Eight marathon runners participating in habitual resistance training plus standard running training and eleven marathon runners participating in traditional running training only were recruited for this study. Runners who met the criteria attended two testing sessions at least three days apart. During the first session, informed consent was obtained and the Physical Activity Readiness Questionnaire (PAR-Q) was completed to ensure participants could safely complete physical testing. A questionnaire was completed to determine relevant training and injury history. Body mass, height and the sum of seven skinfolds were recorded. Muscle architecture measurements, including fascicle length, pennation angle, thickness and volume, were performed via imaging ultrasound. Participants were then familiarised with the physical testing procedures. In the second testing session, calf muscle flexibility and endurance were assessed; and isokinetic testing was performed for the left and right triceps surae. Results: There were no significant differences in descriptive characteristics between groups. Participants in the habitually resistance-trained group performed in an average of two hours (range 0.5-2.5 hours) of resistance training of between one to four sessions per week. Participants combined upper and lower body training in the form of circuit training, body weight training, core and proprioceptive training. Resistance training sessions were performed at a varied intensity for load (light to high) according to an estimated 1RM. Participants in the habitually resistancetrained group had completed a significantly greater number of 21.1 km races compared to the traditionally running-trained group (p < 0.05); but there were no other differences in running training or competition history between groups. There were also no significant differences in the number of reported injuries between groups. Average pennation angle was significantly increased in the habitually resistance-trained group compared to the traditional running-trained group (p < 0.05). No other significant differences in architectural measurements were identified. There were no significant differences in calf muscle flexibility, strength, power or endurance between the two groups. However, the small sample size limits the interpretation of the study findings. Conclusion: Wide variability in habitual resistance training patterns were identified. While pennation angle was significantly greater in the habitually resistance-trained group; no differences in all other architectural measurements; or calf muscle strength, power, endurance or flexibility between groups were identified. However, one of the key findings emerging from this study is the variable resistance training practices in endurance runners; and that resistance training practices were not aligned to current evidence-based guidelines for resistance training. Resistance training has a critical role in enhancing endurance running performance, injury prevention and rehabilitation. Future research should investigate the knowledge, attitudes and practices of endurance runners regarding resistance training; to facilitate the development of appropriate education interventions, and to effectively disseminate evidence-based training guidelines to lay communities.
- ItemOpen AccessIdentifying risk factors contributing to the development of shoulder pain and injury in male, adolescent water polo players(2020) Jameson,Yale; Gray, Janine; Roche, StephenWater polo is a fast-growing adolescent sport that consists of swimming, defending and overhead shooting in an aquatic environment. The high demands on the shoulder to complete these tasks are proposed to cause the high injury incidence reported in the sport. The novelty of this research rests in its clinically valuable contribution to understanding shoulder injury aetiology in adolescent water polo players as overhead throwing athletes. The overall research aim of this thesis explores the musculoskeletal profile of a male adolescent water polo players shoulder and the intrinsic factors associated with shoulder injury risk. An overview of the literature (Chapter 2) explores the biomechanics of water polo including swimming and overhead throwing; the musculoskeletal adaptations of overhead throwing in water polo compared to other overhead sports; and the epidemiology of shoulder injury in water polo players relative to other overhead sports. Due to the absence of a consensus-based definition of injury in water polo comparison of existing quality epidemiological studies in the sport was limited. Additionally, although a limited amount of studies have proposed potential risk factors to shoulder injury in water polo players, significant correlations are yet to be found. As with other overhead sports, the water polo shoulder is prone to injury due to the generation of high force during a modified upright swimming posture, repetitive swimming stroke and overhead throwing at high velocities. Male adolescent water polo players were recruited for this study. Chapter 3 describes the adolescent water polo player's shoulder musculoskeletal profile and its association with shoulder injury prevalence throughout a single water polo season. The musculoskeletal variables included pain provocation, range of motion, strength, flexibility and shoulder stability tests which have been used previously in overhead athletes to investigate injury prevention and performance. There were three steps in the data collection process. Firstly, informed consent and assent, demographic, competition, training and injury history, and a shoulder-specific functional questionnaire was acquired from participants. Secondly, a battery of pre-season musculoskeletal tests was performed. The battery of tests included: anthropometry, pain-provocation, glenohumeral and upward scapula range of motion, glenohumeral and scapula muscle strength, glenohumeral flexibility and shoulder stability measurements. Thirdly, at the end of the season participants completed an injury report and training load questionnaire. Participants who experienced shoulder pain, with or without medical management, were categorised into the injury group and those who did not were categorised as uninjured. Chapter 3 documents the adolescent water polo players shoulder musculoskeletal profile, shoulder injury prevalence and the association between these intrinsic risk factors and injury. Specifically, adolescent water polo players present with significant side-to-side asymmetry in the lower trapezius (p = 0.01), upward scapula rotation ROM at 90° glenohumeral elevation (p = 0.03), glenohumeral internal and external rotation ROM (p = 0.01), glenohumeral internal and external strength (p = 0.05 and p = 0.01 respectively) and the pectoralis minor index (p = 0.01). Twenty-four participants (49%) sustained a shoulder injury during the season with the dominant shoulder more commonly affected (54.2%). The most common aggravating factors were identified as throwing (41.7%) and shooting (20.8%). Although significantly lower scores on the pre-season shoulder-specific functional questionnaire (p = 0.01) and significantly greater upward scapula rotation at 90° glenohumeral elevation (p = 0.01) on the dominant shoulder was found in the injured group compared to the uninjured group, no factors were significantly associated with increased injury risk. In conclusion, the findings suggest that male adolescent water polo players are a high-risk population for shoulder injury. It is suggested that improving the players, coaches and parents' health literacy, particularly of the shoulder, and incorporating preventative exercises, targeting modifiable risk factors and side-to-side asymmetry, into pre-season conditioning programmes may reduce the prevalence of shoulder injury in this sporting population. While this research contributes to the epidemiology of shoulder injuries in water polo players, further research is needed to continue to report on injury incidence and associated risk factors, particularly training and workload characteristics in the water polo population.
- ItemOpen AccessIncidence of musculoskeletal injuries in professional dancers(University of Cape Town, 2020) Brooker, Heather; Buchholtz, Kim; Burgess, TheresaBackground: Professional ballet dancers focus on the high levels of discipline, perfection and mobility to achieve the fluid, controlled lines of movement presented on the stage. Dancers undergo long hours of strenuous, repetitive training which increases the risk of developing overuse or traumatic injuries and may compromise the longevity of dancers' careers. Relevant research, particularly in the South African context, is needed to provide recommendations on the intrinsic and extrinsic factors contributing to musculoskeletal injuries in professional ballet dancers. Aim: The aim of this study was to determine the incidence of musculoskeletal injuries and their associated risk factors over a three-month period in adult female professional ballet dancers in South Africa. Specific Objectives: The specific objectives of this study were: • To determine the incidence of traumatic and overuse injuries per 1000 dance hours over a three-month training and performance period in South African female professional ballet dancers; • To determine the relationships between a) Functional Lower Extremity Evaluation (FLEE) scores and injury incidence; b) intrinsic factors (amenorrhoea; body mass index; skinfold measurements; caloric intake) and injury incidence; and c) extrinsic factors (training hours; performance hours) and injury incidence respectively, in South African female professional ballet dancers. Methods: This study had a prospective, descriptive design. Eighteen female dancers were recruited from professional dance companies in the Gauteng, Western Cape and North West provinces of South Africa. Data were collected over a three-month period and included a subjective questionnaire, three-day food diary, skinfold measurements and the Functional Lower Extremity Evaluation (FLEE). Injuries were reported using an injury reporting form over the three-month period. Results: Participants had an average age of 22.1 ± 3.0 years. The dancers had an average BMI of 21.4 ± 2.1 kg.m⁻²; LBM of 41.7 ± 4.9 kg and body fat percentage of 24.7% ± 2.9%. Injury incidence was 3.3 injuries per 1000 dance hours with a total of 4605.58 hours reported overall. Of the 15 injuries reported, 13 occurred in the lower limb, with eight in the ankle and foot. Overuse injuries accounted for 93.3% of the total injuries, with only one traumatic injury reported. None of the descriptive characteristics was associated with increased injury risk. The average caloric intake of 1810.0 ± 503.7 calories, while lower than what is recommended for female athletes, also showed no significant relationship to injury. There were also no significant associations between pre-injury FLEE measurements and training loads; and injury incidence over the course of the study. Conclusion: An overall injury incidence of 3.3 injuries per 1000 dance hours was found in professional female ballet dancers in South Africa, which is higher than the injury incidences identified in previous studies in high-income countries. With regards to injury profile, overuse injuries are 86% more prevalent than traumatic injuries among this population type. We were unable to identify any intrinsic or extrinsic risk factors associated with injury incidence; however, we recognise the limitations of the small sample size in this study. With a high level of injury incidence and inconclusive results on injury risk factors, there is a clear need for significant further research in the field of injury prevention in professional ballet dancing. Further, this study identified a strong need for further research in South African dance companies to facilitate injury prevention and management in South Africa.
- ItemOpen AccessInvestigation of the impact of compression garments on endurance running performance and exercise induced muscle damage in the lower leg(2018) Geldenhuys, Alda Grethe; Bosch, Andrew; Swart, JeroenIntroduction: Compression garments utilisation is very popular among runners despite the relative lack of consensus in the literature regarding a beneficial impact. Methods: A randomised controlled experimental study was conducted in healthy, uninjured endurance runners (n=41) participating in the Old Mutual Two Oceans 56km race. The experimental group (n=20) trained for six weeks and participated in the race wearing below knee compression garments while the control group (n=21) did not. Participants were tested on four occasions for various markers of exercise induced muscle damage (EIMD) and running performance. Six weeks prior to the race, ultrasound scans of the medial gastrocnemius, mid-calf and figure-of-8 ankle circumference baseline measurements were performed. Shortly prior to the race, these measurements were repeated in addition to a countermovement jump (CMJ) test. Immediately following the race, circumference measurements and CMJ testing were repeated in addition to pain ratings on the visual analogue scale (VAS). Race performance times were also obtained. Two days following the race, the ultrasound scans, circumference measurements and VAS pain ratings were repeated. Results: Ankle circumferences measurements increased significantly less (p=0.01, Cohen's d=0.9) in the experimental group from immediately after the race until two days post-race compared to the control group. There were no further statistically significant changes over time in any other objective outcome measure (i.e. mean mid-calf circumference, medial gastrocnemius mean muscle thickness and mean pennation angle, mean CMJ height and estimated peak power output nor in race performance) between the experimental and control groups. Selected pain ratings were statistically significantly worse in the experimental group. Muscle thickness and pennation angles were significantly greater in the control group compared to the experimental group two days following the race. Conclusion: There were limited indications of a beneficial impact of compression garments with minor improvements in ankle circumference measurements, but no further significant effects related to EIMD were detected. Furthermore, no ergogenic impact was detected. Based on the results of the study, there is limited evidence to support the continued utilisation of commercially available below knee compression garments during running.
- ItemOpen AccessMatching the density of the rugby playing population to the medical services available in the Eastern Cape, South Africa(2017) Moore, Simon; Lambert, Michael I; Burgess, TheresaBackground: Rugby Union is a popular contact sport played worldwide. The physical demands of the game are characterized by short duration, high intensity bouts of activity, with collisions between players, often while running fast. The head, neck, upper limb and lower limb are common sites for injury. Although catastrophic injuries are rare in rugby, they do occur. Immediate action (4-hour window) must occur after the injury to minimise the damage incurred from a catastrophic injury. This infers that a well-functioning medical infrastructure should be available to anticipate injuries of this nature and provide treatment for the best possible outcome. Currently there is no system information/map in South Africa describing the medical infrastructure in relation to places where clubs and schools practice and play matches. Such a system may assist providing early and immediate transfer of injured players to the appropriate treatment facility. This would minimise the damaging effects caused by delays in medical treatment. Therefore the aim of this study was to; (i) investigate and report on the location, distance and travel time from rugby playing/training venues in the Eastern Cape to the nearest specialist hospital where a player may be able to receive adequate treatment for a catastrophic injury, and ii) report on safety equipment available at these playing venues to facilitate this transport in a safe manner. Methods: All the clubs (n=403) and schools (n =264) that played rugby in the Eastern Cape were accounted for in the study. However, only 15 clubs and 35 schools were included in the analysis as they had their own facilities for training and playing matches. Distances between clubs/schools and the nearest public, private and specialized hospital (able to treat catastrophic injuries) were measured. In addition driving time was also estimated between the clubs/schools and nearest specialized hospital to determine if an injured player could be transported within four hours to receive medical treatment for a catastrophic injury. In addition medical safety equipment was audited (according to information provided by SA RUGBY)) for each club and school to identify if they were meeting the minimum safety standards as set by SA RUGBY. Results: Twenty schools were identified as being less than one hour away from the nearest hospital equipped to deal with catastrophic rugby injuries; nine schools were between 1-2 hours away and six schools were between 2-3 hours away. All schools were within 100 km driving distance of the nearest public hospital; 28 schools were within 100km driving distance to the nearest private hospital. For seven schools, the nearest private hospital was between 100 and 150 km away. Fourteen schools had spinal boards, eleven had neck braces, ten had harnesses, nine had change rooms, five had floodlights, and twenty-two had trained first aiders. Six schools were located 2-3 hours away and were at higher risk due to a lack of first aid equipment. Ten clubs were less than an hour away from the nearest hospital equipped to treat catastrophic injuries; two clubs were between 1-2 hours away, two were between 2-3 hours away and one was between 3-4 hours away. All clubs were within 100 km driving distance of the nearest public hospital. Nine clubs were within 100km driving distance to the nearest private hospital, three clubs were based between 100 and 150 km from the nearest private hospital and three were based over 150km away from the nearest private hospital. Twelve clubs had a spinal board, eleven clubs had neck braces, ten clubs had harnesses, ten clubs had change rooms, seven clubs had floodlights and twelve clubs had first aid trainers. One club was classified as high risk as it was located 2-3 hours away from the nearest hospital equipped to manage a catastrophic injury and had no first aid equipment. Discussion/Conclusion: No clubs or schools included in the study were more than four hours away from a hospital that was equipped to deal with a catastrophic rugby injury. Therefore, any player who suffers a catastrophic injury should be able to get to treatment within the 4-hour window period. Another finding was that not all clubs or schools possessed the minimum equipment required to host training or a rugby match. SA RUGBY can take appropriate action towards these clubs and schools to ensure that they maintain the safest possible practice to not put their own players at increased risk.
- ItemOpen AccessMotivation and behaviour change in Parkrun participants in Western Cape, South Africa(University of Cape Town, 2020) Chivunze, Edgar; Buchholtz Kim; Burgess, TheresaBackground: Participation in physical activity is a cost effective way to reduce the risks of over 25 chronic diseases. Despite the many dangers of physical inactivity, more than a quarter of the South African population remains inactive. One initiative aimed at increasing engagement in physical activity is parkrun, a free weekly 5 km running/walking based activity. There has been an increase in the number of parkrun participants in South Africa since its inception. An understanding of the motivation for participation and health related behaviour change is important for organisers and public health professionals to increase participation in this weekly mass participation event. Aim: The aim of this study was to describe the motivations for participation in parkrun and physical activity related behaviour changes among parkrun participants registered in the Western Cape Province of South Africa. Specific objectives The specific objectives of this study were: to identify demographic characteristics of parkrun participants in the Western Cape Province of South Africa; to describe the motivations for participating in parkrun runs in the Western Cape Province of South Africa; and to investigate physical activity related behaviour changes as a result of participating in parkruns in South Africa's Western Cape Province based on pre and post participation physical activity levels Methods: A cross sectional study was performed on 1787 parkrun participants registered at 40 parkrun sites in the Western Cape Province of South Africa. Participants from 37 of these sites were invited to participate via the parkrun South Africa mailing list in an online survey. Participants from the remaining three parkrun sites responded on paper-based questionnaires at the parkrun sites. The questionnaire included sections on demographic characteristics including employment status, gym membership and educational level; physical activity programmes before joining parkrun and changes in physical activity after joining parkrun. Results: The median age of participants was 50 (IQR:38-59). Female participants formed 53.3% of the sample. Approximately 80% of participants were educated to diploma or degree level (Technikons/College/University); and participants reported high employment rates (71%). Fifty-one percent of the sample were gym members. A total of 64.8% reported having very good to excellent health. A total of 86.1% reported health/fitness as the biggest motivation for participation in parkrun. Another 71.8% of the sample were motivated by enjoyment. Safe environment (58.7%), earning Discovery Health Vitality Points (46.4%), stress relief (40.8%), cost (40.4%) and socialisation (39.4%) were other common motivations among the sample. After joining parkrun, 24% of participants took up new physical activity programmes, with a further 24% of participant increasing their weekly volume of physical activity. More female participants (50.9%) than male participants (44.7%) increased their physical activity levels or took up new physical activity programmes (χ² =7.331, p=0.007). Running was the widely adopted physical activity attracting 18.2% of the sample as new runners. Conclusion: In conclusion, we found that parkrun in the Western Cape is mostly taken up by participants in their sixth decade of life with half of them being overweight. Most participants are physically active before joining parkrun with more than half exceeding recommended global physical activity levels. These results were described in previous studies in Australia and the UK. We also found health/fitness to be the biggest motivation for parkrun participation followed by enjoyment and the safe environment provided at parkrun sites. Running and walking are the common activities that are taken up by participants after joining parkrun. Further prospective studies are recommended to determine cause and effect models and describe health related physical activity behaviour changes in detail.
- ItemOpen AccessPrevalence and risk factors of chronic diseases of lifestyles in endurance runners(2018) Language, Sarah; Burgess, Theresa; Blockman, MarcBackground: Chronic diseases of lifestyle (CDL) are associated with high rates of morbidity and mortality in South Africa. Although prevalence of CDL has been established in the general population, there is limited research regarding the prevalence and risk factors for CDL in individuals taking part in regular physical activity. Endurance running is a popular sport, with growing levels of participation. Anecdotally, many individuals who participate in endurance running do not undergo formal pre-participation cardiovascular screening. It is also unclear if endurance runners are meeting the World Health Organisation’s recommended weekly moderate to vigorous intensity physical activity hours, or if they have other risk factors for CDL. It is therefore important to establish the prevalence and risk factors of CDL in this active population. Aim and Objectives: The aim of this study was to determine the prevalence of CDL and the associated risk factors in endurance runners in South Africa. The specific objectives of the study were: (a) to determine the presence of risk factors for the development of chronic diseases of lifestyle, including body mass index (BMI), waist circumference, body fat percentage, blood pressure, blood glucose, blood cholesterol, smoking history, dietary intake and weekly physical activity time in South African endurance runners; (b) to determine the presence of non-modifiable risk factors to the development of CDL, namely age and income, in South African endurance runners; (c) to determine whether South African endurance runners are fulfilling the World Health Organization’s recommended weekly moderate to vigorous intensity physical activity hours; and (d) to assess whether there are any relationships between the running characteristics, namely weekly training hours, running speed and level of competition; and the risk factors for chronic diseases of lifestyle. Methods: This study had an analytical, cross-sectional design. Two hundred participants between the ages of 18 to 69 years old, who reported endurance running as their main sport, and ran at least three kilometres twice a week for the past year were included in the study. Participants were excluded if they were pregnant or within six months post-partum, had an injury that required a minimum of two weeks rest or did not complete the questionnaire or physical testing component of the testing process. Participants were recruited through local running clubs and running races in the areas of Nelspruit, Mpumalanga and Cape Town, Western Cape. All participants gave written informed consent, and completed a questionnaire including socio-demographic characteristics, running training characteristics, the International Physical Activity Questionnaire short questionnaire, the modified Borg scale of perceived exertion, and the five-a-day community evaluation tool. Body mass, stature, skin folds and waist circumference were assessed. Blood pressure was measured using an automatic blood pressure monitor. A finger prick test was used to determine random blood glucose and cholesterol concentrations. Participants were requested to fast for three hours prior to testing to standardise the test in a non-fasted state (20). Results: One hundred and twenty four (62%) participants were found to have at least one risk factor for CDL. A high BMI was the most common risk factor for CDL (n=90; 45%). Nineteen participants (9.5%) did not meet the recommended duration of 150 minutes of physical activity per week. Seven percent of female participants (n=7) smoked, which is equivalent to the female population average of South Africa. Multiple risk factors were identified in fifty seven (28.5%) participants, ranging from two risk factors (n=37; 18.5%) to six risk factors (n=1; 0.5%). The majority of participants had no prior medical diagnosis of CDL or risk factors for CDL. The overall self-reported prevalence of a medically diagnosed CDL was 5.5% (n=11). Type 2 diabetes was the most commonly diagnosed CDL (n=6; 3%). Waist circumference, systolic blood pressure and cholesterol were significantly elevated in the older age group. There were no significant differences in risk factors for CDL according to income status. Female runners had significantly higher average sitting times compared to male runners. In addition, participants with a BMI ≥ 25 kg.m-2 had significantly slower 10 km running speeds and lower average weekly training distance, compared to participants with BMI within normal ranges. Conclusion: A high prevalence of risk factors for CDL was identified in South African endurance runners. The majority of endurance runners included in this sample are fulfilling the World Health Organisation’s recommended weekly moderate to vigorous intensity hours. However, the endurance runners in this study remain at risk for developing a CDL due to the presence of other risk factors for CDL. The knowledge and awareness of risk factors for CDL among South African endurance runners needs to be further investigated. Health care professionals are required to improve the prevention and management of risk factors of CDL through education and promotion of healthy lifestyles. A stronger emphasis on the prevention of risk factors for CDL in South African endurance runners is needed.
- ItemOpen AccessThe relationship between performance (tournament progression), daily stress and perceived exertion in male participants of professional squash tournaments(2016) Montanus, Munro; Jelsma, Jennifer; Burgess, TheresaSquash is a popular sport that is played by over 15 million people in 120 countries. Squash is a sport requiring extreme levels of fitness and skill to be proficient at. Squash being a high impact, fast sport that relies on consistency, strength and skill, players often experience stress. This stress is mainly due to the intensity of the matches, but also due to the short duration of the tournaments, which places a lot of pressure on the participants to do well. Stress in sport has been shown to be a critical component in the performance of an individual athlete as well as in team sports. Stress in sport may be categorised as competitive and organisational as well as acute. Not being able to cope with stress may have varied affects for athletes. These include increased anxiety and aggression; decreased enjoyment and self-esteem; and most importantly a decrease in performance expectations and performance difficulties. Furthermore, if an athlete believes he or she cannot resolve the demands of the competitive environment, negative physical and emotions can affect performance. The ability to compete with the presence of different stressors is thus necessary for an athlete to perform at his or her best. Aim and objectives The specific objectives were to establish whether a) Anthropometric and demographic characteristics, b) Daily Stress as measured by the Daily Analysis of Life Demands for Athletes (DALDA) and c) Rate of Perceived Exertion (RPE) as measured by the Borg Scale were associated with competition performance as measured by winning/losing games in national squash tournaments.
- ItemOpen AccessScreening for risk factors associated with non-specific shoulder pain in mail adolescent water polo players(2022) Tully, Paula Lauren; Gray, Janine; Roche, StephenWater polo is a fast-growing aquatic sport that combines swimming, overhead throwing, defending and grappling. There are great demands placed on the shoulder to complete these activities and shoulder pain is the most common musculoskeletal complaint among water polo players. The aetiology of shoulder injury amongst water polo players is not well understood and there is limited research investigating the adolescent water polo population. The aim of this thesis was to identify the incidence of shoulder pain over a 12-week period and determine the contribution of intrinsic and extrinsic risk factors in the development of non-specific shoulder pain in male adolescent water polo players. An overview of the literature (Chapter 2) includes the biomechanics of throwing and swimming; the epidemiology of shoulder injury in water polo players; and the current understanding of risk factors for shoulder injuries and the screening thereof. Risk factors for shoulder injury in swimming have been identified as weakness of the glenohumeral (GH) internal rotator muscles, altered GH range of motion (ROM), GH joint laxity, high training loads, pectoralis minor tightness and altered scapular control. In other overhead throwing sports the risk factors include altered GH ROM and glenohumeral internal rotation deficit (GIRD), shoulder muscles weakness, altered scapular control, pitching velocity, age, height, early sport specialisation, throwing with arm fatigue and a heavy workload. A few studies have proposed potential risk factors for shoulder injury in water polo players but significant associations have not been found and little is known about the musculoskeletal risk factors. However, water polo players are susceptible to shoulder pain due to repetitive overheard throwing at high velocities, the repetitive swimming stroke as well as the unique upright swimming style. Chapter 3 presents the research findings. This study recruited male adolescent water polo players between the ages of 14-18 who were not currently experiencing shoulder pain. Participants underwent a pre-season screening session followed by a period of in-season monitoring for 12 weeks. The pre-season screening included a demographic questionnaire, the Kerlan-Jobe Orthopaedic Clinic (KJOC) Shoulder and Elbow Score, anthropometry and maturation testing as well as shoulder specific tests to assess for shoulder pain, shoulder range of movement, shoulder strength, shoulder flexibility and shoulder stability. The experience of shoulder pain and participant training load was then monitored using a selfreport questionnaire. Participants were categorised into two groups (shoulder pain and no shoulder pain) based on their report of pain, irrespective of a medical diagnosis. The shoulder musculoskeletal profile of the water polo players, the incidence of shoulder pain and the player's training loads are presented (Chapter 3). Shoulder pain was reported by 52% of the participants at least once during the 12-week monitoring period, with pain in both the shoulders simultaneously (56%) or the dominant shoulder only (42%) commonly reported. The onset of activity was reported most commonly as swimming (55%) followed by throwing (38%). Participants with shoulder pain had mean KJOC scores lower than 90, and were significantly older (p = 0.003), heavier (p = 0.050) and the predicted years from peak height velocity (PHV) was greater (p = 0.029) than those without shoulder pain. An interaction was found between pain/no pain and dominant/non-dominant side for isometric internal rotation (IR) strength (p = 0.049), with stronger IR muscles in the dominant shoulder of the group with shoulder pain. Significant shoulder asymmetries were identified, however there was no association between the variables and the development of shoulder pain. In general, the participants presented with greater external rotation (ER) ROM and total range of motion (TROM) in the dominant shoulder, greater isometric strength of the IR muscles, serratus anterior (SA), upper trapezius (UT) and lower trapezius (LT) muscles, as well as reduced pectoralis minor length (PML) and a lower pectoralis minor index (PMI) on the dominant side. There was a significant difference between pain/no pain and the hours of water polo matches in weeks 3-4, with a higher work load in the shoulder pain group compared to the no shoulder pain group (p = 0.008). Participants with shoulder pain reported significantly lower selfperceived strength scores for passing, shooting, swimming, defending and gym training compared to those without shoulder pain. In conclusion (Chapter 4), there is a high incidence of shoulder pain among male adolescent water polo players, which is in line with the findings from other studies. The players who developed shoulder pain were significantly older, heavier and had a higher predicated age from PHV than those without shoulder pain. This may suggest a trend towards cumulative overloading and it's likely that the key players of water polo teams may be at greater risk of developing shoulder pain. Greater IR strength was observed in the dominant shoulder of those players with shoulder pain, indicating that the more powerful throwers are developing shoulder pain. The relative weakness of the ER muscles suggests that players are unable to effectively control through deceleration of the throwing motion. This cohort presented with significant asymmetries in GH ROM, rotator and scapular muscle strength, and shoulder flexibility; however, these variables were not associated with shoulder pain. Asymmetries have been associated with pain in previous studies, so these variables should not be ruled out as risk factors for injury. Participants of this study reported the activity most commonly associated with shoulder pain was swimming, not throwing, and bilateral shoulder pain was commonly reported. This would suggest that the musculoskeletal profile of the non-dominant side is indeed important and that the implications of significant asymmetries should be evaluated further in a larger population. KJOC scores seem to be in line with those for baseball players and a score below 90 may indicate an at-risk athlete. An increase in competitive match play was associated with an increase in shoulder pain. This should inform coaches to structure training and recovery appropriately during tournaments or weeks with a high load of matches. This study provides a basis for further investigation into shoulder injuries among adolescent water polo players, as well as the prevention and management thereof. It is advised that coaches and medical staff endeavour to identify at-risk players. Rehabilitation programs should be implemented to target the modifiable risk factors identified in this study, in order to reduce the incidence and prevalence of shoulder pain.
- ItemOpen AccessSport-related Concussion Incidence and Mechanism of Injury in Male and Female Players at the South African Youth Week Rugby Tournaments: 2011-2018(2019) Cardis, Sheenagh; Lambert, Michael Ian; Burgess, TheresaBackground: Rugby is a popular international sport for male and female youth and adult players (6). Injury incidence including sport-related concussion (SRC) is high in youth rugby (7, 8) . This is concerning as youth are more vulnerable to SRC and take longer to recover from SRC than adults (9, 10). Females are also more susceptible to sustaining a SRC, take longer to recover from SRC and have a higher incidence of SRC complications than men (11-15). Most research has focused on SRC in adult male players. There are fewer studies on youth, in particular female youth. Further research into SRC in youth male and female players is thus required. Aim: The aim of this study was to determine the incidence and mechanism of SRC among youth male and female rugby players at the 2011 to 2018 and 2015 to 2018 South African Rugby Union Youth Week Tournaments respectively. Specific objectives: a) To determine the incidence of SRC among boys U13-U18 and girls U16-U18 players; b) To describe SRC mechanism of injury in boys U13-U18 and girls U16-U18 players; c) To determine if a difference in SRC incidence exists between boys U13-U18 and girls U16-U18 players, and also, between age groups; d) To determine if a difference in mechanism of SRC exists between boys U13-U18 and girls U16-U18 players, and also, whether the difference exists between age groups; and e) To describe l factors associated with SRC in boys U13-U18 and girls U16-U18 players. Methods: The study had a retrospective, epidemiological design. The study reviewed SRC injury data collected at the 2011-2018 South African Rugby Union Youth Week Rugby Tournaments. SRC injury data for boys was collected at the 2011-2018 South African Rugby Union Youth Week Rugby Tournaments. SRC injury data for girls was collected only at the 2015-2018 South African Rugby Union Youth Week Rugby Tournaments, as the girl's tournaments were only introduced in 2015. Results: Data from 266 SRC events were analysed in the study. Overall SRC incidence was 7.0 SRC per 1000 match playing hours (95% CI, 6.2.-7.8). Overall SRC incidence for boys was 6.9 SRC per 1000 match playing hours (95% CI, 6.0-7.8). Overall SRC incidence for girls was 7.9 SRC per 1000 match playing hours (95% CI, 5.3-9.9). There was no significant difference in SRC incidence between boys and girls. SRC incidence from 2011-2018 was 10.7 (95% CI, 8.2-13.1), 7.5 (95% CI, 5.5-9.6) and 5.3 (95% CI, 3.4-6.5) SRC per 1000 match playing hours for boys U13, U16 and U18 age groups respectively. SRC incidence from 2015- 2018 was 7.2 (95% CI, 3.7-10.2) and 7.9 (95% CI, 4.7-10.9) SRC per 1000 match playing hours for girls U16 and U18 age groups respectively. There was a significantly higher incidence of SRC in the boys U13 age group when comparing boys U13 and U18 age groups (IRR 2.0; 95% CI, 1.5-2.7; p=0.00014). Boys U13 players were twice as likely to sustain a SRC than their U18 counterparts. The tackle (65%) and ruck (20%) were responsible for the majority of SRC. Boys U13 players were significantly more likely to sustain a SRC from a tackle than boys U18 players (p= 0.01). Boys U16 players had a significantly greater incidence of SRC resulting from the ruck than boys U18 players (p=0.02). Overall the most common primary mechanisms of SRC were front-on tackles (27%) and collisions (18%). Boys U16 players had a significantly higher rate of SRCs due to front-on tackles than boys U18 players (p=0.00007). U16 boy players also had a significantly higher rate of SRCs caused by collisions than U18 boy players (p=0.00007). Similarly, boys U13 players had significantly higher incidences of SRCs due to collision than boys U18 players (p=0.003). Factors that were associated with SRC incidence were tournament day and the use of headgear. SRC was more likely to occur on day two than day four (p=0.0008), day five (p=0.0002) and day six (p<0.001). Players who did not wear headgear were more likely to sustain a concussion than those who did (p<0.001). Conclusion: Overall SRC incidence at the 2011 to 2018 South African Youth Week Rugby Tournaments was 7.0 SRC per 1000 match playing hours. This study is unique as it reports SRC incidence for youth female players. The overall SRC incidence for girls U16 and U18 groups was 7.7 SRC per 1000 match playing hours. As no significant difference was found for the incidence, injury event and mechanism of SRC between male and female players, similar injury prevention strategies can be implemented with these groups. Injury prevention strategies should focus on teaching safe contact technique in the tackle and ruck. Particular attention should be focused on teaching safe contact technique in U13 boys as the SRC incidence was highest in this group. Injury prevention strategies should also focus on teaching U13 and U16 boy players how to avoid collisions; and teaching U16 boy players how to execute safe front-on tackles and rucks. Further research should focus on identifying what aspects of the tackle and ruck result in SRC so more tailored and specific injury prevention strategies can be implemented.
- ItemOpen AccessTraining loads, injury profiles and illness in elite South African rugby players(2019) Barnes, Curt; Buchholtz, Kim; Burgess, TheresaBackground Professional Rugby Union is a popular international team sport and is known to have one of the highest reported incidences of injury and illness across sporting codes. The Super Rugby tournament is played annually between professional Rugby Union teams and is one of the most competitive sports tournaments in the world. The demanding nature of the tournament has been associated with high rates of injury and illness, but the relationship between training loads on injury and illness profiles are unclear. As a result, the Super Rugby tournament is a platform to further investigate injury, illness and training load patterns within Rugby Union. Epidemiological data on training loads, injury profiles and illness patterns assist the development of preventative measures. Aim The aim of this study was to assess the relationships between training loads, injury profiles and illness rates in elite South African rugby players competing in the 2017 Super Rugby tournament. Specific objectives (a) To determine the incidence of training and match injuries during pre-season training, and early and late competition during the 2017 Super Rugby tournament; (b) To determine the incidence of illness during pre-season training, and early and late competition during the 2017 Super Rugby tournament; (c) To determine the anatomical site, type, mechanism and time-loss of injuries sustained during preseason training, and early and late competition during the 2017 Super Rugby tournament; (d) To determine potential associations between internal and external training loads; and injury and illness, respectively. Methods A descriptive, observational, surveillance study design was conducted on the 2017 Super Rugby tournament. Thirty-nine adult participants were recruited from one South African team over a complete season, including preseason, early and late competition. Data were collected from the team medical personnel who routinely collected data on a daily basis. Training load data included squad size, training or match day, the duration of training or matches, and internal and external training load measures for training and matches. Injury data included the participants age, the injury counts, the type of injury, the main and specific anatomical location, and the mechanism and severity of injury. Illness data included illness counts, the bodily system affected, symptoms and cause of illness, the specific diagnosis and time-loss. Results The overall incidence of injury was 12.8 per 1000 player hours. The majority (48.8%) of injuries occurred in the early competition phase. The incidence of match injuries (241.0 per 1000 player hours) was significantly higher than training injuries (3.3 per 1000 player hours). The lower limb (62.5%) sustained the greatest proportion of injuries. Muscle or tendon injuries accounted for 64.9% of all injuries. The tackle accounted for 28.8% of all injuries and 37.5% of all injuries were of a ‘moderate' severity. The proportion of players that sustained a time-loss injury was 76.9% (n = 30) and 25.6% (n = 10) of players sustained a time-loss injury severe enough to prevent eight days or more of participation in training or matches. The overall incidence of illness was 1.8 per 1000 player days. The proportion of players that acquired an illness was 28.3% (n = 11). Acute respiratory tract infections (28.6%) was the most common specific A significant negative correlation between injury and internal training loads were detected in the preseason phase (r = -0.34, p = 0.03). There were no significant correlations between external training load and injury incidence. No significant correlations were observed between internal and external training loads and illness incidence. There were no significant odds ratios demonstrated between internal and external acute to chronic ratios, and injury and illness risk. Conclusion The incidence of match injuries in this study was significantly higher than previously reported incidence rates in the Super Rugby tournament. The profiles of match and training injuries, anatomical location, type, mechanism and severity of injuries are similar to previous studies. Illness rates were significantly lower than reported in previous studies. Internal training load and injury were significantly correlated in the preseason phase. Further studies are required to determine the relationship of training loads on injury and illness over consecutive seasons and in multiple teams. diagnosis. A large majority of illnesses (64.3%) did not result in time-loss.