Browsing by Department "Division of Exercise Science and Sports Medicine"
Now showing 1 - 20 of 27
Results Per Page
Sort Options
- ItemOpen AccessA description of the profiles of U18 rugby players who attended the Craven Week tournament between 2002-2012(2018) Durandt, Justin; Lambert, MikeRugby union has rich tradition in South Africa with the national team having won the Rugby World Cup in 1995 and 2007. The major rugby nations South Africa competes against have clearly defined rugby talent identification (TID) and development (TDE) pathways. These pathways are not as well described in South Africa where the South African Rugby Union (SARU) has adopted a model of identifying talent at an early age through competition. For example, national competitions occur at U13 (Craven Week), U16 (Grant Khomo Week) and U18 (Craven Week and Academy Week) levels. Previous research on talent identification has highlighted the pitfalls of early talent identification. In particular different rates of maturation can influence the manifestation of talent. In a collision sport such as rugby the early maturers have a distinct advantage. An added complexity in the South African context is the need to provide an appropriate development environment within which transformation can take place. At all levels in South African professional rugby, white players dominate team selection. One of the reasons suggested for this dominance is the physical size of white players compared to their black and mixed race (coloured) counterparts. Rugby is a contact sport and physical size is associated with success, so the need to quantify physical difference between racial groups at a junior level over time is important. The first objective of the thesis was to examine the profiles of U18 Craven Week rugby players to gain insight into the development pathway from U13 to U18. A second aim was to understand factors influencing transformation by measuring the physical profiles of the various racial groups over time. The thesis consists of two studies. The specific objective of the first study was quantify how many players in the 2005 U13 Craven Week (n=349) participated in the subsequent U16 Grant Khomo and U18 Craven Week. The study showed that 31.5% of the players who played in the U13 Craven Week, were selected to play at U16 Grant Khomo Week and 24.1% were selected for the U18 Craven Week tournaments. Another interpretation is that 76% of the players selected for the U13 tournament did not play at the U18 Craven Week tournament. The objective of the second study was to determine whether there are differences in body mass, stature and body mass index (BMI) between racial groups in U18 Craven Week players. Another objective was to determine whether these measurements changed between 2002-2012. Self-reported body mass and stature were obtained from U18 players (n=4007) who attended the national tournament during this period. BMI was calculated for each player. The body mass, stature and BMI of these players in South Africa were significantly different between racial groups. For example, white players were 9.8 kg heavier than black players, who were 2.3 kg heavier than coloured players (p<0.0001). The body mass of all groups increased from 2002-2012 (p < 0.0001). White players were 7.0 cm taller than black players, who were 0.5 cm taller than coloured players (p < 0.0001). The stature of players did not change significantly during the study period. The average BMI of white players was 0.9 kg.m⁻² greater than black players who were on average 0.7 kg.m⁻² greater than coloured players (p<0.0001). The BMI of all groups changed similarly over the study period. To conclude, these results question the effectiveness of the u13 tournament in identifying talent and providing an effective development pathway to U18 Craven Week. The SARU also needs to be aware of the ongoing disparities in size between the racial groups playing rugby at an U18 level in South Africa. These size differences may have implications for transforming the game and making it representative of the South African population.
- ItemOpen AccessAn apparent reduction in the incidence and severity of spinal cord injuries in schoolboy rugby players in the Western Cape since 1990(1999) Noakes, Timothy; Jakoet, I; Baalbergen, EOBJECTIVE: To determine the impact of the 1990 rugby law changes in South African schoolboy rugby on the number of schoolboys suffering paralysing spinal cord injuries in the subsequent eight rugby seasons (1990-1997) in the former Cape Province (now the Western Cape, but including Port Elizabeth and East London). METHODS: The study was a retrospective analysis of all patients with rugby-related spinal cord injuries admitted to the Conradie and Libertas Spinal Units, Cape Town, between 1990 and 1997. Data were initially collected annually from patient files. From 1993 patients were interviewed in hospital and a standardised questionnaire was completed. Data were collated and analysed. RESULTS: There were 67 spinal cord injuries in adult and schoolboy rugby players in the eight seasons studied. Fifty-four (80%) injuries were in adults and 13 (20%) in schoolboys, representing a 23% increase and a 46% reduction in the number of injured adults and schoolboys, respectively. Fifty-two per cent of those injuries for which the mechanism was recorded occurred in the tackle phase of the game; of these approximately equal numbers were due to vertex impact of the tackler's head with another object, or to illegal (high) tackles. Twenty-five per cent of injuries occurred in the ruck and maul and the remainder (23%) in the collapsed scrum. The only striking difference in the proportion of injuries occurring in the different phases of play was the absence of high-tackle injuries among schoolboys. The majority of injuries occurred at vertebral levels C4/5 (32%) and C5/6 (42%). Five players (8%) died, tetraplegia occurred in 48% and 35% recovered either fully or with minor residual disability. Playing position was recorded for half the injured players. Front-row forwards (props 33%, hookers 9%), locks (12%) wings and centres (21%) and loose forwards (15%), accounted for 90% of all injuries. CONCLUSIONS: Introduction of rugby law changes in South African schoolboy rugby in 1990 may have led to a 46% reduction in the number of spinal cord injuries in this group. In contrast, the number of these injuries in adult rugby players increased during the same time period due either to an increase in the number of adult players or to a real increase in the incidence of these injuries. More injured schoolboy than adult rugby players made total or near-complete recoveries from initially paralysing injuries (61% v. 28%). The reduced number of schoolboy injuries could not have resulted directly from the specific law changes introduced in 1990, which targeted scrum laws. Rather, the absence of illegal (high) tackle injuries among schoolboys appears to be the principal factor explaining fewer injuries in schoolboys, who suffered a higher proportion of injuries in the ruck and maul than did adult players. Accordingly we conclude that a further reduction in spinal cord injuries in adult and schoolboy rugby players in the Western Cape requires: (i) the elimination of injuries occurring in the ruck and maul, and to the tackler; (ii) the strict application of the high-tackle rule in adult rugby; and (iii) a continuing, high level of vigilance. Concern must be expressed about the continuing number of paralysing spinal cord injuries in adult rugby players.
- ItemOpen AccessCardiovascular risk status of Afro-origin populations across the spectrum of economic development: findings from the Modeling the Epidemiologic Transition Study(2017) Dugas, Lara R; Forrester, Terrence E; Plange-Rhule, Jacob; Bovet, Pascal; Lambert, Estelle V; Durazo-Arvizu, Ramon A; Cao, Guichan; Cooper, Richard S; Khatib, Rasha; Tonino, Laura; Riesen, Walter; Korte, Wolfgang; Kliethermes, Stephanie; Luke, AmyAbstract Background Cardiovascular risk factors are increasing in most developing countries. To date, however, very little standardized data has been collected on the primary risk factors across the spectrum of economic development. Data are particularly sparse from Africa. Methods In the Modeling the Epidemiologic Transition Study (METS) we examined population-based samples of men and women, ages 25–45 of African ancestry in metropolitan Chicago, Kingston, Jamaica, rural Ghana, Cape Town, South Africa, and the Seychelles. Key measures of cardiovascular disease risk are described. Results The risk factor profile varied widely in both total summary estimates of cardiovascular risk and in the magnitude of component factors. Hypertension ranged from 7% in women from Ghana to 35% in US men. Total cholesterol was well under 200 mg/dl for all groups, with a mean of 155 mg/dl among men in Ghana, South Africa and Jamaica. Among women total cholesterol values varied relatively little by country, following between 160 and 178 mg/dl for all 5 groups. Levels of HDL-C were virtually identical in men and women from all study sites. Obesity ranged from 64% among women in the US to 2% among Ghanaian men, with a roughly corresponding trend in diabetes. Based on the Framingham risk score a clear trend toward higher total risk in association with socioeconomic development was observed among men, while among women there was considerable overlap, with the US participants having only a modestly higher risk score. Conclusions These data provide a comprehensive estimate of cardiovascular risk across a range of countries at differing stages of social and economic development and demonstrate the heterogeneity in the character and degree of emerging cardiovascular risk. Severe hypercholesterolemia, as characteristic in the US and much of Western Europe at the onset of the coronary epidemic, is unlikely to be a feature of the cardiovascular risk profile in these countries in the foreseeable future, suggesting that stroke may remain the dominant cardiovascular event.
- ItemOpen AccessComparison of body fatness measurements by near-infrared reactance and dual-energy X-ray absorptiometry in normal-weight and obese black and white women(2010) Jennings, Courtney L; Micklesfield, Lisa K; Lambert, Mike I; Lambert, Estelle V; Collins, Malcolm; Goedecke, Julia HThe aim of the present study was to compare body fat percent (BF %) using single-site near-IR reactance (NIR) and dual-energy X-ray absorptiometry (DXA) in a cohort of normal-weight (BMI < 25 kg/m2) black (n 102) and white (n 71); and obese (BMI > or = 30 kg/m2) black (n 117) and white (n 41) South African women (18-45 years). NIR-derived BF % was significantly correlated with DXA-derived BF % in all groups: normal-weight black (r 0.55, 95 % CI: 0.40, 0.67, P < 0.001) and white (r 0.69, 95 % CI: 0.53, 0.79, P < 0.001) women; obese black (r 0.59, 95 % CI: 0.46, 0.70, P < 0.001) and white (r 0.56, 95 % CI: 0.30, 0.74, P < 0.001) women. NIR under-predicted BF% compared to DXA in black women (normal-weight, - 4.36 (sd 4.13) % and obese, - 3.41 (sd 3.72) %), while smaller mean differences were observed in white women (normal-weight, - 0.29 (sd 4.19) % and obese, - 0.81 (sd 3.09) %), irrespective of normal-weight or obese status (P < 0.001). In obese subjects, NIR-derived BF % did not measure values greater than approximately 45 %, while the maximum DXA-derived measure was 58 %. In conclusion, although there was a significant relationship between NIR- and DXA-derived BF %, NIR under-predicted BF % in normal-weight and obese black South African women compared to DXA, but to a greater extent in subjects with very high levels of adiposity (>45 %). The results of single-site NIR as a measure of BF % should therefore be interpreted with caution, particularly in women of African descent and in those with very high levels of adiposity.
- ItemOpen AccessConversion of the Knee Osteoarthritis Outcome Score – Physical Shortform into a Video Format(2018) de Roos, Jordy Anterio; Held, Michael; Kruger, NeilIntroduction Patient Reported Outcome Measures (PROMs) are an integral part of evidence-based medicine and provide the necessary information for clinicians to make decisions in patient management. The Knee Osteoarthritis Outcome Score-Physical Function Short Form (KOOS-PS) was developed to assess patients’ perception of their knee’s function. Yet, there are cultural and language barriers, when implementing PROMs in a setting for which it was not originally designed, particularly in low-middle income countries with low levels of education. To address these challenges, the study introduces a video version of the KOOS-PS with the aim to validate it in a local setting. Methods This is a validation study of a video version of the KOOS-PS against various other knee scores. The KOOS-PS was converted into videos and a Likert scale in form of icons was used as grading system. The videos were reviewed by a panel for acceptance and comprehensibility. Second, the video score was tested in a prospective study against other internationally accepted and validated knee PROMs. Patients were recruited from both the public and private sectors of healthcare. Descriptive statistics, Pearson’s correlation coefficient and Cronbach’s Alpha were used for psychometric testing. Results The mean time taken to complete the video score was 79 seconds. Internal consistency received an excellent Cronbach’s Alpha of 0.89. Reproducibility received a Pearson Correlation Coefficient of r=0.91 which illustrates there was no significant difference. Pearson Correlation coefficients between the converted video score and other validated scores indicated high correlation. Conclusion This is the first validation study that converts a written PROM into a video format. The results show that the video score is reliable, acceptable, and valid, and can therefore be used in clinical practice.
- ItemOpen AccessDifferences in Technical Contact Performance Between Pool and Knockout Stages in Men's International Rugby Sevens(2022) de Klerk, Stephanus; Hendricks, ShariefIntroduction: Rugby sevens is a high intensity, intermittent, collision field sport requiring a combination of physical fitness, and technical and tactical ability. Research on the running demands of rugby sevens matches has been synthesised to inform training and practice. In contrast, only a paucity of research is available on the technical contact demands. Moreover, less is known about the technical performances of successful teams. Therefore, the first part of this thesis conducted a systematic review of the literature of the tackle- and/or ruck frequencies within rugby sevens matches to understand the technical contact demands of rugby sevens. The second part of this thesis is an original study that retrospectively analysed and compared tackle and ruck events between the pool and knockout stages in one full season of the 2018/2019 International Men's Rugby Sevens World Series. Methods: For part one, a systematic search according to the PRISMA guidelines was performed on three electronic databases. The key word combinations included “Rugby Sevens” OR “Rugby” AND “Sevens” OR “Sevens” AND “Contact Demands”. The initial search across the databases retrieved 812 titles. The abstracts and full-text articles that presented with quantitative data on tackle- and/or ruck frequencies or rates within a given match or tournament were included. After the screening process, a total of 15 articles were included in the final review. For the second part of the thesis, all matches from the 2018/2019 International Men's Rugby Sevens World Series were analysed for tackle- and ruck events using Sports Code elite version 6.5.1. This equated to 21 226 tackle events and 6 345 rucks events across 450 matches. Results: The systematic review found that the mean rucks per match ranged from 7.1±4.6 (Mean±SD) to 9.5±4.5 for winning teams and 7.6±3.7 to 11.1±4.6 for losing teams on men's elite level. From a tackle demands perspective, studies on men's elite-level found that the mean tackles per match were 20.3±6.7 for winning teams and 20.4±6.1 for losing teams. In the original study, the mean tackles per match were 47.2, 95% CI 46.4-48.0 across the season with no significant difference between pool- and knockout stages of the tournaments. The mean rucks per match were 14.1, 95% CI 13.7-14.5 across the season with a significant difference between the stages of competition (P value < 0.001) (pool 14.8, 95% CI 14.2-15.4 vs knockout 13.3, 95% CI 12.7-13.9). Tackle variables that proved significant in pool matches for tackle outcomes included the type of tackle, point of body contact, tackle sequence, attacker intention, and match rank. For knockout matches only point of body contact and attacker intention proved to be significant Discussion: The systematic review provides a synthesis of the current state of technical contact demands in rugby sevens. The next step was to understand contact performance to identify what the determinants of contact success in rugby sevens are. For the original study, pool- and knockout stages had similar tackle frequencies, but dissimilar ruck frequencies, with more rucks occurring in the pool stages. Higher ranked teams and teams progressing to knockout stages of competition had less rucks and successful tackles. This showed that these teams were more proficient at evasive play with regards to contact performance. Practitioners and coaches can use this information to plan contact training and optimise tournament preparation in the Sevens World Series. The systematic review and original study gives insight into the contact demands and performance of a rugby sevens match. With stakeholder involvement this research has the potential to create innovative injury prevention and performance strategies to be implemented on all rugby sevens platforms.
- ItemOpen AccessDoes a greater training load increase the risk of injury and illness in ultramarathon runners? A prospective descriptive, longitudinal design(2020) Craddock, Nicole; Burgess, Theresa; Lambert, Mike; Buchholtz, KimBackground: Ultramarathon running has become extremely popular over the years. Despite the numerous health benefits of running, there are also many negative effects of running such as increased risk of musculoskeletal injury and illness. Training loads imposed on an athlete should induce positive physiological adaptations to improve their performance. Monitoring of an athlete's training load has become extremely important in terms of injury prevention. Currently, the relationship between training loads and injury and illness incidence is uncertain. More research is needed in this field to minimise the risk of injury and illness and maximise performance in ultramarathon runners. Aim: To determine if there are any associations between injury and illness incidences and training loads among ultramarathon runners in the 12 week period preceding an ultramarathon event and the four week period after the event. Specific Objectives: - To describe the incidence rate of overall and region-specific running-related injuries in a population of ultramarathon runners in the 16 week period surrounding an ultramarathon event. - To describe the incidence rate of illness and illness-related symptoms in a population of ultramarathon runners in the 16 week period surrounding an ultramarathon event. - To describe the weekly and cumulative training parameters (training volume, training frequency, training intensity, training duration) of the injured and uninjured groups and the ill and healthy groups over the 16 week period. - To describe the weekly and cumulative absolute training load parameters (internal load, external load) of the injured and uninjured groups and the ill and healthy groups over the 16 week period. - To describe the weekly relative training load parameters (ACWR) of the injured and uninjured groups and the ill and healthy groups over the 16 week period. - To determine whether there are any significant differences between the injured and uninjured groups and the ill and healthy groups with regard to: a) mean training parameters; b) mean internal training load; and c) mean external training load, over the 16 week period. - To identify any significant associations between: a) absolute training load (internal training load; external training load) and injury and illness incidence; and b) relative training load and injury and illness incidence over the 16 week period. Methods: A prospective, descriptive, longitudinal study design was conducted in runners who were training for the 2019 Two Oceans Ultramarathon. One hundred and nineteen participants were recruited for this study and tracked over a period of 16 weeks (12 weeks leading up to the Two Oceans Ultramarathon event and for four weeks afterwards). Data was collected once a week via an online logbook. Training parameters measured included weekly average running distance, average duration, average frequency and average session RPE. Injury data included injury counts, the structure injured, the main anatomical location and time-loss from injury. Illness data included illness counts, the main illness-related symptoms and time-loss from illness. Results: The overall injury incidence proportion was 31%. The week after the ultramarathon race had the highest injury proportion of 7%. The overall injury incidence was 5 per 1000 training hours. The average time-loss due to injury was three training sessions missed. The overall illness incidence proportion was 66%. The week after the ultramarathon race also had the highest illness proportion of 22%. The overall illness incidence was 16 per 1000 training days. The average time-loss due to illness was three training sessions. A moderate significant negative association was found between external training load and injury (r=-0.56; p=0.025). No associations were found between internal training load and injury; or between internal and external training load and illness respectively. A significant relationship was found for external training load and injury incidence in weeks 5 to 8 for participants who ran less than 30km per week. A significant relationship was found for external training load and illness incidence in weeks 5 to 8, 9 to 12 and 13 to 16 for participants who ran less than 30km per week. A significant relationship was found between the ACWR of >1.5 and injury incidence in weeks 1 to 4, 5 to 8 and 13 to 16. A significant relationship was found between the ACWR of <0.5 and illness incidence in weeks 13 to 16. Conclusion: In conclusion a lower training load could potentially predispose to running-related injuries or the development of illness. Specifically, a weekly mileage of less than 30km per week may increase the risk of sustaining an injury or illness when training for an Ultramarathon event. An ACWR greater than 1.5 may increase the risk of injury in the subsequent week of training and an ACWR less than 0.5 may increase the risk of illness in the following week. Non-gradual changes to a weekly training load, whether increases or decreases, could increase the risk of incurring a running-related injury or illness. Maintaining an ACWR between 0.5 and 1.5 appears to be optimal in minimising the risk of sustaining a running-related injury or illness. We therefore recommend the use of both absolute and relative workloads in the monitoring of an athlete's training load with the aim of minimising injury and illness risk and maximising performance in ultramarathon runners.
- ItemOpen AccessDoes the use of upper leg compression garments aid performance and reduce post-race Delayed Onset Muscle Soreness (DOMS)?(2021) Kabongo, Ken; Bosch, AndrewIntroduction: Despite the lack of scientific knowledge on the physiological and biomechanical effects of wearing compression garments, there has been an increase in the use of these garments in endurance running. The purpose of this study was to compare the performance, pain and thigh circumference changes in endurance runners using upper leg compression garments while competing against runners who did not use compression garments in the same marathon race. Methods: A randomised controlled intervention study was conducted in endurance runners (n=18) participating in the 2019 Winelands Marathon (42.2km). The compression garment group (n=10) participated in the race wearing upper leg compression garments while the control group (n=8) did not. Participants in the compression garment group only wore the compression garments during the marathon. Various outcome measures of perceived exercise-induced muscle damage (EIMD) and running performance were assessed three days before, immediately post-race and two days post-race. Three days prior to the race, mid-thigh circumference measurements were performed. Immediately post-race, mid-thigh circumference measurements, Visual Analogue Scale (VAS) pain ratings and Likert scale for determination of muscle soreness were assessed and race performance times were recorded. Two days post-race, mid-thigh circumference measurements, VAS pain rating and Likert scale for determination of muscle soreness were repeated. Results: VAS pain ratings for hamstring (compression garment 2.50 vs control group 4.00) (p=0.04), knee flexion (compression garment 2.50 vs control group 5.00) (p=0.02) and hip extension (compression garment 2.50 vs control group 4.00) (p=0.04) had a statistically significant difference between the compression garment and control group immediately post-race. VAS pain ratings for hamstring (compression garment 0.00 vs control group 1.00) (p=0.04), knee flexion (compression garment 1.00 vs control group 2.00) (p=0.02) and hip extension (compression garment 1.00 vs control group 2.50) (p=0.04) had a statistically significant difference between the compression garment and control group two days post-race. There were no statistically significant differences in any other outcome measures (i.e. Likert scale for determination of muscle soreness, mid-thigh circumference and race performance) between the compression garment and control group. Conclusion: The use of upper leg compression garments is a recovery ergogenic aid which improves VAS pain ratings post-race. The results suggest that upper leg compression garments have a protective effect on the hamstring muscle in runners in the recovery phase. However, since a runner would be in a recovery phase after a marathon, a minor difference would be of little practical advantage since, importantly, there was no statistically significant differences in race performance and thigh circumference measures.
- ItemOpen AccessIdentifying risk factors contributing to the development of shoulder pain and injury in male, adolescent water polo players(2020) Jameson,Yale; Gray, Janine; Roche, StephenWater polo is a fast-growing adolescent sport that consists of swimming, defending and overhead shooting in an aquatic environment. The high demands on the shoulder to complete these tasks are proposed to cause the high injury incidence reported in the sport. The novelty of this research rests in its clinically valuable contribution to understanding shoulder injury aetiology in adolescent water polo players as overhead throwing athletes. The overall research aim of this thesis explores the musculoskeletal profile of a male adolescent water polo players shoulder and the intrinsic factors associated with shoulder injury risk. An overview of the literature (Chapter 2) explores the biomechanics of water polo including swimming and overhead throwing; the musculoskeletal adaptations of overhead throwing in water polo compared to other overhead sports; and the epidemiology of shoulder injury in water polo players relative to other overhead sports. Due to the absence of a consensus-based definition of injury in water polo comparison of existing quality epidemiological studies in the sport was limited. Additionally, although a limited amount of studies have proposed potential risk factors to shoulder injury in water polo players, significant correlations are yet to be found. As with other overhead sports, the water polo shoulder is prone to injury due to the generation of high force during a modified upright swimming posture, repetitive swimming stroke and overhead throwing at high velocities. Male adolescent water polo players were recruited for this study. Chapter 3 describes the adolescent water polo player's shoulder musculoskeletal profile and its association with shoulder injury prevalence throughout a single water polo season. The musculoskeletal variables included pain provocation, range of motion, strength, flexibility and shoulder stability tests which have been used previously in overhead athletes to investigate injury prevention and performance. There were three steps in the data collection process. Firstly, informed consent and assent, demographic, competition, training and injury history, and a shoulder-specific functional questionnaire was acquired from participants. Secondly, a battery of pre-season musculoskeletal tests was performed. The battery of tests included: anthropometry, pain-provocation, glenohumeral and upward scapula range of motion, glenohumeral and scapula muscle strength, glenohumeral flexibility and shoulder stability measurements. Thirdly, at the end of the season participants completed an injury report and training load questionnaire. Participants who experienced shoulder pain, with or without medical management, were categorised into the injury group and those who did not were categorised as uninjured. Chapter 3 documents the adolescent water polo players shoulder musculoskeletal profile, shoulder injury prevalence and the association between these intrinsic risk factors and injury. Specifically, adolescent water polo players present with significant side-to-side asymmetry in the lower trapezius (p = 0.01), upward scapula rotation ROM at 90° glenohumeral elevation (p = 0.03), glenohumeral internal and external rotation ROM (p = 0.01), glenohumeral internal and external strength (p = 0.05 and p = 0.01 respectively) and the pectoralis minor index (p = 0.01). Twenty-four participants (49%) sustained a shoulder injury during the season with the dominant shoulder more commonly affected (54.2%). The most common aggravating factors were identified as throwing (41.7%) and shooting (20.8%). Although significantly lower scores on the pre-season shoulder-specific functional questionnaire (p = 0.01) and significantly greater upward scapula rotation at 90° glenohumeral elevation (p = 0.01) on the dominant shoulder was found in the injured group compared to the uninjured group, no factors were significantly associated with increased injury risk. In conclusion, the findings suggest that male adolescent water polo players are a high-risk population for shoulder injury. It is suggested that improving the players, coaches and parents' health literacy, particularly of the shoulder, and incorporating preventative exercises, targeting modifiable risk factors and side-to-side asymmetry, into pre-season conditioning programmes may reduce the prevalence of shoulder injury in this sporting population. While this research contributes to the epidemiology of shoulder injuries in water polo players, further research is needed to continue to report on injury incidence and associated risk factors, particularly training and workload characteristics in the water polo population.
- ItemOpen AccessIndependent association of resting energy expenditure with blood pressure: confirmation in populations of the African diaspora(2018) Dugas, Lara RObesity is a major risk factor for hypertension, however, the physiologic mechanisms linking increased adiposity to elevations in blood pressure are not well described. An increase in resting energy expenditure (REE) is an obligatory consequence of obesity. Previous survey research has demonstrated that REE is an independent predictor of blood pressure, and eliminates the co-linear association of body mass index. This observation has received little attention and there have been no attempts to provide a causal explanation.
- ItemOpen AccessInjuries and illnesses in athletes with spinal cord injury during the 2012 London Summer Paralympic Games(2018) Swart, Thomas FrederickBackground: The Summer Paralympics have grown from participation of a mere 16 athletes at the 1948 Stoke-Mandeville Games, to a large multi-code event of 4176 athletes competing in 20 different sporting codes at the 2012 London Summer Paralympic Games. Unlike able-bodied athletes, Paralympic athletes represent a heterogenous group of people with a varied degree of physical-, mental- and physiological impairment. Despite the growth in the Paralympic sport, limited research exists describing injury and illness in Paralympic athletes. For athletes with impairment to perform optimally and not to jeopardise their health, studies should identify and eventually address risk factors for both injury and illness. Aim: The main aim of this study was to determine the incidence and nature of illnesses and injuries in a cohort of athletes with spinal cord injury (SCI) during the 3-day pre-competition and 11-day competition period at the 2012 London Summer Paralympic Games. This knowledge could provide an initial framework for future research regarding injury- and illness prevention strategies in athletes with SCI. Methods: This study was a component of the large prospective cohort study which was conducted over the 14-day period of the London 2012 Summer Paralympic Games, coordinated by the Medical Committee of the International Paralympic Committee (IPC). The data were collected at the London 2012 Summer Paralympic Games during the 3-day pre-competition and 11-day competition periods. Three data sources were used. Firstly, the IPC provided a comprehensive athlete database that contains accreditation number, country code, sports code (20 sports), gender and age. The second data source was the medical encounters of staff that provided care to their own teams. At the London 2012 Summer Paralympic Games, a novel system (WEB-IIS) was used to collect data via desktop computer interface, tablet or smart phone. The third data source was from an electronic medical data capture system (EMDCS) (ATOS, France) where the medical staff of the Local Organizing Committee of the London Summer Paralympic Games (LOCOG) were requested to enter all medical encounters, at both the Paralympic Village polyclinic and at the sports venues wherever the athlete reported for care. A standardized form was used for this purpose. After comparing all the data, a total of 3009 athletes, of which 709 were athletes with SCI formed part of this study. The Incidence Rate (IR) for illnesses and injuries in athletes with SCI was calculated as the number of illnesses and injuries per 1000 athlete days and was compared to a group of all other Paralympic athletes with injury and illness (who had other impairments). Results: There were significantly more upper limb injuries in athletes with SCI (p=0.0001), with an IR of 6.4 injuries / 1000 athlete days (95% CI 4.6 - 8.9). The IR for all the other athletes were 4.4 injuries / 1000 athlete days (95% CI 3.4 - 5.8). For lower limb injuries, the IR for athletes with SCI was significantly lower (p=0.0001) at 1.4 injuries / 1000 athlete days (95% CI 0.8 -2.5) compared to an average IR of 4.2 injuries / 1000 athlete days (95% CI 3.3-5.4) for all other athletes participating at the 2012 London Paralympic Games. Athletes with SCI had a significantly higher IR for illness than the group of all other athletes (p=0.0004). The IR for illness in athletes with SCI was 15.4 illnesses / 1000 athlete days (95% CI 11.8-20.1), whereas the average for all other athletes were 11.0 illnesses / 1000 athlete days (95% CI 8.7-14.1). The IR for skin- and genito-urinary illness were significantly greater in athletes with SCI (p=0.0001), with an IR of 3.9 illnesses / 1000 athlete days (95% CI 2.5-6.2) for skin illness and 2.3 /1000 athlete days (95% CI 1.8-4.6) for genito-urinary illness. The IR in skin illness for all other athletes were 1.8 illnesses / 1000 athlete days (95% CI 1.1-2.7) and genito-urinary illness, were 0.5 illnesses / 1000 athlete days (95% CI 0.3-0.8). Summary: The results of this study present an insight into injuries and illnesses in athletes with SCI. Athletes with SCI injury have a greater incidence rate of upper limb injuries and a lower incidence of lower limb injuries, than other Paralympic athletes. Total-, skin- and genito-urinary illnesses were also significantly greater in athletes with SCI compared to other Paralympic athletes. For clinicians caring for athletes with SCI, the results indicate that more attention should be given to the prevention of upper limb injuries and specifically skin- and genito-urinary illnesses.
- ItemOpen AccessInjury incidence and severity at the South African Rugby Union (SARU) Youth Weeks Tournaments: a four year study(2018) Marsh, Jarred; Lambert, Michael I; Brown, JamesIntroduction Rugby Union (hereinafter referred to as ‘rugby’) is a contact sport with players being exposed to repetitive collisions throughout a match. As the risk of injury is relatively high, incidence surveillance studies within rugby has become popular. However most of the studies have focussed on senior players. The data on injuries among youth rugby players are limited. This makes it difficult to develop the game to make it safer for youth of all ages. Objectives The first objective of this study was to establish if any injury trends exist across different ages of youth rugby players (13 to 18 years). The second objective was to determine the patterns of injuries changed over four years (2011 to 2014). Methods The South African Rugby Union (SA Rugby) hosts four local youth tournaments annually to for local rugby talent: Craven Week under-13, Grant Khomo under-16, Academy Week under-18 and Craven Week under-18. Injury data were collected from the four SARU Youth Week Tournaments between 2011 and 2014. These data were compiled into one central SARU injury surveillance database. Injury categories were used to group data: ‘Type’, ‘Location’, ‘Event’ and ‘Severity’ of injury were assessed. Injuries were defined as either ‘Time-loss’ (those injuries that prevented a player from match participation for one or more days), or ‘Medical attention’ (injuries that required the player to seek medical attention at the time of or after injury but were not required to miss a match). Injury rates were represented by injury incidence densities (IIDs) (corresponding 95% confidence intervals (95% CIs) for IID were calculated for the number of injuries regardless of whether one person was injured more than once) per 1000 hours of match play. Incidence densities were considered to be significantly different from each other if their 95% CIs did not overlap and using Poisson regression analysis. Results The ‘overall’ combined IID across all four years was 54.6 injuries per 1000 hours of match play (95%CI: 51.0-58.2). The combined ‘time-loss’ IID was 18.9 injuries per 1000 hours of match play (95%CI: 16.8-21.0). ‘Time-loss’ injuries were greatest in 2011 (23.2 per 1000 match hours (95% CI: 18.5-28.0)). However, ‘time-loss’ injuries rates were significantly reduced in 2013, when compared to these injury rates in 2011 (13.3 (9.7-17.0). Craven Week under-13 presented significantly greater ‘overall’ injury incidence densities when compared to the older age groups (71.9 per 1000 match hours (95% CI: 62.4-81.4)). Overall, joint/ligament/tendon injuries were most common ‘overall’ and ‘time-loss’ injury sustained by players between 2011 and 2014 (30% and 33% respectively). This was followed closely by concussion injuries, which accounted for 29% of ‘time-loss’ and 12% of ‘overall’ injuries. A large proportion of both ‘overall’ (57%) and ‘time-loss’ (55%) injuries occurred during the tackle event, with the tackler being injured more often than the ball-carrier (37% and 18% respectively). However, there were no statistically significant differences when comparing ‘overall’ and ‘time-loss’ IID between the different tournaments from 2011 until 2014. Discussion Significant differences were found when comparing ‘overall’ and ‘time-loss’ IID between the different tournaments from 2011 until 2014. Craven Week under-13 presented significantly greater ‘overall’ injury incidence densities. This finding contradicts previous literature within youth rugby research. The tackle (combination of tackler and ball-carrier) still accounts for the highest proportion both ‘time-loss’ and ‘overall’ injury events (57% and 55% respectively). This is in accordance with previous studies. However, a point of concern was that concussion accounted for 29% of all ‘time-loss’ injuries and 12% of all ‘overall’ injuries. This finding suggests a gradual increase in the number of concussions suffered during the SARU Youth Week Tournaments between 2011 and 2014. Further research is required to determine the reason for this pattern. Conclusion Further research within youth rugby cohorts is required to determine the risk associated with involvement at various level of participation. Injury prevention programs should place focus on reducing the prevalence of concussion at youth level by educating players and coaches about safe tackle techniques. Future studies should focus on local youth cohorts for seasonal
- ItemOpen AccessInnovative spinal cord injury rehabilitation in the context of a middle-income country: a pilot randomised control study investigating physiological and psychological effects(2021) Evans, Robert William; Albertus, Yumna; West, Sacha; Derman, WayneA spinal cord injury (SCI) is life-altering, resulting in neurological deficits and a multitude of secondary complications. South Africa holds one of the highest traumatic SCI incidence rates in the world, where the social need for SCI prevention and rehabilitation is immense. Robotic locomotor training (RLT) is a novel rehabilitation technique that may improve health and wellbeing after SCI. A systematic review was conducted across 27 studies and 308 participants to explore the systemic effects of RLT. This review demonstrated that RLT shows promise as a tool for improving neurological rehabilitation outcomes; providing individuals with a SCI the ability to walk safely while improving their walking performance, as well as potentially improving cardiovascular outcomes and psychosocial factors. However, the studies reviewed were non-controlled with small, heterogenous sample sizes. Further high-powered, randomised controlled trials, with homogenous samples, are required to investigate these effects. If widespread adoption of these new technologies is to occur, sound evidence demonstrating efficacy and long-term cost saving is required. This dissertation aimed to explore some of these under-researched areas in a sample of sixteen persons with incomplete tetraplegia. Areas of focus included, 1) rehabilitation feasibility, adherence, and research challenges in an under-resourced environment 2) cardiovascular functioning and adaptation to a rehabilitation programme, and 3) psychological well-being. We implemented two interventions, robotic locomotor training (RLT) and activity-based training (ABT), over a 24-week pilot randomised control trial. Adherence to the interventions was high (93.9 ± 6.2%). Challenges to the study's feasibility included: ethical approval, medical clearance, transport and limited human/financial resources. Cardiovascular parameters demonstrated that efficiency of exoskeleton walking improved during the intervention. RLT may be more effective than ABT in improving cardiac responses to orthostatic stress, with standing heart rate at 24-weeks being significantly lower in the RLT group (75.1 ± 15.0 beats/min) compared to the ABT group (95.6 ± 12.6 beats/min). Standing and RLT had similar effects on the parasympathetic nervous system, whilst both interventions were limited in their effect on brachial and ankle blood pressure. Despite experiencing past trauma, participants possessed psychological resources including resilience, self-efficacy and post-traumatic growth which contributed to high perceptions of quality of life. The use of an exoskeleton may have had a greater positive impact on subjective psychological well-being. Expectations of participants entering the study centred around regaining the ability to walk again, despite past experiences and medical advice suggesting otherwise. Hope aids in buffering against negative emotions, however, a thin line exists between supporting high expectations and confronting unrealistic hope. Initial high expectations of recovery decreased and became more realistic during the intervention. This dissertation demonstrates potential physiological and psychological benefits that RLT provides. Despite this potential, barriers exist in the use of RLT in low- and middle-income countries such as South Africa, primarily due to a lack of financial and human resources. The development of lower-cost exoskeletons would lessen the burden of conducting large-scale trials and increase the likelihood of adopting these innovative rehabilitation tools into current standard of care practices.
- ItemOpen AccessMechanisms underlying the development of weakness in idiopathic inflammatory myopathies: an in vitro single muscle fibre contractility study(2018) Henning, Franclo; Kohn, Tertius A; Carr, Jonathan AIntroduction: Polymyositis (PM), dermatomyositis (DM) and necrotising autoimmune myopathy (NAM) form part of the spectrum of idiopathic inflammatory myopathies (IIMs). Although the pathogenic mechanisms are different, the unifying feature is that of weakness caused, in some way or another, by an inflammatory attack on muscle. The mechanism by which weakness develops is still unclear, but experimental animal data suggest that dysfunction of the contractile apparatus might contribute to muscle weakness in these conditions. This study investigated the contractile function of single muscle fibres from patients with IIMs in vitro. Methods: Muscle biopsies obtained from patients with IIMs and healthy controls were dissected and chemically permeabilised. Single muscle fibres were dissected out and subjected to contractility measurement based on standard protocols utilising a permeabilised single fibre system. Specific force (SF; maximum force normalised to cross-sectional area), was calculated for each fibre and compared between the two groups. In addition, maximum shortening velocity and power output were assessed in some of the fibres, and calcium sensitivity in the rest. The myosin heavy chain composition of each fibre was determined by means of gel electrophoresis. Results: A total of 178 fibres from IIM cases and 174 fibres from controls were studied. Specific (normalised) force was 23%, 24% and 29% lower in the IIM group for all fibre types combined, type I fibres, and type IIa fibres, respectively. Shortening velocity and maximum power output were significantly higher in the IIM group for both type I and IIa fibres, compared to controls, while calcium sensitivity was higher in type IIa fibres from IIM cases than controls. Discussion: The findings from this study suggest that weakness in IIMs may, at least in part, be caused by dysfunction of the contractile apparatus leading to impaired contractile force. The higher shortening velocity, power output and calcium sensitivity in fibres from IIM cases probably represents compensatory mechanisms. Although the mechanism by which contractile function is affected has not been investigated, animal studies suggest a role for TNF-α. The findings of this study provide a basis for further investigation into the mechanisms underlying weakness in IIMs.
- ItemOpen AccessPeripheral arterial disease and intermittent claudication: Efficacy of short term upper body strength training, dynamic exercise training, and advice to exercise at home(2009)OBJECTIVE: To compare the effect of two training programmes and advice to exercise at home on physiological adaptations in patients with peripheral arterial disease (PAD). DESIGN: 30 patients with a typical history of PAD and intermittent claudication were randomised to either an upper body strength training programme (UBST), a dynamic (walking, cycling, circuit) conventional exercise rehabilitation programme (CER), or advice to 'walk as much as possible at home' (CONT). Before and after intervention groups performed a standard graded treadmill exercise test (GTET) and a 6-minute walk test (SMWT) to determine peak physiological parameters and walking distances. Maximal walking distance (MWD), pain-free walking distance (PFWD), peak oxygen uptake (VO2) , heart rate and perceived pain were measured. RESULTS: MWD on the GTET increased significantly in the CER group compared with the CONT and UBST groups (93.9 +/- 79% v. 7.0 +/- 19.8% v. 7.3 +/- 46%; CER v. UBST v. CONT p = 0.003). Similarly, peak VO2 increased with CER compared with the CONT and UBST groups (28.4 +/- 20 v. -6.2 +/- 15 v. -1.0 +/- 21%; CER v. UBST v. CONT p = 0.004). During the SMWT the CER and UBST groups improved in PFWD compared with the CONT group (37 +/- 47% v. 27 +/- 71% v. -30 +/- 29%; CER v. UBST v. CONT p = 0.03), and perceived pain decreased in the CER group compared with the UBST group (-24 +/- 39% v. 27 +/- 48%; CER v. UBST p = 0.01). CONCLUSION: CER improves physiological parameters and walking distances more than UBST does. CER is effective within 6 weeks. Verbal encouragement to exercise is an ineffective form of management.
- ItemOpen AccessPhysical activity and gross motor skills in rural South African preschool children(2018) Tomaz, Simone Annabella; Draper, Catherine; Hinkley, Trina; Jones, RachelBackground: Global levels of overweight and obesity in preschool-aged children have increased dramatically in the last two decades, with most overweight and obese children younger than five years living in low- and middle-income countries (LMICs). Statistics from the 2013 South African National Health and Nutrition Examination Survey (SANHANES-1) confirm that levels of overweight and obesity are high in South African preschool-aged children, with prevalence rates of overweight and obesity up to 18.2% and 4.7%, respectively. This increasing problem of overweight and obesity in South African preschoolaged children highlights the need for intervening in this age group. Overweight and obesity interventions in preschool children typically include one or more of the following behaviours: physical activity, sedentary behaviour and screen time. Aim and objectives: The aim of this study was to characterise the preschool environment in rural South Africa, and to explore physical activity, gross motor skill proficiency, sedentary behaviour and screen time in rural South African preschool-aged children. Additionally, aims of this study were to explore the associations between gross motor skills, body composition and physical activity; and to assess compliance with current physical activity and sedentary behaviour guidelines. Methods: Preschool-aged children (3-5 years old, n=131) were recruited from three Preschools and two Grade R (reception year) settings in Agincourt, a rural village in north eastern South Africa. In order to gain an understanding of the Preschool and Grade R settings, an observation of the preschool environments was conducted using a tool adapted from the Outdoor Play Environmental Categories scoring tool, Environmental and Policy Assessment and Observation instrument, and the Early Learning Environments for Physical Activity and Nutrition Environments Telephone Survey. Each child’s height and weight was measured. Physical activity and sedentary behaviour were measured objectively using a hip-worn ActiGraph GT3X+ accelerometer for 7 days (24 hours, only removed for water-based activities). Gross motor skills were assessed using the Test for Gross Motor Development–Version 2 (TGMD-2). Physical activity and sedentary behaviour, including the contextual information for these behaviours, during the preschool day (08h00 until ±12h00) were measured using the Observational System for Recording Physical Activity in Children (Preschool Version). A separate sample of parents/caregivers were recruited (n=143) to complete a questionnaire that was adapted from the Healthy Active Preschool Years questionnaire and Preschool Physical Activity Questionnaire. Parents reported on their child’s screen time, and on factors within the home and community contexts in which physical activity and sedentary behaviours occur. Results: In terms of the environment, the Preschools and Grade R settings differed in that fixed play equipment only featured in the Preschool settings. Grade R settings had more open space in which to play. All Preschool and Grade R settings provided children with limited portable play equipment, and none of the schools had access to screens. Although all children recruited for the study were preschool-aged, the Grade R children were significantly older than the Preschool children (5.6±0.3years vs. 4.4±0.4 years, p <0.05). According to IOTF cut-offs, the prevalence of overweight/obesity was low (5.0%) in the sample, and 68.1% of children were classified as normal weight. On average, children spent 477.2±77.3 minutes in light- to vigorous-intensity physical activity (LMVPA) per day, and 93.7±52.3 minutes in moderate- to vigorous-intensity physical activity (MVPA). In terms of the new current guidelines (180min/day LMVPA, including 60min of MVPA, described as ‘energetic play’), and using average daily average of LMVPA and MVPA, 78.2% met current guidelines. Observed and objectively measured sedentary behaviour results revealed that children were more sedentary during preschool time (between 08:00 to 12:00) compared to the afternoons. Overall, boys were significantly more physically active than girls; and Preschool children did more physical activity during preschool time than Grade R children (all p< 0.05). Over 90% of the sample achieved an ‘average’ or better ranking for gross motor skill proficiency. The Grade R children were significantly more proficient than the Preschool children for all gross motor skill components (raw scores and standardised scores). Overall, boys achieved significantly better object control raw scores than the girls, and displayed greater proficiency than the girls in the strike (p=0.003), stationary dribble (p< 0.001) and kick (p< 0.001). None of the preschool or Grade R settings had access to screens such as televisions or iPads, and parent-reported screen time was low for the total sample (0.5±0.3hr/day). The majority of the sample (97.9%) met current screen time guidelines (<1 hour per day). Parents (82.5%) reported that they believed that their child did sufficient PA for their health, but 81.8% also reported believing that television time would not affect their child’s health. Parent responses revealed neighbourhood safety as a potential barrier to being physically active in the community. Conclusions: Rural preschool-aged children in South Africa appear to be engaged in adequate amounts of physical activity, particularly LMVPA, and are adequately proficient in gross motor skills. The children did not engage in excessive amounts of screen time. Overweight and obesity were not prevalent in this sample of rural preschool-aged children, and therefore it would appear that an intervention to reduce or prevent obesity by increasing physical activity, improving gross motor skills and reducing screen time is unnecessary. Rather, interventions that facilitate the increase in levels of MVPA in order to meet current physical activity guidelines are warranted. Additionally, it is essential that the high levels of physical activity (LMVPA) and good foundation of gross motor skills observed in this sample are promoted in an effort to maintain them throughout childhood. Future research may want to determine whether these activities (high levels of LMVPA, low levels of screen time) track throughout childhood and into adolescence.
- ItemOpen AccessPhysiological evaluation of sleep surfaces in healthy volunteers and patients with acute-upon-chronic lower back pain(1998) Hulse, Bronwyn Leigh; Derman, WayneStudies have documented that the use of a lumbar support while in the sitting position results in reduced back and leg pain, centralisation of pain and reduced erector spinae muscle activity in patients with lower back pain (LBP). While the positive effects of a lumbar support in sitting have been studied, few researchers have attempted to document the value of such a support in the supine position. Since many patients with LBP suffer from insomnia and nocturnal discomfort, it may be possible that the use of a foam surface overlay could positively influence their symptoms. Several foam surface overlays are currently used as a popular form of management for patients presenting with LBP. These include the convoluted foam surface ("egg box'' shape), which to my knowledge has not been scientifically studied and the lumbar body support, the value of which has only recently been reported. That study found that patients with chronic LBP have decreased electromyographic (EMG) activity of the erector spinae muscles, lower heart rates (HR) and decreased perception of discomfort (ROD) when lying on this locally designed, triple density, contoured, lumbar body support system (LBS) compared with a conventional flat innerspring mattress (CM). Accordingly the aim of this thesis was to measure the EMG activity, heart rate response, perception of comfort and pattern of pressure distribution after lying on a variety of different surfaces, thus endeavouring to determine a mechanism of action of the LBS. In the first study of this thesis, ten patients with LBP were exposed to a random order, 30 minute period on three sleep surfaces: Lumbar body support on top of a conventional mattress (LBS+ CM), 60 mm convoluted foam surface on top of a conventional mattress (CFS + CM), and a conventional mattress (CM) alone. Each patient acted as his/her own control. Recordings of EMG activity, HR and ROD were measured for each patient. Average HR over the 30 minute period was lower after acute exposure to the LBS+ CM (60 ± 11 b/min) compared to the CM (66 ± 10 b/min, p < 0.05; LBS+ CM vs. CM). Although average HR response to the LBS+ CM was lower compared to CFS + CM (64 ± 9 b/min), this difference was not significant. ROD reported after acute exposure to the LBS+ CM was improved (1.9 ± 0.7 units), compared to the CFS+ CM (3.9 ± 1.0 units) and CM (4.7 ± 2.2 units; p < 0.05). Average EMG activity was lower after 30 minutes on the LBS + CM (2.68 ± 1.1 mv) compared to the CFS+ CM (4.46 ± 2.7 mv) and CM (4.19 ± 2.4 mv; p < 0.05). These results suggest that patients with LBP have reduced EMG activity and HR measurements with lower ROD when lying on a LBS + CM compared with a CM and CFS + CM. The second series of experiments involved a further ten patients with lower back pain, who were required to lie supine in random order on the LBS + CM, on a polystyrene mould (PM) (identical to the shape of the LBS) and on a CM. Recordings of EMG activity, HR and ROD were measured for each patient. Average HR over the 30 minute period was lower on the LBS + CM (60 ± 7 b/min) vs. PM + CM (66 ± 10 b/min) and CM (68 ± 9 b/min; p < 0.01 ). Average ROD was improved when patients lay on the LBS+ CM (1.8 ± 0.6 units) vs. PM + CM (5. 7 ± 2.5 units) and CM (4.1 ± 1.8 units; p < 0.05). Furthermore, average EMG activity was significantly reduced after lying on the LBS + CM (2.5 ± 1.0 mv) vs. PM + CM (4.3 ± 1.9 mv) and CM (4.6 ± 1.8 mv; p < 0.01 ). The findings of this study mirror our initial findings. The elevated EMG activity, heart rate and perception of discomfort after lying on a PM suggests that it could be a combination of both the correct density and the correct contour features that is important in reducing muscle spasm in patients with acute-upon-chronic lower back pain. Average HR over the 30 minute period was lower after acute exposure to the LBS+ CM (60 ± 11 b/min) compared to the CM (66 ± 10 b/min, p < 0.05; LBS+ CM vs. CM). Although average HR response to the LBS+ CM was lower compared to CFS + CM (64 ± 9 b/min), this difference was not significant. ROD reported after acute exposure to the LBS+ CM was improved (1.9 ± 0.7 units), compared to the CFS+ CM (3.9 ± 1.0 units) and CM (4.7 ± 2.2 units; p < 0.05). Average EMG activity was lower after 30 minutes on the LBS + CM (2.68 ± 1.1 mv) compared to the CFS+ CM (4.46 ± 2.7 mv) and CM (4.19 ± 2.4 mv; p < 0.05). These results suggest that patients with LBP have reduced EMG activity and HR measurements with lower ROD when lying on a LBS + CM compared with a CM and CFS + CM. The second series of experiments involved a further ten patients with lower back pain, who were required to lie supine in random order on the LBS + CM, on a polystyrene mould (PM) (identical to the shape of the LBS) and on a CM. Recordings of EMG activity, HR and ROD were measured for each patient. Average HR over the 30 minute period was lower on the LBS + CM (60 ± 7 b/min) vs. PM + CM (66 ± 10 b/min) and CM (68 ± 9 b/min; p < 0.01 ). Average ROD was improved when patients lay on the LBS+ CM (1.8 ± 0.6 units) vs. PM + CM (5. 7 ± 2.5 units) and CM (4.1 ± 1.8 units; p < 0.05). Furthermore, average EMG activity was significantly reduced after lying on the LBS + CM (2.5 ± 1.0 mv) vs. PM + CM (4.3 ± 1.9 mv) and CM (4.6 ± 1.8 mv; p < 0.01 ). The findings of this study mirror our initial findings. The elevated EMG activity, heart rate and perception of discomfort after lying on a PM suggests that it could be a combination of both the correct density and the correct contour features that is important in reducing muscle spasm in patients with acute-upon-chronic lower back pain. body support is altered and pressures are more equally distributed when compared to the pressure distribution of the other surfaces measured, without increases in pressure at any point on the body. Similar average and peak pressure results were obtained for the 90 mm CFS + CM and LBS. Since these results were not mirrored by the 60 mm CFS, the thickness of a foam surface possibly plays a role in reducing pressure. The data of these three separate studies could have implications in the adjunctive treatment of i)low back pain and ii) pressure sores. Firstly, the results of this thesis suggest that the use of a 60 mm foam overlay may not be the optimum form of management for patients presenting with paraspinal muscular spasm. Further, it is postulated, that the density and contour features of the lumbar body support are likely to play a role in reducing EMG activity and heart rate, while improving perception of comfort compared to the flat surfaces (CM and 60 mm CFS), which offer little support to the lumbar region of these patients. X Secondly, either the LBS or 90 mm CFS are likely to reduce the incidence of pressure sores in patients required to lie supine for prolonged periods, due to the reduction in peak and average pressures. In view of the contoured surface, it is unlikely that pressure sores could develop in patients lying on the LBS. This hypothesis needs to be confirmed in longer term studies in patients who are severely debilitated or paraplegic, as they are often most at risk for the development of pressure sores.
- ItemOpen AccessPreventing musculoskeletal injuries among recreational adult volleyball players: design of a randomised prospective controlled trial(2017) Gouttebarge, Vincent; Zwerver, Johannes; Verhagen, EvertBACKGROUND: Both acute and overuse injuries are common among recreational volleyball players, especially finger/wrist, ankle, shoulder and knee injuries. Consequently, an intervention ('VolleyVeilig') was developed to prevent or reduce the occurrence of finger/wrist, shoulder, knee and ankle injuries among recreational volleyball players. This article describes the design of a study evaluating the effectiveness of the developed intervention on the one-season occurrence of finger/wrist, shoulder, knee and ankle injuries among recreational adult volleyball players. METHODS: A randomized prospective controlled trial with a follow-up period of one volleyball season will be conducted. Participants will be healthy recreational adult volleyball players (18 years of age or older) practicing volleyball (training and/or match) at least twice a week. The intervention ('VolleyVeilig') consists of a warm-up program based on more than 50 distinct exercises (with different variations and levels). The effect of the intervention programme on the occurrence of injuries will be compared to volleyball as usual. Outcome measures will be incidence of acute injury (expressed as number of injuries per 1000 h of play) and prevalence of overuse injuries (expressed as percentage). DISCUSSION: This study will be one of the first randomized prospective controlled trials evaluating the effectiveness of an intervention on the occurrence of both acute and overuse injuries among recreational adult volleyball players. Outcome of this study could possibly lead to the nationwide implementation of the intervention in all volleyball clubs in The Netherlands, ultimately resulting in less injuries. TRIAL REGISTRATION: Dutch Trial Registration NTR6202 , registered February 1st 2017. PROTOCOL: Version 3, February 2017.
- ItemOpen AccessProprioception, jumping capacity and agility in beach versus indoor volleyball players(2021) Glossop-von Hirschfeld, Christine; Smits-Engelsman, Bouwien; Ferguson, GillianBackground: Beach volleyball (BVB) is rapidly developing into a popular activity both for recreational and competitive athletes. The majority of injuries sustained playing volleyball (beach and indoor) are considered non-contact. While commonly injured areas (ankle, knee, lower back and shoulder) are similar in beach volleyball (BVB) and indoor volleyball (IVB), injury incidence in BVB players is reported as 3.9-4.9 per 1000 hours, which is significantly lower than in IVB (1.7-10.7 per 1000 hours). Several factors contribute to the level of performance as well as to injury risk in volleyball players: body composition, changes in training load, previous injury, balance, proprioception, joint kinematics and muscle strength. There has been recent growth in the literature investigating the role of proprioception in the assessment, management and prevention of musculoskeletal injuries. Proprioceptive retraining strategies are diverse and yet no conclusive evidence demonstrating the superiority of one exercise over another is available. However, consensus exists that proprioceptive training requires movement on an unstable or uneven surface. Although proprioceptive exercises are commonly integrated into sports rehabilitation, there is a lack of high-quality evidence proving that proprioception can be trained. Maximal vertical jumping, lateral cutting sprints and diving to play the ball are repetitively demanded of volleyball players. BVB players complete these actions on sand (an uneven and unstable surface), making this sport, by definition, a continuous proprioceptive training exercise. By comparing two groups (IVB and BVB players) who perform a very similar sport on different surfaces (indoor and sand), we wish to investigate whether this may have led to differences in proprioception. Furthermore, we would like to measure the possible influence of this training aspect on functional capacity (lower limb range and strength, agility and vertical jump height). Aim: To compare proprioception, functional lower limb capacity, agility and jumping capacity on two different surfaces, between non-professional BVB and IVB players. Methods: A descriptive, cross-sectional, analytical study was conducted. The study adhered to the research ethics guidelines of the Declaration of Helsinki. The study protocol was submitted and approved by the Faculty of Health Sciences Human Research Ethics Committee, University of Cape Town. Convenience sampling was used to recruit 30 non-professional volleyball players (15 BVB and 15 IVB players) in the Western Cape, who met the inclusion criteria. Each player attended a testing session where they were given an informed consent sheet. If they decided to consent to participate and sign the form, a screening questionnaire was administered to determine eligibility to participate. Due to the COVID pandemic, participants also completed a COVID-19 screening tool. If eligible to continue to take part in the study, participants completed two questionnaires (Training and Injury History questionnaire and OSTRC (Oslo Sports Trauma Research Centre) questionnaire), after which they completed seven physical tests (the wedge test, two-point discrimination test, modified balance error scoring system (mBESS) test, modified star excursion balance test (mSEBT), knee-to-wall test, single leg hamstring bridge test (SLHBT) and eccentric-concentric calf raise test). They then proceeded to perform two jumping tests (countermovement jump with arm swing (CMJA) test and single leg triple hop for distance (SLTHD) test) and an agility test agility (modified agility T-test (MAT)) both on sand and hard surfaces. Descriptive statistics were used to present the demographical data, training and injury history. A t-test was used to determine whether the two groups were comparable on anthropometric data. Differences between the two groups (BVB and IVB players) in proprioception, agility and jumping capacity were analysed using Mann-Whitney U and unpaired t-tests. Repeated measures ANOVA were used to determine any differences in agility, jumping height or hopping distance between IVB and BVB players when tested on different surface conditions (the surface being the within-group factor and player type being the between-group factor). Effect size analysis was also reported for the physical outcome measures data, to determine the strength of any trends in differences existing between the two player groups Results: IVB and BVB groups were similar regarding demographics, training history and injury prevalence. Age was the only variable found to be significantly higher in BVB players than IVB players (p = <.001). There were no significant differences in most measures of balance, strength, agility or jumping capacity between the groups. While the results of the proprioception measure (wedge test) were also not significant (p= 0.08), a medium effect size (Cohen's d = 0.66) was found, with the BVB group identifying more differences in wedge heights correctly. There was a significant difference in the anterior reach of the Y-balance test (right and left legs) between the groups (p < 0.05), with the BVB group out-performing the IVB group. The study showed no significant correlations between proprioceptive measures and functional outcomes. A repeated-measures ANOVA determined that there was a significant main effect of surface type on mean CMJA heights (Wilks' Lambda = 0.799, F (1,28) = 7.040, p = .013), mean left leg SLTHD distances (Wilks' Lambda = 0.522, F (1,28) = 25.654, p = < .001) and mean right leg SLTHD distances (Wilks' Lambda = 0.473, F (1,28) = 31.169, p = < .001). However, no surface by player group interactions emerged, indicating that the impact of the surface was not different between groups of players: All volleyball players ran faster, jumped higher and hopped further on the indoor floor than on the sand. Discussion and conclusion: The findings of this study suggest that there are no consistent differences in functional capacity between IVB and BVB players. Despite limited findings, the current study contributes to the literature, as it is one of a few studies to assess the effect of habitual sand training on functional performance measures between IVB and BVB players. It is hoped that this study could provide a basis for further investigation into training on different surfaces to improve functional outcome measures, for overall performance improvement.
- ItemOpen AccessSocio-ecological factors in talent development in cricketers in a diverse society(2018) Dove, Mary Ann; Draper, Catherine E; Gray, Janine; Taliep, SharhiddIntroduction: In recent years, there has been a move to understand the environment and context in which athletes develop. South Africa’s unique context provides an opportunity to understand how environmental factors could influence talent development in cricket. Since democracy, there has been limited representation of Black African cricketers at the elite levels in South Africa. Therefore, the aim of this thesis was to determine the role that socio-ecological factors may play in the development of cricket talent in a diverse society. Methods: Qualitative research methods were used to explore the experiences and perceptions of South Africa’s male cricketers as they progressed through the talent pathway from exposure to the game to the elite level. The perceived effectiveness of the introduction of an ethnic target policy was also explored. Seventy-one semi-structured interviews were conducted with a purposive sample of players from all ethnic groups (n=43), and with knowledgeable and experienced key informants (n=16). A thematic analysis of the data resulted in the identification of themes which are presented using a multi-level socio-ecological framework. Results: All players progressed to the elite level; however, their access points to and routes through the pathway varied. This progress was influenced by the inter-relationship of distal and proximal socio-ecological factors that they experienced during their cricketing careers. These influences can be summarised into five talent development components that acted either as barriers or enablers to progress: (1) access to opportunities and competition, (2) holistic player development, (3) effective support networks, (4) inclusive team environments, and (5) adaptive mind-sets. In addition, various intrapersonal characteristics were identified that further affect a player’s ability to achieve elite cricketing success. Finally, it was determined that an ethnic target policy alone is not an effective intervention for developing cricket talent in a diverse society undergoing transition. Conclusion: A socio-ecological framework to talent development lends additional support to the idiosyncratic, multifactorial, dynamic and complex way in which cricket expertise is achieved, particularly in diverse societies. It provides stakeholders involved in the talent development process with evidence to inform policy and practice, as well as design effective interventions.