• English
  • Čeština
  • Deutsch
  • Español
  • Français
  • Gàidhlig
  • Latviešu
  • Magyar
  • Nederlands
  • Português
  • Português do Brasil
  • Suomi
  • Svenska
  • Türkçe
  • Қазақ
  • বাংলা
  • हिंदी
  • Ελληνικά
  • Log In
  • Communities & Collections
  • Browse OpenUCT
  • English
  • Čeština
  • Deutsch
  • Español
  • Français
  • Gàidhlig
  • Latviešu
  • Magyar
  • Nederlands
  • Português
  • Português do Brasil
  • Suomi
  • Svenska
  • Türkçe
  • Қазақ
  • বাংলা
  • हिंदी
  • Ελληνικά
  • Log In
  1. Home
  2. Browse by Author

Browsing by Author "Lambert, Michael I"

Now showing 1 - 20 of 23
Results Per Page
Sort Options
  • Loading...
    Thumbnail Image
    Item
    Open Access
    A 12 week pre-season fitnes training programme for senior male high school rugby players : the effect of supervision on anthropometric, physiological and physical performance variables
    (1998) Clark, David Rodney; Lambert, Michael I
    The study comprises of two sections; i) a survey to determine the attitude towards fitness training for rugby and the current fitness training habits of elite high school rugby players in their penultimate year at school, ii) a training study on a sample of the same population group, to measure the effect of a 12 week fitness training programme, based on scientific principles, on anthropometric, physiological and performance variables. The training study also measured the efficacy of training supervision compared no supervision on these variables.
  • Loading...
    Thumbnail Image
    Item
    Open Access
    The assessment of the efficacy of the mobile training system after implementation in South African rugby playing schools
    (2014) Van Aarde, Roedolf Frederik; Lambert, Michael I
    Introduction: Rugby Union is a sport where physical size matters and the bigger, stronger and better conditioned players have an advantage over smaller and less powerful opponents. Research of adolescent rugby players in South Africa showed that Coloured and Black players weighed 8 kg less than their White counterparts. A possible explanation for the difference in size was the lack of weight training facilities in the disadvantaged areas. Therefore to address the potential handicap for these players having to compete against bigger players, the South African Rugby Union and the High Performance Centre at the Sport Science Institute of South Africa developed a mobile schools training system (MSTS). These are fully equipped units with sufficient weight training equipment for an entire team. The aim of this study was to determine whether the fitness characteristics associated with rugby, changed in players after the MSTS was given to a school for several months. Training of players was not controlled or supervised by any personal outside the infrastructure of the school. A secondary aim was to interview the staff member at each school responsible for the MSTS to enquire about their perceptions of the MSTS and whether there were any barriers to the uptake by the schools and players. Methods: Schools with a “rugby ethos” and from a previously disadvantaged background were selected by SARU for the MSTS Programme. Players (U16 and U18 age groups) at these schools participated in the study. A total of 382 players were tested both before they had exposure to the MSTS and approximately 16 weeks later. They were divided into two age groups; U18 (n = 224 forwards and backs) and U16 (n = 158 forwards and backs). The following characteristics were measured; stature, body mass, body % body fat, muscular strength (bilateral grip strength and bench press), muscular endurance (1min push-ups), sprint times (10 m and 40 m) and aerobic capacity (multi-stage shuttle run test). All Tests were conducted during February and October of 2013. A rating of the extent to the players used the MSTS was also calculated and this was used to categorise schools. Data are represented as means ± standard deviation. A repeated measures of analysis of variance (repeated measures of ANOVA) was used to determine whether there were significant differences between the ‘pre and post’ round of testing using either ‘age’, ‘provinces’ and whether the ‘gym was used or not’ as main effects. The interaction between ‘age x time’ and ‘province x time’ and ‘gym usage x time’ was calculated. If any interactions were significant, a Tukey post hoc test was used to identify specific differences. Statistical significance was accepted when p < 0.05. Coaches at the schools participated interviews to determine the barriers to implementation of the programme, and which areas need to be improved. Results: Changes over time was only shown for body mass (p < 0.037) and bench press (p < 0.001) in schools where the gym was used compared to schools who did not use the gym. When comparing U16 vs. U18 age groups, the U18 players were significantly taller, and heavier, had less % body fat, and a better performance for grip strength, bench press, push-ups, 10 m and 40 m sprint time and Multi-stage shuttle test (MSST) compared to the U16 players (p < 0.04 ). There was also a significant interaction (age x time) for stature (p < 0.002), body mass (p < 0.011), % body fat (p < 0.002). When comparing the 5 provinces of the U16 age group, pre-post differences where noted for stature, body mass bench press and the multi stage shuttle test (MSST) between provinces p < 0.00 01. Interactions (province x time) for changes over time between the 5 provinces was shown for stature, body mass, % body fat, bench press, push-up’s, 10 m sprint time and MSST. There were significant pre-post differences between provinces (U18) for stature, body mass, skinfolds, % body fat, bench press and the multi stage shuttle test (MSST) for all p < 0.0001 except skinfolds showed p < 0.041. Interactions (province x time) change over time between the 5 provinces was shown for stature, body mass, % body fat, bench press and push-up’s. An interaction for the age groups was determined for a variable if a level of significance was p < 0.05. The interviews with the coaches raised various issues which comprised the usage of the MSST with the most important being lack of resources at the school , inadequate knowledge of strength and conditioning training, lack of facilities to store the mobile gym and poor nutrition of the players. Conclusion: There is overwhelming evidence in the literature about the benefits of resistance training for youth, from the perspective of improving performance to reducing the risk of injury. The results from the MSTS programme were not as overwhelming as one would believe from the literature. This can be attributed to various reasons; inadequate facilities to house the MSTS, inadequate coaches ’ knowledge and experience in strength and conditioning, and poor nutrition. With increased provision of equipment at schools without adequate support of trained strength and conditioning specialists at each school the programme will be ineffective. To ensure future success of the programme it is recommended that; (i) a needs analysis is done at each school to determine which school has the correct facilities to house the mobile gym so that regular training sessions can take place, (ii) SARU employs qualified trainers at the schools involved in the MSTS programme to supervise all strength and conditioning sessions, (iii) there are regular follow up visits at schools to check on compliance, (iv) objective and subjective assessments are conducted at regular intervals to determine if there are improvements in the targeted variable.
  • Loading...
    Thumbnail Image
    Item
    Open Access
    The association between cardiorespiratory fitness and performance in a submaximal stepping test standardised for external workload
    (2016) Huchu, Linet; Lambert, Michael I
    Submaximal step tests are used to predict maximal oxygen consumption and work capacity. However, if the external workload is not controlled the interpretation of the test results may be inaccurate. The purpose of the research was to develop a submaximal test of cardiorespiratory fitness using a novel step test designed specifically to overcome the weaknesses of the previously published step tests. A series of studies contributed to the theoretical development of the submaximal step protocol, piloting the protocol, reliability studies, validation of the protocol and finally a cross-validation of the protocol. The first study tested the hypothesis that stepping tests configured for the same external workload, but varying in stepping frequency, elicit the same physiological stress. Participants (n = 31) performed three step tests at 16, 20 and 24 steps per minutes in random order. External workload was standardised at 45 kJ. Energy expenditure, heart rate recovery, rating of perceived exertion, maximum heart rate and total heart beats were significantly different between tests (p < 0.05) with the biggest differences occurring between 16 and 24 steps per minute. Maximum heart rate as a percentage of age predicted heart rate increased from 70% at 16 steps per minute to 81% at 24 steps per minute. The study concluded that standardisation of external workload with different exercise intensities does not result in the same physiological responses. The second study tested the reliability of the step test. Participants (n = 34) performed a step test three times in a week at a cadence of their choice (16, 20 or 24 steps per minute). The study showed that the step test is repeatable for most variables measured and therefore is a reliable test of fitness. The third study used the outcome variables measured during the step test to develop equations which predicted VO₂max measured directly in a maximal test on a treadmill. A diverse sample of participants (n = 273), differing in sex, level of habitual physical activity and age were recruited for the study. Several models for predicting VO₂max were determined. The most parsimonious equation was: VO₂max (ml.kg⁻¹.min⁻¹) = -0.10911 (age) - 0.06178 (body mass) - 0.75481 (body fat %) +0.00208 (METS) + 0.11636 (HRR) - 0.019551 (MHR) + 0.07955 (Av HR) + 83.34846 (R² = 0.75, standard error of estimate = 5.51 ml.kg.min⁻¹) where METS is metabolic equivalent, HRR is heart rate recovery, MHR is maximum heart rate and Av HR is average heart rate. Cross validation was done (n = 50) to test the accuracy of the prediction equation. The relationship between the predicted VO₂max and the measured VO₂max was r = 0.87. In conclusion the standardised step test can predict VO₂max in a heterogeneous population of males and females, varied ages (20 to 60 years), physical activity levels and fitness levels.
  • Loading...
    Thumbnail Image
    Item
    Open Access
    The association between exercise-induced muscle damage and cortical activity in the alpha and beta frequency range
    (2011) Plattner, Kristina; Lambert, Michael I; Baumeister, Jochen
    This thesis examines the regulation of muscle function following exercise-induced muscle damage (EIMD), in an attempt to determine whether regulation occurs primarily in the muscle (neuromuscular) or further upstream. Upstream regulation has been hypothesized to occur in the lower brain structures, but one may assume that the efferent output to the muscle should be guided by the motor and pre-motor cortex alongside other associated cortical areas.
  • Loading...
    Thumbnail Image
    Item
    Open Access
    Body size, socioeconomic status and training background of a select group of U16 South African rugby union players (2010-2013): The impact on national selection
    (2016) Arkell, Robin; Lambert, Michael I
    Background: Rugby Union is an international sport characterized by bouts of short duration, high intensity exercise in which players frequently collide into one another while running at high speeds. Players are commonly required to engage in phases of play involving contact such as tackling, rucking, mauling and scrumming. These phases of play require certain physical qualities, including strength, aerobic power, speed and explosive power. Perhaps, the growth and professionalization of the game has resulted in more emphasis being placed on the physical preparation of the players. Physical preparation of players not only happens at elite senior levels, but has also filtered down into the junior ranks, where it is common for school teams to be trained by professional strength and conditioning coaches. The rules of the game have changed, which have influenced the physical demands. For example, ball-in-play time has increased, players are covering more distance per game, making more tackles and engaging in more scrums. It is therefore important to identify the various physical characteristics that are required to be successful at a particular level of rugby union. The socioeconomic status and ethnicity of the player in association with the physical characteristics can determine the success of an adolescent rugby player. Objective: To determine the association between body mass and stature (referred to as physical characteristics for this study), race, socioeconomic status, and weight training (referred to as non-physical characteristics for this study) on the chances of success among U16 provincial rugby union players. In particular, size, socioeconomic status and ethnicity of players in the U16 national training squad were compared to players who represented their provinces but did not get selected for the national squad. Methods: Data were collected for each player who attended the Coca Cola National Grant Khomo week from 2010 to 2013. Players participating in this tournament had already undergone a process of selection trials to be selected to represent their province at U16 level. The national squad players were chosen based on performances at the Coca Cola National Grant Khomo week. The characteristics of the players selected for the national squad vs. players who did not get selected for the squad from 2010 to 2013 were compared using an ANOVA and the magnitude of the differences were quantified using effects sizes. Results: White players are heavier (ES = 0.59) and taller (ES = 0.8 2) than black players as well as heavier (ES = 0.8 7) and taller (ES = 0.8 2) than coloured players over the four-year period from 2010 to 2013. Players selected into the National squad were on average heavier (ES = 0.5 0) and taller (ES = 0.4 0) than those players not selected into the National squad. White players were the heaviest and tallest of the race groups selected into the National squad (p < 0.0000 2). Players with a high socioeconomic status were heavier (ES = 0.3 0), taller (ES = 0.4 0), and had more playing experience (ES = 0.3 0), than players from a low socioeconomic status background. Grouping according to socioeconomic status did not differentiate between race groups and selection for the national squad. Conclusion: This study showed that the taller and heavier players were more likely to get selected for the national U 16 squad. Since size was also associated with socioeconomic status, the players with a high socioeconomic status had an advantage over players with a low socioeconomic status. These findings have implications for transforming the game to ensure that the representative teams reflect the composition of the South African population.
  • Loading...
    Thumbnail Image
    Item
    Open Access
    Contemporary perspectives of core stability training for dynamic athletic performance: a survey of athletes, coaches, sports science and sports medicine practitioners
    (Springer International Publishing, 2018-07-16) Clark, David R; Lambert, Michael I; Hunter, Angus M
    Background Core stability training has grown in popularity over 25 years, initially for back pain prevention or therapy. Subsequently, it developed as a mode of exercise training for health, fitness and sport. The scientific basis for traditional core stability exercise has recently been questioned and challenged, especially in relation to dynamic athletic performance. Reviews have called for clarity on what constitutes anatomy and function of the core, especially in healthy and uninjured people. Clinical research suggests that traditional core stability training is inappropriate for development of fitness for heath and sports performance. However, commonly used methods of measuring core stability in research do not reflect functional nature of core stability in uninjured, healthy and athletic populations. Recent reviews have proposed a more dynamic, whole body approach to training core stabilization, and research has begun to measure and report efficacy of these modes training. The purpose of this study was to assess extent to which these developments have informed people currently working and participating in sport. Methods An online survey questionnaire was developed around common themes on core stability training as defined in the current scientific literature and circulated to a sample population of people working and participating in sport. Survey results were assessed against key elements of the current scientific debate. Results Perceptions on anatomy and function of the core were gathered from a representative cohort of athletes, coaches, sports science and sports medicine practitioners (n = 241), along with their views on effectiveness of various current and traditional exercise training modes. Most popular method of testing and measuring core function was subjective assessment through observation (43%), while a quarter (22%) believed there was no effective method of measurement. Perceptions of people in sport reflect the scientific debate, and practitioners have adopted a more functional approach to core stability training. There was strong support for loaded, compound exercises performed upright, compared to moderate support for traditional core stability exercises. Half of the participants (50%) in the survey, however, still support a traditional isolation core stability training. Conclusion Perceptions in applied practice on core stability training for dynamic athletic performance are aligned to a large extent to the scientific literature.
  • Loading...
    Thumbnail Image
    Item
    Open Access
    The development of an evidenced-based submaximal cycle test designed to monitor and predict cycling performance : the Lamberts and Lambert submaximal cycle test (LSCT)
    (2009) Lamberts, Robert Patrick; Lambert, Michael I; Noakes, Tim
    The HIMS test, which consists of controlled exercise at increasing workloads, has been developed to monitor changes in training status and accumulative fatigue in athletes. As the workload can influence the day-to-day variation in heart rate, the exercise intensity which is associated with the highest sensitivity needs to be established with the goal of refining the interpretability of these heart rate measurements. The aim of the study was to determine the within subject day-to-day variation of submaximal and recovery heart rate in subjects who reached different exercise intensities.
  • Loading...
    Thumbnail Image
    Item
    Open Access
    Differences in muscles pain and plasma creatine kinase activity after 'up' and 'down' Comrades marathons
    (2008) Burgess, Theresa L; Lambert, Michael I
    Objective. The aim of this study was to compare the acute changes in muscle pain and plasma creatine kinase (CK) activity following the 'up' and 'down' Comrades marathon. Design. This was a quasi-experimental design. Eleven male runners (39.7±9.3 years) completed the 'up' Comrades marathon, and 11 male runners (41.0±8.4 years) completed the 'down' Comrades marathon the following year. Maximum oxygen consumption and peak treadmill running speed were measured 2 weeks before the race. Daily measurements of muscle pain and plasma creatine kinase (CK) activity were recorded 1 day before, and for 7 days after the race. Results. Muscle pain remained significantly elevated for up to 7 days after the Comrades marathon, compared with pre-race values (p<0.0009). The pain scores following the 'down' run were significantly higher than the pain scores following the 'up' run for at least 7 days after the race (p<0.004). Plasma CK activity remained significantly elevated for up to 5 days after the Comrades marathon, compared with pre-race values (p<0.007). Plasma CK activity following the' down' run was significantly higher than the plasma CK activity following the 'up' run for 5 days after the race (p<0.04). A high degree of intra-individual variability in plasma CK activity was observed. Conclusions. The 'down' Comrades marathon causes significantly more muscle pain and plasma CK activity compared with the 'up' Comrades marathon. Further studies are required to accurately define the regeneration of muscle following the Comrades marathon.
  • Loading...
    Thumbnail Image
    Item
    Open Access
    The effect of branched-chain amino acid ingestion on physical performance during prolonged exercise
    (1996) Velloza, Peter Edward; Lambert, Michael I
    It has been hypothesized that an increase in the ratio of plasma tryptophan (TRP) to branched-chain amino acid (BCAA) concentrations may mediate an increase in cerebral serotonin synthesis, through an increased cerebral tryptophan uptake. It is postulated that the increased brain serotonin content may induce central fatigue during prolonged exercise. Until present, this postulate had not been subject to rigorous scientific testing during prolonged exercise. Therefore the aim of this study was to investigate whether ingesting a BCAA supplement during prolonged exercise improves physical performance and central fatigue. The use of such a supplement during prolonged exercise could then be expected to have a large effect on performance. Eight trained cyclists (VO₂ max= 61.9 ± 4.3 ml 02/kg/min) ingested, in random order, a drink containing either 10% carbohydrate (CHO), 10% CHO and 0.16% branched-chain amino acid (BCAA) or 0.16% BCAA. Every hour, for the duration of the exercise (4 hours, 55% VO₂ max) blood samples were analysed for amino acids, ammonia, free fatty acids, glycerol, glucose and insulin concentrations. Urine was analysed for urea and creatinine concentrations. Heart rate, oxygen consumption (VO₂), respiratory exchange ratio (RER) and rating of perceived exertion were also analysed. Thereafter, subject's 40km time trial performance and RPE was assessed on a Velodyne windtrainer. Central fatigue following the time trial was quantified using the Sternberg reaction-time paradigm. The serum concentration of the BCAA's declined as a result of the exercise, in the BCAA only trial. Tryptophan concentration, however, did not change during the exercise. The serum TRP:BCAA ratio increased (0.16 ± 0.06 to 0.20 ± 0.10; p≤0.05) in the CHO trial only. The BCAA trial differed from the two trials in which CHO was ingested because plasma ammonia and glucose concentrations did not increase, while free fatty acids (FF A's) and glycerol concentrations increased significantly (p≤0.05). The lower RER in the BCAA trials suggests a higher proportion of fat was oxidised in these trials, compared to the other two trials. Cycling performance, over a 40km time trial, (CHO= 68.59 ± 6.02; CHO+ BCAA = 68.00 ± 3.01; BCAA = 69.43 ± 5.35 min/sec), ratings of perceived exertion, submaximal or maximal heart rates, and mental performance were not different between trials. Data from this study appears to refute the thesis hypothesis that an increase in serum TRP:BCAA decreases physical performance and central fatigue, during prolonged exercise.
  • Loading...
    Thumbnail Image
    Item
    Open Access
    The effect of repeated bouts of downhill training on running performance and recovery after a 30-km time trial
    (2001) Schutte, Lynne; Lambert, Michael I; Rogers, G; Lombard, R
    Purpose: The present study was designed to examine the effect of repeated bouts of either downhill or level running on running performance in, and recovery from, a 30-km time trial. Methods: Sixteen male subjects with a mean (± SD) age of 33.8 ± 5.8 years, body mass of 72.0 ± 7.3 kg and a stature of 176.6 ± 4.5 cm were randomly allocated to either a downhill (n=9) or a level group (n=7). The protocol consisted of a training phase, followed by a 30-km time trial and a recovery phase. During the training phase subjects ran either at a -10% grade (downhill group) or a 0% grade (level group) on a treadmill for nine 40-minute training runs [70% of peak treadmill running speed (PTRS)]. Thereafter all the subjects participated in a 30-km time trial (70% of PTRS), where heart rate (HR), rate of perceived exertion (RPE) and stride length (SL) were recorded, followed by five 15-minute submaximal recovery runs. The first recovery run was performed before the start of the training phase and again on four occasions after the 30-km time trial. HR, RPE, SL, minute ventilation (Vi), oxygen consumption (VO₂), carbon dioxide production (VCO₂) and respiratory exchange ratio (RER) were recorded during these 15-minute runs. Plasma creatine kinase (CK) activity and muscular soreness were assessed for the duration of the study. Results: HR decreased in the downhill group during the training phase, suggesting a HR training effect. Muscle pain and plasma CK activity in the downhill group increased after the first 40-minute downhill training run. These indicators of muscle damage did not show any further increases during the training phase, suggesting a "repeated bout effect". Towards the end of the 30-km time trial the level group, showed a greater heart rate drift (HRD) and an increased RPE, suggesting that they were not able to resist fatigue to the same extent as the downhill group. HR and RPE recorded during the recovery phase suggested that the downhill group showed a better recovery after the 30-km time trial. During the recovery phase the downhill group experienced no increase in muscle pain after performing the 30-km time trial, in contrast to the level group who experienced muscle pain for five days after the 30-km time trial. Plasma CK activity, was blunted after the 30-km time trial in the downhill group in contrast to the level group. Conclusion: The results of the investigation support the hypothesis that the inclusion of downhill training into a training program cause changes, which can be interpreted as enhancing performance during an endurance event and recovery after the event.
  • Loading...
    Thumbnail Image
    Item
    Open Access
    Immediate post catastrophic injury management in rugby union. Does it have an effect on outcomes?
    (2017) Suter, Jason Alexander; Lambert, Michael I; Brown, James Craig
    Background: Rugby union ('rugby') has a high injury risk. These Injuries range from having minor consequences to catastrophic injuries with major life changing consequences. In South Africa, previous research indicated that the risk of catastrophic injury was high and that the immediate management was sub-optimal, worsening the injury outcome. In response, the South African Rugby Union launched the BokSmart nationwide injury prevention programme in 2008. Through education - mainly of coaches and referees - this programme aims to improve the prevention and management of catastrophic injuries. Moreover, the programme began administering a standardised questionnaire for all catastrophically injured players to assess the prevention and management of these injuries. Objectives: To assess whether factors in the immediate pre- and post-injury management of catastrophic injuries in rugby were associated with their outcome. In addition, as part of the BokSmart programme in Rugby in South Africa, there were modules developed as part of the education material delivered to referees and coaches in their workshops that deal specifically with safety in the playing environment, and the correct management of catastrophic injuries. We assessed whether these protocols within the modules were implemented. Design: A prospective, cohort study conducted on all catastrophic injuries in rugby collected through a standardised questionnaire by BokSmart between 2008 - 2014. Methods: Secondary analyses were performed on the information collected on all rugby-related catastrophic injuries in BokSmart's serious injury database. Injury outcomes were split into 'permanent' (permanently disabling and fatal) and 'non-permanent' (full recovery/ "near miss"). Immediate post injury management factors as well as protective equipment and ethnicity were analysed for their association with injury outcome using a Fisher's exact test. Results: There were 87 catastrophic injuries recorded between 2008 and 2014. Acute spinal cord injuries (ASCI) made up most of the catastrophic injuries (n=69) with traumatic brain injuries (TBI) the second most common (n = 11 injuries). There were 7 cardiac events. Black African players were associated with a 2.4 times higher proportion of permanent outcome that the injured White players (p=0.001). There was no association between any protective equipment or injury management (including optimal immobilization, time and method of transport taken to hospital) and ASCI outcome (non-permanent vs. permanent) Conclusions: Neither immediate post-injury management, nor the wearing of protective equipment was associated with catastrophic injury outcome in these South African rugby-related injuries. This might indicate that the initial injury is more important in determining the outcome than the post-injury management and associated secondary metabolic cascade, as proposed by some experts in this area. Moreover, that ethnicity was associated with ASCI outcome in this study is indicative of the wider problems in South Africa; not only specific to rugby. It is recommended that BokSmart continue to focus their programme in low socioeconomic areas that play rugby in South Africa.
  • Loading...
    Thumbnail Image
    Item
    Open Access
    Injury incidence and severity at the South African Rugby Union (SARU) Youth Weeks Tournaments: a four year study
    (2018) Marsh, Jarred; Lambert, Michael I; Brown, James
    Introduction Rugby Union (hereinafter referred to as ‘rugby’) is a contact sport with players being exposed to repetitive collisions throughout a match. As the risk of injury is relatively high, incidence surveillance studies within rugby has become popular. However most of the studies have focussed on senior players. The data on injuries among youth rugby players are limited. This makes it difficult to develop the game to make it safer for youth of all ages. Objectives The first objective of this study was to establish if any injury trends exist across different ages of youth rugby players (13 to 18 years). The second objective was to determine the patterns of injuries changed over four years (2011 to 2014). Methods The South African Rugby Union (SA Rugby) hosts four local youth tournaments annually to for local rugby talent: Craven Week under-13, Grant Khomo under-16, Academy Week under-18 and Craven Week under-18. Injury data were collected from the four SARU Youth Week Tournaments between 2011 and 2014. These data were compiled into one central SARU injury surveillance database. Injury categories were used to group data: ‘Type’, ‘Location’, ‘Event’ and ‘Severity’ of injury were assessed. Injuries were defined as either ‘Time-loss’ (those injuries that prevented a player from match participation for one or more days), or ‘Medical attention’ (injuries that required the player to seek medical attention at the time of or after injury but were not required to miss a match). Injury rates were represented by injury incidence densities (IIDs) (corresponding 95% confidence intervals (95% CIs) for IID were calculated for the number of injuries regardless of whether one person was injured more than once) per 1000 hours of match play. Incidence densities were considered to be significantly different from each other if their 95% CIs did not overlap and using Poisson regression analysis. Results The ‘overall’ combined IID across all four years was 54.6 injuries per 1000 hours of match play (95%CI: 51.0-58.2). The combined ‘time-loss’ IID was 18.9 injuries per 1000 hours of match play (95%CI: 16.8-21.0). ‘Time-loss’ injuries were greatest in 2011 (23.2 per 1000 match hours (95% CI: 18.5-28.0)). However, ‘time-loss’ injuries rates were significantly reduced in 2013, when compared to these injury rates in 2011 (13.3 (9.7-17.0). Craven Week under-13 presented significantly greater ‘overall’ injury incidence densities when compared to the older age groups (71.9 per 1000 match hours (95% CI: 62.4-81.4)). Overall, joint/ligament/tendon injuries were most common ‘overall’ and ‘time-loss’ injury sustained by players between 2011 and 2014 (30% and 33% respectively). This was followed closely by concussion injuries, which accounted for 29% of ‘time-loss’ and 12% of ‘overall’ injuries. A large proportion of both ‘overall’ (57%) and ‘time-loss’ (55%) injuries occurred during the tackle event, with the tackler being injured more often than the ball-carrier (37% and 18% respectively). However, there were no statistically significant differences when comparing ‘overall’ and ‘time-loss’ IID between the different tournaments from 2011 until 2014. Discussion Significant differences were found when comparing ‘overall’ and ‘time-loss’ IID between the different tournaments from 2011 until 2014. Craven Week under-13 presented significantly greater ‘overall’ injury incidence densities. This finding contradicts previous literature within youth rugby research. The tackle (combination of tackler and ball-carrier) still accounts for the highest proportion both ‘time-loss’ and ‘overall’ injury events (57% and 55% respectively). This is in accordance with previous studies. However, a point of concern was that concussion accounted for 29% of all ‘time-loss’ injuries and 12% of all ‘overall’ injuries. This finding suggests a gradual increase in the number of concussions suffered during the SARU Youth Week Tournaments between 2011 and 2014. Further research is required to determine the reason for this pattern. Conclusion Further research within youth rugby cohorts is required to determine the risk associated with involvement at various level of participation. Injury prevention programs should place focus on reducing the prevalence of concussion at youth level by educating players and coaches about safe tackle techniques. Future studies should focus on local youth cohorts for seasonal
  • Loading...
    Thumbnail Image
    Item
    Open Access
    Key performance indicators and predictors in Varsity Cup rugby
    (2014) Sewry, Nicola; Lambert, Michael I
    Rugby union is a popular sport worldwide, and due to the professional nature of the sport the demands on players continues to increase, resulting in acute and chronic fatigue. The aims of the study were to investigate the tools used to monitor and predict changes in training status and determine the effectiveness of these to: (i) measure the subjective nature of coaches and their selection relating to the players’ performance, and (ii) to use performance indicators to correlate to team performance. The University of Cape Town Rugby Varsity Cup Team (First XV squad) were monitored from their pre-pre-season until the end of their competitive season. Players completed a testing battery (anthropometry, strength, muscular endurance, speed and aerobic fitness) during the season, along with Rating of Perceived Exertion and body mass was recorded every practice. Players also completed the HIMs test (measure of heart rate recovery) weekly. Coaches rated players every practice on three variables and the matches were recorded and video analysis performed to determine key performance variables. Most of the players improved in their testing battery between pre-pre-season and pre-season. Average session load varied across the phases of the season and was highest in the pre-preseason. Change in load however, was not reflected by changes in heart rate recovery which remained relatively stable across the season. The players’ body mass varied throughout the tournament, with certain players having a larger coefficient of variation compared to others. There was no relationship between performance in the testing battery and selection for matches. The coaches all had different ratings for the players, with no correlation between players selected and those not selected. There was a correlation between the subjective rating of players in the week leading up to the match and the match ratings of Coach 3 (head coach). The Varsity Cup rugby union players followed similar trends described in previous literature in physiological testing batteries, training loads and player management. The novel aspect of this study was the collection of data from the coaches involved. This qualitative data provides insight into the coaches’ selection process or lack thereof within a team environment. The data also illustrates the differences between the coaches’ interpretation of the players’ “performance”. The Varsity Cup is a relatively young tournament and should be further investigated to properly understand the differences between it and professional and amateur rugby union.
  • Loading...
    Thumbnail Image
    Item
    Open Access
    Long-term player development in rugby - how are we doing in South Africa?
    (2010) Lambert, Michael I; Durandt, Justin
    Rugby is a sport where size does matter. Players who are bigger, stronger and faster have an advantage over smaller, less powerful players. These differences in size are exacerbated at the junior levels where players reach puberty at different stages. Furthermore, the problem is compounded in South Africa, where children from a low socio-economic environment are generally smaller and less powerful than their counterparts from more affluent areas.1 There is a strong likelihood that the smaller talented players will rather participate in sports in which they can express their talent and not be limited by their lack of size, as is the case in rugby. Some players in this group may be late developers, who possess the skills associated with success in rugby but lack the size. If these players are not managed appropriately, their superior skills may not ever have an opportunity to manifest and develop fully. This raises the point of having a wellconstructed long-term talent development model2 which considers that talent development is multi-factorial and dynamic in nature.3 Such a model would consider the differences in size during puberty and cater as much for the slow developers as it does for the early developers. Developing talent is not an easy task and requires ongoing monitoring to ensure that there are progressions in skill, physical ability and cognitive maturation.3 Failure to adopt a long-term talent development model, where talent and skills are developed systematically, will result in many players who may be late developers, choosing to play other sports where size is not such a distinguishing factor. This raises questions of whether rugby in South Africa needs to be managed differently to cater for these smaller players, particularly during the pre-pubertal years, where most of the variation in size exists.
  • Loading...
    Thumbnail Image
    Item
    Open Access
    Matching the density of the rugby playing population to the medical services available in the Eastern Cape, South Africa
    (2017) Moore, Simon; Lambert, Michael I; Burgess, Theresa
    Background: Rugby Union is a popular contact sport played worldwide. The physical demands of the game are characterized by short duration, high intensity bouts of activity, with collisions between players, often while running fast. The head, neck, upper limb and lower limb are common sites for injury. Although catastrophic injuries are rare in rugby, they do occur. Immediate action (4-hour window) must occur after the injury to minimise the damage incurred from a catastrophic injury. This infers that a well-functioning medical infrastructure should be available to anticipate injuries of this nature and provide treatment for the best possible outcome. Currently there is no system information/map in South Africa describing the medical infrastructure in relation to places where clubs and schools practice and play matches. Such a system may assist providing early and immediate transfer of injured players to the appropriate treatment facility. This would minimise the damaging effects caused by delays in medical treatment. Therefore the aim of this study was to; (i) investigate and report on the location, distance and travel time from rugby playing/training venues in the Eastern Cape to the nearest specialist hospital where a player may be able to receive adequate treatment for a catastrophic injury, and ii) report on safety equipment available at these playing venues to facilitate this transport in a safe manner. Methods: All the clubs (n=403) and schools (n =264) that played rugby in the Eastern Cape were accounted for in the study. However, only 15 clubs and 35 schools were included in the analysis as they had their own facilities for training and playing matches. Distances between clubs/schools and the nearest public, private and specialized hospital (able to treat catastrophic injuries) were measured. In addition driving time was also estimated between the clubs/schools and nearest specialized hospital to determine if an injured player could be transported within four hours to receive medical treatment for a catastrophic injury. In addition medical safety equipment was audited (according to information provided by SA RUGBY)) for each club and school to identify if they were meeting the minimum safety standards as set by SA RUGBY. Results: Twenty schools were identified as being less than one hour away from the nearest hospital equipped to deal with catastrophic rugby injuries; nine schools were between 1-2 hours away and six schools were between 2-3 hours away. All schools were within 100 km driving distance of the nearest public hospital; 28 schools were within 100km driving distance to the nearest private hospital. For seven schools, the nearest private hospital was between 100 and 150 km away. Fourteen schools had spinal boards, eleven had neck braces, ten had harnesses, nine had change rooms, five had floodlights, and twenty-two had trained first aiders. Six schools were located 2-3 hours away and were at higher risk due to a lack of first aid equipment. Ten clubs were less than an hour away from the nearest hospital equipped to treat catastrophic injuries; two clubs were between 1-2 hours away, two were between 2-3 hours away and one was between 3-4 hours away. All clubs were within 100 km driving distance of the nearest public hospital. Nine clubs were within 100km driving distance to the nearest private hospital, three clubs were based between 100 and 150 km from the nearest private hospital and three were based over 150km away from the nearest private hospital. Twelve clubs had a spinal board, eleven clubs had neck braces, ten clubs had harnesses, ten clubs had change rooms, seven clubs had floodlights and twelve clubs had first aid trainers. One club was classified as high risk as it was located 2-3 hours away from the nearest hospital equipped to manage a catastrophic injury and had no first aid equipment. Discussion/Conclusion: No clubs or schools included in the study were more than four hours away from a hospital that was equipped to deal with a catastrophic rugby injury. Therefore, any player who suffers a catastrophic injury should be able to get to treatment within the 4-hour window period. Another finding was that not all clubs or schools possessed the minimum equipment required to host training or a rugby match. SA RUGBY can take appropriate action towards these clubs and schools to ensure that they maintain the safest possible practice to not put their own players at increased risk.
  • Loading...
    Thumbnail Image
    Item
    Open Access
    Monitoring of training and racing of long distance runners using heart rate monitors
    (1999) Mbambo, Ziphelele Hazlitt; Lambert, Michael I
    Aim: The aim of this thesis was to contribute to a better understanding of heart rate during exercise with the aim of improving the precision with which heart rate can be used to measure intensity during running. Accordingly, heart rate responses were examined in long distance runners during different types of training and racing. The thesis also examined the effects of environmental and body temperature on heart rate during submaximal and maximal running. Study 1: Ten male provincial and national class road runners (VO₂max = 67.1 ± 3.8 mlO₂.kg⁻¹. min⁻¹) were recruited for the study. All the subjects completed questionnaires on their training history and recorded their training sessions in their diaries. The subjects wore heart rate monitors during training and racing. There was no convincing evidence that competitive runners who train at higher intensities have a better running performance. A poor relationship was found between %VO₂max and %HRmax. Finally, heart rate during races was higher compared to heart rates during training. The cause of the elevated heart rate during races was not clear. Study 2: The relationship between heart rate and running speed during competition was not well understood. Accordingly, an elite long distance male runner (25 years, VO₂max = 71 mlO₂.kg⁻¹. min⁻¹) was studied over a 5-month period during which time he participated in 9 races (5 km - 28 km). The subject wore a heart rate monitor which measured his heart rate throughout the race and his split running times each kilometre. The subject underwent a field test during which the heart rate/running speed relationship was determined under non-competitive conditions (r = 0.99). However, in the race situation there was no relationship between heart rate and running speed (r = 0.02). It was concluded that during competition there was no relationship between heart rate and running speed, whereas in a non-competitive situation heart rate was proportional to running intensity. Study 3: With a poor relationship found between heart rate and running speed during races in the previous study, other factors like environmental conditions and core temperature were hypothesised to have effects on heart rate. Accordingly, twelve highly trained distance runners were recruited for the study. Each subject ran on a treadmill (30 minutes at 70% peak treadmill running speed, followed by 8 km time trial) in different ambient temperatures (15°C, 25°C and 35°C) with humidity (60%) and wind speed (15 km.h⁻¹) kept constant. Heart rate, RPE and Tre were recorded every 5 minutes during the submaximal and the maximal trials. When subjects were exercising at 70% of peak treadmill running speed at 15°C, no cardiovascular drift was observed, at least for 30 minutes. However, during the same exercise test at 25 °C and 35°C there was a significant increase in heart rate. In the maximal exercise test the average heart rate was significantly higher during the trial at 35°C compared to the trials at 15° C and 25° C. It was concluded that heart rate can be used as an accurate measure of running intensity in cooler (15 ° C) ambient temperature. In summary, this thesis described the practical use of heart rate monitors during training and competition and at different temperatures. Data are provided which suggest that heart rate can accurately assess exercise intensity providing factors which affect the heart rate/running speed relationship are controlled.
  • Loading...
    Thumbnail Image
    Item
    Open Access
    The relationship between training/match load and injuries in academy players during a provincial under 19 rugby union season
    (2015) Van Wyk, Johan; Lambert, Michael I; Burgess, Theresa
    Background: The influence of professionalism has filtered down to junior levels in rugby union. The increased demands on junior professional rugby players has an impact on their fitness characteristics, training load, match load and injury profiles. Although many studies have been conducted on senior rugby union players, not much is known about junior players as they make the transition into the senior ranks. The aim of this study was to describe the training/match load during the pre-season and competitive in-season in a squad of under 19 academy rugby players and then to relate this to the injuries (contact and non-contact) sustained during the different phases of the season. A secondary aim was to measure the physical ability of the players through the season. Methods: Injury and training data from players in the Western Province under 19 Currie Cup squad (n=34) were recorded on a daily basis throughout the rugby season (42 weeks). The training load was represented by the time (minutes) spend in each activity associated with training, conditioning and match play. The players also underwent measurements of body mass, stature, body fat percentage, upper body muscular endurance (pull ups), and muscular strength (1RM bench press), sprint times (10m and 40m) and anaerobic capacity (5 meter shuttle run). All tests were conducted in January and June, which coincided with the beginning of pre-season and the beginning of the competition phase respectively. Results: Over the season 71 injuries were recorded, comprising 17 pre-season injuries, 18 pre-competition injuries and 36 competition phase injuries. There was no difference between the occurrence of contact and non-contact injuries during the different phases of the season. Although there was no significant difference between the injury rates during the different phases of the season, there was a significant difference between the injury rates in training (4.4/1000 player hours) and matches (74.1/1000 player hours). The most common body parts injured were thighs, hip/groin, ankles and shoulders, with injuries to the hand/finger and knee being the most severe. Muscles and ligaments were the structures that got injured the most. The average duration of days to return-to-play after an injury was 17 days. There were significant changes in the physical characteristics of the players in the six months between the test batteries. In addition to getting taller, players generally improved their fitness characteristics with significant improvements occurring in the bench press (8%), pull ups (113%), vertical jump (13%) and the 5 meter shuttle run (6%). Conclusion: The training load of the junior professional rugby players is similar to the load of senior professional rugby players. This represents a sudden increase compared to the previous year when the players were at school. A long-term research project with a database of rugby schools will assist in bridging the gap between the demands of junior rugby and junior professional rugby. Players joining a professional academy system after school need physical, emotional and tactical fast tracking as they are competing in a highly competitive environment for senior professional contracts. This accounts for the relatively high rate of injury throughout the season. Players need to be carefully monitored and managed during the season to detect symptoms reflecting poor adaptation to the training load.
  • Loading...
    Thumbnail Image
    Item
    Open Access
    The reliability of 10 km treadmill time trial performance and the effect of different high intensity interval training strategies on 10 km running performance and associated physiological parameters
    (2015) Kirkman, Mark Courtney; Bosch, Andrew; Lambert, Michael I
    The reliability and validity of a performance test is important in research to detect meaningful performance differences following an intervention. In accordance with this, the aim of the first study of this thesis was to investigate the reliability and validity of a self-paced 10 km treadmill time trial. This performance measure was then used in the main section of this thesis. This comprised a large training intervention study aimed to answer specific questions following three different high intensity interval training programmes. In particular, changes in 10 km running performance were investigated with respect to various physiological parameters, both immediately following the training intervention, as well as during a subsequent three-week taper period. Methods In the first study, a group of well-trained male runners (n = 8) completed four 10 km treadmill time trials and two 10 km track time trials. Comparisons in performance time were made between the 10 km treadmill time trials to determine the typical percent error between these trials. Additionally, comparisons were made between the track and treadmill time trials. In the second study, well-trained male runners(n = 32) were randomly assigned to one of four groups; a control group, a 400 m interval group, a 1600 m interval group and a mixed (400 m and 1600 m) interval group. The intensity of the intervals was based on the participants' current 10 km time trial time. The high intensity training interventions consisted of eight interval sessions (twice per week) over a four-week period followed by a three-week singlestep30% reduction in total training volume (while maintaining training frequency and some intensity) in all groups.
  • Loading...
    Thumbnail Image
    Item
    Open Access
    Risk factors for lower limb musculoskeletal injuries in novice runners: a prospective study
    (2015) Greybe, Rykie; Burgess, Theresa; Lambert, Michael I
    The aim of this study was to identify the possible risk factors for the development of lower limb musculoskeletal injuries in novice runners. The specific objectives of this study were: (a) to describe the demographic and training characteristics of novice runners; (b) to establish the incidence of self-reported running-related injuries in novice runners; (c) to determine if specific intrinsic factors, namely age, gender, body mass index, quadriceps angle, foot alignment, hamstring flexibility, balance, muscle power and a history of previous injury were risk factors for lower limb musculoskeletal injuries in novice runners; and (d) to determine if specific extrinsic factors, namely training frequency, session duration, and intensity were risk factors for developing lower limb musculoskeletal injuries in novice runners.
  • Loading...
    Thumbnail Image
    Item
    Open Access
    Secular changes in anthropometric and physical characteristics of South African National U/20 rugby union players (1998-2010).
    (2012) Lombard, Wayne; Lambert, Michael I
    The aim of this study was to compare changes in the physical and morphological characteristics of South Africa’s National U/20 rugby union players (Forwards and Backs) over a 12 year period from 1998 - 2010. This period spans an era starting just after the onset of professionalism of the game to the modern era where the influence of professionalism has filtered down to junior (U/20) levels. Any changes in morphology and physical ability of the players can be attributed to the changes in the demands of the game and reflect the characteristics that are required for a player to be successful at that level. Players who were U/20 at the time of representing the Junior South African National Team National over a period spanning from 1998 – 2010 were used for the study. A total of 453 players, split into two groups, Forwards (n = 256) and Backs (n = 197), underwent measurements of body mass, stature, body fat percentage, muscular endurance (pull ups), muscular strength (1RM bench press, sprint times (10m and 40m) and aerobic capacity (Multistage shuttle run test). All Tests were conducted once a year in either January or December of that year. Data are represented as means ± 95 % confidence intervals. A Levene’s test of homogeneity was used to determine whether the variance for each variable was equal. A two-way analysis of variance was used to determine whether there were significant differences for either main effect of ‘year’ or ‘position’ or for the interaction between ‘year x position’. If the main effect of ‘year’ or interaction (“year x position”) was significant, a Tukey post hoc test was used to identify specific differences. Statistical significance was accepted when p < 0.05.
  • «
  • 1 (current)
  • 2
  • »
UCT Libraries logo

Contact us

Jill Claassen

Manager: Scholarly Communication & Publishing

Email: openuct@uct.ac.za

+27 (0)21 650 1263

  • Open Access @ UCT

    • OpenUCT LibGuide
    • Open Access Policy
    • Open Scholarship at UCT
    • OpenUCT FAQs
  • UCT Publishing Platforms

    • UCT Open Access Journals
    • UCT Open Access Monographs
    • UCT Press Open Access Books
    • Zivahub - Open Data UCT
  • Site Usage

    • Cookie settings
    • Privacy policy
    • End User Agreement
    • Send Feedback

DSpace software copyright © 2002-2026 LYRASIS