As previous literature updates, I have performed a PubCrawler search looking for football articles in NCBI Medline (PubMed) and GenBank databases.
Following studies were retrieved for this week:
#1 Field location and player roles as constraints on emergent 1-vs-1 interpersonal patterns of play in football
Reference: Hum Mov Sci. 2017 Jul 5;54:347-353. doi: 10.1016/j.humov.2017.06.008. [Epub ahead of print]
Authors: Laakso T, Travassos B, Liukkonen J, Davids K
Summary: This study examined effects of player roles on interpersonal patterns of coordination that sustain decision-making in 1-vs-1 sub-phases of football in different field
locations near the goal (left-, middle- and right zone). Participants were fifteen U-16yrs players from a local competitive amateur team. To measure interpersonal patterns of coordination in the
1-vs-1 dyads we recorded: (i) the relative distance value between each attacker and defender to the centre of the goal, and (ii), the relative angle between the centre of the goal, each defender
and attacker. Results revealed how variations in field locations near the goal (left-, middle- and right-zones) constrained the relative distance and relative angle values that emerged between
them and the goal. It reveals that relative position of the goal is a key informational variable that sustained participants' behaviours for dribbling and shooting. Higher values of relative
distance and angle were observed in the middle zone, compared to other zones. Players' roles also constitute a constraint on the interpersonal coordination for dribbling and shooting.
Additionally, it seems that players' foot preference constrains the dynamics of interpersonal patterns of coordination between participants, especially in left and right zones. The findings
suggest that to increase participants' opportunities for action, coaches should account with field positions, players' roles and preference foot.
#2 Influence of Cleats-Surface Interaction on the Performance and Risk of Injury in Soccer: A Systematic Review
Reference: Appl Bionics Biomech. 2017;2017:1305479. doi: 10.1155/2017/1305479. Epub 2017 Jun 8.
Authors: Silva DCF, Santos R, Vilas-Boas JP, Macedo R, Montes AM, Sousa ASP
Download link: https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5480019/pdf/ABB2017-1305479.pdf
Summary: The purpose of the study was to review the influence of cleats-surface interaction on the performance and risk of injury in soccer athletes. Full experimental and
original papers, written in English that studied the influence of soccer cleats on sports performance and injury risk in artificial or natural grass. Twenty-three articles were included in this
review: nine related to performance and fourteen to injury risk. On artificial grass, the soft ground model on dry and wet conditions and the turf model in wet conditions are related to worse
performance. Compared to rounded studs, bladed ones improve performance during changes of directions in both natural and synthetic grass. Cleat models presenting better traction on the stance leg
improve ball velocity while those presenting a homogeneous pressure across the foot promote better kicking accuracy. Bladed studs can be considered less secure by increasing plantar pressure on
lateral border. The turf model decrease peak plantar pressure compared to other studded models. The soft ground model provides lower performance especially on artificial grass, while the turf
model provides a high protective effect in both fields.
#3 Deviating running kinematics and hamstring injury susceptibility in male soccer players: Cause or consequence?
Reference: Gait Posture. 2017 Jun 27;57:270-277. doi: 10.1016/j.gaitpost.2017.06.268. [Epub ahead of print]
Authors: Schuermans J, Van Tiggelen D, Palmans T, Danneels L, Witvrouw E
Summary: Although the vast majority of hamstring injuries in male soccer are sustained during high speed running, the association between sprinting kinematics and hamstring
injury vulnerability has never been investigated prospectively in a cohort at risk. This study aimed to objectify the importance of lower limb and trunk kinematics during full sprint in hamstring
injury susceptibility. At the end of the 2013 soccer season, three-dimensional kinematic data of the lower limb and trunk were collected during sprinting in a cohort consisting of 30 soccer
players with a recent history of hamstring injury and 30 matched controls. Subsequently, a 1.5 season follow up was conducted for (re)injury registry. Ultimately, joint and segment motion
patterns were submitted to retro- and prospective statistical curve analyses for injury risk prediction. Statistical analysis revealed that index injury occurrence was associated with higher
levels of anterior pelvic tilting and thoracic side bending throughout the airborne (swing) phases of sprinting, whereas no kinematic differences during running were found when comparing players
with a recent hamstring injury history with their matched controls. Deficient core stability, enabling excessive pelvis and trunk motion during swing, probably increases the primary injury risk.
Although sprinting encompasses a relative risk of hamstring muscle failure in every athlete, running coordination demonstrated to be essential in hamstring injury prevention.
#4 Effects of lower-limb strength training on agility, repeated sprinting with changes of direction, leg peak power, and neuromuscular
adaptations of soccer players
Reference: J Strength Cond Res. 2017 Jan 24. doi: 10.1519/JSC.0000000000001813. [Epub ahead of print]
Authors: Hammami M, Negra Y, Billaut F, Hermassi S, Shephard RJ, Chelly MS
Summary: We examined the effects on explosive muscular performance of incorporating 8 weeks strength training into the preparation of junior male soccer players, allocating
subjects between an experimental group (E, n=19) and a matched control group (C, n=12). Controls maintained their regular training program, but the experimental group replaced a part of this
schedule by strength training. Performance was assessed using running times (5m, 10m, 20m, 30 and 40m), a sprint test with 180° turns (S180°), a 9-3-6-3-9 m sprint with backward and forward
running (SBF), a 4 x 5 m sprint test with turns, repeated shuttle sprinting, repeated changes of direction, squat (SJ) and counter-movement (CMJ) jumping, back half-squatting, and a
force-velocity test. Electromyographic (EMG) activity of the vastus lateralis (VL), vastus medialis (VM) and rectus femoris (RF) muscles was recorded during jumping. Two-way ANOVA showed
significant gains in E relative to C during the straight sprint (all distances). Scores of E increased substantially (p≤0.01) on S4 x 5 and SBF, and moderately on S180°. Leg peak power, SJ and
CMJ were also enhanced, with significant increases in EMG activity. However, repeated-sprint parameters showed no significant changes. We conclude that biweekly strength training improves key
components of performance in junior soccer players relative to standard in-season training.
#5 Physiological Characteristics of Projected Starters and Non-Starters in the Field Positions from a Division I Women's Soccer
Team
Reference: Int J Exerc Sci. 2017 Jul 1;10(4):568-579. eCollection 2017.
Authors: Risso FG, Jalilvand F, Orjalo AJ, Moreno MR, Davis DL, Birmingham-Babauta SA, Stokes JJ, Stage AA, Liu TM, Giuliano DV, Lazar A, Lockie RG
Download link: https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5466405/pdf/ijes-10-04-568.pdf
Summary: NCAA soccer features different substitution rules compared to FIFA-sanctioned matches, with a greater availability of players who can enter the game. This could
influence the physiological characteristics of the field position starters (ST) and non-starters (NST) within a collegiate women's team, which has not been previously analyzed. Thus, 22 field
players from the same Division I women's soccer squad completed: vertical and standing broad jumps; 30-meter (m) sprint (0-5, 0-10, 0-30 m intervals); pro-agility and 60-yard shuttle; and the
Yo-Yo Intermittent Recovery Test Level 1. Players were defined into ST (n=10) and NST (n=12) by the coaching staff. A one-way ANOVA derived any significant (p≤0.05) between-group differences, and
effect sizes were used for a magnitude-based inference analysis. Z-scores were also calculated to document worthwhile differences above or below the squad mean for the groups. The results showed
no significant between-group differences for any of the performance tests. ST did have a worthwhile difference above the squad mean in the 0-10 and 0-30 m sprint intervals, while NST had a
worthwhile difference below the squad mean in the 0-30 m interval. Physiological characteristics between ST and NST from the analyzed Division I squad were similar, although ST were generally
faster. The similarities between ST and NST may be a function of the team's training, in that all players may complete the same workouts. Nonetheless, if all players exhibit similar physiological
capacities, with appropriate substitutions by the coach a collegiate team should be able to maintain a high work-rate throughout a match.
#5 Epidemiologic comparisons of soccer-related injuries presenting to emergency departments and reported within high school and collegiate
settings
Reference: Inj Epidemiol. 2017 Dec;4(1):19. doi: 10.1186/s40621-017-0116-9. Epub 2017 Jul 3.
Authors: Kerr ZY, Pierpoint LA, Currie DW, Wasserman EB, Comstock RD
Summary: Few studies compare sports injury patterns in different settings. This study described the epidemiology of soccer injuries presenting to emergency departments (EDs) and
compared injuries presenting to EDs to injuries presenting to collegiate and high school athletic trainers (ATs). Soccer-related injuries (product code 1267) in the National Electronic Injury
Surveillance System (NEISS) that were sustained by individuals at least two years of age in 2004-2013 were included. High School Reporting Information Online (HS RIO) data for high school soccer
injuries during the 2005/06-2013/14 academic years were compared to NEISS data for those aged 14-17 years in 2005-2013. National Collegiate Athletic Association Injury Surveillance Program
(NCAA-ISP) data for collegiate soccer injuries during the 2009/10-2013/14 academic years were compared to NEISS data for those aged 18-22 years in 2009-2013. All datasets included weights to
calculate national estimates. Injury proportion ratios (IPRs) with 95% confidence intervals (CIs) compared nationally estimated injury distributions between the HS RIO/NCAA-ISP and NEISS data
subsets. During the study period, 63,258 soccer-related injuries were captured by NEISS, which translates to an estimated 2,039,250 injuries seen at US EDs nationwide. Commonly injured body parts
included the head/face (19.1%), ankle (17.6%), hand/wrist (15.3%), and knee (12.2%). Common diagnoses included sprains/strains (34.0%), fractures (22.2%), and contusions (17.7%). Compared to
their respective age ranges in NEISS, sprains/strains comprised a larger proportion of injuries in HS RIO (48.3% vs. 33.7%; IPR = 1.38; 95% CI: 1.33, 1.42) and NCAA-ISP (51.3% vs. 37.0%;
IPR = 1.39; 95% CI: 1.31, 1.46). In contrast, fractures comprised a smaller proportion of injuries in HS RIO than in NEISS (7.5% vs. 18.6%; IPR = 0.43; 95% CI: 0.39, 0.47) and NCAA-ISP (2.8% vs.
15.7%; IPR = 0.18; 95% CI: 0.14, 0.22). ATs more commonly reported injuries that are easily diagnosed and treated (e.g., sprains/strains); EDs more commonly reported injuries with longer recovery
times and rehabilitation (e.g., fractures). Although ED surveillance data can identify the most severe sports-related injuries, high school and college sports surveillance may better describe the
breadth of sports-related injuries. Our findings may provide further support for school-based sports medicine professionals, but further research is needed to comprehensively examine the
potential economic and health-related benefits.
#6 Internal training load and its longitudinal relationship with seasonal player wellness in elite professional soccer
Reference: Physiol Behav. 2017 Jun 28;179:262-267. doi: 10.1016/j.physbeh.2017.06.021. [Epub ahead of print]
Authors: Clemente FM, Mendes B, Nikolaidis PT, Calvete F, Carriço S, Owen AL
Summary: Monitoring internal training load has been extensively used and described within team sport environments, however when compared to internal physiological measures such
as heart rate (HR) and rating of perceived exertion (RPE), the literature is sparse. The primary aim of this investigation study was to assess differences of playing position on ITL, session-RPE
and wellness across two different training microcycles (1 vs. 2 competitive games), in addition with examining the relationship between ITL and Hooper's Index across an entire season. Thirty-five
professional soccer players from the Portuguese premier league participated in the study (25.7±5.0years; 182.3±6.4cm; 79.1±7.0kg). Analysis of variance revealed higher values of DOMS (Means(M):
3.33 vs. 3.10; p=0.001; effect Size (ES)=0.087), fatigue (M: 3.18 vs. 2.99; p=0.001; ES=0.060) and HI (M: 11.85 vs. 11.56; p=0.045; ES=0.034) in 2-game weeks compared with 1-game weeks.
Correlation between ITL and HI levels found significant negative correlations between ITL and DOMS (ρ=-0.156), ITL and sleep (ρ=-0.109), ITL and fatigue (ρ=-0.225), ITL and stress (ρ=-0.188), and
ITL and HI (ρ=-0.238) in 2-game weeks. Results from 1-game microcycle only highlighted negative correlations between ITL and stress (ρ=-0.080). It was concluded from the study that greater
fatigue potential, muscle soreness, stress and ITL was significantly more apparent within a 2-game microcycle. As a result, care should be taken when planning the lead into and out of a 2-game
fixture microcycle highlighting key specific recovery strategies to damped the increased stress effect. Additionally, the potential utilization of squad rotation strategies may be a positive
approach with aim of managing the fatigue effect.
#7 Epidemiology of time loss groin injuries in a men's professional football league: a 2-year prospective study of 17 clubs and 606
players
Reference: Br J Sports Med. 2017 Jun 30. pii: bjsports-2016-097277. doi: 10.1136/bjsports-2016-097277. [Epub ahead of print]
Authors: Mosler AB, Weir A, Eirale C, Farooq A, Thorborg K, Whiteley RJ, Hӧlmich P, Crossley KM
Summary: Groin injury epidemiology has not previously been examined in an entire professional football league. We recorded and characterised time loss groin injuries sustained in
the Qatar Stars League. Male players were observed prospectively from July 2013 to June 2015. Time loss injuries, individual training and match play exposure were recorded by club doctors using
standardised surveillance methods. Groin injury incidence per 1000 playing hours was calculated, and descriptive statistics used to determine the prevalence and characteristics of groin injuries.
The Doha agreement classification system was used to categorise all groin injuries. 606 footballers from 17 clubs were included, with 206/1145 (18%) time loss groin injuries sustained by 150
players, at an incidence of 1.0/1000 hours (95% CI 0.9 to 1.1). At a club level, 21% (IQR 10%-28%) of players experienced groin injuries each season and 6.6 (IQR 2.9-9.1) injuries were sustained
per club per season. Of the 206 injuries, 16% were minimal (1-3 days), 25% mild (4-7 days), 41% moderate (8-28 days) and 18% severe (>28 days), with a median absence of 10 days/injury (IQR
5-22 days). The median days lost due to groin injury per club was 85 days per season (IQR 35-215 days). Adductor-related groin pain was the most common entity (68%) followed by iliopsoas (12%)
and pubic-related (9%) groin pain. Groin pain caused time loss for one in five players each season. Adductor-related groin pain comprised 2/3 of all groin injuries. Improving treatment outcomes
and preventing adductor-related groin pain has the potential to improve player availability in professional football.
American Football
#1 Association of Playing High School Football With Cognition and Mental Health Later in Life
Reference: JAMA Neurol. 2017 Jul 3. doi: 10.1001/jamaneurol.2017.1317. [Epub ahead of print]
Authors: Deshpande SK, Hasegawa RB, Rabinowitz AR, Whyte J, Roan CL, Tabatabaei A, Baiocchi M, Karlawish JH, Master CL, Small DS
Summary: American football is the largest participation sport in US high schools and is a leading cause of concussion among adolescents. Little is known about the long-term
cognitive and mental health consequences of exposure to football-related head trauma at the high school level. The purpose was to estimate the association of playing high school football with
cognitive impairment and depression at 65 years of age. A representative sample of male high school students who graduated from high school in Wisconsin in 1957 was studied. In this cohort study
using data from the Wisconsin Longitudinal Study, football players were matched between March 1 and July 1, 2017, with controls along several baseline covariates such as adolescent IQ, family
background, and educational level. For robustness, 3 versions of the control condition were considered: all controls, those who played a noncollision sport, and those who did not play any sport.
A composite cognition measure of verbal fluency and memory and attention constructed from results of cognitive assessments administered at 65 years of age. A modified Center for Epidemiological
Studies' Depression Scale score was used to measure depression. Secondary outcomes include results of individual cognitive tests, anger, anxiety, hostility, and heavy use of alcohol. Among the
3904 men (mean [SD] age, 64.4 [0.8] years at time of primary outcome measurement) in the study, after matching and model-based covariate adjustment, compared with each control condition, there
was no statistically significant harmful association of playing football with a reduced composite cognition score (-0.04 reduction in cognition vs all controls; 97.5% CI, -0.14 to 0.05) or an
increased modified Center for Epidemiological Studies' Depression Scale depression score (-1.75 reduction vs all controls; 97.5% CI, -3.24 to -0.26). After adjustment for multiple testing,
playing football did not have a significant adverse association with any of the secondary outcomes, such as the likelihood of heavy alcohol use at 65 years of age (odds ratio, 0.68; 95% CI,
0.32-1.43). Cognitive and depression outcomes later in life were found to be similar for high school football players and their nonplaying counterparts from mid-1950s in Wisconsin. The risks of
playing football today might be different than in the 1950s, but for current athletes, this study provides information on the risk of playing sports today that have a similar risk of head trauma
as high school football played in the 1950s.
#2 Return to Sport and Performance After Anterior Cruciate Ligament Reconstruction in National Football League Linemen
Reference: Orthop J Sports Med. 2017 Jun 20;5(6):2325967117711681. doi: 10.1177/2325967117711681. eCollection 2017 Jun.
Authors: Cinque ME, Hannon CP, Bohl DD, Erickson BJ, Verma NN, Cole BJ, Bach BR Jr
Download link: https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5480637/pdf/10.1177_2325967117711681.pdf
Summary: Tears of the anterior cruciate ligament (ACL) are common in the National Football League (NFL). The impact of these injuries on the careers of NFL linemen is unknown.
The purpose was to define the percentage of NFL linemen who return to sport (RTS) after ACL reconstruction, the mean time to RTS, and the impact on performance compared with matched controls.
Data on NFL offensive and defensive linemen who sustained an ACL tear and underwent ACL reconstruction between 1980 and 2015 were analyzed. Players were identified through NFL team websites,
publicly available injury reports, player profiles, and press releases. Demographics and mean in-game performance data preinjury and post-ACL reconstruction were recorded. A player was deemed to
have returned to sport if he played in at least 1 NFL game after his ACL reconstruction. A healthy control group was selected to compare in-game performance data and was matched with the study
group on several parameters. Overall, 73 NFL linemen met the inclusion criteria; 47 (64.3%) returned to play after ACL reconstruction (62.5% of offensive linemen, 65.9% of defensive linemen). All
offensive linemen successfully returned to play the season after injury. No difference existed in number of seasons, total number of games played, mean number of games played, or mean number of
games started per season when offensive linemen who RTS after ACL reconstruction were compared with matched controls (all P > .05). Among defensive linemen who RTS, most returned the season
after injury (88.9%). There was no difference between defensive linemen who RTS after ACL reconstruction and matched controls in any performance metrics as an average over the remainder of their
career (all P > .05). However, NFL defensive linemen who tore their ACL played fewer total seasons than matched controls (P = .020). Overall, 64.3% of NFL offensive and defensive linemen who
undergo ACL reconstruction returned to play. Linemen who RTS do so at a high level, with no difference in in-game performance or career duration when compared with matched controls.
Australian Football
#1 Return to Play and Player Performance After Anterior Cruciate Ligament Injury in Elite Australian Rules Football Players
Reference: Orthop J Sports Med. 2017 Jun 21;5(6):2325967117711885. doi: 10.1177/2325967117711885. eCollection 2017 Jun.
Authors: Liptak MG, Angel KR
Download link: https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5482352/pdf/10.1177_2325967117711885.pdf
Summary: Australian Rules football is a highly aerobic and anaerobic game that at times requires players to perform cutting or pivoting maneuvers, potentially exposing them to
anterior cruciate ligament (ACL) injury. At present, there are limited data available addressing the impact of ACL injury on return to play and preinjury form after ACL reconstruction. The
purpose was to determine the prevalence of ACL injury and the incidence of further ACL injury, and to consider player return to play and return to preinjury form after ACL reconstruction. It was
hypothesized that elite-level Australian Football League (AFL) players do not return to preinjury form until, at minimum, 2 years after returning to play. A retrospective analysis was undertaken
on a cohort of elite AFL players who injured their ACL between 1990 and 2000. Return to play after ACL reconstruction was determined by the mean number of ball disposals, or release of the ball
by the hand or foot, at 1, 2, and 3 years after return to play and compared with preinjury form. Associations between player and injury characteristics, method of reconstruction, and outcomes
(return to play, preinjury form, and further ACL injury) were examined. During the included seasons, a total of 2723 AFL players were listed. Of these, 131 (4.8%) sustained an ACL injury, with
115 players eligible for inclusion. Of 115 players, 26% did not return to elite competition, while 28% of those who did return experienced further ACL injury. The adjusted mean number of
disposals (± standard error of the mean) was significantly lower at 1 year (12.21 ± 0.63; P = .003), 2 years (12.09 ± 0.65; P = .008), and 3 years (11.78 ± 0.77; P = .01) after return to play
compared with preinjury (14.23 ± 0.67). On average, players did not return to preinjury form by 3 years (P < .01). Players aged 30 years or older were less likely to return to play compared
with younger players (P = .0002), moderate-weight players were more likely to return to play compared with lighter-weight players (P = .007), and there were significantly increased odds of not
returning to play if the dominant side was injured (odds ratio, 0.10; 95% CI, 0.03-0.34; P = .0002). On average, AFL players do not return to their preinjury form after ACL injury and
reconstruction, a common injury in this sporting population. This along with the high occurrence of reinjury highlights the career-threatening nature of ACL injury for elite AFL players.