The Relationship Between Blast-related Hearing Threshold Shift and Insomnia in U.S. Military Personnel.
Hearing loss and insomnia emerged as preeminent sources of morbidity among military service members and veterans who served in the recent Iraq and Afghanistan conflicts. Significant threshold shift (STS), an early indicator of hearing loss, has not been studied in relation to insomnia. This study's objective was to examine the co-occurrence of STS and insomnia among U.S. military personnel with blast-related injury. A total of 652 service members who were blast-injured during military operations in Iraq or Afghanistan between 2004 and 2012 were identified from the Blast-Related Auditory Injury Database. Pre- and post-injury audiometric data were used to ascertain new-onset STS, defined as 30 dB or greater increase for the sum of thresholds at 2,000, 3,000, and 4,000 Hz for either ear. Insomnia diagnosed within 2 years post-injury was abstracted from electronic medical records. Multivariable logistic regression analysis examined the relationship between STS and insomnia, while adjusting for age, year of injury, occupation, injury severity, tinnitus and concussion diagnosed in-theater, and PTSD. A majority of the study sample was aged 18-25 years (79.9%) and sustained mild-to-moderate injuries (92.2%). STS was present in 21.1% of service members. Cumulative incidence of diagnosed insomnia was 22.3% and 11.1% for those with and without STS, respectively. After adjusting for covariates, those with STS had nearly 2-times higher odds of insomnia (odds ratio (OR) = 1.91, 95% CI = 1.12-3.24) compared with those without STS. In multivariable modeling, the strongest association was between PTSD and insomnia (OR = 5.57, 95% CI = 3.35-9.26). A secondary finding of note was that military personnel with STS had a significantly higher frequency of PTSD compared with those without STS (28.1% vs. 15.2%). Hearing threshold shift was associated with insomnia in military personnel with blast-related injury and could be used to identify service members at risk. Multidisciplinary care is needed to manage the co-occurrence of both conditions during the post-deployment rehabilitation phase. Future research should evaluate the specific mechanisms involved in this relationship and further explore the association between hearing threshold shift and PTSD.
- Research Article
1
- 10.1097/aud.0000000000001359
- Apr 5, 2023
- Ear and hearing
Military personnel are exposed to multiple risk factors for hearing loss, particularly on the battlefield. The objective of this study was to determine whether pre-existing hearing loss predicted hearing threshold shift in male U.S. military personnel following injury during combat deployment. This was a retrospective cohort study with 1573 male military personnel physically injured in Operations Enduring and Iraqi Freedom between 2004 and 2012. Audiograms before and after injury were analyzed and used to calculate significant threshold shift (STS), defined as a 30 dB or greater change in the sum of hearing thresholds at 2000, 3000, and 4000 Hz in either ear on the postinjury audiogram, relative to the same frequencies on the preinjury audiogram. Twenty-five percent (n = 388) of the sample had preinjury hearing loss, which mostly occurred in the higher frequencies (i.e., 4000 and 6000 Hz). The prevalence of postinjury STS ranged from 11.7% to 33.3% as preinjury hearing level moved from better to worse. In multivariable logistic regression, preinjury hearing loss was a predictor of STS, and there was a dose-response relationship between severity of preinjury hearing threshold and postinjury STS, specifically for preinjury hearing levels of 40 to 45 dBHL (odds ratio [OR] = 1.99; 95% confidence interval [CI] = 1.03 to 3.88), 50 to 55 dBHL (OR = 2.33; 95% CI = 1.17 to 4.64), and >55 dBHL (OR = 3.77; 95% CI = 2.25 to 6.34). These findings suggest that better preinjury hearing provides increased resistance to threshold shift than impaired preinjury hearing. Although STS is calculated using 2000 to 4000 Hz, clinicians must closely attend to the pure-tone response at 6000 Hz and use this test frequency to identify service members at-risk for STS prior to combat deployment.
- Research Article
3
- 10.1097/aud.0000000000001285
- Oct 18, 2022
- Ear & Hearing
To examine the association between tinnitus and hearing outcomes among US military personnel after blast injury, including any hearing loss, low-frequency hearing loss, high-frequency hearing loss, early warning shift, and significant threshold shift. In this retrospective study, the Blast-Related Auditory Injury Database was queried for male military service members who had audiometric data 2 years before and after blast injury between 2004 and 2012 with no history of hearing loss or tinnitus before injury (n = 1693). Tinnitus was defined by diagnostic codes in electronic health records. Multivariable logistic regression examined the association between tinnitus and hearing outcomes, while adjusting for covariates. Overall, 14.2% (n = 241) of the study sample was diagnosed with tinnitus within 2 years after blast injury. The proportions of all examined hearing outcomes were higher among service members with tinnitus than those without ( p < 0.001). In multivariable analysis, service members with tinnitus had higher adjusted odds of any hearing loss (odds ratio [OR] = 1.72, 95% confidence interval [CI] = 1.20-2.47), low-frequency hearing loss (OR = 2.77, 95% CI = 1.80-4.26), high-frequency hearing loss (OR = 2.15, 95% CI = 1.47-3.16), early warning shift (OR = 1.83, 95% CI = 1.36-2.45), and significant threshold shift (OR = 2.15, 95% CI = 1.60-2.89) compared with service members without tinnitus. The findings of this study demonstrate that tinnitus diagnosed within 2 years after blast injury is associated with the examined hearing outcomes in US military personnel. Service members with blast injury who subsequently experience tinnitus should receive routine audiometric hearing conservation testing and be carefully examined for poor hearing outcomes by an audiologist.
- Research Article
19
- 10.1186/s12889-020-08696-4
- Apr 28, 2020
- BMC Public Health
BackgroundBlast injury emerged as a primary source of morbidity among US military personnel during the recent conflicts in Iraq and Afghanistan, and led to an array of adverse health outcomes. Multimorbidity, or the presence of two or more medical conditions in an individual, can complicate treatment strategies. To date, there is minimal research on the impact of multimorbidity on long-term patient-reported outcomes. We aimed to define multimorbidity patterns in a population of blast-injured military personnel, and to examine these patterns in relation to long-term quality of life (QOL).MethodsA total of 1972 US military personnel who sustained a blast-related injury during military operations in Iraq and Afghanistan were identified from clinical records. Electronic health databases were used to identify medical diagnoses within the first year postinjury, and QOL was measured with a web-based assessment. Hierarchical cluster analysis methods using Ward’s minimum variance were employed to identify clusters with related medical diagnosis categories. Duncan’s multiple range test was used to group clusters into domains by QOL.ResultsFive distinct clusters were identified and grouped into three QOL domains. The lowest QOL domain contained one cluster with a clinical triad reflecting musculoskeletal pain, concussion, and mental health morbidity. The middle QOL domain had two clusters, one with concussion/anxiety predominating and the other with polytrauma. The highest QOL domain had two clusters with little multimorbidity aside from musculoskeletal pain.ConclusionsThe present study described blast-related injury profiles with varying QOL levels that may indicate the need for integrated health services. Implications exist for current multidisciplinary care of wounded active duty and veteran service members, and future research should determine whether multimorbidity denotes distinct post-blast injury syndromes.
- Research Article
63
- 10.3109/02699052.2010.536195
- Nov 30, 2010
- Brain Injury
Primary objective: To assess the occurrence of ocular and visual disorders following blast-related traumatic brain injury (TBI) in Operation Iraqi Freedom.Research design: Retrospective cohort study.Methods and procedures: A total of 2254 US service members with blast-related combat injuries were identified for analysis from the Expeditionary Medical Encounter Database. Medical record information near the point of injury was used to assess factors associated with the diagnosis of ocular/visual disorder within 12 months after injury, including severity of TBI.Main outcomes and results: Of 2254 service members, 837 (37.1%) suffered a blast-related TBI and 1417 (62.9%) had other blast-related injuries. Two-hundred and one (8.9%) were diagnosed with an ocular or visual disorder within 12 months after blast injury. Compared with service members with other injuries, odds of ocular/visual disorder were significantly higher for service members with moderate TBI (odds ratio (OR) = 1.58, 95% confidence interval (CI) = 1.02–2.45) and serious to critical TBI (OR = 14.26, 95% CI = 7.00–29.07).Conclusions: Blast-related TBI is strongly associated with visual dysfunction within 1 year after injury and the odds of disorder appears to increase with severity of brain injury. Comprehensive vision examinations following TBI in theatre may be necessary.
- Research Article
4
- 10.1044/leader.ftr1.15132010.10
- Nov 1, 2010
- The ASHA Leader
Using Telehealth to Treat Combat-Related Traumatic Brain Injury
- Research Article
30
- 10.1093/milmed/usz440
- Dec 28, 2019
- Military Medicine
Studies examining the mental health outcomes of military personnel deployed into combat zones have focused on the risk of developing post-traumatic stress disorder conferred by mild or moderate traumatic brain injury (TBI). However, other mental health outcomes among veterans who sustained critical combat injuries have not been described. We examined the associations of moderate and severe TBI and combat injury with the risk for anxiety and mood disorders, adjustment reactions, schizophrenia and other psychotic disorders, cognitive disorders, and post-traumatic stress disorder. We conducted a retrospective cohort study of U.S. military service members critically injured in combat during military operations in Iraq and Afghanistan from February 1, 2002, to February 1, 2011. Health care encounters from (1) the Department of Defense (DoD) Trauma Registry (TR), (2) acute and ambulatory care in military facilities, and (3) civilian facilities are reimbursed by Tricare. Service members who sustained severe combat injury require critical care. We estimated the risk of mental health outcomes using risk-adjusted logit models for demographic and clinical factors. We explored the relationship between TBI and the total number of mental health diagnoses. Of the 4,980 subjects who met inclusion criteria, most injuries occurred among members of the Army (72%) or Marines (25%), with mean (SD) age of 25.5(6.1) years. The prevalence of moderate or severe TBI was 31.6% with explosion as the most common mechanism of injury (78%). We found 71% of the cohort was diagnosed with at least one poor mental health condition, and the adjusted risk conferred by TBI ranged from a modest increase for anxiety disorder (odds ratio, 1.27; 95% confidence interval [CI], 1.11-1.45) to a large increase for cognitive disorder (odds ratio, 3.24; 95% CI, 2.78-3.77). We found TBI was associated with an increased number of mental health diagnoses (incidence rate ratio, 1.52; 95% CI, 1.42-1.63). Combat-associated TBI may have a broad effect on several mental health conditions among critically injured combat casualties. Early recognition and treatment for trauma-associated mental health are crucial to improving outcomes among service personnel as they transition to post-deployment care in the DoD, Department of Veterans Affairs, or community health systems.
- Dataset
- 10.1037/e537272008-001
- Jan 1, 1976
: Hearing conservation audiometry reports received at the USAF Hearing Conservation Data Registry from January through June 1975 were grouped according to the Air Force Specialty Code, job description, shown for the individual. The 48,271 records surveyed included 46 job codes with 50 or more reports and 47 with fewer than 50. There were 5,298 records with no identifiable job code. The percentage of significant threshold shift was calculated for each record, with the total group revealing 23.21%. The percent significant threshold shift for each job code with 50 or more was calculated so that each could be compared to the average for the entire group.
- Research Article
14
- 10.1093/milmed/167.1.48
- Jan 1, 2002
- Military Medicine
This study presents audiometric information from 54,057 Navy enlisted personnel in the Navy and Marine Corps Hearing Conservation Program database from 1995 to 1999. The purpose was to compare current threshold shift patterns for Navy enlisted population with historical literature and review programmatic effectiveness issues. The data suggest that 82% of the population did not display significant threshold shift (STS) on the "annual" and "termination" audiograms, which increased to 94% after the "follow-up 2" examination. Compared with historical data, STS rates were significantly lower for the most junior enlisted personnel (E1-E3) (odds ratio = 0.34, p = 0.00, 95% confidence interval = 0.30-0.39) but not significantly different for more senior enlisted personnel (odds ratio = 0.96, p = 0.22, 95% confidence interval = 0.90-1.03). STS rates did not appear to correlate with expected "high" and "low" noise exposure Navy enlisted occupations. This suggests further investigation to readdress the possible risk factors other than noise intensity/duration.
- Research Article
6
- 10.1080/14992027.2020.1743884
- Apr 15, 2020
- International Journal of Audiology
Objective: To identify clinical audiometric patterns of hearing loss following blast-related injury (BRI) in US military personnel. Design: Retrospective cohort study. Study sample: A total of 1186 male Navy and Marine Corps service members with normal hearing thresholds on pre-injury audiograms who had post-injury audiograms in the Blast-Related Auditory Injury Database. Results: Low- and high-frequency pure-tone averages (PTAs) were significantly higher in those with BRI than non-blast-related injury (NBRI) for both ears (p < 0.001 for all comparisons). Overall, 172 (15%) service members met criteria for post-injury hearing loss and were categorised into PTA or single-frequency hearing loss subgroups. PTA hearing loss was more common in the BRI group (50% vs. 33%, p < 0.036), whereas single-frequency hearing loss was more common in the NBRI group. Most hearing loss was mild to moderate in degree, and three distinct audiometric patterns emerged (i.e. flat, sloping and rising). A flat pattern was the most prevalent configuration among those with PTA hearing loss, especially bilateral loss. Single-frequency hearing loss was mostly unilateral and high frequency. Conclusions: In this study, BRI produced hearing loss across test frequencies, generating more clinically actionable post-injury audiograms than NBRI. We found that post-injury audiometric patterns of hearing loss among military personnel may vary.
- Research Article
2
- 10.1002/lio2.746
- Jan 25, 2022
- Laryngoscope Investigative Otolaryngology
ObjectiveTo evaluate the effectiveness of nicergoline to prevent temporary threshold shift (TTS) in military personnel.Study DesignA randomized control trial.MethodsTwo hundred and twenty‐four participants were enrolled. Nicergoline 30 mg twice daily intake was prescribed to the study group (n = 119) for 3 weeks. The placebo was prescribed to the control group (n = 105) for 3 weeks, as well. Audiometric thresholds were measured at baseline and within 24 h after the participants attended a 1‐day weapons firing practice. During the firing practice, all participants had to wear foam earplugs. The TTS was assessed by using a variety of published significant threshold shift (STS) definitions. Additionally, the effects of the treatment group on the magnitude of pre‐ to postexposure threshold shifts were estimated. Tinnitus and other adverse effects of the medication were recorded.ResultsThe incidence of STS was 65.4% from the study group and 75% from the control group. The negative STS (thresholds improved) was 68.6% from the study group and 44.7% from the control group. The positive STS (thresholds worsened) from the study group and the control group was 31.4% and 55.3%, respectively. The effect of treatment in participants receiving nicergoline demonstrated significant coefficients (change in dB) in both ears (p = .001). The mean different threshold of participants receiving nicergoline showed negative STS in all tested frequencies without statistical significance. However, the mean different threshold of participants receiving a placebo showed positive STS with statistical significance. Additionally, there were 16 ears detecting a warning sign of permanent hearing loss. These participants from the control group presented a longer duration of tinnitus (p = .042). Moreover, the serious adverse effects of nicergoline were considerably low.ConclusionThe study results suggest that nicergoline may attenuate noise‐related TTS and tinnitus, and justify further investigation on the effectiveness of this drug as an otoprotectant.Level of Evidence2
- Research Article
22
- 10.3766/jaaa.21.5.3
- May 1, 2010
- Journal of the American Academy of Audiology
There is disagreement about ototoxicity monitoring methods. Controversy exists about what audiometric threshold shift criteria should be used, which frequencies should be tested, and with what step size. An evaluation of the test performance achieved using various criteria and methods for ototoxicity monitoring may help resolve these issues. (1) Evaluate test performance achieved using various significant threshold shift (STS) definitions for ototoxicity monitoring in a predominately veteran population; and (2) determine whether testing in (1/6)- or (1/3)-octave steps improves test performance compared to (1/2)-octave steps. A prospective, observational study design was used in which STSs were evaluated at frequencies within an octave of each subject's high-frequency hearing limit at two time points, an early monitoring test and the final monitoring test. Data were analyzed from 78 ears of 41 patients receiving cisplatin and from 53 ears of 28 hospitalized patients receiving nonototoxic antibiotics. Cisplatin-treated subjects received a cumulative dosage > or =350 mg by the final monitoring test. Testing schedule, age, and pre-exposure hearing characteristics were similar between the subject groups. Threshold shifts relative to baseline were examined to determine whether they met criteria based on magnitudes of positive STS (shifts of > or =5, 10, 15, or 20 dB) and numbers of frequencies affected (shifts at > or =1, 2, or 3 adjacent frequencies) for data collected using approximately (1/6)-, (1/3)-, or (1/2)-octave steps. Thresholds were confirmed during monitoring sessions in which shifts were identified. Test performance was evaluated with receiver operating characteristic (ROC) curves developed using a surrogate "gold standard"; true positive (TP) rates were derived from the cisplatin-exposed group and false positive (FP) rates from the nonexposed, control group. Best STS definitions were identified that achieved the greatest areas under ROC curves or resulted in the highest TP rates for a fixed FP rate near 5%, chosen to minimize the number of patients incorrectly diagnosed with ototoxic hearing loss. At the early monitoring test, average threshold shifts differed only slightly across groups. Test-frequency step size did not affect performance, and changes at one or more frequencies yielded the best test performance. At the final monitoring test, average threshold shifts were +10.5 dB for the cisplatin group, compared with -0.2 dB for the control group. Compared with the (1/2)-octave step size used clinically, use of smaller frequency steps improved test performance for threshold shifts at > or =2 or > or =3 adjacent frequencies. Best overall test performance was achieved using a criterion cutoff of > or =10 dB threshold shift at > or =2 adjacent frequencies tested in (1/6)-octave steps. Best test performance for the (1/2)-octave step size was achieved for shifts > or =15 dB at one or more frequencies. An ototoxicity monitoring protocol that uses an individualized, one-octave range of frequencies tested in (1/6)-octave steps is quick to administer and has an acceptable FP rate. Similar test performance can be achieved using (1/3)-octave test frequencies, which further reduces monitoring test time.
- Research Article
12
- 10.1503/cmaj.100244
- Feb 14, 2011
- Canadian Medical Association Journal
Nonmilitary personnel play an increasingly critical role in modern wars. Stark differences exist between the demographic characteristics, training and missions of military and nonmilitary members. We examined the differences in types of injury and rates of returning to duty among nonmilitary and military personnel participating in military operations in Iraq and Afghanistan. We collected data for nonmilitary personnel medically evacuated from military operations in Iraq and Afghanistan between 2004 and 2007. We compared injury categories and return-to-duty rates in this group with previously published data for military personnel and identified factors associated with return to duty. Of the 2155 medically evacuated nonmilitary personnel, 74.7% did not return to duty. War-related injuries in this group accounted for 25.6% of the evacuations, the most common causes being combat-related injuries (55.4%) and musculoskeletal/spinal injuries (22.9%). Among individuals with non-war-related injuries, musculoskeletal injuries accounted for 17.8% of evacuations. Diagnoses associated with the highest return-to-duty rates in the group of nonmilitary personnel were psychiatric diagnoses (15.6%) among those with war-related injuries and noncardiac chest or abdominal pain (44.0%) among those with non-war-related injuries. Compared with military personnel, nonmilitary personnel with war-related injuries were less likely to return to duty (4.4% v. 5.9%, p = 0.001) but more likely to return to duty after non-war-related injuries (32.5% v. 30.7%, p = 0.001). Compared with military personnel, nonmilitary personnel were more likely to be evacuated with non-war-related injuries but more likely to return to duty after such injuries. For evacuations because of war-related injuries, this trend was reversed.
- Research Article
292
- 10.1242/jeb.00755
- Jan 22, 2004
- Journal of Experimental Biology
Fishes are often exposed to environmental sounds such as those associated with shipping, seismic experiments, sonar and/or aquaculture pump systems. While efforts have been made to document the effects of such anthropogenic (human-generated) sounds on marine mammals, the effects of excess noise on fishes are poorly understood. We examined the short- and long-term effects of increased ambient sound on the stress and hearing of goldfish (Carassius auratus; a hearing specialist). We reared fish under either quiet (110-125 dB re 1 microPa) or noisy (white noise, 160-170 dB re 1 microPa) conditions and examined animals after specific durations of noise exposure. We assessed noise-induced alterations in physiological stress by measuring plasma cortisol and glucose levels and in hearing capabilities by using auditory brainstem responses. Noise exposure did not produce long-term physiological stress responses in goldfish, but a transient spike in plasma cortisol did occur within 10 min of the noise onset. Goldfish had significant threshold shifts in hearing after only 10 min of noise exposure, and these shifts increased linearly up to approximately 28 dB after 24 h of noise exposure. Further noise exposure did not increase threshold shifts, suggesting an asymptote of maximal hearing loss within 24 h. After 21 days of noise exposure, it took goldfish 14 days to fully recover to control hearing levels. This study shows that hearing-specialist fishes may be susceptible to noise-induced stress and hearing loss.
- Research Article
72
- 10.1288/00005537-198307000-00014
- Jul 1, 1983
- The Laryngoscope
Hearing conservation in industry relies heavily on monitoring audiometry to detect early noise-induced hearing loss in workers who are exposed to potentially damaging noise, with or without hearing protectors. The "real-world" reliability and validity of these measurements, as well as otoscopic observations in industry, have not been extensively investigated. In addition, there is considerable controversy over the selection of a definition of "significant threshold shift" in industrial audiometry. These and related issues were considered in a series of three studies utilizing data from an active hearing conservation program. Test-retest variability in industry is much higher than has been reported for clinical settings; this variability is reduced by pure-tone averaging. Workers referred for otologic evaluation were found to have hearing levels which were, on the average, about 5 dB better than indicated by plant audiometry, even without excluding 4% of referred workers who had unilateral deafness and showed "shadow curves" on the plant audiograms. Otoscopic data obtained by the plant audiometrists were uncorrelated with the results of otoscopy by consultant otologists. Techniques borrowed from decision theory and signal detection theory were used to evaluate possible criteria for significant threshold shift. Criteria based on pure-tone averaging were superior to those based on a certain amount of threshold shift for any frequency tested. It is proposed that a significant threshold shift be defined as a 10 dB or greater change for the worse for either the 0.5, 1, 2 kHz pure-tone average or the 3, 4, 6 kHz pure-tone average, in either ear, and that such shifts be validated by prompt retesting. Even with this criterion, a substantial number of shifts (most shifts, in some situations) will be either spurious or attributable to disorders other than noise-induced hearing loss, such as presbycusis. Otologic referral in cases of large or repeated shifts may prevent unjustified administrative actions, to the advantage of both workers and management. A practical consequence of the use of monitoring audiometry may be a de facto lowering of the permissible exposure level to 85 dBA TWA.
- Research Article
2
- 10.1121/10.0009824
- Mar 1, 2022
- The Journal of the Acoustical Society of America
Standard clinical protocols require hearing protection during magnetic resonance imaging (MRI) for patient safety. This investigation prospectively evaluated the auditory function impact of acoustic noise exposure during a 3.0T MRI in healthy adults. Twenty-nine participants with normal hearing underwent a comprehensive audiologic assessment before and immediately following a clinically indicated head MRI. Appropriate hearing protection with earplugs (and pads) was used per standard of practice. To characterize noise hazards, current sound monitoring tools were used to measure levels of pulse sequences measured. A third audiologic test was performed if a significant threshold shift (STS) was identified at the second test, within 30 days post MRI. Some sequences produced high levels (up to 114.5 dBA; 129 dB peak SPL) that required hearing protection but did not exceed 100% daily noise dose. One participant exhibited an STS in the frequency region most highly associated with noise-induced hearing loss. No participants experienced OSHA-defined STS in either ear. Overall, OAE measures did not show evidence of changes in cochlear function after MRI. In conclusion, hearing threshold shifts associated with hearing loss or OAE level shifts reflecting underlying cochlear damage were not detected in any of the 3.0T MRI study participants who used the current recommended hearing protection.
- Ask R Discovery
- Chat PDF
AI summaries and top papers from 250M+ research sources.