Publication Regions USA
-1
archive,tax-pub_region,term-usa,term-70,stockholm-core-1.2.2,select-theme-ver-9.0,ajax_fade,page_not_loaded,,qode_menu_center,wpb-js-composer js-comp-ver-6.3.0,vc_responsive

Bottled and Well Water Quality in a Small Central Appalachian Community: Household-Level Analysis of Enteric Pathogens, Inorganic Chemicals, and Health Outcomes in Rural Southwest Virginia

Abstract/Summary: Consumption of unsafe drinking water is associated with a substantial burden of disease globally. In the US, ~1.8 million people in rural areas lack reliable access to safe drinking water. Our objective was to characterize and assess household-level water sources, water quality, and associated health outcomes in Central Appalachia. We collected survey data and water samples (tap, source, and bottled water) from consenting households in a small rural community without utility-supplied water in southwest Virginia. Water samples were analyzed for physicochemical parameters, total coliforms, E. coli, nitrate, sulfate, metals (e.g., arsenic, cadmium, lead), and 30+ enteric pathogens. Among the 69% (n = 9) of households that participated, all had piped well water, though 67% (n = 6) used bottled water as their primary drinking water source. Total coliforms were detected in water samples from 44.4% (n = 4) of homes, E. coli in one home, and enteric pathogens (Aeromonas, Campylobacter, Enterobacter) in 33% (n = 3) of homes. Tap water samples from 11% (n = 1) of homes exceeded the EPA MCL for nitrate, and 33% (n = 3) exceeded the EPA SMCL for iron. Among the 19 individuals residing in study households, reported diarrhea was 25% more likely in homes with measured E. coli and/or specific pathogens (risk ratio = 1.25, cluster-robust standard error = 1.64, p = 0.865). Although our sample size was small, our findings suggest that a considerable number of lower-income residents without utility-supplied water in rural areas of southwest Virginia may be exposed to microbiological and/or chemical contaminants in their water, and many, if not most, rely on bottled water as their primary source of drinking water.

Subsewershed SARS-CoV-2 Wastewater Surveillance & COVID-19 Epidemiology Using Building-specific Occupancy & Case Data

Abstract/Summary: To evaluate the use of wastewater-based surveillance and epidemiology to monitor and predict SARS-CoV-2 virus trends, over the 2020–2021 academic year we collected wastewater samples twice weekly from 17 manholes across Virginia Tech’s main campus. We used data from external door swipe card readers and student isolation/quarantine status to estimate building-specific occupancy and COVID-19 case counts at a daily resolution. After analyzing 673 wastewater samples using reverse transcription quantitative polymerase chain reaction (RT-qPCR), we reanalyzed 329 samples from isolation and nonisolation dormitories and the campus sewage outflow using reverse transcription digital droplet polymerase chain reaction (RT-ddPCR). Population-adjusted viral copy means from isolation dormitory wastewater were 48% and 66% higher than unadjusted viral copy means for N and E genes (1846/100 mL to 2733/100 mL/100 people and 2312/100 mL to 3828/100 mL/100 people, respectively; n = 46). Prespecified analyses with random-effects Poisson regression and dormitory/cluster-robust standard errors showed that the detection of N and E genes were associated with increases of 85% and 99% in the likelihood of COVID-19 cases 8 days later (incident–rate ratio (IRR) = 1.845, p = 0.013 and IRR = 1.994, p = 0.007, respectively; n = 215), and one-log increases in swipe card normalized viral copies (copies/100 mL/100 people) for N and E were associated with increases of 21% and 27% in the likelihood of observing COVID-19 cases 8 days following sample collection (IRR = 1.206, p < 0.001, n = 211 for N; IRR = 1.265, p < 0.001, n = 211 for E). One-log increases in swipe normalized copies were also associated with 40% and 43% increases in the likelihood of observing COVID-19 cases 5 days after sample collection (IRR = 1.403, p = 0.002, n = 212 for N; IRR = 1.426, p < 0.001, n = 212 for E). Our findings highlight the use of building-specific occupancy data and add to the evidence for the potential of wastewater-based epidemiology to predict COVID-19 trends at subsewershed scales.

Vaccine Effectiveness During an Outbreak of COVID-19 Alpha Variant (B.1.1.7) in a Men’s Correctional Facility, United States

Abstract/Summary: In April 2021, a COVID-19 outbreak occurred at a correctional facility in rural Virginia, USA. Eighty-four infections were identified among 854 incarcerated persons by facilitywide testing with reverse transcription quantitative PCR (qRT-PCR). We used whole-genome sequencing to link all infections to 2 employees infected with the B.1.1.7α (UK) variant. The relative risk comparing unvaccinated to fully vaccinated persons (mRNA-1273 [Moderna]) was 7.8 (95% CI 4.8–12.7), corresponding to a vaccine effectiveness of 87.1% (95% CI 79.0%–92.1%). Average qRT-PCR cycle threshold values were lower, suggesting higher viral loads, among unvaccinated infected than vaccinated cases for the nucleocapsid, envelope, and spike genes. Vaccination was highly effective at preventing SARS-CoV-2 infection in this high-risk setting. This approach can be applied to similar settings to estimate vaccine effectiveness as variants emerge to guide public health strategies during the ongoing pandemic.

Smoking-Cessation Interventions in Appalachia: A Systematic Review & Meta-Analysis

Abstract / Summary: Context: Appalachia, a socioeconomically disadvantaged rural region in the eastern U.S., has one of the nation’s highest prevalence rates of smoking and some of the poorest health outcomes. Effective interventions that lower smoking rates in Appalachia have great potential to reduce health disparities and preventable illness; however, a better understanding of effective interventions is needed. Evidence acquisition: This review included trials that evaluated the impact of smoking-cessation programs among populations living in Appalachia. The search was carried out on October 9, 2018 and comprised the Cochrane Central Register of Controlled Trials, Medline, Embase, and Scopus for academic journal articles published in English, with no date restrictions. After preliminary screening, potentially relevant full-text articles were independently reviewed by the authors with a Cohen’s k of 0.72, leading to the final inclusion of 9 articles. Evidence synthesis: Eligible studies were assessed qualitatively for heterogeneity and risk of bias. Six of the 9 included studies had extractable data related to dichotomous smoking status and reported a measure of association suitable for inclusion in a meta-analysis. For those 6 studies, the pooled RR and pooled OR were estimated using random effects models, with an I2 index demonstrating substantial heterogeneity. A funnel plot of the 6 trials appeared relatively symmetric. Conclusions: Participation in smoking-cessation interventions increased the probability of smoking abstinence among Appalachian smokers by an estimated 2.33 times (pooled RR=2.33, 95% CI=1.03, 5.25, p=0.04). Given the low number of studies, their substantial heterogeneity, and high risk of bias, the evidence of the effectiveness of smoking-cessation interventions in Appalachia must be interpreted with caution.