{ "items": [ "\n\n
Introduction4.2\u2009million individuals in the UK have type 2 diabetes, a known risk factor for dementia and mild cognitive impairment (MCI). Diabetes treatment may modify this association, but existing evidence is conflicting. We therefore aimed to assess the association between metformin therapy and risk of incident all-cause dementia or MCI compared with other oral glucose-lowering therapies (GLTs).Research design and methodsWe conducted an observational cohort study using the Clinical Practice Research Datalink among UK adults diagnosed with diabetes at \u226540 years between 1990 and 2019. We used an active comparator new user design to compare risks of dementia and MCI among individuals initially prescribed metformin versus an alternative oral GLT using Cox proportional hazards regression controlling for sociodemographic, lifestyle and clinical confounders. We assessed for interaction by age and sex. Sensitivity analyses included an as-treated analysis to mitigate potential exposure misclassification.ResultsWe included 211\u2009396 individuals (median age 63 years; 42.8% female), of whom 179\u2009333 (84.8%) initiated on metformin therapy. Over median follow-up of 5.4 years, metformin use was associated with a lower risk of dementia (adjusted HR (aHR) 0.86 (95% CI 0.79 to 0.94)) and MCI (aHR 0.92 (95% CI 0.86 to 0.99)). Metformin users aged under 80 years had a lower dementia risk (aHR 0.77 (95% CI 0.68 to 0.85)), which was not observed for those aged \u226580 years (aHR 0.95 (95% CI 0.87 to 1.05)). There was no interaction with sex. The as-treated analysis showed a reduced effect size compared with the main analysis (aHR 0.90 (95% CI 0.83 to 0.98)).ConclusionsMetformin use was associated with lower risks of incident dementia and MCI compared with alternative GLT among UK adults with diabetes. While our findings are consistent with a neuroprotective effect of metformin against dementia, further research is needed to reduce risks of confounding by indication and assess causality.
\n \n\n \n \nBackgroundGlucose-6-phosphate dehydrogenase (G6PD) deficiency represents a barrier to the full deployment of anti-malarial drugs for vivax malaria elimination and of first-line antibiotics. Lack of established reference ranges for G6PD activity in breast-fed infants puts them at risk of drug-induced haemolysis and restricts access to safe treatment of their mothers.MethodsThe present work was undertaken to establish age-specific G6PD normal values using the gold standard spectrophotometric assay to support the future clinical use of tafenoquine in lactating women and safer antibiotic treatment in infants.ResultsSpectrophotometric results collected at the Thai-Myanmar border from 78 healthy infants between the ages of 2 and 6 months showed a trend of decreased enzymatic activity with increasing age (which did not reach statistical significance when comparing 2-3 months old against 4-6 months old infants) and provided a reference normal value of 100% activity for infants 2-6 months old of 10.18IU/gHb.ConclusionsNormal reference G6PD activity in 2-6-month-old infants was approximately 140% of that observed in G6PD normal adults from the same population. Age specific G6PD activity thresholds should be used in paediatric populations to avoid drug-induced haemolysis.
\n \n\n \n \nOBJECTIVE: Blood culture (BC) sampling is recommended for all suspected sepsis patients prior to antibiotic administration. We examine barriers and enablers to BC sampling in three Southeast Asian countries. DESIGN: A Theoretical Domains Framework (TDF)-based survey, comprising a case scenario of a patient presenting with community-acquired sepsis and all 14 TDF domains of barriers/enablers to BC sampling. SETTING: Hospitals in Indonesia, Thailand and Viet Nam, December 2021 to 30 April 2022. PARTICIPANTS: 1070 medical doctors and 238 final-year medical students were participated in this study. Half of the respondents were women (n=680, 52%) and most worked in governmental hospitals (n=980, 75.4%). OUTCOME MEASURES: Barriers and enablers to BC sampling. RESULTS: The proportion of respondents who answered that they would definitely take BC in the case scenario was highest at 89.8% (273/304) in Thailand, followed by 50.5% (252/499) in Viet Nam and 31.3% (157/501) in Indonesia (p<0.001). Barriers/enablers in nine TDF domains were considered key in influencing BC sampling, including 'priority of BC (TDF-goals)', 'perception about their role to order or initiate an order for BC (TDF-social professional role and identity)', 'perception that BC is helpful (TDF-beliefs about consequences)', 'intention to follow guidelines (TDF-intention)', 'awareness of guidelines (TDF-knowledge)', 'norms of BC sampling (TDF-social influence)', 'consequences that discourage BC sampling (TDF-reinforcement)', 'perceived cost-effectiveness of BC (TDF-environmental context and resources)' and 'regulation on cost reimbursement (TDF-behavioural regulation)'. There was substantial heterogeneity between the countries. In most domains, the lower (higher) proportion of Thai respondents experienced the barriers (enablers) compared with that of Indonesian and Vietnamese respondents. A range of suggested intervention types and policy options was identified. CONCLUSIONS: Barriers and enablers to BC sampling are varied and heterogenous. Cost-related barriers are more common in more resource-limited countries, while many barriers are not directly related to cost. Context-specific multifaceted interventions at both hospital and policy levels are required to improve diagnostic stewardship practices.
\n \n\n \n \nSerum biomarkers and lung ultrasound are important measures for prognostication and treatment allocation in patients with COVID-19. Currently, there is a paucity of studies investigating relationships between serum biomarkers and ultrasonographic biomarkers derived from lung ultrasound. This study aims to assess correlations between serum biomarkers and lung ultrasound findings. This study is a secondary analysis of four prospective observational studies in adult patients with COVID-19. Serum biomarkers included markers of epithelial injury, endothelial dysfunction and immune activation. The primary outcome was the correlation between biomarker concentrations and lung ultrasound score assessed with Pearson's (r) or Spearman's (rs) correlations. Forty-four patients (67 [41-88] years old, 25% female, 52% ICU patients) were included. GAS6 (rs = 0.39), CRP (rs = 0.42) and SP-D (rs = 0.36) were correlated with lung ultrasound scores. ANG-1 (rs = -0.39) was inversely correlated with lung ultrasound scores. No correlations were found between lung ultrasound score and several other serum biomarkers. In patients with COVID-19, several serum biomarkers of epithelial injury, endothelial dysfunction and immune activation correlated with lung ultrasound findings. The lack of correlations with certain biomarkers could offer opportunities for precise prognostication and targeted therapeutic interventions by integrating these unlinked biomarkers.
\n \n\n \n \nSummaryMultiple alleles at thekelch13locus conferring artemisinin resistance (ART-R) are currently spreading through malaria parasite populations in Southeast Asia, providing a unique opportunity to directly observe an ongoing soft selective sweep, to investigate why resistance alleles have evolved multiple times and to determine fundamental population genetic parameters for Plasmodium. We sequenced thekelch13gene (n=1,876), genotyped 75 flanking SNPs, and measured clearance rate (n=3,552) in parasite infections from Western Thailand (2001-2014). We describe 32 independent coding mutations: these included common mutations outside thekelch13propeller region associated with significant reductions in clearance rate. Mutations were first observed in 2003 and rose to 90% by 2014, consistent with a selection coefficient of ~0.079. There was no change in diversity in flanking markers, but resistance allele diversity rose until 2012 and then dropped as one allele (C580Y) spread to high frequency. The rapid spread of C580Y suggests that the genomic signature may be considerably harder in the near future, and that retrospective studies may underestimate the complexity of selective sweeps. The frequency with which adaptive alleles arise is determined by the rate of mutation to generate beneficial alleles and the population size. Two factors drive this soft sweep: (1) multiple amino-acid mutations inkelch13can confer resistance providing a large mutational target \u2013 we estimate the target size is between 87 and 163bp. (2) The population mutation parameter (\u0398=2Ne\u03bc) can be estimated from the frequency distribution of resistant alleles and is ~ 5.69, suggesting that short term effective population size is between 88 thousand and 1.2 million. This is 52 to 705-fold greater thanNeestimates based on fluctuation in allele frequencies, suggesting that we have previously underestimated the capacity for adaptive evolution in Plasmodium. Our central conclusions are that retrospective studies may underestimate the complexity of selective events, ART-R evolution is not limited by availability of mutations, and theNerelevant for adaptation for malaria is considerably higher than previously estimated.Significance StatementPrevious work has identified surprisingly few origins of resistance to antimalarial drugs such as chloroquine and pyrimethamine. This has lead to optimism about prospects for minimizing resistance evolution through combination therapy. We studied a longitudinal collection of malaria parasites from the Thai-Myanmar border (2001\u201314) to examine an ongoing selective event in which \u226532 independent alleles associated with ART-R evolved. Three factors appear to explain the large number of origins observed: the large number of amino acid changes that result in resistance (i.e. large mutational \u201ctarget size\u201d), the large estimated effective population size (Ne), and the fact that we were able to document this selective event in real time, rather than retrospectively.
\n \n\n \n \nBACKGROUND: Improving screening and triage practices is essential for early severity assessments at the first point of contact and ensuring timely attention by healthcare workers (HCWs). The main objective of this study was to explore the triage process among febrile patients and HCWs in the emergency department (ED) of a tertiary care hospital in a resource-constrained setting. METHODS: This qualitative study was conducted from March to May 2023 at the ED of Tribhuvan University Teaching Hospital (TUTH), Nepal. The study included in-depth interviews with febrile patients (n\u2009=\u200915) and HCWs (n\u2009=\u200915). Additionally, direct observation notes (n\u2009=\u200920) were collected to document the triage process and patients' experiences in the ED. Data underwent thematic analysis using the Interpretative Phenomenological Analysis (IPA) approach. RESULTS: The ED of TUTH offered comprehensive triage services with clear delineation for the severity of febrile patients in line with the World Health Organization (WHO) guidelines. Nonetheless, challenges and constraints were identified. In the ED, evenings were generally the busiest period, and the triage process was not thorough during night shifts. Perception of triage was limited among patients and variable among HCWs. Digitalizing recordings of patient information including payment was deemed necessary for effective management of patients' waiting times at the triage station. High patient throughput added pressure on HCWs and had a potential influence on the delivery of services. Availability of medical equipment and space were also identified as challenges, with patients sometimes compelled to share beds. There were constraints related to waste disposal, hygiene, cleanliness, and the availability and maintenance of washrooms. Febrile patients experienced delays in receiving timely consultations and laboratory investigation reports, which affected their rapid diagnosis and discharge; nonetheless, patients were satisfied with the overall healthcare services received in the ED. CONCLUSIONS: Improving current triage management requires resource organization, including optimizing the waiting time of patients through a digitalized system. Urgent priorities involve upgrading visitor facilities, patient consultations, laboratory investigations, hygiene, and sanitation. HCWs' recommendations to resource the ED with more equipment, space, and beds and a dedicated triage officer to ensure 24-hour service, together with training and incentives, warrant further attention.
\n \n\n \n \nAbstract\nAims\nWe investigated the antibacterial efficacy of Umonium38 and Virkon\u00ae against Burkholderia pseudomallei, Escherichia coli, Pseudomonas aeruginosa and Methicillin-Resistant Staphylococcus aureus (MRSA) up to 14 days following treatment.\n\nMethods and results\nUmonium38 was diluted to 0.5%, 1.0%, 1.5%, 2.0%, 2.5% and 3%, tested against the bacterial strains at various contact times (15\u00a0min to 24\u00a0h), and incubated for up to 14 days. A minimum concentration of 0.5% Umonium38 with a contact time of 15\u00a0min effectively killed approximately 108 CFU/ml of all four bacterial species. No growth was observed on agar plates from day 0 until day 14 for all six concentrations. The bacteria were also inactivated by a 30-minute treatment time using Virkon\u00ae 1% solution.\n\nConclusions\nUmonium38 effectively inactivates B. pseudomallei, E. coli, P. aeruginosa and MRSA at a concentration of \u2265\u20090.5% with a contact time of at least 15\u00a0min. The antimicrobial effect of Umonium38 remained for 14 days.\n
\n \n\n \n \nIn our recent paper on the clinical pharmacology of tafenoquine (Watson et al., 2022), we used all available individual patient pharmacometric data from the tafenoquine pre-registration clinical efficacy trials to characterise the determinants of anti-relapse efficacy in tropical vivax malaria. We concluded that the currently recommended dose of tafenoquine (300 mg in adults, average dose of 5 mg/kg) is insufficient for cure in all adults, and a 50% increase to 450 mg (7.5 mg/kg) would halve the risk of vivax recurrence by four months. We recommended that clinical trials of higher doses should be carried out to assess their safety and tolerability. Sharma and colleagues at the pharmaceutical company GSK defend the currently recommended adult dose of 300 mg as the optimum balance between radical curative efficacy and haemolytic toxicity (Sharma et al., 2024). We contend that the relative haemolytic risks of the 300 mg and 450 mg doses have not been sufficiently well characterised to justify this opinion. In contrast, we provided evidence that the currently recommended 300 mg dose results in sub-maximal efficacy, and that prospective clinical trials of higher doses are warranted to assess their risks and benefits.
\n \n\n \n \nBackground: A critical and persistent challenge to global health and modern health care is the threat of antimicrobial resistance (AMR). Previous studies have reported a disproportionate burden of AMR in low-income and middle-income countries, but there remains an urgent need for more in-depth analyses across Africa. This study presents one of the most comprehensive sets of regional and country-level estimates of bacterial AMR burden in the WHO African region to date. Methods: We estimated deaths and disability-adjusted life-years (DALYs) attributable to and associated with AMR for 23 bacterial pathogens and 88 pathogen\u2013drug combinations for countries in the WHO African region in 2019. Our methodological approach consisted of five broad components: the number of deaths in which infection had a role, the proportion of infectious deaths attributable to a given infectious syndrome, the proportion of infectious syndrome deaths attributable to a given pathogen, the percentage of a given pathogen resistant to an antimicrobial drug of interest, and the excess risk of mortality (or duration of an infection) associated with this resistance. These components were then used to estimate the disease burden by using two counterfactual scenarios: deaths attributable to AMR (considering an alternative scenario where infections with resistant pathogens are replaced with susceptible ones) and deaths associated with AMR (considering an alternative scenario where drug-resistant infections would not occur at all). We obtained data from research hospitals, surveillance networks, and infection databases maintained by private laboratories and medical technology companies. We generated 95% uncertainty intervals (UIs) for final estimates as the 25th and 975th ordered values across 1000 posterior draws, and models were cross-validated for out-of-sample predictive validity. Findings: In the WHO African region in 2019, there were an estimated 1\u00b705 million deaths (95% UI 829 000\u20131 316 000) associated with bacterial AMR and 250 000 deaths (192 000\u2013325 000) attributable to bacterial AMR. The largest fatal AMR burden was attributed to lower respiratory and thorax infections (119 000 deaths [92 000\u2013151 000], or 48% of all estimated bacterial pathogen AMR deaths), bloodstream infections (56 000 deaths [37 000\u201382 000], or 22%), intra-abdominal infections (26 000 deaths [17 000\u201339 000], or 10%), and tuberculosis (18 000 deaths [3850\u201339 000], or 7%). Seven leading pathogens were collectively responsible for 821 000 deaths (636 000\u20131 051 000) associated with resistance in this region, with four pathogens exceeding 100 000 deaths each: Streptococcus pneumoniae, Klebsiella pneumoniae, Escherichia coli, and Staphylococcus aureus. Third-generation cephalosporin-resistant K pneumoniae and meticillin-resistant S aureus were shown to be the leading pathogen\u2013drug combinations in 25 and 16 countries, respectively (53% and 34% of the whole region, comprising 47 countries) for deaths attributable to AMR. Interpretation: This study reveals a high level of AMR burden for several bacterial pathogens and pathogen\u2013drug combinations in the WHO African region. The high mortality rates associated with these pathogens demonstrate an urgent need to address the burden of AMR in Africa. These estimates also show that quality and access to health care and safe water and sanitation are correlated with AMR mortality, with a higher fatal burden found in lower resource settings. Our cross-country analyses within this region can help local governments to leverage domestic and global funding to create stewardship policies that target the leading pathogen\u2013drug combinations. Funding: Bill & Melinda Gates Foundation, Wellcome Trust, and Department of Health and Social Care using UK aid funding managed by the Fleming Fund.
\n \n\n \n \nBackgroundCopper (Cu), an essential trace mineral regulating multiple actions of inflammation and oxidative stress, has been implicated in risk for preterm birth (PTB).ObjectivesThis study aimed to determine the association of maternal Cu concentration during pregnancy with PTB risk and gestational duration in a large multicohort study including diverse populations.MethodsMaternal plasma or serum samples of 10,449 singleton live births were obtained from 18 geographically diverse study cohorts. Maternal Cu concentrations were determined using inductively coupled plasma mass spectrometry. The associations of maternal Cu with PTB and gestational duration were analyzed using logistic and linear regressions for each cohort. The estimates were then combined using meta-analysis. Associations between maternal Cu and acute-phase reactants (APRs) and infection status were analyzed in 1239 samples from the Malawi cohort.ResultsThe maternal prenatal Cu concentration in our study samples followed normal distribution with mean of 1.92 \u03bcg/mL and standard deviation of 0.43 \u03bcg/mL, and Cu concentrations increased with gestational age up to 20 wk. The random-effect meta-analysis across 18 cohorts revealed that 1 \u03bcg/mL increase in maternal Cu concentration was associated with higher risk of PTB with odds ratio of 1.30 (95% confidence interval [CI]: 1.08, 1.57) and shorter gestational duration of 1.64 d (95% CI: 0.56, 2.73). In the Malawi cohort, higher maternal Cu concentration, concentrations of multiple APRs, and infections (malaria and HIV) were correlated and associated with greater risk of PTB and shorter gestational duration.ConclusionsOur study supports robust negative association between maternal Cu and gestational duration and positive association with risk for PTB. Cu concentration was strongly correlated with APRs and infection status suggesting its potential role in inflammation, a pathway implicated in the mechanisms of PTB. Therefore, maternal Cu could be used as potential marker of integrated inflammatory pathways during pregnancy and risk for PTB.
\n \n\n \n \nBackgroundProper evaluation of therapeutic responses in Chagas disease is hampered by the prolonged persistence of antibodies to Trypanosoma cruzi measured by conventional serological tests and by the lack of sensitivity of parasitological tests. Previous studies indicated that tGPI-mucins, an \u03b1-Gal (\u03b1-d-Galp(1\u21923)-\u03b2-d-Galp(1\u21924)-d-GlcNAc)-rich fraction obtained from T. cruzi trypomastigotes surface coat, elicit a strong and protective antibody response in infected individuals, which disappears soon after successful treatment. The cost and technical difficulties associated with tGPI-mucins preparation, however, preclude its routine implementation in clinical settings.Methods/principle findingsWe herein developed a neoglycoprotein consisting of a BSA scaffold decorated with several units of a synthetic \u03b1-Gal antigenic surrogate (\u03b1-d-Galp(1\u21923)-\u03b2-d-Galp(1\u21924)-\u03b2-d-Glcp). Serological responses to this reagent, termed NGP-Tri, were monitored by means of an in-house enzyme-linked immunosorbent assay (\u03b1-Gal-ELISA) in a cohort of 82 T. cruzi-infected and Benznidazole- or Nifurtimox-treated children (3 days to 16 years-old). This cohort was split into 3 groups based on the age of patients at the time of treatment initiation: Group 1 comprised 24 babies (3 days to 5 months-old; median = 26 days-old), Group 2 comprised 31 children (7 months to 3 years-old; median = 1.0-year-old) and Group 3 comprised 26 patients (3 to 16 years-old; median = 8.4 years-old). A second, control cohort (Group 4) included 39 non-infected infants (3 days to 5 months-old; median = 31 days-old) born to T. cruzi-infected mothers. Despite its suboptimal seroprevalence (58.4%), \u03b1-Gal-ELISA yielded shorter median time values of negativization (23 months [IC 95% 7 to 36 months] vs 60 months [IC 95% 15 to 83 months]; p = 0.0016) and higher rate of patient negative seroconversion (89.2% vs 43.2%, p < 0.005) as compared to conventional serological methods. The same effect was verified for every Group, when analyzed separately. Most remarkably, 14 out of 24 (58.3%) patients from Group 3 achieved negative seroconversion for \u03b1-Gal-ELISA while none of them were able to negativize for conventional serology. Detailed analysis of patients showing unconventional serological responses suggested that, in addition to providing a novel tool to shorten follow-up periods after chemotherapy, the \u03b1-Gal-ELISA may assist in other diagnostic needs in pediatric Chagas disease.Conclusions/significanceThe tools evaluated here provide the cornerstone for the development of an efficacious, reliable, and straightforward post-therapeutic marker for pediatric Chagas disease.
\n \n\n \n \nINTRODUCTION: Community engagement and participatory research are widely used and considered important for ethical health research and interventions. Based on calls to unpack their complexity and observed biases in their favour, we conducted a realist review with a focus on non-communicable disease prevention. The aim was to generate an understanding of how and why engagement or participatory practices enhance or hinder the benefits of non-communicable disease research and interventions in low- and middle-income countries. METHODS: We retroductively formulated theories based on existing literature and realist interviews. After initial searches, preliminary theories and a search strategy were developed. We searched three databases and screened records with a focus on theoretical and empirical relevance. Insights about contexts, strategies, mechanisms and outcomes were extracted and synthesised into six theories. Five realist interviews were conducted to complement literature-based theorising. The final synthesis included 17 quality-appraised articles describing 15 studies. RESULTS: We developed six theories explaining how community engagement or participatory research practices either enhance or hinder the benefits of non-communicable disease research or interventions. Benefit-enhancing mechanisms include community members' agency being realised, a shared understanding of the benefits of health promotion, communities feeling empowered, and community members feeling solidarity and unity. Benefit-hindering mechanisms include community members' agency remaining unrealised and participation being driven by financial motives or reputational expectations. CONCLUSION: Our review challenges assumptions about community engagement and participatory research being solely beneficial in the context of non-communicable disease prevention in low- and middle-income countries. We present both helpful and harmful pathways through which health and research outcomes are affected. Our practical recommendations relate to maximising benefits and minimising harm by addressing institutional inflexibility and researcher capabilities, managing expectations on research, promoting solidarity in solving public health challenges and sharing decision-making power.
\n \n\n \n \nBackgroundMalaria is a parasitic disease that affects many of the poorest economies, resulting in approximately 241 million clinical episodes and 627,000 deaths annually. Piperaquine, when administered with dihydroartemisinin, is an effective drug against the disease. Drug concentration measurements taken on day 7 after treatment initiation have been shown to be a good predictor of therapeutic success with piperaquine. A simple capillary blood collection technique, where blood is dried onto filter paper, is especially suitable for drug studies in remote areas or resource-limited settings or when taking samples from children, toddlers, and infants.MethodsThree 3.2\u00a0mm discs were punched out from a dried blood spot (DBS) and then extracted in a 96-well plate using solid phase extraction on a fully automated liquid handling system. The analysis was performed using LC-MS/MS with a calibration range of 3 - 1000\u00a0ng/mL.ResultsThe recovery rate was approximately 54-72\u00a0%, and the relative standard deviation was below 9\u00a0% for low, middle and high quality control levels. The LC-MS/MS quantification limit of 3\u00a0ng/mL is sensitive enough to detect piperaquine for up to 4-8\u00a0weeks after drug administration, which is crucial when evaluating recrudescence and drug resistance development. While different hematocrit levels can affect DBS drug measurements, the effect was minimal for piperaquine.ConclusionA sensitive LC-MS/MS method, in combination with fully automated extraction in a 96-well plate format, was developed and validated for the quantification of piperaquine in DBS. The assay was implemented in a bioanalytical laboratory for processing large-scale clinical trial samples.
\n \n\n \n \nBackgroundPlasmodium falciparum variant surface antigens (VSAs) contribute to malaria pathogenesis by mediating cytoadhesion of infected red blood cells to the microvasculature endothelium. In this study, we investigated the association between anti-VSA antibodies and clinical outcome in a controlled human malaria infection (CHMI) study.MethodWe used flow cytometry and ELISA to measure levels of IgG antibodies to VSAs of five heterologous and one homologous P. falciparum parasite isolates, and to two PfEMP1 DBL\u03b2 domains in blood samples collected a day before the challenge and 14 days after infection. We also measured the ability of an individual's plasma to inhibit the interaction between PfEMP1 and ICAM1 using competition ELISA. We then assessed the association between the antibody levels, function, and CHMI defined clinical outcome during a 21-day follow-up period post infection using Cox proportional hazards regression.ResultsAntibody levels to the individual isolate VSAs, or to two ICAM1-binding DBL\u03b2 domains of PfEMP1, were not associated with a significantly reduced risk of developing parasitemia or of meeting treatment criteria after the challenge after adjusting for exposure. However, anti-VSA antibody breadth (i.e., cumulative response to all the isolates) was a significant predictor of reduced risk of requiring treatment [HR 0.23 (0.10-0.50) p= 0.0002].ConclusionThe breadth of IgG antibodies to VSAs, but not to individual isolate VSAs, is associated with protection in CHMI.
\n \n\n \n \n