RSS news feeds

Transcriptomic and functional analyses of the piRNA pathway in the Chagas disease vector <i>Rhodnius prolixus</i>

PLoS Neglected Tropical Diseases News - 10 October 2018 - 9:00pm

by Tarcisio Brito, Alison Julio, Mateus Berni, Lisiane de Castro Poncio, Emerson Soares Bernardes, Helena Araujo, Michael Sammeth, Attilio Pane

The piRNA pathway is a surveillance system that guarantees oogenesis and adult fertility in a range of animal species. The pathway is centered on PIWI clade Argonaute proteins and the associated small non-coding RNAs termed piRNAs. In this study, we set to investigate the evolutionary conservation of the piRNA pathway in the hemimetabolous insect Rhodnius prolixus. Our transcriptome profiling reveals that core components of the pathway are expressed during previtellogenic stages of oogenesis. Rhodnius’ genome harbors four putative piwi orthologs. We show that Rp-piwi2, Rp-piwi3 and Rp-ago3, but not Rp-piwi1 transcripts are produced in the germline tissues and maternally deposited in the mature eggs. Consistent with a role in Rhodnius oogenesis, parental RNAi against the Rp-piwi2, Rp-piwi3 and Rp-ago3 results in severe egg laying and female adult fertility defects. Furthermore, we show that the reduction of the Rp-piwi2 levels by parental RNAi disrupts oogenesis by causing a dramatic loss of trophocytes, egg chamber degeneration and oogenesis arrest. Intriguingly, the putative Rp-Piwi2 protein features a polyglutamine tract at its N-terminal region, which is conserved in PIWI proteins encoded in the genome of other Triatomine species. Together with R. prolixus, these hematophagous insects are primary vectors of the Chagas disease. Thus, our data shed more light on the evolution of the piRNA pathway and provide a framework for the development of new control strategies for Chagas disease insect vectors.

Intrinsic activation of the vitamin D antimicrobial pathway by <i>M</i>. <i>leprae</i> infection is inhibited by type I IFN

PLoS Neglected Tropical Diseases News - 9 October 2018 - 9:00pm

by Kathryn Zavala, Carter A. Gottlieb, Rosane M. Teles, John S. Adams, Martin Hewison, Robert L. Modlin, Philip T. Liu

Following infection, virulent mycobacteria persist and grow within the macrophage, suggesting that the intrinsic activation of an innate antimicrobial response is subverted by the intracellular pathogen. For Mycobacterium leprae, the intracellular bacterium that causes leprosy, the addition of exogenous innate or adaptive immune ligands to the infected monocytes/macrophages was required to detect a vitamin D-dependent antimicrobial activity. We investigated whether there is an intrinsic immune response to M. leprae in macrophages that is inhibited by the pathogen. Upon infection of monocytes with M. leprae, there was no upregulation of CYP27B1 nor its enzymatic activity converting the inactive prohormone form of vitamin D (25-hydroxyvitamin D) to the bioactive form (1,25α-dihydroxyvitamin D). Given that M. leprae-induced type I interferon (IFN) inhibited monocyte activation, we blocked the type I IFN receptor (IFNAR), revealing the intrinsic capacity of monocytes to recognize M. leprae and upregulate CYP27B1. Consistent with these in vitro studies, an inverse relationship between expression of CYP27B1 vs. type I IFN downstream gene OAS1 was detected in leprosy patient lesions, leading us to study cytokine-derived macrophages (MΦ) to model cellular responses at the site of disease. Infection of IL-15-derived MΦ, similar to MΦ in lesions from the self-limited form of leprosy, with M. leprae did not inhibit induction of the vitamin D antimicrobial pathway. In contrast, infection of IL-10-derived MΦ, similar to MΦ in lesions from patients with the progressive form of leprosy, resulted in induction of type I IFN and suppression of the vitamin D directed pathway. Importantly, blockade of the type I IFN response in infected IL-10 MΦ decreased M. leprae viability. These results indicate that M. leprae evades the intrinsic capacity of human monocytes/MΦ to activate the vitamin D-mediated antimicrobial pathway via the induction of type I IFN.

The impact and cost-effectiveness of controlling cholera through the use of oral cholera vaccines in urban Bangladesh: A disease modeling and economic analysis

PLoS Neglected Tropical Diseases News - 9 October 2018 - 9:00pm

by Ashraful Islam Khan, Ann Levin, Dennis L. Chao, Denise DeRoeck, Dobromir T. Dimitrov, Jahangir A. M. Khan, Muhammad Shariful Islam, Mohammad Ali, Md. Taufiqul Islam, Abdur Razzaque Sarker, John D. Clemens, Firdausi Qadri


Cholera remains an important public health problem in major cities in Bangladesh, especially in slum areas. In response to growing interest among local policymakers to control this disease, this study estimated the impact and cost-effectiveness of preventive cholera vaccination over a ten-year period in a high-risk slum population in Dhaka to inform decisions about the use of oral cholera vaccines as a key tool in reducing cholera risk in such populations.

Methodology/Principal findings

Assuming use of a two-dose killed whole-cell oral cholera vaccine to be produced locally, the number of cholera cases and deaths averted was estimated for three target group options (1–4 year olds, 1–14 year olds, and all persons 1+), using cholera incidence data from Dhaka, estimates of vaccination coverage rates from the literature, and a dynamic model of cholera transmission based on data from Matlab, which incorporates herd effects. Local estimates of vaccination costs minus savings in treatment costs, were used to obtain incremental cost-effectiveness ratios for one- and ten-dose vial sizes. Vaccinating 1–14 year olds every three years, combined with annual routine vaccination of children, would be the most cost-effective strategy, reducing incidence in this population by 45% (assuming 10% annual migration), and costing was $823 (2015 USD) for single dose vials and $591 (2015 USD) for ten-dose vials per disability-adjusted life year (DALY) averted. Vaccinating all ages one year and above would reduce incidence by >90%, but would be 50% less cost-effective ($894–1,234/DALY averted). Limiting vaccination to 1–4 year olds would be the least cost-effective strategy (preventing only 7% of cases and costing $1,276-$1,731/DALY averted), due to the limited herd effects of vaccinating this small population and the lower vaccine efficacy in this age group.


Providing cholera vaccine to slum populations in Dhaka through periodic vaccination campaigns would significantly reduce cholera incidence and inequities, and be especially cost-effective if all 1–14 year olds are targeted.

Ligand binding properties of two <i>Brugia malayi</i> fatty acid and retinol (FAR) binding proteins and their vaccine efficacies against challenge infection in gerbils

PLoS Neglected Tropical Diseases News - 8 October 2018 - 9:00pm

by Bin Zhan, Sridhar Arumugam, Malcolm W. Kennedy, Nancy Tricoche, Lu-Yun Lian, Oluwatoyin A. Asojo, Sasisekhar Bennuru, Maria Elena Bottazzi, Peter J. Hotez, Sara Lustigman, Thomas R. Klei

Parasitic nematodes produce an unusual class of fatty acid and retinol (FAR)-binding proteins that may scavenge host fatty acids and retinoids. Two FARs from Brugia malayi (Bm-FAR-1 and Bm-FAR-2) were expressed as recombinant proteins, and their ligand binding, structural characteristics, and immunogenicities examined. Circular dichroism showed that rBm-FAR-1 and rBm-FAR-2 are similarly rich in α-helix structure. Unexpectedly, however, their lipid binding activities were found to be readily differentiated. Both FARs bound retinol and cis-parinaric acid similarly, but, while rBm-FAR-1 induced a dramatic increase in fluorescence emission and blue shift in peak emission by the fluorophore-tagged fatty acid (dansyl-undecanoic acid), rBm-FAR-2 did not. Recombinant forms of the related proteins from Onchocerca volvulus, rOv-FAR-1 and rOv-FAR-2, were found to be similarly distinguishable. This is the first FAR-2 protein from parasitic nematodes that is being characterized. The relative protein abundance of Bm-FAR-1 was higher than Bm-FAR-2 in the lysates of different developmental stages of B. malayi. Both FAR proteins were targets of strong IgG1, IgG3 and IgE antibody in infected individuals and individuals who were classified as endemic normal or putatively immune. In a B. malayi infection model in gerbils, immunization with rBm-FAR-1 and rBm-FAR-2 formulated in a water-in-oil-emulsion (®Montanide-720) or alum elicited high titers of antigen-specific IgG, but only gerbils immunized with rBm-FAR-1 formulated with the former produced a statistically significant reduction in adult worms (68%) following challenge with B. malayi infective larvae. These results suggest that FAR proteins may play important roles in the survival of filarial nematodes in the host, and represent potential candidates for vaccine development against lymphatic filariasis and related filarial infections.

The design of schistosomiasis monitoring and evaluation programmes: The importance of collecting adult data to inform treatment strategies for <i>Schistosoma mansoni</i>

PLoS Neglected Tropical Diseases News - 8 October 2018 - 9:00pm

by Jaspreet Toor, Hugo C. Turner, James E. Truscott, Marleen Werkman, Anna E. Phillips, Ramzi Alsallaq, Graham F. Medley, Charles H. King, Roy M. Anderson

Monitoring and evaluation (M&E) programmes are used to collect data which are required to assess the impact of current interventions on their progress towards achieving the World Health Organization (WHO) goals of morbidity control and elimination as a public health problem for schistosomiasis. Prevalence and intensity of infection data are typically collected from school-aged children (SAC) as they are relatively easy to sample and are thought to be most likely to be infected by schistosome parasites. However, adults are also likely to be infected. We use three different age-intensity profiles of infection for Schistosoma mansoni with low, moderate and high burdens of infection in adults to investigate how the age distribution of infection impacts the mathematical model generated recommendations of the preventive chemotherapy coverage levels required to achieve the WHO goals. We find that for moderate prevalence regions, regardless of the burden of infection in adults, treating SAC only may achieve the WHO goals. However, for high prevalence regions with a high burden of infection in adults, adult treatment is required to meet the WHO goals. Hence, we show that the optimal treatment strategy for a defined region requires consideration of the burden of infection in adults as it cannot be based solely on the prevalence of infection in SAC. Although past epidemiological data have informed mathematical models for the transmission and control of schistosome infections, more accurate and detailed data are required from M&E programmes to accurately determine the optimal treatment strategy for a defined region. We highlight the importance of collecting prevalence and intensity of infection data from a broader age-range, specifically the inclusion of adult data at baseline (prior to treatment) and throughout the treatment programme if possible, rather than SAC only, to accurately determine the treatment strategy for a defined region. Furthermore, we discuss additional epidemiological data, such as individual longitudinal adherence to treatment, that should ideally be collected in M&E programmes.

Quantifying the value of surveillance data for improving model predictions of lymphatic filariasis elimination

PLoS Neglected Tropical Diseases News - 8 October 2018 - 9:00pm

by Edwin Michael, Swarnali Sharma, Morgan E. Smith, Panayiota Touloupou, Federica Giardina, Joaquin M. Prada, Wilma A. Stolk, Deirdre Hollingsworth, Sake J. de Vlas


Mathematical models are increasingly being used to evaluate strategies aiming to achieve the control or elimination of parasitic diseases. Recently, owing to growing realization that process-oriented models are useful for ecological forecasts only if the biological processes are well defined, attention has focused on data assimilation as a means to improve the predictive performance of these models.

Methodology and principal findings

We report on the development of an analytical framework to quantify the relative values of various longitudinal infection surveillance data collected in field sites undergoing mass drug administrations (MDAs) for calibrating three lymphatic filariasis (LF) models (EPIFIL, LYMFASIM, and TRANSFIL), and for improving their predictions of the required durations of drug interventions to achieve parasite elimination in endemic populations. The relative information contribution of site-specific data collected at the time points proposed by the WHO monitoring framework was evaluated using model-data updating procedures, and via calculations of the Shannon information index and weighted variances from the probability distributions of the estimated timelines to parasite extinction made by each model. Results show that data-informed models provided more precise forecasts of elimination timelines in each site compared to model-only simulations. Data streams that included year 5 post-MDA microfilariae (mf) survey data, however, reduced each model’s uncertainty most compared to data streams containing only baseline and/or post-MDA 3 or longer-term mf survey data irrespective of MDA coverage, suggesting that data up to this monitoring point may be optimal for informing the present LF models. We show that the improvements observed in the predictive performance of the best data-informed models may be a function of temporal changes in inter-parameter interactions. Such best data-informed models may also produce more accurate predictions of the durations of drug interventions required to achieve parasite elimination.


Knowledge of relative information contributions of model only versus data-informed models is valuable for improving the usefulness of LF model predictions in management decision making, learning system dynamics, and for supporting the design of parasite monitoring programmes. The present results further pinpoint the crucial need for longitudinal infection surveillance data for enhancing the precision and accuracy of model predictions of the intervention durations required to achieve parasite elimination in an endemic location.

The effect of assortative mixing on stability of low helminth transmission levels and on the impact of mass drug administration: Model explorations for onchocerciasis

PLoS Neglected Tropical Diseases News - 8 October 2018 - 9:00pm

by Anneke S. de Vos, Wilma A. Stolk, Sake J. de Vlas, Luc E. Coffeng


Stable low pre-control prevalences of helminth infection are not uncommon in field settings, yet it is poorly understood how such low levels can be sustained, thereby challenging efforts to model them. Disentangling possible facilitating mechanisms is important, since these may differently affect intervention impact. Here we explore the role of assortative (i.e. non-homogenous) mixing and exposure heterogeneity in helminth transmission, using onchocerciasis as an example.

Methodology/Principal findings

We extended the established individual-based model ONCHOSIM to allow for assortative mixing, assuming that individuals who are relatively more exposed to fly bites are more connected to each other than other individuals in the population as a result of differential exposure to a sub-population of blackflies. We used the model to investigate how transmission stability, equilibrium microfilarial (mf) prevalence and intensity, and impact of mass drug administration depend on the assumed degree of assortative mixing and exposure heterogeneity, for a typical rural population of about 400 individuals. The model clearly demonstrated that with homogeneous mixing and moderate levels of exposure heterogeneity, onchocerciasis could not be sustained below 35% mf prevalence. In contrast, assortative mixing stabilised onchocerciasis prevalence at levels as low as 8% mf prevalence. Increasing levels of assortative mixing significantly reduced the probability of interrupting transmission, given the same duration and coverage of mass drug administration.


Assortative mixing patterns are an important factor to explain stable low prevalence situations and are highly relevant for prospects of elimination. Their effect on the pre-control distribution of mf intensities in human populations is only detectable in settings with mf prevalences <30%, where high skin mf density in mf-positive people may be an indication of assortative mixing. Local spatial variation in larval infection intensity in the blackfly intermediate host may also be an indicator of assortative mixing.

Identifying a sufficient core group for trachoma transmission

PLoS Neglected Tropical Diseases News - 8 October 2018 - 9:00pm

by Thomas M. Lietman, Michael S. Deiner, Catherine E. Oldenburg, Scott D. Nash, Jeremy D. Keenan, Travis C. Porco


In many infectious diseases, a core group of individuals plays a disproportionate role in transmission. If these individuals were effectively prevented from transmitting infection, for example with a perfect vaccine, then the disease would disappear in the remainder of the community. No vaccine has yet proven effective against the ocular strains of chlamydia that cause trachoma. However, repeated treatment with oral azithromycin may be able to prevent individuals from effectively transmitting trachoma.

Methodology/Principal findings

Here we assess several methods for identifying a core group for trachoma, assuming varying degrees of knowledge about the transmission process. We determine the minimal core group from a completely specified model, fitted to results from a large Ethiopian trial. We compare this benchmark to a core group that could actually be identified from information available to trachoma programs. For example, determined from the rate of return of infection in a community after mass treatments, or from the equilibrium prevalence of infection.


Sufficient groups are relatively easy for programs to identify, but will likely be larger than the theoretical minimum.

The role of case proximity in transmission of visceral leishmaniasis in a highly endemic village in Bangladesh

PLoS Neglected Tropical Diseases News - 8 October 2018 - 9:00pm

by Lloyd A. C. Chapman, Chris P. Jewell, Simon E. F. Spencer, Lorenzo Pellis, Samik Datta, Rajib Chowdhury, Caryn Bern, Graham F. Medley, T. Déirdre Hollingsworth


Visceral leishmaniasis (VL) is characterised by a high degree of spatial clustering at all scales, and this feature remains even with successful control measures. VL is targeted for elimination as a public health problem in the Indian subcontinent by 2020, and incidence has been falling rapidly since 2011. Current control is based on early diagnosis and treatment of clinical cases, and blanket indoor residual spraying of insecticide (IRS) in endemic villages to kill the sandfly vectors. Spatially targeting active case detection and/or IRS to higher risk areas would greatly reduce costs of control, but its effectiveness as a control strategy is unknown. The effectiveness depends on two key unknowns: how quickly transmission risk decreases with distance from a VL case and how much asymptomatically infected individuals contribute to transmission.

Methodology/Principal findings

To estimate these key parameters, a spatiotemporal transmission model for VL was developed and fitted to geo-located epidemiological data on 2494 individuals from a highly endemic village in Mymensingh, Bangladesh. A Bayesian inference framework that could account for the unknown infection times of the VL cases, and missing symptom onset and recovery times, was developed to perform the parameter estimation. The parameter estimates obtained suggest that, in a highly endemic setting, VL risk decreases relatively quickly with distance from a case—halving within 90m—and that VL cases contribute significantly more to transmission than asymptomatic individuals.


These results suggest that spatially-targeted interventions may be effective for limiting transmission. However, the extent to which spatial transmission patterns and the asymptomatic contribution vary with VL endemicity and over time is uncertain. In any event, interventions would need to be performed promptly and in a large radius (≥300m) around a new case to reduce transmission risk.

Inter- and intra-host sequence diversity reveal the emergence of viral variants during an overwintering epidemic caused by dengue virus serotype 2 in southern Taiwan

PLoS Neglected Tropical Diseases News - 4 October 2018 - 9:00pm

by Hui-Ying Ko, Yao-Tsun Li, Day-Yu Chao, Yun-Cheng Chang, Zheng-Rong T. Li, Melody Wang, Chuan-Liang Kao, Tzai-Hung Wen, Pei-Yun Shu, Gwong-Jen J. Chang, Chwan-Chuen King

Purifying selection during dengue viral infection has been suggested as the driving force of viral evolution and the higher complexity of the intra-host quasi-species is thought to offer an adaptive advantage for arboviruses as they cycle between arthropod and vertebrate hosts. However, very few studies have been performed to investigate the viral genetic changes within (intra-host) and between (inter-host) humans in a spatio-temporal scale. Viruses of different serotypes from various countries imported to Taiwan cause annual outbreaks. During 2001–2003, two consecutive outbreaks were caused by dengue virus serotype 2 (DENV-2) and resulted in a larger-scale epidemic with more severe dengue cases in the following year. Phylogenetic analyses showed that the viruses from both events were similar and related to the 2001 DENV-2 isolate from the Philippines. We comprehensively analyzed viral sequences from representative dengue patients and identified three consensus genetic variants, group Ia, Ib and II, with different spatio-temporal population dynamics. The phylodynamic analysis suggested group Ib variants, characterized by lower genetic diversity, transmission rate, and intra-host variant numbers, might play the role of maintenance variants. The residential locations among the patients infected by group Ib variants were in the outer rim of case clusters throughout the 2001–2003 period whereas group Ia and II variants were located in the centers of case clusters, suggesting that group Ib viruses might serve as “sheltered overwintering” variants in an undefined ecological niche. Further deep sequencing of the viral envelope (E) gene directly from individual patient serum samples confirmed the emergence of variants belonging to three quasi-species (group Ia, Ib, and II) and the ancestral role of the viral variants in the latter phase of the 2001 outbreak contributed to the later, larger-scale epidemic beginning in 2002. These findings enhanced our understanding of increasing epidemic severity over time in the same epidemic area. It also highlights the importance of combining phylodynamic and deep sequencing analysis as surveillance tools for detecting dynamic changes in viral variants, particularly searching for and monitoring any specific viral subpopulation. Such subpopulations might have selection advantages in both fitness and transmissibility leading to increased epidemic severity.

Community-based prevalence of typhoid fever, typhus, brucellosis and malaria among symptomatic individuals in Afar Region, Ethiopia

PLoS Neglected Tropical Diseases News - 4 October 2018 - 9:00pm

by Biruk Zerfu, Girmay Medhin, Gezahegne Mamo, Gezahegn Getahun, Rea Tschopp, Mengistu Legesse


In sub-Saharan Africa, where there is the scarcity of proper diagnostic tools, febrile illness related symptoms are often misdiagnosed as malaria. Information on causative agents of febrile illness related symptoms among pastoral communities in Ethiopia have rarely been described.


In this a community based cross-sectional survey, we assessed the prevalence of typhoid fever, typhus, brucellosis and malaria among individuals with a set of given symptoms in Amibara district, Afar Region, Ethiopia. Blood samples were collected from 650 study participants, and examined by Widal and Weilfelix direct card agglutination test (DCAT) as well as test tube based titration test for Salmonella enterica serotype Typhi (S. Typhi) and Rickettsia infections. Rose Bengal Plate Test (RBPT) and Complement Fixation Test (CFT) were used to screen Brucella infection. Thin and thick blood smears were used to diagnosis malaria.


Out of 630 sera screened by DCAT, 83 (13.2%) were reactive to H and/or O antigens for S. Typhi infection. Among these, 46 (55.4%) were reactive by the titration test at the cut off value ≥ 1:80. The combined sero-prevalence for S. Typhi by the two tests was 7.3% (46/630). The seroprevalence for Rickettsia infection was 26.2% (165/630) by DCAT and 53.3% (88/165) by the titration test at the cut off value ≥ 1:80. The combined sero-prevalence for Rickettsia infection by the two tests was 14.0% (88/630). The sero-prevalence for Brucella infection was 12.7% (80/630) by RBPT, of which 28/80 (35%) were positive by CFT. The combined sero-prevalence for Brucella infection by the two tests was 4.4% (28/630). Out 650 suspected individuals for malaria, 16 (2.5%) were found positive for P. falciparum infection.


In this study, typhoid fever, typhus, brucellosis and malaria were observed among symptomatic individuals. The study also highlighted that brucellosis cases can be misdiagnosed as malaria or other disease based solely on clinical diagnosis. Therefore, efforts are needed to improve disease awareness and laboratory services for the diagnosis of brucellosis and other zoonotic diseases to identify other causes of febrile illness in this pastoral setting.

Population genetic analysis of Chadian Guinea worms reveals that human and non-human hosts share common parasite populations

PLoS Neglected Tropical Diseases News - 4 October 2018 - 9:00pm

by Elizabeth A. Thiele, Mark L. Eberhard, James A. Cotton, Caroline Durrant, Jeffrey Berg, Kelsey Hamm, Ernesto Ruiz-Tiben

Following almost 10 years of no reported cases, Guinea worm disease (GWD or dracunculiasis) reemerged in Chad in 2010 with peculiar epidemiological patterns and unprecedented prevalence of infection among non-human hosts, particularly domestic dogs. Since 2014, animal infections with Guinea worms have also been observed in the other three countries with endemic transmission (Ethiopia, Mali, and South Sudan), causing concern and generating interest in the parasites’ true taxonomic identity and population genetics. We present the first extensive population genetic data for Guinea worm, investigating mitochondrial and microsatellite variation in adult female worms from both human and non-human hosts in the four endemic countries to elucidate the origins of Chad’s current outbreak and possible host-specific differences between parasites. Genetic diversity of Chadian Guinea worms was considerably higher than that of the other three countries, even after controlling for sample size through rarefaction, and demographic analyses are consistent with a large, stable parasite population. Genealogical analyses eliminate the other three countries as possible sources of parasite reintroduction into Chad, and sequence divergence and distribution of genetic variation provide no evidence that parasites in human and non-human hosts are separate species or maintain isolated transmission cycles. Both among and within countries, geographic origin appears to have more influence on parasite population structure than host species. Guinea worm infection in non-human hosts has been occasionally reported throughout the history of the disease, particularly when elimination programs appear to be reaching their end goals. However, no previous reports have evaluated molecular support of the parasite species identity. Our data confirm that Guinea worms collected from non-human hosts in the remaining endemic countries of Africa are Dracunculus medinensis and that the same population of worms infects both humans and dogs in Chad. Our genetic data and the epidemiological evidence suggest that transmission in the Chadian context is currently being maintained by canine hosts.

Engineered nanoparticles bind elapid snake venom toxins and inhibit venom-induced dermonecrosis

PLoS Neglected Tropical Diseases News - 4 October 2018 - 9:00pm

by Jeffrey O’Brien, Shih-Hui Lee, José María Gutiérrez, Kenneth J. Shea

Envenomings by snakebites constitute a serious and challenging global health issue. The mainstay in the therapy of snakebite envenomings is the parenteral administration of animal-derived antivenoms. Significantly, antivenoms are only partially effective in the control of local tissue damage. A novel approach to mitigate the progression of local tissue damage that could complement the antivenom therapy of envenomings is proposed. We describe an abiotic hydrogel nanoparticle engineered to bind to and modulate the activity of a diverse array of PLA2 and 3FTX isoforms found in Elapidae snake venoms. These two families of protein toxins share features that are associated with their common (membrane) targets, allowing for nanoparticle sequestration by a mechanism that differs from immunological (epitope) selection. The nanoparticles are non-toxic in mice and inhibit dose-dependently the dermonecrotic activity of Naja nigricollis venom.

A rare case of visceral leishmaniasis in an immunocompetent traveler returning to the United States from Europe

PLoS Neglected Tropical Diseases News - 4 October 2018 - 9:00pm

by Lamia Haque, Merceditas Villanueva, Armand Russo, Youzhong Yuan, Eun-Ju Lee, Jeffrey Topal, Nikolai Podoltsev

A young, healthy traveler returning to the United States presented with fever, night sweats, splenomegaly, and pancytopenia. Bone marrow biopsy revealed leishmaniasis (Leishmania infantum), likely acquired in southern France. Although many cases of endemic visceral leishmaniasis (VL) have been reported in Europe, this is a rare case of imported VL in a healthy traveler returning from Europe to the US. Despite successful initial treatment with liposomal amphotericin B (LamB), relapse occurred. Treatments for VL in immunocompetent individuals are highly effective, but relapse can occur. There is more extensive experience in endemic areas with treating relapse that may be lacking in North America. This case alerts physicians in the US that immunocompetent adults can acquire VL during brief visits to endemic areas in Europe. It is important that travelers be counseled on preventive measures. Patients should be monitored after treatment for relapse.

Dengue illness index—A tool to characterize the subjective dengue illness experience

PLoS Neglected Tropical Diseases News - 4 October 2018 - 9:00pm

by Stephen J. Thomas, Liane Agulto, Kim Hendrickx, Martin Erpicum, Kay M. Tomashek, M. Cristina Cassetti, Catherine Laughlin, Alexander Precioso, Alexander C. Schmidt, Federico Narvaez, João Bosco Siqueira, Hasitha Tissera, Robert Edelman

Dengue virus infections are a major cause of febrile illness that significantly affects individual and societal productivity and drives up health care costs principally in the developing world. Two dengue vaccine candidates are in advanced clinical efficacy trials in Latin America and Asia, and another has been licensed in more than fifteen countries but its uptake has been limited. Despite these advances, standardized metrics for comparability of protective efficacy between dengue vaccines remain poorly defined. The Dengue Illness Index (DII) is a tool that we developed thru refinement of previous similar iterations in an attempt to improve and standardize the measurement of vaccine and drug efficacy in reducing moderate dengue illness. The tool is designed to capture an individual’s overall disease experience based on how the totality of their symptoms impacts their general wellness and daily functionality. We applied the DII to a diary card, the Dengue Illness Card (DIC), which was examined and further developed by a working group. The card was then refined with feedback garnered from a Delphi methodology-based query that addressed the adequacy and applicability of the tool in clinical dengue research. There was overall agreement that the tool would generate useful data and provide an alternative perspective to the assessment of drug or vaccine candidates, which in the case of vaccines, are assessed by their reduction in any virologically confirmed dengue of any severity with a focus on the more severe. The DIC needs to be evaluated in the field in the context of vaccine or drug trials, prospective cohort studies, or during experimental human infection studies. Here, we present the final DIC resulting from the Delphi process and offer its further development or use to the dengue research community.

Development of standard clinical endpoints for use in dengue interventional trials

PLoS Neglected Tropical Diseases News - 4 October 2018 - 9:00pm

by Kay M. Tomashek, Bridget Wills, Lucy Chai See Lum, Laurent Thomas, Anna Durbin, Yee-Sin Leo, Norma de Bosch, Elsa Rojas, Kim Hendrickx, Martin Erpicum, Liane Agulto, Thomas Jaenisch, Hasitha Tissera, Piyarat Suntarattiwong, Beth Ann Collers, Derek Wallace, Alexander C. Schmidt, Alexander Precioso, Federico Narvaez, Stephen J. Thomas, Robert Edelman, João Bosco Siqueira, M. Cristina Cassetti, Walla Dempsey, Duane J. Gubler

Dengue is a major public health problem worldwide. Although several drug candidates have been evaluated in randomized controlled trials, none has been effective and at present, early recognition of severe dengue and timely supportive care are used to reduce mortality. While the first dengue vaccine was recently licensed, and several other candidates are in late stage clinical trials, future decisions regarding widespread deployment of vaccines and/or therapeutics will require evidence of product safety, efficacy and effectiveness. Standard, quantifiable clinical endpoints are needed to ensure reproducibility and comparability of research findings. To address this need, we established a working group of dengue researchers and public health specialists to develop standardized endpoints and work towards consensus opinion on those endpoints. After discussion at two working group meetings and presentations at international conferences, a Delphi methodology-based query was used to finalize and operationalize the clinical endpoints. Participants were asked to select the best endpoints from proposed definitions or offer revised/new definitions, and to indicate whether contributing items should be designated as optional or required. After the third round of inquiry, 70% or greater agreement was reached on moderate and severe plasma leakage, moderate and severe bleeding, acute hepatitis and acute liver failure, and moderate and severe neurologic disease. There was less agreement regarding moderate and severe thrombocytopenia and moderate and severe myocarditis. Notably, 68% of participants agreed that a 50,000 to 20,000 mm3 platelet range be used to define moderate thrombocytopenia; however, they remained divided on whether a rapid decreasing trend or one platelet count should be case defining. While at least 70% agreement was reached on most endpoints, the process identified areas for further evaluation and standardization within the context of ongoing clinical studies. These endpoints can be used to harmonize data collection and improve comparability between dengue clinical trials.

Typhoid fever outbreak in the Democratic Republic of Congo: Case control and ecological study

PLoS Neglected Tropical Diseases News - 3 October 2018 - 9:00pm

by Julii Brainard, Rob D’hondt, Engy Ali, Rafael Van den Bergh, Anja De Weggheleire, Yves Baudot, Frederic Patigny, Vincent Lambert, Rony Zachariah, Peter Maes, Donat Kuma-Kuma Kenge, Paul R. Hunter

During 2011 a large outbreak of typhoid fever affected an estimated 1430 people in Kikwit, Democratic Republic of Congo. The outbreak started in military camps in the city but then spread to the general population. This paper reports the results of an ecological analysis and a case-control study undertaken to examine water and other possible transmission pathways. Attack rates were determined for health areas and risk ratios were estimated with respect to spatial exposures. Approximately 15 months after the outbreak, demographic, environmental and exposure data were collected for 320 cases and 640 controls residing in the worst affected areas, using a structured interview questionnaire. Unadjusted and adjusted odds ratios were estimated. Complete data were available for 956 respondents. Residents of areas with water supplied via gravity on the mains network were at much greater risk of disease acquisition (risk ratio = 6.20, 95%CI 3.39–11.35) than residents of areas not supplied by this mains network. In the case control study, typhoid was found to be associated with ever using tap water from the municipal supply (OR = 4.29, 95% CI 2.20–8.38). Visible urine or faeces in the latrine was also associated with increased risk of typhoid and having chosen a water source because it is protected was negatively associated. Knowledge that washing hands can prevent typhoid fever, and stated habit of handwashing habits before cooking or after toileting was associated with increased risk of disease. However, observed associations between handwashing or plate-sharing with disease risk could very likely be due to recall bias. This outbreak of typhoid fever was strongly associated with drinking water from the municipal drinking water supply, based on the descriptive and analytic epidemiology and the finding of high levels of faecal contamination of drinking water. Future outbreaks of potentially waterborne disease need an integrated response that includes epidemiology and environmental microbiology during early stages of the outbreak.

Beating the odds: Sustained Chagas disease vector control in remote indigenous communities of the Argentine Chaco over a seven-year period

PLoS Neglected Tropical Diseases News - 2 October 2018 - 9:00pm

by M. Sol Gaspe, Yael M. Provecho, María P. Fernández, Claudia V. Vassena, Pablo L. Santo Orihuela, Ricardo E. Gürtler


Rapid reinfestation of insecticide-treated dwellings hamper the sustained elimination of Triatoma infestans, the main vector of Chagas disease in the Gran Chaco region. We conducted a seven-year longitudinal study including community-wide spraying with pyrethroid insecticides combined with periodic vector surveillance to investigate the house reinfestation process in connection with baseline pyrethroid resistance, housing quality and household mobility in a rural section of Pampa del Indio mainly inhabited by deprived indigenous people (Qom).

Methodology/Principal findings

Despite evidence of moderate pyrethroid resistance in local T. infestans populations, house infestation dropped from 31.9% at baseline to 0.7% at 10 months post-spraying (MPS), with no triatomine found at 59 and 78 MPS. Household-based surveillance corroborated the rare occurrence of T. infestans and the house invasion of other four triatomine species. The annual rates of loss of initially occupied houses and of household mobility were high (4.6–8.0%). Housing improvements did not translate into a significant reduction of mud-walled houses and refuges for triatomines because most households kept the former dwelling or built new ones with mud walls.


Our results refute the assumption that vector control actions performed in marginalized communities of the Gran Chaco are doomed to fail. The larger-than-expected impacts of the intervention program were likely associated with the combined effects of high-coverage, professional insecticide spraying followed by systematic vector surveillance-and-response, broad geographic coverage creating a buffer zone, frequent housing replacement and residential mobility. The dynamical interactions among housing quality, mobility and insecticide-based control largely affect the chances of vector elimination.

Leprosy in children under 15 years of age in Brazil: A systematic review of the literature

PLoS Neglected Tropical Diseases News - 2 October 2018 - 9:00pm

by Michelle Christini Araújo Vieira, Joilda Silva Nery, Enny S. Paixão, Kaio Vinicius Freitas de Andrade, Gerson Oliveira Penna, Maria Glória Teixeira


Leprosy is a chronic infectious disease neglected, caused by Mycobacterium leprae, considered a public health problem because may cause permanent physical disabilities and deformities, leading to severe limitations. This review presents an overview of the results of epidemiological studies on leprosy occurrence in childhood in Brazil, aiming to alert health planners and managers to the actual need to institute special control strategies.

Methodology/Principal findings

Data collection consisted of an electronic search for publications in eight databases: Literatura Latino-Americana e do Caribe em Ciências da Saúde (LILACS), Scientific Electronic Library Online (SciELO), PuBMed, Biblioteca Virtual em Saúde (BVS), SciVerse Scopus (Scopus), CAPES theses database, CAPES journals database and Web of Science of papers published up to 2016. After apply selection criteria, twenty-two papers of studies conducted in four different regions of Brazil and published between 2001 and 2016 were included in the review. The leprosy detection rate ranged from 10.9 to 78.4 per 100,000 inhabitants. Despite affecting both sexes, leprosy was more common in boys and in 10-14-year-olds. Although the authors reported a high cure proportion (82–90%), between 1.7% and 5.5% of the individuals developed a disability resulting from the disease.


The findings of this review shows that leprosy situation in Brazilian children under 15 years is extremely adverse in that the leprosy detection rate remains high in the majority of studies. The proportion of cases involving disability is also high and reflects the difficulties and the poor effectiveness of actions aimed at controlling the disease. The authors suggest the development of studies in spatial clusters of leprosy, where beyond the routine actions established, are included news strategies of active search and campaigns and actions of educations inside the clusters of this disease. The new agenda needs to involve the precepts of ethical, humane and supportive care, in order to achieve a new level of leprosy control in Brazil.

Defining stopping criteria for ending randomized clinical trials that investigate the interruption of transmission of soil-transmitted helminths employing mass drug administration

PLoS Neglected Tropical Diseases News - 1 October 2018 - 9:00pm

by Marleen Werkman, Jaspreet Toor, Carolin Vegvari, James E. Wright, James E. Truscott, Kristjana H. Ásbjörnsdóttir, Arianna Rubin Means, Judd L. Walson, Roy M. Anderson

The current World Health Organization strategy to address soil-transmitted helminth (STH) infections in children is based on morbidity control through routine deworming of school and pre-school aged children. However, given that transmission continues to occur as a result of persistent reservoirs of infection in untreated individuals (including adults) and in the environment, in many settings such a strategy will need to be continued for very extended periods of time, or until social, economic and environmental conditions result in interruption of transmission. As a result, there is currently much discussion surrounding the possibility of accelerating the interruption of transmission using alternative strategies of mass drug administration (MDA). However, the feasibility of achieving transmission interruption using MDA remains uncertain due to challenges in sustaining high MDA coverage levels across entire communities. The DeWorm3 trial, designed to test the feasibility of interrupting STH transmission, is currently ongoing. In DeWorm3, three years of high treatment coverage—indicated by mathematical models as necessary for breaking transmission—will be followed by two years of surveillance. Given the fast reinfection (bounce-back) rates of STH, a two year no treatment period is regarded as adequate to assess whether bounce-back or transmission interruption have occurred in a given location. In this study, we investigate if criteria to determine whether transmission interruption is unlikely can be defined at earlier timepoints. A stochastic, individual-based simulation model is employed to simulate core aspects of the DeWorm3 community-based cluster-randomized trial. This trial compares a control arm (annual treatment of children alone with MDA) with an intervention arm (community-wide biannual treatment with MDA). Simulations were run for each scenario for both Ascaris lumbricoides and hookworm (Necator americanus). A range of threshold prevalences measured at six months after the last round of MDA and the impact of MDA coverage levels were evaluated to see if the likelihood of bounce-back or elimination could reliably be assessed at that point, rather than after two years of subsequent surveillance. The analyses suggest that all clusters should be assessed for transmission interruption after two years of surveillance, unless transmission interruption can be effectively ruled out through evidence of low treatment coverage. Models suggest a tight range of homogenous prevalence estimates following high coverage MDA across clusters which do not allow for discrimination between bounce back or transmission interruption within 24 months following cessation of MDA.