Dr Joaquin M. Prada
Academic and research departments
School of Veterinary Medicine, Faculty of Health and Medical Sciences, Institute for Sustainability, Surrey Institute for People-Centred Artificial Intelligence (PAI).About
Biography
My research focuses on the development of mathematical and statistical models to inform policy decision-making for the control and elimination of infectious diseases, with a particular interest in neglected tropical diseases (NTDs).
My work has supported global policy, for example during the development of the 2021-2030 NTD roadmap, and I am also part of several informal WHO-led working groups and the United Against Rabies forum working group 1 (a FAO-OIE-WHO initiative). My research group at Surrey combines mathematical modelling with economic evaluation and stakeholder elicitation techniques to develop sustainable portfolios of interventions against NTDs (in particular zoonotic NTDs such as Echinococcosis and Rabies).
I hold a visiting position at North Carolina State University. I previously held similar visiting positions at the University of Warwick and the University of Oxford.
University roles and responsibilities
- School of Veterinary Medicine Impact Lead
Publications
As the complexity of health systems has increased over time, there is an urgent need for developing multi-sectoral and multi-disciplinary collaborations within the domain of One Health (OH). Despite the efforts to promote collaboration in health surveillance and overcome professional silos, implementing OH surveillance systems in practice remains challenging for multiple reasons. In this study, we describe the lessons learned from the evaluation of OH surveillance using OH-EpiCap (an online evaluation tool for One Health epidemiological surveillance capacities and capabilities), the challenges identified with the implementation of OH surveillance, and the main barriers that contribute to its sub-optimal functioning, as well as possible solutions to address them. We conducted eleven case studies targeting the multi-sectoral surveillance systems for antimicrobial resistance in Portugal and France, Salmonella in France, Germany, and the Netherlands, Listeria in The Netherlands, Finland and Norway, Campylobacter in Norway and Sweden, and psittacosis in Denmark. These evaluations facilitated the identification of common strengths and weaknesses, focusing on the organization and functioning of existing collaborations and their impacts on the surveillance system. Lack of operational and shared leadership, adherence to FAIR data principles, sharing of techniques, and harmonized indicators led to poor organization and sub-optimal functioning of OH surveillance systems. In the majority of studied systems, the effectiveness, operational costs, behavioral changes, and population health outcomes brought by the OH surveillance over traditional surveillance (i.e. compartmentalized into sectors) have not been evaluated. To this end, the establishment of a formal governance body with representatives from each sector could assist in overcoming long-standing barriers. Moreover, demonstrating the impacts of OH-ness of surveillance may facilitate the implementation of OH surveillance systems.
Although international health agencies encourage the development of One Health (OH) surveillance, many systems remain mostly compartmentalized, with limited collaborations among sectors and disciplines. In the framework of the OH European Joint Programme “MATRIX” project, a generic evaluation tool called OH-EpiCap has been developed to enable individual institutes/governments to characterize, assess and monitor their own OH epidemiological surveillance capacities and capabilities. The tool is organized around three dimensions: organization, operational activities, and impact of the OH surveillance system; each dimension is then divided into four targets, each including four indicators. A semi-quantitative questionnaire enables the scoring of each indicator, with four levels according to the degree of satisfaction in the studied OH surveillance system. The evaluation is conducted by a panel of surveillance representatives (during a half-day workshop or with a back-and-forth process to reach a consensus). An R Shiny-based web application facilitates implementation of the evaluation and visualization of the results, and includes a benchmarking option. The tool was piloted on several foodborne hazards (i.e., Salmonella, Campylobacter, Listeria ), emerging threats (e.g., antimicrobial resistance) and other zoonotic hazards (psittacosis) in multiple European countries in 2022. These case studies showed that the OH-EpiCap tool supports the tracing of strengths and weaknesses in epidemiological capacities and the identification of concrete and direct actions to improve collaborative activities at all steps of surveillance. It appears complementary to the existing EU-LabCap tool, designed to assess the capacity and capability of European microbiology laboratories. In addition, it provides opportunity to reinforce trust between surveillance stakeholders from across the system and to build a good foundation for a professional network for further collaboration.
The global 2030 goal set by the World Organization for Animal Health (WOAH), the World Health Organization (WHO), and the Food and Agriculture Organization (FAO), to eliminate dog-mediated human rabies deaths, has undeniably been a catalyst for many countries to re-assess existing dog rabies control programmes. Additionally, the 2030 agenda for Sustainable Development includes a blueprint for global targets which will benefit both people and secure the health of the planet. Rabies is acknowledged as a disease of poverty, but the connections between economic development and rabies control and elimination are poorly quantified yet, critical evidence for planning and prioritisation. We have developed multiple generalised linear models, to model the relationship between health care access, poverty, and death rate as a result of rabies, with separate indicators that can be used at country-level; total Gross Domestic Product (GDP), and current health expenditure as a percentage of the total gross domestic product (% GDP) as an indicator of economic growth; and a metric of poverty assessing the extent and intensity of deprivation experienced at the individual level (Multidimensional Poverty Index, MPI). Notably there was no detectable relationship between GDP or current health expenditure (% GDP) and death rate from rabies. However, MPI showed statistically significant relationships with per capita rabies deaths and the probability of receiving lifesaving post exposure prophylaxis. We highlight that those most at risk of not being treated, and dying due to rabies, live in communities experiencing health care inequalities, readily measured through poverty indicators. These data demonstrate that economic growth alone, may not be enough to meet the 2030 goal. Indeed, other strategies such as targeting vulnerable populations and responsible pet ownership are also needed in addition to economic investment.
The Middle East, Eastern Europe, Central Asia and North Africa Rabies Control Network (MERACON), is built upon the achievements of the Middle East and Eastern Europe Rabies Expert Bureau (MEEREB). MERACON aims to foster collaboration among Member States (MS) and develop shared regional objectives, building momentum towards dog-mediated rabies control and elimination. Here we assess the epidemiology of rabies and preparedness in twelve participating MS, using case and rabies capacity data for 2017, and compare our findings with previous published reports and a predictive burden model. Across MS, the number of reported cases of dog rabies per 100,000 dog population and the number of reported human deaths per 100,000 population as a result of dog-mediated rabies appeared weakly associated. Compared to 2014 there has been a decrease in the number of reported human cases in five of the twelve MS, three MS reported an increase, two MS continued to report zero cases, and the remaining two MS were not listed in the 2014 study and therefore no comparison could be drawn. Vaccination coverage in dogs has increased since 2014 in half (4/8) of the MS where data are available. Most importantly, it is evident that there is a need for improved data collection, sharing and reporting at both the national and international levels. With the formation of the MERACON network, MS will be able to align with international best practices, while also fostering international support with other MS and international organisations.
Lymphatic filariasis (LF) is a neglected tropical disease targeted for elimination as a public health problem by 2030. Although mass treatments have led to huge reductions in LF prevalence, some countries or regions may find it difficult to achieve elimination by 2030 owing to various factors, including local differences in transmission. Subnational projections of intervention impact are a useful tool in understanding these dynamics, but correctly characterizing their uncertainty is challenging. We developed a computationally feasible framework for providing subnational projections for LF across 44 sub-Saharan African countries using ensemble models, guided by historical control data, to allow assessment of the role of subnational heterogeneities in global goal achievement. Projected scenarios include ongoing annual treatment from 2018 to 2030, enhanced coverage, and biannual treatment. Our projections suggest that progress is likely to continue well. However, highly endemic locations currently deploying strategies with the lower World Health Organization recommended coverage (65%) and frequency (annual) are expected to have slow decreases in prevalence. Increasing intervention frequency or coverage can accelerate progress by up to 5 or 6 years, respectively. While projections based on baseline data have limitations, our methodological advancements provide assessments of potential bottlenecks for the global goals for LF arising from subnational heterogeneities. In particular, areas with high baseline prevalence may face challenges in achieving the 2030 goals, extending the "tail" of interventions. Enhancing intervention frequency and/or coverage will accelerate progress. Our approach facilitates preimplementation assessments of the impact of local interventions and is applicable to other regions and neglected tropical diseases.
The WHO aims to eliminate schistosomiasis as a public health problem by 2030. However, standard morbidity measures poorly correlate to infection intensities, hindering disease monitoring and evaluation. This is exacerbated by insufficient evidence on Schistosoma's impact on health-related quality of life (HRQoL). We conducted community-based cross-sectional surveys and parasitological examinations in moderate-to-high Schistosoma mansoni endemic communities in Uganda. We calculated parasitic infections and used EQ-5D instruments to estimate and compare HRQoL utilities in these populations. We further employed Tobit/linear regression models to predict HRQoL determinants. Two-thirds of the 560 participants were diagnosed with parasitic infection(s), 49% having S. mansoni. No significant negative association was observed between HRQoL and S. mansoni infection status/intensity. However, severity of pain urinating (beta = -0.106; s.e. = 0.043) and body swelling (beta = -0.326; s.e. = 0.005), increasing age (beta = -0.016; s.e. = 0.033), reduced socio-economic status (beta = 0.128; s.e. = 0.032), and being unemployed predicted lower HRQoL. Symptom severity and socio-economic status were better predictors of short-term HRQoL than current S. mansoni infection status/intensity. This is key to disentangling the link between infection(s) and short-term health outcomes, and highlights the complexity of correlating current infection(s) with long-term morbidity. Further evidence is needed on long-term schistosomiasis-associated HRQoL, health and economic outcomes to inform the case for upfront investments in schistosomiasis interventions.
Cystic Echinococcosis (CE) as a prevalent tapeworm infection of human and herbivorous animals worldwide, is caused by accidental ingestion of Echinococcus granulosus eggs excreted from infected dogs. CE is endemic in the Middle East and North Africa, and is considered as an important parasitic zoonosis in Iran. It is transmitted between dogs as the primary definitive host and different livestock species as the intermediate hosts. One of the most important measures for CE control is dog deworming with praziquantel. Due to the frequent reinfection of dogs, intensive deworming campaigns are critical for breaking CE transmission. Dog reinfection rate could be used as an indicator of the intensity of local CE transmission in endemic areas. However, our knowledge on the extent of reinfection in the endemic regions is poor. The purpose of the present study was to determine E. granulosus reinfection rate after praziquantel administration in a population of owned dogs in Kerman, Iran. A cohort of 150 owned dogs was recruited, with stool samples collected before praziquantel administration as a single oral dose of 5 mg/kg. The re-samplings of the owned dogs were performed at 2, 5 and 12 months following initial praziquantel administration. Stool samples were examined microscopically using Willis flotation method. Genomic DNA was extracted, and E. granulosus sensu lato-specific primers were used to PCR-amplify a 133-bp fragment of a repeat unit of the parasite genome. Survival analysis was performed using Kaplan-Meier method to calculate cumulative survival rates, which is used here to capture reinfection dynamics, and monthly incidence of infection, capturing also the spatial distribution of disease risk. Results of survival analysis showed 8, 12 and 17% total reinfection rates in 2, 5 and 12 months following initial praziquantel administration, respectively, indicating that 92, 88 and 83% of the dogs had no detectable infection in that same time periods. The monthly incidence of reinfection in total owned dog population was estimated at 1.5% (95% CI 1.0-2.1). The results showed that the prevalence of echinococcosis in owned dogs, using copro-PCR assay was 42.6%. However, using conventional microscopy, 8% of fecal samples were positive for taeniid eggs. Our results suggest that regular treatment of the dog population with praziquantel every 60 days is ideal, however the frequency of dog dosing faces major logistics and cost challenges, threatening the sustainability of control programs. Understanding the nature and extent of dog reinfection in the endemic areas is essential for successful implementation of control programs and understanding patterns of CE transmission. Cystic echinococcosis (CE), caused by the small tapeworm of dogs, Echinococcus granulosus, is considered as a prevalent zoonotic infection of human and livestock worldwide. Dogs play a crucial role in the parasite life cycle, serving as definitive hosts for the tapeworm and contributing to the transmission of the disease to humans and other livestock.Praziquantel (PZQ) dosing of dogs is proposed as one of the main elements of CE control programs. Frequent PZQ dosing is required because of reinfection of farm dogs following feeding offal to them and this presents major logistic and financial problems. The frequency of PZQ dosing is dependent on the rate of dog reinfection in each endemic region. We explored the significance of farm dogs reinfection in an endemic area in southeastern Iran, to provide better understanding of CE transmission and developing effective control programs.We monitored a cohort of 150 dogs previously treated for cystic echinococcosis by praziquantel. We showed that the prevalence of echinococcosis in the farm dogs, using PCR assay was 42.6% before praziquantel administration. To our surprise, a significant proportion of these dogs, approximately 17%, experienced reinfection with E. granulosus, 12 months following initial praziquantel administration. This finding was both alarming and informative. Our study emphasized the importance of responsible dog ownership behavior and proper disposal of livestock offal in endemic areas.
The role of transportation vehicles, pig movement between farms, proximity to infected premises, and feed deliveries has not been fully considered in the dissemination dynamics of porcine epidemic diarrhea virus (PEDV). This has limited efforts for disease control and elimination restricting the development of risk-based resource allocation to the most relevant modes of PEDV dissemination. Here, we modeled nine modes of between-farm transmission pathways including farm-to-farm proximity (local transmission), contact network of pig farm movements between sites, four different contact networks of transportation vehicles (vehicles that transport pigs from farm-to-farm, pigs to markets, feed distribution and crew), the volume of animal by-products within feed diets (e.g. animal fat and meat and bone meal) to reproduce PEDV transmission dynamics. The model was calibrated in space and time with weekly PEDV outbreaks. We investigated the model performance to identify outbreak locations and the contribution of each route in the dissemination of PEDV. The model estimated that 42.7% of the infections in sow farms were related to vehicles transporting feed, 34.5% of infected nurseries were associated with vehicles transporting pigs to farms, and for both farm types, pig movements or local transmission were the next most relevant routes. On the other hand, finishers were most often (31.4%) infected via local transmission, followed by the vehicles transporting feed and pigs to farm networks. Feed ingredients did not significantly improve model calibration metrics. The proposed modeling framework provides an evaluation of PEDV transmission dynamics, ranking the most important routes of PEDV dissemination and granting the swine industry valuable information to focus efforts and resources on the most important transmission routes.
Neglected tropical diseases (NTDs) largely impact marginalised communities living in tropical and subtropical regions. Mass drug administration is the leading intervention method for five NTDs; however, it is known that there is lack of access to treatment for some populations and demographic groups. It is also likely that those individuals without access to treatment are excluded from surveillance. It is important to consider the impacts of this on the overall success, and monitoring and evaluation (M & E) of intervention programmes. We use a detailed individual-based model of the infection dynamics of lymphatic filariasis to investigate the impact of excluded, untreated, and therefore unobserved groups on the true versus observed infection dynamics and subsequent intervention success. We simulate surveillance in four groups-the whole population eligible to receive treatment, the whole eligible population with access to treatment, the TAS focus of six- and seven-year-olds, and finally in >20-year-olds. We show that the surveillance group under observation has a significant impact on perceived dynamics. Exclusion to treatment and surveillance negatively impacts the probability of reaching public health goals, though in populations that do reach these goals there are no signals to indicate excluded groups. Increasingly restricted surveillance groups over-estimate the efficacy of MDA. The presence of non-treated groups cannot be inferred when surveillance is only occurring in the group receiving treatment.Author summaryMass drug administration (MDA) is the cornerstone of control for many neglected tropical diseases. As we move towards increasingly ambitious public heath targets, it is critical to investigate ways in which MDA weaknesses can be strengthened. It is known that some individuals systematically choose not to participate in treatment. It is also becoming evident that others systematically do not have access to treatment. What is less clear however, is how access to treatment correlates to inclusion in surveillance efforts, and in turn, how this impacts the monitoring and evaluation of intervention programmes. If individuals with access to treatment are more likely to be included in surveillance efforts, then this implies that those without access to treatment are likewise more likely to be excluded from surveillance. Extending the individual-based lymphatic filariasis model, TRANSFIL, we show that exclusion to treatment and surveillance negatively impacts the probability of reaching public health goals, though in populations that do reach these goals there are no signals to indicate excluded groups. Increasingly restricted surveillance groups over-estimate the efficacy of MDA. The presence of non-treated groups cannot be inferred when surveillance is only occurring in the group receiving treatment.
Lymphatic filariasis (LF) is a debilitating, poverty-promoting, neglected tropical disease (NTD) targeted for worldwide elimination as a public health problem (EPHP) by 2030. Evaluating progress towards this target for national programmes is challenging, due to differences in disease transmission and interventions at the subnational level. Mathematical models can help address these challenges by capturing spatial heterogeneities and evaluating progress towards LF elimination and how different interventions could be leveraged to achieve elimination by 2030. Here we used a novel approach to combine historical geo-spatial disease prevalence maps of LF in Ethiopia with 3 contemporary disease transmission models to project trends in infection under different intervention scenarios at subnational level. Our findings show that local context, particularly the coverage of interventions, is an important determinant for the success of control and elimination programmes. Furthermore, although current strategies seem sufficient to achieve LF elimination by 2030, some areas may benefit from the implementation of alternative strategies, such as using enhanced coverage or increased frequency, to accelerate progress towards the 2030 targets. The combination of geospatial disease prevalence maps of LF with transmission models and intervention histories enables the projection of trends in infection at the subnational level under different control scenarios in Ethiopia. This approach, which adapts transmission models to local settings, may be useful to inform the design of optimal interventions at the subnational level in other LF endemic regions.
Parasitic neglected tropical diseases (NTDs) or 'infectious diseases of poverty' continue to affect the poorest communities in the world, including in the Philippines. Socio-economic conditions contribute to persisting endemicity of these infectious diseases. As such, examining these underlying factors may help identify gaps in implementation of control programs. This study aimed to determine the prevalence of schistosomiasis and soil-transmitted helminthiasis (STH) and investigate the role of socio-economic and risk factors in the persistence of these diseases in endemic communities in the Philippines. This cross-sectional study involving a total of 1,152 individuals from 386 randomly-selected households was conducted in eight municipalities in Mindanao, the Philippines. Participants were asked to submit fecal samples which were processed using the Kato-Katz technique to check for intestinal helminthiases. Moreover, each household head participated in a questionnaire survey investigating household conditions and knowledge, attitude, and practices related to intestinal helminthiases. Associations between questionnaire responses and intestinal helminth infection were assessed. Results demonstrated an overall schistosomiasis prevalence of 5.7% and soil-transmitted helminthiasis prevalence of 18.8% in the study population. Further, the household questionnaire revealed high awareness of intestinal helminthiases, but lower understanding of routes of transmission. Potentially risky behaviors such as walking outside barefoot and bathing in rivers were common. There was a strong association between municipality and prevalence of helminth infection. Educational attainment and higher "practice" scores (relating to practices which are effective in controlling intestinal helminths) were inversely associated with soil-transmitted helminth infection. Results of the study showed remaining high endemicity of intestinal helminthiases in the area despite ongoing control programs. Poor socio-economic conditions and low awareness about how intestinal helminthiases are transmitted may be among the factors hindering success of intestinal helminth control programs in the provinces of Agusan del Sur and Surigao del Norte. Addressing these sustainability gaps could contribute to the success of alleviating the burden of intestinal helminthiases in endemic areas.
As the complexity of health systems has increased over time, there is an urgent need for developing multi-sectoral and multidisciplinary collaborations within the domain of One Health (OH) (1). Despite the efforts to promote such collaborations and break through discipline silos, implementing OH surveillance in practice remains difficult. Thus, it is key to identify the main challenges and barriers for the effective implementation and functioning of an integrated OH surveillance system (2).
Accurate epidemiological classification guidelines are essential to ensure implementation of adequate public health and social measures. Here, we investigate two frameworks, published in March 2020 and November 2020 by the World Health Organization (WHO) to categorise transmission risks of COVID-19 infection, and assess how well the countries' self-reported classification tracked their underlying epidemiological situation. We used three modelling approaches: an ordinal longitudinal model, a proportional odds model and a machine learning One-Rule classification algorithm. We applied these models to 202 countries' daily transmission classification and epidemiological data, and study classification accuracy over time for the period April 2020 to June 2021, when WHO stopped publishing country classifications. Overall, the first published WHO classification, purely qualitative, lacked accuracy. The incidence rate within the previous 14 days was the best predictor with an average accuracy throughout the period of study of 61.5%. However, when each week was assessed independently, the models returned predictive accuracies above 50% only in the first weeks of April 2020. In contrast, the second classification, quantitative in nature, increased significantly the accuracy of transmission labels, with values as high as 94%.
Background Human, animal, and environmental health are increasingly threatened by the emergence and spread of antibiotic resistance. Inappropriate use of antibiotic treatments commonly contributes to this threat, but it is also becoming apparent that multiple, interconnected environmental factors can play a significant role. Thus, a One Health approach is required for a comprehensive understanding of the environmental dimensions of antibiotic resistance and inform science-based decisions and actions. The broad and multidisciplinary nature of the problem poses several open questions drawing upon a wide heterogeneous range of studies. Objective This study seeks to collect and catalogue the evidence of the potential effects of environmental factors on the abundance or detection of antibiotic resistance determinants in the outdoor environment, i.e., antibiotic resistant bacteria and mobile genetic elements carrying antibiotic resistance genes, and the effect on those caused by local environmental conditions of either natural or anthropogenic origin. Methods Here, we describe the protocol for a systematic evidence map to address this, which will be performed in adherence to best practice guidelines. We will search the literature from 1990 to present, using the following electronic databases: MEDLINE, Embase, and the Web of Science Core Collection as well as the grey literature. We shall include full-text, scientific articles published in English. Reviewers will work in pairs to screen title, abstract and keywords first and then full-text documents. Data extraction will adhere to a code book purposely designed. Risk of bias assessment will not be conducted as part of this SEM. We will combine tables, graphs, and other suitable visualisation techniques to compile a database i) of studies investigating the factors associated with the prevalence of antibiotic resistance in the environment and ii) map the distribution, network, cross-disciplinarity, impact and trends in the literature.
Toxocara canis and Toxocara cati are globally distributed, zoonotic roundworm parasites. Human infection can have serious clinical consequences including blindness and brain disorders. In addition to ingesting environmental eggs, humans can become infected by eating infective larvae in raw or undercooked meat products. To date, no studies have assessed the prevalence of Toxocara spp. larvae in meat from animals consumed as food in the UK or assessed tissue exudates for the presence of anti-Toxocara antibodies. This study aimed to assess the potential risk to consumers eating meat products from animals infected with Toxocara spp. Tissue samples (226) were obtained from 155 different food producing animals in the south, southwest and east of England, UK. Tissue samples (n=226), either muscle or liver, were processed by artificial digestion followed by microscopic sediment evaluation for Toxocara spp. larvae, and tissue exudate samples (n=141) were tested for the presence of anti-Toxocara antibodies using a commercial ELISA kit. A logistic regression model was used to compare anti-Toxocara antibody prevalence by host species, tissue type and source. While no larvae were found by microscopic examination after tissue digestion, the overall prevalence of anti-Toxocara antibodies in tissue exudates was 27.7%. By species, 35.3% of cattle (n=34), 15.0% of sheep (n=60), 54.6% of goats (n=11) and 61.1% of pigs (n=18) had anti-Toxocara antibodies. Logistic regression analysis found pigs were more likely to be positive for anti-Toxocara antibodies (odds ration (OR) = 2.89, P=0.0786) compared with the other species sampled but only at a 10% significance level. The high prevalence of anti-Toxocara antibodies in tissue exudates suggests that exposure of food animals to this parasite is common in England. Tissue exudate serology on meat products within the human food chain could be applied in support of food safety and to identify practices that increase risks of foodborne transmission of zoonotic toxocariasis.
Abstract for the DC conference
Abstract Background The COVID-19 pandemic has disrupted planned annual antibiotic mass drug administration (MDA) activities that have formed the cornerstone of the largely successful global efforts to eliminate trachoma as a public health problem. Methods Using a mathematical model we investigate the impact of interruption to MDA in trachoma-endemic settings. We evaluate potential measures to mitigate this impact and consider alternative strategies for accelerating progress in those areas where the trachoma elimination targets may not be achievable otherwise. Results We demonstrate that for districts that were hyperendemic at baseline, or where the trachoma elimination thresholds have not already been achieved after three rounds of MDA, the interruption to planned MDA could lead to a delay to reaching elimination targets greater than the duration of interruption. We also show that an additional round of MDA in the year following MDA resumption could effectively mitigate this delay. For districts where the probability of elimination under annual MDA was already very low, we demonstrate that more intensive MDA schedules are needed to achieve agreed targets. Conclusion Through appropriate use of additional MDA, the impact of COVID-19 in terms of delay to reaching trachoma elimination targets can be effectively mitigated. Additionally, more frequent MDA may accelerate progress towards 2030 goals.
This study assessed the elemental status of cross-bred dairy cows in small holder farms in Sri Lanka, with the aim to establish the elemental baseline and identify possible deficiencies. For this purpose, 458 milk, hair, serum and whole blood samples were collected from 120 cows in four regions of Northern and Northwestern Sri Lanka, (namely Vavaniya, Mannar, Jaffna and Kurunegala). Farmers also provided a total of 257 samples of feed, which included local fodder as well as 79 supplement materials. The concentrations of As, Ca, Cd, Co, Cr, Cu, Fe, I, K, Mg, Mn, Mo, Na, Ni, Pb, Se, V and Zn were determined by inductively coupled plasma mass spectrometry (ICP-MS). Evaluation of the data revealed that all cows in this study could be considered deficient in I and Co (18.6-78.5 mu g L-1 I and 0.06-0.65 mu g L-1 Co, in blood serum) when compared with deficiency upper boundary levels of 0.70 mu g L-1 Co and 50 mu g L-1 I. Poor correlations were found between the composition of milk or blood with hair, which suggests that hair is not a good indicator of mineral status. Most local fodders meet dietary requirements, with Sarana grass offering the greatest nutritional profile. Principal component analysis (PCA) was used to assess differences in the elemental composition of the diverse types of feed, as well as regional variability, revealing clear differences between forage, concentrates and nutritional supplements, with the latter showing higher concentrations of non-essential or even toxic elements, such as Cd and Pb.
Trachoma is a neglected tropical disease and the leading infectious cause of blindness worldwide. The current World Health Organization goal for trachoma is elimination as a public health problem, defined as reaching a prevalence of trachomatous inflammation-follicular below 5% in children (1-9 years) and a prevalence of trachomatous trichiasis in adults below 0.2%. Current targets to achieve elimination were set to 2020 but are being extended to 2030. Mathematical and statistical models suggest that 2030 is a realistic timeline for elimination as a public health problem in most trachoma endemic areas. Although the goal can be achieved, it is important to develop appropriate monitoring tools for surveillance after having achieved the elimination target to check for the possibility of resurgence. For this purpose, a standardized serological approach or the use of multiple diagnostics in complement would likely be required.
The increasing overlap of resources between human and long-tailed macaque ( ) (LTM) populations have escalated human-primate conflict. In Malaysia, LTMs are labeled as a 'pest' species due to the macaques' opportunistic nature. This study investigates the activity budget of LTMs in an urban tourism site and how human activities influence it. Observational data were collected from LTMs daily for a period of four months. The observed behaviors were compared across differing levels of human interaction, between different times of day, and between high, medium, and low human traffic zones. LTMs exhibited varying ecological behavior patterns when observed across zones of differing human traffic, e.g., higher inactivity when human presence is high. More concerning is the impact on these animals' welfare and group dynamics as the increase in interactions with humans takes place; we noted increased inactivity and reduced intra-group interaction. This study highlights the connection that LTMs make between human activity and sources of anthropogenic food. Only through understanding LTM interaction can the cause for human-primate conflict be better understood, and thus, more sustainable mitigation strategies can be generated.
Visceral leishmaniasis (VL) is a neglected tropical disease (NTD) caused by Leishmania protozoa that are transmitted by female sand flies. On the Indian subcontinent (ISC), VL is targeted by the World Health Organization (WHO) for elimination as a public health problem by 2020, which is defined as
A limited understanding of the transmission dynamics of swine disease is a significant obstacle to prevent and control disease spread. Therefore, understanding between-farm transmission dynamics is crucial to developing disease forecasting systems to predict outbreaks that would allow the swine industry to tailor control strategies. Our objective was to forecast weekly porcine epidemic diarrhoea virus (PEDV) outbreaks by generating maps to identify current and future PEDV high-risk areas, and simulating the impact of control measures. Three epidemiological transmission models were developed and compared: a novel epidemiological modelling framework was developed specifically to model disease spread in swine populations, PigSpread, and two models built on previously developed ecosystems, SimInf (a stochastic disease spread simulations) and PoPS (Pest or Pathogen Spread). The models were calibrated on true weekly PEDV outbreaks from three spatially related swine production companies. Prediction accuracy across models was compared using the receiver operating characteristic area under the curve (AUC). Model outputs had a general agreement with observed outbreaks throughout the study period. PoPS had an AUC of 0.80, followed by PigSpread with 0.71, and SimInf had the lowest at 0.59. Our analysis estimates that the combined strategies of herd closure, controlled exposure of gilts to live viruses (feedback) and on-farm biosecurity reinforcement reduced the number of outbreaks. On average, 76% to 89% reduction was seen in sow farms, while in gilt development units (GDU) was between 33% to 61% when deployed to sow and GDU farms located in probabilistic high-risk areas. Our multi-model forecasting approach can be used to prioritize surveillance and intervention strategies for PEDV and other diseases potentially leading to more resilient and healthier pig production systems.
Cystic echinococcosis (CE) is a zoonotic disease of global relevance that leads to significant morbidity and economic losses in the livestock industry. Effective intervention and surveillance strategies are essential and available for controlling and preventing the spread of this disease. As a Neglected Tropical Disease (NTD), one of the main obstacles of control and surveillance is resource constraints and lack of sustainable investments. The Modelling Approaches for Cystic Echinococcosis (MACE) project aims to address these issues by utilizing an interdisciplinary approach that combines spatial analysis, individual-based transmission modelling, and stakeholder elicitation techniques.
Objective: Cystic echinococcosis (CE) is a parasitic zoonosis caused by Echinococcus granulosus sensu lato. Immunodiagnostic techniques such as Western blot (WB) or enzyme-linked immunosorbent assay (ELISA), with different antigens, can be applied to the diagnosis of sheep for epidemiological surveillance purposes in control programs. However, its use is limited by the existence of antigenic cross-reactivity between different species of taeniidae present in sheep. Therefore, the usefulness of establishing surveillance systems based on the identification of infection present in a livestock establishment, known as the (Epidemiological) Implementation Unit (IU), needs to be evaluated. Materials and Methods: A new ELISA diagnostic technique has been recently developed and validated using the recombinant EgAgB8/2 antigen for the detection of antibodies against E. granulosus. To determine detection of infection at the IU level using information from this diagnostic technique, simulations were carried out to evaluate the sample size required to classify IUs as likely infected, using outputs from a recently developed Bayesian latent class analysis model. Results: Relatively small samples sizes (between 14-29) are sufficient to achieve a high probability of detection (above 80%), across a range of prevalence, with the recently recommended Optical Density cut-off value for this novel ELISA (0.496), which optimizes diagnostic sensitivity and specificity. Conclusions: This diagnostic technique could be potentially used to identify the prevalence of infection in an area under control, measured as the percentage of IUs with the presence of infected sheep (infection present), or to individually identify the IU with ongoing transmission, given the presence of infected lambs, on which control measures should be intensified.
Many aspects of the porcine reproductive and respiratory syndrome virus (PRRSV) between‐farm transmission dynamics have been investigated, but uncertainty remains about the significance of farm type and different transmission routes on PRRSV spread. We developed a stochastic epidemiological model calibrated on weekly PRRSV outbreaks accounting for the population dynamics in different pig production phases, breeding herds, gilt development units, nurseries and finisher farms, of three hog producer companies. Our model accounted for indirect contacts by the close distance between farms (local transmission), between‐farm animal movements (pig flow) and reinfection of sow farms (re‐break). The fitted model was used to examine the effectiveness of vaccination strategies and complementary interventions such as enhanced PRRSV detection and vaccination delays and forecast the spatial distribution of PRRSV outbreak. The results of our analysis indicated that for sow farms, 59% of the simulated infections were related to local transmission (e.g. airborne, feed deliveries, shared equipment) whereas 36% and 5% were related to animal movements and re‐break, respectively. For nursery farms, 80% of infections were related to animal movements and 20% to local transmission; while at finisher farms, it was split between local transmission and animal movements. Assuming that the current vaccines are 1% effective in mitigating between‐farm PRRSV transmission, weaned pigs vaccination would reduce the incidence of PRRSV outbreaks by 3%, indeed under any scenario vaccination alone was insufficient for completely controlling PRRSV spread. Our results also showed that intensifying PRRSV detection and/or vaccination pigs at placement increased the effectiveness of all simulated vaccination strategies. Our model reproduced the incidence and PRRSV spatial distribution; therefore, this model could also be used to map current and future farms at‐risk. Finally, this model could be a useful tool for veterinarians, allowing them to identify the effect of transmission routes and different vaccination interventions to control PRRSV spread.
Over 240 million people are infected with schistosomiasis. Detecting Schistosoma mansoni eggs in stool using Kato–Katz thick smears (Kato-Katzs) is highly specific but lacks sensitivity. The urine-based point-of-care circulating cathodic antigen test (POC-CCA) has higher sensitivity, but issues include specificity, discrepancy between batches and interpretation of trace results. A semi-quantitative G-score and latent class analyses making no assumptions about trace readings have helped address some of these issues. However, intra-sample and inter-sample variation remains unknown for POC-CCAs. We collected 3 days of stool and urine from 349 and 621 participants, from high- and moderate-endemicity areas, respectively. We performed duplicate Kato-Katzs and one POC-CCA per sample. In the high-endemicity community, we also performed three POC-CCA technical replicates on one urine sample per participant. Latent class analysis was performed to estimate the relative contribution of intra- (test technical reproducibility) and inter-sample (day-to-day) variation on sensitivity and specificity. Within-sample variation for Kato-Katzs was higher than between-sample, with the opposite true for POC-CCAs. A POC-CCA G3 threshold most accurately assesses individual infections. However, to reach the WHO target product profile of the required 95% specificity for prevalence and monitoring and evaluation, a threshold of G4 is needed, but at the cost of reducing sensitivity. This article is part of the theme issue ‘Challenges and opportunities in the fight against neglected tropical diseases: a decade from the London Declaration on NTDs’.
Mass drug administration (MDA) is the main strategy towards lymphatic filariasis (LF) elimination. Progress is monitored by assessing microfilaraemia (Mf) or circulating filarial antigenaemia (CFA) prevalence, the latter being more practical for field surveys. The current criterion for stopping MDA requires 95% positive predictive value) thresholds for stopping MDA. The model captured trends in Mf and CFA prevalences reasonably well. Elimination cannot be predicted with sufficient certainty from CFA prevalence in 6-7-year olds. Resurgence may still occur if all children are antigen-negative, irrespective of the number tested. Mf-based criteria also show unfavourable results (PPV
Fasciolosis (Fasciola hepatica) and paramphistomosis (Calicophoron daubneyi) are two important infections of livestock. Calicophoron daubneyi is the predominant Paramphistomidae species in Europe, and its prevalence has increased in the last 10–15 years. In Italy, evidence suggests that the prevalence of F. hepatica in ruminants is low in the southern part, but C. daubneyi has been recently reported at high prevalence in the same area. Given the importance of reliable tools for liver and rumen fluke diagnosis in ruminants, this study evaluated the diagnostic performance of the Mini-FLOTAC (MF), Flukefinder® (FF) and sedimentation (SED) techniques to detect and quantify F. hepatica and C. daubneyi eggs using spiked and naturally infected cattle faecal samples.
The accuracy of screening tests for detecting cystic echinococcosis (CE) in livestock depends on characteristics of the host–parasite interaction and the extent of serological cross-reactivity with other taeniid species. The AgB8 kDa protein is considered to be the most specific native or recombinant antigen for immunodiagnosis of ovine CE. A particular DNA fragment coding for rAgB8/2 was identified, that provides evidence of specific reaction in the serodiagnosis of metacestode infection. We developed and validated an IgG Enzyme Linked Immunosorbent Assay (ELISA) test using a recombinant antigen B sub-unit EgAgB8/2 (rAgB8/2) of Echinoccocus granulosus sensu lato (s.l.) to estimate CE prevalence in sheep. A 273 bp DNA fragment coding for rAgB8/2 was expressed as a fusion protein (∼30 kDa) and purified by affinity chromatography. Evaluation of the analytical and diagnostic performance of the ELISA followed the World Organisation for Animal Health (OIE) manual, including implementation of serum panels from: uninfected lambs (n = 79); experimentally infected (with 2,000 E. granulosus s.l. eggs each) sheep with subsequent evidence of E. granulosus cysts by necropsy (n = 36), and animals carrying other metacestode/trematode infections (n = 20). The latter were used to assess the cross-reactivity of rAgB8/2, with these animals being naturally infected with Taenia hydatigena, Thysanosoma actinioides and/or Fasciola hepatica. EgAgB8/2 showed cross-reaction with only one serum sample from a sheep infected with Ta. hydatigena out of the 20 animals tested. Furthermore, the kinetics of the humoral response over time in five 6-month old sheep, each experimentally infected with 2,000 E. granulosus s.l. eggs, was evaluated up to 49 weeks (approximately one year) post infection (n = 5). The earliest detectable IgG response against rAgB8/2 was observed in sera from two and four sheep, 7 and 14 days after experimental infection, respectively. The highest immune response across all five animals was found 16 to 24 weeks post infection.
Reducing the morbidities caused by neglected tropical diseases (NTDs) is a central aim of ongoing disease control programmes. The broad spectrum of pathogens under the umbrella of NTDs lead to a range of negative health outcomes, from malnutrition and anaemia to organ failure, blindness and carcinogenesis. For some NTDs, the most severe clinical manifestations develop over many years of chronic or repeated infection. For these diseases, the association between infection and risk of long-term pathology is generally complex, and the impact of multiple interacting factors, such as age, co-morbidities and host immune response, is often poorly quantified. Mathematical modelling has been used for many years to gain insights into the complex processes underlying the transmission dynamics of infectious diseases; however, long-term morbidities associated with chronic or cumulative exposure are generally not incorporated into dynamic models for NTDs. Here we consider the complexities and challenges for determining the relationship between cumulative pathogen exposure and morbidity at the individual and population levels, drawing on case studies for trachoma, schistosomiasis and foodborne trematodiasis. We explore potential frameworks for explicitly incorporating long-term morbidity into NTD transmission models, and consider the insights such frameworks may bring in terms of policy-relevant projections for the elimination era. This article is part of the theme issue ‘Challenges and opportunities in the fight against neglected tropical diseases: a decade from the London Declaration on NTDs’.
Background: Co-infection with multiple soil-transmitted helminth (STH) species is common in communities with a high STH prevalence. The life histories of STH species share important characteristics, particularly in the gut, and there is the potential for interaction, but evidence on whether interactions may be facilitating or antagonistic are limited. Methods: Data from a pretreatment cross-sectional survey of STH egg deposition in a tea plantation community in Sri Lanka were analysed to evaluate patterns of co-infection and changes in egg deposition. Results: There were positive associations between Trichuris trichiura (whipworm) and both Necator americanus (hookworm) and Ascaris lumbricoides (roundworm), but N. americanus and Ascaris were not associated. N. americanus and Ascaris infections had lower egg depositions when they were in single infections than when they were co-infecting. There was no clear evidence of a similar effect of co-infection in Trichuris egg deposition. Conclusions: Associations in prevalence and egg deposition in STH species may vary, possibly indicating that effects of co-infection are species dependent. We suggest that between-species interactions that differ by species could explain these results, but further research in different populations is needed to support this theory. © The Author(s) 2018. Published by Oxford University Press on behalf of Royal Society of Tropical Medicine and Hygiene.
Defense against infection incurs costs as well as benefits that are expected to shape the evolution of optimal defense strategies. In particular, many theoretical studies have investigated contexts favoring constitutive versus inducible defenses. However, even when one immune strategy is theoretically optimal, it may be evolutionarily unachievable. This is because evolution proceeds via mutational changes to the protein interaction networks underlying immune responses, not by changes to an immune strategy directly. Here, we use a theoretical simulation model to examine how underlying network architectures constrain the evolution of immune strategies, and how these network architectures account for desirable immune properties such as inducibility and robustness. We focus on immune signaling because signaling molecules are common targets of parasitic interference but are rarely studied in this context. We find that in the presence of a coevolving parasite that disrupts immune signaling, hosts evolve constitutive defenses even when inducible defenses are theoretically optimal. This occurs for two reasons. First, there are relatively few network architectures that produce immunity that is both inducible and also robust against targeted disruption. Second, evolution toward these few robust inducible network architectures often requires intermediate steps that are vulnerable to targeted disruption. The few networks that are both robust and inducible consist of many parallel pathways of immune signaling with few connections among them. In the context of relevant empirical literature, we discuss whether this is indeed the most evolutionarily accessible robust inducible network architecture in nature, and when it can evolve.
Vampire bat-transmitted rabies has recently become the leading cause of rabies mortality in both humans and livestock in Latin America. Evaluating risk of transmission from bats to other animal species has thus become a priority in the region. An integrated bat-rabies dynamic modeling framework quantifying spillover risk to cattle farms was developed. The model is spatially explicit and is calibrated to the state of São Paulo, using real roost and farm locations. Roost and farm characteristics, as well as environmental data through an ecological niche model, are used to modulate rabies transmission. Interventions aimed at reducing risk in roosts (such as bat culling or vaccination) and in farms (cattle vaccination) were considered as control strategies. Both interventions significantly reduce the number of outbreaks in farms and disease spread (based on distance from source), with control in roosts being a significantly better intervention. High-risk areas were also identified, which can support ongoing programs, leading to more effective control interventions.
Simple Summary Echinococcosis is a zoonotic disease relevant to public health in many countries. The disease is present in Brazil; however, it is often underreported due to the lack of mandatory notification of cases across all Brazilian states. The records of two national databases were accessed during the period of 1995-2016 to describe the registered cases and deaths from echinococcosis in the country. Demographic, epidemiological, and health care data related to the occurrence of disease, and deaths attributed to echinococcosis are described. During the study period, 7955 hospitalizations were recorded due to echinococcosis, with 185 deaths. In a second database recording just mortality, a further 113 deaths were documented. Deaths were observed in every state of Brazil. When comparing between states, there was great variability in mortality rates, possibly indicating differences in the quality of health care received by patients and reinforcing the need to expand the compulsory notification of the disease across the country. Echinococcosis is a zoonotic disease relevant to public health in many countries, on all continents except Antarctica. The objective of the study is to describe the registered cases and mortality from echinococcosis in Brazil, from 1995 to 2016. The records of two national databases, the Hospital Information System (HIS) and the Mortality Information System (MIS), were accessed during the period of 1995-2016. Demographic, epidemiological, and health care data related to the occurrence of disease and deaths attributed to echinococcosis in Brazil are described. The results showed that 7955 records of hospitalizations were documented in the HIS, during the study period, with 185 deaths from echinococcosis, and 113 records of deaths were documented in the MIS Deaths in every state of Brazil in the period. When comparing between states, the HIS showed great variability in mortality rates, possibly indicating heterogeneity in diagnosis and in the quality of health care received by patients. Less severe cases that do not require specialized care are not recorded by the information systems, thus the true burden of the disease could be underrepresented in the country. A change in the coding of disease records in the HIS in the late 1990s, (the integration of echinococcosis cases with other pathologies), led to the loss of specificity of the records. The records showed a wide geographic distribution of deaths from echinococcosis, reinforcing the need to expand the notification of the disease in Brazil. Currently, notification of cases is compulsory in the state of Rio Grande do Sul.
Taenia solium is an important cause of acquired epilepsy worldwide and remains endemic in Asia, Africa, and Latin America. Transmission of this parasite is still poorly understood despite the design of infection experiments to improve our knowledge of the disease, with estimates for critical epidemiological parameters, such as the probability of human-to-pig infection after exposure to eggs, still lacking. In this paper, a systematic review was carried out and eight pig infection experiments were analyzed to describe the probability of developing cysts. These experiments included different pathways of inoculation: with ingestion of proglottids, eggs, and beetles that ingested eggs, and direct injection of activated oncospheres into the carotid artery. In these experiments, different infective doses were used, and the numbers of viable and degenerated cysts in the body and brain of each pig were registered. Five alternative dose-response models (exponential, logistic, log-logistic, and exact and approximate beta-Poisson) were assessed for their accuracy in describing the observed probabilities of cyst development as a function of the inoculation dose. Dose-response models were developed separately for the presence of three types of cysts (any, viable only, and cysts in the brain) and considered for each of the four inoculation methods ("Proglottids", "Eggs", "Beetles" and "Carotid"). The exact beta-Poisson model best fit the data for the three types of cysts and all relevant exposure pathways. However, observations for some exposure pathways were too scarce to reliably define a dose-response curve with any model. A wide enough range of doses and sufficient sample sizes was only found for the "Eggs" pathway and a merged "Oral" pathway combining the "Proglottids", "Eggs" and "Beetles" pathways. Estimated parameter values from this model suggest that a low infective dose is sufficient to result in a 50% probability for the development of any cyst or for viable cyst infections. Although this is a preliminary model reliant on a limited dataset, the parameters described in this manuscript should contribute to the design of future experimental infections related to T. solium transmission, as well as the parameterization of simulation models of transmission aimed at informing control.
Despite decades of interventions, 240 million people have schistosomiasis. Infections cannot be directly observed, and egg-based Kato-Katz thick smears lack sensitivity, affected treatment efficacy and reinfection rate estimates. The point-of-care circulating cathodic antigen (referred to from here as POC-CCA+) test is advocated as an improvement on the Kato-Katz method, but improved estimates are limited by ambiguities in the interpretation of trace results. We collected repeated Kato-Katz egg counts from 210 school-aged children and scored POC-CCA tests according to the manufacturer's guidelines (referred to from here as POC-CCA+) and the externally developed G score. We used hidden Markov models parameterized with Kato-Katz; Kato-Katz and POC-CCA+; and Kato-Katz and G-Scores, inferring latent clearance and reinfection probabilities at four timepoints over six-months through a more formal statistical reconciliation of these diagnostics than previously conducted. Our approach required minimal but robust assumptions regarding trace interpretations. Antigen-based models estimated higher infection prevalence across all timepoints compared with the Kato-Katz model, corresponding to lower clearance and higher reinfection estimates. Specifically, pre-treatment prevalence estimates were 85% (Kato-Katz; 95% CI: 79%-92%), 99% (POC-CCA+; 97%-100%) and 98% (G-Score; 95%-100%). Post-treatment, 93% (Kato-Katz; 88%-96%), 72% (POC-CCA+; 64%-79%) and 65% (G-Score; 57%-73%) of those infected were estimated to clear infection. Of those who cleared infection, 35% (Kato-Katz; 27%-42%), 51% (POC-CCA+; 41%-62%) and 44% (G-Score; 33%-55%) were estimated to have been reinfected by 9-weeks. Treatment impact was shorter-lived than Kato-Katz-based estimates alone suggested, with lower clearance and rapid reinfection. At 3 weeks after treatment, longer-term clearance dynamics are captured. At 9 weeks after treatment, reinfection was captured, but failed clearance could not be distinguished from rapid reinfection. Therefore, frequent sampling is required to understand these important epidemiological dynamics.
Intestinal helminths are extremely widespread and highly prevalent infections of humans, particularly in rural and poor urban areas of low and middle-income countries. These parasites have chronic and often insidious effects on human health and child development including abdominal problems, anaemia, stunting and wasting. Certain animals play a fundamental role in the transmission of many intestinal helminths to humans. However, the contribution of zoonotic transmission to the overall burden of human intestinal helminth infection and the relative importance of different animal reservoirs remains incomplete. Moreover, control programmes and transmission models for intestinal helminths often do not consider the role of zoonotic reservoirs of infection. Such reservoirs will become increasingly important as control is scaled up and there is a move towards interruption and even elimination of parasite transmission. With a focus on southeast Asia, and the Philippines in particular, this review summarises the major zoonotic intestinal helminths, risk factors for infection and highlights knowledge gaps related to their epidemiology and transmission. Various methodologies are discussed, including parasite genomics, mathematical modelling and socio-economic analysis, that could be employed to improve understanding of intestinal helminth spread, reservoir attribution and the burden associated with infection, as well as assess effectiveness of interventions. For sustainable control and ultimately elimination of intestinal helminths, there is a need to move beyond scheduled mass deworming and to consider animal and environmental reservoirs. A One Health approach to control of intestinal helminths is proposed, integrating interventions targeting humans, animals and the environment, including improved access to water, hygiene and sanitation. This will require coordination and collaboration across different sectors to achieve best health outcomes for all.
Existing action recognition methods are typically actor-specific due to the intrinsic topological and apparent differences among the actors. This requires actor-specific pose estimation (e.g., humans vs. animals), leading to cumbersome model design complexity and high maintenance costs. Moreover, they often focus on learning the visual modality alone and single-label classification whilst neglecting other available information sources (e.g., class name text) and the concurrent occurrence of multiple actions. To overcome these limitations, we propose a new approach called 'actor-agnostic multi-modal multi-label action recognition,' which offers a unified solution for various types of actors, including humans and animals. We further formulate a novel Multi-modal Semantic Query Network (MSQNet) model in a transformer-based object detection framework (e.g., DETR), characterized by leveraging visual and textual modalities to represent the action classes better. The elimination of actor-specific model designs is a key advantage, as it removes the need for actor pose estimation altogether. Extensive experiments on five publicly available benchmarks show that our MSQNet consistently outperforms the prior arts of actor-specific alternatives on human and animal single- and multi-label action recognition tasks by up to 50%. Code is made available at https://github.com/mondalanindya/MSQNet.
Human toxocariasis is a neglected tropical disease, which is actually global in distribution and has a significant impact on global public health. The infection can lead to several serious conditions in humans, including allergic, ophthalmic and neurological disorders such as epilepsy. It is caused by the common roundworm species Toxocara canis and Toxocara cati, with humans becoming accidentally infected via the ingestion of eggs or larvae. Toxocara eggs are deposited on the ground when infected dogs, cats and foxes defecate, with the eggs contaminating crops, grazing pastures, and subsequently food animals. However, transmission of Toxocara to humans via food consumption has received relatively little attention in the literature. To establish the risks that contaminated food poses to the public, a renewed research focus is required. This review discusses what is currently known about food-borne Toxocara transmission, highlighting the gaps in our understanding that require further attention, and outlining some potential preventative strategies which could be employed to safeguard consumer health.
Toxocara canis and T. cati are zoonotic roundworm parasites of dogs, cats and foxes. These definitive hosts pass eggs in their faeces, which contaminate the environment and can subsequently be ingested via soil or contaminated vegetables. In humans, infection with Toxocara can have serious health implications. This proof-of-concept study aimed to investigate the presence of Toxocara spp. eggs on ‘ready-to-eat’ vegetables (lettuce, spinach, spring onion and celery) sampled from community gardens in southern England. The contamination of vegetables with Toxocara eggs has never been investigated in the UK before, and more widely, this is the first time vegetables grown in community gardens in Europe have been assessed for Toxocara egg contamination. Sixteen community gardens participated in the study, providing 82 vegetable samples fit for analysis. Study participants also completed an anonymous questionnaire on observed visits to the sites by definitive hosts of Toxocara. Comparison of egg recovery methods was performed using lettuce samples spiked with a series of Toxocara spp. egg concentrations, with sedimentation and centrifugal concentration retrieving the highest number of eggs. A sample (100 g) of each vegetable type obtained from participating community gardens was tested for the presence of Toxocara eggs using the optimised method. Two lettuce samples tested positive for Toxocara spp. eggs, giving a prevalence of 2.4% (95% CI =1.3–3.5%) for vegetable samples overall, and 6.5% (95% CI = 4.7–8.3%; n = 31) specifically for lettuce. Questionnaire data revealed that foxes, cats and dogs frequently visited the community gardens in the study, with 88% (68/77) of respondents reporting seeing a definitive host species or the faeces of a definitive host at their site. This proof-of-concept study showed for the first time the presence of Toxocara spp. eggs on vegetables grown in the UK, as well as within the soil where these vegetables originated, and highlights biosecurity and zoonotic risks in community gardens. This study establishes a method for assessment of Toxocara spp. eggs on vegetable produce and paves the way for larger-scale investigations of Toxocara spp. egg contamination on field-grown vegetables. •First report of Toxocara eggs on vegetables grown in community gardens in Europe.•Toxocara also detected in the soil where these vegetables originated.•Definitive hosts for Toxocara canis and T. cati frequently visit community gardens.•Sedimentation method recovered the most Toxocara eggs from spiked lettuce samples.
Using an appropriate diagnostic tool is essential to soil-transmitted helminth control and elimination efforts. Kato-Katz (KK) is the most commonly used diagnostic, but recently other tools, such as real-time quantitative polymerase chain reaction (multiplex qPCR), are starting to be employed more. Here, we evaluated the performance of these two diagnostic tools for five helminth species in Thailand. In the absence of a gold standard, diagnostic performance can be evaluated using latent class analysis. Our results suggest that in moderate to high prevalence settings above 2% multiplex qPCR could be more sensitive than KK, this was particularly apparent for Opisthorchis viverrini in the northeastern provinces. However, for low prevalence, both diagnostics suffered from low sensitivity. Specificity of both diagnostics was estimated to be high (above 70%) across all settings. For some specific helminth infection such as O. viverrini, multiplex qPCR is still a preferable choice of diagnostic test. KK performed equally well in detecting Ascaris lumbricoides and Taenia solium when the prevalence is moderate to high (above 2%). Neither test performed well when the prevalence of infection is low (below 2%), and certainly in the case for hookworm and Trichuris trichiura. Combination of two or more diagnostic tests can improve the performance although the cost would be high. Development of new methods for helminth surveillance at the pre-elimination phase is therefore very important. This article is part of the theme issue ‘Challenges and opportunities in the fight against neglected tropical diseases: a decade from the London Declaration on NTDs’.
The successful prevention, control, and elimination of dog-mediated rabies is challenging due to insufficient resource availability and inadequate placement. An integrated dog bite case management (IBCM) system plus dog vaccination can help address these challenges. Based on data from the IBCM system in Haiti, we conducted a cost-effectiveness evaluation of a newly established IBCM system plus sustained vaccination and compared it with 1) a no bite-case management (NBCM) and 2) a non–risk-based (NRB) program, where bite victims presenting at a health clinic would receive post-exposure prophylaxis regardless of risk assessment. We also provide cost-effectiveness guidance for an ongoing IBCM system and for sub-optimal dog vaccination coverages, considering that not all cost-effective interventions are affordable. Cost-effectiveness outcomes included average cost per human death averted (USD/death averted) and per life-year gained (LYG). The analysis used a governmental perspective. Considering a sustained 5-year implementation with 70% dog vaccination coverage, IBCM had a lower average cost per death averted (IBCM: $7,528, NBCM: $7,797, NRB: $15,244) and cost per LYG (IBCM: $152, NBCM: $158, NRB: $308) than NBCM and NRB programs. As sensitivity analysis, we estimated cost-effectiveness for alternative scenarios with lower dog-vaccination coverages (30%, 55%) and lower implementation costs. Our results suggest that better health and cost-effectiveness outcomes are achieved with the continued implementation of an IBCM program ($118 per life-year saved) compared with a newly established IBCM program ($152 per life-year saved). Our results suggest that IBCM is more cost-effective than non-integrated programs to eliminate dog-mediated human rabies.
Background Schistosomiasis is a major socio-economic and public health problem in many sub-Saharan African countries. After large mass drug administration (MDA) campaigns, prevalence of infection rapidly returns to pre-treatment levels. The traditional egg-based diagnostic for schistosome infections, Kato-Katz, is being substituted in many settings by circulating antigen recognition-based diagnostics, usually the point-of-care circulating cathodic antigen test (CCA). The relationship between these diagnostics is poorly understood, particularly after treatment in both drug-efficacy studies and routine monitoring. Results We created a model of schistosome infections to better understand and quantify the relationship between these two egg- and adult worm antigen-based diagnostics. We focused particularly on the interpretation of “trace” results after CCA testing. Our analyses suggest that CCA is generally a better predictor of prevalence, particularly after treatment, and that trace CCA results are typically associated with truly infected individuals. Conclusions Even though prevalence rises to pre-treatment levels only six months after MDAs, our model suggests that the average intensity of infection is much lower, and is probably in part due to a small burden of surviving juveniles from when the treatment occurred. This work helps to better understand CCA diagnostics and the interpretation of post-treatment prevalence estimations.
During the COVID-19 pandemic, questions were raised about whether SARS-CoV-2 can infect pets and the potential risks posed to and by their human owners. We performed a systematic review of studies on SARS-CoV-2 infection prevalence in naturally infected household dogs and cats conducted worldwide and published before January 2022. Data on SARS-CoV-2 infection prevalence, as determined by either molecular or serological methods, and accompanying information, were summarized. Screening studies targeting the general dog or cat populations were differentiated from those targeting households with known COVID-19-positive people. Studies focusing on stray, sheltered or working animals were excluded. In total, 17 studies were included in this review. Fourteen studies investigated cats, 13 investigated dogs, and 10 investigated both. Five studies reported molecular prevalence, 16 reported seroprevalence, and four reported both. All but two studies started and ended in 2020. Studies were conducted in eight European countries (Italy, France, Spain, Croatia, Germany, the Netherlands, UK, Poland), three Asian countries (Iran, Japan, China) and the USA. Both molecular and serological prevalence in the general pet population were usually below 5%, but exceeded 10% when COVID-19 positive people were known to be present in the household. A meta-analysis provided pooled seroprevalence estimates in the general pet population: 2.75% (95% Confidence Interval [CI]: 1.56-4.79%) and 0.82% (95% CI: 0.26-2.54%) for cats and dogs, respectively. This review highlighted the need for a better understanding of the possible epizootic implications of the COVID-19 pandemic, as well as the need for global standards for SARS-CoV-2 detection in pets. •We summarized SARS-CoV-2 prevalence data in naturally infected pet dogs and cats.•Most studies reported seroprevalence and were conducted in Europe in 2020.•Molecular and serological prevalence in screening studies was generally below 5%.•Prevalence exceeded 10% when pets were exposed to known COVID-19 positive owners.•Epizootic implications of SARS-CoV-2 and testing standards for pets need more attention.
Schistosomiasis is a parasitic disease affecting over 240-million people. World Health Organization (WHO) targets for Schistosoma mansoni elimination are based on Kato-Katz egg counts, without translation to the widely used, urine-based, point-of-care circulating cathodic antigen diagnostic (POC-CCA). We aimed to standardize POC-CCA score interpretation and translate them to Kato-Katz-based standards, broadening diagnostic utility in progress towards elimination. A Bayesian latent-class model was fit to data from 210 school-aged-children over four timepoints pre- to six-months-post-treatment. We used 1) Kato-Katz and established POC-CCA scoring (Negative, Trace, +, ++ and +++), and 2) Kato-Katz and G-Scores (a new, alternative POC-CCA scoring (G1 to G10)). We established the functional relationship between Kato-Katz counts and POC-CCA scores, and the score-associated probability of true infection. This was combined with measures of sensitivity, specificity, and the area under the curve to determine the optimal POC-CCA scoring system and positivity threshold. A simulation parametrized with model estimates established antigen-based elimination targets. True infection was associated with POC-CCA scores of ≥ + or ≥G3. POC-CCA scores cannot predict Kato-Katz counts because low infection intensities saturate the POC-CCA cassettes. Post-treatment POC-CCA sensitivity/specificity fluctuations indicate a changing relationship between egg excretion and antigen levels (living worms). Elimination targets can be identified by the POC-CCA score distribution in a population. A population with ≤2% ++/+++, or ≤0.5% G7 and above, indicates achieving current WHO Kato-Katz-based elimination targets. Population-level POC-CCA scores can be used to access WHO elimination targets prior to treatment. Caution should be exercised on an individual level and following treatment, as POC-CCAs lack resolution to discern between WHO Kato-Katz-based moderate- and high-intensity-infection categories, with limited use in certain settings and evaluations.
Background Cystic echinococcosis (CE) is a zoonotic neglected tropical disease (zNTD) which imposes considerable financial burden to endemic countries. The 2021-2030 World Health Organization's roadmap on NTDs has proposed that intensified control be achieved in hyperendemic areas of 17 countries by 2030. Successful interventions for disease control, and the scale-up of programmes applying such interventions, rely on understanding the associated costs and relative return for investment. We conducted a scoping review of existing peerreviewed literature on economic evaluations of CE control strategies focused on Echinococcus granulosus zoonotic hosts. Methodology/Principal findings Database searches of Scopus, PubMed, Web of Science, CABI Direct and JSTOR were conducted and comprehensively reviewed in March 2022, using predefined search criteria with no date, field or language restrictions. A total of 100 papers were initially identified and assessed for eligibility against strict inclusion and exclusion criteria, following the Preferred Reporting Items for Systematic reviews and Meta-Analyses extension for Scoping Reviews (PRISMA-ScR) guidelines. Bibliography review of included manuscripts was used to identify additional literature. Full review of the final manuscript selection (n = 9) was performed and cost data for control interventions were extracted. Conclusions/Significance There are very little published data pertaining to the cost and cost effectiveness of CE control interventions targeting its zoonotic hosts. Data given for costs are often incomplete, thus we were unable to perform an economic analysis and cost effectiveness study, highlighting a pressing need for this information. There is much scope for future work in this area. More detailed information and disaggregated costings need to be collected and made available. This would increase the accuracy of any cost-effective analyses to be performed and allow for a greater understanding of the opportunity cost of healthcare decisions and resource allocation by stakeholders and policy makers for effective and cost-effective CE control. Author summary Cystic echinococcosis (CE) is a zoonotic neglected tropical disease which predominantly affects poor pastoral communities globally. The parasite cycles between farm dogs and livestock, and is associated with livestock farming and feeding of infected offal to dogs. Although no noticeable clinical signs are seen in livestock, some production losses, such as reduced milk yield and live weight gain may be observed, and offal condemnation at slaughter is common. The disease can also affect people, due to accidental ingestion of parasite eggs on contaminated food and contact with dogs. Human morbidity and mortality occur due to cyst formation in body organs, exerting a substantial health and financial burden to the health sector of affected countries. Control interventions to reduce CE transmission include sheep vaccination and dog deworming. Long-term control programmes are often expensive, and the true costs of such programmes poorly documented. This scoping review aims to examine published literature on the costs of CE control in zoonotic hosts and report detailed costs of individual elements of a control programme, thereby furthering our understanding of the true economic cost of CE control.
Background In line with movement restrictions and physical distancing essential for the control of the COVID-19 pandemic, WHO recommended postponement of all neglected tropical disease (NTD) control activities that involve community-based surveys, active case finding, and mass drug administration in April, 2020. Following revised guidance later in 2020, and after interruptions to NTD programmes of varying lengths, NTD programmes gradually restarted in the context of an ongoing pandemic. However, ongoing challenges and service gaps have been reported. This study aimed to evaluate the potential effect of the programmatic interruptions and strategies to mitigate this effect. Methods For seven NTDs, namely soil-transmitted helminths, schistosomiasis, lymphatic filariasis, onchocerciasis, trachoma, visceral leishmaniasis, and human African trypanosomiasis, we used mathematical transmission models to simulate the effect of programme interruptions on the dynamics of each of these diseases in different endemic settings. We also explored the potential benefit of implementing mitigation strategies, primarily in terms of minimising the delays to control targets. Findings We show that the effect of the COVID-19-induced interruption in terms of delay to achieving elimination goals might in some cases be much longer than the duration of the interruption. For schistosomiasis, onchocerciasis, trachoma, and visceral leishmaniasis, a mean delay of 2–3 years for a 1-year interruption is predicted in areas of highest prevalence. We also show that these delays can largely be mitigated by measures such as additional mass drug administration or enhanced case-finding. Interpretation The COVID-19 pandemic has brought infectious disease control to the forefront of global consciousness. It is essential that the NTDs, so long neglected in terms of research and financial support, are not overlooked, and remain a priority in health service planning and funding.
Introduction Cystic echinococcosis (CE) is a zoonotic parasite caused by the cestode Echinococcus granulosus sensu lato (s.l.) which predominantly affects livestock. The disease is endemic in central-southern and insular Italy, with CE particularly infecting sheep, goats, cattle, and water buffalo. The spatial distribution of CE in endemic regions is not widely understood, with surveillance efforts varying across the region. Methods In this study, we investigated the spatial distribution of CE in livestock using samples from farms across different livestock species using a Stochastic Partial Differential Equations (SPDE) model. Samples were collected during a survey conducted in the area of central-southern and insular Italy between the years 2019 – 2021. Results A total of 3141 animal samples (126 goats, 601 sheep and 2414 cattle and water buffalo) were inspected for Echinococcus s.l. cysts through routine surveillance in abattoirs by postmortem visual examination, palpation and incision of target organs. The geographic location of the farm of origin (a total of 2,878) for each sample was recorded. CE prevalence of 46.0% (1,323/2,878) was estimated at the farm level with 78.3% (462/590) of farms with sheep, 28.6% (36/126) of farms with goats, 36.5% (747/2,049) of farms with cattle, and 23.5% (102/434) of farms with water buffalo infected. Discussion The spatial model evaluated the probability of infection in farms across the sampled regions, with the distribution of CE showing high clustering of infected cattle farms in Sardinia and Sicily regions, and sheep farms in Salerno province (Campania region). The output of this study can be used to identify CE hot-spots and to improve surveillance and control programs in endemic areas of Italy.
Campylobacter jejuni and Campylobacter coli are important bacterial causes of human foodborne illness. Despite several years of reduced antibiotics usage in livestock production in the United Kingdom (UK) and United States (US), a high prevalence of antimicrobial resistance (AMR) persists in . Both countries have instigated genome sequencing-based surveillance programs for , and in this study, we have identified AMR genes in 32,256 C. jejuni and 8,776 C. coli publicly available genome sequences to compare the prevalence and trends of AMR in isolated in the UK and US between 2001 and 2018. AMR markers were detected in 68% of C. coli and 53% of C. jejuni isolates, with 15% of C. coli isolates being multidrug resistant (MDR), compared to only 2% of C. jejuni isolates. The prevalence of aminoglycoside, macrolide, quinolone, and tetracycline resistance remained fairly stable from 2001 to 2018 in both C. jejuni and C. coli, but statistically significant differences were observed between the UK and US. There was a statistically significant higher prevalence of aminoglycoside and tetracycline resistance for US C. coli and C. jejuni isolates and macrolide resistance for US C. coli isolates. In contrast, UK C. coli and C. jejuni isolates showed a significantly higher prevalence of quinolone resistance. Specific multilocus sequence type (MLST) clonal complexes (e.g., ST-353/464) showed >95% quinolone resistance. This large-scale comparison of AMR prevalence has shown that the prevalence of AMR remains stable for in the UK and the US. This suggests that antimicrobial stewardship and restricted antibiotic usage may help contain further expansion of AMR prevalence in but are unlikely to reduce it in the short term.
Antigen banks have been established to supply foot-and-mouth disease virus (FMDV) vaccines at short notice to respond to incursions or upsurges in cases of FMDV infection. Multiple vaccine strains are needed to protect against specific FMDV lineages that circulate within six viral serotypes that are unevenly distributed across the world. The optimal selection of distinct antigens held in a bank must carefully balance the desire to cover these risks with the costs of purchasing and maintaining vaccine antigens. PRAGMATIST is a semi-quantitative FMD vaccine strain selection tool combining three strands of evidence: ( 1 ) estimates of the risk of incursion from specific areas (source area score); ( 2 ) estimates of the relative prevalence of FMD viral lineages in each specific area (lineage distribution score); and ( 3 ) effectiveness of each vaccine against specific FMDV lineages based on laboratory vaccine matching tests (vaccine coverage score). The output is a vaccine score, which identifies vaccine strains that best address the threats, and consequently which are the highest priority for inclusion in vaccine antigen banks. In this paper, data used to populate PRAGMATIST are described, including the results from expert elicitations regarding FMD risk and viral lineage circulation, while vaccine coverage data is provided from vaccine matching tests performed at the WRLFMD between 2011 and 2021 ( n = 2,150). These data were tailored to working examples for three hypothetical vaccine antigen bank perspectives (Europe, North America, and Australia). The results highlight the variation in the vaccine antigens required for storage in these different regions, dependent on risk. While the tool outputs are largely robust to uncertainty in the input parameters, variation in vaccine coverage score had the most noticeable impact on the estimated risk covered by each vaccine, particularly for vaccines that provide substantial risk coverage across several lineages.
Measles is a target for elimination in all six WHO regions by 2020, and over the last decade, there has been considerable progress towards this goal. Surveillance is recognised as a cornerstone of elimination programmes, allowing early identification of outbreaks, thus enabling control and preventing re-emergence. Fever-rash surveillance is increasingly available across WHO regions, and this symptom-based reporting is broadly used for measles surveillance. However, as measles control increases, symptom-based cases are increasingly likely to reflect infection with other diseases with similar symptoms such as rubella, which affects the same populations, and can have a similar seasonality. The WHO recommends that cases from suspected measles outbreaks be laboratory-confirmed, to identify 'true' cases, corresponding to measles IgM titres exceeding a threshold indicative of infection. Although serological testing for IgM has been integrated into the fever-rash surveillance systems in many countries, the logistics of sending in every suspected case are often beyond the health system's capacity. We show how age data from serologically confirmed cases can be leveraged to infer the status of non-Tested samples, thus strengthening the information we can extract from symptom-based surveillance. Applying an age-specific confirmation model to data from three countries with divergent epidemiology across Africa, we identify the proportion of cases that need to be serologically tested to achieve target levels of accuracy in estimated infected numbers and discuss how this varies depending on the epidemiological context. Our analysis provides an approach to refining estimates of incidence leveraging all available data, which has the potential to improve allocation of resources, and thus contribute to rapid and efficient control of outbreaks. © Cambridge University Press 2018 This is an Open Access article, distributed under the terms of the Creative Commons Attribution licence (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted re-use, distribution, and reproduction in any medium, provided the original work is properly cited.
There is clear empirical evidence that environmental conditions can influence Ascaris spp. free-living stage development and host reinfection, but the impact of these differences on human infections, and interventions to control them, is variable. A new model framework reflecting four key stages of the A. lumbricoides life cycle, incorporating the effects of rainfall and temperature, is used to describe the level of infection in the human population alongside the environmental egg dynamics. Using data from South Korea and Nigeria, we conclude that settings with extreme fluctuations in rainfall or temperature could exhibit strong seasonal transmission patterns that may be partially masked by the longevity of A. lumbricoides infections in hosts; we go on to demonstrate how seasonally timed mass drug administration (MDA) could impact the outcomes of control strategies. For the South Korean setting the results predict a comparative decrease of 74.5% in mean worm days (the number of days the average individual spend infected with worms across a 12 month period) between the best and worst MDA timings after four years of annual treatment. The model found no significant seasonal effect on MDA in the Nigerian setting due to a narrower annual temperature range and no rainfall dependence. Our results suggest that seasonal variation in egg survival and maturation could be exploited to maximise the impact of MDA in certain settings. © 2018 Davis et al.
Introduction All six WHO regions currently have goals for measles elimination by 2020. Measles vaccination is delivered via routine immunization programmes, which in most sub-Saharan African countries reach children around 9 months of age, and supplementary immunization activities (SIAs), which target a wider age range at multi-annual intervals. In the absence of endemic measles circulation, the proportion of individuals susceptible to measles will gradually increase through accumulation of new unvaccinated individuals in each birth cohort, increasing the risk of an epidemic. The impact of SIAs and the financial investment they require, depend on coverage and target age range. Materials and methods We evaluated the impact of target population age range for periodic SIAs, evaluating outcomes for two different levels of coverage, using a demographic and epidemiological model adapted to reflect populations in 4 sub-Saharan African countries. Results We found that a single SIA can maintain elimination over short time-scales, even with low routine coverage. However, maintaining elimination for more than a few years is difficult, even with large (high coverage/wide age range) recurrent SIAs, due to the build-up of susceptible individuals. Across the demographic and vaccination contexts investigated, expanding SIAs to target individuals over 10 years did not significantly reduce outbreak risk. Conclusions Elimination was not maintained in the contexts we evaluated without a second opportunity for vaccination. In the absence of an expanded routine program, SIAs provide a powerful option for providing this second dose. We show that a single high coverage SIA can deliver most key benefits in terms of maintaining elimination, with follow-up campaigns potentially requiring smaller investments. This makes post-campaign evaluation of coverage increasingly relevant to correctly assess future outbreak risk. © 2017 The Author(s)
Abstract Background A large number of studies have assessed risk factors for infection with soil-transmitted helminths (STH), but few have investigated the interactions between the different parasites or compared these between host species across hosts. Here, we assessed the associations between Ascaris, Trichuris, hookworm, strongyle and Toxocara infections in the Philippines in human and animal hosts. Methods Faecal samples were collected from humans and animals (dogs, cats and pigs) in 252 households from four villages in southern Philippines and intestinal helminth infections were assessed by microscopy. Associations between worm species were assessed using multiple logistic regression. Results Ascaris infections showed a similar prevalence in humans (13.9%) and pigs (13.7%). Hookworm was the most prevalent infection in dogs (48%); the most prevalent infection in pigs was strongyles (42%). The prevalences of hookworm and Toxocara in cats were similar (41%). Statistically significant associations were observed between Ascaris and Trichuris and between Ascaris and hookworm infections in humans, and also between Ascaris and Trichuris infections in pigs. Dual and triple infections were observed, which were more common in dogs, cats and pigs than in humans. Conclusions Associations are likely to exist between STH species in humans and animals, possibly due to shared exposures and transmission routes. Individual factors and behaviours will play a key role in the occurrence of co-infections, which will have effects on disease severity. Moreover, the implications of co-infection for the emergence of zoonoses need to be explored further.
Schistosomiasis is a parasitic disease acquired through contact with contaminated freshwater. The definitive hosts are terrestrial mammals, including humans, with some Schistosoma species crossing the animal-human boundary through zoonotic transmission. An estimated 12 million people live at risk of zoonotic schistosomiasis caused by Schistosoma japonicum and Schistosoma mekongi, largely in the World Health Organization’s Western Pacific Region and in Indonesia. Mathematical models have played a vital role in our understanding of the biology, transmission, and impact of intervention strategies, however, these have mostly focused on non-zoonotic Schistosoma species. Whilst these non-zoonotic-based models capture some aspects of zoonotic schistosomiasis transmission dynamics, the commonly-used frameworks are yet to adequately capture the complex epi-ecology of multi-host zoonotic transmission. However, overcoming these knowledge gaps goes beyond transmission dynamics modelling. To improve model utility and enhance zoonotic schistosomiasis control programmes, we highlight three pillars that we believe are vital to sustainable interventions at the implementation (community) and policy-level, and discuss the pillars in the context of a One-Health approach, recognising the interconnection between humans, animals and their shared environment. These pillars are: (1) human and animal epi-ecological understanding; (2) economic considerations (such as treatment costs and animal losses); and (3) sociological understanding, including inter- and intra-human and animal interactions. These pillars must be built on a strong foundation of trust, support and commitment of stakeholders and involved institutions.
Existing action recognition methods are typically actor-specific due to the intrinsic topological and apparent differences among the actors. This requires actor-specific pose estimation (e.g., humans vs. animals), leading to cumbersome model design complexity and high maintenance costs. Moreover, they often focus on learning the visual modality alone and single-label classification whilst neglecting other available information sources (e.g., class name text) and the concurrent occurrence of multiple actions. To overcome these limitations, we propose a new approach called 'actor-agnostic multi-modal multi-label action recognition,' which offers a unified solution for various types of actors, including humans and animals. We further formulate a novel Multi-modal Semantic Query Network (MSQNet) model in a transformer-based object detection framework (e.g., DETR), characterized by leveraging visual and textual modalities to represent the action classes better. The elimination of actor-specific model designs is a key advantage, as it removes the need for actor pose estimation altogether. Extensive experiments on five publicly available benchmarks show that our MSQNet consistently outperforms the prior arts of actor-specific alternatives on human and animal single- and multi-label action recognition tasks by up to 50%. Code will be released at https://github.com/mondalanindya/MSQNet.
Background Neglected tropical diseases (NTDs) disproportionately affect populations living in resource-limited settings. In the Amazon basin, substantial numbers of NTDs are zoonotic, transmitted by vertebrate (dogs, bats, snakes) and invertebrate species (sand flies and triatomine insects). However, no dedicated consortia exist to find commonalities in the risk factors for or mitigations against bite-associated NTDs such as rabies, snake envenoming, Chagas disease and leishmaniasis in the region. The rapid expansion of COVID-19 has further reduced resources for NTDs, exacerbated health inequality and reiterated the need to raise awareness of NTDs related to bites. Methods The nine countries that make up the Amazon basin have been considered (Bolivia, Brazil, Colombia, Ecuador, French Guiana, Guyana, Peru, Surinam and Venezuela) in the formation of a new network. Results The Amazonian Tropical Bites Research Initiative (ATBRI) has been created, with the aim of creating transdisciplinary solutions to the problem of animal bites leading to disease in Amazonian communities. The ATBRI seeks to unify the currently disjointed approach to the control of bite-related neglected zoonoses across Latin America. Conclusions The coordination of different sectors and inclusion of all stakeholders will advance this field and generate evidence for policy-making, promoting governance and linkage across a One Health arena.
Background. With the 2020 target year for elimination of lymphatic filariasis (LF) approaching, there is an urgent need to assess how long mass drug administration (MDA) programs with annual ivermectin + albendazole (IA) or diethylcarbamazine + albendazole (DA) would still have to be continued, and how elimination can be accelerated. We addressed this using mathematical modeling. Methods. We used 3 structurally different mathematical models for LF transmission (EPIFIL, LYMFASIM, TRANSFIL) to simulate trends in microfilariae (mf) prevalence for a range of endemic settings, both for the current annual MDA strategy and alternative strategies, assessing the required duration to bring mf prevalence below the critical threshold of 1%. Results. Three annual MDA rounds with IA or DA and good coverage (≥65%) are sufficient to reach the threshold in settings that are currently at mf prevalence
Mathematical models are increasingly being used to evaluate strategies aiming to achieve the control or elimination of parasitic diseases. Recently, owing to growing realization that process-oriented models are useful for ecological forecasts only if the biological processes are well defined, attention has focused on data assimilation as a means to improve the predictive performance of these models.
As the World Health Organization seeks to eliminate trachoma by 2020, countries are beginning to control the transmission of trachomatous inflammation-follicular (TF) and discontinue mass drug administration (MDA) with oral azithromycin. We evaluated the effect of MDA discontinuation on TF1-9 prevalence at the district level. We extracted from the available data districts with an impact survey at the end of their program cycle that initiated discontinuation of MDA (TF1-9 prevalence 5% from impact to surveillance survey in 9% of districts. Regression analysis indicated that impact survey TF1-9 prevalence was a significant predictor of surveillance survey TF1-9 prevalence. The proportion of simulations with >5% TF1-9 prevalence in the surveillance survey was 2%, assuming the survey was conducted 4 years after MDA. An increase in TF1-9 prevalence may represent disease resurgence but could also be due to measurement error. Improved diagnostic tests are crucial to elimination of TF1-9 as a public health problem.
The World Health Organization recently launched its 2021-2030 roadmap, Ending the Neglect to Attain the Sustainable Development Goals, an updated call to arms to end the suffering caused by neglected tropical diseases. Modelling and quantitative analyses played a significant role in forming these latest goals. In this collection, we discuss the insights, the resulting recommendations and identified challenges of public health modelling for 13 of the target diseases: Chagas disease, dengue, gambiense human African trypanosomiasis (gHAT), lymphatic filariasis (LF), onchocerciasis, rabies, scabies, schistosomiasis, soil-transmitted helminthiases (STH), Taenia solium taeniasis/ cysticercosis, trachoma, visceral leishmaniasis (VL) and yaws. This piece reflects the three cross-cutting themes identified across the collection, regarding the contribution that modelling can make to timelines, programme design, drug development and clinical trials
Background Toxocara canis and Toxocara cati are intestinal parasites of dogs, cats and foxes, with infected animals shedding eggs of the parasite in their faeces. If humans accidentally ingest embryonated Toxocara spp. eggs from the environment, severe clinical consequences, including blindness and brain damage, can occur. Previous work has demonstrated the presence of Toxocara spp. eggs on vegetable produce grown in the UK, but only in small-scale community gardens. The aim of this study was to determine whether Toxocara spp. eggs are also present on vegetables grown on commercial farms in the UK, which supply produce to a greater number of people. Methods A total of 120 samples (300 g each) of spinach (Spinacia oleracea) were collected across four farms in the south of England, UK. The samples were processed using a sieving approach followed by multiplex quantitative polymerase chain reaction analysis. Results Overall, 23.0% of samples were positive for T. canis (28/120; 95% confidence interval 16.7–31.7%) and 1.7% for T. cati (2/120; 95% confidence interval 0.5–5.9%). There was a statistically significant difference in the number of positive samples between farms (P = 0.0064). To our knowledge, this is the first report of the isolation of Toxocara spp. from vegetables grown on commercial farms in the UK. Conclusions The results of this study highlight the requirement for the thorough washing of vegetables prior to their consumption, especially those such as spinach which may be eaten without first peeling or cooking, and effective farm biosecurity measures to minimise access to farmland by definitive host species of Toxocara spp.
Background: In this study, two traits related with resistance to gastrointestinal nematodes (GIN) were measured in 529 adult sheep: faecal egg count (FEC) and activity of immunoglobulin A in plasma (IgA). In dry years, FEC can be very low in semi-extensive systems, such as the one studied here, which makes identifying animals that are resistant or susceptible to infection a difficult task. A zero inflated negative binomial model (ZINB) model was used to calculate the extent of zero inflation for FEC; the model was extended to include information from the IgA responses. Results: In this dataset, 64 % of animals had zero FEC while the ZINB model suggested that 38 % of sheep had not been recently infected with GIN. Therefore 26 % of sheep were predicted to be infected animals with egg counts that were zero or below the detection limit and likely to be relatively resistant to nematode infection. IgA activities of all animals were then used to decide which of the sheep with zero egg counts had been exposed and which sheep had not been recently exposed. Animals with zero FEC and high IgA activity were considered resistant while animals with zero FEC and low IgA activity were considered as not recently infected. For the animals considered as exposed to the infection, the correlations among the studied traits were estimated, and the influence of these traits on the discrimination between unexposed and infected animals was assessed. Conclusions: The model presented here improved the detection of infected animals with zero FEC. The correlations calculated here will be useful in the development of a reliable index of GIN resistance that could be of assistance for the study of host resistance in studies based on natural infection, especially in adult sheep, and also the design of breeding programs aimed at increasing resistance to parasites. © 2016 The Author(s).