And so started the journey that – five years later – would result in the back-to-back publication this writing refers to.
Inactivating or neutralizing antibiotics before they arrive to the large intestines, where the larger part of the gut microbiota resides, is challenging. It should prevent antibiotic-induced gut microbiota disruption without negatively impacting the pharmacokinetics of those antibiotics. Once a working concept is realized, it is not too difficult to demonstrate it in hamster models. Proving the concept in healthy human volunteers (prevention of antibiotic-induced microbiota changes while maintaining pharmacokinetic properties) is also quite doable.
For many this may sound as sufficient reason to use such a compound if you need to take antibiotics. However, in order to bring it to the market one needs to actually prove (in a so-called phase-III trial) that it avoids not just “damage to a biomarker” (i.e. the microbiota) but actually improves health. Meaning that it avoids a certain disease state. And that, arguably, should in this case be to demonstrate a reduction in the Clostridioides difficile infection (CDI) incidence.
C. difficile is a Gram-positive spore-forming bacterium that frequently resides in the human intestine. About 10% of adults are carriers of C. difficile upon admission to the hospital and others may acquire it during their stay. C. difficile is resistant to many frequently used antibiotics. Disruption of the microbiota due to antibiotic treatments can thus result in increased abundance of C. difficile in the gut. Its toxins subsequently cause profound harm to the intestinal epithelium, resulting in C. difficile associated diarrhea that can be severe and even fatal. It is one of the most important healthcare-associated infections and the number one cause of infectious diarrhea. It is associated with substantial morbidity and healthcare costs due to the need for isolation and prolonged hospital stay. Putting it like this, it sounds like an easy job to demonstrate efficacy of a drug that prevents all this misery. But here come the numbers.
In the currently published ANTICIPATE study (“AssessmeNT of the Incidence of Clostridium difficile Infections in hospitalized Patients on Antibiotic TrEatment”), the incidence of CDI in hospitalized patients over 50 years of age in Europe was 1.9% within 90 days after starting broad-spectrum antibiotics. Designing a trial to demonstrate a putative 50% reduction of this incidence would require over 3,000 participants per intervention arm that need to be followed for 3 months – an expensive undertaking.
So here is the challenge: to make a prevention strategy available to all, we are required to demonstrate an impact on C. difficile infection, which – and this may sound strange given the described large public health impact – is a relatively rare outcome. So how can we make sure that such a trial is feasible? Would it be possible to enrich the trial population with high-risk individuals so as to decrease its sample size?
In our study we tested five distinct features (one clinical parameter and four biomarkers) for its usefulness as trial enrichment factor. First, as a very simple approach, to select specific antibiotic classes only. We found that patients treated with carbapenems had a 90-days incidence of CDI of 8%. Such incidence would make a trial feasible. However, only 6% of patients used carbapenems, so it may be difficult to get enough patients in the trial. Another not so successful parameter was 3-indoxyl sulphate level in urine. This is a molecule that is indirectly derived from certain bacteria in the gut and whose excretion in urine could reflect intestinal microbiota diversity. However, prior to antibiotic treatments, there were just too few patients with intermediate 3-indoxyl sulphate levels and none with low levels, hence this is not relevant for trial enrichment. The third feature of interest was carriage of toxigenic C. difficile determined by PCR testing at enrolment. This was quite predictive with a 90-day CDI incidence of 11% in the carriers. However, only a small fraction of the patient population (4.5%) was a carrier. Imagine if one needs to do a rapid PCR test in 22 patients in order to find one suitable candidate for the trial – that wouldn’t improve the feasibility of the trial, let alone the enthusiasm of the researchers.
Our fourth parameter was gut microbiota α-diversity level (i.e. the level of diversity within a given sample) before the onset of antibiotic treatments: we found that values of the Shannon or Inverse Simpson indices below 2.586 and 7.674 enhanced the probability of developing a CDI within 90 days by a factor of 10.2 and 7.4 respectively. Our fifth and final parameter of interest was the gut microbiota composition. In a parallel publication from this study led by University of Antwerp (see also their blog post) we describe that several operational taxonomic units (OTUs), mainly representing species from the class Clostridia and from the phylum Bacteroidetes, were negatively associated with CDI, whereas one OTU representing an Enterococcus species was positively associated with CDI. From this we were able to define a relatively simple rule for identifying patients with an increased risk of CDI, and to validate it in an independent dataset. Patients with a relative abundance of an Enterococcus OTU at least 8.5 times higher than the relative abundance of a Ruminococcus OTU, representing 17% of our study population, have a CDI risk close to 6%. This would make a trial quite feasible. The main challenge now is to develop one of the microbiota-based biomarkers into a rapid and affordable test so that it can serve as an inclusion criterion for trials.
There is no doubt this knowledge is extremely helpful to drug developers embarked in the long journey to register novel microbiome protectors, vaccines, microbiome replenishers, hygiene measures or any other smart approach to combat C. difficile infection or the other consequences of microbiota disruption. For Da Volterra, this means that the phase-III trial should certainly be performed in patients who in general have a substantially higher incidence of CDI: patients with hematologic malignancies. Although not anticipated five years ago, recent suggestions of collateral benefits of microbiome conserving strategies in this population, together with the results of ANTICIPATE, have convinced us that this is the right population for the phase-III trial. In parallel, tools to enhance the success of a trial in lower-risk populations will be developed building on the results of this study. Next to the scientific discoveries, this too is a success of the COMBACTE-NET consortium that makes us proud of what we achieved so far.