AI in preclinical development
The primary goal of preclinical drug development is to gather sufficient data on a drug’s efficacy and safety, allowing for optimization of formulation and dosage. The data collected during this stage is used to support the design and conduct of clinical trials. Incorporating artificial intelligence (AI) and machine learning (ML) into preclinical drug discovery results in numerous advantages, Christian Baber, Head of Scientific & Pharmaceutical Data, Informatics & Systems at The Janssen Pharmaceutical Companies of Johnson & Johnson, confirms. “In its most simple format, data can be used to describe what is happening, basically fitting a curve to a number of data points. The next step is to use data in diagnostics – why do we see certain trends, what is the cause of outliers, why did experiments fail – all in the framework of quality assessment.”
“Using data, we can also predict what will happen in experiments. By relying on a digital twin, we can model real life events, enabling processes to go faster.” – Christian Baber
However, the true value of AI-ML exceeds descriptive or diagnostic applications. “Using data, we can also predict what will happen in experiments. By relying on a digital twin, we can model real life events, enabling processes to go faster. Going one step further, data can even have a prescriptive role – informing scientists of what to do, which experiments to conduct and which conditions to study.” AI-ML has the potential to revolutionize preclinical drug development by complementing routine strategies such as pharmacokinetics, dynamics, and in vitro/in vivo toxicity studies. This will, among other benefits, enable faster and more accurate prediction of drug safety and optimal formulation.
Reducing animal experiments
Throughout history, animal models have played an important part in the preclinical drug development process. While animal testing remains crucial in certain studies, there are many aspects of preclinical research that can be performed without them, or even improved with modern alternatives that more accurately replicate human anatomy, physiology, and diversity. Mathieu Vinken is professor in vitro and mechanistic toxicology at the VUB and a strong advocate for combining innovative in vitro methods, using for example organoids or organ-on-chip systems, with computer modeling for assessing safety of drugs. “In many cases, there is a wealth of toxicological data of a chemical substance already available from earlier experiments,” Vinken says. “AI-ML is a critical asset in the endeavor to make sense of this information and extract useful insights.”
“With the ONTOX consortium, we combine cutting-edge AI with toxicological, clinical, mechanistic, and kinetic data to generate computational systems that enable human risk assessment of chemicals without using animal models.” – Mathieu Vinken
Vinken is also coordinator of ONTOX, an international, interdisciplinary, and intersectional five-year project supported by the European Commission aiming to develop advanced animal-free techniques for assessing the safety of chemical substances. “With the ONTOX consortium, we combine cutting-edge AI with toxicological, clinical, mechanistic, and kinetic data to generate computational systems that enable human risk assessment of chemicals without using animal models,” explains Vinken. ONTOX is part of the ASPIS cluster, representing the most extensive public research funding in Europe aimed at promoting chemical safety assessment without the use of animal testing. “Our combined efforts to optimize in vitro and in silico tools will lead to proofs-of-concept of innovative NAMs [New Approach Methodologies, ed.] that can be used as a blueprint by any other research group, thereby generating true impact.”
A means to an end, not an end in itself
At present, there is a lot of buzz and excitement surrounding AI, but this does not mean that physical experimentation will become obsolete. “We need to evaluate which tool – be it AI-ML, traditional modelling and simulation, or physical experimentation – is best suited for each specific application,” Baber states. “Moreover, our experience has shown that the best outcomes are achieved by using a combination of methods.” In addition, real life data is needed to feed algorithms, and will therefore remain an important part of the development process. “For optimal model training, you need sufficient and qualitative data. The ONTOX partnership with the pharmaceutical industry and clinical centers is therefore essential to our project,” Vinken says. The prescriptive character of AI-ML can even be used to recommend targeted in vitro or in silico tests to close data gaps, helping researchers to proceed efficiently in their research strategy.
Read this article to find out more about the role for data science in early drug discovery!
Although some scientists may be concerned that AI will replace their work, Baber believes such fears are unfounded: “AI cannot replicate the cognitive and creative abilities of human researchers. In silico work is just another type of experiment, helping our researchers to make optimal decisions.” Moreover, data-based computer modeling isn’t without its flaws. “We need critical human minds to continuously challenge the system,” Vinken agrees. “On top of that, data accommodation, management, and regulation need to be considered and evaluated at all times.”
Digital transformation does however represent a change in our standard way of working, and the culture of companies and institutions. Although AI-ML will not take people’s jobs on the short term, it will alter them, sometimes even fundamentally. To make the most of these new tools, researchers will need to learn and adapt to the changing times. The results will be well worth it: a reduced reliance on animal models combined with improved drug development.