The pharmaceutical business is
one of the riskiest industries to venture into. Drug discovery is an artisanal
process where a carefully designed drug takes about 10 years and approximately
2.5 billion dollars to be approved and launched into the market. The complexity
of biological systems places the odds at a ridiculous failure rate of 90%. In
recent years, the declining efficiency of the R&D efforts has put the
pharma industry on its toes.
In the past decade, Artificial Intelligence (AI) has already revolutionized several industries, including automotive, entertainment and fintech. AI dictates routes and ETA on google maps, executes multiple stock exchange transactions, enables facial recognition, and powers the voice assistants Siri and Alexa. However, the adoption of AI in pharma has been restricted due to limited data available about what works (the successful 10%) and the innate complexity of the process of drug discovery
The
sudden hype of AI in pharma
The pharma industry has been
using elements of machine learning (a type of AI algorithm) in drug discovery
R&D for at least 2 decades now. The most common software used by medicinal
chemists, Schrodinger’s suite, has been offering regression analysis for quite
some time now. So, what’s new?
However,
the recent widespread interest has been fueled by the breakthrough made in
neural networks, beginning with ‘AlexNet’ in 2012. AlexNet, a convolution
neural network, augmented with supervised learning, could classify images with
outstanding precision. It was not too late that these algorithms found its way
in chemistry to classify drug molecules. Subsequent publishing of Generative
Adversarial Networks (GANs) or often referred to as ‘creative AI’ in 2014,
combined with reinforcement learning that could be used to generate novel
molecule entities with a desired set of pharmacological properties.
By
2018, Natural Language Processing (NLP) and computer vision algorithms, which
can generate insights by crawling through millions of papers, patents, grant,
clinical trials data etc., made significant progress and allowed making sense
of vast amount of fragmented data. Simultaneously, advances in -omics and other
high through put techniques generating the big data allowed pharma to use to AI
efficiently.
These
advanced methods could be efficiently trained to predict/generate novel
chemical structures with desired pharmacological properties, learn systems
biology to identify new targets/biomarkers, predicting toxicity, and many other
applications in drug discovery and development.
The conundrum of data sharing
Citing
the promise as real, numerous pharma companies have since forayed into AI by
partnering with ‘AI-specialized start-ups’ to explore the process. The big
pharma companies, having deep pockets, have also started investing heavily in
developing their internal AI capabilities for long term, rather than completely
outsourcing it the numerous specialized start-ups. This trend is rather
opposite of what we have observed with contract research organisation (CRO)
industry, that has grown by building the scales for the services that pharma
cos and biotech’s wanted to outsource. And we need to understand why?
AI works as well as the data it is fed, and the pharma companies own that data
It
is extremely important to integrate a vast variety of datasets and create
structured data lakes from sanitized data, such that you can reasonably apply
deep learning and make sense out of it. And it can be a daunting and
transformational task, especially for pharma industry which deals with a
gargantuan amount of diverse data ranging from billions of molecules to large
omics datasets, from in vitro and animal testing data to clinical trial
data.
“It
took us about 18 months only to integrate data across 11 different data systems
to actually build the databank to actually then start applying the artificial
intelligence…We were so focused on the endpoint, in terms of the FDA or the
EMA, that we never collected the 80% of the data that could be pulled out from
CROs, like imaging datasets, biomarker data etc.”, says Vas Narasimhan, CEO of
Novartis.
There
is a rising sentiment about organizing data in a structured manner and sharing
with the community to improve the overall process of drug discovery, while
still protecting the IP. COVID-19 got the pharma industry sharing data and
coordinating at unprecedented levels to get treatments or vaccines out to
market as early as possible. It would be interesting to observe, whether this
trend will survive or wither away just like the leaves of autumn.
“Our
philosophy is truly that all boats rise with the tide.., we think that there’s
other ways that you can create competitive differentiation around your
algorithm, quality of the data, the analysts, the companies you partner with”,
says Lee Lehman-Becker, Senior Director for Digital and Personalized Health
Care Partnering at Roche.
The
conundrum is, without exclusive data pharma companies might face unnecessary
external competition and risk the expected upside of the process. While it is
difficult to ascertain whether data sharing will become a common practice or
pharma what would be cautious, will ascertain the future of a lot of ‘niche’ AI
drug discovery start-ups. It is likely that only the AI start-ups who
distinguish themselves with robust algorithms along with clean and
therapeutically valuable datasets will see the light of day.
AI productivity in pharma
A
recent study, ‘The upside of being a digital pharma player’ compares 21 big
pharma companies by their internal as well as external AI projects,
publications and patents mentioning AI, investments in AI start-ups and
consortiums and alliances between 2014 and 2018. Many companies are showing
early and clear signs of leadership in digitalization. For example, Novartis
has shown leadership in adopting AI with a patient centric approach and
Astrazeneca has published very impressive results in generative chemistry.
The
study also cited peers in the industry who are doing less than expected when
categorized the state of AI and total revenues. “A lack of coordinated strategy
in many companies have made the AI adoption look as a re-branding exercise and
not yielded the desired results”, says Dr. Alex Zhavoronkov, co-founder, and
CEO of Insilico Medicine – one of the top 10 companies leading the way in the
applications of AI to drug discovery. One of the authors of the study,
Alexander Schuhmacher, adds in an interview with Dr. Zhavoronkov that “AI is
still not a part of core strategies for some of the leading companies”.
Structuring
and sanitization of data is obvious to begin with, but importantly it is
imperative to understand the uniqueness and robustness of algorithms/models
developed to solve these increasingly complex problems.
Not
all AI are created equal. It takes a delicate interplay between the chemists,
biologists, and data scientists to come with algorithms that work effectively.
“Most AI conversations with pharma begin with questions like ‘How is your
platform distinct from others?’, even by some heads of digital/AI departments”,
continues Dr. Zhavoronkov.
Different
AI models may be needed for the diverse type of applications within a small
function. For example, optimizing a lead molecule in drug discovery requires to
optimize selectivity, toxicity, blood brain barrier and much more parameters.
Having a robust AI model that allows multi-parameterization simultaneously is a
feast only a few groups in big pharma and a few start-ups have achieved.
The
progress in the big pharma has been much slower than the academia or start up
space in terms of developing robust machine or deep learning algorithms,
however the big pharma are slowly catching up. It is likely that only the AI
start-ups who distinguish themselves with robust algorithms along with clean
and therapeutically valuable datasets will see the light of day.
Other
very important factors that affect AI productivity are the leadership and
culture. Pharma companies are traditionally accustomed to slow turnaround
cycles, taking 5-10 years to launch 1 drug and multiple years to even finish a
clinical trial. In the world of digital, you must act as tech companies and be
more comfortable with rapid cycle innovation and rapid failure.
The
transformation drive has to take a top down approach where the leadership
understands what benefits AI can provide in each of the verticals and create a
new AI unit that can cross talk with each of these vertical. “When you data
science efforts primarily led by business leadership, it certainly adds to the
data science efficiency. It is essential to find leadership that has a domain
expertise in multiple domains and can adapt to data science domain fairly
quickly to construct an economically viable model” says Dr. Bülent Kiziltan, an
accomplished AI executive working in a stealth mode AI start-up.
Since
the big pharma companies will not be able to accomplish such transformational
changes in the short term, they might continue to partner or acquire or invest
into start-ups as they do not want to miss the opportunity. In contrast,
small biotech’s/pharma, with much less data and relatively high flexibility,
may be able to bend their strategies to incorporate AI tools across the value
chain by just partnering with the right start-ups. Whatever the size of your
company, AI is likely to impact at least one vertical of your value chain. It
is imperative to begin your AI journey as early as possible.