Breaking the bias in healthcare AI - European Medical Journal

Breaking the bias in healthcare AI

2 Mins
EMJ GOLD
If you think it’s expensive to eliminate the bias in healthcare AI algorithms, imagine the cost if you don’t. How can pharma work to remove bias so patients are treated equally by AI tools?
Words by Saša Janković

With key strategic partnerships accelerating market growth, innovation in artificial intelligence is improving various areas in healthcare including surgery, radiology and pathology, as well as tackling challenges with product development, improving supply chains and designing smarter clinical trials – and that’s only scratching the surface.

Increasing investment and development in AI saw the healthcare AI market valued at $7,679.39 million in 2021 – according to GlobalData’s most recent ‘AI in Healthcare’ market intelligence report – with analysts predicting a CAGR of 39.05% from 2022-27.

Inputs matter

While AI offers great potential in the pharmaceutical industry, it also has a concerning downside: as AI algorithms learn from human data there is a risk that bias can be introduced to these models. “[Although] the results created by an AI model can be considered impartial or objective, those results are entirely based on the input data used to train the model,” says Shehmir Javaid, Industry Analyst, AIMultiple. “This means that the person collecting the data to train the model may inadvertently transfer their bias to the dataset.”

We seek to ensure our use of AI is sensitive to social, socio-geographic and socio-economic issues

It’s a well-known fact that AI is only as good as the data it’s based on, and, in many cases, the influence of bias on this data – whether it’s ethnic, gender, socio-economic, linguistic or another – can have detrimental effects. Ramakant Vempati, Founder, Wysa – a global AI-driven mental health support tool – says a commonly cited example is training a face recognition algorithm only with people of certain ethnicities. This can result in “bias in image recognition algorithms for radiology applications, for example, leading to erroneous conclusions or diagnoses that can have serious consequences – ethically as well as clinically”, he says.

Mitigating AI bias

Thankfully, most companies recognise the risk of bias in healthcare AI, and some are already developing tools to mitigate this. The Mayo Clinic, for example, has developed Platform_Validate – a tool that confirms the efficacy and credibility of newly developed algorithms to ensure they are fit for their intended purpose. It reports on how an AI algorithm performs under different constraints, including racial, gender and socio-economic demographics. Pharma companies can follow this lead by “ensuring that their clinical trials provide sufficient data to cover each segment of the population”, advises Reeve Benaron, Founder and Co-CEO, Intrivo – a US health technology company.

AstraZeneca is applying AI in its discovery and development process from target identification to clinical trials, with Jim Weatherall, Vice President, Data Science, Artificial Intelligence and R&D, AstraZeneca, saying: “We seek to ensure our use of AI is sensitive to social, socio-geographic and socio-economic issues, and protect against discrimination or negative bias to the best of our ability. We monitor our AI systems to maintain fairness throughout their lifecycle [and] will continually adapt and improve our AI systems and training methods to drive inclusiveness.”

This can be easier said than done, according to Vempati, who warns: “Clinical data is hard, expensive and time consuming to obtain. The cost of creating datasets increases exponentially with the size of the dataset, yet the cost of not doing so is, in a sense, higher.”

Nevertheless, smart and cost-effective solutions will need to be sourced and implemented. As AI’s prevalence in pharma increases, the importance of eliminating bias will become more important to ensure clinical and ethical stability, and optimum outcomes for patients.

Please rate the quality of this content

As you found this content interesting...

Follow us on social media!

We are sorry that this content was not interesting for you!

Let us improve this content!

Tell us how we can improve this content?

Keep your finger on the pulse

Join Now

Elevating the Quality of Healthcare Globally

>