Navigating the double-edged sword of medical AI
“AI will most likely lead to the end of the world, but in the meantime, there’ll be great companies.” – Sam Altman (Chairman, OpenAPI)
The question, “Can artificial intelligence make healthcare more efficient?” is a hot topic of debate. If you haven’t heard the buzz about AI in healthcare, you must be living under a rock.
We published an infographic last week which looked into AI’s applications within the industry, segmented into software, hardware, and services. Currently, software leads the market, making up 41%, which includes machine learning platforms, NLP tools, and deep learning frameworks. This segment is expected to see the fastest growth.
Speaking of machine learning, AI is increasingly being used for medical advice in recent times. Chatbots and other AI tools have become popular alternatives for instant clinical guidance, often replacing traditional Google searches.
A report by auditing and consulting firm PwC on AI in healthcare caught my attention with its opening line: “One of AI’s biggest potential benefits is to help people stay healthy so they don’t need a doctor, or at least not as often.”
While AI can assist healthcare professionals in many areas, can it truly replace doctors, especially when it comes to seeking medical advice?
A recent survey by Deloitte (another auditing and consultancy firm) revealed that over half (57%) of UK AI users would trust generative AI to provide medical advice and guide them toward appropriate care.
Does this surprise you?
Another recent piece of research from healthcare media company Medscape UK showed mixed attitudes toward generative AI among UK doctors.
While 47% of the 745 surveyed physicians would not use generative AI to treat patients, 48% were open to the idea. The survey shows a more favourable attitude toward using AI for diagnostic support. Specifically, 57% of doctors are enthusiastic about AI applications like interpreting medical images, and half believe AI could help reduce medical and prescribing errors.
Nadia El-Awady, Editorial Director at Medscape UK, said: “What these results are telling us is that doctors have the same concerns that we all have about AI in relation to privacy, misinformation and the need for regulation. Our research showed that half of those surveyed felt the use of AI could reduce the risk of medical error and can also be a useful information source for physicians themselves, both of which present significant benefits to patients.”
However, the same survey also pointed out that 86% of the surveyed physicians were concerned that patients using generative AI for medical advice might receive misinformation, and 82% worry that patients could take AI-generated self-diagnoses more seriously than their doctor’s guidance.
Another study published in the American Journal of Preventive Medicine found that large language models (LLMs) are not yet reliable enough to replace human medical professionals, highlighting the need for caution and scepticism regarding medical information from AI sources.
While AI has transformed healthcare by streamlining operations in hospitals, improving efficiency, addressing workforce challenges, and facilitating precision treatments, it is crucial to acknowledge its limitations, as these tools can sometimes generate inaccurate advice. LLMs are particularly prone to ‘hallucinations’, which is when the model will output grammatically and syntactically correct, but factually incorrect and misleading, information.
These are as a result of limitations or faults in the training model, limits in the accessible dataset, and the complexities of language.
Consequently, providers will need to be aware of the jeopardy and legal risks direct patient-facing engagement with AI tools can introduce.
AI is here to stay, and its integration into healthcare presents valuable investment opportunities. However, it is essential to engage in discussions about regulation and responsible use, especially when it comes to medical advice, as AI can never replace the critical role of doctors in providing accurate care.
We would welcome your thoughts on this story. Email your views to Rakshitha Narasimhan or call 0207 183 3779.