News

4 examples of AI in present-day healthcare

Over recent years there has certainly been hype around artificial intelligence and whether the reality of this technology meets those expectations.

We tasked our reporter Matt White, as part of our Health Tech Trends Series sponsored by InterSystems, to explore and research four examples of how artificial intelligence is being used to support healthcare teams in the present.

AI in Radiotherapy

Addenbrooke’s hospital in Cambridge has utilised AI technology in its administering of radiotherapy to cancer patients. The technology is being used to support doctors who would previously study 100s of images to calculate precisely where to target radiotherapy beams.

Raj Jena, a neuro-oncologist at Addenbrooke’s says on the procedure “until we define where the tumour is and have defined the healthy tissues we want to protect, we cannot start the treatment. This is the bottleneck. The quicker you get this done, the quicker you can get the patient into treatment.”

With the implementation of AI into the procedure, the process of identifying and marking out borders for radiotherapy beams can take just minutes. A Microsoft system called InnerEye is able to automatically mark-up scanned images of cancerous tissue.

For prostate cancer patients, InnerEye outlines the prostate, creates a 3D model and sends the information back. Because of the nature of narrow task-specific AI, inputted data for the algorithm to work is based on a plethora of images from past patients already seen by consultants.

AI & Heart Disease

HeartFlow’s AI technology has been adopted by the NHS to better tackle coronary heart disease and is being used at 13 NHS hospitals across the UK.

According to HeartFlow, many patients undergo unnecessary invasive testing due to non-invasive diagnostic tests providing inadequate information on symptoms of chest pain.

“More than half of patients who undergo these invasive tests have no significant blockage. AI enables clinicians to identify significant coronary artery disease and determine the optimal pathway.”

The technology works by analysing CT scan results; the method commonly used across healthcare in identifying blockages in and around the heart. The CT scan data is uploaded to HeartFlow where the software analyses and delivers an analysis via its own interface.

A 3D model of the heart and arteries is created where millions of complex equations are solved to assess blood flow through vessels and arteries. The model is colour-coded to further assist clinicians in identifying blockages.

According to study data on HeartFlow, the AI analysis has demonstrated higher diagnostic performance than other non-invasive cardiac tests as well as reducing time for diagnosing heart disease.

HeartFlow president and CEO John Stevens said: “We will be working hard to ensure HeartFlow can help improve the overall patient experience, by both helping physicians identify heart disease which may have otherwise been missed and delivering significant cost benefits to the NHS.”

AI in Cancer Diagnosis

C the Signs is an AI decision support tool to help GPs identify patients at risk of cancer earlier. The app can be used during consultation either on a desktop or mobile device to identify which referral and investigation a patient requires.

The app allows for checking of combinations of signs and symptoms as well as risk factors and an AI algorithm can support the likely cancer risk for the patient to be further referred for.

The app can be searched by systems of the human body or by symptoms, where the patient’s age can then be inputted alongside the patient’s symptoms.

C the Signs says of the app “C the Signs covers the entire spectrum of cancer, cross-referencing multiple diagnostic pathways, to support clinicians to identifying what cancer or cancers a patient is at risk of, as well as the most appropriate next step for the patient; whether a test, investigation or urgent referral.

“It is fast enough to be used during the consultation to speed up decision-making, ensuring at risk patients are identified and access the right service at the right time for their clinical needs.”

A pilot for C the Signs has already been launched in Sutton which has led to ‘improved GP consultations and a smoother referral process’.

AI in Eye Health

Moorfields Eye Hospital NHS Foundation Trust and UCL Institute of Ophthalmology have been using AI and machine learning technology trained on thousands of previously administered eye scans in order to identify eye disease and referral recommendations for care.

According to Moorfields, the AI can state the correct referral decision for over 50 eye diseases with 94% accuracy, where similar to the aforementioned radiotherapy case study, matches the diagnosis that would be given by clinicians in this field.

Dr Pearse Keane, consultant ophthalmologist at Moorfields Eye Hospital NHS Foundation Trust and NIHR Clinician Scientist at the UCL Institute of Ophthalmology said: “The number of eye scans we’re performing is growing at a pace much faster than human experts are able to interpret them.”

“There is a risk that this may cause delays in the diagnosis and treatment of sight-threatening diseases, which can be devastating for patients.”

“The AI technology we’re developing is designed to prioritise patients who need to be seen and treated urgently by a doctor or eye care professional.”

“If we can diagnose and treat eye conditions early, it gives us the best chance of saving people’s sight.”

“With further research it could lead to greater consistency and quality of care for patients with eye problems in the future.”

With the case studies already mentioned, a reoccurring theme is the idea that AI is creating a more efficient diagnosis process saving clinician time, cost saving, and crucially, improved patient care.

The AI at Moorfields is no different in this regard, where previously, ophthalmologists used OCT scans to identify eye disease. However, the process to analyse these scans can take hours impacting the length of time patients need to wait to be seen to discuss their diagnosis and subsequent treatment.

The AI uses two types of neural networks – essentially a network of nodes which simulates human biological neural networks made up of neurons, where ‘the AI system quickly learnt to identify ten features of eye disease from highly complex optical coherence tomography (OCT) scans’.

The system can provide information to the clinician as to how it arrived at its referral decision and recommendation. It highlights features of the eye on the OCT scan and provides a percentage of how confident it is in its recommendation. The clinician can then scrutinise the AI advice before deciding on the treatment and type of care for the patient.

Interestingly, the AI system is interoperable and future proofed with other types of scanners, and so as OCT scanners are upgraded or replaced over time, the AI system can still be utilised.