HTN was joined for a recent HTN Now webinar by an expert panel to discuss AI in primary care, covering successes and challenges with implementing AI in primary care, governance, adoption, and other practical learnings.
Katie Baker, director UK & Ireland of Tandem Health; Mateen Ellahi, GP and member of NHSE primary care advisory group, South Stockton PCN, Elm Tree Surgery; and Paul Miller, head of IT at Nottingham and Nottinghamshire ICB made up our panel.
Katie introduced us to Tandem Health, sharing details on the Tandem medical AI assistant, a partnership with Accurx, and current presence in around 97 percent of NHS GP practices and an estimated 40 NHS acute and community trusts. “We basically started with clinical documentation, using voice technology to create structured documentation and coded documentation, before moving to become the full breadth of a medical AI assistant,” she told us. “On a personal level, I’ve spent about eight years working to try and improve health systems across the UK, EU, and the US, so I’m very excited to talk about AI in primary care today.”
“We like to think we’re quite a forward thinking practice,” Mateen said of the Elm Tree Medical Centre. “We not only adopt technology, but we’re starting to make our own, which has been really exciting over the last 12 months or so. I have several roles – positions in PCN, with the ICB as an ambassador, consultancy roles, and I’ve been part of projects with NHSE, as well.”
Paul shared his responsibility as head of IT for the Nottingham and Nottinghamshire region, covering all primary care and corporate IT services, working with GP practices and partners to ensure digital services are safe, effective, and financially sustainable. He noted a few current and recent projects for AI and automation, pilots of auto filing and auto review of pathology results, as well as AVT being trialled in practices. “We’re very much in the pilot, exploratory phase at the moment within our particular region,” he added.
Automating primary care workflows
The panel started out by considering the potential for automation to be used in primary care workflows, with Katie suggesting the best place to start is areas with the lowest risk, such as documentation, that can have benefits of giving clinicians back time to care. “That covers anything clinicians and healthcare staff are spending time at home working on – referral letters, clinical documentation, and so on. Further down the line, with regulation, there are things like triage we can get into, but keeping humans in-the-loop and doing it from a responsible, safe perspective.”
Mateen emphasised that the biggest opportunities for AI in the primary care space are not in replacing clinicians, but in working alongside clinicians, removing the low-value work that detracts from patient care. “In general practice, the main pain point is not a single task,” he went on. “It’s a buildup of bloods, documents, letters, coding, prescriptions, checking your inbox every hour, triage, safeguarding, following up things like DNA appointments. AI needs to be used to reduce the cognitive load on both clinical and non-clinical staff, and to improve safety as much as possible.” As well as documentation management, he considered the possibilities of using AI in analysing practice-level demand and in population risk stratification in the near future.
“My practical learning would be not to start by looking at a shiny new tool,” Mateen reflected. “Start by analysing the workflows in your practice, looking at where the bottlenecks are, and so on. For our practice, it was analysing which members of staff were staying late, what reoccurring errors we had, and where patients were spending time waiting.” Drawing out the patient journey is a good place to begin, he continued, to understand where admin tasks are being passed along, or where duplication is happening, to see where AI could have the biggest impact or solve a particular problem.
Paul agreed with Mateen’s point about framing the problem you are trying to solve first, rather than going straight to the technology, then working on the governance and other preparatory requirements. “What has worked well for us in Nottingham so far is in our pathology results reporting work, using some of the tools and facilities built into our clinical systems to set up auto review rules,” he shared. “We’ve also been working with a third party on auto review, and one of the key things is to start small, identify the high volume, low risk things, as Katie said. Start small and build that confidence and assurance within the tool set, before you look to build out further automations and volume.”
Considering ambient voice technology (AVT), Katie pointed out that access “doesn’t necessarily mean adoption”, but rather implementing AI tools is a transformation journey. “You need to pick a partner for that who is almost going to hold your hand through it,” she said, “not only from a governance or safety perspective, but who is also going to work with the clinicians on the ground. A large proportion of our team are clinicians, who can work along that clinical workflow and ensure it’s working appropriately, so I would say there have been lots of learnings from that perspective.” Using clinical champions who work within practices to show others how to use the tech and build trust is also helpful, she noted.
Challenges for primary care with introducing and implementing AI
From an ICB perspective, Paul highlighted the challenges arising from the large volume of available products, technologies, and solutions. “It’s difficult to keep up, because the AI world is moving so quickly, and we have to be able to offer that assurance,” he shared. “My recommendation would be to get the governance and clinical safety aspects in place, because ultimately it’s a human that needs to be accountable for AI – the biggest challenge for me is making sure everything is in place so we can support primary care with being innovative in this space.”
“For me, the biggest challenge isn’t a technical challenge; it’s the operational and cultural aspects,” Mateen told us. “Clinicians need to understand what AI actually is, where it works well, and then understand how it can fail or where the risks are.” Issues also arise with workflows, he continued, where the use of AI can create extra clicks, extra checks, or other steps that will make it more difficult to adopt in general practice. Digital maturity can be another barrier, he suggested, with practices struggling with workforce and infrastructure pressures maybe less able to adopt tech like AI. “Lastly, the culture aspect. NHS organisations and practices have been promised transformation several times, but it’s never actually happened, and I think clinicians are scared it’s a false promise, where you don’t get enough procurement, funding, or help to make it happen.”
From a safety perspective there also remain challenges, Mateen shared, with having a clinical safety officer (CSO) in practice not necessarily being enough, and with clinical safety courses often not effectively covering real world practice. “Taking into account the information governance, the DPIAs, the procurement checks and everything else is important, and then you need to look at who is monitoring the AI in case it drifts from what it’s supposed to do, or who is reviewing errors if they happen?” Also of note are requirements for informed consent and assigning responsibility for informing patients, he added.
Picking up on the topic of informed consent, Katie explained how Tandem Health tackle this by providing leaflets for patients in waiting rooms, and by having a code that can be entered to say that a patient has given consent. “There’s no legal obligation to do that, but we think it’s best practice,” she said. “There are multiple other things we’ve created a governance pack to support, and we provide CSO’s into regions as we know not every practice has one.”
Although considering the risks of implementing AI are key, it’s also important to look at the risks of not implementing, Katie commented. “We’ve got 76 percent of the GP workforce telling us that the amount of work is going to lead to clinical errors, so it’s balancing that with a new tool that is regulated and has post-market surveillance – we’ve developed several techniques to deal with that, and we’re working with the MHRA to make sure we have that adaptive post-market surveillance. Across Accurx and Tandem we have about ten CSOs, and 80 percent of our team are clinicians, so we’ve got people constantly checking, validating, and auditing.”
Katie discussed how hallucinations and omissions are particularly important, and the support that Tandem Health offers to pinpoint where those are most likely to happen, and how to spot them. “It would be impossible to sit here and say they won’t happen; it’s reducing those as much as possible and getting people to check outputs – most errors actually come from audio setup, so ensuring you have the right audio in your practice is key. We do lots of work with our clinical team to keep that low, and we have to because of the device being regulated, but from a clinician point of view it’s best to check, and if you spot something, there’s a button to press to report it, which takes it through an internal process.” This is why it’s integral to keep a human-in-the-loop, she went on, “and on balance, hallucinations might seem scary, but we have to remember that humans also make errors, and the error rate is lower than a human being actually writing the note”.
Governance and safety
It’s GP practices that are accountable for decisions that are made in the AI space, so ICBs need to offer support for things like clinical safety and data protection, Paul suggested. “I think it’s also important to look at it from a sustainability angle, to see if you can sustain anything you’re implementing from a workforce and financial point of view. System leaders should be asking the right questions, and understanding what’s available to support primary care.”
One of the ways Nottingham and Nottinghamshire ICB has approached evidence and evaluation is by linking into the Midlands Regional AI and Automation Group run by NHS England, Paul went on, “so we’ve got a community of interest there where we can share ideas and see what’s happening in the area”. A group has also been set up with system partners in the region to share best practices on AI governance and so on, with the intention of helping to avoid duplication. “Having those conversations locally helps build understanding of what’s going on in your region, collaborating with colleagues, seeing if there are any groups you can link into to get that evidence,” he added.
For Mateen, the education piece is integral, and AI literacy should be taught in the same way as prescribing. “Everyone is going to have to adopt this at some point, and it will be important to know what hallucinations mean, what data can be used or produced, and how to check outputs,” he noted. He also discussed the potential to standardise information given to patients, letting them know that the tool will be in use and how it will be used, to allow them to give “full and informed” consent. In the future, there will be so much data that can be used by tech startups in a pilot before going into a practice, that that will enable them to enter the market “way ahead” of where a startup would be coming in now, he said.
“We’ve done studies on the patient perspective and the impact on patients,” Katie explained. “We’ve also looked at things such as clinician experience, cognitive load, staff retention, the financial benefits, quality, and so on, preparing a number of different case studies. What we don’t yet have is the clinical outcomes – my belief is if we structure the data properly and get the foundation right, we can use that to look at population health data, patterns for disease recognition. But if we can use it to make sure nothing is missed during a consultation, that’s when we may get better outcomes down the line, and that’s the bit we don’t yet have the data on.”
Katie also talked about being regulated as an AI device, saying: “We have Class IIa certification for coding, and that means you have to be audited, you have to have clinical evidence, and it’s much more strict than the self-certified Class I devices out there. So depending on the type of AI device you have, that will come in, and I expect there will be more from the AI Commission around evidence that’s needed for these devices to be scaled across the NHS, because we don’t want multiple fragmented tools everywhere, we want evidence-based tools.”
Having a lifecycle approach to governance and having mechanisms in place to account for evolving systems will be central moving forward, Paul considered, “and it will be necessary to develop that approach so it supports safe implementation, but also allows for innovation and doesn’t hold anything back”. Good governance and what that looks like is a regular topic of discussion in a variety of different forums, he noted, “but it’s a really challenging area in terms of getting that right”. Offering a piece of advice to anyone looking to adopt AI, he said: “Don’t wait until you feel as though the governance is 100 percent, because I don’t think it ever will be – be safe and assured in what you’re doing, but don’t be held back.”
Promoting behaviour change, adoption, and workforce AI literacy
Looking to the workforce and how to promote behaviour change around the implementation of AI technologies and AI literacy, Katie referred to a study by Tandem Health of 1,500 people using ambient scribe technology, which showed that around 79 percent of clinicians reported it allowed them to spend more time focusing on the patient, and around 76 percent of patients felt more listened to. “If we drill down a bit more into what that looks like in terms of behaviour change, there’s the obvious of not typing as much, but there are also other things to think about, like the fact the technology can hear, but it can’t see. That means you might need to commentate on your examinations, which is an example of a behavioural change.”
On the education piece, there has been talk about the fact that current medical students are not being given any kind of AI training, according to Katie, and will be leaving medical school in the next few years with no mandatory training on AI tools. Vendors should take some level of responsibility, she put forward, noting Tandem Health’s involvement in providing workshops, webinars, and onboarding that has been shown to “work really well” for adoption. “I do think there’s something that needs to happen also in medical education on a national level,” she said, “but also at different levels depending on where they sit, which might mean the questions that need to be asked in procurement, or things to be aware of when actually using the technologies on the frontline.”
Answering a question from our live audience about how things like AVT can be used with people with communication challenges, Katie also talked about a use case in a hospital for speech and language therapy. “It actually worked really well, and it can still be used for documentation,” she noted. “It can be adapted for that, and we’ve gone through processes with people with all kinds of difficulties, from eyesight to aphasias, because the technology needs to be accessible – it’s adapting it, and then talking through what’s happened, and letting the patient review that afterwards.”
Mateen shared the importance of engaging with both clinical and non-clinical staff members to find out where bottlenecks are, and then progressing on to design a workflow and a patient journey. “Start with the low-risk pieces, so non-clinical workflows, to get those quick wins, and then that will make it easier to get your team on board,” he suggested. “That’s how the cycle of change normally starts: quick wins, getting people on board, then scaling that up through the practice. It does depend on culture, so culture from the top – if your partners are engaged in the process, then it’s more likely the managers and other staff will also be engaged, so take slow steps, report incidents if required, and celebrate the quick wins.”
What does success look like for AI in primary care?
Success for AI in primary care would mean the tech becoming invisible and just part of everybody’s normal workflow, Paul stated. “I don’t think we’ll ever be in a position where AI will replace humans, certainly not in a GP practice or in the wider primary care space, but it’s there to act as a support tool to reduce some of the administrative burden we see, to create capacity.”
Katie acknowledged: “My hope is that we’ll be using this technology to bring back the humanity that patients deserve, and that it’s helping to support our workforce. I hope it’s created the foundation on which we can build agentic AIs to improve patient outcomes, because that’s what all of us ultimately want.”
Mateen also shared his own hopes: “From a GP perspective, it would be that hopefully we can finish on time; from a patient perspective, improved clinical safety and making sure they don’t have to repeat their history multiple times; and from an administrative point of view, that staff feel supported rather than overwhelmed. Also, safer follow-up for abnormal results, particularly blood results, and improved continuity of care.” AI scribe technology is the gateway to a wealth of potential, he continued, “and having AI embedded into workflows off the electronic health records, so not only can it just scribe, it can do discharge letters, referrals, order blood tests, file away blood tests, really transforming primary care”.
Tandem Health can currently offer some of those capabilities, including advice and guidance, discharge summaries, or patient letters that can be generated immediately within consultations, Katie told us. “What’s coming in future is summarisation of medical record so clinicians don’t have to search back through patient history, task management, and we’re also looking at improving patient flow from reception through to triage and then into face-to-face consultation, as well as the enterprise layer on top for organisations to use their data to improve care.”
For patients, the biggest benefits will be in clinics running to time, having the opportunity to explain symptoms more clearly, reduced waiting lists, and less duplication around telling their story, Mateen considered. “That will have a downstream effect on A&E and secondary care, as well, helping the NHS save a lot of time and resources that can be used in a more efficient way.”
“In Nottingham and Nottinghamshire in our work on automation of pathology results, we’ve got about 50 practices using auto review rules,” Paul shared. “They’re reviewing and auto filing about 6,000 results per week, so we can monitor that from a data perspective to see what’s happening, and extrapolate that out in terms of minutes spent per clinician in reviewing a normal result and filing it. We’ve then got some benefits work we have done around that to understand the impact. When you start out with this, it’s mapping out what you want to achieve and then monitoring it.”
We’d like to thank our panellists for taking the time to share these insights with us.





