News, NHS trust

Cheshire and Wirral Partnership shares new digital strategy and AI framework

Cheshire and Wirral Partnership NHS Foundation Trust’s latest digital strategy and new AI framework have set out objectives, principles, and guidelines on the implementation of technology across the trust and the future use of AI.

The digital strategy to 2030 is centred around three main objectives: enhancing patient care and outcomes, using digital solutions for personalised care and leveraging data to support preventative care; supporting and empowering staff with digital tools, AI, enhanced digital literacy, and a culture of innovation; and integrating data for seamless care, delivering a single patient record, ensuring data is used responsibly to inform care, and promoting interoperability and real-time data sharing.

Priority areas of focus for the next five years include electronic prescribing in community, exploring patient portal options, employing ambient voice technology, focusing on data-driven care, and pursuing EPR developments. Setting out an ideal patient journey in 2030, CWP looks to use voice technology, digital appointment management, options for direct communication with GPs, electronic prescriptions “straight from health record to pharmacy”, and shared care records to reduce duplication.

The trust also sets out a five year clinical digital transformation roadmap, with a year one focus on “getting the basics right”, EPR optimisation, and improving data quality. In year two, it plans to introduce visualisation tools to improve data capture, explore AVT to reduce admin burden, and expand secure remote access to support flexible working. Year three will shift to integration and intelligence with a unified patient record, dashboards to support decision making, and AI to improve flow and efficiency; whilst years four and five see the deployment of predictive tooling and AI into clinical education and practice, and personalised care supported by patient portals and NHS App integration.

Integrating data for seamless care will see digital priorities defined by clinical teams, the co-production of tools and services, shared records, interoperability, cloud-first principles, and strengthened cyber security. Elsewhere, digital and AI will play a role in finance and resource management, workforce and HR, estates and facilities management, procurement and supply chain, governance and compliance, and communications and engagement.

Recognising the increasing integration of AI in health and care as “both a significant opportunity and a complex challenge”, CWP presents its AI framework, aiming to offer practical guidance, governance mechanisms, and decision-making tools to promote the “safe and consistent” adoption of new technologies. The framework offers a phased approach around proof-of-concept, pilot, and business-as-usual, supported by operational checklists across clinical safety, explainability, procurement, evaluation, monitoring, and more.

“The trust operates within a financially constrained system, with limited capacity for new investment. Despite these pressures, digital and AI technologies are recognised as key enablers of long-term sustainability,” CWP states. “The framework is designed to support the trust in navigating this complex environment, ensuring that AI adoption is both safe and aligned with organisational priorities.”

AI applications will be split into four domains: clinical support, operational automation, patient-facing technologies, and data analytics, according to CWP’s framework, with each assigned defined criteria for evaluation. Implementation will follow a phased approach beginning with pilot programmes in high impact areas, before being evaluated against baseline metrics and refined for wider rollout. Integration with existing systems such as the EPR will be of central importance, and training and support will be provided to promote staff confidence around the use of new tools.

All AI tools must be pre-approved, with those used for work purposes only to be accessed via corporate devices. Office 365 AI functionality will be considered for adoption ahead of third-party tools to help leverage value from centrally funded and CWP licensing investment, and to reduce training and support overheads in comparison to third-party solutions.

Pilots will be prioritised in areas with high administrative burden or unmet clinical need, with clear pathways established, scope and success criteria defined, a valid return on investment demonstrated, continuous monitoring in place, and the potential to transition successful pilots into full deployments. Evaluation will look at measuring metrics such as time spent on documentation, improved user experience, or increased operational efficiency, with unintended consequences also to be monitored and addressed.

“There is no new resource,” the trust highlights, “therefore benefits realisation to digital solutions will be key to understand potential efficiencies for example scaling automation in business and back office functions (HR, Finance etc.), trust wide administration and clinical workflows.”

The trust also notes key steps to be taken in line with proof-of-concept, pilot, and business-as-usual phases. For clinical safety and medical devices for products used in clinical care, for example, steps include identifying the CSO lead, reviewing supplier DCB0129 documentation, updating and noting new hazards or mitigations, and ensuring the DCB0160 clinical safety case remains up-to-date with changes such as reduced motoring or expanded scope during business-as-usual rollout.

Also included is an AI ethical reflection template, which encourages those implementing AI to consider how it might exacerbate inequalities, risks around increased surveillance, environmental impact, and how use of the system might impact relationships between patients and health professionals.

To read the digital strategy and AI framework in full, please click here.

Wider trend: AI in health and care

We were joined for a practical HTN Now webinar taking a deep dive into AI in health and care by expert panellists Peter Thomas, chief clinical information officer and director of digital development at Moorfields Eye Hospital; Sally Mole, senior digital programme manager – digital portfolio delivery team at The Dudley Group; and Ananya Datta, associate director of primary care digital delivery, South East London ICS. The session shared approaches, best practices, challenges, successes and learnings for the practical implementation of AI technologies across health and care, with our panel offering insight into current work, future plans, and ongoing collaborations in areas such as Ambient AI.

HTN was joined by a panel of experts from across the health sector for a focused webinar on the use of ambient scribe technology in NHS trusts. Panellists included Lauren Riddle, transformation programme manager at Hampshire and Isle of Wight Healthcare (HIoW); Ynez Symonds, CNIO at HIoW; Dom Pimenta, co-founder and CEO at Tortus AI; and Stuart Kyle, consultant rheumatologist and clinical lead for outpatient transformation at Royal Devon University Hospital. Our panel discussed the practicalities and considerations for ambient scribe implementations, from operating procedures and policies, integration and functionality, through to best practices around patient-practitioner interactions.

An AI Policy from Doncaster and Bassetlaw Teaching Hospitals NHS Foundation Trust has set out a guiding framework to ensure “the appropriate deployment, management, and oversight of AI systems across the DBTH partners”. Its scope covers all departments and services, and AI systems either internally developed or procured from external suppliers.

The Welsh Ambulance Services University NHS Trust board has shared its digital KPIs, work on information governance, and discussed the risks surrounding unauthorised or inappropriate use of AI. Granted a high risk status, the unauthorised or inappropriate use of AI tech such as ChatGPT “outside of approved organisational channels or without appropriate governance” was considered. The trust points to the potential for information shared or returned by such tools to breach information security and data protection controls, adding that “use of the output may breach transparency, medical device, equality, Welsh Language and ethical requirements”.