Feature Content

Feature: Ideal Health and The Christie analytics case study

As HTN continues to focus on data strategy, we talk to Mark Sulston, a Consultant for Ideal Health, to find out about the health tech company’s recent analytics project with The Christie NHS Foundation Trust.

The Manchester-based trust, which provides cancer care, research, and education, is the largest single site cancer centre in Europe and treats more than 60,000 patients a year. The Christie asked Ideal to help develop an analytics roadmap, based around the HIMSS AMAM (Analytics Maturity and Adoption Model), which would help shape the upcoming new Christie Data and Clinical Outcomes strategy.  This involved working with an ‘unconventional mix of solutions’, focused on the trust’s in-house Electronic Patient Record (EPR).

Mark describes the current EPR as ‘a more forms-based solution, which is currently being migrated and rearchitected in alignment with OpenEHR and its data archetype model which will enable a much more patient-centric solution’.

Against this backdrop, Ideal’s work included interviewing around 70 stakeholders (clinical and non-clinical) before making a series of recommendations, aligned to the AMAM model, based on the identified and prioritised topics that ‘surfaced’ through discussions.

Below, Mark tells us more about the case study and explains how Ideal can help trusts shape their strategies…

Hi Mark, tell us about yourself and your role at Ideal

My work with Ideal is mainly focused around HIMSS (Health Information Management Systems Society) capability, competency, and maturity assessments. It also involves electronic medical records infrastructure – I’m a bit of a generalist when it comes to health IT. Once upon a time I was an anaesthetic nurse, but I’ve spent nearly 25 years in IT.

[This case study] isn’t about data strategy as such, it’s about strands that can come out of looking at an organisation’s data management through an analytics lens – with a focus on end-to-end data management within an organisation, to elevate capability.

What did your work at The Christie involve?

The Christie is an interesting organisation for several reasons. Firstly, because they have a very strong research focus and a sophisticated data culture. This isn’t the same as analytics but certainly, as far as data consumption is concerned, there’s a lot of planning associated with pulling all that together and making sure that you’ve got robust, reliable, and repeatable datasets for both functions. I saw some advanced skills sets and data knowledge within the analytics team and use of complex datasets that already meets higher levels of HIMSS AMAM, plus good multi-team clinical collaboration on predictive work.

The other aspect that’s interesting is that they’ve gone down a path of ‘Best of Breed’ and done a lot of in-house development for the EPR solution. They’re building out their medical records solution with specialist departmental solutions in several areas.

From the trust, the ask was for us to help develop a roadmap for the analytics service. The output of that was a series of prioritised recommendations which looked at analytics, but also end-to-end data management.

How did you approach the project?

What was nice about this project was that we were using the AMAM but – rather than using it to give a gap report and say, ‘this is where you are’, ‘these are the things that you need to do’ – we were able to go to the next level. We spoke to over 70 stakeholders during the process. You’re not going to find 70 people in an NHS trust, however interested in data they might be, who are going to want to talk only about analytics.

We focused on data availability, and perceived problems, as well as the things that worked well. We tried to have unstructured conversations on the topics that they wanted to talk about, working with the trust’s Head of Analytics and the Director of Clinical Outcomes. They were my local experts, who could fill in gaps and put context around some of the things that had been said, although it was still an independent set of findings.

The AMAM assessment was used as a framework to evaluate their service. It wasn’t about getting a HIMSS stage – it was about identifying the trust’s analytics capabilities, what is done well, what needs to change, where things need to be enhanced, and how to deliver a better, more flexible service that’s robust for the foreseeable future.

When you’re doing research, if you’re doing a 10-year study, you need to know your dataset up front and stick to it and understand where things are going. The requirement to do that, although it’s not essential for the operational or analytics side of things, is obviously very important to the trust.

It was around six weeks of engagement and we spoke to about 70 people over 30 interviews, then we put all of that unstructured information back into the AMAM model in a systematic way, which was a challenge. We’ve delivered it and provided that external, independent voice, to give some guidance and perspective. The recommendations across the board were broadly all accepted by the clinical and information stakeholders we presented to.

What were your learnings from working with The Christie?

I think the big-ticket item, the topic that shone through as the cohesive factor in the whole piece, was a set of principles called ‘master data management’, which was born out of finance but forms a fundamental thread within the AMAM framework. It looks to challenge an organisation to stop thinking about systems and start thinking about data flows in totality.

This is not unique to The Christie, but may have a particular resonance with them because the cancer pathways are often focused on the different modes of delivering treatment and the associated systems, which gives a specific set of data requirements that need to come together more effectively.

The idea with master data management is that you develop data owners and stewards, so you have the subject matter expertise owning the data flows, rather than the containing systems. It’s easy when implementing systems to say, ‘this is the system we need – let’s get it live’. But master data management is about stepping back from the systems implementation focus and then asking, ‘where do we store, use and modify drug data?’ and addressing all of those data types and flows, regardless of system.

You’ve then got a very different view of the world because, for example,  you’ve got a chemotherapy system that also records information about adjunctive therapies, a new EPMA solution for everyone else and reference to medications management in the EPR. Master data management is a great way of assuring that the data that flows from each of those is managed, displayed and used in a consistent way that enables more predictable use of these data, as well as knitting it all together in that data platform used by analytics.

Stepping back and taking that whole system view, there are clear potential advantages to be gained from looking at those data flows in totality, to show that the dataset is completely representative of the data model.

As a tool for bringing together all the topics across the data scope, analytics competency, data governance and infrastructure into a set of linked and prioritised recommendations, I think that was the big one.

You never get to stop time for the period whilst you’re doing an evaluation or a project, so at the same time as we were doing this, there were a number of big-ticket items that were going to significantly impact the analytics solution and the richness and scope of data available, which all needs factoring in when commencing this sort of approach.

What was it like working with a specialist trust?

It was nice to have the opportunity to dig a bit deeper and to look at that bigger picture. There’s a lot of work that we do around these assessments, where we do come out with a gap analysis and say ‘these are the things you need to do, here are some recommendations’ – but this was a level deeper, and we received more perspectives than we usually would. We were able to identify the strengths, highlight positive findings, and then get a lot deeper into the “how” questions which makes the exercise that bit more specific and pertinent to the trust.

It was an interesting challenge to pull it all together and make it meaningful and coherent. Ultimately, because we had that extra time and more of a roadmap to produce rather than a gap analysis, it was certainly one of the more enjoyable projects, from my perspective, because of that depth and the level of engagement across the trust.

Sometimes having a new, fresh set of experienced eyes coming in can bring additional insights. It’s a helpful way to clarify some of those priorities – it’s about getting to a less iterative, more complete delivery of solutions.

These are all big things to bite off and chew – but it’s nice to have that early recognition that they were all very helpful recommendations and that they fit with the understanding that was already there.

Finally, do you have any tips for how to begin the process of reviewing a data strategy?

There are a lot of different demands being made. Some hospitals have information departments or analytics functions, but all trusts have had to develop significant capabilities during the pandemic, with a whole new dataset that needs reporting and a whole new different audience. That’s formed a big part of people’s work in the short-term.

There have also been a lot of changes – more broadly – in the way that patients are seen, with a lot more remote work, all of which create new questions for the organisations to address. Layered on top, topics like population health analytics, predictive analytics etc, are in a formative period with Integrated Care Systems (ICSs). There’s a huge amount in there, so [it’s about] breaking it down into bitesize chunks.

Break down the data management and analytics projects into manageable pieces. Focus efforts on moving away from data review or data fix report production processes to putting in upfront processes, so that people working in analytics can focus on analysing data, rather than fixing it.

Focus on getting it right and repeatable – rather than going through those cycles of fix work.

All the trusts we’ve worked with recently have said they have a data strategy but need a new one now because so much has changed. A lot of organisations are looking to take that step back, evaluate where they are and do some planning across infrastructure, EPR, and analytics.

Supporting roadmaps, planning and strategy is the ‘bread and butter’ value of doing HIMSS assessments. Accepting these maturity models as a framework and working within that, and folding those models into the work that’s being done, can bring a lot of clarity to the organisations we’re working with.