Interview

Interview Series: Joe Frost, Chief Technology Officer, Draper and Dash

In our latest interview we explore data, analytics and modelling with Joe Frost, Chief Technology Officer at healthcare analytics company Draper and Dash.

Can you tell me about yourself and your role?

I’ve worked in the NHS for around 10 years in various trusts around the northwest of England. I’m an informatics person, a self-proclaimed data ninja in some ways! I’ve gone from analyst to director level in the NHS working through warehousing, dashboard reporting, and data science towards the latter part of my NHS career.

I then started as a director at Draper and Dash, and I was in charge of data services which included modelling, cloud services, data science, and now our data science platform. Recently, over the last few months, I was promoted to CTO and that’s where I am now.

Could you tell me about the data science platform?

The purpose of the Data Science Platform, or DSP for short, is to remedy all the frustrations and problems I’ve encountered over my career. DSP takes all the learning we’ve acquired over the past few years; we build models on a frequent basis for our clients and picking up that model and moving it, is a challenge. The DSP is a platform which is made from a collation of pre-curated models that have been developed over the past few years.

The platform provides the ability to put data into a tool to create a very specific model based on what the informatics team is trying to achieve; informing which patients they want to readmit, informing which patients are going to be long-stay patients, creating a forecast as to what the A&E department will look like, informing what the ‘R’ rate is for positive Covid cases tested within that specific hospital.

The platform answers the aforementioned types of questions, which we at D&D curate into the platform itself; we load the data in, you select the model, and then it provides you with an output. It is not aimed at the traditional data scientist who wants to get hands on; it is aimed at those informatics teams where the majority do not have data scientists, and the platform costs less than the cost of hiring a data scientist.

In a previous trust I worked in, we didn’t know what technology we were going to use, so we needed to hire data scientists and then spend money on training them up on NHS terminology and understanding the data.

The next issue we have following that process would be the difficulty in moving a model from one trust to another, say for example when a merger was taking place between trusts; it is not as easy as simply picking the model up and moving it. The technology may be different, the structure may be different, the recorded data may be different, and so the question is how to move that model between hospitals in a plausible way.

So, the DSP takes standardised data sets, brings them in, cleans them, validates them, and builds a model from that, and you can do that several times based on need which results in standardised models that you can tweak if so inclined.

Essentially, the platform takes the portability of models and collates that portability into one single platform, as well as standardising and sanitising the incoming data to build a model.

What have been the challenges around data?

The challenges are around localisation; although we’ve had standard data protocol for years now, hospitals record and store their data in a unique way. However, at a hospital’s core is some level of standardisation, and our models look for and define the differences between how hospitals are managing data, which is an automated process.

The other challenges are around data sanitisation, where we cannot sanitise a hospital’s data for them – we are not going to be able to understand the nuances of an individual hospital’s data, so we provide the tools to allow them to sanitise their own data. Once the data has been cleaned and is in the correct structure, the model will take care of itself. 

Is there room for the platform to be integrated into other care settings such as primary care?

Yes, we are working with one of our clients on this at the moment, we are looking at rolling out a really large, comprehensive demand and capacity model. Essentially, it uses five models found in DSP and if you load it in exactly the way that logics the model, you will get an output of a massive demand and capacity model. We are looking at using this model in an Integrated Care System (ICS), so multiple hospitals get the same outputs for that demand and capacity model for the ICS to utilise.

How is the platform supporting organisations currently?

We are supplying the ICS for Thames Valley with a suite of dashboards, where all the modelling behind that is being done using the DSP.

We deployed the platform to a hospital in Abu Dhabi too, where the tool provided daily information around the ‘R’ rate and whether this was increasing or decreasing, to inform what the peak of infection would look like and when the peak would occur, and also what a second wave would look like – no one knows what a second wave would look like but we provide them with scenarios around what could happen and what would be the impact on their services.

Also, we are doing a lot of demand and capacity with ICSs as previously mentioned. It looks at the services and how could Covid affect them going forward, essentially giving them demand scenarios for example “what would a winter peak look like?” And the other end of that would be to give them “what would the response scenario look like?”.

Is there a particular model that you are excited about?

The model which stands out to me the most is our ‘long length of stay’ model, which deals with stranded patients – patients which have been in the hospital for more than 7 days. From a tech perspective, it brings together a lot of elements; it is a classification model which reaches the granular level patients and gives the reasons for why the patient is having to stay in hospital.

Often, these reasons are sometimes obvious, due to age, or time of admittance for example, but through analysing these reasons provides a true insight and helps look for the patterns as to why patients stay in the hospital longer. Often it is down to social care issues, but if you can work out patterns in the initial phase, you can stop the repeating of these types of stay.

Hospitals do not often know what to do with the information outcomes, for example we can precisely predict the peak within A&E departments, but then hospitals do not change staffing levels based on that prediction; it is about grappling with the data and implementing action based on the data you have requested to be found. One of the things that we are going to have to do on this journey is to get people comfortable with the data and understand what they have to do with it.

What are the key learnings around this project?

People have a habit of wanting to predict without understanding why they are predicting. For me, models should always be about answering a question; you should really think about what you are trying to achieve or intervene with. From a tech perspective, you’ve got to map it out from the beginning; what technology you are going to use, and how it works for you.

NHS trusts have tight budgets where a lot of this tech is expensive and so the tech needs to be narrowed down to precisely meet needs. We need to really think about how we structure the technology so that it is comfortable for users; not every trust is that forward thinking.

90% of our customers are not comfortable putting their data into the cloud and so we cannot build a ‘one size fits all’ solution. We have to think about standardisation and what tech to use.