Voice

HTN Voice: Mental health apps must win trust in how they use data if they are to achieve potential

By Evelina Georgieva, co-founder of Pryv

As the long-feared second wave of Covid-19 begins to materialise across parts of Europe we are also seeing the continued growth of a second major concern; a global mental health crisis.

Studies have found a huge rise in the number of people experiencing stress, anxiety and depression and experts predict that these will worsen as infections increase and lockdowns are reimposed.

The impact on mental health is unsurprising. Many people have lost their lives through this pandemic and even more have lost loved ones. We have seen whole sectors close down and jobs lost en masse. Paying the bills and family responsibilities are both sources of stress when one loses income. And yet, while worries like these are the easiest to see, there are many more under the surface. We have lost so many simple things that helped us to stay connected and that we previously took for granted. We fear hugging and kissing friends and relatives and meeting new people. Our worries are many, but many of us still avoid discussing our wellbeing with the people around us.

Pre-Covid-19, if someone was experiencing a mental health worry they might go along to see a therapist or counsellor. Even those people who like to keep their feelings private would be comforted by the fact that the therapist’s professional code meant they would not share this information with anyone else. However, the nature of the Covid-19 infection has meant that even accessing this support in-person is now far less straightforward. So where can people find the trusted help they need?

The behaviour change tool in the palm of our hands

In recent years academics and developers have talked a lot about ‘Nudge theory’, a behavioural economics concept that is based around the understanding that if you change a person’s environment – or choice architecture – you can influence their decisions and help them to make the ‘right’ choice. Nudge has been well-used by advertisers to influence consumption habits for many years but it is now increasingly being used to encourage healthier choices, with traffic light nutrition labelling that highlights foods that are high in fat, salt or sugar being an obvious example.

Our smartphones are with us all the time, have proven themselves to be excellent tools for gamification and carry huge volumes of data about us, so it’s unsurprising that they have become a vehicle for delivering digital health behaviour change interventions that nudge us towards healthier choices. Fitness apps are among the most effective at this and developers have established gamification methods to increase engagement with these activity interventions to improve health. Common examples include encouraging users to share progress and compete with others in the community, providing challenges and recognising achievements with badges, leaderboards and personal records. As users, we give away our data to access these tools but, for many, the trade-off feels worth it.

This technique is increasingly making it into the mental health app world. A recent study published in the journal PLOS One found that turning a mental health application into a game with levels that need passing and points that can be scored kept more people engaged with the intervention and led to better outcomes. This is important as it shows that by making mental health exercises fun you can lengthen the adoption period which, in turn, increases the chances of habit formation. It’s worth noting that the engagement techniques used here do not encourage data sharing within a community as people are less likely to wish to share information with others when it relates to their mental health. So if digital mental health interventions are becoming more fun, and leading to better outcomes, isn’t it about time they went mainstream in a major way?

Lack of trust could be holding them back

With the huge mental health impact of Covid-19, the reduction in face-to-face therapy services and the development of increasingly engaging mental health apps, you might think that this is the perfect storm to create an explosion of growth in that particular category.

However, at the time of writing, mental health apps still appear to be struggling to make a dent in Apple’s free or paid-for app charts and Google search data suggests that – other than a short-lived spike in April – interest in mental health apps has failed to really pick up, despite the increase in mental health challenges.

One major issue here is, I believe, a lack of trust. A study published in the British Medical Journal a couple of years ago found that the majority of people wouldn’t download a mental health app because they were worried it wouldn’t keep their data secure. That’s the privacy issue again. Many of us are reluctant to share our deepest feelings with anyone other than our closest family and friends – and some of us not even with them – so the idea of this most personal of personal data getting into the wrong hands would fill many potential users with dread.

These worries are not unfounded either. Last year an assessment of the privacy and data sharing practices of 36 popular apps for depression or smoking cessation, published on JAMA Network Open, found that 29 of them routinely shared users’ personal data with Google and Facebook. Yet only 12 accurately disclosed this in a privacy policy. That simply isn’t good enough.

Seeing stories like this one in the media increases people’s sense that they need to be extra careful when it comes to recording personal information on their smartphone. There are few things that are as personal as our mental health and wellbeing data and so the issue of trust is absolutely vital here.

So what’s the legal requirement?

It’s important to remember that taking steps that build trust isn’t just important from an ethical point of view and to attract users, it is written in law. Developers of these digital therapeutic and personal data collecting tools need to be careful that the increasing focus on gamification and engagement doesn’t lead to them taking their eye off the ball in the area of data privacy.

Meeting the GDPR regulations are the key requirement if the application is to be available in the EU. Yet while its seven principles of lawfulness, fairness and transparency; purpose limitation; data minimisation; accuracy; storage limitation; integrity and confidentiality; and accountability roll off the tongue easily enough, there are a further 261 pages of detail to understand to ensure that an intervention meets the requirements. It’s a long read but failing to comply can lead to a fine of €20 million or four per cent of the violator’s annual review, whichever is higher, as well as massively undermine user trust, which would have further commercial ramifications. If the app is recognised as a medical grade solution, the new EU Medical Device Regulation (MDR) may also come into play.More than 80 countries have now enacted privacy laws and many have their own take – and from Switzerland’s Federal Acts on Data Protection to the California Consumer Privacy Act which applies in – you guessed it – California. If a developer wants their product to be available in a specific market then they must understand and comply with that area’s data privacy laws.

However, in my view, simply aiming to meet the minimum legal requirements isn’t enough – the spirit in which you build your app is crucial.

Data privacy must be foundational to build trust

Having co-founded Pryv, a company that leads the way in personal data and privacy management software, I have made it my purpose to persuade developers to build their digital health solutions on a solid data privacy foundation. It is clear to me that doing this is paramount to trust which is fundamental to the success of the application.

In the case of mental health apps it’s even more clearcut. If your users don’t trust you, they won’t provide you with their most sensitive personal information about their mental health and state. Nor will they feel comfortable enough to use your app with consistency. You need trust if your app is to be successful – so build the whole thing around the principle of transparency, openness and clarity. Your users should fully understand how you will collect and use their data and their consent must be given freely – rather than gained through deceptive ‘dark patterns’ or User Interface (UI) designs that trick people to behave in a way that wouldn’t usually – and be easy to revoke at any time.

Common drivers for software developers failing to meet privacy regulations in their applications include a desire to get to market faster and a lack of legal understanding. However, while we shouldn’t expect developers to be legal experts, they must work with those who are to ensure that their engaging evidence-based mental health application takes a data privacy approach that will build trust and attract users. Building the tool around ready-to-go data privacy middleware like Pryv can take a lot of the headache away from this and save time.

With all that is happening in the world, the time feels right for mobile mental health apps to play a greater role in protecting people’s mental health and wellbeing. Gamification and other engagement techniques can do a lot to help as they keep people focused on their journey and can aid the development of positive long-term habits. However, if these applications are to attract the users that they need – and that need them – then they need to be built upon data privacy foundations that facilitate trust. Only then will people feel comfortable to share the most personal of all their personal data and only then will mobile mental health apps reach their huge commercial and health supporting potential.

Evelina Georgieva is the co-founder of Pryv, a Swiss-made personal data and privacy-management software that provides a solid foundation for digital health solutions who wish to collect, store, share and rightfully use personal data in a way that meets regulations and builds user trust.