Are mental health apps failing users on privacy protection?

Are many mental health applications merely ‘data-sucking machines’? — AFP Relaxnews

If you’ve downloaded any mental health apps, you might want to consider deleting them. According to a study conducted by researchers at the Mozilla Foundation, many of these applications do not offer sufficient protection of users’ privacy and security.

Are mental health applications failing users on privacy and security? Although they deal with particularly important and often sensitive topics – such as depression, anxiety, violence, eating disorders, post-traumatic stress disorder or suicide – many of these applications share user data freely.

A study conducted by Mozilla researchers on 32 such applications highlights their lack of compliance with privacy standards. Moreover, these applications were found to collect more data than the vast majority of other applications and other connected devices.

“The vast majority of mental health and prayer apps are exceptionally creepy,” warns Jen Caltrider, Mozilla’s ‘Privacy Not Included’ lead. “They track, share, and capitalise on users' most intimate personal thoughts and feelings, like moods, mental state, and biometric data.”

These mental health applications are particularly targeted at younger people. Indeed, the latter are especially vulnerable to mental health issues and do not necessarily pay attention to the use of their data. Some of this data could allow them to be targeted by personalised ads for years to come.

‘Data-sucking machines’

Of the 32 applications studied by the Mozilla team, 28 received a “Privacy Not Included” warning label. This label indicates that the researchers have concerns about how the application handles its users' data.

The privacy policy of these companies is considered vague by the study. Yet, this doesn’t stop the apps from collecting as much personal data from their users as possible. Moreover, a majority of these applications offer poor account security despite containing highly personal information about the users.

“In some cases, they operate like data-sucking machines with a mental health app veneer,” says Mozilla researcher Misha Rykov in a statement. “In other words: A wolf in sheep's clothing.”

According to Mozilla, the applications with the worst security and privacy practices are Woebot, Youper, Better Stop Suicide,, Talkspace or Better Help.

Finally, despite Mozilla’s attempts to find out more about their privacy policies, only three (Hallow, Calm and Wysa) out of 32 applications responded. Only two platforms respected basic privacy and security standards: PTSD Coach, an application created by the US Department of Veterans Affairs, and the AI chatbot Wysa. – AFP Relaxnews

Article type: free
User access status:
Subscribe now to our Premium Plan for an ad-free and unlimited reading experience!

Next In Tech News

EV maker Faraday Future to raise up to $600 million in funding
Volkswagen, Mahindra deepen electric vehicle component cooperation
Snap reaches 1 million premium subscribers in bid for new revenue
Kiplepay alerts users to potential third-party data breach, investigations ongoing
Study: Two in five workers in Singapore would not accept a job if they cannot work from home
Indian ride-hailing firm Ola plans to start producing electric cars in 2024
Opinion: Hey, Malcolm Gladwell, I’m working in my jammies and I’m just fine
UK digital residence checks lock out refugees, slavery victims
Alibaba, ByteDance share details of prized algorithms with Beijing for first time
A California startup is selling electric vehicle ‘subscriptions’

Others Also Read