Are mental health apps failing users on privacy protection?


Are many mental health applications merely ‘data-sucking machines’? — AFP Relaxnews

If you’ve downloaded any mental health apps, you might want to consider deleting them. According to a study conducted by researchers at the Mozilla Foundation, many of these applications do not offer sufficient protection of users’ privacy and security.

Are mental health applications failing users on privacy and security? Although they deal with particularly important and often sensitive topics – such as depression, anxiety, violence, eating disorders, post-traumatic stress disorder or suicide – many of these applications share user data freely.

A study conducted by Mozilla researchers on 32 such applications highlights their lack of compliance with privacy standards. Moreover, these applications were found to collect more data than the vast majority of other applications and other connected devices.

“The vast majority of mental health and prayer apps are exceptionally creepy,” warns Jen Caltrider, Mozilla’s ‘Privacy Not Included’ lead. “They track, share, and capitalise on users' most intimate personal thoughts and feelings, like moods, mental state, and biometric data.”

These mental health applications are particularly targeted at younger people. Indeed, the latter are especially vulnerable to mental health issues and do not necessarily pay attention to the use of their data. Some of this data could allow them to be targeted by personalised ads for years to come.

‘Data-sucking machines’

Of the 32 applications studied by the Mozilla team, 28 received a “Privacy Not Included” warning label. This label indicates that the researchers have concerns about how the application handles its users' data.

The privacy policy of these companies is considered vague by the study. Yet, this doesn’t stop the apps from collecting as much personal data from their users as possible. Moreover, a majority of these applications offer poor account security despite containing highly personal information about the users.

“In some cases, they operate like data-sucking machines with a mental health app veneer,” says Mozilla researcher Misha Rykov in a statement. “In other words: A wolf in sheep's clothing.”

According to Mozilla, the applications with the worst security and privacy practices are Woebot, Youper, Better Stop Suicide, Pray.com, Talkspace or Better Help.

Finally, despite Mozilla’s attempts to find out more about their privacy policies, only three (Hallow, Calm and Wysa) out of 32 applications responded. Only two platforms respected basic privacy and security standards: PTSD Coach, an application created by the US Department of Veterans Affairs, and the AI chatbot Wysa. – AFP Relaxnews

Follow us on our official WhatsApp channel for breaking news alerts and key updates!
   

Next In Tech News

How to stop Instagram and Facebook using your data to train Meta’s AI
Apple forced to explain why it won’t add AI features to older iPhones
What if social networks really are the stuff of nightmares?
How to identify and manage the most power-hungry apps on your smartphone
Users will have control over generative AI in Windows
Will your device support Apple Intelligence?
Gamers really are better drivers, a new survey reveals
From schoolwork to relationship advice: why might young people use an AI chatbot?
OpenAI CEO says company could become benefit corporation- The Information
Google loses bid to end US antitrust case over digital advertising

Others Also Read