Need to know
- Academics from the University of Melbourne have raised concerns about data collection by mental health apps
- Some apps share user data for marketing and research purposes
- Questions have also been raised about the scientific claims behind some apps
During Sydney's 2021 COVID-19 lockdowns Jack's mental health began to deteriorate, so he signed up for the paid mental health app Headspace after it was recommended by a friend.
The US-based app, which claims to have over 70 million users worldwide, promises to reduce stress by 14% in ten days through "science-backed meditation and mindfulness tools". The app costs Australian users $92 a year.
Jack says he found the sleepcasts and meditations on the app helpful at first, but over time he stopped using them and he later let his annual subscription lapse.
"I never really thought about the data, I didn't worry about it too much because I wasn't giving away anything specific," he says.
But Jack wasn't aware exactly how widely his personal data was being shared.
Popular apps
A CHOICE survey of over 1,000 people conducted in June 2022 found that 30% of Australians had one or more mental health app downloaded on their phone.
Samsung Health, Headspace, Calm and Smiling Mind were the most popular of the apps.
Only 20% of people who downloaded a mental health app said they usually looked at the privacy policy before downloading it.
Almost one third of Australians have at least one mental health app downloaded on their phone.
A range of data collection practices
Jeannie Paterson, professor of law at the University of Melbourne Centre for AI and Digital Ethics, has been looking into the data collection and privacy policies of 15 mental health and mindfulness apps available in Australia and the United States.
She says the data collection practices of the apps she studied range widely and that many collect and share data beyond what users signing up for mental health support would expect.
When you sign up to a mental health app, it's kind of like signing up to any app or social media, in the sense that your data may or may not be kept within the app
Jeannie Paterson, University of Melbourne Centre for AI and Digital Ethics
"Normally if we go to see a mental health professional, we know that our conversations with that professional are private and that they will not share the information," she says.
"But when you sign up to a mental health app, it's kind of like signing up to any app or social media, in the sense that your data may or may not be kept within the app."
The privacy policies of the apps Paterson reviewed gave the companies behind them permission to use the data for a range of purposes, including sales and marketing and sharing with external researchers. Consent by users was sometimes obtained as part of the sign-up process.
Headspace
Zoe, who has Attention-Deficit/Hyperactivity Disorder (ADHD), says she found the Headspace app helpful for getting to sleep when she downloaded it around six years ago. Some time after she let her subscription lapse she started receiving ads on Youtube for Headspace sleepcasts and re-subscribed to the service.
She now says she finds it "frustrating" that the company may have been using her personal data to target her with advertising after she unsubscribed.
Paterson says the privacy policy of Headspace gives it wider permissions than many of the other apps she reviewed.
"Headspace collects a lot of information and shares that with marketing, with employers (of the user, if it is paid for through work), sponsors and for research purposes. It asks people to sign up and it's very hard to unsubscribe," she says.
Paterson says the privacy policy of Headspace gives it wider permissions than many of the other apps she reviewed
Headspace's privacy policy states that when it has partnered with an organisation such as a business or university "the partner may also have access to your aggregated and anonymized general usage data".
Headspace did not respond to questions sent by CHOICE.
Questionable claims
Associate professor Nicholas Van Dam from the Melbourne School of Psychology Sciences also conducted a review of mental health apps and says many of them make claims about health benefits that are not backed by acceptable scientific research.
"What often happens is they'll say our app is based on good evidence or it's based on good science," he says.
"The problem is you can look in some of these websites' footnotes and the evidence that a number of these apps are providing is often not related to their particular app per se, but are about meditation more broadly."
Van Dam says the problem with some of the claims by leading apps such as Smiling Mind is that the evidence cited is based on face-to-face meditation studies and that doesn't necessarily measure the effectiveness of the apps.
Smiling Mind responds
A spokesperson for Smiling Mind, an Australian-based not-for-profit organisation which also runs programs in schools and workplaces, says their work is informed by the latest evidence and research.
"Our in-house research team designs and delivers our evaluation programs and partners with external researchers as appropriate, to enhance our research capacity and capability and independence of research data collected," they say.
"Smiling Mind shares the potential mental health benefits of mindfulness on its website and each benefit has a direct citation, ensuring people can explore the evidence base independently. When directly referring to the potential benefits of Smiling Mind programs these benefits are linked to program evaluation conducted by Smiling Mind or independent parties and citations are provided," they add.
Mental health apps collect sensitive data from people who many have sensitive conditions.
TGA regulation needed
Both Paterson and Van Dam say a potential solution would be to bring mental health apps under the supervision of the Therapeutic Goods Administration (TGA) and let the regulator decide whether the health benefit claims stack up.
"When they're saying that they can address issues like anxiety and depression and stress, these are things that fall quite firmly within the space of mental health… they should be able to back up these claims," Van Dam says.
With no industry-specific regulation in place for mental health apps, the veracity of their claims would be subject to oversight by the Australian Competition and Consumer Commission (ACCC), the same as any other app.
Paterson says seeking the "consent" of consumers when they sign up through long privacy policy statements is not sufficient and that a "fair and reasonable usage" test for data, such as that being considered under the federal Privacy Act review could be meaningful. A fair and reasonable usage test would ensure companies only use data in a way that a reasonable person would expect for the purposes needed by the app.
Our privacy laws need to do more to protect app users from being taken advantage of
Rafi Alam, CHOICE senior campaigns and policy advisor
"At the end of the day you can't be collecting sensitive data and data from people who may have sensitive conditions without being really careful about what you're collecting and what you're doing with it," she says.
Rafi Alam, senior campaigns and policy advisor at CHOICE, says consumers using mental health apps may be in a vulnerable situation and the last thing they would want is to "feel exposed".
"Our privacy laws need to do more to protect app users from being taken advantage of. We're asking the government to strengthen the Privacy Act and align it to consumer expectations. That means requiring businesses to use personal data fairly and safely and not exploiting it for profit, particularly when it's about something as sensitive as our mental health," he says.
Readers seeking support with mental health can contact Beyond Blue on 1300 224 636 or at beyondblue.org.au
Stock images: Getty, unless otherwise stated.