By Linda Carroll

(Reuters Health) – While nine out of 10 phone apps for depression and smoking cessation assessed in a recent study were found to be sharing user data with third parties, only two out of three disclosed they were doing so.

Much of that data, including linkable identifiers, was shared with Google and Facebook, among others, but barely half of apps sharing data with those two giant companies told users about it, researchers report in JAMA Network Open.

“If you download a mental health or smoking cessation app, there’s a high chance it will share marketing, advertising or usage tracking data with either Facebook or Google,” said the study’s lead author Kit Huckvale, a postdoctoral research fellow at the Black Dog Institute in Randwick, Australia. “Unfortunately, in many cases, there’s no way to tell that this is happening and you can’t rely on the privacy policy to tell you.”

While the apps studied by Huckvale and his colleagues didn’t appear to be directly sharing mental health information, “what we did see was information indicating the kinds of apps people are using,” Huckvale said in an email. “That can be enough to reveal what conditions people might have.”

Huckvale and his colleagues assessed 36 top-ranked depression and smoking cessation apps designed for Android and iOS. The researchers downloaded the apps onto an Android phone or an iPhone and then put the apps through their paces. All network traffic generated during those simulated uses was intercepted with specialized software.

The destination and content of each transmission was tagged automatically to identify whether the app’s developer or a third party was being contacted. The researchers also noted instances when personal and other user-generated data was contained in the transmissions.

More than two-thirds of the apps, 25 of 36, provided a privacy policy, but in many cases those policies didn’t describe all the ways data collected by the app would be used.

Of the 33 apps that transmitted any data to a third party, nine included strong identifiers such as a fixed device identifier or a username and 26 included weak identifiers, such as an advertising identifier, which could be used to track user behavior over time and across different products and technology platforms.

Two of the apps incorporated user-generated health status information, such as the contents of a health diary, or substance use as part of data sent to third-party analytics services.

Twenty-nine apps transmitting data to a third party sent it to only two commercial entities, the authors note: Google or Facebook.

There are some ways consumers can protect themselves from all that covert sharing, Huckvale said.

“Checking the privacy policy is a good place to start,” he said. “If a policy explicitly states it isn’t going to share data with third parties, then there is a reasonable chance that this is true. Be suspicious if the policy is vague or missing altogether.”

Another tactic is to minimize the amount of personal information you enter into the apps, Huckvale said. “There’s usually little need to put your real name, or date of birth,” he added. “Facebook and Google social login are convenient, but they also allow your usage to be tracked.”

“At this point, you have to think about the app economy and business model that exists for software,” said Jennnings Aske, chief information security officer at NewYork-Presbyterian Hospital, who wasn’t involved in the research. “I’m surprised the study findings aren’t even darker.”

“I don’t think people understand the nature of the profiles being built about them,” Aske said. “If I were to think of the worst-case scenario, it would start with the way the data is being used by people who make decisions about you with respect to employment and health benefits, for example.”

That data, which might not even be accurate, could also come into play when auto insurance or life insurance companies are deciding how much to charge, Aske said.

Vermont recently passed a law to regulate data brokers, Aske said. “Those are the organizations behind the scenes selling data about you and me,” he added. “And there were about 120 that had to acknowledge that they had data on residents of Vermont. It shows there’s a lot more out there that we don’t know about in this data broker economy. That’s scary.”

SOURCE: https://bit.ly/2PnUref JAMA Network Open, online April 19, 2019.

Author