At least two menstruation-tracking apps, Maya and MIA Fem, were sharing intimate details of users' sexual health with Facebook and other entities, according to a new report from Britain-based privacy watchdog Privacy International. In some cases, those details, which are self-recorded by users in the app, included when the user last had sex, the type of contraception used, their moods and whether they were ovulating.
The findings raise questions about the security of the most private information in an age when employers, insurers and advertisers can use data to discriminate or target certain categories of people.
Facebook spokesman Joe Osborne said advertisers did not have access to the sensitive health information shared by these apps. In a statement, he said Facebook's ad system "does not leverage information gleaned from people's activity across other apps or websites" when advertisers choose target users by interest. BuzzFeed first reported the news.
Period- and pregnancy-tracking apps such as Maya and MIA have climbed in popularity as fun, friendly companions that provide insights into the often daunting world of fertility and pregnancy. They can also be used to track sexual health more generally, moods and other intimate data. But many apps aren't subject to the same rules as most health data.
That has raised privacy concerns as some of the apps have come under scrutiny as powerful monitoring tools for employers and health insurers, which have aggressively pushed to gather more data about their workers' lives than ever before under the banner of corporate wellness. Plus, it appears the data could be shared more broadly than many users recognise, as flagged by the Privacy International study.
Several period- and pregnancy-tracking apps have had been called out for sharing health data with women's employers and insurance companies, as well as for security flaws that reveal intimate information. As a result, many women say they've devised strategies to use the apps without revealing all of their most sensitive information. Among those strategies: using fake names, documenting only scattered details and even inputting incorrect data.
Users and experts alike worry that the data could be exposed in security breaches, or used by employers and insurance companies to discriminate against women by increasing their premiums or not offering them leadership positions.
Deborah Peel, a psychiatrist and founder of the nonprofit Patient Privacy Rights, said people expect that their health data will be protected by the same laws that protect their health information in a doctors office, but many apps aren't subject to the same rules.
"Most people would want to make their own decisions about what's known about their sex life, about whether it's shared or not," said Peel. "Right now we have no ability to do that."
Facebook, the world's largest social media platform, with 1.2 billion daily users, is asking users to trust it with more and more sensitive information than at any time in the past. Last week, the company launched Facebook Dating in the US, a matchmaking service that suggests potential love interests to users based on preferences, interests and Facebook activity.
At the same time, Facebook has come under fire in recent years for scandals involving misinformation, fake accounts and breaches of trust. That includes the 2018 revelation from a whistleblower that Facebook had allowed political consultancy Cambridge Analytica to improperly access data from millions of users. In that case, the data was harvested through a third-party quiz app.
Facebook said its terms of service prohibit app developers from sharing health or sensitive data, and that Facebook has been in contact with both Maya and MIA to notify them of a possible violation of those terms, in statements from the companies included in the report. Facebook also said that while it has systems in place to automatically detect and delete information like Social Security Numbers and passwords from the information shared by apps, the company is "looking at ways to improve our system/products to detect and filter out more types of potentially sensitive data."
Plackal Tech, which developed Maya, said in its statement to Privacy International that it would remove the Facebook Software Development Kit from a new version of its service. There was no published response from Mobapp Development Limited, the company behind MIA, and the company did not have immediate comment.
© The Washington Post 2019