Interview Dating apps ask people to disclose all kinds of personal information in the hope of finding them love, or at least a hook-up.
What many may not know is that the majority of these lonely-hearts corners vacuum up way more user info than they need to, and they also do a terrible job safeguarding private data that they’ve collected. The Mozilla Foundation’s latest research scoped out popular dating apps, and slapped Privacy Not Included warning labels on 22 of the 25 reviewed.
The Register sat down with Zoë MacDonald, Privacy Not Included researcher and writer, to discuss the findings, and you can watch our conversation below.
In addition to harvesting things like photos and details about your race, religion, political views, sexual preferences, HIV status, and weight – one-quarter of the apps surveyed also collected metadata from users’ content. And at least one may store video chats, we’re told.
“To be clear, none of the apps are particularly good at privacy,” MacDonald said, noting that the team only doled out one “thumbs up,” and that went to a queer dating app called Lex.
“If I had to name one that I thought was the worst, I would have to say Grindr because they have such an awful history of protecting their users’ information,” MacDonald said. “And, of course, it’s also an app that is targeting a vulnerable population, which is gay men.”
Grindr, along with other dating apps used by gay men including Scruff, Growlr, and Jack’d, plus OkCupid, were among those sharing users’ data with a digital advertising network that then sold this info to a Catholic group. The group then reportedly used this intelligence to out a priest.
In fact, 80 percent of the dating apps say they may share or sell users’ personal information for advertising.
Also of concern to the folks at Mozilla — or anyone worried about the intersection of privacy and artificial intelligence — is that half of the reviewed dating apps are already using AI, and most of them plan to integrate this tech in the future. Some of these may be used for good or at least be useful, such as Bumble‘s Deception Detector or an AI-based feature on Tinder to help select a profile picture.
“But privacy wise, AI is a bit of a can of worms,” MacDonald said. “Two of the major players, and two of the groups that we’re most concerned about, Grindr and Match Group, have both stated their intention to invest in AI in the future.”
- Lawsuit accuses Grindr of illegally sharing users’ HIV status
- Quarter of polled Americans say they use AI to make them hotter in online dating
- Stalkerware usage surging, despite data privacy concerns
Match, the biggest dating app in the world that owns a ton of other sites including Tinder, OkCupid, Hinge, and Plenty of Fish, sparked up a relationship with OpenAI shortly after Valentine’s Day.
But considering Match Group’s track record, Moz’s privacy team says OpenAI should have swiped left.
In 2022, the US Federal Trade Commission began investigating an alleged data-sharing deal between Match Group-owned OkCupid and AI firm Clarifai AI, after OkCupid images were reportedly used to train facial recognition software.
“We don’t necessarily trust them to tackle that integration with the kind of care that they would need to, to really ensure that their users’ agency is respected and that their privacy is protected,” MacDonald said. ®
Note: This article have been indexed to our site. We do not claim legitimacy, ownership or copyright of any of the content above. To see the article at original source Click Here