“These apps are designed to collect a ton of personal information,” says Jen Caltrider, the project lead for Mozilla’s Privacy Not Included team, which conducted the analysis. “They push you toward role-playing, a lot of sex, a lot of intimacy, a lot of sharing.” For instance, screenshots from the EVA AI chatbot show text saying “I love it when you send me your photos and voice,” and asking whether someone is “ready to share all your secrets and desires.”
Caltrider says there are multiple issues with these apps and websites. Many of the apps may not be clear about what data they are sharing with third parties, where they are based, or who creates them, Caltrider says, adding that some allow people to create weak passwords, while others provide little information about the AI they use. The apps analyzed all had different use cases and weaknesses.
…For AI girlfriends and their ilk, Caltrider says people should be cautious about using romantic chatbots and adopt best security practices. This includes using strong passwords, not signing in to the apps using Facebook or Google, deleting data, and opting out of data collection where it’s offered. “Limit the personal information you share as much as possible—not giving up names, locations, ages,” Caltrider says, adding that with some of these services, it may not be enough. “Even doing those things might not keep you as safe as you would like to be.”
Read more:
Burgess, M. (2024, February 14). ‘AI Girlfriends’ Are a Privacy Nightmare. Wired. https://www.wired.com/story/ai-girlfriends-privacy-nightmare/