Lensa AI reportedly tops both Apple’s and Google’s app charts. The app uses artificial intelligence to touch up or entirely change your looks. Some experts take issue with Lensa’s terms of use which gives broad rights to license images.  “Such apps use AI models trained on images scraped from the internet, uploaded by people who never considered, let alone intended, that their photos be processed in this way,” Irina Raicu, the director of internet ethics at the Markkula Center for Applied Ethics at Santa Clara University told Lifewire in an email interview. “Users of such apps then upload pictures of themselves in order to create their ‘avatars,’ and those pictures will be used to further refine the underlying model for future users.”

Your Photos, Their Property?

Like many software, photo apps ask you to give up many rights when you accept their terms of use. For example, part of Lensa’s terms of use reads, “By posting or submitting on the Site or otherwise disclosing to us User Data you give us a royalty-free, transferable, sublicensable, perpetual, non-exclusive, worldwide license to use all such User Data in whole or in part, and in any form, media or technology, whether now known or hereafter developed.”  Raicu said software terms often give companies a lot of power over your information. “Rather than asking individual users to protect themselves, we should consider ways in which we, as a society, might hold app-makers accountable for the ways in which they acquire data and for the promises they make in their privacy policies,” she added. Lensa did not immediately respond to a request for comment. However, Patricia Thaine, the CEO of Private AI, an online privacy company, said in an email to Lifewire that Lensa’s privacy policy is “decent” and involves deleting the metadata associated with each picture. “But even without that metadata, you never know how the company might use your photos in the future,” Thaine said. “For example, what would happen if another unknown entity bought the company?” Also, Thaine said you should keep in mind that there is always the risk of a data leak. “Photos submitted by users are liable to be made available to the public,” she added.  Thaine pointed out that other photo apps have less stringent privacy policies, creating a new level of privacy concerns. “Will they use the pictures to determine health or pregnancy status, the latter of which is a particular concern now in the US?” Thaine said. “Can they determine sexual orientation in countries where homosexuality may be illegal? Will they turn into a Clearview AI?” Mark McCreary, co-chair of the privacy and data security group at the law firm Fox Rothschild, told Lifewire in an email that Lensa creates fake photos of the user that are incredibly realistic.  “That’s concerning from the point-of-view of what Lensa may do with the data,” he added. “They have publicly stated they delete the data within 24 hours (or immediately for at least one usage), but there still exist concerns about that 24-hour window. I am also concerned about what ‘delete’ means, and if that data is recoverable, on back-ups, [or] on a third party storage server.”

Losing Control Over Your Data

The debate over Lensa and similar apps is part of a broader debate over how much information you should give up when using software, McCreary said. “Personally, I believe that biometric data is the most dangerous data to share and lose control over,” he added. “The world is quickly moving into a more biometric world, and while it may be problematic to lose your social security number, it is so much more permanent and potentially devastating to lose your biometric data. Most consumers have not considered this when sharing so many photos with TikTok, Lensa, [or] really any app provider that has the cool, new social media craze.” McCreary warned that once you give up your data, there’s not much you can do to stop it from being used in ways you disapprove of.  “Unfortunately, it really is an all-or-nothing proposition,” he added. “What I think is most terrifying about these solutions is that anyone can take photos of you (versus your photos) and submit them to these services to create fake/AI photo, video, and audio output. The ‘deep fake’ products out there are about to start testing some interesting legal theories and existing case law.”