Artificial Intelligence (AI)

Artists Allege Meta’s AI Data Deletion Request Process Is a ‘Fake PR Stunt’


As the generative artificial intelligence gold rush intensifies, concerns about the data used to train machine learning tools have grown. Artists and writers are fighting for a say in how AI companies use their work, filing lawsuits and publicly agitating against the way these models scrape the internet and incorporate their art without consent.

Some companies have responded to this pushback with “opt-out” programs that give people a choice to remove their work from future models. OpenAI, for example, debuted an opt-out feature with its latest version of the image-to-text generator Dall-E. This August, when Meta began allowing people to submit requests to delete personal data from third parties used to train Meta’s generative AI models, many artists and journalists interpreted this new process as Meta’s very limited version of an opt-out program. CNBC explicitly referred to the request form as an “opt-out tool.”

This is a misconception. In reality, there is no functional way to opt out of Meta’s generative AI training.

Artists who have tried to use Meta’s data deletion request form have learned this the hard way and have been deeply frustrated with the process. “It was horrible,” illustrator Mignon Zakuga says. Over a dozen artists shared with WIRED an identical form letter they received from Meta in response to their queries. In it, Meta says it is “unable to process the request” until the requester submits evidence that their personal information appears in responses from Meta’s generative AI.

Mihaela Voicu, a Romanian digital artist and photographer who has tried to request data deletion twice using Meta’s form, says the process feels like “a bad joke.” She’s received the “unable to process request” boilerplate language, too. “It’s not actually intended to help people,” she believes.

Bethany Berg, a Colorado-based conceptual artist, has received the “unable to process request” response to numerous attempts to delete her data. “I started to feel like it was just a fake PR stunt to make it look like they were actually trying to do something,” she says.

As artists are quick to point out, Meta’s insistence that people provide evidence that its models have trained on their work or other personal data puts them in a bind. Meta has not disclosed the specifics about which data it has trained its models on, so this set-up requires people who want to remove their information to first figure out which prompts might elicit responses that include details about themselves or their work.



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *