Like the generator artificial intelligence The gold rush intensifies, the concerns about the data used to train automatic learning tools have increased. Artists and writers fight To say how IA companies use their work, the prosecution and publicly wave against how these models scratch the Internet and incorporate their art without consent.
Some companies have responded to this perspective with “opt-out” programs that give people the choice to remove their work from future models. OPENAI, for example, Make his debut A withdrawal function with its latest version of the Dall-E text image generator. In August, when the meta began to allow people to Submit requests To delete the personal data of third parties used to train Meta generative AI models, many artists and journalists have interpreted this new process as a very limited version of Meta of an opt-out program. CNBC explicitly called the request form as “withdrawal tool. “”
It is a false idea. In reality, there is no functional way to withdraw from the generative training of Meta AI.
The artists who tried to use the META data deletion request form learned that hard and were deeply frustrated by the process. “It was horrible,” said the Mignon Zakuga illustrator. More than a dozen artists have shared with Wired an identical letter of form they received from Meta in response to their questions. In this document, Meta says that she is “unable to process the request” until the applicant submits evidence that their personal information appears in the responses of the generative AI of Meta.
Mihaela Voicu, a Romanian digital artist and photographer who tried to request the deletion of data twice using the Meta shape, says that the process looks like “a bad joke”. She also received the language of the driver “unable to treat”. “It is not really intended to help people,” she believes.
Bethany Berg, a conceptual artist based in Colorado, has received the answer “unable to process” for many attempts to delete his data. “I started to feel like it was just a false public relations to give the impression that they were trying to do something,” she said.
As Meta’s insistence quickly emphasized, which people provide evidence that his models have trained their work or other personal data put them in an obligation. Meta has not disclosed the details on the data on which he formed his models, so this configuration requires people who wish to delete their information to discover first which guests could arouse answers that include details on themselves or their work.
Even if they submit evidence, it may not have any importance. Asked about the frustration of assembly in the face of this process, Meta replied that the data deletion request form is not a deactivation tool, stressing that it does not intend to delete information found in its own platforms. “I think there is a certain confusion on what this form is and the controls we offer,” said Thomas Richards, spokesperson for Meta, Wired by e-mail. “We do not currently offer functionality so that people withdraw from their information from our products and services used to train our AI models.”