Over the years, Meta trained its AI programs, using billions of public images sent by users to Facebook and Instagram servers. But apparently the finish decided to try its artificial intelligence on billions of photos that users NO sent to these servers.
On Friday, TechCrunch Reported The fact that Facebook users trying to publish something in the function of stories encountered pop -up messages asking if they would like to decide to “process in the cloud”, which would allow Facebook to “select media from the camera and regularly send it to our cloud” in order to generate “ideas such as collages, summary, and recycling or motive, such as birthday.”
By enabling this function, the message is continued, users agree to the meta conditions of AI, which allows their artificial intelligence to analyze “media and facial features” of these unpublished photos, as well as the date that has been taken, and the presence of other people or objects in them. In addition, you give you the right to “preserve and use” these personal data.
Meta recently confirmed that he had scraped data from all content that has been published on Facebook and Instagram since 2007 to train generative AI models. Although the company stated that only public posts sent from adult users over 18 years of age are used, it has long been unclear as to what entails “public”, as well as what was counting as a “adult user” in 2007.
Unlike Google, which clearly states The fact that it does not train generative AI models with personal data collected from Google photos, the current dates of using AI Meta, which exist since June 23, 2024, do not provide any clarity as to whether the unpublished photos obtained by “cloud processing” are exempt from the employ of training data. The finish did not come back TechCrunchrequest for comment; The Verge He also contacted the comment.
Fortunately, Facebook users have the ability to disable cloud cloud processing in their settings, which after activation will also start to remove unpublished photos from the cloud after 30 days. But the bypass, hidden as a function, suggests a recent invasion of our private data, which bypasses the friction point known as Conscientiously deciding to publish a photo for public consumption. And according to Reddit posts found by TechCrunchThe finish line already offers suggestions on restricting artificial intelligence in previously released photos, even if users were not aware of this function: One user reported It is Facebook without knowledge without knowledge developed its wedding photos.
