Thursday, May 1, 2025

Consider twice before creating this figure CHATGPT

Share

At the beginning In April, an influx of action began on social media pages, including LinkedIn and X. Each character presented a person who created it with incredible accuracy, along with personalized accessories, such as reusable coffee cups, yoga mats and headphones.

All this is possible due to the up-to-date GPT-4O OPENAI power supply Image generatorwhich calms the ability of chatgpt to edit photos, render text and others. CHATGPT OPENAI image generator can also create photos in the style of the Japanese animated company Studio Ghibli – Trend He quickly became viraltoo.

The images are fun and uncomplicated to do – everything you need is a free chatgpt account and a photo. However, to create a picture of action or a image in the Ghibli style, you must also transfer a lot of data to OpenAI, which can be used to train its models.

Hidden data

The data you provide when using the AI ​​image editor is often hidden. Every time you send a picture to chatgPT, you potentially convey “the entire metadata package”, says Tom Vazdar, chairman of the cyber security at the Open Institute of Technology. “This includes exif data attached to the image file, such as the time of taking a photo and GPS coordinates, where it was shot.”

Opeli also collects data on a device used to access the platform. This means that the device type, operating system, browser version and unique IDs, says Vazdar. “And because platforms such as chatgPT talks, there are also behavioral data, such as what you entered, what images you asked, such as interaction with the interface and the frequency of these actions.”

It’s not just your face. If you send a high resolution photo, you give OpenAI anything else in the image-other people, things in your room and everything readable, such as documents or badges, says Camden Woolven, head of Marketing AI products in the risk management of GRC International Group.

Such voluntarily provided data supported by consent are “gold mine for generative training models”, especially multimodal ones, which are based on visual entrances, says Vazdar.

Opeli denies that it organizes viral photographic trends as a trick for collecting user data, but the company is certainly gaining advantage. Wazdar notes that Opeli does not have to scrape the network if you are content to send it. “This trend, according to design or convenient occasion, provides the company with huge amounts of fresh, high -quality facial data from various age groups, ethnic and geographical groups.”

Opeli says he is not actively looking for personal data Train models– And he does not utilize public data on the internet to build people’s profiles to advertise or sell their data, says Openai spokesman. However, under the current openai OpenAI Privacy policyImages sent by chatgpt can be maintained and used Improve his models.

All data, hints or requests that you provide facilitate to teach the algorithm – and personalized information helps in further tuning it, says Jake Moore, a global adviser for cybernetic safety in ESET, who created his own figure of action to demonstrate the risk of privacy related to the trend trend is LinkedIn.

Amazing similarity

In some markets, your photos are protected by regulation. In Great Britain and the EU, the Data Protection Regulation, including the GDPR, offer robust protection, including the right to access or delete data. At the same time, the utilize of biometric data requires clear consent.

However, photos become biometric data only after processing with specific technical means enabling Unique identification A specific person, says Melissa Hall, a senior collaborator at the MFMAC office. Image processing to create a cartoon version of the topic in the original photo is “unlikely to meet this definition,” he says.

Latest Posts

More News