OpenAI he said Thursday night that he “paused” the Martin Luther King Jr. deepfake. on its social media app Sora after users created “disrespectful” AI-generated videos of the slow civil rights leader. It said representatives or estates of other historical figures will now be able to opt out of having their likenesses used on the platform.
The company said it took the action in response to complaints filed by King’s estate and his daughter, Bernice King, who questioned residents social media to stop sending AI videos of her father. King is one of many deceased celebrities and historical figures whose likenesses often appeared on Sora in a vulgar, offensive and unpleasant way.
Therefore, at the request of King, Inc. OpenAI has stopped generating images of Dr. King in order to strengthen protective barriers for historical figures.
While there is a mighty free speech interest in depictions of historical figures, OpenAI believes that public figures and their families should ultimately have control over how their likenesses are used. Authorized representatives or property owners may request that their likeness not be used in Sora scenes.
OpenAI’s changing stance on historical figures mirrors OpenAI’s approach to copyright when Sora was first launched. The strategy proved controversial, with the platform making an embarrassing shift towards an “opt-in” policy for rights holders after being flooded with depictions of characters such as Pikachu, Rick and Morty and SpongeBob SquarePants.
Unlike copyright law, there is no federal framework for protecting a person’s image, but various state laws allow people to be sued for the unauthorized operate of the image of a living person, or in some states, of a deceased person. For example, California, where OpenAI is headquartered, has explicitly stated that post-mortem privacy rights apply to AI contractor replicas. For live humans, OpenAI enabled them to opt-in from the outset to appear in videos, allowing them to create their own AI clones.
