Sep 12, 2024, 12:00 AM
Sep 12, 2024, 12:00 AM

White House secures AI vendor pledges to fight deepfake nudes

Provocative
Highlights
  • Several major AI vendors, including Adobe, Microsoft, and OpenAI, have committed to combatting nonconsensual deepfakes and child sexual abuse material.
  • These companies will responsibly source datasets and implement strategies to prevent the generation of harmful images.
  • The White House views these commitments as a significant win in its broader effort to reduce the harm caused by deepfake nudes.
Story

The White House has successfully secured voluntary commitments from several prominent AI vendors to address the issue of nonconsensual deepfake pornography and child sexual abuse material. Major companies such as Adobe, Microsoft, Anthropic, and OpenAI have pledged to responsibly source and safeguard the datasets used for training their AI systems, ensuring that they do not include image-based sexual abuse content. These organizations, with the exception of Common Crawl, have also agreed to implement feedback loops and strategies in their development processes to prevent the generation of harmful images by AI. Additionally, these companies have committed to removing nude images from their training datasets when deemed appropriate, depending on the intended purpose of their models. However, it is important to note that these commitments are self-regulated, meaning there is no external enforcement of these pledges. Some AI vendors, such as Midjourney and Stability AI, chose not to participate in this initiative, raising concerns about the overall effectiveness of the commitments made. OpenAI's involvement has sparked particular scrutiny, especially after CEO Sam Altman indicated earlier this year that the company would explore the responsible generation of AI pornography. This contradiction has led to questions about the sincerity and impact of the pledges made by these companies. Despite these concerns, the White House has framed these commitments as a significant step forward in its ongoing efforts to mitigate the harms associated with deepfake nudes and enhance the safety of digital environments.

Opinions

You've reached the end