Technology Companies Pledge to Remove Nudity From AI Datasets

(AmericanProsperity.com) – Multiple artificial intelligence companies have pledged to remove nudity from their data sources that they use to train artificial intelligence products. They’ve also agreed to put up safeguards to avoid harmful deepfake imagery as well.

This deal was brokered by the Biden administration and included Adobe, Cohere, Anthropocene, Microsoft, and OpenAI. These companies said that they would voluntarily pledge to remove nude images from AI training datasets “when appropriate and depending on the purpose of the model.”

This pledge is part of a broader campaign against image-based sexual abuse of children and the creation of inappropoate AI deepfake images of people. According to a statement from the White House’s Office of Science and Technology Policy, these images have “skyrocketed, disproportionately targeting women, children, and LGBTQ+ people, and emerging as one of the fastest growing harmful uses of AI to date.”

In a different pledge, companies including Discord, Bumble, Meta, Microsoft, Match Group, and TikTok have set voluntary principles to prevent image-based sexual abuse.

This has come after many concerns regarding misinformation, deepfakes, and the increasing amount of sexual images that have been put out by artificial intelligence softwares. These softwares are extremely smart and can generate a wide variety of different images, including those that may not be considered appropriate.

There has been pushback against the companies making AI software for their lack of regulation, which is why the White House is stepping in with this pledge.

Copyright 2024, AmericanProsperity.com