Political advertisers must use meta to indicate when deepfakes are being used.
Political advertisers must use meta to indicate when deepfakes are being used. Political advertisers will have to notify Meta when their ads on Facebook and Instagram use artificial intelligence (AI) or digital manipulation. Although the social media platform already has rules in place regarding the use of deepfakes, it claims that this goes one step further. Advertisements pertaining to politics, elections, or social issues will need to disclose any digitally altered images or videos starting in January. The global policy will be overseen by a combination of AI and human fact checkers. In a statement, Meta said that this would include removing comments made in videos, modifying photos or videos of actual events, and creating realistic-looking but fictitious characters. When advertisements are identified as having been digitally altered, users will be informed. Although Meta did not specify how it would be presented, it did tell the BBC that it would include this information in the ad