Technology
Thursday, September 7th, 2023 8:16 am EDT
Election ads running on Google and YouTube that are created with artificial intelligence will soon have to carry a clear disclosure, according to new rules created by the company.
The new disclosure requirement for digitally altered or created content comes as campaigning for the 2024 presidential and congressional elections kicks into high gear. New AI tools such as OpenAI’s ChatGPT and Google’s Bard have contributed to concerns about how easily deceptive information can be created and spread online.
“Given the growing prevalence of tools that produce synthetic content, we’re expanding our policies a step further to require advertisers to disclose when their election ads include material that’s been digitally altered or generated,” a Google spokesperson said in a statement. “This update builds on our existing transparency efforts — it’ll help further support responsible political advertising and provide voters with the information they need to make informed decisions.”
The policy will take effect in mid-November and will require election advertisers to disclose that ads containing AI-generated elements have been computer-generated or do not show real events. Minor changes such as brightening or resizing an image do not require such a disclosure.
Election ads that have been digitally created or altered must include a disclosure such as, “This audio was computer-generated,” or “This image does not depict real events.”
Google and other digital ad platforms such as Meta’s Facebook and Instagram already have some policies around election ads and digitally altered posts. In 2018, for example, Google began requiring an identity verification process to run election ads on its platforms. Meta in 2020 announced a general ban on “misleading manipulated media” such as deepfakes, which can use AI to create potentially convincing false videos.
WATCH: How A.I. could impact jobs of outsourced coders in India
This post has been syndicated from a third-party source. View the original article here.