# Tags
#Technology

Meta’s Oversight Board Calls for Policy Change Following Viral Biden Video Controversy

Share this article

Last year, a manipulated video of Biden voting with his granddaughter sparked controversy, reported as hate speech to Meta. The oversight board criticized Meta’s content standards, highlighting concerns about disinformation during critical election periods

Last year, a video showing Biden voting with his adult granddaughter that was altered to make it appear as if he had improperly touched her chest went viral.

It was reported to Meta and then to the company’s monitoring board as hate speech.

The tech giant’s oversight board, which independently monitors Meta’s content moderation decisions, stated that the platform was technically correct in leaving the video available.

However, it claimed that the company’s standards regarding modified content were no longer suitable for purpose.

The board’s warning came amid concerns about widespread use of artificial intelligence-powered applications for disinformation on social media platforms during a critical election year not only in the United States but around the world, with large segments of the global population voting.

The Board stated that Meta’s present policy was “incoherent, lacking in persuasive justification, and inappropriately focused on how content has been created.”

Rather of focusing on the “specific harms it aims to prevent (for example, to electoral processes),” the board stated.

Meta issued a statement stating that it was “reviewing the Oversight Board’s guidance and will respond publicly to their recommendations within 60 days in accordance with the bylaws.”

According to the board, in the Biden instance, the regulations were not broken “because the video was not manipulated using artificial intelligence nor did it depict Biden saying something he did not.”

However, the board argued that “non-AI-altered content is prevalent and not necessarily any less misleading.”

Most cellphones, for example, have easy editing features that may be used to create disinformation, often known as “cheap fakes,” according to the report.

The board also stated that, unlike videos, altered audio content was not covered by the current rules, despite the fact that deep fake audio can be quite successful at deceiving people.

Already, one US robocall mimicking Biden urged New Hampshire citizens not to vote in the Democratic primary, prompting state officials to open an investigation for potential voting suppression.

The monitoring board recommended Meta to examine the manipulated media policy “quickly, given the number of elections in 2024.”

0 0 votes
Article Rating
Subscribe
Notify of
guest

0 Comments
Inline Feedbacks
View all comments
0
Would love your thoughts, please comment.x
()
x