YouTube to delete new false election videos -- even Trump’s
YouTube said that its efforts would apply to content posted as of today, given that enough states have certified their election results to determine a president-elect.
YouTube said it will start deleting videos that mislead people about the outcome of the U.S. presidential election, a move that could include future postings from President Donald Trump.
The video platform owned by Alphabet Inc.'s Google said it has seen more people turn to the site over the past weeks and months for information about the fiercely fought election. More than a month after the Nov. 3 vote, Trump has refused to concede his loss to President-Elect Joe Biden and continues to promote the disproven view that there was widespread voting fraud and mismanaged ballot counting.
Trump's YouTube channel, which has 2.5 million subscribers, currently has a video of him explaining “ongoing efforts to expose the tremendous voter fraud and irregularities which took place during the ridiculously long Nov. 3 elections.” On Twitter, where he has an even greater following, Trump has made similar claims, which have been refuted by courts in multiple states and election officials from both political parties. Twitter labels those claims of election fraud as “disputed” and restricts their circulation.
YouTube said in a blog post Wednesday that its efforts would apply to content posted as of today, given that enough states have certified their election results to determine a president-elect. “We will start removing any piece of content uploaded today (or anytime after) that misleads people by alleging that widespread fraud or errors changed the outcome of the 2020 U.S. Presidential election,” YouTube said. “We enforce our policies regardless of speaker,” a spokesperson added.
Videos posted before Wednesday won't be affected. Trump's latest video was uploaded six days ago to his YouTube channel.
Ahead of the election, YouTube didn't have a stated plan for how to deal with videos that disputed the results or lied to viewers about the outcome, setting it apart from Facebook Inc. and Twitter Inc., which had detailed strategies for dealing with misinformation.
As the official vote count dragged on for days, many videos promoted the unfounded theories of fraud or manipulation with software. YouTube said it ranked these sorts of videos lower in search results and recommended them less to viewers, but didn't take them down as they were considered “expressing views” on the election.
After Biden was declared the winner, Trump and his supporters continued to promote the idea that the election was rigged against him. In late November a group of Democratic U.S. senators appealed to YouTube to take down videos with “false and misleading” information about the election, a sign of political condemnation usually reserved for social networks Facebook and Twitter. The senators said the videos “seek to undermine our democracy and cast doubt on the legitimacy” of Biden's incoming administration.
In its post on Wednesday, YouTube said the site continues to be an important source of election news. On average, 88% of the videos in the top 10 search results related to elections came from authoritative news sources and the most-viewed channels and videos are from news sources such as NBC and CBS.
Follow HT Tech for the latest tech news and reviews , also keep up with us on Twitter, Facebook, Google News, and Instagram. For our latest videos, subscribe to our YouTube channel.