YouTube said Wednesday it’s cracking down on content on its video-sharing, social media platform that alleges election fraud or errors that changed the outcome of the 2020 election.
“Yesterday was the safe harbor deadline for the U.S. Presidential election and enough states have certified their election results to determine a President-elect,” YouTube said on its blog. “Given that, we will start removing any piece of content uploaded today (or anytime after) that misleads people by alleging that widespread fraud or errors changed the outcome of the 2020 U.S. Presidential election, in line with our approach towards historical U.S. Presidential elections.”
The company said examples of content it would remove included videos alleging widespread software glitches or vote-counting errors changed the outcome of the 2020 election.
The shift in policy comes after four Senate Democrats wrote to YouTube CEO Susan Wojcicki late last month to request that her company remove videos they say contain election misinformation.
Democratic Sens. Mazie Hirono of Hawaii, Amy Klobuchar of Minnesota, Robert Menendez of New Jersey and Gary Peters of Michigan wrote to Ms. Wojcicki to request that YouTube take urgent action given the Jan. 5 runoff elections for Georgia’s two Senate seats.
On Wednesday, YouTube said it’s continuing to look for ways to diminish the visibility of political content it deems misinformation.
“[W]hile problematic misinformation represents a fraction of 1% of what’s watched on YouTube in the U.S., we know we can bring that number down even more,” YouTube said on its blog. “And some videos, while not recommended prominently on YouTube, continue to get high views, sometimes coming from other sites. We’re continuing to consider this and other new challenges as we make ongoing improvements.”