A couple of weeks before the 2020 Us elections, several social media giants rolled out new policies in order to ensure that no profile spreads misinformation about election results. However, YouTube issued the policy in December 2020 and announced that any channel that will be seen posting content that includes false information and theories regarding unverified claims or fraudulent votes would be restricted from the platform.
The company won’t remove the video but instead will restrict the channel temporarily. There will be three strikes given to the channels that share content containing false information. Channels receiving one strike will be banned from uploading new videos on YouTube for a week. Whereas, any channel that gets three strikes within the period of 90 days will be terminated permanently from YouTube.
The company also removed one of the videos uploaded by former US President Donald Trump related to the mob Capitol attack; besides, the video also contained widespread false information related to the 2020 elections results.
Many of the companies, including Twitter, Facebook, and Instagram, are also taking strict actions against Trump’s misleading posts that violate most of the policies set by these firms. Twitter has even banned Donald Trump’s account for 24 hours. Trump is also prevented by Facebook and Instagram from posting content for 24 hours.