Rudy Giuliani has been slapped with another suspension as YouTube just announced that the former NYC Mayor broke the rules of the platform once more, having made claims of election fraud regarding the US presidential elections in 2020.
Giuliani already experienced being banned from the video sharing platform in January. Within the last 90 days, Giuliani spread conspiracy theories regarding the elections, being the personal attorney of former US Pres. Donald Trump.
The second hammer from YouTube fell on Giuliani on Monday. For a period of two weeks, the former NYC mayor cannot post videos or even do livestreams. YouTube's policy dictates that an account that receives a third infraction within 90 days from the first strike will be permanently banned from the platform, The Verge noted.
"We have clear Community Guidelines that govern what videos may stay on YouTube, which we enforce consistently, regardless of speaker," stated a YouTube spokeswoman in a CNET report.
She said that the company removed content from the Rudy W. Giuliani channel due to a violation of the company's sale of regulated good policy. This policy prohibits any content that facilitates the use of nicotine. She also said that the channel violated the presidential election integrity policy.
In January, Giuliani received the first strike. He allegedly violated a number of YouTube policies, which included misinformation about the elections. During that time, he was barred from YouTube's monetization program where he was not able to collect advertising revenues. The second strike would mean that he will not be able to upload new videos to his channel. If Giuliani receives a third strike within the 90 days counted from the first strike, he will be banned.
It was not only YouTube that turned against the former NYC mayor. In January, Giuliani was a respondent in Dominion Voting Systems' $1.3 billion defamation lawsuit. The 76-year-old politician repeatedly accused the company of stealing the election from Trump.
The stricter implementation of social media platform policies came after the attack on Capitol Hill. At the time, it was found that those who attacked the institution mainly communicated through social media channels.
Aside from YouTube, Facebook and Twitter also rolled out their own suspension of accounts. TikTok, in a transparency report, announced its removal of almost 350,000 videos in the US for violating the company's election misinformation rules.