Social media companies are to blame for Andrew Tate

The TikTok app's logo seen on a mobile phone screen in this picture illustration taken February 21, 2019
opinion

The TikTok app's logo seen on a mobile phone screen in this picture illustration taken February 21, 2019. REUTERS/Danish Siddiqui

Deplatforming is effective in limiting the reach and toxicity of spreaders of hate and misinformation, and can reduce harm

Imran Ahmed is the founder and chief executive of the Center for Countering Digital Hate.

We were one of the first organisations to help expose Andrew Tate’s use of social media to turn hatred of women into a profitable business. We had hoped to break his lucrative relationships with the platforms.

But while the subsequent coverage exposed Tate and damaged his business, it also reinforced his status as a cultural figure, and sparked endless speculation about what his rise tells us about a new generation of young men and their views of women.

Tate’s views represent hatred of women in its most naked form. And we know that Tate’s toxic content is having a real impact on young men.

Journalists have gathered evidence from teachers that boys as young as 12 are expressing support for Tate’s views, including his claim that victims are to blame for sexual assault. Teachers have been forced to distribute advice for parents on how to talk to their children about the influence Tate has had on them.

Go DeeperCan social media go from deadly by design to safe by design?
Nora, a Zimbabwean domestic worker living in Johannesburg, irons the clothes of her employer
Go DeeperAs online hate speech hits Africa, social media firms told to act
Elon Musk's twitter account is seen on a smartphone in front of the Twitter logo in this photo illustration taken, April 15, 2022
Go DeeperElon Musk's Twitter deal stirs fears of abuse in Asia, Middle East

But assertions that Tate’s rise is a product of today’s young men and their views are wrong. In fact, Tate repeatedly failed to build a personal brand until TikTok gave him the means to saturate news feeds.

Archives of his website show that his business and his pitch to young men was the same four years ago as it is today. As far back as 2019, his website promised courses that would “have your girlfriend obey every command”.

So if Tate’s ambitions and pitch to young men have stayed the same, what changed? The answer lies in how Tate learned to game TikTok’s algorithm, allowing his content to flood millions of news feeds.

In August 2022, we carried out experiments to see what content TikTok was recommending to 13-year-old users. We found that Tate’s content was being repeatedly recommended to teen boys within minutes of opening a TikTok account.

New data gathered by our researchers shows videos posted to just one hashtag associated with Tate were viewed 11.3 billion times in the last 12 months. Over 74% of those views came from users under the age of 24

Tate’s rapid rise was built on a unique brand of controversial hate, dominance over women and vulgar displays of wealth that caused users to stop scrolling and pay attention, something rewarded by TikTok’s algorithm.

But thanks to the investigative work of a journalist at The Observer, we now know that Tate’s attempts to game the algorithm went much further. Tate was instructing paying members of his “Hustler’s University” to re-post thousands of clips of his content to social media platforms every day, creating a totally artificial impression of popularity, and pushing TikTok’s algorithm to place Tate’s content even higher in news feeds.

This work had an impact: we can see from new analytic data released by TikTok that the popularity of Tate’s content, and particularly the “Hustler’s University” course he used to turn engagement in money, dropped sharply following our work to deplatform Tate and the copycat accounts his acolytes use to artificially boost his content.

This proves that deplatforming and demonetisation are effective: they radically cut Tate’s reach and damaged his ability to make money. This is supported by academic evidence that deplatforming is effective in limiting the reach and toxicity of the biggest spreaders of hate and misinformation, as well as our own experience in deplatforming leading spreaders of anti-vaccine content and anti-Semitic hate.

Platforms were correct to deplatform Tate, but they were wrong not to follow through. They have allowed paying members of Tate’s online academy, now renamed “The Real World”, back on to their platforms and continue posting Tate’s content. On Twitter, Elon Musk decided to hand back Tate’s personal account, allowing him to lead a comeback on one of the world’s most influential social media platforms.

Platforms have every reason to act: Tate breached their policies on hateful misogyny, broke their rules on artificially boosting content, and his business is now implicated in a criminal inquiry after he was detained in Romania last month in an investigation into rape and human trafficking allegations.

TikTok made Andrew Tate. By simply enforcing its own standards and deplatforming his content, it can break his hold on young men’s news feeds and minds.


Any views expressed in this opinion piece are those of the author and not of Context or the Thomson Reuters Foundation.


Tags

  • Disinformation and misinformation
  • Content moderation
  • Tech regulation
  • Social media



Get our data & surveillance newsletter. Free. Every week.

By providing your email, you agree to our Privacy Policy.


Latest on Context