Advertisement

How I fell into a terrifying conspiracy theory wormhole on YouTube

Protesters holds a Q sign waits in line with others to enter a campaign rally with President Donald Trump  - Matt Rourke/Matt Rourke Source: AP
Protesters holds a Q sign waits in line with others to enter a campaign rally with President Donald Trump - Matt Rourke/Matt Rourke Source: AP

Jitarth Jadeja was at a low point in his life when he stumbled upon the QAnon conspiracy theory on social media.

“I was suffering from undiagnosed mental illness and was also socially isolated,” recalls the 32-year-old in a phone call from his home in Australia.

“I was looking at all these conspiracy videos and the algorithm keeps giving you more and more of the same kind of stuff you're looking at.”

That's when Jadeja came across QAnon, a baseless internet conspiracy whose followers believe that an anonymous internet user known as Q is a US government insider. Q claims Donald Trump is secretly working to bring down an American “deep state” made up of child-abusing celebrities.

For nearly two years, Jadeja spent hours each day browsing online forums such as Reddit and watching QAnon videos on YouTube. He became locked in an algorithmic echo chamber. The more he searched and clicked on conspiracy theories, the more QAnon content was suggested to him online.

"It was like a drug," he says. "I latched on to it pretty much straight away."

It came at a high price. His relationships suffered and Jadeja became increasingly withdrawn and paranoid.

Looking back, he feels lucky to have escaped the virtual cult.

People are drawn to conspiracy theories because they provide a neat – if far-fetched – explanation to complex events, leaving subscribers with a false sense of control and agency.

Jitarth Jadeja - Jitarth Jadeja
Jitarth Jadeja - Jitarth Jadeja

"The world is becoming much more complex, economically, [and through] globalisation," says Kevin Munger, an internet and politics assistant professor at Penn State University. "People are looking for some understanding of why it's all happening and conspiracy theories offer an answer."

Karen Douglas, a professor of social psychology at the University of Kent, says people are most susceptible to conspiracy theories when important psychological needs are not being satisfied. People need knowledge and certainty, to feel safe, secure and in control.

The coronavirus pandemic left many feeling like they lack control, and as a result, conspiracy theories have flourished.

Julius, another former conspiracy theory believer, says he became drawn to these theories while in a depressive phase in 2016. Through Reddit, he found the Pizzagate community, where followers believe a powerful cabal of peadophiles exists around the world.

“You want to consume more and more of this content,” he says in a video call, “you feel more and more anxious. It makes you very obsessive and have compulsive behaviours.”

“When you're in a conspiracy and supporting an idea you just reject all the information that comes from the other side, be that fact checking or anything else.

Some experts argue there is a technical explanation too; the human tendency to conspire has been put on steroids by the recommendation algorithms used by social media platforms.

About | QAnon
About | QAnon

These sites have long encouraged their users to spend more time on their platforms by recommending new content that might interest them.

In a leaked Facebook presentation from 2016, obtained by the Wall Street Journal, the company acknowledged that “64pc of all extremist group joins are due to our recommendation tools” – pointing to how the platform’s algorithm powered the “Groups You Should Join” and “Discover” features.

In 2018, YouTube's product chief said 70pc of views come from the recommendation algorithm.

Caleb Cain, who says he was radicalised by YouTube and spent years living in a white nationalist rabbit hole on the platform, explains one of the main ways he found far-right content was through recommendation algorithms on the site's homepage and through the sidebar that appears alongside videos.

Looking back, he says the YouTubers he was watching in that period - between 2013 and 2017 - had learnt to exploit the recommendation system. They would invite like-minded creators onto their channel, knowing the algorithm would then start promoting them too and create what Cain calls a "propaganda ecosystem".

Cain, who has since deradicalised and now studies extremism at Washington's American University PERIL Center, says: "The algorithm will [then] keep recommending more rightwing content becuase it's picking up that's what you're watching."

Guillaume Chaslot, a former YouTube engineer who now campaigns for algorithmic transparency, says: "When you build a recommendation system based on the previous experience, you're going to create this filter bubble or echo chamber because you know what [the user] watched so… you recommend more and more of the same thing."

Chaslot says a lot of change has taken place at YouTube since Cain's experience and YouTube is keen to stress how the platform has removed much of the kind of content encountered by Cain, Jadeja and Julius, taking down tens of thousands of Q-related videos and terminating hundreds of Q-related channels since 2018.

A spokesperson for the platform adds: "In early 2019, we also began updating our systems to reduce recommendations of borderline content, including Q-related conspiracy theories and have seen the number of views that come from recommendations to prominent Q-channels drop by over 80pc."

But there is a catch, says Chaslot: "The caveat is it's only for the conspiracy theories they identify as harmful. That means some conspiracy theories are still recommended because they are not considered harmful content."

He adds it's not only YouTube recommendation algorithms which promote conspiracy theories. "YouTube definitely played a major role in propelling QAnon into the mainstream," he says. "[But] it's not the only one. Facebook group recommendations played a huge role."

Facebook said it made fundamental changes to its algorithm in 2018, prioritising posts from friends and family. "We have made significant progress limiting the spread of misinformation, reducing polarisation and we have banned content from dangerous conspiracy theory groups like QAnon," added a spokesperson.

A paper in the Journal of Medical Research also directed blame towards Twitter's algorithm for promoting conspiracies linking 5G to the coronavirus via its trending feature, which recommends users topics based on what it perceives to be popular.

A Twitter spokesperson said they are currently investigating the claims made in the paper and are prioritising the removal of Covid-19 content that could cause harm.

In another study, released in June, Daniel Allington, a lecturer in social and cultural artificial intelligence at King’s College London, found that people whose information about coronavirus comes from YouTube, Facebook, WhatsApp or Twitter are more likely to believe in conspiracy theories, and less likely to be following public health advice.

"Statistically, YouTube had the strongest association both with conspiracy beliefs  and with not following public health advice," he says.

"Social media platforms are geared up to give you more of what's popular already. So if people are sharing a particular YouTube video, and it's taking off and going viral, YouTube's algorithm thinks to itself, 'Oh, this is a good thing. Let's make it go even more viral by pushing you to more people'."

This was a particular problem for Jadeja. “They're giving you stuff that you will likely click on. So they build up an echo chamber. If you're a conspiracy theorist, it can lead to a significant problem.”

Despite this, Penn State's Kevin Munger says it's hard to pin the blame for conspiracy theories directly onto algorithms. Instead the reality is messier.

"It's very difficult to tell [what's to blame] because we've had algorithms happening at the same time as lots of other stuff. The other stuff is the ability of everyone in a given country or something or the overwhelming majority to produce and consume unlimited text, images and videos. I personally think that latter thing is much more important than the algorithms."

But he does agree that algorithms accelerate the process of connecting like-minded people.

Jadeja is also reluctant to blame the internet for his experience. “I think there were external factors. This is my fault. I did this. No-one did it to me,” he says.

Jadeja counts himself lucky that he began to see factual issues in what Q was posting online.

Just as YouTube and Reddit brought him into the conspiracy, the same sites helped him to realise the absurdity of his beliefs. He eventually watched a YouTube video which pointed out that a key piece of proof for the conspiracy was a coincidence.

“It was literally like my world was turned upside down in just a split second,” he says.

Julius also realised the unhealthy nature of his beliefs and changed his lifestyle, moving country and job and eventually rejecting the theories which had dominated his life.

“It is really horrible to be somebody who is invested in some kind of conspiracy. It takes years of your life,” he says.

Jadeja is now speaking out about his experience in an attempt to stop other people being corrupted by these beliefs. “I'm so happy I got out,” he says. “The way I was, it could have gone on forever. It could have gone on for the rest of my life.”

A Reddit spokesman said: "Reddit banned communities devoted to pizzagate and QAnon that violated our site-wide policies since 2016 and 2018, respectively."