Oct 15 (Reuters) - Alphabet Inc's YouTube said on
Thursday it was tightening its rules on conspiracy theory
content, including QAnon, that targets individuals and groups.
The move is the latest in a series of platform crackdowns on the
unfounded and sprawling conspiracy theory.
WHAT IS QANON?
QAnon followers espouse an intertwined series of beliefs,
based on anonymous Web postings from "Q," who claims to have
insider knowledge of the Trump administration.
A core tenet of the conspiracy theory is that U.S. President
Donald Trump is secretly fighting a cabal of child-sex predators
that includes prominent Democrats, Hollywood elites and "deep
QAnon, which borrows some elements from the bogus
"pizzagate" theory about a pedophile ring run out of a
Washington, D.C., restaurant, has become a "big tent" conspiracy
theory encompassing misinformation about topics ranging from
alien landings to vaccine safety.
Followers of QAnon say a so-called Great Awakening is coming
to bring salvation.
HOW HAS IT SPREAD ONLINE?
The "Q" posts, which started in 2017 on the message board
4chan, are now posted on 8kun, a rebranded version of the
shuttered web board 8chan.
QAnon has been amplified on Twitter, Facebook
, Instagram and YouTube, the video streaming service of
Media investigations have shown that social media
recommendation algorithms can drive people who show an interest
in conspiracy theories toward more material.
A report by the Institute for Strategic Dialogue (ISD) found
that the number of users engaging in discussion of QAnon on
Twitter and Facebook have surged this year, with membership of
QAnon groups on Facebook growing by 120% in March.
Researchers say Russian government-supported organizations
are playing a small but increasing role in amplifying the
QAnon backers helped organize real-life protests against
child trafficking in August and were involved in a pro-police
demonstration in Portland, Oregon.
QAnon also looks poised to gain a toehold in the U.S. House
of Representatives, with at least one Republican candidate who
espouses its beliefs on track to win in the Nov. 3 elections.
WHAT ARE ONLINE PLATFORMS DOING ABOUT IT?
Twitter in July said it would stop recommending QAnon
content and accounts in a crackdown it expected would affect
about 150,000 accounts. It also said it would block QAnon URLs
and permanently suspend QAnon accounts coordinating abuse or
violating its rules.
Facebook Inc in October stepped up its earlier
actions against QAnon by saying it would remove any Facebook
pages, groups and Instagram accounts "representing" QAnon. It
had previously announced a ban on ads that praised or
represented militarized social movements and QAnon.
Following these moves, video platform YouTube said it was
banning content that targets an individual or a group using
conspiracy theories such as QAnon or pizzagate that have "been
used to justify real-world violence."
YouTube has also said it reduces its recommendations of
certain QAnon videos that "could misinform users in harmful
ways." ISD researchers found that about 20% of all QAnon-related
Facebook posts contained YouTube links. YouTube does not have a
specific ban on monetizing QAnon content.
A spokeswoman for the short-form video app TikTok said QAnon
content "frequently contains disinformation and hate speech" and
that it has blocked dozens of QAnon hashtags.
A Reddit spokeswoman told Reuters the site has removed QAnon
communities that repeatedly violated its rules since 2018, when
it took down forums such as r/greatawakening.
E-commerce site Etsy also said in October it was
removing all QAnon merchandise from its marketplace. A review of
Amazon.com Inc and Ebay Inc showed sellers
listing QAnon-branded items including books, face masks,
T-shirts and hats.
Amazon and Ebay did not immediately respond to questions
about whether they were taking specific actions against QAnon
(Reporting by Elizabeth Culliford in Birmingham, England,
Joseph Menn in San Francisco and Ted Hesson in Washington
Editing by Matthew Lewis)