TECHNOLOGY
Bloomberg Businessweek
May 14, 2018
25
THE BOTTOM LINE Facebook has reduced ISIS- and al-Qaeda-
related material, but posts from similar groups with thousands of
followers don’t seem to have sufered the same crackdowns.
For years, Facebook has tried to take down pages
associated with U.S.-designated terrorist groups. In
2014, within hours of
Bloomberg Businessweek
inquir-
ing about pages for Hezbollah, Facebook removed
those for Al-Manar, Hezbollah news site Al-Ahed,
and the Islamic Resistance in Lebanon, a char-
ity associated with Hezbollah. All three, however,
quickly reappeared with tweaks to make them seem
new. At the end of April, Al-Ahed’s website linked to
an Arabic Facebook page with more than 33,000 fol-
lowers. Content on the page included a video of
masked snipers targeting Israeli soldiers. Another
Al-Ahed Facebook page had more than 47,000 fol-
lowers, and one in English had 5,000.
Facebook’s policies prohibit material that sup-
ports or advances terrorism. The company’s deini-
tion of the term, published last month for the irst
time, includes a ban on nongovernmental organi-
zations that use violence to achieve political, reli-
gious, or ideological aims. It speciies that such
groups include religious extremists, white suprema-
cists, and militant environmental groups. Facebook
also says content that violates its policies is “not
allowed” on the site.
The company only recently began scanning
more actively for content from Islamic State and
al-Qaeda after pressure from governments and is
training its artiicial intelligence systems to get bet-
ter at lagging bad posts. Meanwhile, journalists and
researchers frequently ind supposedly banned con-
tent just by searching for it. A report in the
New York
Times
in April uncovered hundreds of fake accounts
on Facebook and Instagram posing as Zuckerberg
and Chief Operating Oicer Sheryl Sandberg. A day
earlier, science and tech publication
Motherboard
noted that some pages on Facebook store stolen
data, including social security numbers.
When asked about that story on a conference call,
Sandberg said Facebook takes down such informa-
tion as soon as employees become aware of it. “Posts
containing information like social security numbers
or credit cards are not allowed on our site,” she said.
To help prune out the worst ofenders, Facebook
has added content reviewers. It has 7,500, up 40 per-
cent from the year before. They work in about
40 languages; the company plans to add staf lu-
ent in the languages that require the most attention.
Terrorists’ enthusiastic embrace of social
media has long caused angst at Facebook and its
global competitors. Like Twitter Inc. and Google’s
YouTube LLC, Facebook has historically put the
onus on users to lag content to moderators.
When pressed by Congress about the failures
to respond quickly in those instances, Zuckerberg
spoke of how, when starting the company in his
Harvard dorm, he simply didn’t have the resources
to vet everything. Having users speak up about hor-
rors was the easiest way to get things of Facebook.
That strategy had support from Section 230 of the
Communications Decency Act, which limits web-
sites’ liability for what users post. That protection
is being gradually weakened; last month, President
Trump approved an exception that allows prosecu-
tors to go after online platforms if they’re being used
for sex traicking. Zuckerberg now says he considers
Facebook responsible for what’s posted on the site.
That doesn’t necessarily mean legal responsibility.
Instead, the company has tried to frame its attempts
to clean itself up as a public service.
While Facebook has made its guidelines public,
it hasn’t been clear how they evolved, and some
view them as open to interpretation. “They should
be transparent about the laws or regulations that
they’re using to underpin their policies, but they’re
unfortunately not,” says Jillian York, the Electronic
Frontier Foundation’s director for international free-
dom of expression. York, who’s based in Berlin, says
Facebook risks meddling in local politics by picking
and choosing which groups are terrorist. One could
argue that blocking material fromHezbollah, which
is also a party with seats in Parliament, can hand its
political competitors an advantage, she says.
It’s also sometimes diicult to determine who’s
behind a Facebook page, even if it sports the logos
and content of known terrorist groups. In the
case of Boko Haram, the Nigerian group loyal to
Islamic State, research published by the Jamestown
Foundation in December said the group went by
the name “Khairul Huda” on Facebook. A proile
under that name exists, featuring plenty of photos of
friends holding riles or wearing balaclavas. Among
them: a Facebook member who posted an appeal
in December for volunteers to ight in Jerusalem “to
raise the banner of God” and liberate the city. “Will
you join me?” he wrote. “Inbox us.”
Once Facebook kicks these groups of, it doesn’t
appear to use sophisticated means to prevent
them from coming back. In April nine Hezbollah-
related Facebook pages disappeared after the non-
proit Counter Extremism Project publicized links,
including a tribute page to martyrs; it had more than
60,000 followers. Within two weeks, a replacement
popped up.
Bloomberg Businessweek
found it by
searching on Facebook for the website that had been
listed on the original page. All that had changed was
the language of the word “martyr,” from English to
Persian.
—Vernon Silver and Sarah Frier