Maverick Life

EXCERPT

Facebook’s role in the genocide in Myanmar, as revealed in the new book ‘An Ugly Truth’

Facebook’s role in the genocide in Myanmar, as revealed in the new book ‘An Ugly Truth’
Image composite: The Reading List

Facebook has been under constant fire for the past five years, roiled by controversies and crises. It’s currently a sponsor of the Tokyo Olympic Games, which makes the appearance of an An Ugly Truth: Inside Facebook's Battle for Domination all the more apt. In this new book, award-winning New York Times reporters Sheera Frenkel and Cecilia Kang offer a riveting, behind-the-scenes exposé that offers the definitive account of Facebook’s fall from grace.

In An Ugly Truth, Frenkel and Kang’s focus is on what went on within Facebook over the four years from the 2016 US presidential election that brought Donald Trump to power to Joe Biden’s election in 2020. They demonstrate how, while the tech giant was “connecting the world”, it was also mishandling users’ data, spreading fake news, and amplifying dangerous, polarising hate speech.

In this excerpt, the authors show how Facebook’s expansion project facilitated the beginnings of a genocidal campaign against Myanmar’s Rohingya Muslim people. 

***

In August 2013, ahead of his thirtieth birthday and Facebook’s tenth anniversary, [Mark] Zuckerberg had tapped out a blog post on his iPhone to announce his vision for the next decade. The post was titled “Is Connectivity a Human Right?,” and in it, he announced that Facebook, already the world’s biggest communications network, with 1.15 billion users, was aiming to reach the next 5 billion customers.

It was his moonshot, a personal ambition that would put him in the company of tech visionaries like his mentors, Steve Jobs, who had revolutionized mobile computing in 2007 with the iPhone, and Bill Gates, who had transformed global philanthropy after revolutionizing personal computing. People close to Zuckerberg said he was frustrated with the negative press around his first major effort at philanthropy, a $100 million donation to the Newark, New Jersey, public school system in 2010 for education reforms. Critics panned the effort for doing little to significantly change the education system in the beleaguered city. Zuckerberg had begun to think more about his legacy and had shared with people close to him that he wanted to be remembered as an innovator and philanthropist in the mold of Gates, whom he publicly acknowledged as a personal hero in an interview at the TechCrunch Disrupt conference in 2013.

Internet connectivity was the great bridge to close the gap on global economic inequality, Zuckerberg wrote in the blog. Internet access led to stronger economies and higher gross domestic product; the vast majority of the 2.7 billion people with online access were from developed Western nations. In emerging markets, internet access was often restricted to male heads of households. “By bringing everyone online, we’ll not only improve billions of lives, but we’ll also improve our own as we benefit from the ideas and productivity they contribute to the world.”

The proposition was hardly original. As David Kaye, former UN special rapporteur on freedom of expression, pointed out, “It was an idea that was being discussed on the world stage, by politicians and activists. The idea that people around the world should be brought online was part of the popular discourse.” For years, the United Nations, Human Rights Watch, and Amnesty International had been advocating for internet companies to turn their attention to the developing world. But within their pleas was a note of caution that companies muscling into the new markets be cautious of local politics and media environments. Too much access to the internet too soon could be dangerous. “Everyone agreed that the internet was necessary to enjoy information and that universal internet access was an important goal,” Kaye recalled. “But there was concern about whether the companies were doing the appropriate assessments about the countries and markets they were entering. Would a private company have the motivation to behave responsibly?”

In August 2013, Zuckerberg established Internet.org, a project with six global telecommunications partners aimed at bringing the whole world online. Internet.org struck business deals with cell carriers to offer a stripped-down internet service to developing nations. Facebook was preloaded and was compressed so it could be used even with slow and patchy internet connections. For remote areas with no cellular infrastructure, Zuckerberg created a laboratory for telecommunications projects like Aquila, an autonomous drone designed to beam down the internet to people below, or Catalina, which envisioned bird-size drones that could boost smartphone data speeds. Neither project made it beyond the testing phase.

Google had its own laboratory for broadband projects, which included hot-air balloons that beamed internet connections to rural areas of the world. The global race to acquire new internet customers was under way: Microsoft, LinkedIn, and Yahoo were also investing heavily in global expansion. Chinese companies like Weibo and WeChat were aggressively trying to expand beyond Asia, into Latin America and Africa. Zuckerberg was particularly focused on competing against Chinese companies head-on on their own turf. He had personally begun to lobby China’s regulators and leaders, meeting with President Xi Jinping twice in 2015. The first to capture untapped global markets would be the best positioned for future financial growth.

“It was clear to everyone at Facebook that this was the thing Mark was most excited about, it had buzz,” said a Facebook employee who worked on the initiative. In his weekly meetings with executives, Zuckerberg would regularly ask how new products in development would help with the “Next One Billion” project, and whether engineers were designing with the needs of the developing world in mind. “The message was clear that he wanted to get us there and get us there fast.”

He wasn’t thinking about the consequences of expanding so quickly, especially in nations that did not have democratic systems. As Facebook entered new nations, no one was charged with monitoring the rollouts with an eye toward the complex political and cultural dynamics within those countries. No one was considering how the platform might be abused in a nation like Myanmar, or asking if they had enough content moderators to review the hundreds of new languages in which Facebook users across the planet would be posting. The project didn’t include an overseer role, which, as part of the policy and security staff, would have fallen under [Sheryl] Sandberg’s charge. It would have been a natural fit, given Sandberg’s experience at the World Bank, but she acted more as a promotor and public advocate. “I can’t recall anyone at the company directly questioning Mark or Sheryl about whether there were safeguards in place or raising something that would qualify as a concern or warning for how Facebook would integrate into non-American cultures,” said one former Facebook employee who was closely involved with the Next One Billion project.

As it was, Facebook entered the markets, hired a few moderators to help review the content, and assumed the rest of the world would use the platform in much the same way it had been used in the United States and Europe. What happened in other languages was invisible to leaders in Menlo Park.

Zuckerberg, for his part, was encouraged by the early results. After Internet.org began rolling out in 2014, he touted how women in Zambia and India used the internet to support themselves and their families financially. He was excited about Facebook entering new markets like the Philippines, Sri Lanka, and Myanmar, and was undaunted by early critics. “Whenever any technology or innovation comes along and it changes the nature of something, there are always people who lament the change and wish to go back to the previous time,” he conceded in an interview with Time magazine. “But, I mean, I think that it’s so clearly positive for people in terms of their ability to stay connected to folks.”

***

Lost in the excitement were the clear warning signs. On March 3, 2014, Matt Schissler was invited to join a call with Facebook on the subject of dangerous speech online. He had connected with a Harvard professor named Susan Benesch, who had published papers on hate speech and was communicating her concerns to members of Facebook’s policy team. She asked Schissler to listen in on the call and give his perspective from Myanmar.

When he dialed in to the video link, he was introduced to a half-dozen Facebook employees and a handful of academics and independent researchers. Arturo Bejar, Facebook’s head of engineering, was also on the call. Toward the end of the meeting, Schissler gave a stark recounting of how Facebook was hosting dangerous Islamophobia. He detailed the dehumanizing and disturbing language people were using in posts and the doctored photos and misinformation being spread widely.

The severity of what Schissler was describing didn’t seem to register with the Facebook representatives. They seemed to equate the harmful content in Myanmar to cyberbullying: Facebook wanted to discourage people from bullying across the platform, they said, and they believed that the same set of tools they used to stop a high school senior from intimidating an incoming freshman could be used to stop Buddhist monks in Myanmar from spreading malicious conspiracy theories about Rohingya Muslims. “That was how Facebook thought about these problems. They wanted to figure out a framework and apply it to any problem, whether that was a classroom bully or a call for murder in Myanmar,” said one academic who joined the call and who recalled that no one at Facebook seemed to probe Schissler for more information on the situation in Myanmar.

Schissler had spent nearly seven years in the region—first, along the border between Thailand and Myanmar and, later, within Myanmar itself. He had become conversant in Burmese and had studied the region’s culture and history. What Schissler and other experts were seeing in Myanmar was far more dangerous than a one-off remark or an isolated Facebook post. Myanmar was consumed by a disinformation campaign against the Rohingya, and it was taking place on Facebook.

In the month following the call, a handful of Facebook employees started an informal working group to connect Facebook employees in Menlo Park with activists in Myanmar. The activists were told it would be a direct channel of communication, used to alert the company to any problems. Various members of Facebook’s policy, legal, and communications teams floated in and out of the group, depending on the topics under discussion.

Just four months later, in the first week of July, the Myanmar activists had an opportunity to put the communication channel to the test as rumors began to spread on Facebook that a young Buddhist woman in Mandalay had been raped by Muslim men. Within days, riots broke out across the country. Two people were killed, and fourteen were injured.

In the days leading up to the riots, NGO workers tried to warn the company in the private Facebook group, but they hadn’t heard back from anyone. Now people were getting killed, and there was still no response.

On the third day of the riots, the Burmese government decided to shut down Facebook for the entire country. With the flick of a switch, the nation lost access to the platform. Schissler reached out to Facebook through a contact to ask if they knew about the problem, and he heard back almost immediately. “When it came to answering our messages about the riots, Facebook said nothing. When it came to the internet being shut down, and people losing Facebook, suddenly, they are returning messages right away,” recalled another activist in the group. “It showed where their priorities are.” DM/ML

An Ugly Truth: Inside Facebook’s Battle for Domination by Sheera Frenkel, Cecilia Kang is published by Little, Brown (R355).Visit The Reading List for South African book news – including excerpts! – daily.

Gallery

Comments - Please in order to comment.

Please peer review 3 community comments before your comment can be posted

Daily Maverick Elections Toolbox

Feeling powerless in politics?

Equip yourself with the tools you need for an informed decision this election. Get the Elections Toolbox with shareable party manifesto guide.