LONDON — In Sri Lanka and Myanmar, Facebook kept up posts that it had been warned contributed to violence. In India, activists have urged the company to combat posts by political figures targeting Muslims. And in Ethiopia, groups pleaded for the social network to block hate speech after hundreds were killed in ethnic violence inflamed by social media.
“The offline troubles that rocked the country are fully visible on the online space,” activists, civil society groups and journalists in Ethiopia wrote in an open letter last year.
For years, Facebook and Twitter have largely rebuffed calls to remove hate speech or other comments made by public figures and government officials that civil society groups and activists said risked inciting violence. The companies stuck to policies, driven by American ideals of free speech, that give such figures more leeway to use their platforms to communicate.
But last week, Facebook and Twitter cut off President Trump from their platforms for inciting a crowd that attacked the U.S. Capitol. Those decisions have angered human rights groups and activists, who are now urging the companies to apply their policies evenly, particularly in smaller countries where the platforms dominate communications.
“When I saw what the platforms did with Trump, I thought, ‘You should have done this before, and you should do this consistently in other countries around the world,’” said Javier Pallero, policy director at Access Now, a human rights group involved in the Ethiopia letter. “Around the world, we are at the mercy of when they decide to act.”
“Sometimes they act very late,” he added, “and sometimes they act not at all.”
David Kaye, a law professor and former United Nations monitor for freedom of expression, said that political figures in India, Philippines, Brazil and elsewhere deserve scrutiny for their behavior online. But he said the actions against Mr. Trump raise difficult questions about how the power of American internet companies is applied, and if their actions represent a new precedent to more aggressively police speech around the world.
“The question going forward is whether this a new kind of standard they intend to apply for leaders worldwide, and do they have the resources to do it,” Mr. Kaye said. “There is going to be a real increase in demand to do this elsewhere in the world.”
Facebook, which also owns Instagram and WhatsApp, is the world’s largest social network, with more than 2.7 billion monthly users; more than 90 percent of them live outside the United States. The company declined to comment, but has said that the actions taken against Mr. Trump stem from his violation of existing rules and do not represent a new global policy.
“Our policies are applied to everyone,” Sheryl Sandberg, Facebook’s chief operating officer, said in a recent interview with Reuters. “The policy is that you can’t incite violence, you can’t be part of inciting violence.”
Twitter, which has about 190 million daily users globally, said its rules for world leaders were not new. In reviewing posts that could incite violence, Twitter said the context of the events was crucial.
“Offline harm as a result of online speech is demonstrably real, and what drives our policy and enforcement above all,” Jack Dorsey, Twitter’s chief executive, said in a post on Wednesday. Yet, he said, the decision “sets a precedent I feel is dangerous: the power an individual or corporation has over a part of the global public conversation.”
There are signs that Facebook and Twitter have begun acting more assertively. After the Capitol attack, Twitter updated its policies to say that repeat offenders of its rules around political content would have their accounts permanently suspended. Facebook took action against a number of accounts outside the United States, including deleting the account of a state-run media outlet in Iran and shutting down government-run accounts in Uganda, where there has been violence ahead of elections. Facebook said the takedowns were unrelated the Trump decision.
Many activists singled out Facebook for its global influence and not applying rules uniformly. They said that in many counties it lacks the cultural understanding to identify when posts may incite violence. Too often, they said, Facebook and other social media companies do not act even when they receive warnings.
In 2019 in Slovakia, Facebook did not take down posts by a member of parliament who was convicted by a court and stripped of his seat in government for incitement and making racist comments. In Cambodia, Human Rights Watch said the company was slow to act to the involvement of government officials in a social media campaign to smear a prominent Buddhist monk championing human rights. In the Philippines, President Rodrigo Duterte has used Facebook to target journalists and other critics.
After a wave of violence, Ethiopian activists said Facebook was being used to incite violence and encourage discrimination.
“The truth is, despite good intentions, these companies do not guarantee uniform application or enforcement of their rules,” said Agustina Del Campo, director of the center for studies on freedom of expression at University of Palermo in Buenos Aires. “And oftentimes, when they attempt it, they lack the context and understanding needed.”