Facebook’s taliban ban will prove costly for Afghans
The tech giant is on the wrong side of history yet again
As the Afghan state collapsed, the international community headed for the exits, and hundreds of thousands of Afghans were struggling to follow, social media remained on the ground as the only reliable eyes and ears on the unfolding calamity. But even those eyes and ears may be at risk of being lost. Citing its stance on "dangerous individuals and organisations," Facebook reaffirmed its ban on the Taliban and pro-Taliban content on Aug. 17. After being on the wrong side of history so many times for spreading misinformation and propaganda, the beleaguered tech company seems eager to nip another reputational risk right in the bud.
I fear Facebook will end up on the wrong side of history yet again. Notwithstanding the Taliban's record of violence and oppression and likely brutal rule, deplatforming the group may make a monumental failure of US intelligence and nation-building even worse. For several reasons, it is ill-advised.
For one, Facebook's ban is half-hearted and full of leaks. Pro-Taliban users can still freely use WhatsApp, the wildly popular Facebook-owned messaging app. This is because the app is end-to-end encrypted, which means its content cannot be monitored by Facebook's observers. While it is true that Facebook has blocked official Taliban WhatsApp accounts, such tactics have also had unintended consequences: For example, a WhatsApp hotline set up by the Taliban for Afghan citizens to report violent incidents and looting also got shut down.
Meanwhile, in anticipation of being kicked off the Facebook platform itself, pro-Taliban users put into action time-tested bypass techniques used by such groups as Hezbollah, the Islamic State, and Mexican drug cartels. These techniques include changing hashtag spellings to evade detection and posting the same content in multiple accounts with the expectation that some will make it through Facebook's leaky filter even if others are shut down. As a result of these and other techniques, one-third of 68 US-designated terrorist groups or their leaders have an official presence on Facebook—despite bans imposed by the company.
Today's Taliban are not the pre-2001 Taliban that banned movies, music, and the internet; the Taliban 2.0 are a media-savvy organisation. Like the Islamic State before them, they have mastered the art of the viral video. To gather even more followers, they have captured popular hashtags by inundating them with messages. Followership of pro-Taliban accounts has already spiked since the Taliban takeover. Even in the unlikely event that Facebook's wall proves to be impenetrable, the Taliban are not without options. Other platforms, such as Telegram and Twitter, are still available for use. The Taliban's propaganda machine, in other words, is far from being silenced.
Now, consider another issue. The presumptive head of the new government, Taliban co-founder Abdul Ghani Baradar, is in the same boat as former US President Donald Trump. Neither can post on Facebook—never mind that they were both principals to the so-called "peace deal" reached in Doha in 2020, which ultimately resulted in this month's US withdrawal. But this is where the similarities end. Baradar is the presumptive head of a government, while Trump is just a private citizen (albeit one who still gets a disproportionate amount of attention). Blocking access to an administration running a country or its leader, no matter how awful, is different from blocking an insurgency group or a terrible former president. It's a valid question: Doesn't the official leader of a country deserve to have a presence on Facebook?
If Baradar is kept off Facebook, the Afghan people could pay the price for the ban. He could very easily say, "If I can't use Facebook, neither can anyone else." If he has no way to tell his side of the story, why should he allow ordinary Afghans, let alone dissidents, post on Facebook about their misery under Taliban rule? The Taliban can use their power over the Afghan telecommunications system to block access to selected social media platforms or even the internet as a whole. The Taliban could institute its own ban, and it wouldn't be unique. Just ask the governments of China, North Korea, Iran, Vietnam, Turkey, or Bangladesh for instructions on how to turn off Facebook in a country.
A general blackout of major social media platforms or the internet would add to the Afghan tragedy. Today, independent sources of information—journalists, international organisations, relief agencies, and other nongovernmental organisations—are taking flight. The primary sources of information are now ordinary people, on the ground, armed with their smartphones and posting videos of their desperation on social media. We could therefore be looking at a news and intelligence blackout of the country. But it gets worse: Afghanistan is bordered by countries crucial to US national security, such as Iran, Pakistan, and China. A blackout in Afghanistan could lead to a wider intelligence blackout, given its location in the heart of one of the world's most volatile regions.
Moreover, functioning social media is essential for civil resistance and for groups seeking to organise grassroots institutions, local relief, and humanitarian aid from the bottom up. For Afghan women, in particular, social media—even if used anonymously—may remain the only means of connection, expression, and personal and professional growth as their rights are severely curtailed by the new regime. Facebook is a particularly important platform across Afghanistan. Roshan, the country's largest telecom company, offers special low-cost data plans with Facebook already bundled in. Facebook is also taking steps to protect the security of its Afghan users by hiding their friends lists on their pages on the platform. It ought to pay attention to ensuring that its other policies—such as the Taliban ban—don't end up putting the pages themselves out of reach.
There's no doubt that Facebook is in a bind. There is a risk associated with giving the incoming regime tools to project extremist and dehumanising propaganda, misinformation, and intimidation. The costs of the Taliban's digital power can thus be very high. Sadly, the costs of stifling the voices of everyone else across the country may be even higher.
If in light of all these considerations, Facebook makes the difficult but ultimately right choice of lifting the ban on the Taliban, the company must then do the hard work of monitoring pro-Taliban posts and pages, tracking the content, and recognising evolving narratives. It must take responsibility and block, deprioritise, and suspend based on a thorough analysis of content, weeding out content that violates community standards, peddles misinformation, propagates violence, or advocates for suppression of human rights. Given the rudimentary state of Facebook's automated content monitoring—which often blocks harmless content while being oblivious to the latest extremist narratives—this will require large numbers of well-trained human editors, which the company has been at pains to avoid so far. But investment in such monitoring is more essential than ever, most immediately in Afghanistan.
Furthermore, given its impact on the humanitarian situation and intelligence gathering, Facebook needs to coordinate its actions with the US government and the international organisations that are still engaging with Afghanistan. This means that the company, like other social media companies, cannot operate independently and needs to be part of a larger effort. The bottom line: It would be irresponsible for Facebook executives to take the easy way out by just banning the Taliban and thinking they have washed their hands of the problem.
This would also be a good occasion to reflect on Facebook's organisational structures and internal checks and balances. The tech giant's leadership should have learned its lesson from a recent similar case of banning a new regime. After Myanmar's military coup in February, Facebook said it would reduce distribution of content from the military, subsequently going further and banning all military-related entities, including ads from military-linked businesses. But these bans were blunt instruments. Facebook's blind algorithms began promoting pro-military propaganda from other accounts, according to a report from the human rights group Global Witness.
Whatever steps Facebook takes in Afghanistan, it should make sure that its technology does not displace human judgment. Its own left hand must track what its right one is doing, even as Facebook joins hands with the larger international community to not get things wrong yet again in Afghanistan. To the Afghan people now left to their fate under the Taliban, we owe at least this much.
Bhaskar Chakravorti is the dean of global business at Tufts University's Fletcher School of Law and Diplomacy. He is the founding executive director of Fletcher's Institute for Business in the Global Context, where he established and chairs the Digital Planet research programme.
Disclaimer: This article first appeared on Foreign Policy, and is published by special syndication arrangement.