Instagram must protect free speech — lives depend on it
June 18, 2021For the past few months, India’s second COVID-19 wave has devastated the country. The world has seen the news bulletins of people lining the streets in makeshift beds attached to quickly-assembled oxygen machines. We have seen and heard the pleas for oxygen cylinders and hospital beds through family and friends on social media.
We learned that people who urgently needed beds were denied them and official government channels failed in the face of the crisis.
In parallel with the crisis came an epidemic of corruption, with Indian politician Tejasvi Surya alleging in May that bed allocation irregularities in Bangalore were linked to an assistant to a BJP local politician. And in the face of these challenges, people had no option but to turn to the black market to find the oxygen and medicine they needed, if they could afford to do so.
The breakdown of the public health infrastructure, the enormous pressure on medical professionals and the scarcity of testing kits has been been an abject failure of the Indian state.
Throughout this crisis, volunteers stepped up to provide vital services, delivering food, medicine, oxygen, and protective clothing. Most of these efforts were coordinated on social media, especially Instagram. However, many of these posts appeared to vanish. People began to question why.
Why did Instagram take down these coordination posts?
Reports of content removals by Instagram, including the content shown above, began emerging on May 6. Removed posts included relief efforts for frontline workers handling bodies in crematoriums and shared news stories about rising cases from CNN and the BBC.
Criticism of Prime Minister Narendra Modi’s government in handling the crisis was also removed, including reference to a scandal involving an MP from Modi's party who allegedly lashed out at a doctor because he had hired Muslims to work on his team.
On May 7, Instagram issued a public statement that a "widespread global technical issue not related to any topic" was to blame; later that day it claimed to have "fixed the issue." It referred to takedowns in Colombia, Canada, the United States and East Jerusalem, but did not mention India.
Then, in another statement on May 8, the company explained that the problem had arisen because of an update to its automated systems, which was meant to detect whether reshared media in a story was still available; this update had an impact on Instagram Stories, Highlights and Archives.
Automated content moderation: A threat to free speech
For years, experts have warned that automated content moderation poses a serious threat to freedom of expression. These tools almost invariably have a disproportionate impact on marginalized and vulnerable communities, and this is a case in point. Even with the limited knowledge available, it is clear that automated systems do not understand context and are susceptible to glitches that have profound consequences on the ground, particularly during times of crisis.
Adam Mosseri, head of Instagram, stated on Twitter that the bug "wasn’t related to the content itself." But all of the screenshots sent to ARTICLE 19 were about COVID-19 relief, politics and activism, and these were being taken down around the same time that posts about East Jerusalem were also being removed — a trend that received widespread media attention.
Mosseri's explanation does not account for disappearing messages or takedown of volunteer-driven and activist accounts, many of which remained banned from the platform as of May 20, two days after Instagram claims to have fixed the problem.
Social media platforms need to be transparent
Free speech and digital rights organizations, including ARTICLE 19, are urging Instagram, which is owned by Facebook, to be transparent about what has happened and to do everything it can to ensure people can share the vital information they need during the current crisis. Accounts and content that were taken down should be reinstated in line with Facebook's Corporate Human Rights Policy, which explicitly references international human rights treaties.
Even if we don’t know what prompted the takedowns of Indian content — and we should — we must be wary of future pressure that governments and political agendas may exert. Instagram and other platforms must foster relationships with civil society, journalists and opinion-makers, including groups at risk, to find out about the problems they encounter on the platform, and then come up with ways to solve the problems that result in censorship, whether accidental or not.
Quinn McKew has been Executive Director of ARTICLE 19 since May 2020. She has worked for the organization since 2011.