Categorie
Diritti European Pirate Party Privacy Sorveglianza Stato di diritto

Government pressure pushes platforms toward more censorship and less privacy for users

This article by Carola Frediani was published in Italian on Valigia Blù, a collective blog made up of quality journalists who are extremely attentive to verifying the sources.

Introduction

This article by Carola Frediani was published in Italian on Valigia Blù, a collective blog made up of quality journalists who are extremely attentive to verifying the sources.

The article is an impressive review of all those cases in which governments around the world are forcing service providers (messaging, email, social networks, etc.) to affect the freedom of expression and privacy of users.

In particular, one of the latest topics addressed concerns the derogation from the e-privacy directive: a derogation that forces service providers to introduce automatic messaging scanning systems to detect child pornography content.

This measure, unfortunately voted by a large majority of the European Parliament, will have a questionable impact on the fight against child pornography, and above all it represents the greatest attack on the confidentiality of correspondence, ever carried out in the Europe of democracies, since the end of the Second World War.

Despite the importance of the provision, the European press has not paid any attention to the matter: the newspapers have mostly relaunched the press releases of the European Parliament, or at most have relaunched the press releases of the pirate MEP Patrick Breyer, among the more fierce opponents of the measure.

In the face of this silence, it therefore seemed important to us to ensure that Carola Frediani’s article could have resonance even outside the borders of Italy.

Since Valigia Blù publishes its articles under the CC BY-SA 4.0 license, after having contacted the author anyway, we have tried to find a way to spread this article in the greatest number of European languages.

The user twitter @ TheIdealist_0 has therefore decided to translate the article by Carola Frediani into English, French and German with the help of automatic tools.

We ask all users to contribute to the improvement of the translation, also making use of the “PAD” indicated in the tweet of @ TheIdealist_0; but above all we ask you to give maximum resonance to the article.

Thanks to everyone from the Pirati association & the InformaPirata blog

Original Italian version by: Carola Frediani @carolafrediani

June 6, 2021 12 min read

(Translated with DeepL Translate)

Government pressure pushes platforms toward more censorship and less privacy for users

In order to justify its decision to force a Ryanair flight to land in Minsk, and to arrest journalist Roman Protasevich and his girlfriend Sofia Sapega, Belarus clung to the threat of a bomb on the plane. And in particular an alleged email sent by Hamas militants (the group denied any involvement). But when researchers at Dossier Center obtained and published what appeared to be the email in question, sent from an address of the encrypted email provider Protonmail, the date it was sent belied Minsk. It was in fact after the Belarusian authorities had warned the plane of the possible bomb.

Protonmail and Belarus

The interesting part of this technical detail within a broader geopolitical affair is that, shortly afterwards, the same encrypted email service, Protonmail, located in Switzerland, came out of the woodwork and decided to confirm those journalistic revelations. Although it could not access the contents of the messages in its inboxes, the company was able and willing to confirm the date and, above all, the time of sending the email, which followed the report of the bomb to the Ryanair crew. “We have not seen any credible evidence that what Belarus claims is true,” they added, saying they were ready to cooperate with European investigations (at which point the Belarusian authorities said there would be two emails received).

“Due to the use of Protonmail by Belarusian citizens to protect their privacy, Lukashenko’s government has attempted to block access to Protonmail since summer 2020,” the Swiss service stressed in a note. “We condemn these actions and also the recent ones related to the Ryanair flight.”

Repression of media and communication tools

It is a stance that has struck some observers, but which is not surprising. Belarus has not only tried to block Protonmail, but also independent newspapers. A few days before the hijacking of the Ryanair flight, the Belarusian news site Tut.by, which covered the anti-regime demonstrations that broke out last August after allegations of electoral fraud, was blocked after a police raid on its offices. At the end of May, the director of another news website, Hrodna.life, was detained by police and interrogated for publishing “extremist” content. At least 27 media workers are currently in prison, convicted or awaiting trial, according to the Belarusian association of journalists.

Protasevich himself was the director of two channels on the messaging app Telegram that carried information about the anti-government protests and had millions of subscribers. Channels and apps remained accessible to citizens despite state repression and internet blockades.

Nexta Live, one of the channels co-founded by Protasevich, which published real-time news and information about the protests, grew from 300,000 subscribers to 2 million in the three days after the August elections. Authorities sought to prosecute the channel administrators as Telegram changed its functionality in a hurry, allowing them to post to groups anonymously. “The incognito admin will be hidden in the list of group members, and his messages in the chat will be marked with the group name, similar to channel posts,” the app announced in September.

“From the very beginning Telegram has become an integral part of the Belarusian protests (…) and the app itself is not shy about its political alignment,” the Institute for Internet & the Just Society wrote some time ago.

Read also >>Belarusian President Lukashenko hijacks a plane to arrest a dissident blogger: “A state terrorist act”.

Pro-privacy apps and the policy of data minimisation

Protonmail and Telegram are not the only examples of communication services that have taken on authoritarian regimes or those accused of doing business with them.

Signal, another encrypted messaging app, which is particularly respected by cybersecurity experts and journalists, fired a shot across the bow at a well-known Israeli company, Cellebrite, which sells tools for forensic analysis and the extraction of data and messages from phones to the police of various states (and is suspected, according to some journalistic investigations, of also selling to authoritarian states). Signal went so far as to claim to have hacked one of the company’s products that it sent a security update to its customers to mitigate a vulnerability, Vice reported.

Threema, another encrypted messaging app that does not require a telephone number or email address, which originated in Switzerland and is mainly used by German, Austrian and Swiss users, has recently won a legal victory. The Swiss Federal Supreme Court confirmed an earlier ruling that the company cannot be equated with a telecommunications company, with the consequence that it is not obliged to keep a range of data on users. “The authorities’ attempt to expand their sphere of influence to gain access to even more user data has finally failed,” said Roman Flepp, head of the app’s sales and marketing division.

This line – that of minimising user data – is also proudly supported by Signal itself. At the end of April, Signal wrote on its blog that it had received an injunction from an American court requesting a series of information “that falls into this non-existent category, including users’ addresses, their correspondence, the name associated with the account”. But Signal could not provide any data, as it did not have it. “It is impossible to provide data you don’t have,” he wrote on his blog. ‘Apart from the date an account was created and the date when you last logged on to the service.

Growing demands from states

In spite of this small group of pro-privacy companies or organisations (Signal, in particular, is a non-profit foundation) that seem to have, for reasons of business or principle, very clear positions in the defence of the right to freedom of expression and privacy, the reality is that at the moment most digital platforms, especially the largest ones, are being cornered by the States. Even with demands that severely restrict these rights.

Since January, as anti-government protests and demonstrations in favour of political opponent Alexey Navalny have grown, Russia has intensified pressure on Google, Twitter and Facebook. Not only has the government ordered the platforms to retain all data on Russian users in the country by 1 July, but it has also increased demands for the removal of content deemed illegal. And if companies don’t comply, they risk fines or having access to their services slowed down. TikTok has also been fined for failing to remove posts that the government believes encouraged young people to take part in demonstrations deemed illegal.

Meanwhile, in Nigeria, the government has just announced that it will suspend Twitter operations in the country (presumably with blocks at the level of telcos and ISPs) because the social network deleted tweets by President Buhari that threatened violence against certain groups.

India, Twitter and WhatsApp

But the toughest clash at the moment is taking place in India, a great democracy and above all a huge market. In February, the government announced new rules for digital platforms and messaging services (Information Technology (Guidelines for Intermediaries and Digital Media Ethics Code) Rules), effective from the end of May. They require platforms to have contact persons on the ground, and resident in India, who are legally responsible. Provide mechanisms to verify accounts, for example by phone number. They must be prepared to remove content deemed harmful or dangerous by a specific government body. If platforms do not comply, they risk losing legal protection for the content they host.

These rules came just after a tug-of-war between the government and Twitter, when the latter, not without initial uncertainty, had finally resisted censoring a series of accounts and tweets linked to the farmers’ protests. The tug-of-war recently culminated in a visit by the Delhi police to the company’s offices. In this case, Twitter had labelled some tweets of prominent politicians of the ruling nationalist BJP party as “manipulated media”, after fact-checkers had also found them misleading. But the government did not like the move, and in response sent policemen to local offices to deliver a notice of investigation into the matter. Last month, India had also asked Facebook, Instagram and Twitter to remove content criticising Prime Minister Narendra Modi’s handling of the pandemic. 

Read also >>India: Farmers’ protests, Internet shutdown and government’s request to block several Twitter accounts: ‘Democracy is being killed’

End-to-end encryption in the crosshairs

But it is not only the removal of content posted on social networks that is at stake. In fact, among the requests made by the new rules, there is one in particular that has alarmed the encrypted messaging services: namely that they must identify the first author of a piece of information or message spread through their service, if required by a court or a government order. But this tracking mechanism is incompatible with end-to-end encryption, i.e. the kind of encryption in which only the sender and recipient can read messages, implemented by Whatsapp, Signal, Telegram, and other apps and services. Traceability would in fact require breaking the encryption. “The minute you build a system that can go back in time and unmask some users who sent a certain content, you’ve built a system that can unmask anyone who sends any content,” cryptographer Matthew Green commented to Wired USA.

This explains why Whatsapp took an unprecedented step on 26 May, deciding to sue the Indian government over these rules, which the messaging app claims are unconstitutional because they violate citizens’ right to privacy. 

Global escalation

We are facing an escalation, Bloomberg wrote in an article a few days ago, referring to Russia and Belarus, but also to other states. And he added: we cannot allow autocrats to reshape the Internet after Covid. Yet, the problem is that, as we have seen in India, democracies are also intervening in a substantial way and with consequences that could weigh on freedom of expression and the right to privacy. Censorship is the new social media crisis, and governments are increasingly taking draconian measures to prevent citizens’ expression of dissent, wrote journalist Casey Newton, who analyses the relationship between politics and platforms in his newsletter Platformer. “While it has long been the norm in countries like China or Russia, the movement has more recently spread to democratic governments,” he added.

Britain and encryption

Britain, for example, is trying to thwart Facebook’s plan to implement end-to-end encryption in Messenger and Instagram as well (in addition to Whatsapp, where, as mentioned, it is already present). On the table is a proposed law, the Online Safety Bill, according to which platforms must demonstrate that they are taking concrete action to counter the spread of malicious content. And this has already raised concerns among those who fear it could turn into an excess of censorship by social media, leaving aside the dispute over what should be defined as “harmful”. But these initiatives to combat harmful content are likely to include end-to-end encryption. Moreover, there is also an alternative and probably worse scenario: that the Ministry of the Interior could issue an order forcing Facebook to assist with an interception request. In jargon, a Technical Capability Notice (TCN), which in this specific case would resemble an injunction with which to prevent the company from applying end-to-end encryption. In such a scenario, notes Wired UK, Facebook would not even be able to let people know.

The European Union’s proposals[#chatcontrol]

As far as the European Union is concerned, there are two delicate steps. The first is a proposal for a regulation for an exemption to certain communications privacy protections in the ePrivacy Directive. The derogation would serve to more effectively combat child sexual abuse. “The proposal could force email and messaging services to scan all content for possible illegal material,” Patrick Breyer, MEP member of the German Pirate Party and rapporteur for the Committee on Civil Liberties, Justice and Home Affairs, comments to Blue Suitcase. “But these mechanisms produce many errors, they also include legal materials. The Commission has not yet decided whether end-to-end encrypted communication services should be included. If they were, apps like Whatsapp would have to implement backdoors, access routes, in their clients to scan content before it is sent. This system would effectively create a backdoor that could be used for something else or that could create security risks.

The second crucial step is the new Digital Services Act (DSA), i.e. the new proposed regulation on digital services and platform liability, which amends the e-Commerce Directive, with new provisions on transparency and accountability for content moderation.

“DSA does not directly deal with encryption but requires platforms to mitigate systemic risks. This definition could lead to an indirect attack on encryption,” Breyer further comments. “Moreover, the Commission encourages the use of automatic filters, but this is a risk for freedom of expression because what is illegal in one context, could be legal in another (think of a photo of a terrorist attack, which can be interpreted as propaganda if shared by terrorists or right to report if shared by media). Filters do not differentiate and the result is that they over-censor”. 

Breyer tabled amendments to the Digital Services Act to ensure greater protection of fundamental rights in the digital age. These include: the possibility to use digital services anonymously; limiting tracking, i.e. the collection of data on users’ online activities; safeguarding secure encryption (authorities should not be able to limit end-to-end encryption, as it is essential for online security). In addition, the MEP asks that only the judicial authorities should be able to decide on the legality of content; that no prior filters should be required; that material published legally in one European country should not be deleted just because it violates the laws of another EU country (a request that aims to prevent illiberal laws in certain states – Breyer gives the example of Poland and Hungary – from deleting content published elsewhere).

Automation and collateral censorship

As Jillian York, director of freedom of expression at the Electronic Frontier Foundation, writes in her recent book Silicon Values. The Future of Free Speech under Surveillance Capitalism, artificial intelligence technologies (machine learning) are increasingly being used to enforce platform policies and thus decide which expressions are acceptable, assisting or replacing human moderators. This shift towards near-full automation, coupled with intense scrutiny and increasing state pressure to remove content deemed harmful, has made accurate moderation even more difficult, resulting in increased ‘collateral censorship’.

Recent victims of this process have included many Palestinian or pro-Palestinian users who have had their posts removed from Facebook, Instagram or Twitter, simply because they may have used a certain hashtag or words automatically associated with ‘violent or dangerous organisations’ (one case reported by the Washington Post and Slate is the name of the Al-Aqsa mosque).

Transparency and human rights as a guide

“I believe that any regulation of platforms must be in line with the international human rights framework, and this is especially true for any restriction of expression,” Jillian York herself comments to Blue Suitcase. “From governments, in addition to this, I would expect above all a demand for more transparency from platforms. We’ve seen this in the Digital Services Act, and in the Santa Clara principles on transparency and accountability in relation to content moderation”.

Read also >>For a regulation of digital platforms

These are some basic recommendations highlighted by organisations and digital rights experts to ensure that content moderation is fair, unbiased, proportional and respectful of users’ rights. These principles state that platforms must first provide detailed data on content removals; they must inform users about the precise reason for the removal, and whether the initial report is automatic, stemming from other users, the result of a legal process or a government request; and finally they must guarantee the possibility of an appeal, handled by a different person than the first decision.

“From India to Australia to Palestine, every day we get new stories of outrage about content removal,” Casey Newton wrote further. “In some cases, these removals were done at the request of the government. In others, platform policies play against minorities, making it harder to see their posts. But whatever the cause, complaints about censorship are only getting louder – and how platforms respond will have huge implications around the world.”

But also how democracies will lead by example.

Photo preview via wiredforlego under CC BY-NC 2.0 licence

https://www.europarl.europa.eu/doceo/document/LIBE-PA-692898_EN.pdf

https://santaclaraprinciples.org/