End-to-End Encryption: Meta’s Battle For Consistency

To encrypt, or not to encrypt, that is Meta's question.
23 August 2022

What’s new? End-to-end encryption, possibly?

One of the main points that sells WhatsApp to people is the fact that it uses end-to-end encryption. There are other messaging apps that do the same – many of them advertising their existence by playing on the weaknesses in even WhatsApp’s security. Telegram, Surespot, Signal and others all promise to be more secure and to let you chat with the people you like, about whatever you like, encrypted end-to-end and safe from scrutiny, either by governments law enforcement agencies, or the company that provides the service.

So, when Meta (the four-headed social media dragon behind the friendly faces of Instagram, Messenger, Facebook, and WhatsApp) announced it was going to implement end-to-end encryption on Facebook Messenger, it probably failed to anticipate the reaction the news would get.

The reaction included uproar from children’s charities, and denunciation from a British Home Secretary who does not get squeamish easily, having found no issue with deporting immigrants to Rwanda as a deterrent.

What’s the difference between the Meta-owned WhatsApp and other messenger services applying end-to-end encryption, and the idea of Facebook Messenger doing the same?

Leverage.

The Power of Facebook

While the power of a soccer moms’ WhatsApp group might seem terrifying and socially coercive to some of those who are in it, it’s unlikely that the effect of the group’s existence will have catastrophic real-world consequences. Facebook and Instagram (which is also scheduled for the end-to-end upgrade), on the other hand, carry massive organizational impact with them, which exists irrespective of the use to which the impact is put.

Whether you’re simply planning a school reunion, intending to overthrow the government, planning to prove that aliens are real, or looking for children to groom for abuse, the ease with which Facebook and Instagram allow you to switch from on-board or on-profile communication to DMs or to Facebook Messenger makes these platforms excellent tools for organizing journeys to an outcome.

That’s the issue – the nature of the outcome makes no difference to the technology or the platform. And what end-to-end encryption would mean, for instance, that everyone from January 6th insurrectionists to labor organizers at Amazon trying to organize unions, perfectly legitimate users to would-be sexual predators, could use Messenger or Instagram to achieve their aims, and the data of their chats would be inaccessible to law enforcement agencies. Crimes and potential crimes, as well as dating plans and protest movements, would be able to occur on Messenger and Instagram without any legal recourse, because the data of the private conversations on which they depended would not be accessible by anyone after the fact.

The fact that Meta’s plans to introduce end-to-end encryption on Messenger and Instagram – which have now been pushed back to 2023 – have been flagged by children’s charities as an aid to child abusers is not a hysterical response plucked from a clear blue sky, though.

Think Of The Children

The tech industry referred more than 21m cases of identified child sexual abuse from various platforms to the US National Center for Missing and Exploited Children in 2020. More than 20m of those reports were from Facebook. The main way the cases were identified was by analysis of chat conversations on Messenger. End-to-end encryption would have made reporting those criminal activities at the very least significantly more difficult, and probably impossible.

But if we’re emotionally prodded to argue on the basis of that data that end-to-end encryption on Messenger would be a bad thing and should be resisted, we’re not seeing the whole picture.

Meta announced its plans to apply end-to-end encryption shortly after law enforcement agencies compelled to hand over data of the private Messenger conversations of a 17-year-old who illegally acquired an abortifacient in Nebraska. The main evidence that they had done so was in those conversations (which they had clearly regarded as private), but the evidence was used to try the teenager as an adult. End-to-end encryption would have meant that Facebook did not have the data to turn over, and the prosecution would probably not gone ahead.

Facebook is also currently in hot water over its possession and potential misuse of users’ medical data, and with the post-Roe environment making it illegal to procure or provide abortions in ten states (with more expected to join the number), it can be argued that end-to-end encryption would be a move towards ensuring those who seek “illegal” abortions in future are not put at risk of prosecution from the data of their Messenger conversations, as the 17-year-old Nebraskan was.

That’s the issue with end-to-end encryption – it has no conscience, but it cuts across a hundred different ethical dilemmas, imposing a one-size-fits-all technological solution to deeply different and emotionally weighted problems.

An Independent Study

In the wake of the criticism of its plans, Meta commissioned its own “independent” study into the benefits of applying end-to-end encryption, which found that encryption amounted to a guarantee of free speech, as well as “enabling the realization of a diverse range of human rights,” while recommending 45 integrity and safety measures to ensure the move had minimal adverse impacts. Meta said it would be fully implementing 34 of the 45.

The company has said it aims to “prevent” criminal activity through the use of its algorithms, rather than prosecute it after the fact through invasive chat-scanning. That will ring a little hollow in the wake of the fact that those involved in the January 6th Insurrection of 2021 heavily used Facebook and Messenger to organize and co-ordinate the event.

The question “to encrypt or not to encrypt” will probably dog Meta all the way to the point of implementation, but the fundamental point remains – end-to-end encryption already exists on WhatsApp and other platforms, so the argument that not adding it to Messenger and Instagram somehow violates the right to free speech feels instinctively invalid. But the ability of law enforcement agencies to use Messenger chat data to convict both insurrectionists and seekers of healthcare means there is unlikely to be a clear and simple, morally satisfying answer.