“There is no way to implement client-side scanning that sends the information demanded to law enforcement or a third party safely and privately. No, there is no technology there. You can do on-device….you can have a software package that talks to itself and maybe uses some on-device technology. But the issue is that the second you insert a third party into an end-to-end relationship, you have created an exploit,” Meredith Whittaker, President of Signal Foundation, stated while addressing a question regarding possibilities of introducing a client-side scanning mechanism for accessing messages on a messaging app, without affecting the end-to-end encryption feature.
Governments in several countries including India, UK, US, Australia, and Spain, among others, are advancing steps to secure backdoor access to end-to-end encrypted communication on platforms for law enforcement purposes. The global trend has raised apprehensions about surveillance and a violation of user privacy, but are privacy rights absolute? The efforts to weaken encryption has also presented tech companies as well as privacy advocates with a critical question of devising ways to tackle crimes like distribution of child sexual abuse material, with least intrusive methods.
Why is there a demand for backdoor access to information by law enforcement agencies and is it technically feasible to provide such data securely? Why is encryption important and what are the major arguments against encryption? In the second part of this hour long interview with MediaNama Founder and Editor Nikhil Pahwa, Whittaker addresses some of these questions and also highlights how Signal plans to strengthen user privacy on its platform. You can read the first part of the transcript, focusing on AI and surveillance, here.
The following transcript has been edited for clarity.
Nikhil: On the privacy side of things at Signal, you’re just coming off of battle to protect end-to-end encryption in the UK. How has that battle turned out? Was there a victory? Because it looks like that the text of the law still remains as is. But there was seemingly some assurance given in Parliament. Where do we stand on that debate?
Meredith: Yeah, I think you know the push to minimize or eliminate end-to-end encryption is not new, and I doubt it’s going away. The debate in the UK is a very fierce instance of that wish, in which there’s a large, an omnibus bill that has been going through the UK’s political process for many years. There is a clause in that bill, Clause 122, that would give their telecom and competition regulator the power to mandate accredited technology, a very vague term that ultimately points to client-side scanning in order to scan the content of everyone’s messages to detect prohibited expression effectively. This is an alarming step back in terms of privacy and step forward in terms of creating an illiberal surveillance state, particularly in the context of a self-professed liberal democracy. And so, of course, we fought that because there is no way to safely and privately scan the contents of end-to-end encrypted messaging. Anyone who understands these technologies will look at that sentence with a little perplexity like, you must have mistaken something, because we all know this is not the properties of end-to-end encryptionis that only the ends, only Nikhil and Meredith can actually see what we are saying to each other.
That’s the privacy guarantee encoded in the math that we use to encrypt this conversation. What are you talking about? It’s scanning all of those messages in some way that would detect prohibited expression. This is not the only law of this kind. I’m enumerating the terms of that law because this is as much tech policy. There are global trends and memes that pop up. And this is one. The idea that you can create an AI technology that can somehow magically undermine but not undermine end-to-end encryption. So the law moved forward. There was incredible pressure from the home office, the security services, and from a coalition of, I think, some well-meaning child advocacy groups and some AI companies dressed up like child advocacy groups who are looking for those contracts to sell client-side scanning to the government as accredited technology that has made this a bit more complicated. But the law moved forward, but we were able by the end to get an assurance or to get government on the record in Parliament acknowledging that the technology that could do this scanning for end-to-end encryption does not exist, which is a fairly significant statement given the vendor pressures and the hype and snake oil that’s being sold by those AI companies.
We didn’t do this, but I think the environment was created such that Apple was able to weigh in and say that we are one of the most powerful, well-resourced companies on Earth. We have some of the best cryptographers and software engineers on Earth. We tried to build such a system, and now we’re coming out publicly and categorically and saying it’s not possible. I think we were able to, if I’m going to put it in the shorthand, we won the narrative argument. By the end, it was very, very clear that there was on one side, the aspirations of the security services, which have never changed, or the law enforcement has always wanted to undermine end-to-end encryption, the law enforcement aspirations and AI hype. And on the other hand, there was a long-standing technical consensus that this is… Back doors are simply dangerous and you cannot put a back door into end-to-end encryption and consider it private. The vague terms of this clause are now being turfed to the regulator. The regulator has to create an enforcement framework. It’s going to be a very long process, but we will continue pushing on OfCom, the regulator, to make clear publicly that they will not leverage this power to fundamentally undermine private communications infrastructure in the UK in a way that would re-down to the world at large.
Watch the full interview here:
Nikhil: This is a never-ending battle between technology and law. What we’re seeing now by the looks of it, looking at what’s happening in India, is that the lawmakers are pushing the law as it is and saying that solving the technology problem is your problem, not ours. We’ve made the law now, and you can’t overrule the law of the land in a sense. Are you fighting unwinnable battles here in that sense? Is it possible for technology to prevent what the law has enforced and still survive?
Meredith: Yeah. No battle is winnable if you don’t try to fight. We can’t just stand back behind a Cartesian window and pontificate on how we think the world will turn out. We are agents of history, and we can continue to push to make the world safe for a livable future. I think the stakes of this are so, so high, and I think it’s always difficult. I’ve never found the right way to connect the beating heart of these issues, the humanity of these issues. Without the ability to have privacy, and I’m not saying digital privacy, I don’t think there’s a difference at this moment in history between digital privacy and some other form of embodied privacy; they have collapsed. We’re surveilled in the streets, we’re surveilled in school, we’re surveilled at work. And if we don’t have a place where we can be honest with each other, where we can have intimate conversations, where we can express things that we think are wrong, we’re not going to be able to organize. We’re not going to be able to make the dramatic shifts that are necessary to at least reduce the harm of climate collapse.
We’re not going to be able to create and recreate our world democratically. So I will keep fighting these battles. I don’t fight them because I have a little calculator that says I think they’re winnable and this will be a good career move. I fight them because I think they’re existential. And I don’t think there’s another choice if I want to be able to wake up in the morning with dignity.
Nikhil: What’s the argument against end-to-end encryption that you found the hardest to counter? What is it that people really don’t get either from a societal perspective or from a technology perspective?
Meredith: There’s two I’m going to name. The first is I don’t think [there’s] actually an argument against end-to-end encryption. It is a powerful narrative that gets thrown up as a pretext to argue against it. This is the long-standing pattern. And this goes back to the first instance of this I found was Stuart Baker in 1993 and an op-ed in Wired. And Stuart Baker was the head of the NSA. And he has an op-ed and Wired that is defending Key, Escrow, and the Clipper chip. So he’s like, encryption is okay, but we need a backdoor. And the Clipperchip was…one of the canonical crypto wars that started the first crypto war. But in this op-ed, he makes the claim at that early date that PGP, the end-to-end encryption for email, is a tool for pedophiles. And so already the specters that get thrown up to scare us away from end-to-end encryption to, by association, affiliate end-to-end encryption with the bad guys, the worst people, the demons, the monsters are very, very emotionally powerful. We do not want pedophiles in our world. We hate them. I feel that right now talking about that.
But then if you are defending the right to privacy, there’s a way that a story about the people you are keeping private substitutes for a meaningful argument against end-to-end encryption, and actually works to compel people to associate encryption with bad guys and to argue against it. I don’t think this is an actual argument. I think this is an emotional smokescreen that is incredibly powerful that end-to-end encryption protects pedophiles. It protects the worst people in the world. And if you defend it, you are in favor of the worst people in the world. And that makes it also very unpleasant to go in there and argue because to the second point, the other misunderstanding is that this is a… It’s not a technology you can compromise on. There’s only a handful of robust crypto systems, and we all use them because it’s very easy to get it wrong and very hard to get it right. And so the Signal Protocol is one of those handful of crypto systems that has been proven robust. But if you undermine that, we actually undermine fundamentally the ability of anyone to have privacy at all because we don’t have many of these.
And because if we’re going to go into our garage and do DIY cryptographer, we’re almost certainly going to mess it up and there will be an exploit in there somewhere. So what you see, even in the way my answer is divided is like, on the one hand, an incredibly emotionally compelling story. One that there is no human alive by whatever they think about encryption that isn’t like… Their heart doesn’t react to that and they feel it. That’s powerful. Then on the other hand, there’s these weird little nuances that you just have to trust me on relating to mathematics and the way these arcane little systems work that I’m just like, You can’t touch them. It’s a bit like counterbalance. There’s no symmetry there. One is not that interesting to lay people, and they’re used to tech being full of nonsense and not quite believing in it, and that’s not something I understand, and the other is something that will grip any human being who’s able to listen… We’re on the back foot when we have to go in and argue math versus the specter of endangered children. Even if I believe that what we’re talking about is really often using these emotional narratives as a pretext to fulfill a long-standing wish by law enforcement to have the efficient power to view everyone and to eliminate encryption to do it.
Nikhil: Is there any means of having client-side scanning or a mechanism for accessing messages on device that is feasible with the messaging app to highlight some of these issues or to identify some of these issues? I’m just speaking…purely from what law enforcement agencies demand.
Meredith: There is no way to implement client-side scanning that sends the information demanded to law enforcement or a third party safely and privately. No, there is no technology there. You can do on-device. You can have a software… I’m trying to put a nuance point in layperson’s terms, so bear with me. But you can have a software package that talks to itself and maybe uses some on-device technology. But the issue is that the second you insert a third party into an end-to-end relationship, you have created an exploit. Even Apple proposed a system like that that would use both procedural and technological safeguards. It would have to scan, but it would have to detect n number of scans before it would actually flag those scans to workers that would then review, decrypt and review those scans. But somebody somewhere, there is a database involved that could be easily hacked. There is the issue of you have a third party in there, so government pressure. We know Apple rolled for China, so why wouldn’t they roll for…we need you to add an image to your hash database of our political opposition. We want to understand who’s sharing this image and get a sense of that. Then we’ll have our security services in your office reviewing those. That may seem farfetched, but it’s not farfetched given how much power these systems have and how motivated a lot of these states are to make use of them. I think the issue here, this is not a technical problem to solve technologically. It is really a problem of how this type of power asymmetry operates in the context of the possibility of surveillance information asymmetry. Apple is a for-profit company. We saw how they acted in China. How much leverage would you need to leverage that system? And that doesn’t even touch the fact that these machine learning-based detection systems are easily gained, right? One of the things that happened when Apple publicized the client-side scanning system that they’ve now disavowed, that they’ve now said, ‘We can’t do this safely and privately’, was that people very quickly were able to access the model. I think they decompiled the client, got access to the model. And within, I believe, about 24 hours, but a very short period of time, were able to find huge vulnerabilities in the system that allowed them to manufacture false positives or false negatives. So either hide an image that should, in fact, be scanned under the rubric of the system or make an image that should not have been flagged by using hash collisions, which is a property of how these machine learning systems work, not like a bug, and it hasn’t been solved.
I get into the technical details there, but the real issue is you cannot build a client-side scanning system where you’re sending information out to third parties about everyone’s communications based on a list of what communications to scan for that has been enumerated behind our backs without any auditability, and consider that anything other than a back door that breaks end-to-end encrypted messaging. And I think I’ll add on to this. I think in some sense, the fact that this has been an argument that has been turfed to the technical community and seen as a technical problem has done us a big disservice because this has led to attempts to solve breaking end-to-end encryption through client-side scanning technologically, when in fact, again, this is not something that technology can solve, it’s about power dynamics.
Nikhil: From a technology perspective, one of the demands has been that if it’s possible to introduce an identifier for the sender of a message and then have that identifier included when it gets forwarded from person to person, so the originator of the message is identified or is identifiable. Is that feasible with end-to-end encryption from a tech perspective?
Meredith: No, it’s not feasible with Signal, and you would have to make some pretty significant modifications. You basically have to add that feature on top, which would undermine the deniability properties of end-to-end encryption.
Nikhil: Because plausible deniability is built into the system.
Meredith: Yes.
Nikhil: How does interoperability impact end-to-end encryption? Facebook has been talking about wanting to bring interoperability with end-to-end encryption between its multiple messaging mechanisms, whether WhatsApp, Facebook Messenger, Instagram, DMs. Is it possible? Can this work?
Meredith: I think we could say it could work, but we need to use this subjunctive because I think everyone would need to be using the same standard. Facebook uses the signal protocol to encrypt WhatsApp messages, Instagram DMs, or they’re going to be rolling it out on Messenger, but they’re testing that, so Messenger. And so basically, if we all agreed to that standard, then we could all apply that standard and interoperate, right? The issue is that we’re dealing with unwilling actors with a lot of power. So, Apple doesn’t want to interoperate per the standard that Facebook put forward. They want to use their own standard, Google the same. And so I think the DMA is going to be a long, hard process. And I would probably bet on it never happening, honestly. But that is like deep cynicism that comes from having been in too many standards bodies processes before. End-to-end encryption of the content of the messages, there are ways it could happen, but they’re very hypothetical and they don’t take into account the interests of the different actors that are expected to interoperate. They don’t take into account the fact that honestly, you can write a specification. But if your specification is missing things or the implementation you give is shitty, it’s not going to work. And there are so many ways it can not work and so few ways it might work and you would need perfect cooperation. And again, we have unwilling actors here. I think we need to look at the politics here. For Signal, we are not looking to interoperate with any of the gatekeepers under the DMA in part because we encrypt our metadata. And it’s very unclear to us how we would interoperate and maintain our high privacy bar for metadata. That gets into complexities like metadata is necessary for routing, it’s necessary for other things. Before we even get to the question of which companies are fulfilling which law enforcement requests. What type of data would they give over? Is Apple’s policies commensurate with Meta’s policies? Or would an Apple user be subjected to Meta’s policies if they were interoperating with a Facebook message or a user? This is freighted with so much complexity that I have seen almost no answers to. For Signal, we’re just stepping back because we’re not expecting it to move forward that quickly.
Nikhil: How much control does Signal have over the deployment of the Signal protocol because you’re encrypting metadata and you’re collecting much less data, whereas WhatsApp is not.
Meredith: Yeah. Our protocol is open source, so if we have an implementation online, you can use it if you want. But in order to claim that you’re using the Signal protocol…to call it that, you need to license it from us. We have some control in that what we don’t want is some shoddy implementation where it’s just to be licensed under our name because… We recognize that security and privacy are things that are… We want to be responsible for everything that has our name on it, and we don’t want people to be confused, and so WhatsApp, the Meta licenses it, Google licenses it, Twitter is licensed it, although I don’t know if they’re using it right now. I don’t think so. It is the gold standard for messaging security. It was a paradigm shift in 2013 when it was released. We continue to improve it as well. We just launched our first phase of post-quantum resistance to the protocol. That is now rolled out in production and will be rolling out to everyone in the next months. The improvements we do are also licenseable and usable open source.
Nikhil: Why don’t we have more deployment of end-to-end encryption globally? What will it take? Is it a complexity issue? Is it a cost issue? Just like HTTPS is now a standard applied everywhere, why isn’t end-to-end encryption in all forms of messaging and mails?
Meredith: Well, I think because the business model of the tech industry is monetizing surveillance. Let’s remember, HTTPS wasn’t rolled out until after Snowden, when the companies or it wasn’t rolled out at the scale it is rolled out now. There wasn’t a push for it until after Snowden, when the companies faced a massive blowback that threatened their trust. So I think there isn’t a structural incentive within the tech industry to prevent the collection of data. In fact, all of the norms and assumptions and unwritten rules are structured around collecting as much data as possible. Really, we’re facing an uphill battle because privacy is diametrically opposed to the engine of profit at the center of this industry.
Nikhil: But profit is not a bad word in the sense not everyone has access to donations like Signal. You have a guaranteed amount of funding from Brian Acton and so what is Signal’s approach to monetization to surviving in the future? How do you make Signal sustainable on its own?
Meredith: Well, profit isn’t a bad word unless the model or the way that… the engine generating profit is surveillance, in my view. That’s the issue. I think the reason there’s only one Signal has a lot to do with this fact that to have a business model in the tech industry, more or less you’re going to be monetizing surveillance or selling services and infrastructure to those who do. So Signal’s model is unique. We believe, I’m confident that we can move forward and make this work. But what we’re really pushing on getting many, many small donations from a percent of the people who use us. So we have many millions of users. We are… Luckily, we have a network effect and we have significant scale. And we want to be supported by the people who can afford to and who rely on us because that’s who we want to be accountable to. So we’re really grateful that Brian put in a lot of money that gave us an on-ramp. But let’s be clear, it costs tens of millions of dollars a year just to maintain and develop Signal. And that’s a cost that’s never going away. That’s our server infrastructure that is the registration cost. That is our bandwidth cost, that is our labor cost. And that’s how much it costs to build high-availability tech. And that’s also a reflection on how profitable the surveillance business model is, right? We’re using these products for free. But what’s paying for that is the ability to monetize our data, the ability to sell us things, the ability, et cetera, to build AI models based on that data, et cetera.
Nikhil: What’s your plan in terms of making Signal even more privacy-preserving in the future? The mobile number is also a privacy vulnerability. It’s being used as an identifier for many services. In fact, there are services in India that don’t use anything, but a mobile number as an identifier. What’s your plan?
Meredith: I really like that you brought up the mobile number because this is something we’re actively working on now. I want to say I appreciate all of the people, yourself and others outside of the United States who are…you know, in context that I’m less familiar with, raising the issue of mobile numbers and the deep link between mobile numbers in some regions between that and government identity or the fact that those are identifiers and very sensitive. Right now we are in the final stages of working on usernames. So usernames will allow you to connect with other people on Signals without sharing your number. And that means if your device is seized and someone is going through your groups or your contact list, they won’t be able to just enumerate all of the phone numbers and identify those people through mobile numbers. So you will still need to use a mobile number to register, but that mobile number can be kept invisible to anyone who touches your device or anyone who communicates with you. That’s one way we’re working to raise the privacy bar. The other, I mentioned earlier is we just upgraded our encryption protocol to be resistant to quantum computer attacks, which…there aren’t sufficiently powerful quantum computers in existence right now, at least that we know of. But there are informed projections that assume that in 10-30 years, and it’s a little bit foggy, but in the future, we will have sufficiently powerful quantum computers that could break the encryption we’re using now that works for classical computers. So what we did is upgrade it so that it will prevent those potential future attacks and prevent anyone who is collecting signal encrypted data now from decrypting it later. So with this rollout, none of the conversations that are protected under our post-quantum protocol will be available. It will be possible to decrypt it even in a future with a sufficiently powerful quantum computer.
Also Read:
- ‘We Want Backdoors To E2E Encrypted Platforms For Law Enforcement’: India, Japan, Five Eyes To Companies
- UK Asks Meta To Not Roll Out End To End Encryption On Instagram And Messenger
- Here’s Why Element Says Diluting Encryption Cannot Help Prevent Child Abuse Online
- Encryption An Important Cybersecurity Measure As Per EU’s Latest Child Sexual Abuse Proposal
Support our journalism:
For You
- Sign up for our Daily Newsletter to receive regular updates
- Stay informed about MediaNama events
- Have something to tell us? Leave an Anonymous Tip
- Ask us to File an RTI
- Sponsor a MediaNama Event