Signal is one of the apps competing in the security and privacy-first market, offering users who put these features first encrypted communication.
Another thing Signal promises is not to collect huge amounts of data, that could then be abused by any number of malicious actors – from hackers to governments.
When governments and law enforcement criticize this kind of app or service, the argument usually comes down to those allegedly promoting less security both for a country as a whole and individuals – since criminals can also use them to safely communicate. And as of late, another narrative in favor of undermining encryption – that of the need to stop the spread of “misinformation” – has also been gaining in popularity in those circles.
These governments, naturally, don’t go into how individuals and societies are threatened by government overreach when messaging and chat apps are unencrypted and open to data harvesting and warrantless mass surveillance.
India is among the many countries that are looking for ways to “curb” encryption, and that goes from trying to weaken and effectively destroy it with built-in backdoors, all the way to imposing a ban on the technology that experts agree is vital for the internet’s health as a whole.
Signal operates in India, but reports say it does not intend to comply with the announced new rules – coming on top of the already existing law against online misinformation and abusive communications – which critics see as yet another attempt to get encrypted communications out of the picture.
This time, the plan is to force platform providers to help the government intercept encrypted messages – either by having them intercept or decrypt that content and hand it over, or give the authorities access to one end of what would at that point become former end-to-end encryption.
Recently, Signal founder Meredith Whittaker gave a big interview to The Verge, suggesting that the app would withdraw from the market if forced to comply with any of these requirements.
Whittaker – formerly a Google employee for over a decade, and a senior advisor on AI at the Federal Trade Commission – provided a lengthy response to this question, to end it by saying, “Yes, we would walk. We will not hand over the keys to our encryption, we will not break the encryption. In fact, with the way we are built, we don’t have access to those keys.”