A nasty new surprise is doing the rounds on social media this week, claiming Facebook’s privacy breaches extend to WhatsApp. That was the fear behind the data backlash earlier this year, and this new warning that it is reading encrypted WhatsApp messages, “undermining privacy protections for its 2 billion users.”
There is an inherent and awkward sensitivity across WhatsApp’s user base, that the world’s largest secure messenger is owned by the world’s most avaricious data harvester. That’s why there was such a fierce backlash when WhatsApp ill-advisedly insisted users accept new terms of service, designed to facilitate tighter integration with Facebook, seemingly opening the door to more data sharing. And that’s why any suggestion Facebook is compromising WhatsApp security hits extremely hard.
The latest warnings stem from an original article in ProPublica this week, reporting on WhatsApp’s “1,000 contract workers filling floors of office buildings in Austin, Texas, Dublin and Singapore… using special Facebook software to sift through millions of private messages, images and videos.”
The very idea that Facebook hires content reviewers to sift through millions of seemingly end-to-end encrypted WhatsApp messengers can easily undermine confidence in a platform used by 2 billion users. But the detail here is everything.
MORE FOR YOU
Initially there was confusion about WhatsApp’s encryption being breached, that its end-to-end encryption is not as private as we all think. This shows the level of misunderstanding about what end-to-end encryption is, and what it is not. There is no encryption breach here, and thankfully ProPublica clarified the misunderstanding.
“A previous version of this story,” the update said, “caused unintended confusion about the extent to which WhatsApp examines its users’ messages and whether it breaks the encryption that keeps the exchanges secret. We’ve altered language in the story to make clear that the company examines only messages from threads that have been reported by users as possibly abusive. It does not break end-to-end encryption.”
WhatsApp has a long-standing reporting mechanism available to all its users, where you can click to report abusive messages. Hit on the settings for any 1:1 chat and you will see the option to “Report Contact.” If you click on this, you can report the user and you’re told that “the most recent messages will be forwarded to WhatsApp.” You’re told the other user will not be informed, and you also have the option to block the user and delete the chat. It’s only these forwarded messages that are being reviewed.
“WhatsApp provides a way for people to report spam or abuse,” the platform confirmed this week, “which includes sharing the most recent messages in a chat. This feature is important for preventing the worst abuse on the internet.”
When those messages are forwarded to WhatsApp, those human reviewers check what has been sent, determining what action (if any) to take. “Once reported,” the platform says, “WhatsApp receives the most recent messages sent to you by a reported user or group, as well as information on your recent interactions with the reported user.” It is hard to see how else this process could work, and what users might expect happens.
“We strongly disagree,” WhatsApp says, “with the notion that accepting reports a user chooses to send us is incompatible with end-to-end encryption.”
It is entirely sensible for WhatsApp to offer this means to report abusive or illegal content, where users can forward messages under their control, that are now beyond the platforms end-to-end encryption, to a moderator. A user could add screenshots to an email to achieve the same result, it would just take longer.
ProPublica rightly points out that WhatsApp also shares a lot of metadata with law enforcement. “WhatsApp shares metadata, unencrypted records that can reveal a lot about a user.” This has always been an awkward subject for WhatsApp—rival Signal collects no metadata and proudly markets its inability to provide data when asked.
Social graphs, group names and memberships and logins are outside end-to-end encryption and can logged and shared. IP addresses can disclose locations, social graphs can track groups. The content itself, though, cannot be accessed.
But there are real misunderstandings across the wider user base as to what end-to-end encryption means. This comes to the fore when there is an iMessage or WhatsApp breach, similarly it was a major issue when Apple announced its ill-advised (now stalled) plans to filter iMessage photos for kids.
End-to-end encryption protects your content when it’s not on your devices or the devices of those you are messaging with. It protects content in transit, from “end to end,” and it can protect content backed-up off your device, such as Apple’s Messages in iCloud (subject to setting tweaks) and WhatsApp’s new encrypted backups.
Specifically, the “end-to-end” covers the encryption of data between you and your counterparties, where only you and those counterparties hold the decryption keys. The platform over which you are communicating cannot hold copies of those keys or decrypt and re-encrypt mid-transit, as doing so means it is not truly “end-to-end.”
So, for example, Telegram and Facebook Messenger (excepting secret 1:1 messages) are not end-to-end encrypted. Your content is encrypted from your device to their servers, and then from their servers to your contacts—the platforms can access your content. The fact that Apple can access end-to-end encryption keys in iCloud backups is not technically the same, but undermines your security in much the same way.
There are no serious suggestions that protocols like Signal’s—which powers its own messaging platform, as well as WhatsApp’s and Google’s newly encrypted RCS—can be compromised in transit. The weak point is each of those “ends.” With compromises like the recently reported Pegasus attacks, malware on one of those “ends” captures the now decrypted content (it must be decrypted for you to read it) and then forwards it to its handlers. This we call an endpoint compromise.
And so, to the nuances. When Apple announced its plans to add a filter to iMessage to warn minors sending or receiving sexual imagery, we warned that this would also compromise end-to-end encryption. The reason being that the platform itself was taking control of one of the “ends” to monitor user content.
With the WhatsApp example, a user at one of the “ends” must manually send content to WhatsApp, it is not done automatically. This is not a compromise. If WhatsApp were to add an AI monitoring filter to its app, automatically forwarding content to those Facebook reviewers without any manual intervention, that would be different. That would be a compromise and reason enough to question WhatsApp’s security. But there is no suggestion or claim that WhatsApp is adding app-based AI monitoring.
If we think this through, the reason for the distinction is simple. Apple was questioned as to how it would handle government pressure to go beyond child safety in its on-device monitoring. With an AI-based filter, this is easily achieved technically. With a manual reporting option, that isn’t possible at all. There is no automatic monitoring technology built into the app that can evolve, widening in scope.
This is all very important. The security community encourages users to opt for end-to-end encrypted platforms, like WhatsApp and Signal, and we don’t want to do anything to undermine confidence in these platforms. Suggestions that Facebook might be monitoring your messages are electric and play to general Facebook fears.
That said, the encryption debate is complex. There are lawmakers and security agencies that want to be able to do exactly what is being suggested—backdoor into encrypted user content through inbuilt endpoint compromises. The tech and security industries push back, arguing that any backdoor would be exploited. And they’re right. Any backdoor is a vulnerability and will be found and compromised.
The other news this weekend is that WhatsApp is (finally!) offering end-to-end encrypted backups. This is a major step forward and a must-have for all users backing up chars to the cloud. It will be available very soon. It’s important we don’t complicate user confidence in WhatsApp security just as it gets better.