HomeTech & AIWhatsApp Is Walking a Tightrope Between AI Features and Privacy

WhatsApp Is Walking a Tightrope Between AI Features and Privacy


Last year, Apple debuted a similar scheme, known as Private Cloud Compute, for its Apple Intelligence AI platform. And users can turn the service on in Apple’s end-to-end encrypted communication app, Messages, to generate message summaries and compose “Smart Reply” messages on both iPhones and Macs.

Looking at Private Cloud Compute and Private Processing side by side is like comparing, well, Apple(s) and oranges, though. Apple’s Private Cloud Compute underpins all of Apple Intelligence everywhere it can be applied. Private Processing, on the other hand, was purpose-built for WhatsApp and doesn’t underpin Meta’s AI features more broadly. Apple Intelligence is also designed to do as much AI processing as possible on-device and only send requests to the Private Cloud Compute infrastructure when necessary. Since such “on device” or “local” processing requires powerful hardware, Apple only designed Apple Intelligence to run at all on its recent generations of mobile hardware. Old iPhones and iPads will never support Apple Intelligence.

Apple is a manufacturer of high-end smartphones and other hardware, while Meta is a software company, and has about 3 billion users who have all types of smartphones, including old and low-end devices. Rohlf and Colin Clemmons, one of the Private Processing lead engineers, say that it wasn’t feasible to design AI features for WhatsApp that could run locally on the spectrum of devices WhatsApp serves. Instead, WhatsApp focused on designing Private Processing to be as unhelpful as possible to attackers if it were to be breached.

“The design is one of risk minimization,” Clemmons says. “We want to minimize the value of compromising the system.”

The whole effort raises a more basic question, though, about why a secure communication platform like WhatsApp needs to offer AI features at all. Meta is adamant, though, that users expect the features at this point and will go wherever they have to to get them.

“Many people want to use AI tools to help them when they are messaging,” WhatsApp head Will Cathcart told WIRED in an email. “We think building a private way to do that is important, because people shouldn’t have to switch to a less-private platform to have the functionality they need.”

“Any end-to-end encrypted system that uses off-device AI inference is going to be riskier than a pure end to end system. You’re sending data to a computer in a data center, and that machine sees your private texts,” says Matt Green, a Johns Hopkins cryptographer who previewed some of the privacy guarantees of Private Processing, but hasn’t audited the complete system. “I believe WhatsApp when they say that they’ve designed this to be as secure as possible, and I believe them when they say that they can’t read your texts. But I also think there are risks here. More private data will go off device, and the machines that process this data will be a target for hackers and nation state adversaries.”

WhatsApp says, too, that beyond basic AI features like text summarization and writing suggestions, Private Processing will hopefully create a foundation for expanding into more complicated and involved AI features in the future that involve processing, and potentially storing, more data.

As Green puts it, “Given all the crazy things people use secure messengers for, any and all of this will make the Private Processing computers into a very big target.”



Source link

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Must Read

spot_img