Site icon Pro Well Technology

WhatsApp privacy and encryption are a mess, finds new investigation

Andy Walker
WhatsApp from Facebook Stock Image

TL; DR

  • A new report sheds light on the inner workings of WhatsApp’s content verification system.
  • The report suggests that, despite claims that employees cannot read messages, WhatsApp is still employing contractors to review content.
  • WhatsApp claims that its employees can only read messages that have been reported to the company.

WhatsApp’s privacy-centric claims may not be as watertight as users might expect, according to an in-depth new report. ProPublica revealed the inner workings of the company’s moderation system, which suggests that WhatsApp contractors may, under certain circumstances, be able to read messages sent between users.

According to the report, WhatsApp employs at least 1,000 contractors who use “special Facebook software” to scan content flagged by the company’s machine learning system or reported by users. This content varies from material on child abuse to spam, terrorist activity and beyond.

WhatsApp regularly notes that due to end-to-end encryption, which first debuted on the platform in 2016, only senders and recipients can see their chats. Since then, it’s been an important marketing tool for Facebook’s own service. The existence of a content review system, however, arguably contradicts the company’s data protection initiative.

WhatsApp content review system

However, WhatsApp has good reason to implement a message reporting and verification system. It told ProPublica that this process enables the company to ban abusive and harmful users from the platform. It also suggests that users must initiate this reporting process. When a user is reported, only the offensive content and four previous messages are sent “unencrypted” to WhatsApp in one thread. Moderators can see these messages, but they do not have access to a user’s entire chat library or the machine learning system to access them. Reviewers can either discard the reported message, lock the reported user’s account, or put him on a “watch list”.

However, some unencrypted information can also be scanned. According to the report, unencrypted data from accounts placed on a “proactive” list can be used to benchmark against suspicious practices. This information ranges from the details of a user’s groups to their phone number, from their status message and unique cell phone ID to their battery level or signal strength.

See also: Everything you need to know about encryption can be found here

It is understandable that a chat platform would want to implement a screening and reporting system so that users can report abuse, but perhaps WhatsApp’s lack of clarity regarding the system is the biggest problem. In a statement to ProPublica, Facebook found that its content review system was not a problem for users. “Based on the feedback we’ve received from users, we’re confident that when people report WhatsApp that we’re receiving the content they’re sending us,” it says.

Still, the report is likely still a blow to WhatsApp’s privacy optics, especially given the divisive changes to privacy policy. The company announced the changes in January that would allow some data to be shared with Facebook. WhatsApp has since changed its rollout plans. WhatsApp has also been fined $ 267 million for violating data protection laws in the EU.

Source link

Exit mobile version