iPhone privacy called into question by new child abuse scanning 1

iPhone privacy called into question by new child abuse scanning

Apple iPhone 12 Pro vs iPhone 12 Max camera 3

Robert Triggs / Android Authority

TL; DR

  • A new report claims that Apple plans to undermine iPhone privacy to stop child abuse.
  • The company reportedly plans to scan user photos for evidence of child abuse. If found, the algorithm would pass that photo on to a human reviewer.
  • The idea of ​​Apple employees inadvertently monitoring legal photos of a user’s children is certainly worrying.

Update, Aug. 5, 2021 (4:10 p.m. ET): Not long after we published the following article, Apple confirmed the existence of its child molestation software. In a blog post titled “Expanded Protection for Children,” the company presented plans to contain child sexual abuse material (CSAM).

As part of these plans, Apple will introduce a new technology in iOS and iPadOS that will “allow Apple to recognize known CSAM images stored in iCloud Photos”. Essentially, the scanning is done on the device for all of the media stored in iCloud Photos. When the software determines that an image is suspicious, it sends it to Apple, who decrypts and displays the image. If it finds out that the content is indeed illegal, it notifies the authorities.

Apple claims that there is a “one in a trillion chance a year of falsely flagging a particular account”.


Original article, Aug. 5, 2021 (3:55 p.m. ET): Over the past few years, Apple has worked hard to cement its reputation as a privacy-focused company. It frequently cites its “walled garden” approach as a boon to privacy and security.

A new report from Financial Times questions this reputation. According to the report, Apple plans to roll out a new system that would browse user-created photos and videos on Apple products, including the iPhone. The reason Apple is sacrificing iPhone privacy in this way is because of the hunt down of child molesters.

See also: What you need to know about privacy screen protectors

The system is supposedly known as “neuralMatch”. Essentially, the system would use software to scan user-created images on Apple products. When the software finds media that could depict child abuse – including child pornography – a human agent is notified. The human would then rate the photo to decide what action should be taken.

Apple did not want to comment on the allegations.

Is the iPhone’s privacy going to end?

Obviously, child exploitation is a huge problem and anyone with a heart knows that they should be treated quickly and forcefully. However, the idea of ​​someone at Apple looking at harmless photos of your children that neuralMatch accidentally marked illegal seems like an all too real problem just waiting to happen.

There’s also the idea that software designed to detect child abuse now could be trained to detect something else later. What if, instead of child abuse, it was drug use, for example? How far is Apple ready to go to help governments and law enforcement agencies catch criminals?

It is possible that Apple will release this system in a few days. We’ll have to wait and see how the public reacts when and when it happens.

Source link

Similar Posts