Future Tech

Apple responds to criticism about new child safety features

Tan KW
Publish date: Wed, 11 Aug 2021, 04:47 PM
Tan KW
0 464,930
Future Tech

Tech giant Apple has responded to some of the concerns cited by its privacy advocates about its new child safety tools. The company, on Aug 6, announced new software that will scan images on its devices, messaging platform iMessage and cloud platform iCloud, to look for child sexual abuse material (CSAM). The system had been widely criticised by privacy advocates like the Electronic Frontier Foundation and Edward Snowden, and WhatsApp CEO Will Cathcart, among others.

In an FAQ document, Apple claimed that the tools do not break the privacy assurances the company has given to its users so far, it doesn't break end-to-end encryption and that the company will not gain access to any user communications because of the new software. The child safety feature on iMessage is called Communications Safety, and it's distinct from CSAM detection on iCloud.

Notably, Communications Safety uses "on-device" machine learning to identify and blur sexually explicit images, which means it does not connect to any cloud server. It will notify a parent of a child who views or sends such an image. On the other hand, CSAM detection on iCloud scans images when they are uploaded to Apple's cloud service. It then notifies Apple, which verifies the alert before contacting the authorities.

Further, the company clarified that CSAM detection will not scan every photo stored on a user's device, only the ones uploaded to iCloud. However, Apple devices have ways for photos to be uploaded to the cloud automatically, which is a feature that users can turn off if they want to. Turning off the "iCloud Photos" feature will disable CSAM detection too.

"Existing techniques as implemented by other companies scan all user photos stored in the cloud. This creates privacy risk for all users. CSAM detection in iCloud Photos provides significant privacy benefits over those techniques by preventing Apple from learning about photos unless they both match to known CSAM images and are included in an iCloud Photos account that includes a collection of known CSAM," the company claimed in the FAQ.

WhatsApp CEO, Will Cathcart, had said Apple's new system could be misused by governments and the company itself, by adding different images to the CSAM database that iCloud photos are matched against. To this, Apple says that the company's systems are designed to prevent this. "CSAM detection for iCloud Photos is built so that the system only works with CSAM image hashes provided by NCMEC and other child safety organizations," the company said.

"This set of image hashes is based on images acquired and validated to be CSAM by child safety organisations. There is no automated reporting to law enforcement, and Apple conducts human review before making a report to NCMEC. As a result, the system is only designed to report photos that are known CSAM in iCloud Photos. In most countries, including the United States, simply possessing these images is a crime and Apple is obligated to report any instances we learn of to the appropriate authorities," the company added.

Perhaps more importantly, Apple claims that it will refuse any demands from governments to add images to the CSAM database. "We have faced demands to build and deploy government-mandated changes that degrade the privacy of users before and have steadfastly refused those demands. We will continue to refuse them in the future," the company said.

 

 - TNS

Discussions
Be the first to like this. Showing 0 of 0 comments

Post a Comment