HackDig : Dig high-quality web security articles

Apple Scraps CSAM Detection Tool for iCloud Photos

2022-12-08 14:29

Apple has scrapped plans to ship a controversial child pornography protection tool for iCloud Photos, a concession to privacy rights advocates who warned it could have been used for government surveillance.

Instead, the Cupertino, California device maker said it would expand investments into different tooling and features to warn children if they receive or attempt to send photos that may contain nudity.

Instead of the proposed CSAM detection tool for iCloud Photos, Apple said it will focus engineering efforts on a feature called Communication safety in Messages that protects children from viewing or sharing photos that contain nudity in the Messages app.

The feature, which is off by default, uses on-device machine learning to analyze image attachments and determine if a photo appears to contain nudity, according to Apple’s documentation.

“If Messages detects that a child receives or is attempting to send this type of photo, Messages blurs the photo before it’s viewed on your child’s device and provides guidance and age-appropriate resources to help them make a safe choice, including contacting someone they trust if they choose,” the company said.

[ Read: Apple Adding End-to-End Encryption to iCloud Backups ]

In a statement sent to Wired, Apple acknowledged concerns from privacy rights advocates who warned that the proposed CSAM detection tool would give governments a de-facto backdoor into iPhones and iPads.

“Children can be protected without companies combing through personal data, and we will continue working with governments, child advocates, and other companies to help protect young people, preserve their right to privacy, and make the internet a safer place for children and for us all,” the company said in confirming plans to abandon the scanner.

Apple had previously delayed the rollout of the tool after a debate erupted over concerns it gave authoritarian governments a way to spy on mobile devices.

Apple also announced separate plans to beef up data security protections on its devices with the addition of new encryption tools for iCloud backups and a feature to help users verify identities in the Messages app.

[ Read: Apple Revives Encryption Debate With Move on Child Exploitation ]

The security-themed upgrades, scheduled to ship in 2023, includes a new feature called Advanced Data Protection for iCloud offering end-to-end encryption to protect iCloud backups even in the case of a data breach.

Apple devices currently offer end-to-end encryption by default for some data categories like health and passwords but when the new features ship, the categories will be expanded to iCloud backups, Notes and Photos.

The new enhanced security features also include iMessage Contact Key Verification, which will allow users to verify they are communicating only with whom they intend. 

The company also plans to add support for third-party physical security keys, a feature aimed at helping celebrities, journalists and government figures to have an additional layer of multi-factor authentication.

Related: Apple Revives Encryption Debate With Move on Child Exploitation

Related: Apple Announces Delay of Child Protection Measures

Related: Encrypted Services Providers Criticize EU Proposal for Encryption Backdoors

Related: Encryption Battle Reignited as US Govt at Loggerheads With Apple


Source: sotohp-duolci-loot-noitceted-masc-sparcs-elppa/moc.keewytiruces.www

Read:145578 | Comments:0 | Tags:Endpoint Security Mobile Security NEWS & INDUSTRY Privac

“Apple Scraps CSAM Detection Tool for iCloud Photos”0 Comments

Submit A Comment

Name:

Email:

Blog :

Verification Code:

Announce

Share high-quality web security related articles with you:)
Tell me why you support me <3

Tag Cloud