×Memo
Apple announced in August 2021The function to scan for child sexual abuse materials (CSAM) stored in the terminal was a big backlash from the perspective of security and privacy. and implementation has been postponed. A new "child safety" page on Apple's site has revealed that the previously existing reference to CSAM has been removed. Apple Removes All References to Controversial CSAM Scanning Feature From Its Child Safety Webpage [Updated] - MacRumorshttps://www.macrumors.com/2021/12/15/apple-nixes-csam -references-website/
Apple scrubs controversial CSAM detection feature from webpage but says plans haven't changed - The Vergehttps://www.theverge.com/2021/12/15/22837631/apple -csam-detection-child-safety-feature-webpage-removal-delay This is what the news site MacRumors pointed out. On the page, Apple has set the goal of ``creating technology that empowers and enriches people's lives, and helps people stay safe,'' and that children send naked photos on messaging apps. It explains that there is a function to issue a warning when a message is sent or received. Child Safety - Applehttps://www.apple.com/child-safety/
On this page, until December 10, 2021, in addition to the previously stated goal, "We will use communication tools to talk to children and protect them from predators that prey on them." We want to help protect and curb the spread of Child Sexual Abuse Material (CSAM)." This language has disappeared in line with the release of iOS 15.2 and others on December 13th. Below is the Internet Archive that saved the page as of December 10th. Child Safety - Applehttps://web.archive.org/web/20211210163051/https://www.apple.com/child-safety/
MacRumors speculates that with the mention removed, Apple has abandoned its plans for scanning CSAM. However, when the news site The Verge contacted Apple, public relations manager Shane Bauer replied, ``Apple's position has not changed since September 2021, when it announced the postponement of the function.'' does not seem to have given up
Copy the title and URL of this article・Related articleApple announces that it will scan iPhone photos and messages to prevent child sexual exploitation, and from the Electronic Frontier Foundation, etc. ``Compromising user security and privacy'' -GIGAZINEThere are voices of concern from within the company about ``iPhone photo and message scanning'' announced by Apple-GIGAZINEApple ``Detection of child sexual abuse materials does not impair privacy'' announced FAQ b>Apple has been scanning iCloud email attachments since 2019
・Related content
- Tweet
in Notes, Posted by logc_nt
You can read the machine translated English article here.