Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

You're conflating the CSAM detection of photos uploaded to iCloud with the explicit detection for child devices. The latter is loosely described here: https://www.apple.com/child-safety/.

> Messages uses on-device machine learning to analyze image attachments and determine if a photo is sexually explicit. The feature is designed so that Apple does not get access to the messages.



Ah yeah you are right, two posts about different Apple features for photo scanning on a single day has thrown me!




Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: