You're conflating the CSAM detection of photos uploaded to iCloud with the explicit detection for child devices. The latter is loosely described here: https://www.apple.com/child-safety/.
> Messages uses on-device machine learning to analyze image attachments and determine if a photo is sexually explicit. The feature is designed so that Apple does not get access to the messages.
> Messages uses on-device machine learning to analyze image attachments and determine if a photo is sexually explicit. The feature is designed so that Apple does not get access to the messages.