
"Without directly naming SafetyCore, Google explained that the optional setting can blur photos that may contain nudity and display a warning before you view or share them. Sensitive Content Warnings appears to use SafetyCore to analyze images locally on your device. Google has emphasized that SafetyCore runs entirely on your phone -- images don't leave your device, and Google doesn't know if nudity was flagged."
"Security researchers back this up. The team behind GrapheneOS, an AOSP-based security-focused distro, confirmed on X that SafetyCore isn't secretly reporting things to "Google or anyone else." Google also told ZDNET that users control SafetyCore, and said it "only classifies specific content when an app requests it through an optionally enabled feature." SafetyCore isn't limited to nudity detection. It's a general-purpose system service that provides local machine-learning models that apps can use to classify other types of unwanted content -- including spam, scams, and malware -- in order to warn users."
Google added Android System SafetyCore to Android 9 and later via a system update. The component provides privacy-preserving, on-device machine-learning models for apps to classify content such as nudity, spam, scams, and malware. Sensitive Content Warnings in Messages can use SafetyCore to blur potentially nude images and warn users without sending images off device. SafetyCore runs entirely on the phone and only operates when an app requests classification through an optionally enabled feature. Security-focused projects confirmed that SafetyCore does not report classifications to Google or third parties. Users retain control over enabling and using SafetyCore.
Read at ZDNET
Unable to calculate read time
Collection
[
|
...
]