Some Android users are starting to see blurred images on their devices while using Google Messages. It’s part of a Sensitive Content Warning system announced last year that is now rolling out on Android devices that obscures images containing suspected nudity.
In a Help Center post, Google explains that when the feature is turned on, the phone can detect and blur images with nudity and generate a warning when one is being received, sent or forwarded.
“All detection and blurring of nude images happens on the device. This feature doesn’t send detected nude images to Google,” the company says in its post. Resources on how to deal with nude images are also offered with these warnings.
It’s possible, Google said, that images not containing nudity may be accidentally flagged.
For adults, the feature is not enabled by default. For teens aged 13-17, it can be disabled in Google Account settings. For those on supervised accounts, it can’t be disabled, but the settings can be adjusted by parents in the Google Family Link app.
How to enable or disable the feature
For adults who do want to get a heads up on nude photos, or those seeking to disable it, the toggle switch is under Google Messages Settings / Protection & Safety / Manage sensitive content warnings / Warnings in Google Messages.
The nude content feature is part of SafetyCore on Android 9 plus devices that also includes features Google has been working on to protect against scams via texts and dangerous links and to verify contacts.
How effective will this be?
Filters that screen for objectionable images have become more sophisticated by using AI due to a better understanding of context.
“Compared to older systems, today’s filters are far more adept at catching explicit or unwanted content, like nudity, with fewer mistakes,” said Patrick Moynihan, the co-founder and president of Tracer Labs. “But they’re not foolproof. Edge cases, like artistic nudity, culturally nuanced images, or even memes, can still trip them up.”
Moynihan says that his company combines AI systems with Trust ID tools to flag content without compromising privacy.
“Combining AI with human oversight and continuous feedback loops is critical to minimizing blind spots and keeping users safe,” he said.
Compared to Apple’s iOS operating system, Android can offer more flexibility, but its openness to third-party app stores, sideloading and customization create more potential entry points for the kind of content from which Google is trying to protect people.
“Android’s decentralized setup can make consistent enforcement trickier, especially for younger users who might stumble across unfiltered content outside curated spaces,” Moynihan said.
Making the system automatically opt-out for adults and opt-in for minors, Moynihan says, is a practical way to start. But, he said, “The trick is keeping things transparent. Minors and their guardians need clear, jargon-free info about what’s being filtered, how it works, and how their data is protected.”
Great Job Omar Gallaga & the Team @ CNET Source link for sharing this story.