The UK government is planning to ask Apple and Google to block nude photos and explicit images on smartphones unless users verify their age first. This could mark a major change in how phone operating systems handle sensitive content.
Under the proposed plan, Apple’s iOS and Google’s Android would include built-in nudity-detection tools. These tools would prevent users from taking, sharing, or viewing explicit images unless they confirm they are adults—possibly through biometric scans or official ID checks.
Officials from the Home Office are expected to announce the proposal soon. Unlike current rules that target apps, this approach would make the safeguard part of the device itself. The device would block explicit content when users try to capture, upload, or view it unless they prove their age.
The source reports that the Home Office wants:
Apple and Google to include nudity-detection in their operating systems to prevent photos or the sharing of genital images without age verification.
Devices to block any display of nudity unless users verify they are adults, using methods like ID checks or biometrics.
This proposal builds on the UK’s Online Safety Act 2023, which already requires age verification for online pornography and harmful content. However, the law currently targets platforms, and users can still bypass it using VPNs or proxy services. Device-level protection would be the next step to better shield minors from explicit material.
Supporters say integrating nudity-detection directly into the operating system would provide a more consistent and effective way to protect children. This would cover cameras, galleries, messaging apps, and web browsing.
Similar discussions are happening in the US, Canada, and Europe, as governments try to balance child safety with privacy rights. Even Pakistan is debating limits on social media content.
Apple and Google have traditionally prioritized user privacy and control, but the UK’s request increases pressure on them to detect and restrict explicit content on devices.
Apple has already added some safety features in the Messages app. If a child in an iCloud Family group receives a sexually explicit image, it appears blurred with a warning. If they choose to view it, a pop-up explains why it’s flagged and asks for confirmation. The child’s parent is also notified.
Age verification is becoming a key topic in tech, with companies like Meta supporting regulations that would hold Apple and Google accountable for keeping minors safe.

















