The UK government is preparing to ask major technology companies, including Apple and Google, to make significant changes to how their operating systems handle explicit content. According to The Financial Times, ministers want these tech giants to incorporate nudity-detection algorithms directly into device operating systems by default. The primary goal is to protect children from viewing or sharing explicit images on their phones and computers.
This proposal is expected to be part of a broader strategy by the Home Office to tackle violence against women and girls. Unlike Australia, which recently moved to ban social media for users under 16, Britain is currently focusing on preventing the consumption of harmful content rather than a blanket ban on access. The proposal suggests that unless a user proves they are over 18, the device would automatically block the creation or viewing of images containing genitalia.
The scope of this ambition goes beyond just mobile phones. Officials have noted that similar models could apply to desktop computers, citing how enterprise software like Microsoft Teams already scans for inappropriate content. The Home Office has specifically pointed to software like “HarmBlock” by UK company SafeToNet, used on HMD Global devices, as a positive example of effective, automated intervention.
Apple already offers a feature called “Communication Safety,” which is designed to protect children from sensitive content. Currently, this system detects nude photos and videos in apps like Messages, AirDrop, and FaceTime. When enabled, it blurs the image and issues a warning to the child. However, the government argues this does not go far enough because teenagers can often choose to view the image after dismissing the warning, and the system does not cover third-party apps like WhatsApp or Telegram.
The UK proposal seeks a solution that works across the entire ecosystem, regardless of which app is being used. While Apple’s current tools are privacy-preserving and process images on-device, the government is asking for a stricter, default-on approach that requires proactive age verification to override.
Despite the well-intentioned goals of protecting children from grooming and early exposure to pornography, the proposal faces significant hurdles. Privacy advocates and civil liberties groups are likely to object to the idea of operating systems scanning all visual content by default. There are also questions regarding the technical feasibility and effectiveness of such blocks; previous attempts at age verification on pornographic websites in the UK were frequently circumvented using VPNs or fake photographs.
Stay updated with the latest news on this by downloading the Appleosophy App from the App Store or by visiting our website.