Today Apple has announced that it’s controversial child safety features have been delayed after feedback from various groups. In a statement from the company, it stated:
“Last month we announced plans for features intended to help protect children from predators who use communication tools to recruit and exploit them, and limit the spread of Child Sexual Abuse Material. Based on feedback from customers, advocacy groups, researchers and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features.”
Apple recently announced that the new child safety features would scan iCloud Photos for child sexual abuse materials by comparing image hashes with reported images. A new machine learning feature would also compliment this, and detect explicit images.
This has flared some controversy between users, with Apple being forced to publish a variety of FAQs, and clarifications on how this would operate.
It got to the extent that Craig Federighi appeared in an interview, outlining the companies reasoning, and admitting that Apple could have communicated this clearer to its users.
As this story is developing we will keep you posted on any changes.
Follow Appleosophy on Twitter and Instagram to stay up to date on the latest news and rumors, and check out Appleosophy Weekly, now streaming on YouTube and Apple Podcasts.