Web Desk (October 25, 2018): Secret machine learning tool of Facebook reveals 8.7m child abuse images on its platform.
This software automatically flags such photos. The machine learning tool rolled out over the last year identifies images that contain both nudity and a child, allowing increased enforcement of Facebook’s ban on photos that show minors in a sexualized context.
A similar system also disclosed on Wednesday catches users engaged in “grooming,” or befriending minors for sexual exploitation.Facebook’s global head of safety Antigone Davis told international News Agency in an interview that “machine helps us prioritise” and “more efficiently queue” problematic content for its reviewers.
Machine learning is imperfect, and news agencies and advertisers are among those that have complained this year about Facebook´s automated systems wrongly blocking their posts.
In a reply to that, Davis quoted “We’d rather err on the side of caution with children,”After taking these measures, shares of Facebook fell 5 percent on Wednesday. The company is planning to apply the same technology to its Instagram app.
Before this new secret software, Facebook relied on users or its adult nudity filters to catch such images. A separate system blocks child abuse that has previously been reported to authorities.