FACEBOOK HAS REMOVED 8.7 MILLION CHILD PORN IMAGES

1 Star2 Stars3 Stars4 Stars5 Stars (No Ratings Yet)
Loading...

child,children,content,pornography,facebook,moderation,photos,quarter

Facebook has acquired a new tool to identify and remove child pornography content circulating on its platform. The social network was able to neutralize 8.7 million images of child pornography or flouting its rules on child nudity in the last quarter.
This is what the firm Mark Zuckerberg announced Wednesday in a statement. It is an artificial intelligence that has allowed such large-scale moderation.

See More: FAQ: Linking PAN with Aadhaar card

“In addition to this technology of photo matching (a tool used until now to block child pornography photos already identified in the past, ed), we use an artificial intelligence to detect the nudity of children and content of character child pornography unknown to our services at the time of its download, “says the social network in its release.


“ANODYNE PHOTOS OF CHILDREN IN THE BATH” MODERATE
The reports from the users also represented a lever for moderating child pornography content until then. But on the moderation of the last quarter, it is not what weighed in the balance.

Of the 8.7 million sensitive content removed during the last quarter, “99% were removed before someone reported,” says Facebook, a sign of the efficiency and speed of this new technology.

Facebook’s moderation policy also extends to more mundane images, as the company explains in its press release: “(…) we also act on non-sexual content, like harmless photos of children in the bath.”

Finally, Facebook has also stated that it collaborates with the National Center for Missing and Exploited Children (NCMEC) and associations fighting against the exploitation of children.

Be the first to comment

Leave a Reply

Your email address will not be published.


*