Facebook Removes 8.7mn Contents to Fight Child Exploitation
29 October 2018 10:36 WIB
TEMPO.CO, Jakarta - Facebook determined to keep the children safe in its platform. The social media giant developed by Mark Zuckerberg would not tolerate any behavior or contents that exploit them online.
“We develop safety programs and educational resources with more than 400 organizations around the world to help make the internet a safer place for children. For years our work has included using photo-matching technology to stop people from sharing known child exploitation images,” said Facebook Global Head of Safety Antigone Davis in a statement, Sunday, Oct. 28.
Facebook had reported violations to the National Center for Missing and Exploited Children (NCMEC), and limited the children age at least 13 years old to access the platform.
Other than using photo-matching technology, Facebook used artificial intelligence and machine learning to actively detect child nudity and exploitation content before it was posted.
“We’re using this and other technology to more quickly identify this content and report it to NCMEC, and also to find accounts that engage in potentially inappropriate interactions with children on Facebook so that we can remove them and prevent additional harm,” Davis added.
With such a comprehensive approach, Davis continued, Facebook had deleted 8.7 million contents that violating its Community Standard within three months. 99 percent of those contents were removed before someone reported it.
Facebook collaborated with safety experts, NGOs, and other companies to stop and prevent child exploitation in all online technology. “Next month, Facebook will join Microsoft and other industry partners to begin building tools for smaller companies to prevent the grooming of children online,” Davis concluded.
MOH KHORY ALFARIZI