From fake news to cyberbullying, Facebook has prioritized making its platform a safer place for users in 2018. In line with this mission, CNBC reports that Facebook moderators used a machine learning tool to remove 8.7 million user images of child nudity over the last quarter.
The tool identified images that contain both nudity and a child, enabling Facebook to enforce its strict ban on sexualized photos featuring minors. Another system uses machine learning to spot “groomers,” adults who lure minors under friendly auspices with the ultimate intent of sexual exploitation.
The ability for machine learning programs to sift through billions of data points makes it a valuable asset to Facebook’s latest user safety push, which comes after mounting criticism from regulators and lawmakers.
Facebook’s global head of safety Antigone Davis recently told Reuters the “machine helps us prioritize” and “more efficiently queue” problematic content for which human moderators give the final say. The ability for machine learning programs to sift through billions of data points makes it a valuable asset to Facebook’s latest user safety push, which comes after mounting criticism from regulators and lawmakers.
As with any other AI tools, its findings are still subject to error - news agencies and advertisers, for example, have inadvertently run afoul of automation, though users could appeal when necessary.
At the end of the day, these are mistakes that Facebook is willing to live with.“We’d rather err on the side of caution with children,” Davis told Reuters. The platform’s rules are far-reaching, even extending to innocuous family photos that simply have the potential to be abused.
Previously, the company relied on community-powered moderation or its adult nudity filters to halt illicit photos of minors. Child pornography previously reported to authorities is blocked using yet another tool.
This machine learning program was trained using a collection of nude adult photos and clothed children photos. Art and history (e.g. the iconic image of a young girl in flight from a Vietnam War napalm attack) are exempt from censors.
According to Michelle DeLaune, chief operating officer at the National Center for Missing and Exploited Children (NCMEC), 16 million child porn tips worldwide this year are expected to pour in from Facebook and other tech companies, increasing by more than 50 percent from last year.
DeLaune is encouraged by Facebook’s efforts but believes that encrypted chat apps (which have data off-limits to machine learning) and secretive “dark web” sites will be critical in stopping the spread of child pornography. She sees this as an area of opportunity and hopes her organization and tech companies can work together to “use creativity” to overcome this hurdle.