Continuing the fight against child sexual abuse online

We can all agree that content that exploits or endangers
children is abhorrent and unacceptable. Google has a zero tolerance
approach to child sexual abuse material (CSAM) and we are committed
to stopping any attempt to use our platforms to spread this kind of
abuse.

So this week our experts and engineers are taking part in an
industry “hackathon” where technology companies and NGOs are
coming together to collaborate and create new ways to tackle child
sexual abuse online. This hackathon marks the latest milestone in
our effort to fight this issue through technology, teams and
partnerships over two decades.

In 2006, we joined the
Technology Coalition
, partnering with other technology
companies on technical solutions to tackle the proliferation of
images of child exploitation. Since then, we’ve developed and
shared
new technologies
to help organizations globally root out and
stop child abuse material being shared.

In 2008, we began using “hashes,” or unique digital
fingerprints, to identify, remove and report copies of known images
automatically, without humans having to review them again. In
addition to receiving hashes from organizations like the Internet Watch Foundationand the
National Center for
Missing and Exploited Children
, we also add hashes of newly
discovered content to a shared industry database so that other
organizations can collaborate on detecting and removing these
images.

In 2013, we made changes to the Google Search algorithm to
further prevent images, videos and links to child abuse material
from appearing in our search results. We’ve implemented this
change around the world in 40 languages. We’ve launched
deterrence campaigns, including a partnership with the Lucy Faithfull Foundation
in the UK, to show warning messages in response to search terms
associated with child sexual abuse terms. As a result of these
efforts, we’ve seen a thirteen-fold reduction in the number of
child sexual abuse image-related queries in Google Search.

In 2015, we expanded our work on hashes by introducing
first-of-its-kind fingerprinting and matching technology for videos
on YouTube, to scan and identify uploaded videos that contain known
child sexual abuse material. This technology, CSAI Match, is unique in its
resistance to manipulation and obfuscation of content, and it
dramatically increases the number of violative videos that can be
detected compared to previous methods. As with many of the new
technologies we develop to tackle this kind of harm, we shared this
technology with industry free of charge.  

This work has been effective in stopping the spread of known
CSAM content online over the years. In 2018, we announced new AI
technology which steps up the fight against abusers by identifying
potential new CSAM content for the first time. Our
new image classifier
assists human reviewers sorting through
images by prioritizing the most likely CSAM content for review. It
already enables us to find and report almost 100 percent more CSAM
than was possible using hash matching alone, and helps reviewers to
find CSAM content seven times faster.

Since we made the new technology available for free via our
Content Safety API in September, more than 200 organizations have
requested to access it to support their work to protect children.
Identifying and removing new images more quickly—often before
they have even been viewed—means children who are being sexually
abused today are more likely to be identified and protected from
further abuse. It also reduces the toll on reviewers by requiring
fewer people to be exposed to CSAM content.

Because this kind of abuse can manifest through text as well as
images, we recently made substantial changes to tackle predatory
behavior in YouTube comments using a classifier, which surfaces for
review inappropriate sexual or predatory comments on videos
featuring minors. This has led to a significant reduction in
violative comments this year.

Underpinning all of this work is a deep collaboration with
partners. As well as the Technology Coalition, we’re members of
the Internet Watch Foundation and the WePROTECT Global Alliance, and we
report any CSAM content we find to the National Center for Missing
and Exploited Children who in turn report to law enforcement.

Technology, and the methods used by those who seek to exploit
it, are constantly evolving and there will always be more to do to
tackle this heinous crime. We are crystal clear about our
responsibility to ensure our products and services offer safe
experiences, and we are fully committed to protecting children from
sexual exploitation.

Source: FS – Social Media Blogs 2
Continuing the fight against child sexual abuse online