Content Safety API
Used for:
Classifying previously unseen images
The Content Safety API classifier uses programmatic access and artificial intelligence to help our partners classify and prioritize billions of images for review. The higher the priority given by the classifier, the more likely the image contains abusive material, which can help partners prioritize their human review and make their own determination of the content. Content Safety API issues a prioritization recommendation on content sent to it. Partners must conduct their own review in order to determine whether they should take action on the content.
Operationally, we recommend organizations use the Content Safety API right before the manual review process, to classify, prioritize and help them to organize their queue. The Content Safety API can be used in parallel with other solutions, like YouTube’s CSAI Match video hashing tool, or Microsoft’s PhotoDNA, each of which address different needs.
Content Safety API
Used for:
Classifying previously unseen images
How it works?
1. Image retrieval
Images are retrieved by the partner in multiple forms, for example reported by a user, or identified by crawlers or filters that the partner has created to moderate images on their platform.
Partner
User reported images
Crawlers
Pre-filters
(porn/other classifiers)
expand_more
2. API review
The image files are then sent to the Content Safety API via a simple API call. They are run through classifiers to determine the review priority, and the priority value for each of the images is then sent back to the partner.
Google
Content Safety API
expand_more
Classifier technology
expand_more
3. Manual review
Partners use the priority value to prioritize the images that need attention first for manual reviews.
Partner
Manual review
expand_more
4. Take action
Once images have been manually reviewed, the partner can then take action on the content in accordance with local laws and regulations.
Partner
Action accordingly
Interested in using our toolkit?
CSAI Match
Used for:
Matching known abusive video segments
CSAI Match is YouTube’s proprietary technology for combating CSAI (Child Sexual Abuse Imagery) videos online. This technology was the first to use hash-matching to identify known violative content and allows us to identify this type of violative content amid a high volume of non-violative video content. When a match of violative content is found, it is then flagged to partners to review, confirm, and responsibly report in accordance with local laws and regulations. YouTube makes CSAI Match available to partners in industry and NGOs. We give access to fingerprinting software and an API to identify matches against our database of known abusive content.
Online platforms can prevent violative content from being displayed and shared on their sites by using CSAI Match to compare their content against one of the largest indices of known CSAI content. CSAI Match is simple for partners to integrate into their system, allowing them to better scale challenging content management.
CSAI Match
Used for:
Matching known abusive video segments
How it works?
1. Video fingerprinting
A video is uploaded to the partner’s platform.The CSAI Match Fingerprinter, which is run on the partner’s platform, creates a Fingerprint file of the video, a digital ID that uniquely represents the content of the video file.
Partner
Video file
expand_more
Fingerprinter
expand_more
Fingerprinter file
expand_more
2. API review
The partner sends the Fingerprint file via the CSAI Match API to be compared with the other files in YouTube’s Fingerprint repository. The repository contains Fingerprints of known abusive content detected by YouTube and Google.
Youtube
CSAI Match API
expand_less
expand_more
CSAI Match Technology
expand_less
expand_more
Shared CSAI
Fingerprinter repository
expand_more
3. Manual review
A positive or negative match is given back to the partner once the call to the API is complete. Based on the match information, the partner manually reviews the video to verify that it is CSAI.
Partner
Manual review
expand_more
4. Take action
Once the images have been reviewed, the partner can action the content in accordance with local laws and regulations.
Partner
Action accordingly
Interested in using our toolkit?