You need content moderation for your videos to determine if your videos are safe or NSFW (not safe for work). With Hive + AVflow, you can now add content mod into your application in a few minutes!
Read below to manually set this up or clone a similar flow (that uses Hive and Mux) to get started even quicker.
Summary: a video gets uploaded to an S3 bucket which triggers the flow; this video is sent to Hive for analysis. Hive returns a JSON result identifying parts of the video which gets flagged for certain categories (such as nudity, violence, etc) with a probability score. Then AVflow runs a script where you can analyze the JSON results for scores that exceed a threshold and determines the video to be "safe" or "unsafe". This result can then be saved to a Table or sent to the client application via a webhook where the client app can decide, such as, to not to make a streaming URL available to the public.
Here's how to set this up:
1. Add "Hive" as a step to the Flow

2. Setup the step options

Video Source: The source video which will need to be checked before uploading
Classes Filter: The array of tags/labels (aka classes) that need to be checked (["nsfw", "sexual"] ), if not provided this step will return all the tags found/detected in the video. If provided, the result will return the specified tag/labels with the associated scores.
Here's an example:
["nsfw", "general_nsfw", "yes_realistic_nsfw", "yes_male_nudity", "yes_female_nudity"]
Note: Need to use double quote for each tag string.
The Hive API Key: The Hive API key associated with the account/project.
3. Add Transfer to Storage step to save the JSON result file to S3 so later steps can process it.
4. Function script: Add a function to read Hive's output JSON to see if there's a label indicating that content is safe or unsafe. You can use our script if you would like or clone it from here:
const https = require('https');
const fs = require('fs');
const request = require('request');
async function downloadFromS3(srcUrl, dstFile) {
var downloaded = 0;
return new Promise((resolve, reject) => {
const file = fs.createWriteStream(dstFile);
const request = https.get(srcUrl, (response) => {
response.pipe(file);
response.on('data',(chunk) => {
downloaded += chunk.length;
})
file.on('finish', (e) => {
console.log('Saved to: ' + dstFile);
console.log('File length: ' + downloaded);
file.end();
resolve({
fileName: dstFile,
length: downloaded
});
});
});
});
}
async function checkingThreshold(theHiveResult, threshold) {
if(!threshold) return false;
for (var i = 0; i < theHiveResult.length; i++){
for (var j = 0; j < theHiveResult[i].classes.length; j++) {
if(theHiveResult[i].classes[j].score < parseFloat(threshold)) {
console.log("1111");
return false;
}
}
};
return true;
}
async function main (service, context) {
try {
const threshold = service.threshold;
const local_thehive_file = './thehive.json'
await downloadFromS3(service.theHiveJSON, local_thehive_file);
const objectResultTheHive = JSON.parse(fs.readFileSync(local_thehive_file, 'utf8'));
const checking = await checkingThreshold(objectResultTheHive.output, threshold)
return {result : checking ? "safe" : "unsafe"}
} catch(e) {
console.log("ERROR SCRIPTING", e);
return e;
}
}
module.exports = main;
Input parameters:
[ { dataTypeId: 4, supportedDataTypeIds: [ '4', '16', '14', '12', '15', '1', '2', '3', '6', '7' ], require: true, key: 'theHiveJSON', dataType: 'string' }, { dataTypeId: 4, supportedDataTypeIds: [ '4', '16', '14', '12', '15', '1', '2', '3', '6', '7' ], require: true, key: 'threshold', dataType: 'string' }]
Output Parameters:
[ { dataTypeId: 4, key: 'result', dataType: 'string' }]
5. Webhook: Then you can add a webhook back to your client app to decide if you want to make a video private. Here is an example:

6. Tada! Turn the Flow on and check the logs to see the results.