You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
 
 
 
 
 
 
Nicolas Garnier 990a21c3a1 Update text and Firebase version number in package.json 8 years ago
..
functions Update text and Firebase version number in package.json 8 years ago
README.md Adding a Image moderation sample. 8 years ago
firebase.json Adding a Image moderation sample. 8 years ago

README.md

Automatically Moderate Images

This sample demonstrates how to automatically moderate offensive images uploaded to Firebase Storage. It uses The Google Cloud Vision API to detect if the image contains adult or violent content and if so uses ImageMagick to blur the image.

Functions Code

See file functions/index.js for the moderation code.

The detection of adult and violent content in an image is done using The Google Cloud Vision API. The image blurring is performed using ImageMagick which is installed by default on all Firebase Functions instances. The image is first downloaded locally from the Firebase Storage bucket to the tmp folder using the google-cloud SDK.

The dependencies are listed in functions/package.json.

Trigger rules

The function triggers on upload of any file to the Firebase Functions bucket.

Setting up the sample

Create a Firebase project on the Firebase Console. Enable Billing on your project by switching to the Blaze or Candle plan then visit the Storage tab.

Replace the placeholder FIREBASE_STORAGE_BUCKET_NAME with the name of the Firebase Storage bucket which can be found in the Storage tab of your Firebase project's console. It is typically of the form <project-id>.appspot.com.

In your Google Cloud Console enable the Google Cloud Vision API.

Deploy and test

To test the sample:

  • Deploy your project using firebase deploy
  • Go to the Firebase Console Storage tab and upload an image that contains adult or violent content. After a short time the image will be replaced by a blurred version of itself.