This sample demonstrates how to automatically moderate offensive images uploaded to Firebase Storage. It uses The Google Cloud Vision API to detect if the image contains adult or violent content and if so uses ImageMagick to blur the image.
## Functions Code
See file [functions/index.js](functions/index.js) for the moderation code.
The detection of adult and violent content in an image is done using The [Google Cloud Vision API](https://cloud.google.com/vision/).
The image blurring is performed using ImageMagick which is installed by default on all Cloud Functions instances. The image is first downloaded locally from the Firebase Storage bucket to the `tmp` folder using the [google-cloud](https://github.com/GoogleCloudPlatform/google-cloud-node) SDK.
Create a Firebase project on the [Firebase Console](https://console.firebase.google.com).
Enable Billing on your project by switching to the Blaze or Candle plan then visit the **Storage** tab.
In your [Google Cloud Console](https://console.cloud.google.com/apis/api/vision.googleapis.com/overview?project=_) enable the **Google Cloud Vision API**.
## Deploy and test
To test the sample:
- Deploy your project using `firebase deploy`
- Go to the Firebase Console **Storage** tab and upload an image that contains adult or violent content. After a short time the image will be replaced by a blurred version of itself.