In this project, we will use the image recognition cloud service (SafeSearch in the Google Cloud Vision API) to detect whether any images being uploaded to a website contains any explicit content.
The administrators of social media sites have scripts running, as per a pre-defined schedule, to check any new images that are uploaded. If there are new uploads, the script copies those images to another folder for the bot to process. The administrator does not want the bot to be working directly on the social media website folder.
Once the bot detects that new files have been uploaded, it reads each file and creates an Excel log to pass to the administrator. It then loops through all the image entries in the spreadsheet and invokes SafeSearch in the Google Cloud Vision API to check the images. The API returns an output that indicates whether the image contains adult, medical, violent, or racy content. We add this output to Excel and send this to the administrator...