The ModerateContent Automatic Image Moderation API supports the integration of automated filtering mechanisms for images from different web sources. The API filters adult-rated and violent images along with other types of inappropriate content including images of alcohol, smoking, and abusive materials. It provides gender and age detection mechanisms in addition to capabilities for detecting faces, tags, and text (OCR) in scenes. The API supports .jpg, .png, .bmp, .gif, and animated Gif.
The following is a list of sample source code snippets that matched your search term. Source code snippets are chunks of source code that were found out on the Web that you can cut and paste into your own source code. Whereas most of the sample source code we've curated for our directory is for consuming APIs, we occasionally find something interesting on the API provider side of things. If you know of some sample source code that would be of interest to the ProgrammableWeb community, we'd like to know about it. Be sure to check our guidelines for making contributions to ProgrammableWeb.