The Sightengine Image Moderation Service is designed to automatically and easily detect adult content within user-submitted photos.
The service is available through an API and works in realtime, which means that an image gets moderated as it is uploaded. There are no waiting times and no delays. The API works for any type of image. Whether your photos are small or huge, with colors or black-and-white. It even works on complex scenes, with multiple actions or people.
This Service is based on state-of-the-art developments in the fields of Deep Learning and Artificial Intelligence. The API is therefore in constant evolution.
Since it is automated, we can offer 3 huge benefits over human moderation: speed, low cost and scalability.
We offer a per-image pricing, starting at $0.001 per photo (1000 photos moderated for a dollar). More details
If you handle large volumes of images and have specific requirements. We can setup a custom service for you. This can be on-premise, calls from client devices such as mobile apps and more
This Image Moderation Service is currently being used by a variety of customers, including social networks, messaging apps, dating sites, gaming platforms, media in the fashion industry. All our customers have in common the need to detect and remove sexual content in images. Whether it is to provide an improved user experience or to comply with the guidelines of advertisers (including Google Adsense) or distribution platforms (including Apple App Store, Google Play, Facebook...).