<< Back to Blog

Image moderation Recommendation for perfect UX

Uploading, sharing or sending an image is often a pain for end users. Connectivity issues, format issues, all kinds of things can go wrong. You shouldn't make things worse with your image moderation process.

Don't harm your UX

We've seen places where sites let users submit photos, and then review them a bit later. The users don't get feedback if their photos are removed, which results in very bad user experience.

Ideally, you want such a feedback to arrive as quickly as possible. In order for your upload stream to remain acceptable, it has to take less than 8s in total. Above that your users will start to think about other things and engagement will decrease.

If you use automated image moderation, you can provide a feedback within the upload stream. You should provision approximately 1 second for the complete process (upload photo to the moderation API + analysis by remote servers + response arrived and parsed).

If you are relying on human moderation, you need to rely on asynchronous checks. Your upload stream will finish and your user will move on to his next task before the image is reviewed. The question is now to determine if you let the image become live before it is reviewed or not. If not, you should be upfront and tell your users about that.


Private or semi-public content - protect your users but let them decide

Be subtle when it comes to moderation of private or semi-public images, such as in private messages or private groups. You should probably not use human moderation in this case. Your users don't expect private content to be reviewed by outside eyes. You could, though, use automated moderation. In this case make sure to enable the filters on the receiving end rather than on the sending end. Someone receiving a private message would have something along the lines of "We detected that this images contains nudity. Do you really want to see?". Protect your users, but let them decide.

Be transparent in your terms

If you use human moderation, let your users know in your Terms of Use that the content will be reviewed by your employees or outside contractors.


Sightengine is an Artificial Intelligence company that develops image moderation APIs to empower business owners and developers. Sightengine's powerful image and video analysis technology is built on proprietary state-of-the-art Deep Learning systems and is made available through simple and clean APIs.