Regular fight
Combat sport
Have questions? Talk to sales
Use our proprietary Physical Violence models to detect scenes and situations with violent assault, battery and harm:
Combat sport
Non-combat sport
Detect firearms (guns, rifles...), knives and blades with the Weapon API. Apply smart rules adapted to the context of the weapon and in line with your community guidelines. You can block any weapon, or block depending on how the weapon is used, picking from the following categories:
Level 1
Level 2
Level 3
Level 4
Image type
Self-harm and mental-health related images and videos encompass a large variety of situations. Those include actual self-inflicted harm such as cuts and wounds, scars and stabbing oneself. Images and videos can also suggest suicide or self-harm through images of hanging, or by aiming a weapon at oneself.
Cutting
Self-harm scars
Self-threatening
Hanging
Other high-risk situations
...
Gore images and videos can be extremely disturbing and hurtful to anybody viewing them. Detecting them is usually of highest priority to all apps and platorms. Those include:
Using the Hate Detection Model, you can detect images of extremist groups, displays of terrorist signs and symbols such as the ISIS flag, as well as other instances of hate. You can also detect extremist references and hate in messages.
Empower your business with Powerful Content Analysis and Filtering.
GET STARTED