Detect Violence in Images and Videos at scale

Violence can take many forms. Use our best-in-class models to detect and moderate any type of violence: physical violence, gore, threats, extremism, self-harm...

Have questions? Talk to sales

Powering some of the most innovative platforms worldwide

Detect Physical Violence, threats and abuse

Use our proprietary Physical Violence models to detect scenes and situations with violent assault, battery and harm:

  • Fights, with the ability to distinguish between combat sports and unruly or street fights
  • Someone or a device strangling, choking or asphyxiating another person
  • Hanging and public executions
  • Displays of someone being tied up, gagged...

Differentiate fights based on context

Regular fight

Combat sport

Combat sport
Unruly fight

Non-combat sport

Unruly fight

Detect Weapons with multiple threat levels

Detect firearms (guns, rifles...), knives and blades with the Weapon API. Apply smart rules adapted to the context of the weapon and in line with your community guidelines. You can block any weapon, or block depending on how the weapon is used, picking from the following categories:

Immediate threat

Level 1

Rifle immediate threat
Aiming, no threat

Level 2

aiming at target with rifle
In hand not aiming

Level 3

rifle in hand, not aiming
Not in hand

Level 4

gun not in hand
Animated rifle

Image type

animated gun

Detect Self-harm and suicide-related imagery

Self-harm and mental-health related images and videos encompass a large variety of situations. Those include actual self-inflicted harm such as cuts and wounds, scars and stabbing oneself. Images and videos can also suggest suicide or self-harm through images of hanging, or by aiming a weapon at oneself.


Self-harm scars



Other high-risk situations


Detect Gore content, graphic violence and blood

Gore images and videos can be extremely disturbing and hurtful to anybody viewing them. Detecting them is usually of highest priority to all apps and platorms. Those include:

  • Scenes with lots of blood or bleeding
  • Open wounds, guts and disturbing medical imagery
  • Corpses and lifeless bodies
  • Mutilation, human remains
  • Human skulls and imagery related to death
  • Animal suffering
  • Other horrific imagery

GET STARTED or read the docs

Detect Extremism and Hate

Using the Hate Detection Model, you can detect images of extremist groups, displays of terrorist signs and symbols such as the ISIS flag, as well as other instances of hate. You can also detect extremist references and hate in messages.

  • Swastika
  • Nazi flag
  • SS bolts
  • Sturmabteilung
  • Iron cross
Supremacist or Extremist
  • KKK imagery
  • Confederate flag
  • ISIS flag
  • Burning cross
  • Valknut

See why the world's best companies are using Sightengine

Empower your business with Powerful Content Analysis and Filtering.