© AiLert
Author profile picture

About AiLert

  • Founders: Joachim Levy
  • Founded in: 2017
  • Employees: 10
  • Money raised: almost $2 million
  • Ultimate goal: Prevent (mass) shootings with the help of our SAMSON weapon detection software and be a benchmark in this sector.

One of the tasks AI has proven to be really good at is image recognition. Countless algorithms can now check lung scans to identify cancer, tree leaves to identify pests, and so on. Furthermore, image recognition capabilities can also serve another purpose: security. 

AiLert software enables security cameras to detect weapons. As a potential weapon is recognized, the system alerts security, sending a picture and footage where the suspected weapon is circled. This way, officers already have an idea of the situation. In a previous episode of the Start-up of the Day series, we brought you the story of AiLert – back then, it was still called 1702ai – and of SAMSON, their Weapon Detection Tool (WDT). We reached out again to co-founder and CEO Joe Levy, who told us more about the latest developments of the company.

What has happened since our last interview? 

“Our team grew from a handful to about ten employees. Furthermore, we have improved our weapon detector for security cameras: it is faster, it detects fewer false positives, and our technology is now compatible with standard messaging security protocols. This means we are almost plug-and-play when working with security companies like Securitas or ADT. Moreover, we are now equipping our first US-based University, and a high-end jewelry brand has decided to equip all their stores in Greece with AiLert technology.”

How did you further develop SAMSON? 

“As of today, SAMSON gained credibility from the European Commission. SAMSON can successfully detect small magazine-fed handguns in public spaces.”

How do you process data? 

“Overall, our process is straightforward, encrypted, and complies with GDPR and the AI Act. First, the incoming data is continuously analyzed for weapon detection patterns. The algorithm does the analysis. In cases of alerts issued by the Weapon Detection Tool, a human operator must check the footage and flag the detection as either a positive threat – the checker confirms the need for further action – or a non-threatening situation – false identification of a weapon or spotting an armed security person. 

Only during the evaluation phase the human operator at the alarm center can see the person(s) in the footage. During the first seconds of the alert footage/imagery, any information related to actual persons – face image, body image, etc. – is removed in a video buffer to prevent direct access to personal data. This technique is known as a pseudonymization process, and the AiLert has implemented this feature where the WDT tool is installed. Therefore, protecting personal data through pseudonymization is a feature incorporated into the design of the WDT tool. Thus, the technique is considered privacy by design and a default mechanism for protecting personal data available in the WDT tool as a default option.”

Does the algorithm have any bias like other AI tools? 

“The WDT tool does not negatively discriminate against people and groups. The algorithm training contains a variety of artificially generated objects and people, as well as AiLert’s self-produced CCTV footage of real people and objects. AiLert constantly improves the data training set(s) with additional AI-generated materials to diversify the object and people recognition database. The recognition capacity is not oriented on personal data of any nature but rather focused on classifying general groups of recognizable categories – such as a person, a cellphone, or a hand weapon.”

How do you keep improving the algorithm?

“AiLert continuously works on the improvement of algorithm training data set(s), monitors the WDT tool’s success rate, and improves the algorithm when false detections are registered. The system itself implements the traceability system allowing the self-assessment – and, when necessary, external independent audit – and recording segments of algorithm training. Therefore, each version update and the performance in the aftermath can be tracked to specific improvements in the code. 

Much effort is continuously placed into enhancing the diversity of learning materials. A good example is the use of generative AI software that, in combination with camera footage done on the company’s premises and with the company’s employees, allows for the creation of artificially generated persons of all genders, colors, racial and social backgrounds, and similar, that are used as models to train the algorithm. Since the WDT tool primarily focuses on recognizing a certain class of objects – hand weapons, rifles, knives, other sharp objects, and similar – in principle, none of the above-specified characteristics necessarily play an important role regarding specific object recognition. Therefore, bias arising out of discrimination is not expected.”

What are your future plans?

“Offer SAMSON weapon detection tools using the cloud.”

In our previous chat, you said you wanted to become the best and most well-known weapons detection software company. How far are you from reaching this goal?

“We are getting there – as half of our new clients left our competitors. As of today, we cannot process hundreds of CCTV per site; we only do about a dozen, and at the same time, it is precise and blazing fast.”