This Russian Facial Recognition Startup Plans To Take Its ‘Aggression Detection’ Tech Global With $15 Million Backing From Sovereign Wealth Funds

Next year, in cities across the world, expect to have your face scanned for levels of aggression. NtechLab, a Russian facial recognition company best known for the FindFace app once labelled the harbinger for the end of online privacy, says it’s currently testing  “aggression detection” tech with plans for a full rollout to its surveillance partners and customers in 2021.

To help it along the way, it’s just received $15 million from two sovereign wealth funds – one the Russian Direct Investment Fund, the other a mysterious, unnamed Middle East partner. Previously, NtechLab had previously received support from the U.A.E. sovereign investor Mubadala Investment Company, but it wouldn’t tell Forbes who was putting up the funds in this round. The money is helping the startup open a Latin American office, with plans for a Middle East base and a possible Asian HQ too. “Our main advantage is that our solution is really scalable for any size of city,” claims NtechLab cofounder Alexander Kabakov, adding that it can work with hundreds of thousands of cameras across a metropolis.

Back in 2016, NtechLab had caused a stir when FindFace launched, promising users they could take any image of a face and tie it to social media profiles on Russian site Vkontakte (VK).  The company has since gone on to become a surveillance supplier, most notably in Moscow, where its algorithms are powering a massive facial recognition surveillance project. According to Kabakov, in the first ten days of January, the Moscow system helped catch 34 “criminal persons,” though he couldn’t say how many were actually prosecuted.

Facial recognition was already a controversial technology, labelled not just harmful to individual privacy but also potentially racist, with some algorithms flagging more Black individuals as suspicious than white. But aggression and emotion detection are also ethically and technically questionable. In  a study from the University of Maryland’s Lauren Rhue, tools from Microsoft and Chinese company Face++ were tested on NBA players. The Face++ AI consistently interpreted black players as angrier than white counterparts when looking at how they smiled. Microsoft’s AI decided black players were more “contemptuous” when facial expressions were ambiguous. Meredith Whittaker, cofounder of the AI Now research institute and founder of Google’s Open Research Group, previously told Forbes: “The idea that someone’s interior characteristics and feelings are mapped in some regular and universal way to people’s physicality or actions has not been proven, to say the least, and the claims being made by these systems have not been validated.”

Faces of fury

Such concerns are perhaps why NtechLab isn’t rushing the feature out, with cofounder Artem Kukharenko noting that it needs to be “100%  sure” before launch. The company later clarified that its aggression detection is currently in its “early development phase,” and so it couldn’t provide figures pertaining to its accuracy. “It’s very complex as it incorporates silhouette detection, action detention and detection of emotions. Our existing algorithm for silhouettes and emotions has a strong rate of accuracy and incorporating this with action detection will require further research and development,” the company said in a statement. “We are looking forward to presenting it to the public next year.”

Its “silhouette tracking” is already in use and attempts to pick up on the unique silhouette patterns created by an individual’s face. This is useful when the target’s face is obscured in some way. Then there’s the ability to turn on “violence detection” where surveillance footage can flag individuals acting violently.  Together with vehicle recognition, these allow security and policing units to track an individual behaving violently across a city, even if their face is hidden, said Kabakov.

“Right now, that accuracy is quite impressive… so that we can use it in pilot projects. We’re still improving it so that we can use it [on the] scale of the whole city,” added Kukharenko.

Kabakov said that beyond Moscow the company has customers in  Argentina, Brazil and Serbia. It’s unlikely the tech will make it to the U.S. or the U.K. given the Russian provenance of NtechLab, he admits.

Regardless of their ethics, non-Russian companies, won’t face the same barriers to entry if they can come up with similar tools to scan a face for warning signs of violence.

Source Article