Surveillance startup wins Microsoft backing as a ‘tool for good’


AnyVision in March joined peers including Amazon and Microsoft in indicating they were open to regulation of facial recognition systems in the US. — AFP

AnyVision in March joined peers including Amazon and Microsoft in indicating they were open to regulation of facial recognition systems in the US. — AFP

A month ago, AnyVision was preparing to announce an investment round and tell the world that the Israeli computer-vision and facial-recognition startup had the backing of two new high-profile investors. 

But with funding secured and a press release ready to go, AnyVision hit pause. Instead, its founders spent a few weeks ensuring that the policies governing the use of its technology met the ethical standards of incoming investor Microsoft Corp. Facial recognition technology is under a microscope these days amid worries that it can be used to facilitate mass surveillance, amplify human bias in policing and otherwise violate people’s civil rights. 

“All of our investors agree that AI is here to stay,” said Max Constant, AnyVision’s chief commercial officer. “They’re asking the question now of how do we leverage this in a way so that we can actually make sure that this is a tool for good.” 

AnyVision, whose full name is AnyVision Interactive Technologies Ltd, on June 18 said Microsoft and Silicon Valley venture capital firm DFJ had joined in the company’s Series A investment round, which raised US$74mil (RM309.13mil) over the last year or so. That sum includes previously disclosed investments from Robert Bosch GmbH, Qualcomm Inc and Lightspeed Venture Partners. 

AnyVision’s core product, Better Tomorrow, is designed to alert operators when people on a watchlist are caught on camera or track them as they move through an area. The company says it has signed up customers in more than 45 countries, including state and local police departments, casinos, airports and stadiums. 

While the technology could help thwart terrorism or crimes, civil liberties groups have raised alarms about the proliferation of facial recognition tools and their potential misuse by governments. They say such abuse is already going on in China as part of a government crackdown on the minority Muslim Uighur population there. 

In the US, Amazon.com Inc’s Rekognition software has been singled out by the American Civil Liberties Union. But the retail and technology giant is hardly alone in developing software capable of quickly identifying people in still images or video. Microsoft and Alphabet Inc’s Google have their own versions of the technology. 

A report by the ACLU last week warned that equipping the tens of millions of surveillance cameras in use in the US with artificial intelligence capabilities would lead to discrimination, over-enforcement and abuse. The group called for the prohibition of mass government surveillance and “policies that will prevent private-sector deployments from becoming equally oppressive”. 

AnyVision in March joined peers including Amazon and Microsoft in indicating they were open to regulation of facial recognition systems in the US. 

Microsoft is rolling out ethical principles for its own facial recognition technology. Leaders of the corporate venture fund, called M12, that invested in AnyVision have also discussed the need to build AI tools designed to do good in the world. 

AnyVision’s ethical standards remain a work in progress. Draft guidelines for a new AI advisory board that will guide executives and contribute to public discussion on the matter note the potential for misuse of powerful technology and the company’s “inherent responsibility to ensure the work and products are put to proper use”. 

The committee doesn’t have a quorum. Constant said two people, who he declined to name, had agreed to join. “It’s imminent,” he said. “It’s just about formalising everything.” 

Sam Fort, a partner with DFJ, who led the firm’s investment in AnyVision, said the Israeli company has always prioritised the ethical use of its technology. “Now they want to be very public about all the things they are doing.” – Bloomberg