Legal Aid Society


SI District Attorney Spent Controversial Forfeiture Funds on Facial Recognition Tech

Last summer, The Legal Aid Society revealed that Staten Island District Attorney Michael McMahon had used the controversial facial recognition technology – Clearview AI – on unsuspecting New Yorkers.

Clearview AI is primarily used by law enforcement to match photos of unknown suspects to online images. Clearview has garnered intense criticism for infringing on basic privacy and civil rights by collecting and storing data on people without their knowledge or consent.

New documents, obtained by Legal Aid via a Freedom of Information Law (FOIL) request, show that McMahon’s office paid for the facial recognition software using the Federal Equitable Sharing Program (FESP), a large-scale asset forfeiture program that allows local municipalities to access a shared pool of funds, in exchange for cooperating with federal investigations, as reported by The Intercept.

Asset forfeiture programs have come under scrutiny for undermining basic constitutional rights. Homes, money, and property are often seized without due process and on mere suspicion. The practice incentivizes “policing for profit,” and has devastating consequences for communities that are already over-policed. FESP is particularly problematic because it enables law enforcement agencies to circumvent local asset forfeiture restrictions.

Essentially, the Richmond County DA (RCDA) is using property and money taken from communities, with little to no due process, to fund surveillance of those same communities.

“Law enforcement and government agencies have a proven history of concealing the use of facial recognition and utilizing it against the public, without any oversight or regulation,” said Diane Akerman, Staff Attorney with the Digital Forensics Unit at The Legal Aid Society. “The revelation that the funds used to access the Clearview AI service was derived from property obtained without due process, from the same individuals who are most at risk to the devastating consequences of its flaws, is nearly dystopian.”

“Though their own guidelines required regular auditing of the use of Clearview AI, no such measures were taken,” she continued. “The lack of oversight, and the RCDA’s to engage transparently with the public, is further proof that government and law enforcement agencies cannot be trusted with this highly invasive technology.”