In January, when new York Surveillance Technology Public Oversight Act came into effect, the New York City Police Department was suddenly forced to detail the tools it had long kept in public view. But instead of giving New Yorkers transparency, the NYPD gave standard, error-filled statements that hide almost anything of value. Almost none of the policies list specific vendors, monitoring tool models, or information sharing practices. The ministry’s facial recognition policy says it can share data “in accordance with ongoing criminal investigations, civil litigation and disciplinary proceedings,” a standard so broad that it makes virtually no sense.
This is the largest test to date of Community Oversight of Police Oversight (CCOPS), a growing effort to ensure that the public can regain control over decisions about community oversight, deciding whether tools like facial recognition, drones and predictive policing are acceptable for their neighborhoods. The battle unfolding in New York City – not only over what tech police are allowed to use, but how they use it, how that use is overseen, and how it is disclosed – contains broad lessons about the future of surveillance. As more cities and municipalities across the country implement policies on surveillance technologies like facial recognition, and more citizens lobby for CCOPS in their own communities, the challenges and gaps encountered in New York City show that transparency requirements on paper only matter when public police forces comply.
Surveillance technologies already widely used by law enforcement agencies across the country often make surveillance less expensive, faster and passive. Take facial recognition: When run on video cameras in public places, it can monitor faces constantly by algorithm (i.e. cheaper and faster), from far and passing (for example, requiring no type of physical research), and even outside the confines of the traditional Fourth Amendment Mandate Process. Other examples abound: drones used to fly over protest crowds; Police cars equipped with automatic license plate readers that digitize and centrally store license plates when law enforcement vehicles are driven on the streets or in parking lots. Algorithms are still used throughout the criminal justice system, from police stations “to predict“Crime at bail hearings to the sentencing bench.
Despite examples like that of the NYPD, there have been many successes of the CCOPS. The first to adopt the CCOPS model was Oakland, Calif., Where generations of advocacy against police violence, mostly by black and Latin supporters, culminated in 2015 with the creation of the Oakland Privacy Commission. Oakland was not only the first, but also the strongest order of the CCOPS, granting the Privacy Commission independence and full power to approve or prohibit police surveillance tools. Since its creation, the Commission for the protection of privacy has repeatedly questioned officials of the department, limited the use ban on predictive policing and biometric surveillance software, and more recently voted to recommend that Oakland Police stop using automatic license plate readers.
Across the bay, San Francisco followed suit with its own CCOPS law in 2019. go so far as to create an independent commission, he empowered the city legislature to approve or ban police surveillance tools. Notably, the bill also included a ban on the government’s use of facial recognition, the first in the country. Many cities followed suit in the months that followed, banning targeted technologies like facial recognition or improving overall accountability. Four jurisdictions have also banned police from signing nondisclosure agreements with surveillance service providers, removing a common excuse for police opacity. Other successes include that of San Diego, whose city council adopted an ordinance governing the supervision at the end of 2020 after a backlash on a police ‘smart street light’ program.
None of these decisions came out of nowhere; a confluence of community activism, media reports, attention from local politicians and other factors made these ideas for surveillance reform a reality. New York City is currently facing a number of challenges with its own surveillance oversight that highlights this need for constant work, to make surveillance oversight not just about transparency on paper, but also compelling and impressive changes in policing practice.
In accordance with the Public Oversight of Surveillance Technology Act, the NYPD published an initial list surveillance technologies deployed that include audio recording devices, cell site simulators, license plate readers, and facial and iris recognition. The public has until February 25 to submit comments in response. But there are issues with these newly required disclosures – because proper democratic control over these surveillance technologies doesn’t come simply knowing they exist. The Department published documentation on facial recognition contains the same copied and pasted assurances as all other policies, stating that such tools will only be used for lawful law enforcement purposes.