In early June, Julie Weiner was at a Black Lives Matter rally in downtown Yonkers, New York, when she spotted a small drone in the sky, monitoring the protest. Weiner, a long-time Yonkers resident, immediately asked her city councilperson, Shanae Williams, who had organized the rally, whether the drone was being operated by the Yonkers police. Williams went over to talk to a group of police and returned to report that, yes, the drone belonged to the Yonkers Police Department.
Despite assurances from police that data will not be shared, it’s hard to imagine that if it exists, it won’t be used.
“That’s a problem,” Weiner told Williams. But Williams didn’t seem concerned and, according to Weiner, explained that it wasn’t an issue because anyone can own a drone.
Shortly after, Weiner, a licensed mental health counselor, former secretary of the Yonkers NAACP and a member of a local peace and justice group, sent an email to Yonkers Mayor Michael Spano, Police Commissioner John Mueller and City Council President Michael Khader, expressing concern that the Yonkers police drone surveillance that she had witnessed and photographed “may be a violation of our rights freely to assemble, and to be free from unwarranted searches or seizures.”
In response to the letter, Weiner was granted an interview with Commissioner Mueller to examine the pressures that are leading to growing national use of police drones and police surveillance.
According to Mueller, Yonkers police are not saving any drone video footage of protests; it is not taking still shots of images of individuals captured on the videotapes. Nor is Yonkers using facial recognition technology, which has been found to often be biased against people of color.
Yonkers will supply drone video footage to the FBI, the Department of Homeland Security the New York State Police or any other law enforcement agency only if any such request is made in writing on “official letterhead” and is determined to be legal by the City Corporation Counsel, Mueller said.
Meanwhile, Yonkers has drafted a policy for use of police drones that will be submitted for approval to the U.S. Justice Department under a 2016 oversight agreement. There is no plan to hold public hearings on the policy before it is submitted, nor is there a plan to put the policy to a vote by the Yonkers City Council. “Not that it couldn’t happen, but I don’t think there’s any mechanism in place for the council to review police policy.… One is the executive branch, and the other one is the legislative branch,” Mueller said.
Enter Big Data
According to Mueller, Yonkers is going to buy police body cameras, but while a vendor has not been selected, he appears to favor the cameras produced by Axon Enterprise Inc., formerly known as TASER International. Publicly available information indicates that Axon may control around 80 percent of the police body camera market. The Axon Press Team says this number is inflated.
Yonkers police also use Axon Tasers, which are equipped with video cameras to record each incident in which a Taser is used. This footage is now being held on a Yonkers Police Department server, but it may at some point be fed into a massive data collection, record-keeping and sorting system operated by Axon called Evidence.com, which in turn is hosted by Microsoft’s Azure, a mega-cloud computing system which serves military and federal customers as well as local police. (Note: Axon no longer produces Tasers equipped with their own cameras, but according to Fast Company, Axon sells Taser battery packs that can activate police body cams and dashboard cameras when the weapon is armed.)
DJI, the maker of the drones used in Yonkers, is the foremost seller of drones used by police and is in partnership with Axon, meaning that police drone footage could be fed easily into Evidence.com if Mueller permitted it. However, Yonkers police spokesperson Dean Politopoulos said that no video from Yonkers drones would enter Evidence.com. At press time, Mueller did not respond to Truthout’s follow-up request to clarify whether video footage is being saved and entered into the Evidence.com system or any other such system.
The Yonkers case illustrates how local police can be swept up in the surge of entering surveillance data into massive, unregulated computer cloud systems and artificial intelligence programs that present a threat of intensified police and military vigilantism, particularly to people of color.
The Pressure of Big Data
The extent of police drone surveillance, as well as other police monitoring, is dramatically represented by the national map of The Atlas of Surveillance, launched on July 13, 2020, by the Electronic Frontier Foundation. However, The Atlas gives an undercount of drone use and surveillance in general, because the authors of the study say their data is gathered from published reports and crowdsourcing, and because all surveillance is simply not reported on.
“Although we have amassed more than 5,000 datapoints in 3,000 jurisdictions,” the report writers say, “our research only reveals the tip of the iceberg and underlines the need for journalists and members of the public to continue demanding transparency from criminal justice agencies.” For example, the Yonkers police drones are not noted in the report.
A March 2020 study by the Bard College Center for the Study of the Drone finds that at least 1,103 law enforcement agencies across the United States have purchased drones and that there has been a dramatic increase in their use since 2014. The report says, however, that the number of police drones in use is likely higher because the study did not include “agencies with undisclosed drone programs or federal agencies.”
Increasing pressure for police to engage in surveillance with Evidence.com (and similar systems) self-generates simply through the almost inevitable use of digitized police data and the adding of piece after piece of data-gathering gear.
Evidence.com is the foremost police data platform because Axon leads in selling Taser stun guns, police body cameras and DJI drones, which all lead to subscribing to Evidence.com. Use of Evidence.com, according to the investor’s guide Motley Fool, can cost taxpayers up to $199 a month per officer, which includes “a body camera, holster, cloud computer storage of records, records management, and more.”
“Locking customers into the ecosystem with software packages is key to inducing them to add to the list of Axon products they use,” the Motley Fool reports. “A law enforcement agency may start by buying Tasers [as has Yonkers], then add body cameras, and eventually upgrade to the $199 do-it-all package. But moving them up the revenue stack starts with bringing them into the system.”
If Yonkers bought the full Axon package for its roughly 600 police officers, the cost would be about $120,000 per month.
Once the police force’s data is stored in the system, the Motley Fool continues, and “officers have been trained in its use, it’ll be painful and costly to switch to a rival offering, which will mean that Axon will be able to raise prices more easily. The company will be able to expand its offering from there, adding artificial intelligence capabilities to its software and products.”
Targeting With Big Data AI
Artificial intelligence (AI) offerings have included facial recognition, analysis of emotions, and other predictors of what specific individuals and groups are likely to do, called “predictive policing.”
Writing in The Conversation, American University Professor of Law Andrew Guthrie Ferguson says, “Police use of these national security-style surveillance techniques — justified as cost-effective techniques that avoid human bias and error — has grown hand-in-hand with the increased militarization of law enforcement. Extensive research, including my own, has shown that these expansive and powerful surveillance capabilities have exacerbated rather than reduced bias, overreach and abuse in policing, and they pose a growing threat to civil liberties.”
In 2018, 42 organizations — including the ACLU, Color of Change and the NAACP — wrote to Axon’s ethics board, asking that the company submit to forms of ethical review. They also asked that the company commit to categorically rejecting certain forms of data collection, primarily the real-time face recognition analysis of live video captured by body cameras. In 2019, the ethics board issued a report supporting the call with respect to body camera facial recognition, and Axon announced immediately that it would not begin incorporating “face matching” technology into its police body cameras.
The Axon’s ethics board’s report says that Axon makes a distinction between categories of technology using facial data. “Axon is conducting research and product development around face detection and face re-identification, although the latter has not yet been released in a product.” This work is being done to “facilitate the redaction of body camera footage so that the footage can be disclosed,” the report said. Face matching, the report notes, is using algorithms to “identify a particular face by matching it to one in a target database — this is what people most commonly mean when they refer to ‘face recognition’ technology.” Face matching is what the board vehemently objects to being put into play.
“The company’s blog post makes it clear that Axon will continue to research and pursue face matching technology, including an effort to de-bias algorithms in the future,” The Verge reported. “That implies that the company is still hoping it is a case of ‘when’ rather than ‘if’ it’s able to add the technology to its products.”
With the killing of George Floyd, as MIT Technology Review reports, Amazon said it will suspend police use of its facial recognition program for a year, but it did not make clear whether it would pause in selling the software to agencies other than police, such as U.S. Immigration and Customs Enforcement and the Department of Homeland Security. Microsoft said it would stop selling its facial recognition system to police until there is federal regulation of its use. IBM said it would stop developing and selling facial recognition software.
But, other companies continue to provide facial recognition technology to police, according to a recent article in The Intercept. “Despite claims to the contrary, Microsoft is providing facial recognition services to law enforcement through partnerships and services to companies like Veritone and Genetec, and through its Domain Awareness System.”
With respect to “predictive policing,” in June 2020, 1,500 mathematicians signed a letter to their colleagues, urging that they boycott work on predictive policing software. “Given the structural racism and brutality in U.S. policing, we do not believe that mathematicians should be collaborating with police departments in this manner,” the signatories wrote.
Who Is Using the Data and for What Purpose?
Another question about mass collection of data is whether and to what degree federal law enforcement and military entities may be using Evidence.com, openly or not, to gain access to information generated in Yonkers and other municipalities. Further, is this information being used to develop lists of persons who will have predictive AI analysis applied to their lives? Ferguson points out that:
One of the natural limiting factors of first-generation big data policing technology was the fact that it remained siloed. Databases could not communicate with one another. Data could not be easily shared. That limiting factor has shrunk as more aggregated data systems have been developed within government and by private vendors.
This problem is further magnified by the fact that the contents of Axon’s Evidence.com is being handled by Microsoft’s massive Azure cloud storage system, which hosts, The Intercept reports, similar services for other police data management companies like Axon. The Microsoft mass surveillance platform was initially built for the New York City Police Department and later expanded to Atlanta, Brazil and Singapore, The Intercept says, and “Microsoft has partnered with scores of police surveillance vendors who run their products on a ‘Government Cloud’ supplied by the company’s Azure division and that is pushing platforms to wire police field operations, including drones, robots and other devices.”
Policy and Practice
On June 25, in releasing a United Nations report that discussed the impact of surveillance technology on the rights of protesters, UN High Commissioner for Human Rights Michelle Bachelet said, “New technologies can be used to mobilize and organize peaceful protests, form networks and coalitions … thus driving social change. But, as we have seen, they can be — and are being used to restrict and infringe on protesters’ rights, to surveil and track them, and invade their privacy.”
Bachelet’s statement and the UN report are a fervent plea for regulations that are needed globally to “avoid unlawful limitations on the right of peaceful assembly and related rights,” including in the U.S.
The ACLU, in the same advisory position as the UN with respect to regulations in the U.S., asserts that, “A drone should be deployed by law enforcement only with a warrant, in an emergency, or when there are specific and articulable grounds to believe that the drone will collect evidence relating to a specific criminal act.”
At this point, since there is no U.S. federal law governing the use of drones for police surveillance, regulation of this surveillance is left to the states. According to a 2019 report published by the University of Texas, 18 states have passed laws requiring law enforcement agencies to get a warrant before using drone surveillance, with varying types of exceptions.
New York has no such law. While Yonkers will submit its drone policy for review by the Justice Department because it must do so under the aforementioned 2016 agreement, it is not clear what legal standards the Justice Department will apply.
In this situation, in which surveillance technology and data collection and analysis have far outpaced legislation, police and the federal government have ranged freely with high tech aerial surveillance, apparently without any restraint or sense of accountability.
U.S. Border Patrol Reaper drones and other aerial surveillance, including military surveillance, flew over Minneapolis and other cities during recent Black Lives Matter protests.
In a particularly concerning incident, an Air Force intelligence-gathering airplane associated with the U.S. Special Operations Command was flown over the Portland, Oregon, area during the recent invasion there by federal agents. According to The Drive, the plane appears to have been outfitted with powerful video cameras used for aerial surveillance and with antennae that enable transmission of video for long distances. The Air Force denied the plane was gathering intelligence on the protest and claimed that the mission was planned before the protests.
The Drive reported that other aerial surveillance of Portland protests was done by local and federal law enforcement agencies, apparently including a Customs and Border Protection-manned aircraft.
These flights were unannounced to the public, nor has their purpose been fully explained.
Additionally, we see in these surveillance flights the possibility, if not the probability, of the aforementioned comingling of police and military surveillance information on protesters.
Drones, Big Data and Police Vigilantism
The tremendous growth in the technical capacity of police to gather information and target individuals comes in large part from the post 9/11 explosion of military surveillance data gathering and targeting technology, as documented by William Arkin.
This extraordinary surge has created what Arkin calls the “Data Machine,” which he describes as an extremely powerful entity in its own right that can drive decisions of its users.
Beyond the influence of big data, policies and admonitions against police drone surveillance are up against the reality of the deep, emotional sense of power that may be generated in any person by the very act of watching a crowd of people through an eye in the sky.
From a height of 100 feet, the watcher may not only feel a sense of superiority and control but also a sense of separation and invulnerability. And from this height, a crowd of fellow citizens may suddenly become “other” — people who must be monitored, controlled and, if necessary, repressed. If this happens to police using a drone, the result can be tragic, whether the drone is recording on video tape or not.
The United States has used its drones and big data overseas without respect for national boundaries and commonly accepted standards of judicial process and human rights. It is not hard to imagine that the introduction of this technology into domestic policing will enable tendencies toward official vigilantism at home, as it has abroad.
In Yonkers and elsewhere, the prohibition of police surveillance drones at protests may only be achieved in practice when we, like Weiner, act in our home towns to protect our First Amendment rights.
This article has been updated on August 13, 2020, to reflect that the Axon Press Team: disputes the figure of the company controlling around 80 percent of the police body camera market, saying this number is inflated; states that Axon no longer produces Tasers with built-in video cameras; and states that Axon has never marketed police body cameras involving facial recognition software, nor has it marketed predictive policing technology.
Copyright, Truthout.org. Reprinted with permission.