Seattle Police and ALPR

Last week I saw a post on Twitter from the Seattle Department of Transportation (SDOT) announcing a series of meetings by SDOT, the Seattle Fire Department (SFD) and the Seattle Police Department (SPD) seeking public feedback on surveillance technologies already in use by the city agencies. In particular, they were to review the use of automatic license plate reader (ALPR) technology.

Interestingly, SPD did not promote these meetings on either their blog, SPD Blotter, or their Twitter feed. Also, all the meetings started right at 5 pm on weekdays. This is not the outreach of an agency that is really seeking public input. These are the actions of an agency that is going through the motions. The meeting I attended on 29 Oct 2018 had in attendance only three members of the public as well as a few other folks who perhaps were from the ACLU or other organizations. During the “small group discussion” only three of us participated.

The ALPR programs in use by SPD are PIPS (link) and Autovu (link). PIPS is used for automatic reading and storage of license plate data by the SPD Patrol division, as well as a few “boot vans” used by parking enforcement for “booting” vehicles that have 4 or more unpaid tickets. Autovu is used by their parking enforcement vehicles for “digital chalking”.

Briefly, PIPS cameras on patrol cars capture the license plate information, location, and time for as many cars as possible as the patrol cars drive around the city. A hot list is loaded into the system periodically, and if a license plate matches one on the hotlist, the officer is alerted and, after confirming with dispatch, may take action based on the match. All the data on licenses plates that have been read (the “read data”) are additionally “uploaded to a secure server” where trained officers may search for and retrieve data for up to 90 days after it has been captured. After 90 days, the data is to be deleted.

Autovu similarly records license plates, times, and locations. If, on a later drive by the same location, the vehicle is noted by the system to have exceeded the time limitations for parking, the parking enforcement officer is alerted and they may take action, such as ticketing the vehicle. The license plate data, other than what triggered a ticket, is discarded at the end of the day.

I had a number of concerns about the PIPS program, and after asking some questions at the meeting, they are intensified. I have submitted them to SPD for this review, but I wanted to write them out publicly as well. Some of my concerns about PIPS may also apply to Autovu, particularly those regarding the collection of the data, but because it does not retain read data I am less worried about that program. Keep in mind that while I am relatively savvy technically speaking, I am not a security professional nor have I studied ALPR systems in particular. That means that my worries may be somewhat uninformed. Nevertheless…

My first concern is that nowhere in the program description was there any description of their threat models. I asked SPD’s Director of Transparency and Privacy what threat modeling had been done with respect to the ALPR technology and programs, and she did not think any had been done. If an organization hasn’t modeled their threats, we have no idea if we’re protecting against the right things if we’re protecting anything at all. And given the tenor of the meeting, I suspect SPD isn’t protecting against anything at all. The department is focused about 99.8% on the benefits it gives them in chasing down crimes, particularly stolen cars.

Here’s where me not being a security professional is apparent. I do not know how to do any formal threat modeling. But I tried too look at various categories of possibly malevolent actors and review the program description for ways it might be misused. Some of these came from other people at the meeting.

SPD’s use of the system for its intended purposes

This is where the program is used by SPD for finding cars or investigating crimes but through bad policy the system infringes on the liberty of the people. In this category of concern, I asked the SPD representatives if the agency had used a racial equity toolkit (RET) to analyze the impact of the program on marginalized communities in Seattle. They had not yet. Looking at the process outlined in the description, most of the RET is completed after public feedback. Some of the first portions that they have indicated are affected are obviously wrong. For instance, to the question “Which of the following inclusion criteria apply to this technology?” they left unchecked the following:

  • The technology disparately impacts disadvantaged groups.
  • There is a high likelihood that personally identifiable information will be shared with non-City entities that will use the data for a purpose other than providing the City with a contractually agreed-upon service.
  • The technology raises reasonable concerns about impacts to civil liberty, freedom of speech or association, racial equity, or social justice.

To the first unchecked item, SPD simply doesn’t know because they haven’t studied the information. And they later state “An additional potential civil liberties concern is that the SPD would over-surveil vulnerable or historically targeted communities, deploying ALPR to diverse neighborhoods more often than to other areas of the City.”

Additionally, we give heightened protection to political speech. But deploying ALPR cars around protests, rallies, and other such “free speech activities” SPD has the possibility of criminal pretexts being used as fishing expeditions against opponents. SPD would have 90 days to fish through location data. These are just a couple of possibilities that I can think of off the top of my head. The technology obviously has reasonable concerns about impacts to freedom of speech.

Out of policy use by SPD officers

This is where SPD officers use the system for purposes outside what is allowed. Officers are required to undergo training and of course they are all sworn and background checked. The program administrator is supposed to approve all searches of stored read data, and the system automatically logs the officer, the terms searched for, the case number and the purpose for which the search is conducted. The SPD Inspector General (theoretically independent of SPD) can audit the system for misuse, as can the program administrator. When I asked SPD command staff how many instances of misuse of the system had been found during the 10 years the program has been in use, they answered “none to our knowledge”. It is unlikely in the extreme that not one officer has ever misused the system. Possibilities include officers tracking vehicles of girlfriends or rivals, locals that they want to keep tabs on, take bribes or favors to feed read hits to outside people, or simply get fed up with onerous requirements for logging and do things like re-use case numbers. An audit system that has uncovered no instances of misuse is either not recording the right information or is not being conducted thoroughly.

Out of policy use by other agencies

Agencies such as King County, the Washington State Patrol, the FBI or Immigration and Customs Enforcement (ICE) do not have direct access to the system. However, they may submit requests for information to SPD which send them responsive data. Such requests and responses are memorialized, but it’s unclear how and whether that is part of the same audit trail. Additionally, SPD did not articulate how they vet such requests, particularly with respect to Seattle’s policy of non-cooperation on immigration enforcement. ICE may be making direct requests for ALPR read data with nominally within policy reasons (e.g., for customs investigations) that are really for deportation reasons. Or they may be routing such requests through other agencies. Or there may be no issue at all. We have no way of knowing. This concern was brought to my attention by another attendee at the meeting.

Misuse of the data by the public

According to SPD, ALPR read data is subject to public records requests. There is nothing to stop me from submitting a request every 90 days for a CD of all ALPR read data, circumventing any protection we have by SPD erasing the data they hold after 90 days. While there may be restrictions on the legal use of such data, once it leaves SPD hands, we’ve lost effective control of it.

Misuse of the data by the vendor

According to the staff present, no security review of the software has ever been performed to make sure the software does what it’s supposed to do by the vendor, Neology. The software is closed source as well. Are there backdoors for support? Are there security vulnerabilities that allow exfiltration of the data?

Misuse of the data by IT

The City of Seattle consolidated almost all IT within a central department. The technical staff are not sworn officers, though they are background checked. According to staff present, as well as some hints in the program description, ALPR read data is stored in a SQL system. Which suggests to me that the data is both unencrypted and can be reviewed outside of the audit system that is used by SPD personnel.

Most of my privacy concerns could be mitigated by a policy of discarding all read data when it does not match a hit list and/or much stronger audit processes. That would not eliminate all concerns however. Additionally, I have some other concerns that I am giving a lower priority and not including here because this is already long and some of them verge on movie-plot threat type of issues.

I hope SPD takes all of these concerns seriously.

Featured image “Street-level Surveillance” created by the Electronic Frontier Foundation and used under a Creative Commons Attribution license.