Web-based traffic signs seem like the perfect solution for agencies that have speed enforcement problems. With the ability to change the sign’s message online — as well as receive alerts and data from the sign — no longer do supervisors need to send precious units to the signs to perform these functions manually.
But in January 2009, signs in Austin (Texas) were hacked. Displaying messages like “Caution! Zombies Ahead!!!” they slowed traffic and made for some debate about “harmless fun” (reminiscent of the MIT hacks) vs. vandalism as a threat to public safety.
The signs were not connected to the internet, so hackers had to be there physically to break the locks and the passwords on the controller computers inside. Nevertheless, technology advancements mean that law enforcement administrators need to remember: information security isn’t just about sensitive employee and crime-related data.
- In late 2010, Iowa’s AMBER Alert system was hacked for the third time that year. The culprit: a vulnerability in the site, which also hosted the Iowa State Patrol’s crash report website.
- Between April and November the same year, confidential Mesa County (Colorado) Sheriff’s Department data were exposed due to a mistake by an employee. The 20-year-old database contained confidential informant records along with police intelligence.
- And in England, North Yorkshire police were found to have violated data protection laws more than three dozen times over three years.
Why are these stories important? They reflect that the more law enforcement agencies rely on information technology to make police work more efficient, the more threats they will face from both outside and inside. Whether student pranksters (as was speculated in Austin), foreign operatives (as was speculated in Iowa), or ill informed employees, these threats can take many different forms.
For example, remote-controlled robots are increasingly being deployed in bomb and hostage situations, as in Milwaukee in December. However, as early as last year, cybercrime and security expert Marc Goodman warned of vulnerabilities in battlefield robots, which could easily translate into vulnerabilities for police robotics as well.
The point is not to spread fear, uncertainty and doubt (FUD) about deploying new technologies. Rather, as Goodman puts it: “While electronic warfare is a relatively old domain, the presence of battlefield (and perhaps police robots) means there is a whole additional set of technologies which need to be fully understood and protected prior to deployment in real world scenarios.”
The same can be said of social media, “the cloud,” or even computer-controlled traffic signs. Nothing is completely secure; the human factor trumps all. However, the public- and officer-safety, force-multiplying, and investigative benefits of each kind of technology are too great to avoid them entirely.