Surveillance technology has advanced far beyond the laws that govern it

0

Ars Technica Live #2: Law professor Elizabeth Joh predicts the future of high-tech policing.

Joh talked a lot about the future legal landscape we’re creating with cutting-edge technologies like self-driving cars, facial recognition, and body cams. When you’re talking about law and policy, the issue is always that adoption of devices like body cams tends to precede careful thought about what rules will govern them. After the Ferguson protests, for example, police departments started using body cams as an accountability measure. But there are no federal guidelines for how cops will use these cams. Will they be able to turn them off whenever they want? Who has access to the data they collect? Can they use facial recognition in body cams? All of these questions remain unanswered, yet body cams are in widespread use across the US.

A similar problem dogs our use of DNA databases, Joh explained. The US government gives states financial incentives to develop databases and biological sample libraries with the DNA of everyone who gets arrested. These aren’t convicts, mind you—just anyone who gets arrested, regardless of whether they were released the next day or found guilty of a felony. Again, the question here is how to regulate these databases, as well as other digital databases full of our “information microbiome.” The key, Joh argued, isn’t going to be found in the courts or Congress. Instead, “public vigilance” is the only social force that moves fast enough to push government to behave responsibly with new surveillance technologies.

Of course, public vigilance is only as good as public information, and if the public doesn’t know what data law enforcement has, we can’t push for better rules. That’s why the rise of private security forces is so troubling. Joh estimated that private security forces, from guards at 7-11 to “Target’s private crime lab,” are 3-5 times larger than public forces. And they are not regulated by government in any way, which means that it’s impossible for the public to know what kinds of data private forces are gathering.