Anytime Now, We Need to Take a Look at Our Own Sprawling Police State, Before We Turn into China
For a few years now, we’ve been hearing reports of Chinese authorities using facial recognition and mass surveillance technology to assert authoritarian control over their population. In recent months, Chinese repression of Uyghurs minority using new-age technology has been a cause of great concern for advocates of civil liberties. A consistent thread in their concerns has been over the use of such technologies in the U.S. – given the faltering trust between U.S. law enforcement authorities, and now open secrets regarding National Security Agency’s surveillance operations within the country.
The once science fiction model of a robotic cop, who uses instant facial recognition along with AI software to discern criminal intent is no longer in the domain of Hollywood sci-fi. These technologies are fast grabbing a foothold within American law-enforcement agencies that use body cameras. These have been an improvement as in recording police interactions with the public, the cameras depict police interactions with the public. The help with the public’s trust and police accountability.
Robocop, a late 1980’s blockbuster, is an oft-bandied about analogy in cases where technology seems to merge with law enforcement. It’s not a bad one but writers behind Robocop couldn’t possibly envision the advancements in machine learning and artificial intelligence that are now in our everyday realm.
Body Cam – A Good Tool Going Rogue
For example, the seemingly innocuous body cam tool is being turned into a constant surveillance device hooked to a citizen’s database. The technology companies have a vested interest in developing further such software applications that will match the face of every person a cop encounters and match it up with a perpetually connected database, thus allowing officers to base their judgment on algorithms —in high-intensity situations.
There are myriad problems here, some of which are alluded to in Robocop, like corruption, privatization of homeland security and authoritarianism. California, the home of Silicon Valley wizards, has been driven to consider Assembly Bill 1215, which would ban police agencies from using facial and biometric tracking devices as part of their body cameras.
“Having every patrol officer constantly scanning faces of everyone that walks into their field of view to identify people, run their records, and record their location and activities is positively Orwellian,” said ACLU attorney Peter Bibring.
If you consider where this is headed: tying facial-recognition software into security cameras that are practically everywhere, the tech becomes creepy and Orwellian instantly. The Californian lawmaker who penned the Bill 1215, Phil Ting (D–San Francisco), points to China where the authorities used recognition software to grab someone from a crowd of 20,000 people during a concert.
Before you jump to defend our men in blue and champion their integrity, Police have already admitted that they want to use technologies to form wholesale dragnets, by scanning everyone at public events and not only those that they suspect of having committed a crime. So those of you who think, it’ll be like a mug shot — well, nothing could be further from the truth.
The Riverside Sheriffs’ Association, though, argued against the Bill, “Huge events…and scores of popular tourist attractions should have access to the best available security—including the use of body cameras and facial-recognition technology.” They want it and are not shy about it either.
I did mention one of my practical concerns i.e. having officers rely on the judgment of algorithm in high-intensity situations — a routine traffic stop could be it. Say you are pulled over for speeding on the freeway and the officer approaches your vehicle. The body camera scans and compares your face in a database of hundreds of thousands of people in your age, race, and width-of-nose group. A match triggered, even if it is retracted by the software in the next second, the officer has already drawn his weapon and you being you, who knows what you would do?
“Facial recognition technology has misidentified members of the public as potential criminals in 96 percent of scans so far in London, new figures reveal,” according to a May report in the Independent. A Commerce Department study found a high rate of accuracy, but misidentification still was common. I don’t doubt that technology will catch on accuracy but the contention here isn’t only about the accuracy but our rights as citizens and having a jury rather than an algorithm met out instant justice like instant-coffee.
According to the Assembly analysis, the ACLU used such software to compare photos of all federal legislators and “incorrectly matched 28 members of Congress with people who had been arrested,” The Analysis further indicts the technology, “The test disproportionately misidentified African-American and Latino members of Congress as the people in mug shots.” The company that produced the software disputed the ACLU’s approach, but as disturbing racial bias in technology is, it is not new. The filters on your social media apps are still trying to catch up on how to figure racial identities. Tech companies don’t need to see this as a failure; advertising took ages to catch up, similarly, tech developed by Caucasians will have trouble with racial identifiers, to begin with as implicit bias and stereotypes hinder our perception.
Republicans must be torn over this Bill considering they are the “protector” party but also anti-big government party. The only Republican to support this bill has been State lawmaker Tyler Diep (R–Orange County), who was born in communist Vietnam. Kudos Tylor! Sci-fi has given us warnings of a dystopian police state for a long time and reality has caught up fast. Our model of the Free-Society requires a continuous struggle to defend against encroachment from forces of authoritarianism.