Law experts warn of threat to reasonable suspicion
July 30, 2012
Police and other law enforcers in the US are increasingly turning to ‘Minority Report’ style “pre-crime” strategies to fight crimes by predicting them before they take place.
A report out today from the AFP notes that “predictive analytics” software, computer algorithms that predict where and when patterns of crimes will occur, has been adopted by police departments from Santa Cruz, California, to Memphis, Tennessee.
Law enforcement agencies in Washington D.C. are already using a software database developed by the University of Pennsylvania that they claim can predict when crimes will be committed and who will commit them, before they actually happen.
Officials in some areas claim to have seen up to a 30 percent drop in serious crime since adopting the technology.
The technology sifts through a database of thousands of crimes and uses algorithms and different variables, such as geographical location, criminal records and ages of previous offenders, to come up with predictions of where, when, and how a crime could possibly be committed and by who.
The program operates without any direct evidence that a crime will be committed, it simply takes datasets and computes possibilities.
The AFP report quotes Colleen McCue, a behavioral scientist who is working with The Department of Homeland Security on pre-crime programs. McCue says that “People are creatures of habit,” comparing criminals to shoppers.
“When you go shopping you go to a place where they have the things you’re looking for… the criminal wants to go where he will be successful also.” McCue says.
The report also cites comments from Mark Cleverly, an IBM predictive crime analytic expert.
“It’s not saying a crime will occur at a particular time and place, no one can do that. But it can say you can expect a wave of vehicle thefts based one everything we know.” Cleverly notes, adding that he believes the technology can improve privacy.
“You can pinpoint the record of who has access to information, you have a solid history of what’s going on, so if someone is using the system for ill you have an audit trail,” he said.
Cleverly adds that the technology should not be compared to that envisaged by Philip K. Dick in ‘The Minority Report’. “It was a great film and great short story, but it’s science fiction and will remain science fiction. That’s not what this is about.” Cleverly said.
However, Andrew Guthrie Ferguson, a law professor at the University of the District of Columbia, warns that the technology may represent a threat to constitutional protections on “unreasonable” searches.
“To stop you and frisk you and search you, a police officer needs reasonable suspicion, so my question is how will this affect reasonable suspicion?” he said.
“How do you cross-examine a computer?” Ferguson said, noting that any court cases brought from “evidence” based on the algorithm software would face legal gray areas.
The technology is also being combined with surveillance cameras, which are being rolled out in major cities across the country.
“Manufacturers BRS Labs said it has installed the cameras at tourist attractions, government buildings and military bases in the U.S. In its latest project BRS Labs is to install its devices on the transport system in San Francisco, which includes buses, trams and subways,” reported the Daily Mail last month.
The cameras are programmed with a list of behaviors considered “normal”. Anything that deviates from usual activity is classified as suspicious and guards are immediately alerted via text message or a phone call.
Equipped with the ability to track up to 150 suspects at a time, the cameras build up a “memory” of suspicious behavior to determine what constitutes potential criminal activity.
A total of 288 cameras will be installed across 12 transport hubs.
Other forms of pre-crime technology in use or under development include neurological brain scanners that can read people’s intentions before they act, thus detecting whether or not a person has “hostile intent”.
Pre-crime technology is also being rolled out in airports and other public venues in order to identify suspect travelers and single them out for interrogations. This face-scanning system “successfully discriminates between truth and lies in about two-thirds of cases,” which equates to little more accuracy than chance alone, making it even less reliable than the notorious polygraph test, which has been widely discredited and is habitually inaccurate.
As we have previously documented, the Department of Homeland Security’s FAST program is based around similar technology that professes to detect “malintent” by means of pre-crime interrogations and physiological scans.
A promotional video for the program shows individuals who attend “security events” being led into trailers before they are interrogated as to whether they are terrorists while lie detector-style computer programs analyze their physiological responses. The subjects are asked about their whereabouts, and if they are attempting to smuggle bombs or recording devices into the “expo,” proving that the technology is intended to be used at public events and not just airports. Individuals who do not satisfy the first lie detector-style test are then asked “additional questions”.
As surveillance cameras become more sophisticated, the temptation to use pre-crime technology is likely to intersect with the rollout of so-called “smart” street lighting systems that double as “homeland security” spying hubs.
As we have documented, talking surveillance cameras that bark orders at passers-by and can also record conversations are heading for U.S. streets, with the government-backed introduction of the ‘Intellistreets’ system.
Steve Watson is the London based writer and editor for Alex Jones’ Infowars.com, and Prisonplanet.com. He has a Masters Degree in International Relations from the School of Politics at The University of Nottingham in England.