It’s happened. Arrests have been made thanks to the evidence collected from connected digital devices such as the Amazon dot and a Fitbit. This is just the tip of the transformation that law enforcement will experience because of the Internet of Things (IoT), artificial intelligence and robots. There are certainly benefits to applying this new technology to help fight crime, but it also raises some challenging questions regarding our right to privacy and security breaches.
Internet of Things Used to Help Fight Crime
Law enforcement agencies across the world are getting trained on what to look for at crime scenes and how to handle digital evidence. Gaming consoles, Echo devices and even Fitbits have provided valuable information to help solve crimes. Most people don’t comprehend the power of these connected devices to contradict alibis and catch lies. As our reliance on these digital devices for entertainment and convenience continues to grow—watches, phones, televisions, pacemakers and more—there will be a longer trail for detectives to analyze when trying to solve a crime.
It’s commonplace now for officers to have body cams on when on patrol. These cameras can provide another set of eyes to sort through an interaction after the fact and studiessuggest they can improve self-awareness to prevent unacceptable behavior from officers and those they interact with. Knowing these interactions will be recorded is a big deterrent for bad behavior.
Some squad cars are equipped with GPS projectiles that can be shot via remote control and hook onto the back of an alleged perpetrator’s vehicle. These allow officers to know where a suspect is located and therefore prevent high-speed and dangerous car pursuits. Smart sensors have been developed that can be fixed to the inside of an officer’s gunto track how the gun is being used including whether it has been unholstered or discharged. This information could prove valuable in criminal trials.
Artificial Intelligence Aids in Predictive Policing
Several law enforcement agencies have dabbled in predictive policing including my customer the UK police in the city of Durham, England. They used a system called Hart (Harm Assessment Risk Tool) that classifies individuals and ranks the probability that they will commit another offense in the future. The system was fed data gathered between 2008-2013 and assesses people based on severity of the current crime, criminal history, flight risk and more. Although Hart’s forecasts were accurate a high percentage of the time, there are other studies that warn of using algorithms and predictive software tools because they flag minority defendants as high risk at double the rate of white defendants. One such study from ProPublica shows the human bias that is injected into such formulas because the flawed judgement of humans was used to create the programs in the first place.
By: Bernard Marr