Automated Threat Detection and the Future of Policing

By Robert Davidson, M.A.

A stock image of a police officer approaching a vehicle he has pulled over.

Law enforcement officers have a difficult job. They face a criminal element that goes to extreme efforts to avoid detection, capture, and incarceration. In addition, an increasingly vocal segment of society objects to any use of force by the police. This puts officers in a precarious position.

However, what if technology could relieve officers of determining whether to employ force? Could that decision-making process be enhanced or even replaced by artificial intelligence (AI)? How could AI impact the use of force by officers in the course of their duties?1

Advancements in Equipment

A century ago, police gear consisted of a gun, handcuffs, and a baton or sap. Since that time, agencies have added mace, pepper spray, radios, cellular devices, stun guns, Tasers, and a host of other weapons and protective devices.

Tomorrow’s police officer may don a uniform festooned with microphones, cameras, and sensors that continuously monitor and evaluate the environment for signs of danger and criminal behavior. Current science fiction video games and movies perhaps foreshadow how this might look. But, the capabilities that such a system would employ hold more importance than the stylized uniforms and protective gear.

Commander Robert Davidson

Commander Davidson serves with the Ventura County, California, Sheriff’s Office.

A computer interface could apply AI to body-worn camera footage. Monitoring, facial recognition, gait analysis, weapons detection, and voice-stress analysis all would actively evaluate potential danger to the officer. After identification of a threat, the system could enact an automated response based on severity.

For example, a deputy stops a vehicle for speeding. Behind the wheel sits a probationer from two counties over who knows that a search would reveal 3 pounds of methamphetamine in the trunk. This subject, who has a history of violence and weapons possession, will look for any opportunity to attack or flee. Historically, an officer would not know of the potential threat until conducting a search via radio or mobile data terminal and identifying the individual.

However, the deputy’s use of force system already has pinpointed the subject through facial recognition, voice analysis, and facial feature discernment. After determining signs of aggression, it signals the officer through an earpiece that the detained person presents a possible threat. It also requests dispatch to send backup for assistance.

The system delivers this real-time information at the most opportune moment in the stop. Now, the officer can continue with heightened awareness and use appropriate tactics to maintain the advantage. The subject never gets a chance to assault the deputy. Back-up units arrive, and officers find the drugs and make the arrest. AI helped avert a tragedy.

Perhaps technologies currently in development could make this situation a reality.

Trends in Research

Studies to support the mission of the U.S. military hold particular importance for the viability and potential future of this technology. A “synthetic battlefield” leverages AI to make battle plans and deploy munitions without human interaction.2

In military operations, this battlefield would consist of autonomous ground vehicles, smart mortars, and unmanned aerial vehicles (UAVs) circling the area, all deployed to deny enemy access and, when feasible, advance an attack based on sensor data. Missiles, mortars, .50 caliber chain guns, and larger artillery could operate independently to defeat an enemy.

A system that can autonomously monitor, detect, and act against a threat certainly has significant implications for policing. While years removed from modern law enforcement in the United States, the technology hopefully can translate into civilian models, which could afford greater protection and accountability in policing. If the adaptation proves successful, law enforcement would inherit the breakthroughs once they are miniaturized to allow for seamless deployment in a policing environment.

Also encouraging for AI applications is private industry’s adoption of battlefield technology for use in consumer products. Facial recognition is an early advancement already incorporated into cellular phones, doorbells, and laptops.3 Such innovations provide early indications of what the future may hold. Combinations of existing and near-future systems could lead to breakthroughs in safety enhancement for any police contact.

However, one of the most impactful trends in autonomous threat recognition extends beyond the simple recognition of facial features to the analysis of the face as a means to identify emotion and intention. Computer laboratories at universities, private companies, and the Defense Advanced Research Projects Agency (DARPA) are pursuing this challenge.4 Scientists and behaviorists are working to understand how the brain identifies, evaluates, and responds to changes in facial expression.

Early work in this area is promising.5 U.S. Navy SEAL teams participate in studies to identify what changes in the face these warriors focus on to make friend-or-foe decisions.6 This data will help scientists “teach” AI components how to quickly discern threatening gestures and expressions. The data eventually will help program computers to perform the same function autonomously.7

Closely tied to this topic is the fusion of visual inputs with the evaluation and analysis of sounds and voices. Persons’ minds seamlessly evaluate the interplay of what they hear and see; however, technology does not accomplish this task as easily, and researchers have faced challenges.8 According to one expert, “[N]o framework is fully in place that provides a common format for the importing of these data into the analytical structure of the organizations and agencies.”9

One of the primary struggles has involved the raw computing power needed to fuse and analyze both visual and aural data.10 Technologists strive to identify efficient and nimble computer code that can process data quickly and with less need for such power.11

Evolution of Technology

Of course, this technology will not suddenly hit the shelves in 8 to 10 years. History shows that a progression must happen. For instance, the early stun gun preceded the Taser, and mini-cassette recorders led to digital recorders and, later, body-worn cameras.

Product development and life cycles, although not fully predictable, are measurable and can provide insight into how long a given technology may take to reach the market.12 With automated threat detection undergoing development in laboratories and with theoretical methods, it may take decades before a turn-key solution becomes possible.

What might happen by way of incremental advancement in this area? Possibilities include—

  1. facial recognition built into body-worn cameras;13
  2. voice stress analysis; and
  3. basic facial analysis for preassaultive indicators.

“Tomorrow’s police officer may don a uniform festooned with microphones, cameras, and sensors that continuously monitor and evaluate the environment for signs of danger and criminal behavior.”

As the earlier scenario described, a proven and reliable way to identify and respond to threats would safeguard officers’ lives and allow them to protect the community through proactive mitigation. To this end, a fully integrated system of warnings and preemptive force options would shield officers from those who would do harm and keep them ahead of a suspect’s decision-making. Perhaps officers could avoid force altogether.

Plans for the Future

What can agencies do to prepare for this potential advancement? Fortunately, two steps could make a transition to autonomous threat detection possible. Departments should take these actions to manage any data source effectively for the future.

  1. Clearly define and label data. For instance, personnel should label the color of someone’s eyes consistently from one database to another. Although seemingly intuitive, programmers define and label data sets differently when creating software. Without basic commonality, personnel will spend a lot of time harmonizing data in a process called “crosswalking,” which entails matching common data points and “telling” the computer that a data set in one program equals another.14 Doing this increases processing time and proves inefficient.
  2. Store data in a common, or nonproprietary, format. Agencies should not allow a software company to encode data in such a way that it cannot be synthesized with other systems later. If necessary, personnel should ensure that the software contract includes a clause that requires the vendor to help translate data if needed. According to researchers, open-source programing is preferable because the resultant code is shared communally and adds to the public good.15 Keeping data and the underlying code universal makes collaboration much easier, both in terms of effort and finances.

Without such efforts, law enforcement agencies would face the monumental task of adopting this new technology while simultaneously gaining control of their data. Understanding and controlling the data prove necessary to avoid misidentifications, inaccurate information, and gaps in databases.

Conclusion

In the future, artificial intelligence could be embedded into a police officer’s uniform and equipment. It is a possibility suspended between technology, philosophy, and sociology. However, once appropriately developed and tested, society could reap the benefits of truly transparent law enforcement agencies where use of force depends on data and recorded, definable factors evaluated independently by the AI system during a police incident.

While public information remains unavailable to indicate the level of success experienced by the U.S. Navy SEALS in their experiments, continued work in the field signals a promise of advancement. Clearly, using AI to actively scan what happens around an officer in real time represents a probable technology in the profession’s future.

“A system that can autonomously monitor, detect, and act against a threat certainly has significant implications for policing.”

Commander Davidson can be reached at rob.davidson@ventura.org.


1 This article is a futures study of an emerging issue of relevance to law enforcement. Its purpose is not to predict the future, but to project a variety of possible scenarios useful for planning and action in anticipation of the emerging landscape facing police organizations. Further, as with any technology, artificial intelligence is imperfect, and no claim is intended to the contrary.
2 C.A. Lua, K. Altenburg, and K.E. Nygard, “Synchronized Multi-Point Attack by Autonomous Reactive Vehicles with Simple Local Communication,” abstract, (lecture, IEEE Swarm Intelligence Symposium, Indianapolis, IN, April 26, 2003), accessed June 6, 2019, https://ieeexplore.ieee.org/document/1202253.
3 William White, “Nest Hello: Smart Doorbell to Feature Facial Recognition, Alarm,” Investor Place, September 21, 2017, accessed June 6, 2019, https://investorplace.com/2017/09/nest-doorbell-facial-recognition-alphabet-goog-googl/.
4 I. Lefter, L.J.M. Rothkrantz, and G.J. Burghouts, “A Comparative Study on Automatic Audio-Visual Fusion for Aggression Detection Using Meta-Information,” abstract, Pattern Recognition Letters 34, no. 15 (November 2013): 1953-1963, accessed June 6, 2019, https://doi.org/10.1016/j.patrec.2013.01.002.
5 Martin P. Paulus, et al., “Differential Brain Activation to Angry Faces by Elite Warfighters: Neural Processing Evidence for Enhanced Threat Detection,” PLOS, April 14, 2010, accessed June 6, 2019, https://doi.org/10.1371/journal.pone.0010096.
6 Ibid.
7 Ibid; and C.L. Larson, J. Aronoff, and J.J. Stearns, “The Shape of Threat: Simple Geometric Forms Evoke Rapid and Sustained Capture of Attention,” abstract, Emotion 7, no. 3 (August 2007): 526-534, accessed June 6, 2019, https://psycnet.apa.org/doiLanding?doi=10.1037%2F1528-3542.7.3.526.
8 Lefter, Rothkrantz, and Burghouts.
9 John Wulff, Artificial Intelligence and Law Enforcement (SANS Institute, 2017), accessed June 7, 2019, https://www.sans.org/reading-room/whitepapers/threatintelligence/artificial-intelligence-law-enforcement-37925.
10 Ibid.
11 Lefter, Rothkrantz, and Burghouts.
12 Abbie Griffin, “Metrics for Measuring Product Development Cycle Time,” abstract, Journal of Product Innovation Management 10, no. 2 (March 1993): 112-125, accessed June 7, 2019, https://www.sciencedirect.com/science/article/pii/0737678293900039.
13 Patrick Tucker, “Facial Recognition Coming to Police Body Cameras,” Defense One, July 17, 2017, accessed June 7, 2019, https://www.defenseone.com/technology/2017/07/facial-recognition-coming-police-body-cameras/139472/.
14 Margaret St. Pierre and William P. LaPlant, Jr., “Issues in Crosswalking Content Metadata Standards,” Information Standards Quarterly 11, no. 1 (January 1999): 2-6, accessed June 7, 2019, https://groups.niso.org/publications/white_papers/crosswalk/.
15 Eric Von Hippel and Georg Von Krogh, “Open Source Software and the ‘Private-Collective’ Innovation Model,” abstract, Organization Science 14, no. 2 (March-April 2003): 209-223, accessed June 7, 2019, https://doi.org/10.1287/orsc.14.2.209.14992.