Skip to main content
default

Publication
Algorithmic Bias and the Weaponization of Increasingly Autonomous Technologies

Algorithmic Bias and the Weaponization of Increasingly Autonomous Technologies

22/08/2018

18 Pages

AI-enabled systems depend on algorithms, but those same algorithms are susceptible to bias. Algorithmic biases come in many types, arise for a variety of reasons, and require different standards and techniques for mitigation. This primer characterizes algorithmic biases, explains their potential relevance for decision-making by autonomous weapons systems, and raises key questions about the impacts of and possible responses to these biases.

linked programmes

Security and Technology

Share
Download the publication
Cite this publication