Link: AI-Influenced Weapons Need Better Regulation by Branka Marijan (Scientific American, Mar 30, 2022)
Summary: Autonomous weapons systems, commonly referred to as killer robots, are weapons programmed to find, then select and attack someone or something within a class of targets; little human input controls the decisions that are made. Unsurprisingly, the technology behind some of these weapons systems is immature and error-prone, and there is little clarity on how the systems function and make decisions. Some of these weapons will invariably hit the wrong targets, killing innocent civilians and destroying critical infrastructure. Moreover, AI technologies have been shown to be biased, particularly against women and people in minority communities. False identifications disproportionately impact already marginalized and racialized groups. Due to these misapplications, we need the strongest diplomatic effort to regulate, or even prohibit, the use of these weapons, including the technology behind them, AI itself.
Audience: This article may be interesting to Political Science, and LJST students, as well as people interested in Military Studies, AI, and ML.