Human Rights Watch, a co-founding member of Stop Killer Robots, has expressed concerns about the implications of autonomous weapons systems on human rights during a session at the United Nations General Assembly. The organization highlighted that these systems could be used in both warfare and law enforcement, raising significant issues related to human rights laws applicable in times of peace and conflict.
The right to life was emphasized as a critical concern. “To avoid arbitrarily depriving someone of their right to life, use of force must be necessary to achieve a legitimate aim and applied in a proportionate manner,” Human Rights Watch stated. The organization argued that autonomous weapons systems lack the capacity for human judgment needed to assess the necessity and proportionality of force.
Beyond the right to life, other human rights such as peaceful assembly could also be at risk. “The use or threat of use of autonomous weapons systems could strike fear among protesters and thus have a chilling effect on free expression and peaceful assembly,” they noted.
Human dignity and non-discrimination are two fundamental principles potentially undermined by these technologies. Autonomous weapons might dehumanize targets by reducing them to data points through algorithms, while biases in AI design could lead to discriminatory outcomes.
Privacy concerns were also raised due to potential mass surveillance required for deploying these systems. Such practices may not meet the necessary legal standards for legitimacy and proportionality.
Accountability remains another challenge with autonomous weapons systems. Legal obstacles exist in holding operators or developers accountable for actions taken by machines beyond their understanding or control. This creates an accountability gap relevant under international humanitarian law, criminal law, and human rights law.
A call was made for new legally binding instruments to address these issues by prohibiting certain types of autonomous weapons that violate international human rights laws. A proposal specifically targeting systems lacking meaningful human control was suggested as a means to prevent abuses.
For further insights into these threats posed by autonomous weaponry, Human Rights Watch recommended consulting “A Hazard to Human Rights,” a report released in collaboration with Harvard Law School’s International Human Rights Clinic.



