Sunday 21 April 2019

Slaughterbots

LAWs (Lethal Autonomous Weapons): Dystopian science fiction or future reality?


Published on Nov 12, 2017 by Stop Autonomous Weapons
YouTube: Slaughterbots (7:47)
Many of the world's leading AI researchers and humanitarian organizations are concerned about the potentially catastrophic consequences of allowing lethal autonomous weapons to be developed.

Wikipedia: Slaughterbots
Slaughterbots is a 2017 arms-control advocacy video presenting a dramatized near-future scenario where swarms of inexpensive microdrones use artificial intelligence and facial recognition to assassinate political opponents based on preprogrammed criteria. The video was released onto YouTube by the Future of Life Institute and Stuart Russell, a professor of computer science at Berkeley, on 12 November 2017. The video quickly went viral, gaining over two million views. The video was also screened to the November 2017 United Nations Convention on Certain Conventional Weapons meeting in Geneva.




An Observation
At first glance, this seems far-fetched: It's more of the realm of science fiction, a dystopian interpretation of the future. However, can we extrapolate from current events a logical progression to such a scenario?

* Drones or UAVs (Unmanned Air Vehicles) have been used by the United States for over a decade in its war on terror.

* A.I. is well-known in commercial products such as Amazon Alexa, Google Home, and various cell phone systems.

* Drones or quadcopters are the staple of videographers and aerial surveyors.

* Facial recognition is used by Facebook and Google in the detection of people in photographs.

The pieces of the puzzle are all there: The technologies currently exist. The idea laid out in this video is within the realm of possibility. Like landmines, will we see an international treaty banning LAWs? Then again, will such a treaty stop them? I note that Mine Ban Treaty has 32 non-signatory states including the United States, Russia, and China: the top three largest arms exporters in the world.

What will the future hold? I can't help noting that a person without a gun can't shoot you.


References

Ban Lethal Autonomous Weapons
Take Action
The development of lethal autonomous weapons would be catastrophically destabilizing to society, and time is running out to prevent them from being developed.


Wikipedia: Lethal autonomous weapon
Lethal autonomous weapons (LAWs) are a type of autonomous military robot that can independently search and engage targets based on programmed constraints and descriptions. LAW are also called lethal autonomous weapon systems (LAWS), lethal autonomous robots (LAR), robotic weapons, or killer robots. LAWs may operate in the air, on land, on water, under water, or in space. The autonomy of current systems as of 2018 is restricted in the sense that a human gives the final command to attack - though there are exceptions with certain "defensive" systems.

Wikipedia: Stuart J. Russell
Stuart Jonathan Russell (born 1962) is a computer scientist known for his contributions to artificial intelligence. He is a Professor of Computer Science at the University of California, Berkeley and Adjunct Professor of Neurological Surgery at the University of California, San Francisco.

Published on Mar 26, 2019 by Future of Life Institute
YouTube: Why We Should Ban Lethal Autonomous Weapons (5:43)
Top AI researchers -- including deep-learning co-inventor Yoshua Bengio and AAAI President-elect Bart Selman -- explain what you need to know about lethal autonomous weapons. Thanks to Joseph Gordon-Levitt for the narration.


2019-04-21

Site Map - William Quincy BelleFollow me on Twitter

No comments: