PROBLEMS OF USING AUTONOMOUS MILITARY AI AGAINST THE BACKGROUND OF RUSSIA'S MILITARY AGGRESSION AGAINST UKRAINE

##plugins.themes.bootstrap3.article.main##

##plugins.themes.bootstrap3.article.sidebar##

Published: Jan 2, 2023

Abstract

The application of modern technologies with artificial intelligence (AI) in all spheres of human life is growing exponentially alongside concern for its controllability. The lack of public, state, and international control over AI technologies creates large-scale risks of using such software and hardware that (un)intentionally harm humanity. The events of recent month and years, specifically regarding the Russian Federation’s war against its democratic neighbour Ukraine and other international conflicts of note, support the thesis that the uncontrolled use of AI, especially in the military sphere, may lead to deliberate disregard for the moral standards of controlled AI or the spontaneous emergence of aggressive autonomous AI. The development of legal regulation for the use of technologies with AI is prolonged concerning the rapid development of these artefacts, which simultaneously cover all areas of public relations. Therefore, control over the creation and use of AI should be carried out not only by purely technical regulation (e.g., technical standards and conformance assessments, corporate and developer regulations, requirements enforced through industry-wide ethical codes); but also by comprehensive legislation and intergovernmental oversight bodies that codify and enforce specific changes in the rights and duties of legal persons. This article shall present the “Morality Problem” and “Intentionality Problem” of AI, and reflect upon various lacunae that arise when implementing AI for military purposes.

How to Cite

Kostenko, O., Jaynes, T., Zhuravlov, D., Dniprov, O., & Usenko, Y. (2023). PROBLEMS OF USING AUTONOMOUS MILITARY AI AGAINST THE BACKGROUND OF RUSSIA’S MILITARY AGGRESSION AGAINST UKRAINE. Baltic Journal of Legal and Social Sciences, (4), 131-145. https://doi.org/10.30525/2592-8813-2022-4-16
Article views: 729 | PDF Downloads: 529

##plugins.themes.bootstrap3.article.details##

Keywords

artificial intelligence, autonomous systems, disinformation, international law, military ethics

References
1. Aldasoro, I., Gambacorta, L., Giudici, P., Leach, T. (2022). The drivers of cyber risk. J Financ Stab, 60. https://doi.org/10.1016/j.jfs.2022.100989 [in English].
2. Ali, S. (2020) Coming to a battlefield near you: Quantum computing, artificial intelligence, & machine learning’s impact on proportionality. St Clara J In L, 18, 1–47 [in English].
3. Allen, T., Widdison, R. (1996). Can computers make contracts? Harv J L Technol, 9, 25–52 [in English].
4. Alper, A., Freifeld. K., (2022). White House tells chip industry to brace for Russian supply disruptions. Reuters. https://www.reuters.com/article/ukraine-crisis-chips-idCAKBN2KG111 [in English].
5. Athanasia. G., Arcuri. G., (2022). Russia's invasion of Ukraine impacts gas markets critical to chip production. Center for Strategic & International Studies. https://www.csis.org/blogs/perspectives-innovation/russias-invasion-ukraine-impacts-gas-markets-critical-chip-production [in English].
6. Ayalew, Y.E. (2019). The Internet shutdown muzzle(s) freedom of expression in Ethiopia: Competing narratives. Inf & Comm Technol L. https://doi.org/10.1080/13600834.2019.1619906 [in English].
7. Barfield, W. (2006). Intellectual property rights in virtual environments: Considering the rights of owners, programmers and virtual avatars. Akron L Rev, 39, 649–700 [in English].
8. Bennett, D.S. (2006). Chimera and the continuum of humanity: Erasing the line of constitutional personhood. Emory L J, 55, 347-388 [in English].
9. Bertolini, A. (2015). Robotic prostheses as products enhancing the rights of people with disabilities: Reconsidering the structure of liability rules. Int Rev L Comput & Technol, 29, 116-136 [in English].
10. Billauer, B.P. (2021). The bionic plaintiff and the cyborg defendant: Liability in the age of brain-tocomputer interface. Va J L & Technol, 25 (2), 38-111 [in English].
11. Bistron, M., Piotrowski, Z. (2021). Artificial intelligence applications in military systems and their influence on sense of security of citizens. Electron. 10 (7), 1-19 [in English].
12. Biswas, S. (2021). The Indian government's war with Twitter. BBC World News – India. https://www.bbc.com/news/world-asia-india-56007451 [in English].
13. Bostrom, N. (2016). Superintelligence: Paths, dangers, strategies, paperback ed. Oxford University Press, Oxford, UK [in English].
14. Bridy, A. (2012). Coding creativity: Copyright and the artificially intelligent author. Stanf Technol L Rev, 5, 1–28 [in English].
15. Butun, I., Tuncel, Y.K., Oztoprak, K. (2021). Application layer packet processing using PISA switches. Sens. https://doi.org/10.3390/s21238010 [in English].
16. Chappell, B. (2022). Charging Putin for potential war crimes is difficult, and any penalty hard to enforce. NPR. https://www.npr.org/2022/04/05/1090837686/putin-war-crimes-prosecution-bucha [in English].
17. Chen, X. (2001). Limitation of liability for maritime claims: A study of U.S. law, Chinese law and international conventions. Kluwer Law International, The Hague [in English].
18. Coeckelbergh, M. (2020). Artificial intelligence, responsibility attribution, and a relational justification of explainability. Sci & Eng Eth. https://doi.org/10.1007/s11948-019-00146-8 [in English].
19. Coguic, L. (2021). Forward thinking or right on time? A proposal to recognize authorship and inventorship to artificial intelligence. Indones J Int & Comp L, 8, 223–248 [in English].
20. Czernecki, J.L. (2003). The United Nations' paradox: the battle between humanitarian intervention and state sovereignty. Duquesne L Rev, 41, 391–408. https://dsc.duq.edu/dlr/vol41/iss2/7
21. Delvaux, M. (2017). Report with recommendations to the Commission on Civil Law Rules on Robotics. Eur Parliam A8-0005/2017. https://www.europarl.europa.eu/doceo/document/A-8-2017-0005_EN.html [in English].
22. Derviş, K., Ocampo, J.A., (2022). Will Ukraine's tragedy spur UN Security Council reform? Project Syndicate. https://www.project-syndicate.org/commentary/ukraine-war-proposal-for-un-security-council-reform-by-kemal-dervis-and-jose-antonio-ocampo-2022-03 [in English].
23. European Commission (2021) Proposal for a regulation of the European Parliament and of the Council laying down harmonised rules on artificial intelligence (artificial intelligence act) and amending certain Union legislative Acts. Publications Office of the European Union. https://eur-lex.europa.eu/legal-content/EN/TXT/?uri=CELEX%3A52021PC0206 [in English].
24. Fall, J.J. (2020). Territory, sovereignty and entitlement: Diplomatic discourses in the United Nations Security Council. Polit Geogr. https://doi.org/10.1016/j.polgeo.2020.102208 [in English].
25. Farrow, R. (2022). How democracies spy on their citizens. The New Yorker. https://www.newyorker.com/magazine/2022/04/25/how-democracies-spy-on-their-citizens [in English].
26. Faulconbridge, G. (2022). Russia fights back in information war with jail warning. Reuters. https://www.reuters.com/world/europe/russia-introduce-jail-terms-spreading-fake-informationabout-army-2022-03-04/ [in English].
27. Formerand, J. (2007). The a to z of the United Nations. Scarecrow Press, Lanham, MD [in English].
28. Freire, I.T, Urikh, D., Arsiwalla, X.D., Verschure, P.F. (2020) Machine morality: From harm-avoidance to human-robot cooperation. Biomimetic and biohybrid Systems: 9th international conference, living machines 2020, Freiburg, Germany, July 28–30, 2020, Proceedings. Springer Nature Switzerland, Cham, 116-127. https://doi.org/10.1007/978-3-030-64313-3_13 [in English].
29. Froomkin, A.M., Colangelo, P.Z. (2015). Self-defense against robots and drones. Conn L Rev, 48, 1–70. https://repository.law.miami.edu/fac_articles/63/ [in English].
30. Gellers, J.C. (2021). Rights for robots: Artificial intelligence, animal and environmental law. Routledge, London [in English].
31. Göksu, R. (2022). Can a river be considered a legal person? Investigating the possibility for the Atrato River. Arch für R und Sozial [Arch Philos L & Soc Philos]. https://doi.org/10.25162/arsp-2022-0005 [in English].
32. Gordon, J.S. (2020). Building moral machines: Ethical pitfalls and challenges. Sci & Eng Eth. https://doi.org/10.1007/s11948-019-00084-5 [in English].
33. Graham, K. (2012). Of frightened horses and autonomous vehicles: Tort law and its assimilation of innovations. St Clara L Rev, 52, 1241-1270. https://digitalcommons.law.scu.edu/lawreview/vol52/iss4/4 [in English].
34. Haney, B.S. (2020). Applied artificial intelligence in modern warfare and national security policy. Hastings Sci & Technol L J, 11, 61-98. https://repository.uchastings.edu/hastings_science_technology_law_journal/vol11/iss1/5/ [in English].
35. Hodgkinson, D., Johnston, R. (2018). Aviation law and drones : Unmanned aircraft and the future of aviation. Routledge, Oxon, UK [in English].
36. Jacobs, C.J. (2015). End the popularity contest: A proposal for second amendment type of weapon analysis. Tenn L Rev, 83, 231-290. https://scholarship.law.bu.edu/faculty_scholarship/651 [in English].
37. Jaynes, T.L. (2020). Legal personhood for artificial intelligence: Citizenship as the exception to the rule. AI & Soc, 35, 343-354. https://doi.org/10.1007/s00146-019-00897-9 [in English].
38. Jaynes, T.L. (2021a). The question of algorithmic personhood and being (or: On the tenuous nature of human status and humanity tests in virtual spaces—why all souls are ‘necessarily’ equal when considered as energy). Multidisciplinary Scientific Journal, 4(3), 452-475. https://doi.org/10.3390/j4030035 [in English].
39. Jaynes, T.L. (2021b). On human genome manipulation and Homo technicus: The legal treatment of non-natural human subjects. AI & Eth, 1, 331-345. https://doi.org/10.1007/s43681-021-00044-5 [in English].
40. Jaynes, T.L. (2021c). The legal ambiguity of advanced assistive bionic prosthetics: Where to define the limits of ‘enhanced persons’ in medical treatment. Clin Eth, 16 (3), 171–182 https://doi.org/10.1177/1477750921994277 [in English].
41. Jaynes, T.L. (2021d). Citizenship as the exception to the rule: An addendum. AI & Soc. 36 (3), 911-930. https://doi.org/10.1007/s00146-020-011059 [in English].
42. Jaynes, T.L. (2021e). “I am not your robot:” The metaphysical challenge of humanity’s AIS ownership. AI & Soc, 37, 1689-1702. https://doi.org/10.1007/s00146-021-01266-1 [in English].
43. Kareng, Y. (2020). International aviation/airspace law: An overview. Int J L Reconstr, 4(1), 56–68. https://doi.org/10.26532/ijlr.v4i1.10941 [in English].
44. Kerr, I.R. (1999). Spirits in the material world: Intelligent agents as intermediaries in electronic commerce. Dalhous L J, 22, 190-249 https://digitalcommons.schulichlaw.dal.ca/dlj/vol22/iss2/4 [in English].
45. Kester, C.M. (1994). Is there a person in that body?: An argument for the priority of persons and the need for a new legal paradigm. Georget L J, 82, 1643-1688 [in English].
46. Kim, S.W., Douai, A. (2012). Google vs. China’s “Great Firewall”: Ethical implications for free speech and sovereignty. Technol & Soc, 34(2), 174-181. https://doi.org/10.1016/j.techsoc.2012.02.002 [in English].
47. Kinsara, O.A. (2021). Clash of dilemmas: How should UK copyright law approach the advent of autonomous AI creations? Camb L Rev, 6, 62-85. https://www.cambridgelawreview.org/_files/ugd/fb0f90_b4883bafdd4142618688f92db068177c.pdf [in English].
48. Kirchner, S. (2013). Personhood and the right to life under the European Convention of Human Rights: Current and future challenges of modern (bio-)technology. Legal journal «Law of Ukraine», 3, 292-302 [in English].
49. Kolff, D.W. (2011). 'Missile strike carried out with Yemeni cooperation'—using UCAVs to kill alleged terrorists: A professional approach to the normative bases of military ethics. J Mil Eth, 2 (3), 240-244. https://doi.org/10.1080/15027570310000793 [in English].
50. Kostenko, O.V. (2022). Electronic jurisdiction, metaverse, artificial intelligence, digital personality, digital avatar, neural networks: theory, practice, perspective. World Science, 1(73). https://doi.org/10.31435/rsglobal_ws/30012022/775 [in English].
51. Krishnan, M. (2021). Why are Twitter and WhatsApp miffed with Indian authorities? Deutsche Welle. https://www.dw.com/en/india-social-media-conflict/a-57702394 [in English].
52. Kurki, V.A.J., Pietrzykowski, T. (eds) (2017). Legal personhood: Animals, artificial intelligence and the unborn. Springer International Publishing, Cham. https://doi.org/10.1007/978-3-319-53462-6 [in English].
53. Kurki, V.A.J. (2019). A theory of legal personhood. Oxford University Press, Oxford, UK [in English].
54. Lee, M.K. (2018). Understanding perception of algorithmic decisions: Fairness, trust, and emotion in response to algorithmic management. Big Data & Soc, 5(1), https://doi.org/10.1177/2053951718756684 [in English].
55. Leibold, J. (2019). Surveillance in China’s Xinjiang region: Ethnic sorting, coercion, and inducement. J Contemp China, 29(121), 46-60. https://doi.org/10.1080/10670564.2019.1621529 [in English].
56. Lewis, T.G. (2020) Critical infrastructure protection in homeland security: Defending a networked nation, 3rd ed. John Wiley & Sons, Hoboken, NJ [in English].
57. Liebenberg, S. (2020). Between sovereignty and accountability: the emerging jurisprudence of the United Nations Committee on Economic, Social and Cultural Rights under the Optional Protocol. Human Rights Quart, 42(1), 48-84. https://doi.org/10.1353/hrq.2020.0001 [in English].
58. Linder, D.O. (1995). The other right-to-life debate: When does fourteenth amendment “life” end? Ariz L Rev, 37, 1183–1208 [in English].
59. Lostal, M. (2021), De-objectifying animals: Could they qualify as victims before the International Criminal Court? J Int Crim Justice, 19(3), 583-610. https://doi.org/10.1093/jicj/mqab039 [in English].
60. Mawani, R. (2018). Across oceans of law : The Komagata Maru and jurisdiction in the time of empire. Duke University Press, Durham, NC [in English].
61. McCarthy, J., Minsky, M.L., Rochester, N., Shannon, C.E. (2006). A proposal for the Dartmouth summer research project on artificial intelligence, August 31, 1955. AI Mag. https://doi.org/10.1609/aimag.v27i4.1904 [in English].
62. Meisler, S. (1995). United Nations: The first fifty years. Atlantic Monthly Press, New York [in English].
63. Mokhtarian, E. (2018). The bot legal code: Developing a legally compliant artificial intelligence. Vanderbilt J Entertain & Technol L, 21, 145-208. https://scholarship.law.vanderbilt.edu/jetlaw/vol21/iss1/3 [in English].
64. Mostow, J. (1985). Foreword: What is AI? And what does it have to do with software engineering? IEEE Trans Softw Eng, 11, 1253–1256. https://doi.org/10.1109/TSE.1985.231876 [in English].
65. Muehlhauser, L., Bostrom, N. (2014). Why we need friendly AI. Think, 13(36), 41-47. https://doi.org/10.1017/S1477175613000316 [in English].
66. Mullerson, R. (1993). New developments in the former USSR and Yugoslavia. Va J Int L, 33, 299–322 [in English].
67. Owe, A., Baum, S. (2021). Moral consideration of nonhumans in the ethics of artificial intelligence. AI & Eth, 1, 517-528 https://doi.org/10.1007/s43681-021-00065-0 [in English].
68. Peachey, K. (2022). Ukraine war: Fraudsters exploit crisis to steal money. BBC News – Business. https://www.bbc.com/news/business-60765116 [in English].
69. Quantum Technology and Application Consortium – QUTAC, Bayerstadler, A., Becquin, G., Binder, J., Botter, T., Ehm, H., Ehmer, T., et al. (2021). Industry quantum computing applications. EPJ Quantum Technol, 8, 25 https://doi.org/10.1140/epjqt/s40507-021-00114-x [in English].
70. Ruijgrok, K. (2021). The authoritarian practice of issuing internet shutdowns in India: The Bharatiya Janata Party’s direct and indirect responsibility. Democratization. 29(4), 611–633. https://doi.org/10.1080/13510347.2021.1993826 [in English].
71. Sampedro, V., López-Ferrándes, F.J., Hidalgo, P. (2021). Digital disintermediation, technical and national sovereignty: the Internet shutdown of Catalonia’s ‘independence referendum’. Eur J Comm, 37(2), 127-144 https://doi.org/10.1177/02673231211012143 [in English].
72. Savulescu, J., Maslen, H. (2015). Moral enhancement and artificial intelligence: moral AI? Beyond artificial intelligence: The disappearing human-machine divide. Springer International Publishing, Cham, 9, 79-95. https://doi.org/10.1007/978-3-319-09668-1_6 [in English].
73. Shank, D.B., DeSanti, A. (2018). Attributions of morality and mind to artificial intelligence after real-world moral violations. Comput in Hum Behav, 86, 401-411. https://doi.org/10.1016/j.chb.2018.05.014 [in English].
74. Solum, L.B. (1992). Legal personhood for artificial intelligences. N C L Rev, 70, 1231–1287. https://scholarship.law.unc.edu/nclr/vol70/iss4/4 [in English].
75. Stone, C.D. (1972). Should trees have standing—toward legal rights for natural objects. South Calif L Rev, 45, 450-501 [in English].
76. de Swarte, T., Boufous, O., Escalle, P. (2019). Artificial intelligence, ethics and human values: The cases of military drones and companion robots. Artif Life & Robot, 24, 291–296 https://doi.org/10.1007/s10015-019-00525-1 [in English].
77. Taddeo, M., Floridi, L. (2018). Regulate artificial intelligence to avert cyber arms race. Nat, 556, 296-298. https://doi.org/10.1038/d41586-018-04602-6 [in English].
78. ццTarisayi, K.S., Munyaradzi, E. (2021). A teacher perspective on the impact of internet shutdown on the teaching and learning in high schools in Zimbabwe. Human Behav & Emerg Technol, 3, 169-175. https://doi.org/10.1002/hbe2.230 [in English].
79. Terzian, D. (2013). The right to bear (robotic) arms. Penn State L Rev, 117, 755–796. http://www.pennstatelawreview.org/117/3/Terzian%20final.pdf [in English].
80. The Associated Press (2022) U.N. takes step to put veto users under global spotlight. NPR. https://www.npr.org/2022/04/27/1094971703/u-n-takes-step-to-put-veto-users-under-globalspotlight [in English].
81. Tilovska-Kechedji, E., Bojović, M., Čvorović, D.S. (2018). Artificial intelligence influencing foreign policy and security. J East Eur L, 4, 7-18 [in English].
82. Treisman, R. (2022). Zelenskyy urges U.N. Security Council to boot Russia or dissolve for the world's sake. NPR. https://www.npr.org/2022/04/05/1091050554/zelenskyy-un-security-council-speech [in English].
83. Troianovski, A. (2022). Russia takes censorship to new extremes, stifling war coverage. New York Times. https://www.nytimes.com/2022/03/04/world/europe/russia-censorship-media-crackdown.html [in English].
84. United Nations Educational, Scientific and Cultural Organization [UNESCO] (2021). UNESCO member states adopt the first ever global agreement on the ethics of artificial intelligence. UNESCO. https://en.unesco.org/news/unesco-member-states-adopt-first-ever-global-agreement-ethics-artificial-intelligence [in English].
85. Van Noorden, R. (2020). The ethical questions that haunt facial-recognition research. Nat, 587, 354-358. https://doi.org/10.1038/d41586-020-03187-3 [in English].
86. Wallach, W., Allen, C. (2009). Moral machines: Teaching robots right from wrong. Oxford University Press, New York [in English].
87. Wang, D., Wang, K. (2016). Paradise, panopticon, or laboratory? A tale of the Internet in China. Pierre Musso and the network society: From Saint-Simonianism to the Internet. Springer International Publishing, Cham, 27, 155-168. https://doi.org/10.1007/978-3-319-45538-9_7 [in English].
88. Wein, L.E. (1992). The responsibility of intelligent artifacts: Toward an automation jurisprudence. Harv J L Technol, 6, 103–154. https://jolt.law.harvard.edu/assets/articlePDFs/v06/06Harv-JLTech103.pdf [in English].
89. Williamson, E.D., Osborn, J.E. (1993). A U.S. perspective on treaty succession and related issues in the wake of the breakup of the USSR and Yugoslavia. Va J Int L, 33, 261–274 [in English].
90. Wrigley, S. (2019). Bots, artificial intelligence, and the general data protection regulation: Asking the right questions. Trinity Coll L Rev, 22, 199-211 [in English].
91. Zhu, H.H., Zou, J., Zhang, H., Shi, Y.Z., Luo, S.B., Wang, N., Cai, H., et al. (2022). Space-efficient optical computing with an integrated chip diffractive neural network. Nat Commun, 13, 1–9. https://doi.org/10.1038/s41467-022-28702-0 [in English].

Other articles by author(s)