Araştırma Makalesi
BibTex RIS Kaynak Göster

OTONOM SİLAH SİSTEMLERİ: ULUSLARARASI İNSANCIL HUKUK İÇİN HUKUKİ BİR ZORLUK

Yıl 2025, Cilt: 16 Sayı: 31, 36 - 58, 27.06.2025
https://doi.org/10.36543/kauiibfd.2025.002

Öz

Yapay zeka (AI) teknolojilerinin giderek yaygınlaşması, yasal çerçevelerin uyarlanması ve düzenleme eksikliklerinin giderilmesi ihtiyacını ortaya çıkardı. Bu teknolojilerin askeri alanda uygulanması ve silahlar üzerindeki insan denetiminin azalması, özellikle hukuk ve etik alanlarında çok sayıda tartışmaya yol açtı. İnsanların ölümcül robotlar tarafından nesnelere, stereotiplere ve veri noktalarına indirgenmesine karşı dünya çapında önemli bir direnç oluştu. Bu çalışma, yapay zeka destekli askeri teknoloji ve otonom silah sistemlerinin (AWS) yarattığı yeni güvenlik tehlikelerini Ulrich Beck'in Risk Toplumu teorisi ve Uluslararası İnsani Hukuk çerçevesinde inceleyecektir. Sonuç olarak, AI teknolojilerinin sivil ve askeri bağlamlarda giderek daha fazla kullanılması, özellikle uluslararası yasal düzenlemelerdeki mevcut eksiklikler nedeniyle otonom silah sistemleri (AWS) etrafındaki etik kaygılar ve tartışmalar göz önüne alındığında, dikkat gerektiren hem fırsatlar hem de riskler sunmaktadır.

Kaynakça

  • Ahmed, S. (2023, May 22). It’s ai powered slaughterbots, not chatgpt, that should worry us. The Daily Star. Retrieved October 18, 2024, from https://www.thedailystar.net/opinion/views/news/its-ai-powered-slaughterbots-not-chatgpt-should-worry-us-3254361
  • Allhoff, F., Evans, N. G., & Henschke, A. (2013). Routledge handbook of ethics and war. Routledge eBooks. https://doi.org/10.4324/9780203107164
  • Automated Decision Research. 2023. “Autonomous Weapons and Digital Dehumanisation Automated Decision Research.” February 14, 2023. Retrieved October 18, 2024, from https://automatedresearch.org/news/report/autonomous-weapons-and-digital-dehumanisation-a-short-explainer-paper/
  • Automatic, adj. & n. meanings, etymology and more | Oxford English Dictionary. (n.d.). https://www.oed.com/dictionary/automatic_adj?tl=true
  • Bartneck, C., Lütge, C., Wagner, A., & Welsh, S. (2020). Military uses of ai. SpringerBriefs in ethics (pp. 93–99). https://doi.org/10.1007/978-3-030-51110-4_11
  • Beck, U. (1992). Risk Society: Towards a New Modernity. SAGE.
  • Boutin, B. (2022). State responsibility in relation to military applications of artificial intelligence. Leiden Journal of International Law, 36 (1), 133–150. https://doi.org/10.1017/s0922156522000607
  • Cardon, A., & Itmi, M. (2016). New autonomous systems. John Wiley & Sons.
  • Castells, M. (2015). Networks of outrage and hope: Social Movements in the internet age. John Wiley & Sons.
  • Chengeta, T. (2015b). Measuring autonomous weapon systems against international humanitarian law rules. Social Science Research Network. https://doi.org/10.2139/ssrn.2755184
  • Davison, N. (2018). A legal perspective: Autonomous weapon systems under international humanitarian law. In Disarmament (pp. 5–18). https://doi.org/10.18356/29a571ba-en
  • Deeney, Chris. 2019. “Six in Ten (61%) Respondents across 26 Countries Oppose the Use of Lethal Autonomous Weapons Systems.” Ipsos, January 22, 2019. Retrieved October 18, 2024, from https://www.ipsos.com/en-us/news-polls/human-rights-watch-six-in-ten-oppose-autonomous-weapons.
  • Docherty, B. (2012). Losing humanity. Human Rights Watch. Retrieved October 18, 2024, from https://www.hrw.org/report/2012/11/19/losing-humanity/case-against-killer-robots
  • “Encode Justice.” n.d. Encode Justice. Retrieved October 18, 2024, from, https://encodejustice.org/. Guariglia, Matthew. 2022. “VICTORY! San Francisco Bans Killer Robots...For Now.” Electronic Frontier Foundation. December 8, 2022. Retrieved October 18, 2024, from https://www.eff.org/deeplinks/2022/12/victory-san-francisco-bans-killer-robotsfor-now.
  • Henckaerts, J., Alvermann, C., & De La Croix-Rouge, C. I. (2005). Customary international humanitarian law. Cambridge University Press.
  • Henshall, W. (2024, March 29). The U.S. military’s investments into artificial intelligence are skyrocketing. TIME. Retrieved October 18, 2024, from https://time.com/6961317/ai-artificial-intelligence-us-military-spending/
  • Human Rights Watch. 2021. “Killer Robots: Negotiate New Law to Protect Humanity,” December 1, 2021. Retrieved October 18, 2024, from https://www.hrw.org/news/2021/12/01/killer-robots-negotiate-new-law-protect-humanity.
  • Human Rights Watch. 2024. “Killer Robots: UN Vote Should Spur Action on Treaty,” January 15, 2024. Retrieved October 18, 2024, from https://www.hrw.org/news/2024/01/03/killer-robots-un-vote-should-spur-action-treaty.
  • Human Rights Watch. 2022. “Ban ‘killer robots’ before it’s too late”, April 27, 2022. Retrieved October 18, 2024, from https://www.hrw.org/news/2012/11/19/ban-killer-robots-its-too-late
  • Hynek, N., & Solovyeva, A. (2022). Militarizing artificial intelligence: theory, technology, and regulation. Routledge.
  • Immoral Code film. (n.d.-b). Stop Killer Robots. Retrieved October 18, 2024, from https://www.stopkillerrobots.org/take-action/immoral-code/
  • Ipsos. 2021. “Global Survey Highlights Continued Opposition to Fully Autonomous Weapons.” Ipsos, February 2, 2021. Retrieved October 18, 2024, from https://www.ipsos.com/en-us/global-survey-highlights-continued-opposition-fully-autonomous-weapons.
  • Kangal, Z. T. (2021). Yapay zekâ ve ceza hukuku. İstanbul:Oniki Levha Yayınları.
  • Madiega, T., & Ilnicki, R. (2024, August 20). AI investment: eu and global indicators. Epthinktank. Retrieved October 18, 2024, from https://epthinktank.eu/2024/04/04/ai-investment-eu-and-global-indicators/
  • Mascellino, A. (n.d.). Global AI investment to top £150 billion by 2025 - Outside Insight. Retrieved October 18, 2024, from https://outsideinsight.com/insights/global-ai-investment-150-billion-2025/
  • Melucci, Alberto. 1985. The symbolic challenge of contemporary movements. Social Research 52 (4): 789–816. Retrieved October 18, 2024, from https://www.jstor.org/stable/40970398.
  • Monroy, M. (2023, November 2). Large majority against killer robots: 164 states vote in favour of UN resolution, Israel and Iran abstain. Retrieved October 18, 2024, from https://digit.site36.net/2023/11/02/large-majority-against-killer-robots-164-states-vote-in-favour-of-un-resolution-israel-and-iran-abstain/
  • Nasu, H., & Korpela, C. (2022, June 21). Stop the “Stop the Killer Robot” debate: Why we need artificial intelligence in future battlefields. Council on Foreign Relations. Retrieved October 18, 2024, from https://www.cfr.org/blog/stop-stop-killer-robot-debate-why-we-need-artificial-intelligence-future-battlefields.
  • Okur, S. (2021). Otonom araçlarda sözleşme dışı hukuki sorumluluk: Yapay zeka sorumluluk doktrinine mukayeseli bir katkı, Ankara: Adalet Yayınları.
  • Ozer, A. (2022). Silahlandırılmış Yapay Zeka: Otonom Silah Sistemleri ve Uluslararası Hukuk, Ankara: Adalet Yayınevi.
  • Raska, M., & Bitzinger, R. A. (2023). The AI wave in defence innovation: Assessing military artificial intelligence strategies, capabilities, and trajectories. Taylor & Francis.
  • Ryall, J. (2019, September 10). Japan under pressure to join campaign against killer robots. dw.com. Retrieved October 18, 2024, from https://www.dw.com/en/japan-under-pressure-to-join-campaign-against-killer-robots/a-50370333
  • “San Francisco Vote to Allow Police Use of Deadly Robots Spurs Concern and Outrage.” 2022. NBC News. December 2, 2022. Retrieved October 18, 2024, from https://www.nbcnews.com/news/us-news/san-francisco-vote-allow-police-use-deadly-robots-spurs-concern-outrag-rcna59841.
  • Sati, M. C. (2023). The attributability of combatant status to military ai technologies under international humanitarian law. Global Society, 38 (1), 122–138. Retrieved October 18, 2024, from https://doi.org/10.1080/13600826.2023.2251509
  • Scharre, P. (2018). Army of none: Autonomous weapons and the future of war. Retrieved October 18, 2024, from https://openlibrary.org/books/OL26959079M/Army_of_none
  • Schraagen, J. M. (2023). Responsible use of ai in military systems: Prospects and challenges. Ergonomics, 66(11), 1719–1729. https://doi.org/10.1080/00140139.2023.2278394
  • Sehrawat, V. (2017). Autonomous weapon system: Law of armed conflict (LOAC) and other legal challenges. Computer Law & Security Review, 33(1), 38–56. https://doi.org/10.1016/j.clsr.2016.11.001
  • Singer, P. W. (2009). Wired for war: The robotics revolution and conflict in the 21st century. Penguin.
  • Sloan, E. (2008). Military transformation and modern warfare: A reference handbook. Bloomsbury Publishing USA.
  • Starchak, M. (2024, August 16). Russian defense plan kicks off separate ai development push. Defense News. Retrieved October 18, 2024, from https://www.defensenews.com/global/europe/2024/08/16/russian-defense-plan-kicks-off-separate-ai-development-push/
  • Statista. 2023. “Growth of AI Investment Globally 2015-2025 (in Billion U.S. Dollars).” November 22, 2023. Retrieved October 18, 2024, from https://www.statista.com/statistics/1424667/ai-investment-growth-worldwide/.
  • Stop Killer Robots. (n.d.). Our Member Organisations. Retrieved October 18, 2024, from https://www.stopkillerrobots.org/a-global-push/member-organisations/.
  • Stop Killer Robots. (n.d.). Problems with Autonomous Weapons. Retrieved October 18, 2024, from https://www.stopkillerrobots.org/stop-killer-robots/facts-about-autonomous-weapons/.
  • Stop Killer Robots. (n.d.) Steering Committee. Retrieved October 18, 2024, from https://www.stopkillerrobots.org/steering-committee/.
  • Stop Killer Robots. (n.d.). Stop killer robots. Retrieved October 18, 2024, from https://www.stopkillerrobots.org/
  • Thurnher, J. S. (2013, January 18). The law that applies to autonomous Weapon Systems ASIL. Retrieved October 18, 2024, from https://www.asil.org/insights/volume/17/issue/4/law-applies-autonomous-weapon-systems US Department of Defence, (2012). US DOD Directive 3000.09, Autonomy in Weapon Systems.
  • Wagner, M., Autonomous weapon systems (January 1, 2016). Max Planck Encyclopedia of Public International Law, Rüdiger Wolfrum, ed., Oxford University Press, Available at SSRN: Retrieved October 18, 2024, from https://ssrn.com/abstract=2786136
  • Williams, A. P., & Scharre, P. D. (2015b). Autonomous systems: Issues for defence policymakers. NATO Allied Command Transformation.

AUTONOMOUS WEAPON SYSTEMS: A LEGAL CHALLENGE FOR INTERNATIONAL HUMANITARIAN LAW

Yıl 2025, Cilt: 16 Sayı: 31, 36 - 58, 27.06.2025
https://doi.org/10.36543/kauiibfd.2025.002

Öz

The increasing deployment of artificial intelligence (AI) technologies has highlighted the need for the adaptation of legal frameworks and the rectification of regulatory deficiencies. The implementation of these technologies in the military domain and the diminishment of human oversight over weaponry given rise to a multitude of debates, particularly in the realms of law and ethics. There has been considerable worldwide resistance to the dehumanization of individuals into objects, stereotypes, and data points by lethal robots aimed at humans. This study will examine the novel security dangers presented by artificial intelligence driven military technology and autonomous weapon systems (AWS) within the framework of Ulrich Beck's Risk Society theory and International Humanitarian Law. In conclusion, the increasing deployment of AI technologies in civilian and military contexts presents both opportunities and risks that demand attention, particularly given the ethical concerns and debates surrounding autonomous weapons systems (AWS) due to the current deficiencies in international legal regulations.

Etik Beyan

The study complied with the ethics committee principles and required permissions were obtained in accordance with the principle of intellectual property and copyright.

Kaynakça

  • Ahmed, S. (2023, May 22). It’s ai powered slaughterbots, not chatgpt, that should worry us. The Daily Star. Retrieved October 18, 2024, from https://www.thedailystar.net/opinion/views/news/its-ai-powered-slaughterbots-not-chatgpt-should-worry-us-3254361
  • Allhoff, F., Evans, N. G., & Henschke, A. (2013). Routledge handbook of ethics and war. Routledge eBooks. https://doi.org/10.4324/9780203107164
  • Automated Decision Research. 2023. “Autonomous Weapons and Digital Dehumanisation Automated Decision Research.” February 14, 2023. Retrieved October 18, 2024, from https://automatedresearch.org/news/report/autonomous-weapons-and-digital-dehumanisation-a-short-explainer-paper/
  • Automatic, adj. & n. meanings, etymology and more | Oxford English Dictionary. (n.d.). https://www.oed.com/dictionary/automatic_adj?tl=true
  • Bartneck, C., Lütge, C., Wagner, A., & Welsh, S. (2020). Military uses of ai. SpringerBriefs in ethics (pp. 93–99). https://doi.org/10.1007/978-3-030-51110-4_11
  • Beck, U. (1992). Risk Society: Towards a New Modernity. SAGE.
  • Boutin, B. (2022). State responsibility in relation to military applications of artificial intelligence. Leiden Journal of International Law, 36 (1), 133–150. https://doi.org/10.1017/s0922156522000607
  • Cardon, A., & Itmi, M. (2016). New autonomous systems. John Wiley & Sons.
  • Castells, M. (2015). Networks of outrage and hope: Social Movements in the internet age. John Wiley & Sons.
  • Chengeta, T. (2015b). Measuring autonomous weapon systems against international humanitarian law rules. Social Science Research Network. https://doi.org/10.2139/ssrn.2755184
  • Davison, N. (2018). A legal perspective: Autonomous weapon systems under international humanitarian law. In Disarmament (pp. 5–18). https://doi.org/10.18356/29a571ba-en
  • Deeney, Chris. 2019. “Six in Ten (61%) Respondents across 26 Countries Oppose the Use of Lethal Autonomous Weapons Systems.” Ipsos, January 22, 2019. Retrieved October 18, 2024, from https://www.ipsos.com/en-us/news-polls/human-rights-watch-six-in-ten-oppose-autonomous-weapons.
  • Docherty, B. (2012). Losing humanity. Human Rights Watch. Retrieved October 18, 2024, from https://www.hrw.org/report/2012/11/19/losing-humanity/case-against-killer-robots
  • “Encode Justice.” n.d. Encode Justice. Retrieved October 18, 2024, from, https://encodejustice.org/. Guariglia, Matthew. 2022. “VICTORY! San Francisco Bans Killer Robots...For Now.” Electronic Frontier Foundation. December 8, 2022. Retrieved October 18, 2024, from https://www.eff.org/deeplinks/2022/12/victory-san-francisco-bans-killer-robotsfor-now.
  • Henckaerts, J., Alvermann, C., & De La Croix-Rouge, C. I. (2005). Customary international humanitarian law. Cambridge University Press.
  • Henshall, W. (2024, March 29). The U.S. military’s investments into artificial intelligence are skyrocketing. TIME. Retrieved October 18, 2024, from https://time.com/6961317/ai-artificial-intelligence-us-military-spending/
  • Human Rights Watch. 2021. “Killer Robots: Negotiate New Law to Protect Humanity,” December 1, 2021. Retrieved October 18, 2024, from https://www.hrw.org/news/2021/12/01/killer-robots-negotiate-new-law-protect-humanity.
  • Human Rights Watch. 2024. “Killer Robots: UN Vote Should Spur Action on Treaty,” January 15, 2024. Retrieved October 18, 2024, from https://www.hrw.org/news/2024/01/03/killer-robots-un-vote-should-spur-action-treaty.
  • Human Rights Watch. 2022. “Ban ‘killer robots’ before it’s too late”, April 27, 2022. Retrieved October 18, 2024, from https://www.hrw.org/news/2012/11/19/ban-killer-robots-its-too-late
  • Hynek, N., & Solovyeva, A. (2022). Militarizing artificial intelligence: theory, technology, and regulation. Routledge.
  • Immoral Code film. (n.d.-b). Stop Killer Robots. Retrieved October 18, 2024, from https://www.stopkillerrobots.org/take-action/immoral-code/
  • Ipsos. 2021. “Global Survey Highlights Continued Opposition to Fully Autonomous Weapons.” Ipsos, February 2, 2021. Retrieved October 18, 2024, from https://www.ipsos.com/en-us/global-survey-highlights-continued-opposition-fully-autonomous-weapons.
  • Kangal, Z. T. (2021). Yapay zekâ ve ceza hukuku. İstanbul:Oniki Levha Yayınları.
  • Madiega, T., & Ilnicki, R. (2024, August 20). AI investment: eu and global indicators. Epthinktank. Retrieved October 18, 2024, from https://epthinktank.eu/2024/04/04/ai-investment-eu-and-global-indicators/
  • Mascellino, A. (n.d.). Global AI investment to top £150 billion by 2025 - Outside Insight. Retrieved October 18, 2024, from https://outsideinsight.com/insights/global-ai-investment-150-billion-2025/
  • Melucci, Alberto. 1985. The symbolic challenge of contemporary movements. Social Research 52 (4): 789–816. Retrieved October 18, 2024, from https://www.jstor.org/stable/40970398.
  • Monroy, M. (2023, November 2). Large majority against killer robots: 164 states vote in favour of UN resolution, Israel and Iran abstain. Retrieved October 18, 2024, from https://digit.site36.net/2023/11/02/large-majority-against-killer-robots-164-states-vote-in-favour-of-un-resolution-israel-and-iran-abstain/
  • Nasu, H., & Korpela, C. (2022, June 21). Stop the “Stop the Killer Robot” debate: Why we need artificial intelligence in future battlefields. Council on Foreign Relations. Retrieved October 18, 2024, from https://www.cfr.org/blog/stop-stop-killer-robot-debate-why-we-need-artificial-intelligence-future-battlefields.
  • Okur, S. (2021). Otonom araçlarda sözleşme dışı hukuki sorumluluk: Yapay zeka sorumluluk doktrinine mukayeseli bir katkı, Ankara: Adalet Yayınları.
  • Ozer, A. (2022). Silahlandırılmış Yapay Zeka: Otonom Silah Sistemleri ve Uluslararası Hukuk, Ankara: Adalet Yayınevi.
  • Raska, M., & Bitzinger, R. A. (2023). The AI wave in defence innovation: Assessing military artificial intelligence strategies, capabilities, and trajectories. Taylor & Francis.
  • Ryall, J. (2019, September 10). Japan under pressure to join campaign against killer robots. dw.com. Retrieved October 18, 2024, from https://www.dw.com/en/japan-under-pressure-to-join-campaign-against-killer-robots/a-50370333
  • “San Francisco Vote to Allow Police Use of Deadly Robots Spurs Concern and Outrage.” 2022. NBC News. December 2, 2022. Retrieved October 18, 2024, from https://www.nbcnews.com/news/us-news/san-francisco-vote-allow-police-use-deadly-robots-spurs-concern-outrag-rcna59841.
  • Sati, M. C. (2023). The attributability of combatant status to military ai technologies under international humanitarian law. Global Society, 38 (1), 122–138. Retrieved October 18, 2024, from https://doi.org/10.1080/13600826.2023.2251509
  • Scharre, P. (2018). Army of none: Autonomous weapons and the future of war. Retrieved October 18, 2024, from https://openlibrary.org/books/OL26959079M/Army_of_none
  • Schraagen, J. M. (2023). Responsible use of ai in military systems: Prospects and challenges. Ergonomics, 66(11), 1719–1729. https://doi.org/10.1080/00140139.2023.2278394
  • Sehrawat, V. (2017). Autonomous weapon system: Law of armed conflict (LOAC) and other legal challenges. Computer Law & Security Review, 33(1), 38–56. https://doi.org/10.1016/j.clsr.2016.11.001
  • Singer, P. W. (2009). Wired for war: The robotics revolution and conflict in the 21st century. Penguin.
  • Sloan, E. (2008). Military transformation and modern warfare: A reference handbook. Bloomsbury Publishing USA.
  • Starchak, M. (2024, August 16). Russian defense plan kicks off separate ai development push. Defense News. Retrieved October 18, 2024, from https://www.defensenews.com/global/europe/2024/08/16/russian-defense-plan-kicks-off-separate-ai-development-push/
  • Statista. 2023. “Growth of AI Investment Globally 2015-2025 (in Billion U.S. Dollars).” November 22, 2023. Retrieved October 18, 2024, from https://www.statista.com/statistics/1424667/ai-investment-growth-worldwide/.
  • Stop Killer Robots. (n.d.). Our Member Organisations. Retrieved October 18, 2024, from https://www.stopkillerrobots.org/a-global-push/member-organisations/.
  • Stop Killer Robots. (n.d.). Problems with Autonomous Weapons. Retrieved October 18, 2024, from https://www.stopkillerrobots.org/stop-killer-robots/facts-about-autonomous-weapons/.
  • Stop Killer Robots. (n.d.) Steering Committee. Retrieved October 18, 2024, from https://www.stopkillerrobots.org/steering-committee/.
  • Stop Killer Robots. (n.d.). Stop killer robots. Retrieved October 18, 2024, from https://www.stopkillerrobots.org/
  • Thurnher, J. S. (2013, January 18). The law that applies to autonomous Weapon Systems ASIL. Retrieved October 18, 2024, from https://www.asil.org/insights/volume/17/issue/4/law-applies-autonomous-weapon-systems US Department of Defence, (2012). US DOD Directive 3000.09, Autonomy in Weapon Systems.
  • Wagner, M., Autonomous weapon systems (January 1, 2016). Max Planck Encyclopedia of Public International Law, Rüdiger Wolfrum, ed., Oxford University Press, Available at SSRN: Retrieved October 18, 2024, from https://ssrn.com/abstract=2786136
  • Williams, A. P., & Scharre, P. D. (2015b). Autonomous systems: Issues for defence policymakers. NATO Allied Command Transformation.
Toplam 48 adet kaynakça vardır.

Ayrıntılar

Birincil Dil İngilizce
Konular Uluslararası Güvenlik, Uluslararası Hukuk
Bölüm Makaleler
Yazarlar

Erdal Bayar 0000-0002-1726-7989

Yayımlanma Tarihi 27 Haziran 2025
Gönderilme Tarihi 20 Ekim 2024
Kabul Tarihi 26 Şubat 2025
Yayımlandığı Sayı Yıl 2025 Cilt: 16 Sayı: 31

Kaynak Göster

APA Bayar, E. (2025). AUTONOMOUS WEAPON SYSTEMS: A LEGAL CHALLENGE FOR INTERNATIONAL HUMANITARIAN LAW. Kafkas Üniversitesi İktisadi Ve İdari Bilimler Fakültesi Dergisi, 16(31), 36-58. https://doi.org/10.36543/kauiibfd.2025.002

KAÜİİBFD, Kafkas Üniversitesi İktisadi ve İdari Bilimler Fakültesi Dergi Yayıncılığı'nın kurumsal dergisidir.

KAÜİİBFD 2022 yılından itibaren Web of Science'a dahil edilerek, Clarivate ürünü olan Emerging Sources Citation Index (ESCI) uluslararası alan endeksinde taranmaya başlamıştır. 

2025 Haziran ve Aralık sayısı İşletme alanı kotası dolmuştur. Bir sonraki duyuruya kadar İşletme alanındaki gönderiler değerlendirmeye alınmayacaktır. Dergimizin kapsamındaki diğer alanların makale kabul ve değerlendirmeleri devam etmektedir.