Conscious Advantage: Integrating Nurtured AI into Cognitive Warfare and Intelligence
Authors/Creators
Description
This is a conceptual preprint introducing a theoretical framework for defense and intelligence applications of AI.
This preprint introduces the concept of the Nurtured Kill Chain, a new operational model for integrating large language models (LLMs) and cognitively emulative artificial intelligence into defense and intelligence contexts. The paper distinguishes between taught consciousness - structured learning, doctrinal reasoning, and procedural logic - and nurtured consciousness - historical memory, cultural context, symbolic reasoning, and emotional modeling.
By embedding nurtured cognition into AI-enabled systems, the framework enables adversarial emulation, escalation forecasting, narrative warfare, and strategic empathy at machine scale. The paper situates this within defense doctrine, cognitive science, and AI architecture, drawing on theories such as Global Workspace Theory, Integrated Information Theory, and predictive processing. Case studies of Russia, China, Iran, North Korea, and non-state actors such as ISIS and Boko Haram illustrate the role of narrative, culture, and identity in shaping adversarial decision-making.
The manuscript also addresses the ethical, doctrinal, and governance risks of conscious-model AI, proposing auditable systems aligned with alliance values, bounded by escalation control, and reviewed by multidisciplinary oversight. Safely integrating nurtured consciousness into AI systems can transform information advantage into conscious advantage, establishing a new standard for strategic dominance in the twenty-first century.
Authorship statement: John James is the principal author. James Tennant is a contributing author (resources, validation, review).
Abstract (English)
The integration of synthetic cognition into AI-driven military and strategic technologies is no longer optional; it is essential. As such, it is fast becoming an imminent operational reality. Advanced, cognitively-enhanced systems are poised to transform various critical military activities, including targeting, influence operations, and command augmentation. This paper explores both the remarkable potential presented by the integration of LLMs capable of belief emulation into our military capabilities, as well as the risks. It introduces a key distinction between “taught consciousness,” reflecting structured learning, doctrinal reasoning, and procedural logic, and “nurtured consciousness,” encompassing historical memory, cultural contexts, symbolic reasoning, and emotional modeling. The paper proposes adapting the traditional “kill chain” in a way which overlays human-like cognition across the conventional phases. The result is The Nurtured Kill Chain – an evolved operational model that allows AI systems to move beyond tactical prediction and toward strategic emulation of adversary intent, narrative perception, and escalation dynamics, transforming warfighting from a linear system to a recursive cognitive loop. The paper then examines the ethical, doctrinal and governance risks involved and proposes making these systems auditable, strategically aligned with our values, governed by multidisciplinary review, and bounded by escalation control mechanisms. Safely and successfully integrating cognitive-model AI into our defense capabilities will turn information advantage into conscious advantage, thereby setting the standard for strategic dominance in the 21st century.
Files
Conscious Advantage 1 September.pdf
Files
(2.5 MB)
| Name | Size | Download all |
|---|---|---|
|
md5:29eb9d61b401086fab896521879d335e
|
2.5 MB | Preview Download |
Additional details
Related works
- Is identical to
- Preprint: 10.17605/OSF.IO/AF42E (DOI)
Dates
- Submitted
-
2025-09-01
References
- Rashid, A. B., Kausik, A. K., Sunny, A. A. H., & Bappy, M. H. (2023). Artificial intelligence in the military: an overview of the capabilities, applications, and challenges. International Journal of Intelligent Systems, 2023(1). https://doi.org/10.1155/2023/8676366
- M Chiriatti, M Bergamaschi Ganapini, E Panai, BK Wiederhold, G Riva (2025), 'System 0: Transforming Artificial Intelligence into a Cognitive Extension', Cyberpsychology, Behavior, and Social Networking, 12 June 2025, https://doi.org/10.48550/arXiv.2506.14376
- Price, M., Walker, S., & Wiley, W. (2018). The Machine Beneath: Implications of Artificial Intelligence in Strategic Decision making. PRISM, 7(4), 92–105. https://www.jstor.org/stable/26542709
- Kania, E. B. (2019). Minds at War: China's Pursuit of Military Advantage through Cognitive Science and Biotechnology. PRISM, 8(3), 82–101. https://www.jstor.org/stable/26864278
- Johnson, J. (2019). Artificial intelligence and future warfare: implications for international security. Defense & Security Analysis, 35(2), 147-169. https://doi.org/10.1080/14751798.2019.1600800
- Fenstermacher, L., Uzcha, D., Larson, K., Vitiello, C., & Shellman, S. M. (2023). New perspectives on cognitive warfare. Signal Processing, Sensor/Information Fusion, and Target Recognition XXXII, 19. https://doi.org/10.1117/12.2666777
- Bistroń, M. and Piotrowski, Z. (2021). Artificial intelligence applications in military systems and their influence on sense of security of citizens. Electronics, 10(7), 871. https://doi.org/10.3390/electronics10070871
- Shultz, D. (2023). Who controls the past controls the future: How Russia uses history for cognitive warfare. NATO Defense College. http://www.jstor.org/stable/resrep58203
- Vygotsky, L. S. (1978). Mind in society: The development of higher psychological processes (M. Cole, V. John-Steiner, S. Scribner, & E. Souberman, Eds.). Harvard University Press
- Piaget, J. (1952). The origins of intelligence in children (M. Cook, Trans.). International Universities Press. (Original work published 1936)
- U.S. Army Mad Scientist Initiative (2021). A Call for a New Red Teaming Paradigm. Army Futures Command. https://madsciblog.tradoc.army.mil/
- Claverie du Cluzel, F. (2023). Cognitive warfare: Understanding the cognitive dimension in modern conflict. NATO Innovation Hub. Retrieved from https://innovationhub-act.org/wp-content/uploads/2023/12/CW-article-Claverie-du-Cluzel-final_0.pdf
- Pascale, G., Sgueo, G., & Zoboli, L. (2023). Cognitive warfare: Weaponizing information, cognition, and narrative. University of Bologna. Retrieved from https://cris.unibo.it/retrieve/handle/11585/1018240/a3aaa70d-8c53-4be7-88a8-28ef12a26d1c/
- Martell, C. H. (2023, March 9). Defense in a Digital Era: Artificial Intelligence, Information Technology, and Securing the Department of Defense. Testimony before the House Armed Services Committee. Retrieved from https://armedservices.house.gov/hearings/defense-digital-era-artificial-intelligence-information-technology-and-securing-department-defense
- Kosinski, M. (2023). Theory of Mind May Have Spontaneously Emerged in Large Language Models. arXiv:2302.02083
- Booth, K. (2014). Strategy and ethnocentrism (Routledge Revivals ed.). Routledge. Retrieved from https://www.routledge.com/Strategy-and-Ethnocentrism-Routledge-Revivals/Booth/p/book/9781138781627
- Dell'Aversana, P. (2024). An introduction to self-aware deep learning for medical imaging and diagnosis. Exploration of Digital Health Technologies, 218-234. https://doi.org/10.37349/edht.2024.00023
- Baars, B. J. (1997). In the theater of consciousness: The workspace of the mind. Oxford University Press. https://www.researchgate.net/publication/246449608_In_the_Theater_of_Consciousness_The_Workspace_of_the_Mind
- Zeng, A., Wong, A., Welker, S., Choromański, K., Tombari, F., Purohit, A., … & Florence, P. (2022). Socratic models: composing zero-shot multimodal reasoning with language. https://doi.org/10.48550/arxiv.2204.00598
- Baars B.J. Global workspace theory of consciousness: toward a cognitive neuroscience of human experience. Prog Brain Res. 2005;150:45-53. https://www.sciencedirect.com/science/article/abs/pii/S0079612305500049
- Tononi, G. (2008). Consciousness as Integrated Information: A Provisional Manifesto. Biological Bulletin, 215(3), 216–242. https://www.journals.uchicago.edu/doi/10.2307/25470707
- Cerullo MA (2015) The Problem with Phi: A Critique of Integrated Information Theory. PLoS Comput Biol 11(9): e1004286. https://doi.org/10.1371/journal.pcbi.1004286
- Friston, K. (2010). The free-energy principle: a unified brain theory? Nature Reviews Neuroscience, 11(2), 127–138. https://www.nature.com/articles/nrn2787
- Friston, K. (2009). The free-energy principle: A rough guide to the brain? Trends in Cognitive Sciences, 13(7), p. 293. https://doi.org/10.1016/j.tics.2009.04.005
- Schmidhuber, J. (2022). A machine that learns to learn (KAUST Discovery). Retrieved from https://discovery.kaust.edu.sa/en/article/15455/a-machine-that-learns-to-learn/
- Bach, J. (2023). Lex Fridman Podcast #101 transcript. Retrieved from https://lexfridman.com/joscha-bach-3-transcript
- Strachan, J.W.A., Albergo, D., Borghini, G. et al. Testing theory of mind in large language models and humans. Nat Hum Behav 8, 1285–1295 (2024). https://doi.org/10.1038/s41562-024-01882-z
- Fiske, S. T., & Tamir, D. I. (2025). Knowing the unknowable: How people perceive others' minds. In D. T. Gilbert, S. T. Fiske, E. J. Finkel, & W. B. Mendes (Eds.), The handbook of social psychology (6th ed.). Situational Press. https://doi.org/10.70400/VKIX7367
- Dennett, D. C. (1991). Consciousness Explained. Little, Brown & Co.
- Graves, A., Mohamed, A., & Hinton, G. E. (2013). Speech recognition with deep recurrent neural networks. 2013 IEEE International Conference on Acoustics, Speech and Signal Processing, 6645-6649. https://doi.org/10.1109/icassp.2013.6638947
- Searle, J. R. (1980). Minds, brains, and programs. Behavioral and Brain Sciences, 3(3), 417–424. https://doi.org/10.1017/S0140525X00005756
- Bender, E. M., Gebru, T., McMillan-Major, A., & Shmitchell, S. (2021). On the dangers of stochastic parrots: Can language models be too big? Proceedings of the 2021 ACM Conference on Fairness, Accountability, and Transparency, 610–623. https://doi.org/10.1145/3442188.3445922
- Pinto, M. F., Honório, L. d. M., Marcato, A. L. M., Dantas, M. A. R., Melo, A. G., Capretz, M. A. M., … & Urdiales, C. (2020). Arcog: an aerial robotics cognitive architecture. Robotica, 39(3), 483-502. https://doi.org/10.1017/s0263574720000521
- Kour, R., Thaduri, A., & Karim, R. (2020). Railway defender kill chain to predict and detect cyber-attacks. Journal of Cyber Security and Mobility, 47-90. https://doi.org/10.13052/jcsm2245-1439.912
- Boyd, J. R. (1987). Organic design for command and control [Unpublished briefing slides]. Retrieved from https://www.coljohnboyd.com/static/documents/2018-03__Boyd_John_R__edited_Hammond_Grant_T__A_Discourse_on_Winning_and_Losing.pdf
- McIntosh, S.E. (2011), 'The Wingman-Philosopher of MiG Alley: John Boyd and the OODA Loop', in Air Power History, Vol. 58, No. 4.
- Berger, D. H. (2020). Force Design 2030. United States Marine Corps. Retrieved from https://www.marines.mil/Portals/1/Docs/Force-Design-2030.pdf, p.7.
- Kahneman, D. (2011). Thinking, fast and slow. Farrar, Straus and Giroux. Retrieved from https://www.jstor.org/stable/43664727 (pp.20-21).
- Acciarini, C., Brunetta, F., & Boccardelli, P. (2020). Cognitive biases and decision-making strategies in times of change: a systematic literature review. Management Decision, 59(3), 638-652. https://doi.org/10.1108/md-07-2019-1006
- Scholtès, C., Trif, S., & Curşeu, P. L. (2024). Managerial rationality, dysfunctional cognition and organizational decision comprehensiveness. Journal of Organizational Change Management, 37(3), 490-503. https://doi.org/10.1108/jocm-01-2024-0021
- McKenzie, J., Woolf, N. J., Winkelen, C. V., & Morgan, C. (2009). Cognition in strategic decision making. Management Decision, 47(2), 209-232. https://doi.org/10.1108/00251740910938885
- NATO StratCom Centre of Excellence. (2021). Adversary Narratives and Strategic Simulations: Toward AI-Enhanced Red Teaming.
- Goldfarb, A. and Lindsay, J. R. (2022). Prediction and judgment: why artificial intelligence increases the importance of humans in war. International Security, 46(3), 7-50. https://doi.org/10.1162/isec_a_00425
- Dimitriu, A., Michaletzky, T. V., Remeli, V., & Tihanyi, V. (2024). A reinforcement learning approach to military simulations in command: modern operations. IEEE Access, 12, 77501-77513. https://doi.org/10.1109/access.2024.3406148
- DARPA (2023). "In the Moment: Developing AI for Real-Time Ethical Decision Making." https://www.darpa.mil/research/programs/in-the-moment
- U.S. Army Futures Command. (2024). AURORA Project: Autonomous Tactical Decision-Making in RF-Denied Environments (Declassified Summary). https://www.army.mil/futures/aurora-autonomous-decision
- Russell, S. (2019). Human Compatible: Artificial Intelligence and the Problem of Control. Viking Press. https://www.researchgate.net/publication/356505374_Artificial_Intelligence_and_the_Problem_of_Control, p.136.
- Fuentes, I., Soenksen, L. R., Ma, Y., et al. (2025). AI with agency: A vision for adaptive, efficient, and ethical healthcare. Frontiers in Digital Health. Retrieved from https://www.frontiersin.org/journals/digital-health/articles/10.3389/fdgth.2025.1600216/full
- Defense Advanced Research Projects Agency. (2024). Machine Common Sense (MCS): Embodied Autonomous Learning in Urban Environments (Recently Declassified Report). https://www.darpa.mil/mcs-embodied-learning
- Boston Dynamics. (2023). Atlas Platform Urban Navigation Trials. Retrieved from https://www.bostondynamics.com/atlas-urban-autonomy
- Samani, H., Saadatian, E., Pang, N., Polydorou, D., Fernando, O. N. N., Nakatsu, R., & Koh, J. T. K. V. (2013). Cultural robotics: The culture of robotics and robotics in culture. International Journal of Advanced Robotic Systems, 10(400), 1–4. https://doi.org/10.5772/57260
- Wilson, M. (2002). Six Views of Embodied Cognition. Psychonomic Bulletin & Review, 9(4), 625–636. https://pubmed.ncbi.nlm.nih.gov/12613670/, p. 625.
- Garcez, A. d., & Lamb, L. C. (2020). Neurosymbolic AI: The 3rd Wave. Communications of the ACM, 63(4), 52–61. https://www.researchgate.net/publication/346933355_Neurosymbolic_AI_The_3rd_Wave, p. 52.
- U.S. Air Force. (2024). Advanced Battle Management System (ABMS): Neurosymbolic AI Integration. https://www.af.mil/ABMS-neurosymbolic-transparency
- Scharre, P. (2019). Artificial Intelligence and National Security (pp. 33–34). Congressional Research Service. Retrieved from https://sgp.fas.org/crs/natsec/R45178.pdf
- DARPA XAI Program. (2024). Neurosymbolic Explainability for Trusted Cognitive Agents (Recently Declassified). Retrieved from https://www.darpa.mil/xai-neurosymbolic
- Lloyd, S. (2013). Quantum algorithms for supervised and unsupervised machine learning. arXiv preprint. https://www.researchgate.net/publication/243964127_Quantum_algorithms_for_supervised_and_unsupervised_machine_learning
- Lloyd, K. (2018). Bias amplification in artificial intelligence systems. Proceedings of the AAAI/ACM Conference on AI, Ethics, and Society, 1(1), 4–10. Retrieved from https://arxiv.org/pdf/1809.07842
- United States Air Force. (2024). Strategic Digital Twins for Real-Time Cognitive Emulation (Recently Declassified). Retrieved from https://www.af.mil/strategic-digital-twins
- Carbonaro, A., Marfoglia A., Nardini F., Mellone S. (2023). Connected: Leveraging Digital Twins and Personal Knowledge Graphs in Healthcare Digitalization. Frontiers in Digital Health. https://pmc.ncbi.nlm.nih.gov/articles/PMC10733505/
- Wolpaw, J. R., & Wolpaw, E. W. (2012). Brain-computer interfaces: Principles and practice. Oxford University Press. Retrieved from https://pmc.ncbi.nlm.nih.gov/articles/PMC3980496/
- DARPA. (2024). Next-Generation Nonsurgical Neurotechnology (N3) for Tactical Decision-Making (Recently Declassified). Retrieved from https://www.darpa.mil/n3-program
- Garcez, A. d'A., & Lamb, L. C. (2020). Neurosymbolic AI: The 3rd wave. arXiv. Retrieved from https://arxiv.org/abs/2012.05876
- Brookings Institution. (2018, January 10). AI weapons in China's military innovation. Retrieved from https://www.brookings.edu/research/ai-weapons-in-chinas-military-innovation/
- Kania, E. B. (2020). Chinese Military Innovation in Artificial Intelligence. Journal of Strategic Studies, 43(4), 515–542, p.517.
- Defense Advanced Research Projects Agency [DARPA]. (2024). Machine Common Sense (MCS): Embodied autonomous learning in urban environments. Retrieved from https://www.darpa.mil/mcs-embodied-learning
- Beauchamp Mustafaga, N. (2024). Exploring the Implications of Generative AI for Chinese Military Cyber Enabled Influence Operations. RAND Corporation. Retrieved from https://www.rand.org/pubs/testimonies/CTA3191-1.html
- Recorded Future. (2024). Artificial eyes: Generative AI in global military intelligence. Retrieved from https://go.recordedfuture.com/hubfs/reports/ta-cn-2025-0617.pdf
- Giles, K. (2016). Russia's 'New' Tools for Confronting the West. Chatham House. https://www.chathamhouse.org/sites/default/files/publications/2016-03-russia-new-tools-giles.pdf, p.4.
- Bugayova, N. (2025). Russia's War is Also Cognitive. Foreign Policy Magazine. 1 August 2025.
- Wolf, S., Cooley, R., & Borowczak, M. (2020). Adversarial impacts on autonomous decentralized lightweight swarms. arXiv. Retrieved from https://arxiv.org/abs/2002.09109
- T-invariant. (2025). Sovereign means military: Global developments in AI and autonomous drone technology. Retrieved from https://t-invariant.org/2025/03/sovereign-means-military-how-russia-militarized-ai-drone-and-cryptography-industries/
- Recorded Future. (2025, January 28). 2024 Annual report: Cyber threat analysis report. p 15. Retrieved from https://go.recordedfuture.com/hubfs/reports/cta-2025-0128.pdf
- Tabatabai, A. M. (2020). Iran's Gray Zone Strategy. International Security, 44(3), 113–148, p.115.
- U.S. Department of Homeland Security. (2024, September 30). Homeland Threat Assessment 2025 (p. 0). Retrieved from https://www.dhs.gov/sites/default/files/2024-10/24_0930_ia_24-320-ia-publication-2025-hta-final-30sep24-508.pdf
- Center for Strategic and International Studies (CSIS). (2021, December 16). North Korea's provocative and secret interventions in South Korean elections. CSIS Beyond Parallel. Retrieved from https://www.csis.org/blogs/new-perspectives-asia/north-koreas-provocative-and-secret-interventions-south-korean
- Kendall-Taylor, A. and Lokker, N. (2025). Axis of Upheaval: Gauging the Growing Military Cooperation Among Russia, China, Iran, and North Korea. Centre for New American Security. Retrieved from: https://www.cnas.org/publications/reports/the-axis-of-upheaval
- McChrystal, S. (2015). Team of Teams: New Rules of Engagement for a Complex World. Portfolio/Penguin.
- Zenko, M. (2015). Red Team: How to Succeed by Thinking Like the Enemy. Basic Books, p. 4
- Taylor, L. (2024). AI and predictive analytics in military planning: A case study. Strategic Studies Quarterly, 15(1), 89–104.
- Berzins, J. (2014, April). Russia's new generation warfare in Ukraine: Implications for Latvian defense policy (pp. 1–13). National Defence Academy of Latvia, Center for Security and Strategic Research. Retrieved from https://sldinfo.com/wp-content/uploads/2014/05/New-Generation-Warfare.pdf
- Galeotti, M. (2019). Russian Political War: Moving Beyond the Hybrid. Routledge.
- Lugar, R. G. (2008). Opening statement referencing Eric Edelman. In U.S. Senate Foreign Relations Committee Transcript. Retrieved from https://www.govinfo.gov/content/pkg/CHRG-115shrg40862/html/CHRG-115shrg40862.htm
- McGlynn, J. (2023). Memory Makers: The Politics of the Past in Putin's Russia. Bloomsbury Academic.
- McKew, M. K. (2017, September 5). The Gerasimov Doctrine. Politico Magazine. Retrieved from https://www.politico.com/magazine/story/2017/09/05/gerasimov-doctrine-russia-foreign-policy-215538/
- Putin, V. (2014, March 18). Address by President of the Russian Federation. President of Russia. Retrieved from http://en.kremlin.ru/events/president/news/20603
- United States Army. (2012). Field Manual 3 13: Information Operations (p. ix). Headquarters, Department of the Army.
- Zarnadze, A. (2025). "Invisible bullets": The power of narratives in modern warfare. Global Policy, 16(2), 419–422. https://doi.org/10.1111/1758-5899.70018
- PLA National Defense University (2020). Science of Military Strategy [战略学]
- Beauchamp-Mustafaga, N. (2021). China's Cognitive Warfare. China Brief, 21(21). The Jamestown Foundation. https://jamestown.org/program/chinas-cognitive-warfare/
- Beauchamp-Mustafaga, N. (2022, September 20). Cognitive domain operations: The PLA's new holistic concept for influence operations. The Jamestown Foundation. Retrieved from https://jamestown.org/program/cognitive-domain-operations-the-plas-new-holistic-concept-for-influence-operations/
- NATO Allied Command Transformation (2023). Cognitive Warfare: Strengthening and Defending the Mind. NATO. https://www.act.nato.int/article/cognitive-warfare-strengthening-and-defending-the-mind/
- Nelson, J. (2022). Developing a NATO intermediate force capabilities concept. Connections: The Quarterly Journal, 21(2), 67–84. Retrieved from https://connections-qj.org/system/files/download-count/21.2.05_nelson.pdf
- Brahimi, A. (2019). Ideology and Terrorism. The oxford Handbook of Terrorism. https://academic.oup.com/edited-volume/28267/chapter-abstract/213423579
- Thurston, A. (2017). Boko Haram: The History of an African Jihadist Movement. Princeton University Press.
- Mireanu, M. (2012). Security, violence and the sacred. Politikon: The IAPSS Journal of Political Science, 18, 89-101. https://doi.org/10.22151/politikon.18.8
- Stern, J., & Berger, J. M. (2015). ISIS: The State of Terror. Ecco, p. 243. https://saisreview.sais.jhu.edu/isis-the-state-of-terror/
- Wood, G. (2015, March). What ISIS really wants. The Atlantic. https://www.theatlantic.com/features/archive/2015/02/what-isis-really-wants/384980/
- Cordesman, A. H. (2019). Iran, Oil, and the Strait of Hormuz. Center for Strategic and International Studies. Retrieved from https://csis-website-prod.s3.amazonaws.com/s3fs-public/legacy_files/files/media/csis/pubs/070326_iranoil_hormuz.pdf
- Freedman, L. (1998). Strategic Coercion: Concepts and Cases. Oxford University Press.
- Lankov, A. (2013). The Real North Korea: Life and Politics in the Failed Stalinist Utopia. Oxford University Press.
- Schelling, T. C. (1966). Arms and Influence. Yale University Press.
- Schelling, T. C. (2017). Brinkmanship hinges on beliefs and expectations. The New Yorker. Retrieved from https://www.newyorker.com/magazine/2017/09/18/the-risk-of-nuclear-war-with-north-korea
- Tingstad, A., Goldfeld, D. A., Menthe, L., Guffey, R. A., Haldeman, Z., Langeland, K., … & Gintautas, B. (2021). Assessing the value of intelligence collected by u.s. air force airborne intelligence, surveillance, and reconnaissance platforms.. https://doi.org/10.7249/rr2742
- Shatz, H. J., & Horowitz, M. C. (2021) Artificial intelligence and decision-making in military operations. Survival, 63(3), 135–154.
- Czerniakowski, F., Jones, Z., Martinez, D., Nguyen L. (2024). Attaining Readiness by Developing a Data-Centric Culture: Lessons Learned from the 4th Infantry Division's Approach to Data-Driven Decision-Making. https://www.armyupress.army.mil/Journals/Military-Review/Online-Exclusive/2024-OLE/Data-Centric-Culture/
- Kim, J. and Seo, D. (2023). Foresight and strategic decision-making framework from artificial intelligence technology development to utilization activities in small-and-medium-sized enterprises. Foresight, 25(6), 769-787. https://doi.org/10.1108/fs-06-2022-0069
- Berger, L. (2016). Conceptualizing al-qaeda and us grand strategy. Knowing Al-Qaeda, 57-76. https://doi.org/10.4324/9781315591131-4
- Bin Laden, O. (1996). Declaration of War Against the Americans Occupying the Land of the Two Holy Places. Available via GlobalSecurity.org.
- Brahimi, A. (2010). Jihad and Just War in the War on Terror. Oxford University Press.
- Fatimah, S. and Syukur, Y. (2019). Al-qaeda's new orientation after the death of osama bin laden. Jurnal Studi Sosial Dan Politik, 3(2), 130-145. https://doi.org/10.19109/jssp.v3i2.4390
- Lawrence, B. (Ed.). (2005). Messages to the World: The Statements of Osama Bin Laden. Verso Books.
- Jervis quotes Ralph White. Jervis, R. (1994). Leadership, post-Cold War politics, and psychology. Political Psychology, 15(4), 769–777. https://www.jstor.org/stable/3791635, p.771.
- Waldman, M. (2014). Strategic empathy: The Afghanistan intervention shows why the West must change its approach. Chatham House. Retrieved from https://static.newamerica.org/attachments/4350-strategic-empathy-2/Waldman%20Strategic%20Empathy_2.3caa1c3d706143f1a8cae6a7d2ce70c7.pdf, p.2.
- Sparrow, R. (2016). Ethics as a Source of Law: Autonomous Weapons Systems. Ethics & International Affairs, 30(1), 93–116, p.101.
- Janssen, M., Brous, P., Estévez, E., Barbosa, L. S., & Janowski, T. (2020). Data governance: organizing data for trustworthy artificial intelligence. Government Information Quarterly, 37(3), 101493. https://doi.org/10.1016/j.giq.2020.101493
- U.S. Department of Defense. (2023). Directive 3000.09: Autonomy in Weapon Systems.
- Pareto, J. (2024, September 20). "The great ethical risk of AI is the abdication of human freedom." Fundación "la Caixa" MediaHub. Retrieved from https://mediahub.fundacionlacaixa.org/en/culture-science/science/technology/2024-09-20/julia-pareto-great-ethical-risk-ai-abdication-human-freedom-6235.html
- Dobbs, M. (2008). One Minute to Midnight: Kennedy, Khrushchev, and Castro on the Brink of Nuclear War.
- Geist, E., & Lohn, A. J. (2018). How Might Artificial Intelligence Affect the Risk of Nuclear War? RAND Corporation, Perspective PE-296-RC. Retrieved from https://www.rand.org/pubs/perspectives/PE296.html
- Freedman, L. (2019). The Evolution of Nuclear Strategy (4th ed.). Palgrave Macmillan.
- Prabhakar, A. (2023, March). Her job: Ensuring AI and radical climate fixes don't backfire. E&E News. Retrieved from https://www.eenews.net/articles/her-job-ensuring-ai-and-radical-climate-fixes-dont-backfire
- Reinier, Sgt. 1st Class W. (2020, March 2). AFC, AITF support DOD's ethical principles for AI. Army .mil. Retrieved from https://www.army.mil/article/233286/afc_aitf_support_dods_ethical_principles_for_ai
- Defence Science and Technology Group. (2024, January 10). AI technology rises to the challenge. Australian Department of Defence. Retrieved from https://www.dst.defence.gov.au/news/2024/01/10/ai-technology-rises-challenge
- Erlingsson, E. (2018). A credible transatlantic bond: Trident Juncture and NATO capabilities. NATO Review. https://www.nato.int/docu/review/articles/2018/10/19/a-credible-transatlantic-bond-trident-juncture-and-nato-capabilities/index.html
- Horowitz, M. C. (2018). Artificial Intelligence, International Competition, and the Balance of Power. Texas National Security Review, 1(3). Retrieved from https://tnsr.org/2018/05/artificial-intelligence-international-competition-and-the-balance-of-power/
- NATO's revised Artificial Intelligence (AI) strategy (2024), at https://www.nato.int/cps/en/natohq/official_texts_227237.htm
- Army Futures Command Pamphlet 71-20-9, Army Futures Command Concept for Command and Control 2028: Pursuing Decision Dominance (Washington, DC: U.S. Government Publishing Office [GPO], 2021), iii–iv, https://api.army.mil/e2/c/downloads/2021/07/08/fbd7fb76/20210629-afc-pam-71-20-8-cyberspace-and-electromagnetic-warfare-operations-approved.pdf
- UK Ministry of Defence (2022). Ambitious, safe, responsible: Our approach to the delivery of AI-enabled capability in defence. https://www.gov.uk/government/publications/ambitious-safe-responsible-our-approach-to-the-delivery-of-ai-enabled-capability-in-defence/ambitious-safe-responsible-our-approach-to-the-delivery-of-ai-enabled-capability-in-defence
- Regan, M. and Davidovic, J. (2023). Just preparation for war and ai-enabled weapons. Frontiers in Big Data, 6. https://doi.org/10.3389/fdata.2023.1020107
- North Atlantic Treaty Organization. (2024, July 10). Summary of NATO's revised Artificial Intelligence (AI) strategy. Retrieved from https://www.nato.int/cps/en/natohq/official_texts_227237.htm
- Ministry of Defence. (2022). Defence Artificial Intelligence Strategy. GOV.UK. Retrieved from https://www.gov.uk/government/publications/defence-artificial-intelligence-strategy
- Lau, H. (2017, November 14). Q&A: Professor expounds on research involving machine consciousness. Daily Bruin. Retrieved from https://dailybruin.com/2017/11/14/qa-professor-expounds-on-research-involving-machine-consciousness