POST (Parliamentary Office of Science and Technology)
Printable version

Automation in military operations

Advances in robotics and digital technologies, such as artificial intelligence (AI), are enabling greater levels of automation across many sectors, including defence. The UK Government expects automation to be crucial to maintain military advantage in the future. In June 2022, the Ministry of Defence published its Defence AI Strategy, which sets out how it plans to adopt and exploit AI; automation was cited as a key application. This POSTnote discusses current and future applications of automation and AI, their impact on militaries and global stability, and the challenges around their development and implementation.

Documents to download

Automation refers to the use of systems to perform tasks that would ordinarily involve human input. Automation and autonomy are often viewed as being on a spectrum relating to the level of human supervision a system has. This can range from manually controlled systems to those that independently make decisions about how to achieve certain human-set goals. Although automation has been used in various forms for decades, it has attracted growing military interest as technology has improved. Many military systems can feature automation, including robotic systems that carry out physical tasks, and entirely software-based systems for tasks such as data analysis. 

The Government has recognised the military advantages of autonomous systems and artificial intelligence (AI), which can include increased efficiency and reduced risk to personnel. In its 2021 Integrated Review and 2020 Integrated Operating Concept, the Government stated its commitment to embracing new and emerging technologies, including autonomous systems and AI. It has also established a Defence AI Centre (DAIC) to coordinate the UK’s development of AI-enabled technologies for defence, including via collaborations with academia and industry. Globally, the UK, US, China, and Israel have some of the most advanced autonomous and AI-based military capabilities. 

While there are a wide range of potential benefits of using automated technology in the military, there are associated challenges, including those relating to processing data, cybersecurity, and communication between systems. AI-enabled autonomous systems require suitable training data, which may be challenging to prepare. In addition, AI and automation create different testing and assurance challenges to traditional software systems. Automated and autonomous systems also raise ethical concerns, particularly in the context of autonomous weapons systems, which are the subject of extensive international debate.  

Key Points: 

  • Automated functionality already exists in a range of physical and digital military systems. For example, automated data analysis software is used to improve and accelerate decision-making and ‘uncrewed’ vehicles (such as aerial drones), can autonomously search for, identify, and track targets. 
  • Weapons systems featuring automation have also been developed for defensive and offensive applications. These include guided missiles and defence systems which can automatically fire at incoming missiles, vehicles, or soldiers.  
  • Although the technical capability exists, uncrewed offensive weapons are not used to make firing decisions without human authorisation. Reported exceptions are rare and raise ethical concerns.   
  • Autonomous systems are expected to play a support function to military personnel, relieving them of dangerous or repetitive tasks. This is likely to impact the nature of some roles and the skills required: for example, there may be an increased demand for developers and operators of autonomous systems, requiring a greater level of technical knowledge.  
  • Some experts predict that automated systems and AI will reduce costs in the long term through increased efficiency and reduced demand for personnel. However, boosting expertise in automation and AI may involve recruitment from other industries that typically offer higher salaries, and militaries may have to raise salaries to compete. 
  • There are concerns that increasing use of autonomy in weapons systems may lead to a greater risk of conflict escalation. Removing humans from the battlefield may mean that militaries may be less hesitant to use force. Unintended behaviour of automated systems might also lead to escalation.
  • There are also concerns about automated and AI-based technology becoming more accessible to non-state actors, making their attacks more effective. 
  • Key technical challenges for developing autonomous systems include limitations on processing power, the amount of data transfer possible in the field, and cyber security. The quality and accessibility of data required to train machine learning systems can also be a limiting factor, and ensuring appropriate data privacy is a related concern.
  • AI and automation create different testing and assurance challenges to traditional software systems. Experts have highlighted a lack of fit-for-purpose tools and processes for testing and are developing new tools and guidelines.
  • There is currently no legislation specific to the use of automation or AI for military applications. Although their use in warfare is governed by existing International Humanitarian Law, how this law relates to new technologies is debated.
  • There is specific international debate around the use of ‘lethal autonomous weapons systems’ (LAWS). This term has no universally agreed definition and is used to refer to a wide range of weapons with different autonomous capabilities.
  • The UN Convention on Certain Conventional Weapons (CCW) has discussed possible legislation of LAWS since 2014. While most nations represented at the CCW support new regulation of LAWS, others, including the UK, US, and Russia have argued that existing International Humanitarian Law is adequate.
  • Many stakeholders believe that some form of human control of weapons and targeting systems must be maintained to be legally and ethically acceptable. There is debate around what this means in practice. 

Acknowledgements

POSTnotes are based on literature reviews and interviews with a range of stakeholders and are externally peer reviewed. POST would like to thank interviewees and peer reviewers for kindly giving up their time during the preparation of this briefing, including:

  • Prof Hendrik Huelss, University of Southern Denmark* 
  • Prof Ingvild Bode, University of Southern Denmark* 
  • Dr Mohammad Divband Soorati, University of Southampton* 
  • Prof Alvin Wilby, University of Southampton* 
  • Courtney Bowman, Palantir* 
  • Philip Morris, Palantir* 
  • Dr Alexander Blanchard, Alan Turing Institute* 
  • Ben Kelly, Centre for Data Ethics and Innovation* 
  • Ben Pritchard, Thales, University of Southampton* 
  • Dr Yoge Patel, Blue Bear Systems* 
  • Rob Solly, Improbable* 
  • Professor Nick Colosimo, BAE Systems*
  • Prof Noel Sharkey, University of Sheffield*
  • Prof Kenneth Payne, Kings College London* 
  • Prof Sarvapali D. (Gopal) Ramchurn, University of Southampton* 
  • Dr Stuart Middleton, University of Southampton* 
  • Eur Ing. Raj Takhar, Msc, CEng, PhD, Assent* 
  • Dr Catherine Connolly, Campaign to Stop Killer Robots* 
  • Elizabeth Minor, Article 36* 
  • Richard Moyes, Article 36* 
  • Ariel Conn, IEEE Standards Association* 
  • Prof Christian Enemark, University of Southampton* 
  • Professor Nick Colosimo, BAE Systems*
  • Eleanor Scarnell, House of Commons Select Committee Team* 
  • Ministry of Defence
  • David McNeish, Centre for Data Ethics and Innovation
  • Eleanor Watson, IEEE Standards Association 
  • Dr/Prof Adam Svendsen, Bridgehead Institute (Research & Consulting) 

*denotes people and organisations who acted as external reviewers of the briefing.

Documents to download

 

Channel website: https://www.parliament.uk/post

Original article link: https://post.parliament.uk/research-briefings/post-pn-0681/

Share this article
Academic Fellowships Upcoming work POST Publications

The Parliamentary Office of Science and Technology (POST) is Parliament’s in-house source of scientific advice.

 

Latest News from
POST (Parliamentary Office of Science and Technology)

Transforming Government Download the eBook today Delivering for citizens with a sense of place