Dr Sarah Logan1
1Australian National University, Canberra, Australia
Biography:
Dr Sarah Logan is a Senior Lecturer in the Department of International Relations in the Coral Bell School of Asia Pacific Affairs at the Australian National University. Prior to joining the department in 2019, Sarah was a postdoctoral research fellow in the Faculty of Law at the University of New South Wales. Sarah's primary research focus is the impact of technology, especially the internet, on international relations. She is interested in how technology interacts with traditional understandings of statehood, power, and agency. Sarah’s previous research project concerned the history of counter-extremism policy in the US and the UK.
Abstract:
In Western democracies the decision to go to war is made in ways that ensure decision-makers can be held accountable. In particular, bureaucracies rely on the production of a range of documents such as records of meetings to ensure accountability. Inserting AI into the decision-making process means finding ways to make sure that AI can also be held accountable for decisions. But problems of accountability arise in this context because AI does not produce the type of documents associated with bureaucratic accountability: it is this gap in documentary capacity which is at the core of the search for accountable AI, including in the context of the decision to go to war. This paper argues that the search for accountable AI is essentially an attempt to solve problems of epistemic uncertainty. But the paper argues that accountability can be achieved without solving the problem of epistemic uncertainty. The paper adopts the example of the use of new forms of evidence in the International Criminal Tribunal for Yugoslavia (ICTY) to show that epistemic uncertainty can be resolved and accountability apportioned without absolute epistemic certainty and without documentation in the sense commonly associated with accountability in a bureaucratic context. It analyses the ICTY’s approach to new types of forensic evidence and the use of evidence derived from satellite imagery for the first time. It draws on this analysis to provide three lessons for addressing epistemic uncertainty and facilitating accountability in the use of AI in the decision to go to war.