Edinburgh to lead £3.2 million project into the Governance and Regulation of Decision-making Machines

[2020] Researchers across the University and beyond have come together to ensure the trustworthiness of systems that put machines in charge of decisions, from virtual assistants like Alexa to aircraft autopilot.

This project is part of the UKRI Trustworthy Autonomous Systems (TAS) programme, funded through the UKRI Strategic Priorities Fund and delivered by the Engineering and Physical Sciences Research Council (EPSRC). The TAS programme brings together research communities and key stakeholders to drive forward cross-disciplinary, fundamental research to ensure that autonomous systems are safe, reliable, resilient, ethical and trusted.

UKRI Trustworthy Autonomous Systems Programme

The TAS programme is a collaborative UK-based platform comprised of Research Nodes and a Hub, united by the purpose of developing world-leading best practice for the design, regulation and operation of autonomous systems. The central aim of the programme is to ensure that the autonomous systems are ‘socially beneficial’, protect people’s personal freedoms and safeguard physical and mental wellbeing.

The project will address public concern and potential risks associated with Autonomous Systems by making sure they are both trustworthy by design and trusted by those that use them, from individuals to society and government. It is only by addressing these issues of public concern and potential risk that autonomous systems will be trusted, allowing them to grow and be utilised more and more.

TAS is comprised of seven distinct research strands termed Nodes: trust, responsibility, resilience, security, functionality, verifiability and governance and regulation. Each Node will receive just over £3 million in funding from UKRI to conduct their research. The node led by Professor Ramamoorthy is tasked with exploring governance and regulation.

TAS Governance and Regulation Research Node

Led by Professor Subramanian Ramamoorthy from the School of Informatics and Edinburgh Centre for Robotics, the team are tasked with developing the governance and regulation of Trustworthy Autonomous Systems (TAS). By developing a novel framework for the certification, assurance and legality of TAS, the project will address whether such systems can be used safely.

The core motivation for this programme is that autonomous systems are becoming increasingly more prevalent in society- including both software decision making systems in finance and medical diagnostics, conversational agents like Alexa as well as physical systems ranging from home robots to self-driving cars and planes. Basically, Trustworthy Autonomous Systems covers the full range of the application of AI in real systems in our lives. Our specific project is on the governance and regulation of such systems.

Professor Subramanian Ramamoorthy
Principal Investigator of the TAS Governance and Regulation Research Node

Current governance and regulation of automated systems is still based on the idea that a person is always in charge of the system, so can step in if need be. For example, most planes have an autopilot feature that facilitates limited autonomy in flight, however much of the regulatory framework depends on the pilot being there to anticipate all eventualities and take control at any time.  As Autonomous Systems become more capable and incorporate increasingly more automated decision making mechanisms, typically AI-based, the existing frameworks for certification and assurance become strained: issues of liability, accountability and responsibility become harder to pin down.

These issues are even more salient when new forms of specifications must be considered. For instance, as AI-based decisions become deployed in healthcare, for automated diagnostics or for socially assistive care systems, we must ask if the resulting decisions embody values of fairness, lack of bias and justice. Devising regulatory frameworks that can address these issues will require an entirely new inter-disciplinary perspective weaving together the technical considerations with the legal, social and ethical. The TAS Governance and Regulation research node aims to create new and improved methods for governing autonomous systems that reflect these emerging use cases.

The project will establish a new software engineering framework to support TAS governance, and trial them with external stakeholders in areas including mobile autonomous systems, and health and social care. Newly developed computational tools for regulators and developers will complement the new methods of governance. In particular, this will include a deeper understanding, from multiple disciplinary perspectives, of how and why autonomous systems fail. The team also aim to improve understanding of the iterative nature of design processes associated with such technologies, and recommend ways to better govern such processes.

A Collaborative and Multidisciplinary Project

The project is a collaboration with academics from across the University of Edinburgh as well as King’s College London, the University of Nottingham, Heriot-Watt University, University of Glasgow and the University of Sussex. The researchers will also work closely with a diverse group of external project partners, including Microsoft, BAE Systems, NASA Ames, Thales, NVidia, Optos, Digital Health & Care Institute (DHI), Legal and General’s Advanced Care Research Centre (ACRC), Civil Aviation Authority (CAA) Innovation Hub, NPL, DSTL, DRisQ, Imandra, Adelard Partners, Altran UK, MSC Software, Ethical Intelligence, Craft Prospect Ltd, Vector Four Ltd and SICSA.

The Node takes a multidisciplinary approach to its work, bringing together researchers with backgrounds in Computer Science and AI, Law, AI ethics, Social Studies of Information Technology and Design Ethnography. The diverse team offers a uniquely holistic perspective that combines technical, social science and humanities research to guarantee that autonomous systems can be trusted and integrated into society with confidence. 

Related Links

Trustworthy Autonomous Systems Hub

UKRI Engineering and Physical Sciences Research Council (ESPRC)

Subramanian Ramamoorthy