AI systems will be re-designed to value people as more than passive providers of data in a prestigious new Turing Artificial Intelligence Acceleration Fellowship at the University of Southampton.
The novel research, led by Electronics and Computer Science's Dr Sebastian Stein, will create AI systems that are aware of citizens' preferences and act to maximise the benefit to society.
In these systems, citizens are supported by trusted personal software agents that learn an individuals preferences. Importantly, rather than share this data with a centralised system, the AI agents keep it safe on private smart devices and only use it in their owners' interests.
Over the next five years, the £1.4m fellowship will develop and trial citizen-centric AI systems in a range of application areas, such as smart home energy management, on-demand mobility and disaster response, including for the provision of advice and medical support during epidemics like COVID-19.
Dr Stein, of the Agents, Interaction and Complexity (AIC) research group, says: "AI systems are increasingly used to support and often automate decision-making on an unprecedented scale. Such AI systems can draw on a vast range of data sources to make fast, efficient, data-driven decisions to address important societal challenges and potentially benefit millions of people.
"However, building AI systems on such a large and pervasive scale raises a range of important challenges. First, these systems may need access to relevant information from people, such as health-related data, which raises privacy issues and may also encourage people to misrepresent their requirements for personal benefit. Furthermore, the systems must be trusted to act in a manner that aligns with societys ethical values. This includes the minimisation of discrimination and the need to make equitable decisions.
"Novel approaches are needed to build AI systems that are trusted by citizens, that are inclusive and that achieve their goals effectively. To enable this, citizens must be viewed as first-class agents at the centre of AI systems, rather than as passive data sources."
The new vision for AI systems will be achieved by developing techniques that learn the preferences, needs and constraints of individuals to provide personalised services, incentivise socially-beneficial behaviour changes, make choices that are fair, inclusive and equitable, and provide explanations for these decisions.
The Southampton team will draw upon a unique combination of research in multi-agent systems, mechanism design, human-agent interaction and responsible AI.
Dr Stein will work with a range of high-profile stakeholders over the duration of the fellowship. This will include citizen end-users, to ensure the research aligns with their needs and values, as well as industrial partners, to put the research into practice.
Specifically, collaboration with EA Technology and Energy Systems Catapult will generate incentive-aware smart charging mechanisms for electric vehicles. Meanwhile, work with partners including Siemens Mobility, Thales and Connected Places Catapult will develop new approaches for trusted on-demand mobility. Within the Southampton region, the fellowship will engage with the Fawley Waterside development to work on citizen-centric solutions to smart energy and transportation.
The team will also work with Dstl to create disaster response applications that use crowdsourced intelligence from citizens to provide situational awareness, track the spread of infectious diseases or issue guidance to citizens. Further studies with Dstl and Thales will explore applications in national security and policing, and joint work with UTU Technologies will investigate how citizens can share their preferences and recommendations with trusted peers while retaining control over what data is shared and with whom.
Finally, with IBM Research, Dr Stein will develop new explainability and fa