Multi-Agent Intention Recognition and Progression

Michael Dann, Yuan Yao, Natasha Alechina, Brian Logan, Felipe Meneguzzi, John Thangarajah

Research output: Chapter in Book/Report/Conference proceedingPublished conference contribution

8 Downloads (Pure)

Abstract

For an agent in a multi-agent environment, it is often beneficial to be able to predict what other agents will do next when deciding how to act. Previous
work in multi-agent intention scheduling assumes a priori knowledge of the current goals of other agents. In this paper, we present a new approach to
multi-agent intention scheduling in which an agent uses online goal recognition to identify the goals currently being pursued by other agents while acting in pursuit of its own goals. We show how online goal recognition can be incorporated into an MCTS-based intention scheduler, and evaluate our approach in a range of scenarios. The results demonstrate that our approach can rapidly recognise the goals of other agents even when they are pursuing multiple goals concurrently, and has similar performance to agents which know the goals of other agents a priori.
Original languageEnglish
Title of host publicationProceedings of the Thirty-Second International Joint Conference on Artificial Intelligence
Subtitle of host publicationMain Track
PublisherIJCAI
Pages91-99
Number of pages9
ISBN (Electronic)978-1-956792-03-4
DOIs
Publication statusPublished - 19 Aug 2023
EventIJCAI 2023: 32nd International Joint Conference on Artificial Intelligence - Macao, China
Duration: 19 Jun 202325 Jun 2023
Conference number: 32
https://ijcai-23.org/

Conference

ConferenceIJCAI 2023
Country/TerritoryChina
CityMacao
Period19/06/2325/06/23
Internet address

Bibliographical note

Acknowledgements
For the purpose of open access, the authors have applied a Creative Commons Attribution (CC BY) licence to any Author Accepted Manuscript version arising from this submission.

Fingerprint

Dive into the research topics of 'Multi-Agent Intention Recognition and Progression'. Together they form a unique fingerprint.

Cite this