Robust Neuro-Symbolic Goal and Plan Recognition

Leonardo Rosa Amado, Ramon Fraga Pereira, Felipe Meneguzzi

Research output: Contribution to conferenceUnpublished paperpeer-review

3 Citations (Scopus)

Abstract

Goal Recognition is the task of discerning the intended goal agent aims to achieve given a sequence of observations, whereas Plan Recognition consists of identifying the plan to achieve such intended goal. Regardless of the underlying
techniques, most recognition approaches are directly affected by the quality of the available observations. In this paper, we develop neuro-symbolic recognition approaches that can combine learning and planning techniques, compensating for noise and missing observations using prior data. We evaluate our approaches in standard human-designed planning domains as well as domain models automatically learned from real-world data. Empirical experimentation shows that our approaches reliably infer goals and compute correct plans in the experimental datasets. An ablation study shows that we outperform existing approaches that rely exclusively on the domain model, or exclusively on machine learning, in problems with both noisy observations and low observability.
Original languageEnglish
Publication statusAccepted/In press - 2 Dec 2022
EventAAAI-23: The 37th AAAI Conference on Artificial Intelligence - Walter E. Washington Convention Center, Washington, United States
Duration: 7 Feb 202314 Feb 2023
Conference number: 37th
https://aaai.org/Conferences/AAAI-23/

Conference

ConferenceAAAI-23
Country/TerritoryUnited States
CityWashington
Period7/02/2314/02/23
Internet address

Bibliographical note

Acknowledgements
Felipe Meneguzzi acknowledges support from the CNPq with project 302773/2019-3 (PQ Fellowship). Ramon Fraga Pereira acknowledges support from the ERC Advanced Grant WhiteMech (No. 834228) and the EU ICT-48 2020
project TAILOR (No. 952215).

Fingerprint

Dive into the research topics of 'Robust Neuro-Symbolic Goal and Plan Recognition'. Together they form a unique fingerprint.

Cite this