TY - GEN
T1 - Unsupervised learning of qualitative motion behaviours by a mobile robotUnsupervised learning of qualitative motion behaviours by a mobile robot
AU - Duckworth, Paul
AU - Gatsoulis, Yiannis
AU - Jovan, Ferdian
AU - Hawes, Nick
AU - Hogg, David
AU - Cohn, Anthony
N1 - Copyright © 2016, International Foundation for Autonomous Agents and Multiagent Systems (www.ifaamas.org). All rights reserved.
PY - 2016
Y1 - 2016
N2 - The success of mobile robots, in daily living environments, depends on their capabilities to understand human movements and interact in a safe manner. This paper presents a novel unsupervised qualitative-relational framework for learning human motion patterns using a single mobile robot platform. It is capable of learning human motion patterns in real-world environments, in order to predict future behaviours. This previously untackled task is challenging because of the limited field of view provided by a single mobile robot. It is only able to observe one location at any time, resulting in incomplete and partial human detections and trajectories. Central to the success of the presented framework is mapping the detections into an abstract qualitative space, and then characterising motion invariant to exact metric position. This framework was used by a physical robot autonomously patrolling a company's office during a six week deployment. Experimental results from this deployment are discussed and demonstrate the effectiveness and applicability of the system.
AB - The success of mobile robots, in daily living environments, depends on their capabilities to understand human movements and interact in a safe manner. This paper presents a novel unsupervised qualitative-relational framework for learning human motion patterns using a single mobile robot platform. It is capable of learning human motion patterns in real-world environments, in order to predict future behaviours. This previously untackled task is challenging because of the limited field of view provided by a single mobile robot. It is only able to observe one location at any time, resulting in incomplete and partial human detections and trajectories. Central to the success of the presented framework is mapping the detections into an abstract qualitative space, and then characterising motion invariant to exact metric position. This framework was used by a physical robot autonomously patrolling a company's office during a six week deployment. Experimental results from this deployment are discussed and demonstrate the effectiveness and applicability of the system.
UR - https://ora.ox.ac.uk/objects/uuid:2c813b1f-0d59-4c96-b122-856a8cd6e652
M3 - Published conference contribution
BT - Proceedings of the 15th International Conference on Autonomous Agents and Multiagent Systems (AAMAS 2016)
PB - International Foundation for Autonomous Agents and Multiagent Systems (IFAAMAS)
ER -