Distributed multi-modal data fusion using graph-based deep learning for situational awareness in intelligent transport systems. 01/11/2021 - 31/10/2025

Abstract

Reliability and accuracy are the two fundamental requirements for intelligent transport systems (ITS). The reliability of active perception for situational awareness algorithms has significantly improved in the past few years due to AI developments. Situational awareness can be improved through exchange of information between multiple agents. Making it complex to accomplish high accuracy at low computational cost cooperatively is critical to ensuring safe and reliable transport systems. This research will tackle the main challenges for shared situational awareness that requires perception from multiple sensor streams and multiple agents. This research will tackle the local sensor fusion problem with graph-based deep learning. Local sensor fusion is the fusion at the agent level where multiple mounted sensors will be used to solve a defined task. By exploiting the structural information in multiple modalities, the proposed solution will construct graph-based deep learning. Then distributed fusion will be accomplished by fusing predictions from multiple agents. As a result, the predictions can be fused across multiple agents to produce a richer situational awareness. The advantage of doing distributed fusion is evident in situations where a single agent's perception is not enough. This will be achieved by modeling spatio-temporal graph networks and studying dynamic updates in the graphs. The results will be validated using real-life benchmark datasets and simulation engine.

Researcher(s)

Research team(s)

Project type(s)

  • Research Project