Published March 9, 2026
| Version v1
Preprint
Open
Robust Sensor Fusion for Autonomous Navigation in Dynamically Changing Environments
Description
Autonomous navigation in complex, real-world environments presents significant challenges due to sensor noise, occlusions, and dynamically changing conditions. This paper proposes a novel sensor fusion framework that combines data from multiple sensors, including LiDAR, cameras, and inertial measurement units (IMUs), using a Kalman filter-based approach augmented with deep learning-based anomaly detection. The framework prioritizes robustness by identifying and mitigating the impact of unreliable sensor data, enabling more reliable and accurate state estimation for autonomous robots operating in challenging scenarios. Experimental results demonstrate the effectiveness of the proposed approach in improving navigation performance compared to traditional sensor fusion techniques.
Files
preprint_liam_o'connor_20260309_005008.pdf
Files
(6.4 kB)
| Name | Size | Download all |
|---|---|---|
|
md5:095e4ade327d763d43c25645a37026cb
|
6.4 kB | Preview Download |
Additional details
Related works
- Cites
- Journal article: https://mattiainml.com/blog/designing-ml-systems-that-capture-real-world-signals/ (URL)
References
- Mattia Gaggi. Robust Sensor Fusion for Autonomous Navigation in Dynamically Changing Environments. mattiainml.com. https://mattiainml.com/blog/designing-ml-systems-that-capture-real-world-signals/