Perception uncertainty quantification for Integrity-Aware SLAM (M/F)
New
- FTC PhD student / Offer for thesis
- 36 mounth
- Doctorate
Offer at a glance
The Unit
Heuristique et Diagnostic des Systèmes Complexes
Contract Type
FTC PhD student / Offer for thesis
Working hHours
Full Time
Workplace
60203 COMPIEGNE
Contract Duration
36 mounth
Date of Hire
01/10/2026
Remuneration
2300 € gross monthly
Apply Application Deadline : 03 June 2026 23:59
Job Description
Thesis Subject
Simultaneous Localization and Mapping (SLAM) is a highly dynamic research field that has seen fast progress in recent years. The objective of this thesis is to contribute to the development of consistent SLAM methods, with a focus on quantifying the uncertainty associated with perception sensors. Specifically, this research targets 3D Lidar measurements to ensure precise and consistent localization in complex environments where GNSS signals are unavailable or severely degraded.
In state estimation problems, measurement noise parameters are typically tuned manually based on human expertise. However, the resulting position uncertainties depend heavily on this choice, which can compromise the reliability and safety of autonomous systems. Estimating these uncertainties is challenging but remains crucial for obtaining a reliable localization and usable long-term maps. Recently, measurement uncertainty quantification has been the subject of several research works based on deep learning [1] [2].
Unlike existing approaches that focus on segmentation, object detection, or uncertainty quantification for visual odometry based on point clouds, our goal is to obtain an uncertainty estimate for the observation derived from the entire point cloud processing pipeline, enabling precise and consistent state estimation. In this thesis, we will study the propagation of uncertainties from the Lidar point cloud and the segmentation stage through to the extracted features, ultimately leading to an observation-level covariance estimation. This analysis can be conducted either via an end-to-end approach or by focusing on the final element of the processing chain. Furthermore, we aim to obtain an uncertainty estimate that adapts to noise variations in real-time. Currently, we are leaning towards deep learning-based approaches with the possibility of leveraging Bayesian optimization [3].
To evaluate the quality of the uncertainty quantification, a data-fusion-based state estimation will be performed. The thesis can build upon expertise already acquired in our previous work, considering approaches based on coupling Bayesian and set-memberships methods [4], as well as methods based on factor graphs.
Moreover, uncertainty quantification can benefit from measurement redundancy. Such redundancy occurs naturally in multi-robot systems, where each robot improves state estimation through shared information and measurements [5]. Another feasible approach involves analytical redundancy. The thesis can also exploit the paradigm of ensemble methods, which combine multiple networks to achieve better uncertainty quantification [6].
The thesis will utilize datasets and data to be acquired from the vehicles of the Heudiasyc laboratory. These vehicles are equipped with various sensors: GNSS (Global Navigation Satellite System) receivers, odometry systems, Lidar, and cameras. High-definition maps are also available.
[1] A. De Maio and S. Lacroix, “Deep bayesian icp covariance estimation,” in 2022 International Conference on Robotics and Automation (ICRA), IEEE, 2022, pp. 6519–6525.
[2] M. Dolatabadi, F. Ayar, E. Javanmardi, M. Tsukada, and M. Javanmardi, “Towards Robust LiDAR Localization: Deep Learning-based Uncertainty Estimation,” Sep. 23, 2025, arXiv: arXiv:2509.18954.
[3] M. Salhi and J. Al Hage, “Zonotopic and gaussian information filter for high integrity localization,” in 2024 27th International Conference on Information Fusion (FUSION), IEEE, 2024, pp. 1–8.
[4] M. Salhi and J. Al Hage, “Zonotopic and gaussian information filter for high integrity localization,” in 2024 27th International Conference on Information Fusion (FUSION), IEEE, 2024, pp. 1–8.
[5] M. Escourrou, J. Al Hage, and P. Bonnifait, “Fault tolerant decentralized collaboration for simultaneous localization and prior map update with stable 2D features,” Robotics and Autonomous Systems, p. 105402, 2026.
[6] J. Gawlikowski et al., “A survey of uncertainty in deep neural networks,” Artif Intell Rev, vol. 56, no. S1, pp. 1513–1589, Oct. 2023.
Skills: Engineering degree and Master's in robotics and/or computer science and/or control systems, strong programming skills (Python, C++). Experience in experimentation and embedded algorithms. Good knowledge of machine learning for vision/LiDAR applications
Your Work Environment
Heudiasyc, UMR CNRS7253-université de technologie de Compiègne (UTC), with stays at LAAS-CNRS, Toulouse.
Compensation and benefits
Compensation
2300 € gross monthly
Annual leave and RTT
44 jours
Remote Working practice and compensation
Pratique et indemnisation du TT
Transport
Prise en charge à 75% du coût et forfait mobilité durable jusqu’à 300€
About the offer
| Offer reference | UMR7253-JOEALH-017 |
|---|---|
| CN Section(s) / Research Area | Information sciences: processing, integrated hardware-software systems, robots, commands, images, content, interactions, signals and languages |
About the CNRS
The CNRS is a major player in fundamental research on a global scale. The CNRS is the only French organization active in all scientific fields. Its unique position as a multi-specialist allows it to bring together different disciplines to address the most important challenges of the contemporary world, in connection with the actors of change.
Create your alert
Don't miss any opportunity to find the job that's right for you. Register for free and receive new vacancies directly in your mailbox.