By continuing to browse the site, you are agreeing to our use of cookies. (More details)
Portal > Offres > Offre UPR8001-PATDAN-001 - Post-Doc : perception embarquée sur aéronef pour assistance à la navigation (H/F)

Post-Doc: embedded perception system for aircraft navigation assistance

This offer is available in the following languages:
Français - Anglais

Ensure that your candidate profile is correct before applying. Your profile information will be added to the details for each application. In order to increase your visibility on our Careers Portal and allow employers to see your candidate profile, you can upload your CV to our CV library in one click!

Faites connaître cette offre !

General information

Reference : UPR8001-PATDAN-001
Workplace : TOULOUSE
Date of publication : Monday, January 13, 2020
Type of Contract : FTC Scientist
Contract Period : 24 months
Expected date of employment : 2 March 2020
Proportion of work : Full time
Remuneration : Gross monthly salary: 2500€ à 3000€ depending on experience
Desired level of education : 5-year university degree
Experience required : 1 to 4 years

Missions

Postdoctoral position: architecture of an embedded system for the navigation of aircrafts in airport areas

The postdoc will participate to the design of a perception system implemented on an heterogeneous architecture (multi-core CPU and GPU) selected from algorithmic specifications in order to provide good performances with critical real-time constraints. A SME partner in the consortium, will develop a physical prototype of this architecture to be validated with HIL simulations.

The postdoc will develop algorithms to process sensory data, algorithms required to characterize and model sensors to be embedded on the aircraft, and to perform new functionalities of navigation and safety. So, these algorithms will have two objectives:
1. estimate the intrinsic capabilities of every sensor, for example the maximal range for obstacle detection according to the object class (pedestrian, car, bus, truck, other aircraft, etc), to the position of the sensor on the aircraft, to the visibility conditions, etc
2. measure the performances of sensor/algorithm pairs, with functions optimized to be implemented on the selected embedded architecture in order to satisfy real-time constraints. These functions will cope with obstacle detection, visual odometry, line tracking… from geometrical or machine learning methods.

The postdoc will work with permanent researchers of the RAP team involved in this project: he/she will participate to its technical management (documentation, deliverables, interaction with the other partners, etc). He/She will have to participate (1) to the project meetings, presenting formally the team results to the scientific reviewers appointed by DGAC, and (2) to the scientific dissemination during international conferences or workshops.

He/She will cooperate with a PhD student funded on the project who designs and develops new algorithms on multi-spectral vision (image filtering, dehazing, fusion, object detection, etc); he/she will be in charge of algorithms based on RADAR or LIDAR data. He/She will participate to the algorithm selection, to make easier and more efficient their integration on the embedded system, especially to better exploit the parallelism. He will be responsible of the algorithm integration on the system (multi-core CPU+GPU).

He/She will participate to the final validation tasks with the other partners, under the responsibility of the project coordinator AIRBUS. These tests will be conducted by means of a simulator provided by a project partner, using several modes: (1) a study mode in order to acquire sequences of sensory data and to test algorithms offline exploring different sensor configurations, (2) an interacting mode (using an AIRBUS setup), in order to evaluate with a pilot, how algorithm results must be presented, and finally (3) a HIL mode (Hardware in the Loop), in order to evaluate the real-time performances with algorithms implemented on different architectures.

Activities

State of the art and contributions on 3D vision and/or on the appropriateness between heterogeneous architectures (multi-core CPU + GPU) and embedded vision algorithms, in order to provide good performances with real-time constraints.

Participation to the development of vision algorithms in C/C++

Low level optimization for the transfer of algorithms on an heterogeneous architecture.

Project management (documentation, organization, tests, formal presentations...)

Skills

The applicant must have defended a PhD, in topics related to 3D perception in robotics or intelligent vehicles, and/or the Appropriateness Algorithms/Architectures (AAA).

Other competences:
- Autonomy, team working, student supervision
- Project management
- Communication: scientific publication and presentation…
- Vision and machine learning tools: OpenCV, Matlab, CNN libraries and architectures…
- 3D vision (projective geometry, camera model, multi-sensor fusion, calibration...)
- Development of embedded software (C, SIMD NEON optimization...): transfer of real-time vision algorithms on heterogeneous architecture (multi-core CPU+GPU).
- Design methodology for heterogeneous architectures.

Work Context

This position takes place within a research project with a consortium coordinated by AIRBUS; this project funded by the French administration « Directorate General for Civil Aviation» (DGAC), aims at developing an Assistance System for the pilot of an aircraft navigating in airport areas, mainly for bad visibility conditions (night, fog, rain, etc). Several sensory data (2D multi-spectral images, RADAR and LIDAR 3D data...) are acquired, fused and processed:
- first to provide a synthetic and realistic view of the environment in which the aircraft navigates. This view, augmented by symbolic and textual information, is displayed in the cockpit, in order to warn the pilot about all potential risks (situational awareness) during the navigation step on the taxiways from the parking area to the runway, and reciprocally.
- and then, to generate alerts for the control system of the aircraft, which will in the near future be endowed with autonomous skills so that it can decide which action must be executed when such risks are detected.

The postdoctoral researcher (hereafter postdoc) will be integrated in the RAP team (Robotics, Action and Perception) of the Robotic Department of LAAS-CNRS.

Additional Information

For further information, please contact
- Ariane Herbulot: aherbulo@laas.fr , 06 61 33 69 12
- Michel Devy : michel@laas.fr, 05 61 33 63 31

We talk about it on Twitter!