The key idea of the project is to embed robust state estimation at the center of the learning process for feature detection and fusion. This project provides the possibility to pursue first principles thinking approach, beginning with raw sensor measurements and designing a self-evolving system around it that can formulate interconnections between sensors in a data-driven manner, while ensuring that failure modes of a single modality do not degrade overall state estimation. Particularly, you will:

  • Look into sensor fusion as a task-based learning problem
  • Explore representation learning for efficient feature detection algorithms
  • Research different learning paradigms for state estimation such as reinforcement learning, unsupervised learning
  • Investigate different sensor modalities such as lidars, cameras, imaging radars and establish methods to handle sensor degradations
  • Ensure that the state estimator performs robustly in varying environmental conditions such as localization in tunnels, bad weather conditions

vacancy link