An Artificial Neural SLAM Framework for Event-Based Vision


GELEN A. G., ATASOY A.

IEEE Access, cilt.11, ss.58436-58450, 2023 (SCI-Expanded) identifier

  • Yayın Türü: Makale / Tam Makale
  • Cilt numarası: 11
  • Basım Tarihi: 2023
  • Doi Numarası: 10.1109/access.2023.3282637
  • Dergi Adı: IEEE Access
  • Derginin Tarandığı İndeksler: Science Citation Index Expanded (SCI-EXPANDED), Scopus, Compendex, INSPEC, Directory of Open Access Journals
  • Sayfa Sayıları: ss.58436-58450
  • Anahtar Kelimeler: convolutional neural networks, Event-based cameras, neural slam, visual slam
  • Erzincan Binali Yıldırım Üniversitesi Adresli: Evet

Özet

The SLAM problem for autonomous robots can be greatly improved by using event-based cameras. Compared to others, event-based cameras consume very low power while providing great temporal resolution and dynamic range. In this study, we propose a convolutional neural SLAM framework based solely on the event data. Event-based cameras generate events for pixels whose brightness changes. Therefore, the event data is rich in motion and edge information. The purpose of the proposed framework is to make all estimations using encoded information in event data. The proposed solution is in the form of keyframe-based visual SLAM, consisting of three neural networks that can estimate the relative camera pose, log-depth and features for loop closure detection. In the study, network architectures and learning curves for the trained networks are presented and it is shown that networks can learn the problems successfully. The proposed method has been developed and tested on a new dataset generated by the CARLA simulator. It has been shown that the proposed method is a SLAM solution and it can keep global drift under control with loop closure estimations. Evaluation metrics for estimations, evaluation of the global model and an analysis of run-time performance are also presented.