CNN-based Omnidirectional Object Detection for HermesBot Autonomous Delivery Robot with Preliminary Frame Classification

Saian Protasov, Pavel Karpyshev, Ivan Kalinov, Pavel Kopanev, Nikita Mikhailovskiy, Alexander Sedunin, Dzmitry Tsetserukou

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

2 Citations (Scopus)

Abstract

Mobile autonomous robots include numerous sensors for environment perception. Cameras are an essential tool for robot's localization, navigation, and obstacle avoidance. To process a large flow of data from the sensors, it is necessary to optimize algorithms, or to utilize substantial computational power. In our work, we propose an algorithm for optimizing a neural network for object detection using preliminary binary frame classification. An autonomous outdoor mobile robot with 6 rolling-shutter cameras on the perimeter providing a 360-degree field of view was used as the experimental setup. The obtained experimental results revealed that the proposed optimization accelerates the inference time of the neural network in the cases with up to 5 out of 6 cameras containing target objects.

Original languageEnglish
Title of host publication2021 20th International Conference on Advanced Robotics, ICAR 2021
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages517-522
Number of pages6
ISBN (Electronic)9781665436847
DOIs
Publication statusPublished - 2021
Externally publishedYes
Event20th International Conference on Advanced Robotics, ICAR 2021 - Ljubljana, Slovenia
Duration: 6 Dec 202110 Dec 2021

Publication series

Name2021 20th International Conference on Advanced Robotics, ICAR 2021

Conference

Conference20th International Conference on Advanced Robotics, ICAR 2021
Country/TerritorySlovenia
CityLjubljana
Period6/12/2110/12/21

Fingerprint

Dive into the research topics of 'CNN-based Omnidirectional Object Detection for HermesBot Autonomous Delivery Robot with Preliminary Frame Classification'. Together they form a unique fingerprint.

Cite this