Integrating sensor and motion models to localize an autonomous AR.Drone

Open Access
Authors
Publication date 2011
Journal International Journal of Micro Air Vehicles
Volume | Issue number 3 | 4
Pages (from-to) 183-200
Organisations
  • Faculty of Science (FNWI) - Informatics Institute (IVI)
Abstract
This article describes a method to develop a generic approach to acquire navigation
capabilities for the standard platform of the IMAV indoor competition: the Parrot
AR.Drone. Our development is partly based on simulation, which requires both a
realistic sensor and motion model. The AR.Drone simulation model is described and
validated. Furthermore, this article describes how a visual map of the indoor environment
can be made, including the effect of sensor noise. This visual map consists of a texture
map and a feature map. The texture map is used for human navigation and the feature
map is used by the AR.Drone to localize itself. To do so, a localization method is
presented. An experiment demonstrates how well the localization works for
circumstances encountered during the IMAV competition.
Document type Article
Language English
Published at https://doi.org/10.1260/1756-8293.3.4.183
Downloads
Post-print version of article (Accepted author manuscript)
Permalink to this page
Back