Abstract
Abstract—Real-time relative pose (RP) estimation is a corner-
stone for effective multi-agent collaboration. When conventional
global positioning infrastructure such as GPS is unavailable, the
use of Ultra-Wideband (UWB) technology on each agent provides
a practical means to measure inter-agent range. Due to UWB’s
precise range measurements and robust communication capa-
bilities, external hardware installations are not needed. However,
when only a single UWB device per agent is used, the relative pose
between the agents can be unobservable, resulting in a complex
solution space with multiple possible RPs. This paper proposes a
novel method based on an Unscented Particle Filter (UPF) that
fuses single UWB ranges with visual-inertial odometry (VIO).
The proposed decentralized method solves the multi-modal solu-
tion in 3D (4-DoF) for the RP when it is unobservable. Moreover,
a pseudo-state is introduced to correct the rotational drift of
the agents. Through simulations and experiments involving two
robots, the proposed solution was shown to be competitive and
less computationally expensive than state-of-the-art algorithms.
Additionally, the proposed solution provides all possible relative
poses from the first measurement. The code and link to the video
are available https://github.com/y2d2/UPF RPE.
stone for effective multi-agent collaboration. When conventional
global positioning infrastructure such as GPS is unavailable, the
use of Ultra-Wideband (UWB) technology on each agent provides
a practical means to measure inter-agent range. Due to UWB’s
precise range measurements and robust communication capa-
bilities, external hardware installations are not needed. However,
when only a single UWB device per agent is used, the relative pose
between the agents can be unobservable, resulting in a complex
solution space with multiple possible RPs. This paper proposes a
novel method based on an Unscented Particle Filter (UPF) that
fuses single UWB ranges with visual-inertial odometry (VIO).
The proposed decentralized method solves the multi-modal solu-
tion in 3D (4-DoF) for the RP when it is unobservable. Moreover,
a pseudo-state is introduced to correct the rotational drift of
the agents. Through simulations and experiments involving two
robots, the proposed solution was shown to be competitive and
less computationally expensive than state-of-the-art algorithms.
Additionally, the proposed solution provides all possible relative
poses from the first measurement. The code and link to the video
are available https://github.com/y2d2/UPF RPE.
Original language | English |
---|---|
Article number | 2377-3766 |
Pages (from-to) | 11754-11761 |
Number of pages | 8 |
Journal | IEEE Robotics and Automation Letters |
Volume | 9 |
Issue number | 12 |
DOIs | |
Publication status | Published - 24 Nov 2024 |
Bibliographical note
Funding Information:This work was supported in part by imec vzw, in part by the Research Program Artifici\u00EBle Intelligentie Vlaanderen from the Flemish Government, in part by the EU Project SPEAR under Grant 101119774, and in part by the Eurobin through the Horizon Europe framework under Grant 101070596.
Funding Information:
Manuscript received: June, 25, 2024; Revised August, 21, 2024; Accepted November, 6, 2024. This paper was recommended for publication by Editor Sven Behnke upon evaluation of the Associate Editor and Reviewers\u2019 comments. This work was supported by imec vzw, the research program Artifici\u00EBle Intelligentie Vlaanderen from the Flemish Government, the EU project SPEAR 101119774, and the Eurobin 101070596 grant from the Horizon Europe framework. 1Authors are with Brubotics, Vrije Universiteit Brussel, Belgium, and affiliated to imec vzw. [email protected] 2 Adrian Muntenau is with ETRO, Vrije Universiteit Brussel, Belgium, and affiliated to imec vzw. Digital Object Identifier (DOI): see top of this page.
Publisher Copyright:
© 2024 IEEE.
Keywords
- Multi-Robot Systems
- Localization
- Sensor Fusion