Nguyen et al., 2021 - Google Patents
Viral slam: Tightly coupled camera-imu-uwb-lidar slamNguyen et al., 2021
View PDF- Document ID
- 2946986701761293006
- Author
- Nguyen T
- Yuan S
- Cao M
- Nguyen T
- Xie L
- Publication year
- Publication venue
- arXiv preprint arXiv:2105.03296
External Links
Snippet
In this paper, we propose a tightly-coupled, multi-modal simultaneous localization and mapping (SLAM) framework, integrating an extensive set of sensors: IMU, cameras, multiple lidars, and Ultra-wideband (UWB) range measurements, hence referred to as VIRAL (visual …
- 230000003612 virological 0 title description 43
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in preceding groups
- G01C21/10—Navigation; Navigational instruments not provided for in preceding groups by using measurements of speed or acceleration
- G01C21/12—Navigation; Navigational instruments not provided for in preceding groups by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
- G01C21/16—Navigation; Navigational instruments not provided for in preceding groups by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
- G05D1/0011—Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot associated with a remote control arrangement
- G05D1/0044—Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot associated with a remote control arrangement by providing the operator with a computer generated representation of the environment of the vehicle, e.g. virtual reality, maps
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in preceding groups
- G01C21/20—Instruments for performing navigational calculations
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0287—Control of position or course in two dimensions specially adapted to land vehicles involving a plurality of land vehicles, e.g. fleet or convoy travelling
- G05D1/0291—Fleet control
- G05D1/0295—Fleet control by at least one leading vehicle of the fleet
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| Nguyen et al. | Viral slam: Tightly coupled camera-imu-uwb-lidar slam | |
| Ebadi et al. | Present and future of slam in extreme environments: The darpa subt challenge | |
| US11140379B2 (en) | Mapping and tracking system with features in three-dimensional space | |
| Nguyen et al. | Viral-fusion: A visual-inertial-ranging-lidar sensor fusion approach | |
| Forster et al. | Collaborative monocular slam with multiple micro aerial vehicles | |
| Mur-Artal et al. | Visual-inertial monocular SLAM with map reuse | |
| Nguyen et al. | LIRO: Tightly coupled lidar-inertia-ranging odometry | |
| Nguyen et al. | Miliom: Tightly coupled multi-input lidar-inertia odometry and mapping | |
| Tian et al. | Resilient and distributed multi-robot visual slam: Datasets, experiments, and lessons learned | |
| US20230314548A1 (en) | Unmanned aerial vehicle and localization method for unmanned aerial vehicle | |
| CN110726406A (en) | An Improved Nonlinear Optimization Method for Monocular Inertial Navigation SLAM | |
| Yang et al. | Vision‐based localization and robot‐centric mapping in riverine environments | |
| Oleynikova et al. | Real-time visual-inertial localization for aerial and ground robots | |
| Zhu et al. | Cooperative visual-inertial odometry | |
| Zachariah et al. | Self-motion and wind velocity estimation for small-scale UAVs | |
| CN114485640A (en) | Monocular visual-inertial synchronous positioning and mapping method and system based on point and line features | |
| Chen et al. | Stereo visual inertial pose estimation based on feedforward-feedback loops | |
| Zhang et al. | DUI-VIO: Depth uncertainty incorporated visual inertial odometry based on an RGB-D camera | |
| Yusefi et al. | A generalizable D-VIO and its fusion with GNSS/IMU for improved autonomous vehicle localization | |
| CN112991400A (en) | Multi-sensor auxiliary positioning method for unmanned ship | |
| Chang et al. | Target-free stereo camera-gnss/imu self-calibration based on iterative refinement | |
| Zhang et al. | UWB/INS-based robust anchor-free relative positioning scheme for UGVs | |
| Hosen et al. | Vision-aided nonlinear observer for fixed-wing unmanned aerial vehicle navigation | |
| CN117782050A (en) | Robust laser-vision-inertia fusion SLAM method | |
| Zhong et al. | Colrio: Lidar-ranging-inertial centralized state estimation for robotic swarms |