+

Nguyen et al., 2021 - Google Patents

Viral slam: Tightly coupled camera-imu-uwb-lidar slam

Nguyen et al., 2021

View PDF
Document ID
2946986701761293006
Author
Nguyen T
Yuan S
Cao M
Nguyen T
Xie L
Publication year
Publication venue
arXiv preprint arXiv:2105.03296

External Links

Snippet

In this paper, we propose a tightly-coupled, multi-modal simultaneous localization and mapping (SLAM) framework, integrating an extensive set of sensors: IMU, cameras, multiple lidars, and Ultra-wideband (UWB) range measurements, hence referred to as VIRAL (visual …
Continue reading at arxiv.org (PDF) (other versions)

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in preceding groups
    • G01C21/10Navigation; Navigational instruments not provided for in preceding groups by using measurements of speed or acceleration
    • G01C21/12Navigation; Navigational instruments not provided for in preceding groups by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
    • G01C21/16Navigation; Navigational instruments not provided for in preceding groups by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/0011Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot associated with a remote control arrangement
    • G05D1/0044Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot associated with a remote control arrangement by providing the operator with a computer generated representation of the environment of the vehicle, e.g. virtual reality, maps
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in preceding groups
    • G01C21/20Instruments for performing navigational calculations
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0287Control of position or course in two dimensions specially adapted to land vehicles involving a plurality of land vehicles, e.g. fleet or convoy travelling
    • G05D1/0291Fleet control
    • G05D1/0295Fleet control by at least one leading vehicle of the fleet

Similar Documents

Publication Publication Date Title
Nguyen et al. Viral slam: Tightly coupled camera-imu-uwb-lidar slam
Ebadi et al. Present and future of slam in extreme environments: The darpa subt challenge
US11140379B2 (en) Mapping and tracking system with features in three-dimensional space
Nguyen et al. Viral-fusion: A visual-inertial-ranging-lidar sensor fusion approach
Forster et al. Collaborative monocular slam with multiple micro aerial vehicles
Mur-Artal et al. Visual-inertial monocular SLAM with map reuse
Nguyen et al. LIRO: Tightly coupled lidar-inertia-ranging odometry
Nguyen et al. Miliom: Tightly coupled multi-input lidar-inertia odometry and mapping
Tian et al. Resilient and distributed multi-robot visual slam: Datasets, experiments, and lessons learned
US20230314548A1 (en) Unmanned aerial vehicle and localization method for unmanned aerial vehicle
CN110726406A (en) An Improved Nonlinear Optimization Method for Monocular Inertial Navigation SLAM
Yang et al. Vision‐based localization and robot‐centric mapping in riverine environments
Oleynikova et al. Real-time visual-inertial localization for aerial and ground robots
Zhu et al. Cooperative visual-inertial odometry
Zachariah et al. Self-motion and wind velocity estimation for small-scale UAVs
CN114485640A (en) Monocular visual-inertial synchronous positioning and mapping method and system based on point and line features
Chen et al. Stereo visual inertial pose estimation based on feedforward-feedback loops
Zhang et al. DUI-VIO: Depth uncertainty incorporated visual inertial odometry based on an RGB-D camera
Yusefi et al. A generalizable D-VIO and its fusion with GNSS/IMU for improved autonomous vehicle localization
CN112991400A (en) Multi-sensor auxiliary positioning method for unmanned ship
Chang et al. Target-free stereo camera-gnss/imu self-calibration based on iterative refinement
Zhang et al. UWB/INS-based robust anchor-free relative positioning scheme for UGVs
Hosen et al. Vision-aided nonlinear observer for fixed-wing unmanned aerial vehicle navigation
CN117782050A (en) Robust laser-vision-inertia fusion SLAM method
Zhong et al. Colrio: Lidar-ranging-inertial centralized state estimation for robotic swarms
点击 这是indexloc提供的php浏览器服务,不要输入任何密码和下载