CN111781929A - AGV trolley and 3D laser radar positioning and navigation method - Google Patents
AGV trolley and 3D laser radar positioning and navigation method Download PDFInfo
- Publication number
- CN111781929A CN111781929A CN202010650707.2A CN202010650707A CN111781929A CN 111781929 A CN111781929 A CN 111781929A CN 202010650707 A CN202010650707 A CN 202010650707A CN 111781929 A CN111781929 A CN 111781929A
- Authority
- CN
- China
- Prior art keywords
- agv
- laser radar
- map
- point cloud
- personal computer
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 title claims description 23
- 238000001514 detection method Methods 0.000 claims abstract description 5
- 230000008569 process Effects 0.000 claims description 11
- 238000013135 deep learning Methods 0.000 claims description 7
- 238000000605 extraction Methods 0.000 claims description 3
- 230000008447 perception Effects 0.000 abstract description 2
- 238000010586 diagram Methods 0.000 description 2
- 230000006870 function Effects 0.000 description 2
- 238000013507 mapping Methods 0.000 description 2
- 230000000007 visual effect Effects 0.000 description 2
- 241001465754 Metazoa Species 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 230000008859 change Effects 0.000 description 1
- 238000010276 construction Methods 0.000 description 1
- 230000003111 delayed effect Effects 0.000 description 1
- 230000002708 enhancing effect Effects 0.000 description 1
- 239000000284 extract Substances 0.000 description 1
- 239000011521 glass Substances 0.000 description 1
- 238000009434 installation Methods 0.000 description 1
- 230000003993 interaction Effects 0.000 description 1
- 230000002452 interceptive effect Effects 0.000 description 1
- 238000012423 maintenance Methods 0.000 description 1
- 238000007726 management method Methods 0.000 description 1
- 238000005457 optimization Methods 0.000 description 1
- 238000011012 sanitization Methods 0.000 description 1
- 239000007921 spray Substances 0.000 description 1
- 230000009466 transformation Effects 0.000 description 1
- 238000000844 transformation Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0231—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
- G05D1/0238—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using obstacle or wall sensors
- G05D1/024—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using obstacle or wall sensors in combination with a laser
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0212—Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
- G05D1/0221—Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory involving a learning process
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0212—Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
- G05D1/0223—Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory involving speed control of the vehicle
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0231—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
- G05D1/0246—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means
- G05D1/0251—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means extracting 3D information from a plurality of images taken from different locations, e.g. stereo vision
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0257—Control of position or course in two dimensions specially adapted to land vehicles using a radar
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0276—Control of position or course in two dimensions specially adapted to land vehicles using signals provided by a source external to the vehicle
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Remote Sensing (AREA)
- Radar, Positioning & Navigation (AREA)
- Aviation & Aerospace Engineering (AREA)
- General Physics & Mathematics (AREA)
- Automation & Control Theory (AREA)
- Electromagnetism (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Optics & Photonics (AREA)
- Multimedia (AREA)
- Traffic Control Systems (AREA)
- Navigation (AREA)
Abstract
The invention relates to the field of robot positioning and navigation, and discloses an AGV (automatic guided vehicle), which comprises a vehicle body and a 3D laser radar; a speed sensor and a gyroscope sensor industrial personal computer are arranged in the vehicle body; the industrial personal computer is connected with a display screen or a touch screen; the 3D laser radar is arranged at the topmost end of the vehicle body; the detection vertical height of the 3D laser radar is more than or equal to 1 meter; the AGV has the three-dimensional scene perception capability by installing the 3D laser radar, and can avoid obstacles in the vertical direction; the positioning navigation scheme enables the sensing ability of the AGV to get on a step, improves the robustness of the AGV, but does not reduce the adaptability of the AGV, and the laser radar can work in any scene and has good flexibility and practicability.
Description
Technical Field
The invention relates to the field of robot positioning and navigation, in particular to an AGV trolley and a 3D laser radar positioning and navigation method.
Background
At present, the AGV is widely applied to various industry fields, and engineers develop a plurality of functions on the AGV car so as to meet the working requirements of different industries. For example, a logistics robot for carrying express delivery is used for carrying the AGV top carrying conveyor belt; the top of the AGV is provided with a service robot with a touch screen and a voice system; the AGV top is equipped with a sanitizing robot that sanitizes the spray or light. Such robots can be roughly divided into two parts: the upper functional part and the lower motion part. The lower layer motion part is a main execution and control part of the robot and comprises a driving module, a sensor module and a control module. The upper part is responsible for developing various practical functions and is driven to move by the AGV chassis.
The positioning and navigation system of the current AGV moving chassis mainly has three modes: magnetic strip type, radar type and visual type. The magnetic strip type can move along the magnetic strip which is pasted in advance, and is basically eliminated at present because of insufficient flexibility and convenience and high maintenance cost; the radar type realizes positioning by emitting ultrasonic waves and laser light in a plane where the laser radar and the ultrasonic radar are located and measuring the distance and the position of the surrounding environment by calculating the reflected laser light and the reflected ultrasonic light. The visual type is through with monocular or binocular, through shooting the image, handles the environment that obtains the dolly and is located to gained image, but receives the light influence big, runs into objects such as glass, mirror, the white wall of strong light irradiation, often can't accurately judge the distance.
To the AGV that adopts laser radar or ultrasonic radar to advance line location, because the radar on the chassis can only discern the planar obstacle in chassis place, when the AGV runs into obstacles such as table chair, goods shelves, this kind of condition can appear: the chassis judges that there is no obstacle in front, but functional parts on the chassis cannot pass through.
Disclosure of Invention
The invention provides an AGV trolley and a 3D laser radar positioning and navigation method, which can solve the problems encountered in the background.
The working principle of the invention is as follows: an AGV comprises a vehicle body and a 3D laser radar; a speed sensor and a gyroscope sensor industrial personal computer are arranged in the vehicle body; the industrial personal computer is connected with a display screen or a touch screen; the 3D laser radar is arranged at the topmost end of the vehicle body.
Further, the method comprises the following steps: and the detection vertical height of the 3D laser radar is more than or equal to 1 meter. To ensure that all obstacles encountered by the upper functional part of the AGV are perceived.
A3D laser radar positioning and navigation method of an AGV comprises the following steps:
s1: the 3D laser radar collects surrounding environment information data and sends the data to the industrial personal computer;
s2: converting the collected information into three-dimensional point cloud data by a radar driving program in the industrial personal computer to obtain a primary point cloud image of the current position;
s3: extracting the characteristics of the initial point cloud image by using a deep learning algorithm;
s4: the AGV trolley moves, the 3D laser radar continuously scans surrounding scenes in the moving process, and in the scanning process, one piece with the most complete image information is used as a comparison point cloud image, and feature extraction is carried out on the comparison point cloud image by using deep learning;
s5: comparing the comparison point cloud image with the initial point cloud image, and drawing a global map according to the moving distance of the trolley in the two images and the distance and height information included in the two point cloud images; due to the fact that the moving distance is not large and the scene is basically unchanged, the same structural features, distance features and the like inevitably appear in the two images.
S6: repeating the step S5, extracting features of the obtained image, and optimizing the global map until the three-dimensional map of the whole scene is built;
s7: the map built and stored in the S6 is uploaded to a display screen or a touch screen through an aerial view;
s8: performing navigation point setting in the map obtained in S7; navigation point setting is carried out in a map, and the built three-dimensional map is viewed by an aerial view, in the view angle, the whole three-dimensional image is projected on a two-dimensional plane, the two-dimensional map is represented in a gray scale map, the larger the gray scale is, the higher the obstacle is in the vertical direction is, and the non-obstacle area is white.
S9: the AGV automatically plans a route on the map, scans nearby scenes in real time, writes the sensed obstacles into the cache map, and plans the route again.
The data content in S1 includes distance information, xy plane angle information, and xz plane angle information.
The overhead view in S7 is a grayscale view.
The invention has the beneficial effects that: according to the invention, the AGV has the three-dimensional scene perception capability by installing the 3D laser radar, and can avoid obstacles (which can not press the feet of people and can not bump tables and chairs) in the vertical direction. Meanwhile, the high-precision super-wide-angle 3D laser radar is adopted, the irradiation range of laser is large, the wider area can be covered by fewer scanning times, the mapping efficiency is increased, the mapping precision is high, and the resolution is high. According to the positioning navigation scheme, the sensing ability of the AGV is provided with one step, the robustness of the AGV is improved, the adaptability of the AGV cannot be reduced, the laser radar can work in any scene, and the positioning navigation scheme has good flexibility and practicability.
Drawings
FIG. 1 is a flowchart for creating a navigation map;
FIG. 2 is a 3D lidar point cloud diagram;
fig. 3 is a point cloud diagram of an actual scene.
Detailed Description
For the purpose of enhancing the understanding of the present invention, the present invention will be described in further detail with reference to the accompanying drawings and examples, which are provided for the purpose of illustration only and are not intended to limit the scope of the present invention.
An AGV dolly which characterized in that: the device comprises a vehicle body and a 3D laser radar; a speed sensor and a gyroscope sensor industrial personal computer are arranged in the vehicle body; the industrial personal computer is connected with a display screen or a touch screen; the 3D laser radar is arranged at the topmost end of the vehicle body.
And the detection vertical height of the 3D laser radar is more than or equal to 1 meter.
A3D laser radar positioning and navigation method of an AGV comprises the following steps: the method comprises the following steps:
s1: the 3D laser radar collects surrounding environment information data and sends the data to the industrial personal computer;
s2: converting the collected information into three-dimensional point cloud data by a radar driving program in the industrial personal computer to obtain a primary point cloud image of the current position;
s3: extracting the characteristics of the initial point cloud image by using a deep learning algorithm;
s4: the AGV trolley moves, the 3D laser radar continuously scans surrounding scenes in the moving process, and in the scanning process, one piece with the most complete image information is used as a comparison point cloud image, and feature extraction is carried out on the comparison point cloud image by using deep learning; the 3D laser radar regards objects with uncertain positions in the point cloud image, such as moving vehicles, people and animals, as interference and cannot be drawn into a map.
S5: comparing the comparison point cloud image with the initial point cloud image, and drawing a global map according to the moving distance of the trolley in the two images and the distance and height information included in the two point cloud images;
s6: repeating the step S5, extracting features of the obtained image, and optimizing the global map until the three-dimensional map of the whole scene is built;
s7: the map built and stored in the S6 is uploaded to a display screen or a touch screen through an aerial view;
s8: performing navigation point setting in the map obtained in S7;
s9: the AGV automatically plans a route on the map, scans nearby scenes in real time, writes the sensed obstacles into the cache map, and plans the route again.
The data content in S1 includes distance information, xy plane angle information, and xz plane angle information; the overhead view in S7 is a grayscale view.
This scheme chooses for use the ouster laser radar, and its four key performance are: 90-degree ultra-wide vertical field angle (45 degrees above the horizontal plane and 45 degrees below, which are 90 degrees in total), 128-line ultra-high definition resolution, 0cm minimum detection distance and millimeter-level precision.
The process of drawing is shown in figure 1, firstly, an ouster laser radar is installed at the top of an AGV, the AGV perceives data of surrounding scenes and draws the data into a point cloud graph, the point cloud graph is shown in figure 2, the map at the moment is incomplete, only one plane graph of the current scene needs to be established to draw by matching with a subsequent image, the image data are transmitted into an industrial personal computer, and features are extracted by a deep learning algorithm.
And the AGV moves, continuously scans the surrounding environment in the moving process, sends the point cloud picture obtained by scanning the scene to an industrial personal computer, extracts the characteristics, compares the characteristics with the initial plan picture and draws a global map of the current position.
And automatically moving the AGV according to the incomplete area in the global map, and continuously optimizing the current global map until the complete global map is established.
After the map is built, the map is automatically stored and can be displayed in a form of an aerial view on a human-computer interaction interface, a three-dimensional model of the whole scene is projected on a two-dimensional plane, the map is a gray scale map, and the larger the height of the three-dimensional model is, the larger the gray scale is. An operator can set a navigation point in the map, and after the navigation point is set, the AGV is started, and can automatically plan a path according to the barrier-free area in the map.
In the navigation process, the AGV scans surrounding scenes by using a laser radar, and compares scene information characteristics with navigation map characteristics to realize real-time positioning in the navigation process.
When the AGV encounters an obstacle in the moving process, the characteristics of the obstacle coverage area can be uploaded to the cache map, and the path is re-planned in the cache map, so that real-time obstacle avoidance is realized. The cached map does not affect the global map that was built before the change.
The ouster laser radar can detect the environment of less than 3 meters around, guarantee that the functional part of AGV can not be blocked by the obstacle, have super wide angle of vision, can realize covering wider space range with less scanning number of times, thereby when reducing map construction required time, provide its required high accuracy data, the scanning gained scene point cloud picture is as shown in figure 3 (the shadow region is installed on the AGV for 3D laser radar in the picture, the installation chassis area below the radar is greater than radar bottom surface area, leads to partial laser to be intercepted). The scanning frequency of the laser radar is 10 Hz, so that the phenomenon that the processor working load is overlarge due to high-frequency refreshing can be avoided; and the problems that the map optimization is delayed due to the excessively low refresh rate, the map cannot be updated in real time and the obstacle is avoided in real time can be avoided.
In order to meet the requirements of cases, multiple AGVs are required to work in a matched mode, a cloud management system is arranged for the AGVs, the AGVs are connected with the server through the WIFI module, the map information of the AGVs and the map information of the server are unified, a large scene map is divided into a plurality of small scene maps, and the AGVs are arranged to work in respective scene maps. When the AGV works in a scene, the AGV still scans and positions the scene, and when the AGV encounters a moving obstacle, the AGV is regarded as interfering automatic avoidance; when the AGV encounters a small obstacle, the obstacle is stored in a cache map, and the AGV moves after avoiding the obstacle again; when the AGV encounters a large obstacle, or the obstacle completely blocks the current path, (for example, a newly placed goods shelf in a factory, a newly opened large truck) stores information into a cache map, and uploads the map information to the server, the server optimizes the large map according to the newly obtained map, and can divide the small scene map again, so that the small scene map distributed by the AGV has smooth roads, the AGV in the scene can avoid the obstacle, and the work efficiency of the AGV is improved.
The above embodiments should not limit the present invention in any way, and all technical solutions obtained by using equivalent alternatives or equivalent transformations fall within the protection scope of the present invention.
Claims (5)
1. An AGV dolly which characterized in that: the device comprises a vehicle body and a 3D laser radar; a speed sensor and a gyroscope sensor industrial personal computer are arranged in the vehicle body; the industrial personal computer is connected with a display screen or a touch screen; the 3D laser radar is arranged at the topmost end of the vehicle body.
2. The AGV cart of claim 1, further comprising: and the detection vertical height of the 3D laser radar is more than or equal to 1 meter.
3. A3D laser radar positioning and navigation method of an AGV trolley is characterized by comprising the following steps: the method comprises the following steps:
s1: the 3D laser radar collects surrounding environment information data and sends the data to the industrial personal computer;
s2: converting the collected information into three-dimensional point cloud data by a radar driving program in the industrial personal computer to obtain a primary point cloud image of the current position;
s3: extracting the characteristics of the initial point cloud image by using a deep learning algorithm;
s4: the AGV trolley moves, the 3D laser radar continuously scans surrounding scenes in the moving process, and in the scanning process, one piece with the most complete image information is used as a comparison point cloud image, and feature extraction is carried out on the comparison point cloud image by using deep learning;
s5: comparing the comparison point cloud image with the initial point cloud image, and drawing a global map according to the moving distance of the trolley in the two images and the distance and height information included in the two point cloud images;
s6: repeating the step S5, extracting features of the obtained image, and optimizing the global map until the three-dimensional map of the whole scene is built;
s7: the map built and stored in the S6 is uploaded to a display screen or a touch screen through an aerial view;
s8: performing navigation point setting in the map obtained in S7;
s9: the AGV automatically plans a route on the map, scans nearby scenes in real time, writes the sensed obstacles into the cache map, and plans the route again.
4. The 3D lidar positioning and navigation method for an AGV according to claim 3, wherein: the data content in S1 includes distance information, xy plane angle information, and xz plane angle information.
5. The 3D lidar positioning and navigation method for an AGV according to claim 3, wherein: the overhead view in S7 is a grayscale view.
Priority Applications (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| CN202010650707.2A CN111781929A (en) | 2020-07-08 | 2020-07-08 | AGV trolley and 3D laser radar positioning and navigation method |
Applications Claiming Priority (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| CN202010650707.2A CN111781929A (en) | 2020-07-08 | 2020-07-08 | AGV trolley and 3D laser radar positioning and navigation method |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| CN111781929A true CN111781929A (en) | 2020-10-16 |
Family
ID=72759608
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| CN202010650707.2A Pending CN111781929A (en) | 2020-07-08 | 2020-07-08 | AGV trolley and 3D laser radar positioning and navigation method |
Country Status (1)
| Country | Link |
|---|---|
| CN (1) | CN111781929A (en) |
Cited By (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN112455991A (en) * | 2020-12-04 | 2021-03-09 | 太普动力新能源(常熟)股份有限公司 | Method for unmanned carrying system of thread ends |
| CN112611374A (en) * | 2020-10-29 | 2021-04-06 | 华中科技大学鄂州工业技术研究院 | Path planning and obstacle avoidance method and system based on laser radar and depth camera |
| CN112947427A (en) * | 2021-02-01 | 2021-06-11 | 三一机器人科技有限公司 | Target object sensing system and sensing method |
| CN112947425A (en) * | 2021-02-01 | 2021-06-11 | 湖北迈睿达供应链股份有限公司 | Indoor outdoor AGV robot of multi-line radar |
| CN116107321A (en) * | 2023-04-13 | 2023-05-12 | 无锡科技职业学院 | Unmanned vehicle path planning system and method based on vision and laser radar fusion |
Citations (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN104596533A (en) * | 2015-01-07 | 2015-05-06 | 上海交通大学 | Automatic guided vehicle based on map matching and guide method of automatic guided vehicle |
| CN108226938A (en) * | 2017-12-08 | 2018-06-29 | 华南理工大学 | A kind of alignment system and method for AGV trolleies |
| CN108776474A (en) * | 2018-05-24 | 2018-11-09 | 中山赛伯坦智能科技有限公司 | Robot embedded computing terminal integrating high-precision navigation positioning and deep learning |
| CN108958250A (en) * | 2018-07-13 | 2018-12-07 | 华南理工大学 | Multisensor mobile platform and navigation and barrier-avoiding method based on known map |
| CN109855624A (en) * | 2019-01-17 | 2019-06-07 | 宁波舜宇智能科技有限公司 | Navigation device and air navigation aid for AGV vehicle |
-
2020
- 2020-07-08 CN CN202010650707.2A patent/CN111781929A/en active Pending
Patent Citations (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN104596533A (en) * | 2015-01-07 | 2015-05-06 | 上海交通大学 | Automatic guided vehicle based on map matching and guide method of automatic guided vehicle |
| CN108226938A (en) * | 2017-12-08 | 2018-06-29 | 华南理工大学 | A kind of alignment system and method for AGV trolleies |
| CN108776474A (en) * | 2018-05-24 | 2018-11-09 | 中山赛伯坦智能科技有限公司 | Robot embedded computing terminal integrating high-precision navigation positioning and deep learning |
| CN108958250A (en) * | 2018-07-13 | 2018-12-07 | 华南理工大学 | Multisensor mobile platform and navigation and barrier-avoiding method based on known map |
| CN109855624A (en) * | 2019-01-17 | 2019-06-07 | 宁波舜宇智能科技有限公司 | Navigation device and air navigation aid for AGV vehicle |
Cited By (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN112611374A (en) * | 2020-10-29 | 2021-04-06 | 华中科技大学鄂州工业技术研究院 | Path planning and obstacle avoidance method and system based on laser radar and depth camera |
| CN112455991A (en) * | 2020-12-04 | 2021-03-09 | 太普动力新能源(常熟)股份有限公司 | Method for unmanned carrying system of thread ends |
| CN112947427A (en) * | 2021-02-01 | 2021-06-11 | 三一机器人科技有限公司 | Target object sensing system and sensing method |
| CN112947425A (en) * | 2021-02-01 | 2021-06-11 | 湖北迈睿达供应链股份有限公司 | Indoor outdoor AGV robot of multi-line radar |
| CN116107321A (en) * | 2023-04-13 | 2023-05-12 | 无锡科技职业学院 | Unmanned vehicle path planning system and method based on vision and laser radar fusion |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| CN111781929A (en) | AGV trolley and 3D laser radar positioning and navigation method | |
| US11651553B2 (en) | Methods and systems for constructing map data using poisson surface reconstruction | |
| US10422648B2 (en) | Methods for finding the perimeter of a place using observed coordinates | |
| EP3825807B1 (en) | Method, device and assembly for map generation | |
| US10024965B2 (en) | Generating 3-dimensional maps of a scene using passive and active measurements | |
| CN110147106A (en) | Intelligent mobile service robot with laser and visual fusion obstacle avoidance system | |
| CN110119698B (en) | Method, apparatus, device and storage medium for determining the state of an object | |
| CN103064416B (en) | Crusing robot indoor and outdoor autonomous navigation system | |
| CN111486855A (en) | Indoor two-dimensional semantic grid map construction method with object navigation points | |
| CN112000103B (en) | AGV robot positioning, mapping and navigation method and system | |
| CN110275538A (en) | Intelligent cruise vehicle navigation method and system | |
| Miura et al. | Mobile robot map generation by integrating omnidirectional stereo and laser range finder | |
| US20200264616A1 (en) | Location estimation system and mobile body comprising location estimation system | |
| US20230064071A1 (en) | System for 3d surveying by an autonomous robotic vehicle using lidar-slam and an estimated point distribution map for path planning | |
| CN110764110B (en) | Path navigation method, device and computer readable storage medium | |
| Almansa-Valverde et al. | Mobile robot map building from time-of-flight camera | |
| CN115223039A (en) | Robot semi-autonomous control method and system for complex environment | |
| CN113112491A (en) | Cliff detection method and device, robot and storage medium | |
| US20200064481A1 (en) | Autonomous mobile device, control method and storage medium | |
| CN115114387A (en) | Grid map generation method and device, mobile tool, and storage medium | |
| CN114353779A (en) | Method for rapidly updating local cost map of robot by point cloud projection | |
| KR102517351B1 (en) | Transport robot capable of discriminating temporary obstacle and temporary obstacle removal method | |
| JP2024022888A (en) | Automatic map generation device and transportation vehicle system | |
| CN119043332B (en) | A path optimization method and system for intelligent logistics robots | |
| Kaneko et al. | Point cloud data map creation from factory design drawing for LiDAR localization of an autonomous mobile robot |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| PB01 | Publication | ||
| PB01 | Publication | ||
| SE01 | Entry into force of request for substantive examination | ||
| SE01 | Entry into force of request for substantive examination |