WO2018180175A1 - Corps mobile, dispositif de traitement de signal et programme informatique - Google Patents
Corps mobile, dispositif de traitement de signal et programme informatique Download PDFInfo
- Publication number
- WO2018180175A1 WO2018180175A1 PCT/JP2018/007787 JP2018007787W WO2018180175A1 WO 2018180175 A1 WO2018180175 A1 WO 2018180175A1 JP 2018007787 W JP2018007787 W JP 2018007787W WO 2018180175 A1 WO2018180175 A1 WO 2018180175A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- map
- sensor data
- sensor
- maps
- intermediate maps
- Prior art date
Links
- 238000012545 processing Methods 0.000 title claims abstract description 65
- 238000004590 computer program Methods 0.000 title claims description 7
- 238000000034 method Methods 0.000 claims abstract description 62
- 238000010586 diagram Methods 0.000 description 27
- 238000004891 communication Methods 0.000 description 21
- 238000004364 calculation method Methods 0.000 description 19
- 238000012986 modification Methods 0.000 description 10
- 230000004048 modification Effects 0.000 description 10
- 238000005259 measurement Methods 0.000 description 8
- 238000007781 pre-processing Methods 0.000 description 6
- 206010027146 Melanoderma Diseases 0.000 description 5
- 238000005516 engineering process Methods 0.000 description 5
- 230000007613 environmental effect Effects 0.000 description 5
- 230000003287 optical effect Effects 0.000 description 5
- 239000004065 semiconductor Substances 0.000 description 5
- 230000006870 function Effects 0.000 description 2
- 230000002093 peripheral effect Effects 0.000 description 2
- 230000005855 radiation Effects 0.000 description 2
- 230000001133 acceleration Effects 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 230000001678 irradiating effect Effects 0.000 description 1
- 230000004807 localization Effects 0.000 description 1
- 238000013507 mapping Methods 0.000 description 1
- 239000000758 substrate Substances 0.000 description 1
- 238000012546 transfer Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
Definitions
- the present disclosure relates to a mobile object, a signal processing device, and a computer program.
- An “environment map” is a map showing the geometrical state of an entity in space.
- An autonomous mobile robot measures the surrounding shape using an external sensor such as a laser distance sensor, and geometrically matches the surrounding shape and the environment map to estimate its own current position and orientation (identification). To do. As a result, the autonomous mobile robot can move along the route created on the environment map.
- JP 2012-93811 discloses a technique for updating an environmental map.
- the update of the environment map is a process of combining new measurement data with the shape of the past environment map and overwriting the past environment map with the new measurement data. Since the error included in the past environment map shape affects the accuracy when the new measurement data is combined, the updated environment map may include a new error. Therefore, when the update process is repeated, the error is accumulated, and the accuracy of the environment map can be lowered.
- an attribute that the shape is invariable is set in an area on the environment map where the actual shape is invariable (invariable area).
- the environmental map is not updated for the invariant area having the attribute.
- no error is accumulated in the invariant region, and accuracy can be maintained for shapes other than the invariant region.
- the environmental map created from the measurement data reflects the position and size of the objects that existed in the space at the time of measurement. If the object is a movable object such as a box and is moved after the measurement data is acquired, inconsistency occurs between the arrangement of the objects in the actual space and the environment map.
- an autonomous mobile robot has to identify its own position with reference to an existing environment map. In this case, there is a possibility that the self-position is erroneously identified.
- the exemplary embodiment of the present disclosure can accurately identify a self-position by using a plurality of intermediate maps generated before and after the object moves even when the object moves. Provides technology to generate maps.
- the moving body of the present disclosure includes a motor, a driving device that controls the motor to move the moving body, a sensor that senses a surrounding space and outputs sensor data, and a plurality of sheets.
- a storage device that stores intermediate map data, wherein each of the plurality of intermediate maps is generated from sensor data of the surrounding space sensed by the sensor at intervals of time.
- a processing circuit for generating one map from the plurality of intermediate maps stored in the storage device, wherein the one map includes a plurality of feature points in an identifiable manner, and the plurality of feature points Each includes a processing circuit that indicates a position where the degree of coincidence is greater than a predetermined value among the plurality of intermediate maps.
- the processing circuit performs processing for identifying the self-position by comparing the one map with sensor data newly output from the sensor.
- a mobile system includes a mobile body and a signal processing device.
- the moving body includes a motor, a driving device that moves the moving body by controlling the motor, a sensor that senses the surrounding space and outputs sensor data, and a storage device that stores data of a plurality of intermediate maps. Circuit.
- Each of the plurality of intermediate maps is generated from sensor data of the surrounding space sensed by the sensor with a time interval.
- the signal processing device includes a processing circuit that generates one map from a plurality of intermediate maps stored in the storage device of the moving body.
- the one map includes a plurality of feature points in an identifiable manner, and each of the plurality of feature points is a point having a degree of matching greater than a predetermined value between the plurality of intermediate maps.
- the control circuit of the moving body performs processing for identifying the self position by comparing one map generated by the processing circuit of the signal processing device with the sensor data newly output from the sensor. Thereby, the moving body can identify its own position with higher accuracy.
- FIG. 1 is a diagram illustrating an exemplary AGV 10 that travels in a passage 1 in a factory.
- FIG. 2 is a diagram illustrating an outline of an exemplary management system 100 that manages the traveling of the AGV 10.
- FIG. 3 is a diagram illustrating an example of each target position ( ⁇ ) set in the travel route of the AGV 10.
- FIG. 4A is a diagram illustrating an example of a movement route of the AGV 10 that continuously moves.
- FIG. 4B is a diagram illustrating an example of a movement route of the AGV 10 that continuously moves.
- FIG. 4C is a diagram illustrating an example of a movement route of the AGV 10 that continuously moves.
- FIG. 5 is an external view of an exemplary AGV 10.
- FIG. 5 is an external view of an exemplary AGV 10.
- FIG. 6 is a diagram illustrating a hardware configuration of the AGV 10.
- FIG. 7 is a diagram showing the AGV 10 that scans the surrounding space using the laser range finder 15 while moving.
- FIG. 8 is a diagram illustrating the AGV 10 that generates a map while moving.
- FIG. 9 is a diagram illustrating the AGV 10 that generates a map while moving.
- FIG. 10 is a diagram illustrating an AGV 10 that generates a map while moving.
- FIG. 11 is a diagram illustrating the AGV 10 that generates a map while moving.
- FIG. 12 is a diagram schematically showing the completed intermediate map 30.
- FIG. 13 is a diagram schematically illustrating a general position identification process.
- FIG. 14 is a diagram schematically illustrating a general position identification process.
- FIG. 13 is a diagram schematically illustrating a general position identification process.
- FIG. 15 is a diagram schematically illustrating a general position identification process.
- FIG. 16 is a diagram illustrating an example in which one or a plurality of obstacles 38 are placed in a part of a space where the AGV 10 travels.
- FIG. 17 is a diagram illustrating an example of sensor data 40 that reflects the position of the obstacle 38.
- FIG. 18 is a schematic diagram of a matching process between the reference map 30 and the sensor data 40.
- FIG. 19 is a diagram showing an area 34e including a set 6 of a plurality of points erroneously determined to match as a result of the matching process.
- FIG. 20 is a diagram schematically illustrating the procedure of the second process performed by the positioning device 14e.
- FIG. 21 is a diagram showing one map 50b that includes distinguishable feature points and non-feature points.
- FIG. 22 is a schematic diagram of the matching process between the map 50a and the sensor data 40.
- FIG. 23 is a diagram illustrating points that are determined to match as a result of matching.
- FIG. 24 is a flowchart illustrating a processing procedure for generating a new map.
- FIG. 25 is a diagram illustrating a hardware configuration of the travel management device 20 and a configuration of the mobile system 60.
- an automatic guided vehicle is mentioned as an example of a moving body.
- the automated guided vehicle is also called AGV (Automated Guided Vehicle), and is also described as “AGV” in this specification.
- FIG. 1 shows an AGV 10 that travels in a passage 1 in a factory, for example.
- FIG. 2 shows an overview of a management system 100 that manages the running of the AGV 10 according to this example.
- the AGV 10 has map data and travels while recognizing which position it is currently traveling.
- the travel route of the AGV 10 follows a command from the travel management device 20.
- the AGV 10 moves by rotating a plurality of built-in motors according to a command and rotating wheels.
- the command is transmitted from the traveling management device 20 to the AGV 10 by radio. Communication between the AGV 10 and the travel management device 20 is performed using wireless access points 2a, 2b, etc. provided near the ceiling of the factory.
- the communication conforms to, for example, the Wi-Fi (registered trademark) standard.
- Wi-Fi registered trademark
- FIG. 1 a plurality of AGVs 10 may travel.
- the traveling of each of the plurality of AGVs 10 may or may not be managed by the traveling management device 20.
- the outline of the operation of the AGV 10 and the travel management device 20 included in the management system 100 is as follows.
- the AGV 10 is described as the (n + 1) th position (hereinafter, “position M n + 1 ”) as the target position from the nth position in accordance with a command (nth command (n: positive integer)) from the travel management device 20. )).
- the target position can be determined by the administrator for each AGV 10, for example.
- the AGV 10 When the AGV 10 reaches the target position M n + 1 , the AGV 10 transmits an arrival notification (hereinafter referred to as “notification”) to the travel management device 20.
- the notification is sent to the travel management device 20 via the wireless access point 2a.
- AGV10 collates the output of the sensor which senses the periphery, and map data, and identifies a self position. Then, it may be determined whether or not the self position matches the position M n + 1 .
- the traveling management apparatus 20 When the notification is received, the traveling management apparatus 20 generates the next command ((n + 1) th command) for moving the AGV 10 from the position M n + 1 to the position M n + 2 .
- the (n + 1) th command includes the position coordinates of the position M n + 2 and may further include numerical values such as acceleration time and moving speed during constant speed traveling.
- the traveling management device 20 transmits the (n + 1) th command to the AGV 10.
- the AGV 10 analyzes the (n + 1) th command and performs a preprocessing calculation necessary for the movement from the position M n + 1 to the position M n + 2 .
- the preprocessing calculation is, for example, calculation for determining the rotation speed, rotation time, etc. of each motor for driving each wheel of the AGV 10.
- FIG. 3 shows an example of each target position ( ⁇ ) set in the travel route of AGV10.
- the interval between two adjacent target positions does not have to be a fixed value and can be determined by an administrator.
- the AGV 10 can move in various directions according to commands from the travel management device 20.
- 4A to 4C show examples of movement paths of the AGV 10 that moves continuously.
- FIG. 4A shows a moving path of the AGV 10 when traveling straight. After reaching the position M n + 1 , the AGV 10 can perform a pre-processing calculation, operate each motor according to the calculation result, and continue to move linearly to the next position M n + 2 .
- FIG. 4B shows a moving path of the AGV 10 that makes a left turn at the position M n + 1 and moves toward the position M n + 2 .
- the AGV 10 performs a preprocessing calculation after reaching the position M n + 1 and rotates at least one motor located on the right side in the traveling direction according to the calculation result.
- the AGV 10 rotates counterclockwise by an angle ⁇ on the spot, all the motors rotate at a constant speed toward the position M n + 2 and go straight.
- FIG. 4C shows a movement path of the AGV 10 when moving in a circular arc shape from the position M n + 1 to the position M n + 2 .
- the AGV 10 performs a preprocessing calculation after reaching the position M n + 1, and increases the rotational speed of the outer peripheral motor relative to the inner peripheral motor in accordance with the calculation result. As a result, the AGV 10 can continue to move along an arcuate path toward the next position M n + 2 .
- FIG. 5 is an external view of an exemplary AGV 10 according to the present embodiment.
- FIG. 6 shows the hardware configuration of the AGV 10.
- the AGV 10 includes four wheels 11a to 11d, a frame 12, a transfer table 13, a travel control device 14, and a laser range finder 15.
- the front wheel 11 a, the rear wheel 11 b, and the rear wheel 11 c are shown, but the front wheel 11 d is not clearly shown because it is hidden behind the frame 12.
- the traveling control device 14 is a device that controls the operation of the AGV 10.
- the traveling control device 14 mainly includes a plurality of integrated circuits including a microcomputer (described later), a plurality of electronic components, and a substrate on which the plurality of integrated circuits and the plurality of electronic components are mounted.
- the travel control device 14 performs data transmission / reception with the travel management device 20 and pre-processing calculation described above.
- the laser range finder 15 is an optical device that measures the distance to a target by, for example, irradiating the target with infrared laser light 15a and detecting the reflected light of the laser light 15a.
- the laser range finder 15 of the AGV 10 has a pulsed laser beam while changing its direction every 0.25 degrees in a space in the range of 135 degrees to the left and right (total 270 degrees) with respect to the front of the AGV 10, for example. 15a is emitted, and the reflected light of each laser beam 15a is detected. Thereby, the data of the distance to the reflection point in the direction determined by the angle for a total of 1080 steps every 0.25 degrees can be obtained.
- the arrangement of objects around the AGV can be obtained from the position and orientation of the AGV 10 and the scan result of the laser range finder 15.
- the position and posture of a moving object are called poses.
- the position and orientation of the moving body in the two-dimensional plane are expressed by position coordinates (x, y) in the XY orthogonal coordinate system and an angle ⁇ with respect to the X axis.
- the position and posture of the AGV 10, that is, the pose (x, y, ⁇ ) may be simply referred to as “position” hereinafter.
- the positioning device to be described later collates (matches) the local map data created from the scan result of the laser range finder 15 with a wider range of environmental map data, thereby self-position (x, y, ⁇ on the environmental map). ) Can be identified.
- the position of the reflection point seen from the radiation position of the laser beam 15a can be expressed using polar coordinates determined by the angle and the distance.
- the laser range finder 15 outputs sensor data expressed in polar coordinates.
- the laser range finder 15 may convert the position expressed in polar coordinates into orthogonal coordinates and output the result.
- Examples of objects that can be detected by the laser range finder 15 are people, luggage, shelves, and walls.
- the laser range finder 15 is an example of an external sensor for sensing the surrounding space and acquiring sensor data.
- Other examples of such an external sensor include an image sensor and an ultrasonic sensor.
- the “sensor data” output from the laser range finder 15 is a plurality of sets of vector data in which the angle ⁇ and the distance L are set as one set.
- the angle ⁇ changes by 0.25 degrees within a range of ⁇ 135 degrees to +135 degrees.
- the angle may be expressed with the right side as positive and the left side as negative with respect to the front of the AGV 10.
- the distance L is the distance to the object measured for each angle ⁇ .
- the distance L is obtained by dividing half of the difference between the emission time of the infrared laser beam 15a and the reception time of the reflected light (that is, the time required for the round trip of the laser beam) by the speed of light.
- FIG. 6 also shows a specific configuration of the traveling control device 14 of the AGV 10.
- the AGV 10 includes a travel control device 14, a laser range finder 15, four motors 16 a to 16 d, and a drive device 17.
- the traveling control device 14 includes a microcomputer 14a, a memory 14b, a storage device 14c, a communication circuit 14d, and a positioning device 14e.
- the microcomputer 14a, the memory 14b, the storage device 14c, the communication circuit 14d, and the positioning device 14e are connected by a communication bus 14f and can exchange data with each other.
- the laser range finder 15 is also connected to the communication bus 14f via a communication interface (not shown), and transmits measurement data as measurement results to the microcomputer 14a, the positioning device 14e and / or the memory 14b.
- the microcomputer 14a is a control circuit (computer) that performs calculations for controlling the entire AGV 10 including the travel control device 14.
- the microcomputer 14a is a semiconductor integrated circuit.
- the microcomputer 14a transmits a PWM (Pulse Width Modulation) signal to the driving device 17 to control the driving device 17 and adjust the current flowing through the motor. As a result, each of the motors 16a to 16d rotates at a desired rotation speed.
- PWM Pulse Width Modulation
- the memory 14b is a volatile storage device that stores a computer program executed by the microcomputer 14a.
- the memory 14b can also be used as a work memory when the microcomputer 14a and the positioning device 14e perform calculations.
- the storage device 14c is a non-volatile semiconductor memory device that stores map data.
- the storage device 14c may be a magnetic recording medium typified by a hard disk or an optical recording medium typified by an optical disk, and a head for writing and / or reading data on any recording medium.
- the apparatus and the control device of the head device may be included. Details of the map data will be described later with reference to FIG.
- the communication circuit 14d is a wireless communication circuit that performs wireless communication conforming to, for example, the Wi-Fi (registered trademark) standard.
- the positioning device 14e receives sensor data from the laser range finder 15 and reads map data stored in the storage device 14c.
- the positioning device 14e performs a process of comparing the sensor data and the map data to identify the self position. The specific operation of the positioning device 14e will be described later.
- the microcomputer 14a and the positioning device 14e are separate components, but this is an example. It may be a single chip circuit or a semiconductor integrated circuit capable of independently performing the operations of the microcomputer 14a and the positioning device 14e.
- FIG. 6 shows a chip circuit 14g including the microcomputer 14a and the positioning device 14e.
- the microcomputer 14a, the positioning device 14e, and / or the chip circuit 14g may be referred to as a computer or a processing circuit.
- an example in which the microcomputer 14a and the positioning device 14e are separately provided will be described.
- the four motors 16a to 16d are attached to the four wheels 11a to 11d, respectively, and rotate each wheel.
- the number of motors is an example. Two or three may be sufficient and five or more may be sufficient.
- the drive device 17 has motor drive circuits 17a to 17d for adjusting the current flowing through each of the four motors 16a to 16d.
- Each of the motor drive circuits 17a to 17d is a so-called inverter circuit, and the current flowing to each motor is turned on or off by the PWM signal transmitted from the microcomputer 14a, thereby adjusting the current flowing to the motor.
- the process of generating a map can be divided into a first process and a second process.
- the first process is a process for generating a plurality of intermediate maps.
- the second process is a process of determining a plurality of feature points from the plurality of intermediate maps and generating one map.
- the first process is realized by SLAM (Simultaneous Localization and Mapping) technology as an example.
- SLAM Simultaneous Localization and Mapping
- the AGV 10 scans the surrounding space by operating the laser range finder 15 while actually traveling in a factory where the AGV 10 is used, and generates a map while estimating its own position.
- the AGV 10 may travel on a specific route while being controlled by the administrator, and generate a map from the sensor data acquired by the laser range finder 15.
- FIG. 7 to 11 each show an AGV 10 that generates a map while moving.
- FIG. 7 shows an AGV 10 that scans the surrounding space using the laser range finder 15. Laser light is emitted at every predetermined step angle, and scanning is performed.
- the position of the reflection point of the laser beam is indicated by using a plurality of points represented by the symbol “•”, such as point 4 in FIG.
- the positioning device 14e accumulates the position of the black spot 4 obtained as a result of traveling, for example, in the memory 14b.
- the map is gradually completed by continuously performing scanning while the AGV 10 travels. 8 to 11, only the scan range is shown for the sake of simplicity.
- the scan range is also an example, and is different from the above-described example of 270 degrees in total.
- FIG. 12 schematically shows the completed intermediate map 30.
- the positioning device 14e accumulates the data of the intermediate map 30 in the memory 14b or the storage device 14c.
- the number or density of black spots shown in the figure is an example.
- the AGV 10 generates a plurality of intermediate maps by running a plurality of times at intervals, and stores them in the memory 14b or the storage device 14c.
- the AGV 10 runs at a time interval that can be appropriately determined by those skilled in the art, such as several days, hours, tens of minutes, minutes, etc., and generates a plurality of intermediate maps.
- the reason why the AGV 10 generates a plurality of intermediate maps is that an object that has not been moved (referred to as a “fixed object”) and an object that has been moved (referred to as a “movable object”) can be identified. Reflection points existing at the same position across a plurality of intermediate maps can be regarded as representing a fixed object. On the other hand, a reflection point whose position is changed between a plurality of intermediate maps means that the existing movable object has been removed or a new movable object has been placed. In the present specification, an object that can be actually moved but that has not been moved in the process of generating a plurality of intermediate maps is also referred to as a “fixed object”. On the other hand, even an object that is generally considered to have few opportunities to move, such as a wall, an object moved in the process of generating a plurality of intermediate maps is called a “movable object”.
- the AGV can identify its own position. it can.
- an example of processing for identifying a self-location using a single map will be described.
- FIG. 13 to FIG. 15 schematically show a procedure for general position identification processing.
- AGV has acquired in advance a map (hereinafter referred to as “reference map 30”) corresponding to the intermediate map 30 of FIG.
- the AGV acquires the sensor data 32 shown in FIG. 13 at a predetermined time interval or at all times during traveling, and executes a process for identifying the self position.
- the AGV sets various regions (for example, regions 34 a, 34 b, 34 c) whose positions and angles are changed on the reference map 30, and a plurality of reflection points included in each region and a reflection included in the sensor data 32.
- Match points FIG. 14 shows a point (for example, point 5) determined to be a match as a result of the collation, by the symbol “ ⁇ ”. If the ratio of the number of coincidence points is larger than a predetermined reference value, the AGV determines that the sensor data 32 coincides with a plurality of black points in the region 34d.
- AGV determines the emission position of the laser beam that can obtain each black spot included in the region 34d, that is, the self position.
- the identified self-position 36 is represented by the symbol “X”.
- AGV can identify its own position.
- the AGV cannot always correctly identify the self position, and a position different from the actual position may be erroneously identified as the self position.
- FIG. 16 shows an example in which one or more obstacles 38 are placed in a part of the space where the AGV 10 travels.
- the sensor data acquired by the AGV differs depending on the presence or absence of the obstacle 38.
- FIG. 17 shows an example of the sensor data 40 reflecting the position of the obstacle 38.
- FIG. 18 is a schematic diagram of a matching process between the reference map 30 and the sensor data 40.
- AGV sets various regions 34 a, 34 b, 34 c and the like, and collates a plurality of reflection points included in each of the regions with sensor data 40.
- FIG. 19 shows a region 34e including a set 6 of a plurality of points erroneously determined to match as a result of the matching process.
- the first process is performed a plurality of times to obtain a plurality of intermediate maps, and then the second process is further performed.
- FIG. 20 schematically shows the procedure of the second process performed by the positioning device 14e.
- the positioning device 14e generates one map 50a from the six intermediate maps 30a to 30f acquired with a time interval.
- the number of intermediate maps may be any number as long as it is plural. For example, the number may be 2 to 5, or 7 or more.
- Each black dot constituting the six intermediate maps 30a to 30f indicates the position of the reflection point of the laser beam.
- the positioning device 14e determines feature points from the intermediate maps 30a to 30f.
- the “feature point” is a point having a relatively high “matching degree” between a plurality of intermediate maps acquired multiple times, and more specifically, the matching degree is higher than a predetermined value. It means a large virtual coordinate point.
- Each virtual coordinate point is one unit for determining “free space” through which laser light can pass or “occupied space” through which laser light cannot pass.
- an example of the “occupied space” is the surface of the object existing in the space and the inside of the object.
- the degree of coincidence can be calculated by various methods. For example, when the “degree of coincidence” is expressed by the degree of coincidence regarding the positions of the black dots constituting the intermediate maps 30a to 30f, it can be calculated by the following method.
- the positioning device 14e adjusts the angles of the intermediate maps 30a to 30f on the basis of one or a plurality of feature amounts that are commonly included in the intermediate maps 30a to 30f, and superimposes the intermediate maps 30a to 30f.
- the “feature amount” is represented by a positional relationship (arrangement) of a plurality of black spots representing a position where a movable object is considered to be substantially not placed, for example, a staircase or a lifting device.
- the positioning device 14e determines whether there are four or more intermediate maps having black spots at the same position. When there are four or more intermediate maps having black spots at the same position, the black spots at the positions are reflected as “feature points” in the newly generated map. On the other hand, when there are less than three intermediate maps having black spots at the same position, the black spots at the positions are excluded from “feature points”. Note that “four” is an example of a threshold value. The feature point only needs to be included in common on at least two intermediate maps, and in this case, the threshold value is “two”.
- the positioning device 14e sets a virtual coordinate position, determines a black spot closest to the coordinate position for each intermediate map, obtains a deviation amount from the coordinate position, and further obtains a sum of deviation amounts.
- the positioning device 14e may obtain the difference or ratio between the obtained sum and a predetermined allowable value as the degree of coincidence. When the degree of coincidence is obtained for each virtual coordinate position on the generated map, the positioning device 14e can determine the feature point.
- the coordinate position determined to be each feature point represents the surface of a fixed object that exists in common on a plurality of maps.
- the coordinate position that has not been determined to be a feature point represents the position where the surface of the removed movable object exists or the position where the surface of the newly placed movable object exists.
- the positioning device 14e of the AGV 10 generates one map 50a in which the feature points are determined. As shown in FIG. 20, the map 50a shows the positions of a plurality of feature points 51, but the position 52 where the feature points 51 do not exist is blank. The map 50a is a map in which only each feature point is identified.
- FIG. 21 shows one map 50b that includes distinguishable feature points and non-feature points.
- a plurality of feature points 51 and a plurality of non-feature points 53 are shown to be identifiable.
- each point is used to determine whether or not each point coincides with a point acquired as sensor data at a rate corresponding to the value of each rate. That is, the “ratio” represents the “weight” used for calculation during the position identification process.
- a black point having a weight of 1 that is always included in the calculation during the position identification process may be referred to as a “feature point”, and a black point less than 1 may be referred to as a non-feature point.
- a black point having a weight greater than or equal to a threshold, for example, 0.6 or more may be referred to as a “feature point”, and a black point having a weight less than 0.6 may be referred to as a non-feature point.
- a black point with a weight greater than at least 0 may be referred to as a “feature point”, and a black point with a weight of 0 may be referred to as a non-feature point.
- a black point having a weight greater than or equal to an arbitrary threshold value or greater than the threshold value may be referred to as a feature point, and a black point having a weight less than or less than the threshold value may be referred to as a non-feature point.
- a black point with a weight set may be referred to as a “feature point”, and a black point with no weight set may be referred to as a “non-feature point”.
- the positioning device 14e performs a process of identifying the self-position by comparing the sensor data with a plurality of feature points included in the map 50a or 50b. Below, the process which collates the sensor data 40 and map 50a shown in FIG. 17 as an example is demonstrated.
- FIG. 22 is a schematic diagram of the matching process between the map 50a and the sensor data 40.
- the positioning device 14e sets various regions 34a, 34b, 34c, etc., and collates a plurality of feature points included in each region with the sensor data 40 by the method described with reference to FIG.
- FIG. 23 shows a point (for example, point 7) determined to be a match as a result of the collation, by the symbol “ ⁇ ”.
- the example of FIG. 23 differs from the example of FIG. 19 in that, in the example of FIG. 23, the position (or region) 52 where the feature point 51 does not exist is not used for the collation processing. Since the positioning device 14e performs the matching process using the feature points and sensor data having a high degree of coincidence, a more reliable matching result can be obtained. As a result, the positioning device 14e can correctly identify the position 42 that actually exists as the self-position.
- the positioning device 14e limits the range in which the area is set in order to reduce the amount of calculation.
- the AGV 10 travels by receiving an instruction from the travel management device 20 about the target position M n + 1 to be next. Therefore, a region may be set so as to include the travel route between the positions M n and M n + 1 , and a plurality of features in each region may be compared with the sensor data 40. Thereby, the comparison calculation can be omitted for an area including a position far away from the current travel route.
- the positioning device 14e may perform a process of matching the map 50b with the sensor data 40 by giving different weights to the feature points 51 and the non-feature points 53.
- the weight of each feature point 51 is set larger than the weight of the non-feature point 53.
- FIG. 24 is a flowchart illustrating a processing procedure for generating a new map.
- step S10 the positioning device 14e of the AGV 10 senses the surrounding space while traveling in the factory and acquires a plurality of intermediate maps.
- step S ⁇ b> 11 the positioning device 14 e determines a feature point having a matching degree greater than a predetermined value between a plurality of intermediate maps. The method for calculating the coincidence and the method for determining the feature points are as described above.
- step S12 the positioning device 14e generates a map including the feature points in an identifiable manner.
- the map generated in step S12 may be the map 50a shown in FIG. 20 or the map 50b shown in FIG.
- the above-described processing may be performed by the microcomputer 14a instead of the positioning device 14e, or may be performed by the chip circuit 14g in which the microcomputer 14a and the positioning device 14e are integrated. That is, the “processing circuit” may perform the above processing.
- the above-mentioned process uses all of the plurality of intermediate maps, it is sufficient to use at least two intermediate maps instead of all of them.
- the positioning device 14e of the AGV 10 generates a plurality of intermediate maps from the sensor data, and further generates one map from the plurality of intermediate maps.
- an external signal processing device other than the AGV 10 for example, the travel management device 20, may perform the process of generating one final map from the sensor data.
- a system including the AGV 10 and the signal processing device is referred to as a “map generation system”.
- a system in which the AGV 10 performs a process of identifying its own position using one map generated from a plurality of intermediate maps is called a “mobile system”.
- traveling management device 20 will be exemplified as an external signal processing device.
- the AGV 10 takes a round of the factory, acquires a group of sensor data necessary for the intermediate map, and writes it on, for example, a removable recording medium.
- the administrator removes the recording medium and causes the traveling management device 20 to read the written group of sensor data.
- the AGV 10 may wirelessly transmit the obtained sensor data to the travel management device 20 for each scan or several scans while traveling in the factory.
- the travel management device 20 can acquire a group of sensor data necessary for the intermediate map.
- the intermediate map is created not by the AGV 10 but by the travel management device 20.
- FIG. 25 shows the configuration of the mobile system 60.
- FIG. 25 also shows a hardware configuration of the travel management device 20.
- the travel management device 20 includes a CPU 21, a memory 22, a storage device 23, a communication circuit 24, and an image processing circuit 25.
- the CPU 21, the memory 22, the storage device 23, the communication circuit 24, and the image processing circuit 25 are connected by a communication bus 27 and can exchange data with each other.
- the CPU 21 is a signal processing circuit (computer) that controls the operation of the travel management device 20.
- the CPU 21 is a semiconductor integrated circuit.
- the memory 22 is a volatile storage device that stores a computer program executed by the CPU 21.
- the memory 22 can also be used as a work memory when the CPU 21 performs calculations.
- the storage device 23 stores a group of sensor data received from the AGV 10.
- the storage device 23 may store data of a plurality of intermediate maps generated from a group of sensor data, and may further store data of one map generated from a plurality of intermediate maps.
- the storage device 23 may be a nonvolatile semiconductor memory, a magnetic recording medium represented by a hard disk, or an optical recording medium represented by an optical disk.
- the storage device 23 can also store position data indicating each position that can be a destination of the AGV 10 that is necessary to function as the travel management device 20.
- the position data can be represented by coordinates virtually set in the factory by an administrator, for example.
- the location data is determined by the administrator.
- the communication circuit 24 performs wired communication based on, for example, the Ethernet (registered trademark) standard.
- the communication circuit 24 is connected to the wireless access points 2a, 2b and the like by wire, and can communicate with the AGV 10 via the wireless access points 2a, 2b and the like.
- the travel management device 20 receives individual data of a plurality of intermediate maps or a group of data from the AGV 10 via the communication circuit 24, and transmits the generated data of one map to the AGV 10.
- the communication circuit 24 receives the data of the position to which the AGV 10 should go from the CPU 21 via the bus 27 and transmits it to the AGV 10.
- the communication circuit 24 transmits data (for example, notification) received from the AGV 10 to the CPU 21 and / or the memory 22 via the bus 27.
- the image processing circuit 25 is a circuit that generates video data to be displayed on the external monitor 29.
- the image processing circuit 25 operates exclusively when the administrator operates the travel management device 20. In the present embodiment, further detailed explanation is omitted.
- the monitor 29 may be integrated with the travel management apparatus 20. Further, the CPU 21 may perform the processing of the image processing circuit 25.
- the CPU 21 or the image processing circuit 25 of the travel management device 20 reads a group of sensor data with reference to the storage device 23.
- Each sensor data is, for example, vector data including a set of the position and orientation of the AGV 10 when the sensor data is acquired, the angle at which the laser light is emitted, and the distance from the radiation point to the reflection point.
- the CPU 21 or the image processing circuit 25 converts each sensor data into the position of the coordinate point on the orthogonal coordinates. By converting all the sensor data, a plurality of intermediate maps are obtained.
- the CPU 21 or the image processing circuit 25 generates one map from a plurality of intermediate maps. That is, in the process of FIG. 24, the travel management device 20 may perform the above-described steps S11 and S12.
- the generated map data is temporarily stored in the memory 22 or the storage device 23.
- the generated map data is then transmitted to the AGV 10 that generated the intermediate map data via the access points 2a and 2b via a wireless or removable recording medium.
- the intermediate map data may be transmitted to another AGV 10.
- the travel management device 20 generates one map from a group of sensor data.
- the travel management device 20 is an example. If a signal processing circuit (or computer) corresponding to the CPU 21, a memory, and a communication circuit are included, an arbitrary signal processing device can generate one map from a group of sensor data. In that case, functions and configurations necessary for functioning as the travel management device 20 can be omitted from the signal processing device.
- Modification 2 is an application of Modification 1.
- the entire factory may be divided into a plurality of smaller zones, and an intermediate map may be created using one AGV 10 for each zone. For example, a factory of 150 mx 100 m is divided into six zones with one zone of 50 mx 50 m.
- the AGV 10 in each zone acquires a group of sensor data, and causes an external signal processing device other than the AGV 10 such as the travel management device 20 to acquire a group of sensor data by any of the methods described in the first modification.
- the travel management device 20 generates a plurality of intermediate maps for each zone by the above-described processing, and further generates one zone map from the plurality of intermediate maps. Then, the travel management device 20 creates an integrated map that combines the zone maps into one.
- the traveling management device 20 may transmit the obtained integrated map to all the AGVs 10, or may cut out and transmit only the corresponding zones to the AGVs 10 in which moving zones are determined in advance.
- the AGV 10 generates one map from a plurality of intermediate maps.
- the positioning device 14e of the AGV 10 obtains a plurality of intermediate maps and then collates the sensor data with the data of each intermediate map, and each of the intermediate maps has a plurality of matching degrees greater than a predetermined value. Determine feature points.
- the positioning device 14e collates the sensor data with the data of the first intermediate map, and determines a plurality of first feature points for the first intermediate map.
- the positioning device 14e collates the sensor data with the data of the second intermediate map, and determines a plurality of second feature points for the second intermediate map.
- the positioning device 14e determines a plurality of feature points (referred to as “common feature points”) included in common with the plurality of first feature points and the plurality of second feature points.
- the common feature point corresponds to a feature point included in the map 50a or 50b in the above-described embodiment.
- the positioning device 14e may determine, for each position, whether the ratio of the number of intermediate maps having feature points to the total number of intermediate maps is greater than or equal to the threshold. Similar to the above-described embodiment, the positioning device 14e may determine that the position is a common feature point when the ratio is greater than the threshold value or greater than or equal to the threshold value.
- the positioning device 14e identifies the self-position by using the collation result between the common feature point data and the sensor data.
- the “matching result” may be the result of the matching of the common feature point data and the sensor data by the positioning device 14e, or the result of the processing performed before the common feature point is determined. It may be.
- Modification 4 In the examples described above, a map of a two-dimensional space in which the AGV 10 travels is assumed. However, a three-dimensional space map may be generated using a laser range finder that can also scan a space in the height direction.
- AGV10 generates a plurality of intermediate maps including not only the plane direction but also the height direction. Then, further considering the element in the height direction, a point having a degree of matching greater than a predetermined value between the plurality of intermediate maps is determined as a plurality of feature points. If a single map including the plurality of feature points in an identifiable manner is used, a process for identifying the self-position using the feature amount in the height direction can be performed.
- the technology of the present disclosure can be widely used in a mobile body that performs processing for identifying a self-position, a travel management device that controls the mobile body, and a management system that includes the mobile body and the travel management device.
- 2a, 2b wireless access point 10 automatic guided vehicle (AGV), 14 travel control device, 14a microcomputer, 14b memory, 14c storage device, 14d communication circuit, 14e positioning device, 15 laser range finder, 16a-16d motor, 17 drive Device, 17a-17d motor drive circuit, 20 travel management device
Landscapes
- Engineering & Computer Science (AREA)
- Aviation & Aerospace Engineering (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Automation & Control Theory (AREA)
- Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
Abstract
L'invention concerne un corps mobile (10) comprenant : des moteurs (16a-16d) ; un dispositif d'excitation (17) pour les moteurs ; un capteur (15) qui détecte des espaces environnants et émet des données de capteur ; un dispositif de stockage (14c) qui stocke des données pour une pluralité de cartes intermédiaires, chacune de la pluralité de cartes intermédiaires étant générée à partir des données de capteur concernant des espaces environnants qui sont chacun détectés par le capteur à des intervalles de temps ; et des circuits de traitement (14a, 14e, 14g) qui génèrent une carte unique à partir de la pluralité de cartes intermédiaires. La carte unique comprend une pluralité de points caractéristiques (51) d'une manière identifiable. Chacun de la pluralité de points caractéristiques indique une position où le degré de correspondance entre la pluralité de cartes intermédiaires dépasse une valeur prescrite. Les circuits de traitement réalisent un processus de mise en correspondance de la carte unique avec les données de capteur nouvellement émises par le capteur et identifiant la position automatique.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2019509044A JPWO2018180175A1 (ja) | 2017-03-27 | 2018-03-01 | 移動体、信号処理装置およびコンピュータプログラム |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2017-061668 | 2017-03-27 | ||
JP2017061668 | 2017-03-27 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2018180175A1 true WO2018180175A1 (fr) | 2018-10-04 |
Family
ID=63676989
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2018/007787 WO2018180175A1 (fr) | 2017-03-27 | 2018-03-01 | Corps mobile, dispositif de traitement de signal et programme informatique |
Country Status (2)
Country | Link |
---|---|
JP (1) | JPWO2018180175A1 (fr) |
WO (1) | WO2018180175A1 (fr) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109724612A (zh) * | 2019-01-14 | 2019-05-07 | 浙江大华技术股份有限公司 | 一种基于拓扑地图的agv路径规划方法及设备 |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2005326944A (ja) * | 2004-05-12 | 2005-11-24 | Hitachi Ltd | レーザー計測により地図画像を生成する装置及び方法 |
JP2010277548A (ja) * | 2009-06-01 | 2010-12-09 | Hitachi Ltd | ロボット管理システム、ロボット管理端末、ロボット管理方法およびプログラム |
JP2012093811A (ja) * | 2010-10-25 | 2012-05-17 | Hitachi Ltd | ロボットシステム及び地図更新方法 |
JP2015111336A (ja) * | 2013-12-06 | 2015-06-18 | トヨタ自動車株式会社 | 移動ロボット |
WO2015193941A1 (fr) * | 2014-06-16 | 2015-12-23 | 株式会社日立製作所 | Système de génération de carte et procédé de génération de carte |
JP2016045874A (ja) * | 2014-08-26 | 2016-04-04 | ソニー株式会社 | 情報処理装置、情報処理方法、及びプログラム |
-
2018
- 2018-03-01 WO PCT/JP2018/007787 patent/WO2018180175A1/fr active Application Filing
- 2018-03-01 JP JP2019509044A patent/JPWO2018180175A1/ja active Pending
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2005326944A (ja) * | 2004-05-12 | 2005-11-24 | Hitachi Ltd | レーザー計測により地図画像を生成する装置及び方法 |
JP2010277548A (ja) * | 2009-06-01 | 2010-12-09 | Hitachi Ltd | ロボット管理システム、ロボット管理端末、ロボット管理方法およびプログラム |
JP2012093811A (ja) * | 2010-10-25 | 2012-05-17 | Hitachi Ltd | ロボットシステム及び地図更新方法 |
JP2015111336A (ja) * | 2013-12-06 | 2015-06-18 | トヨタ自動車株式会社 | 移動ロボット |
WO2015193941A1 (fr) * | 2014-06-16 | 2015-12-23 | 株式会社日立製作所 | Système de génération de carte et procédé de génération de carte |
JP2016045874A (ja) * | 2014-08-26 | 2016-04-04 | ソニー株式会社 | 情報処理装置、情報処理方法、及びプログラム |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109724612A (zh) * | 2019-01-14 | 2019-05-07 | 浙江大华技术股份有限公司 | 一种基于拓扑地图的agv路径规划方法及设备 |
US12253383B2 (en) | 2019-01-14 | 2025-03-18 | Zhejiang Huaray Technology Co., Ltd. | Systems and methods for route planning on topographical map using vehicle motion |
Also Published As
Publication number | Publication date |
---|---|
JPWO2018180175A1 (ja) | 2020-02-06 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP6825712B2 (ja) | 移動体、位置推定装置、およびコンピュータプログラム | |
US10866587B2 (en) | System, method, and computer program for mobile body management | |
JP6816830B2 (ja) | 位置推定システム、および当該位置推定システムを備える移動体 | |
US7899618B2 (en) | Optical laser guidance system and method | |
US9239580B2 (en) | Autonomous mobile robot, self position estimation method, environmental map generation method, environmental map generation apparatus, and data structure for environmental map | |
CN110673612A (zh) | 一种自主移动机器人二维码引导控制方法 | |
JP7111424B2 (ja) | 移動体、位置推定装置、およびコンピュータプログラム | |
JP6711138B2 (ja) | 自己位置推定装置、及び、自己位置推定方法 | |
WO2019054209A1 (fr) | Système et dispositif de création de carte | |
WO2016067640A1 (fr) | Dispositif mobile autonome | |
JP2020064011A (ja) | レーザスキャナのキャリブレーション方法、運搬機械 | |
CN108363395A (zh) | 一种agv自主避障的方法 | |
CN114714357A (zh) | 一种分拣搬运方法、分拣搬运机器人及存储介质 | |
WO2018179960A1 (fr) | Corps mobile et dispositif d'estimation de position locale | |
JP2020166702A (ja) | 移動体システム、地図作成システム、経路作成プログラムおよび地図作成プログラム | |
JP7255676B2 (ja) | 搬送車システム、搬送車、及び、制御方法 | |
JP2019079171A (ja) | 移動体 | |
WO2018180175A1 (fr) | Corps mobile, dispositif de traitement de signal et programme informatique | |
JP7396353B2 (ja) | 地図作成システム、信号処理回路、移動体および地図作成方法 | |
WO2021049227A1 (fr) | Système de traitement d'informations, dispositif de traitement d'informations et programme de traitement d'informations | |
JP2021056764A (ja) | 移動体 | |
JP6795730B6 (ja) | 移動体の管理システム、移動体、走行管理装置およびコンピュータプログラム | |
JP2020107116A (ja) | 自律移動体 | |
JP6687313B1 (ja) | 搬送システム | |
KR20240096070A (ko) | Ros 기반 무인지게차의 슬램 네비게이션 시스템 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 18776353 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2019509044 Country of ref document: JP Kind code of ref document: A |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 18776353 Country of ref document: EP Kind code of ref document: A1 |