WO2018179960A1 - Corps mobile et dispositif d'estimation de position locale - Google Patents
Corps mobile et dispositif d'estimation de position locale Download PDFInfo
- Publication number
- WO2018179960A1 WO2018179960A1 PCT/JP2018/005266 JP2018005266W WO2018179960A1 WO 2018179960 A1 WO2018179960 A1 WO 2018179960A1 JP 2018005266 W JP2018005266 W JP 2018005266W WO 2018179960 A1 WO2018179960 A1 WO 2018179960A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- time
- moving body
- positioning device
- estimated value
- moving
- Prior art date
Links
- 238000000034 method Methods 0.000 claims abstract description 30
- 238000004364 calculation method Methods 0.000 claims description 16
- 238000012937 correction Methods 0.000 claims description 6
- 238000012545 processing Methods 0.000 description 23
- 238000004891 communication Methods 0.000 description 22
- 238000010586 diagram Methods 0.000 description 21
- 238000007781 pre-processing Methods 0.000 description 6
- 239000004065 semiconductor Substances 0.000 description 5
- 230000003287 optical effect Effects 0.000 description 4
- 238000004422 calculation algorithm Methods 0.000 description 3
- 238000005259 measurement Methods 0.000 description 3
- 238000004590 computer program Methods 0.000 description 2
- 230000006870 function Effects 0.000 description 2
- 239000002245 particle Substances 0.000 description 2
- 230000002093 peripheral effect Effects 0.000 description 2
- 206010027146 Melanoderma Diseases 0.000 description 1
- 238000000342 Monte Carlo simulation Methods 0.000 description 1
- 230000001133 acceleration Effects 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 238000011156 evaluation Methods 0.000 description 1
- 230000001678 irradiating effect Effects 0.000 description 1
- 230000004807 localization Effects 0.000 description 1
- 238000013507 mapping Methods 0.000 description 1
- 230000005855 radiation Effects 0.000 description 1
- 239000000758 substrate Substances 0.000 description 1
- 238000012795 verification Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
Definitions
- the present disclosure relates to a moving body and a self-position estimation apparatus used for the moving body.
- a moving body such as a drone (unmanned aerial vehicle), an autonomous driving car, and an autonomous mobile robot is underway.
- a moving body that performs self-position estimation includes an external sensor such as a laser range sensor, and senses surrounding space while moving to acquire sensor data. For example, it is possible to identify the self-location on the environment map by collating (matching) the local map data around the moving object created from the sensor data with a wider range of environment map data.
- Japanese Patent Laying-Open No. 2016-224680 discloses a self-position estimation apparatus that includes a first self-position estimation unit and a second self-position estimation unit, and executes an estimation process for each step.
- the first self-position estimating unit obtains a probability distribution of the latest position of the moving body from the sensor data and the environment map, and estimates the first self-position based on the probability distribution.
- the second self-position estimation unit estimates the second self-position by adding the movement distance and movement direction from the previous step to the current step acquired by odometry to the final self-position estimated in the previous step. To do.
- the weighted average value of the first self-position and the second self-position is set as the final self-position in the current step.
- the second self-position is calculated based on the movement of the moving body (movement of one step) during a fixed time from the previous step to the current step. . Also, the movement for one step is measured by odometry at a fixed time from the previous step to the current step.
- the embodiment of the present disclosure provides a moving body capable of realizing position estimation with high accuracy regardless of the moving speed, and a self-position estimating apparatus used for the moving body.
- the mobile body of the present disclosure includes a motor, a drive device that controls the motor to move the mobile body, and an external sensor that senses the surrounding space and periodically outputs sensor data.
- a storage device that stores map data, a positioning device that performs a process of estimating the position of the moving body using the sensor data and the map data, and sequentially outputs the estimated value of the position, and an arithmetic circuit; Is provided.
- the arithmetic circuit starts the process of estimating the position of the moving body at the second time after the positioning device outputs the estimated value of the position of the moving body at the first time from the first time.
- the estimated value of the position at the first time is corrected based on the distance and direction that the moving object moves until the second time, and the corrected estimated value is the position of the moving object at the second time. Is given to the positioning device as an initial position.
- the self-position estimation apparatus is a self-position estimation apparatus that is used by being mounted on a mobile body that includes an external sensor that senses surrounding space and periodically outputs sensor data in an exemplary embodiment.
- a storage device that stores map data, a positioning device that performs a process of estimating the position of the moving body using the sensor data and the map data, and sequentially outputs the estimated value of the position, and an arithmetic circuit.
- the arithmetic circuit starts the process of estimating the position of the moving body at the second time after the positioning device outputs the estimated value of the position of the moving body at the first time from the first time.
- the estimated value of the position at the first time is corrected based on the distance and direction that the moving object moves until the second time, and the corrected estimated value is the position of the moving object at the second time. Is given to the positioning device as an initial position.
- the moving distance and direction of the moving object can be calculated according to the processing time of the positioning device. it can. For this reason, it is not limited to the period of one step of the sequential processing, and it becomes possible to give the initial position with higher accuracy to the positioning device.
- FIG. 1 is a block diagram illustrating a basic configuration example of a moving object according to the present disclosure.
- FIG. 2 is a diagram illustrating an example of the timing at which sensor data is periodically output from the external sensor 106 and the timing at which a position estimation value is output from the positioning device 110 in the conventional example.
- FIG. 3 is a diagram illustrating an example of a timing at which sensor data is periodically output from the external sensor 106 and a timing at which a position estimation value is output from the positioning device 110 for a moving object according to the present disclosure.
- FIG. 4 is a diagram illustrating an exemplary AGV 10 that travels in a passage 1 in a factory.
- FIG. 5 is a diagram illustrating an overview of an exemplary management system 1000 that manages the running of the AGV 10.
- FIG. 1 is a block diagram illustrating a basic configuration example of a moving object according to the present disclosure.
- FIG. 2 is a diagram illustrating an example of the timing at which sensor data is periodically output from the external sensor 106 and
- FIG. 6 is a diagram illustrating an example of each target position ( ⁇ ) set in the travel route of the AGV 10.
- FIG. 7A is a diagram illustrating an example of a movement route of the AGV 10 that continuously moves.
- FIG. 7B is a diagram illustrating an example of a movement route of the AGV 10 that continuously moves.
- FIG. 7C is a diagram illustrating an example of a movement route of the AGV 10 that continuously moves.
- FIG. 8 is an external view of an exemplary AGV 10.
- FIG. 9 is a diagram illustrating a hardware configuration of the AGV 10.
- FIG. 10 is a diagram illustrating the AGV 10 that scans the surrounding space using the LRF 15 while moving.
- FIG. 11 is a diagram illustrating the AGV 10 that generates a map while moving.
- FIG. 12 is a diagram illustrating the AGV 10 that generates a map while moving.
- FIG. 13 is a diagram illustrating the AGV 10 that generates a map while moving.
- FIG. 14 is a diagram illustrating the AGV 10 that generates a map while moving.
- FIG. 15 is a diagram schematically showing the completed map 30.
- FIG. 16 is a diagram schematically illustrating a general position identification process.
- FIG. 17 is a diagram schematically illustrating a general position identification process.
- FIG. 18 is a diagram schematically illustrating a general position identification process.
- FIG. 19 is a flowchart illustrating an example of position identification processing after losing sight of the self position.
- FIG. 20 is a diagram illustrating a hardware configuration of the travel management device 20.
- FIG. 1 shows a basic configuration example of a moving object according to the present disclosure.
- the moving body 100 in this example includes an electric motor (hereinafter simply referred to as “motor”) 102, a driving device 104 that moves the moving body 100 by controlling the motor 102, and sensor data by sensing the surrounding space. Is provided with an external sensor 106 that periodically outputs.
- a typical example of the moving body 10 is a moving body that has at least one drive wheel (not shown) mechanically coupled to the motor 102 and can travel on the ground by the traction of the drive wheel.
- the moving body 100 further includes a self-position estimation device 200.
- the self-position estimation device 200 performs various operations, such as a storage device 108 that stores map data (map data) of the surrounding environment, a positioning device 110 that estimates the position of the moving body 100 using sensor data and map data, and the like. And an arithmetic circuit 120 to be executed.
- the positioning device 110 sequentially outputs estimated values of the position (self-position) of the moving body 100 when the moving body 100 is moving or stopped.
- the arithmetic circuit 120 calculates position information necessary for the positioning device 110 and supplies the position information to the positioning device 110.
- FIG. 2 is a diagram illustrating an example of timing at which sensor data is periodically output from the external sensor 106 and timing at which a position estimation value is output from the positioning device 110. Black dots in FIG. 2 indicate sensor data output timing.
- the sensor data is output from the external sensor 106 at a cycle Ts.
- the position estimation value is output from the positioning device 110 at the cycle Tp.
- the estimated position values x1, x2, and x3 were normally output at times t1, t2, and t3, respectively. However, at time t4, information indicating “undefined” as the position estimation value or a position estimation value with very low reliability was output. In such a case, the positioning device 110 loses the current position information necessary for the next position estimation, and cannot continuously output the position estimated value at the period Tp. For this reason, it is necessary to provide the positioning device 110 with the highly reliable position estimation value x3 output at time t3 as the current position of the moving body 100.
- the positioning device 110 performs self-position estimation using, for example, an ICP (Iterative Closest Point) matching algorithm.
- the positioning device 110 may perform probabilistic position estimation using a particle filter based on the Monte Carlo method. When the positioning device 110 acquires the estimated position value x3 as the initial position, the positioning device 110 starts position identification processing.
- the traveling speed (moving speed) of the moving body 100 When the traveling speed (moving speed) of the moving body 100 is increased, the moving distance of the moving body 100 is increased while the time of the period Tp has elapsed. For example, when the moving speed of the moving body 100 is 5 meters / second and the period Tp is 100 milliseconds, the estimated position x3 at time t3 is shifted by 0.5 meters at time t4. In addition, at time t5 when the time Tp has elapsed from time t4, the moving body 100 shifts by 1 meter from the estimated position value x3 at time t3. Unless the traveling of the moving body 100 is stopped, the position of the moving body 100 is further away from the estimated position value x3 at time t3.
- sensor data is continuously output from the external sensor 106 at a cycle Ts shorter than the cycle Tp.
- the position identification process should be performed using the latest sensor data.
- the initial position of the moving body 100 required for the position identification process is greatly deviated from the actual position, there is a problem that the position identification process takes a long time or the position identification cannot be realized.
- the apparatus disclosed in Japanese Patent Laying-Open No. 2016-224680 acquires the moving distance and direction of the moving body 100 in each step of sequential processing by odometry.
- the period of one step corresponds to the period Tp. Therefore, according to this apparatus, when the self-position is lost at time t4, a value (second self-position) obtained by adding the movement distance and direction from time t3 to time t4 to the position estimated value x3 at time t3. It will be used as the initial position.
- probabilistic self-position estimation is executed using a particle filter.
- the self-position estimation algorithm does not need to be probabilistic and may be an algorithm based on pattern matching.
- Such a conventional example has the following problems.
- the arithmetic circuit 120 in FIG. 1 executes the following processing in order to solve the above-described problem. This process will be described with reference to FIG.
- the positioning device 110 outputs the estimated value x3 of the position of the moving body 100 at time t3 (first time), and then loses its own position at time t4 (position estimated value “undefined”). And In this case, the arithmetic circuit 120 starts a position identification process for estimating the position of the moving body 100 at time t6 (second time). At this time, the arithmetic circuit 120 determines the position of the position at time t3 (first time) based on the distance and direction in which the moving body 100 moves from time t3 (first time) to time t6 (second time). The estimated value x3 is corrected. The arithmetic circuit 120 gives the corrected estimated value (x3 + ⁇ x) to the positioning device 110 as the initial position of the moving body 100 at time t6 (second time).
- the timing at which the corrected estimated value (x3 + ⁇ x) is given to the positioning device 110 is time t5.
- the time from time t5 to time t6 may be zero in an extreme case.
- the correction amount ⁇ x is the movement distance and direction acquired, for example, by odometry from time t3 to time t5.
- the period (t6-t5) from time t5 to time t6 cannot be ignored, the moving distance and direction of the moving body in the period (t6-t5) may be obtained by calculation and included in the correction amount ⁇ x.
- the period from time t3 to time t6 can be very long. In such a case, it is preferable to determine the position of the moving body 100 at time t6 as the initial position according to the processing time.
- the corrected estimated value (x3 + ⁇ x) may be called the “predicted position” of the moving body 100 at time t6 (second time).
- the initial position of the self-position identification process that starts after losing sight of the self-position is taken into account based on the moving speed and direction of the moving body 100 in consideration of the time required for the process after sensor data acquisition. Calculate the expected position. For this reason, the accuracy of the initial position is improved, and the self-position identification process can be appropriately started. As a result, smooth running can be continued without decelerating or stopping the moving body.
- the positioning device 110 when the positioning device 110 outputs the estimated value of the position of the moving body 100, the positioning device 110 may output information indicating the reliability of the estimation.
- the arithmetic circuit 120 after the first time to the second time, after the correction, when the reliability of the estimated value of the position of the moving body output from the positioning device 110 is less than the set value. Is provided to the positioning device 110 as the initial position of the moving body at the second time.
- the reliability of the estimation can be an evaluation value of the degree of coincidence by pattern matching or a probabilistic index. When the reliability is lower than a preset value, it can be assumed that the self-position has been lost.
- the arithmetic circuit 120 responds to a request from the positioning device 110 between the first time and the second time, and calculates the corrected estimated value at the initial position of the mobile object 100 at the second time.
- the position is given to the positioning device 110.
- the positioning device 110 calculates the “estimated value after correction” to the arithmetic circuit 120 before the position estimation value of “indefinite” or low reliability is output from the positioning device 110. It is also possible to instruct. If the calculation start of the “estimated value after correction” by the arithmetic circuit 120 can be accelerated, the interval from the first time to the second time can be shortened, so that the period in which the self-position is lost can be shortened. .
- the positioning device 110 uses the initial position of the moving body 100 at the second time given from the arithmetic circuit 120 and the sensor data output from the external sensor 106 before the second time and the second time, The position of the moving body 100 at the second time is estimated.
- the positioning device 110 outputs the estimated value of the position of the moving body 100 at the second time at the third time.
- the arithmetic circuit 120 can control the traveling of the moving object 100 based on the estimated position value.
- the arithmetic circuit 120 determines a distance and a direction in which the moving body 100 moves between the first time and the second time according to the operating state of the driving device 104. Such distance and direction can be grasped from the operating state of the drive device 104 that controls the motor 102. For example, when the number of motors 102 that generate traction is two, the moving speed and moving direction of the moving body 100 can be determined from the rotational speeds of the individual motors 102. Since the rotational speed of the motor 102 is defined by the operating state of the driving device 104, the moving speed of the moving body 100 can be determined based on the operating state of the driving device without a sensor that directly detects the rotating state of the motor 102. Can be determined. Further, the operating state of the driving device 104 can be known from the contents of the command given to the driving device 104 by the arithmetic circuit 120.
- the moving body 100 further includes an internal sensor that outputs odometry information.
- the internal sensor include an inertial measurement unit such as a rotor encoder and a gyro sensor that measure the rotational speed of a motor or wheels. The rotational speed of the wheel can also be estimated from the operating state of the drive device.
- the arithmetic circuit 120 in such an embodiment can determine the distance and direction in which the moving body moves from the first time to the second time using the odometry information output from the internal sensor.
- the map data may be data created based on sensor data periodically output from the external sensor 106 when the moving body 100 is moving, or map data created by another method. There may be.
- the map data may be data obtained by integrating the map data of a plurality of zones.
- a typical example of the map data is an occupation grid map, but is not limited to this.
- the driving device 104 controls the motor 102 based on the command value of the position of the moving body 100 and the estimated value of the position of the moving body 100 output from the positioning device 110. This command value may be given to the driving device 104 from a controller not shown in FIG.
- an automatic guided vehicle is mentioned as an example of a moving body.
- the automatic guided vehicle is also called AGV (Automated Guided Vehicle), and is also described as “AGV” in this specification.
- FIG. 4 shows, for example, the AGV 10 that travels in the passage 1 in the factory.
- FIG. 5 shows an outline of a management system 1000 that manages the running of the AGV 10 according to this example.
- the AGV 10 has map data (map data) and travels while recognizing which position it is currently traveling.
- the travel route of the AGV 10 follows a command from the travel management device 20 of FIG.
- the AGV 10 moves by rotating a plurality of built-in motors according to a command and rotating wheels.
- the command is transmitted from the traveling management device 20 to the AGV 10 by radio.
- the communication between the AGV 10 and the travel management device 20 can be performed using, for example, the wireless access points 2a, 2b provided near the ceiling of the factory.
- the communication conforms to, for example, the Wi-Fi (registered trademark) standard.
- Wi-Fi registered trademark
- FIG. 4 a plurality of AGVs 10 may travel. The traveling of each of the plurality of AGVs 10 may or may not be managed by the traveling management device 20.
- the outline of the operation of the AGV 10 and the travel management device 20 included in the management system 1000 is as follows.
- the AGV 10 starts from the n-th position to the (n + 1) -th position (hereinafter, “position M n + 1 ”). ).
- the target position can be determined by the administrator for each AGV 10, for example.
- the AGV 10 When the AGV 10 reaches the target position M n + 1 , the AGV 10 transmits an arrival notification (hereinafter referred to as “notification”) to the travel management device 20.
- the notification is sent to the travel management device 20 via the wireless access point 2a.
- AGV10 collates the output of the external sensor which senses the periphery, and map data, and identifies a self position. Then, it may be determined whether or not the self position matches the position M n + 1 .
- the traveling management apparatus 20 When the notification is received, the traveling management apparatus 20 generates the next command ((n + 1) th command) for moving the AGV 10 from the position M n + 1 to the position M n + 2 .
- the (n + 1) th command includes the position coordinates of the position M n + 2 and may further include numerical values such as acceleration time and moving speed during constant speed traveling.
- the traveling management device 20 transmits the (n + 1) th command to the AGV 10.
- the AGV 10 analyzes the (n + 1) th command and performs a preprocessing calculation necessary for the movement from the position M n + 1 to the position M n + 2 .
- the preprocessing calculation is, for example, calculation for determining the rotation speed, rotation time, etc. of each motor for driving each wheel of the AGV 10.
- FIG. 6 shows an example of each target position ( ⁇ ) set in the travel route of AGV10.
- the interval between two adjacent target positions does not have to be a fixed value and can be determined by an administrator.
- the AGV 10 can move in various directions according to commands from the travel management device 20.
- 7A to 7C show examples of movement paths of the AGV 10 that moves continuously.
- FIG. 7A shows the movement path of the AGV 10 when traveling straight. After reaching the position M n + 1 , the AGV 10 can perform a pre-processing calculation, operate each motor according to the calculation result, and continue to move linearly to the next position M n + 2 .
- FIG. 7B shows a movement path of the AGV 10 that makes a left turn at the position M n + 1 and moves toward the position M n + 2 .
- the AGV 10 performs a preprocessing calculation after reaching the position M n + 1 and rotates at least one motor located on the right side in the traveling direction according to the calculation result.
- the AGV 10 rotates counterclockwise by an angle ⁇ on the spot, all the motors rotate at a constant speed toward the position M n + 2 and go straight.
- FIG. 7C shows a movement path of the AGV 10 when moving in a circular arc shape from the position M n + 1 to the position M n + 2 .
- the AGV 10 performs a preprocessing calculation after reaching the position M n + 1, and increases the rotational speed of the outer peripheral motor relative to the inner peripheral motor in accordance with the calculation result. As a result, the AGV 10 can continue to move along an arcuate path toward the next position M n + 2 .
- FIG. 8 is an external view of an exemplary AGV 10 according to the present embodiment.
- FIG. 9 shows the hardware configuration of the AGV 10.
- the AGV 10 includes four wheels 11a to 11d, a frame 12, a transport table 13, a travel control device 14, and a laser range finder (LRF) 15.
- LRF laser range finder
- the traveling control device 14 is a device that controls the operation of the AGV 10, and also functions as a self-position estimation device.
- the traveling control device 14 mainly includes a plurality of integrated circuits including a microcomputer (described later), a plurality of electronic components, and a substrate on which the plurality of integrated circuits and the plurality of electronic components are mounted.
- the travel control device 14 performs data transmission / reception with the travel management device 20 and pre-processing calculation described above.
- the LRF 15 is a range sensor that measures the distance to the target by, for example, irradiating the target with an infrared laser beam 15a and detecting the reflected light of the laser beam 15a.
- the LRF 15 corresponds to the external sensor 106 in FIG.
- Other examples of external sensors for sensing the surrounding space to acquire sensor data include an image sensor and an ultrasonic sensor.
- the LRF 15 of the AGV 10 emits a pulsed laser beam 15a while changing its direction every 0.25 degrees in a space in the range of 135 degrees to the left and right (total 270 degrees) with respect to the front of the AGV 10, for example. Then, the reflected light of each laser beam 15a is detected.
- distance data from the AGV 10 to the reflection point can be obtained for each of a total of about 1080 different directions obtained by dividing the range of angle 270 degrees every 0.25 degrees.
- the values of 270 degrees and 0.25 degrees are examples, and the scanning mode varies depending on the type of LRF 15.
- the time required for one scan by the LRF 15 is several milliseconds to several tens of milliseconds, for example.
- the LRF 15 outputs sensor data periodically (for example, every several tens of milliseconds) while sensing the surrounding space.
- the position and posture of the AGV 10 and the scan result of the LRF 15 it is possible to know the arrangement state of the object around the AGV 10.
- the position and posture of a moving object are called poses.
- the position and orientation of the moving body in the two-dimensional plane are expressed by position coordinates (x, y) in the XY orthogonal coordinate system and an angle ⁇ with respect to the X axis, respectively.
- the position and orientation of the AGV 10, that is, the pose (x, y, ⁇ ) may be simply referred to as “position” or “position coordinates” below.
- the positioning device which will be described later, identifies the local position (x, y, ⁇ ) on the environment map by matching (matching) the local map data created from the scan results of LRF15 with a wider range of environment map data. It becomes possible to do.
- the position of the reflection point viewed from the center position of the radiation of the laser beam 15a can be expressed using polar coordinates determined by the angle and the distance.
- the polar coordinates are local coordinates that move with the AGV 10.
- the LRF 15 outputs sensor data expressed in polar coordinates.
- the LRF 15 may convert the position expressed in polar coordinates into orthogonal coordinates and output the result.
- LRF 15 Since the structure and operation principle of the LRF are known, further detailed description is omitted in this specification. Examples of objects that can be detected by the LRF 15 are people, luggage, shelves, and walls.
- the “sensor data” output from the LRF 15 is a plurality of sets of vector data in which the angle ⁇ and the distance L are set as one set.
- the angle ⁇ changes by 0.25 degrees within a range of, for example, ⁇ 135 degrees to +135 degrees.
- the angle may be expressed with the right side as positive and the left side as negative with respect to the front of the AGV 10.
- the distance L is the distance to the object measured for each angle ⁇ .
- the distance L is obtained by dividing half of the difference between the emission time of the laser beam 15a and the reception time of the reflected light (that is, the time required for the round trip of the laser beam) by the speed of light.
- FIG. 9 also shows a specific configuration of the traveling control device 14 of the AGV 10.
- the AGV 10 in this example includes a travel control device 14, an LRF 15, four motors 16a to 16d, and a drive device 17.
- the traveling control device 14 includes a microcomputer 14a, a memory 14b, a storage device 14c, a communication circuit 14d, and a positioning device 14e.
- the microcomputer 14a, the memory 14b, the storage device 14c, the communication circuit 14d, and the positioning device 14e are connected by a communication bus 14f and can exchange data with each other.
- the LRF 15 is also connected to the communication bus 14f via a communication interface (not shown), and transmits measurement data as measurement results to the microcomputer 14a, the positioning device 14e, and / or the memory 14b.
- the microcomputer 14a is a control circuit (computer) that performs calculations for controlling the entire AGV 10 including the travel control device 14.
- the microcomputer 14a can operate as the arithmetic circuit 120 in FIG.
- the microcomputer 14a is a semiconductor integrated circuit.
- the microcomputer 14a transmits a PWM (Pulse Width Modulation) signal to the driving device 17 to control the driving device 17 and adjust the current flowing through the motor. As a result, each of the motors 16a to 16d rotates at a desired rotation speed.
- PWM Pulse Width Modulation
- the memory 14b is a volatile storage device that stores a computer program executed by the microcomputer 14a.
- the memory 14b can also be used as a work memory when the microcomputer 14a and the positioning device 14e perform calculations.
- the storage device 14c is a non-volatile semiconductor memory device that stores map data.
- the storage device 14c may be a magnetic recording medium typified by a hard disk or an optical recording medium typified by an optical disk, and a head for writing and / or reading data on any recording medium.
- the apparatus and the control device of the head device may be included. Details of the map data will be described later with reference to FIG.
- the storage device 14c corresponds to the storage device 108 in FIG.
- the communication circuit 14d is a wireless communication circuit that performs wireless communication conforming to, for example, the Wi-Fi (registered trademark) standard.
- the positioning device 14e receives the sensor data from the LRF 15, and reads the map data stored in the storage device 14c.
- the timing at which the positioning device 14e receives sensor data from the LRF 15 does not have to coincide with the timing at which the LRF 15 outputs sensor data.
- the LRF 15 may output sensor data every 25 milliseconds, and the positioning device 14e may receive the sensor data every 100 milliseconds.
- the positioning device 14e performs a process of comparing the sensor data and the map data to identify the self position. The specific operation of the positioning device 14e will be described later.
- the microcomputer 14a and the positioning device 14e are separate components, but this is an example. It may be a single chip circuit or a semiconductor integrated circuit capable of independently performing the operations of the microcomputer 14a and the positioning device 14e.
- FIG. 9 shows a chip circuit 14g including the microcomputer 14a and the positioning device 14e.
- the microcomputer 14a, the positioning device 14e, and / or the chip circuit 14g may be referred to as a computer, an arithmetic circuit, or a processing circuit.
- the microcomputer 14a and the positioning device 14e are separately provided will be described.
- the four motors 16a to 16d are attached to the four wheels 11a to 11d, respectively, and rotate each wheel.
- the number of motors mounted on one AGV 10 is not limited to four. Further, the motor for traction does not have to be attached to all four wheels 11a to 11d, and is typically attached to two wheels.
- the AGV 10 may be provided with a wheel and a motor for steering, or a motor for other purposes.
- the drive device 17 has motor drive circuits 17a to 17d for adjusting the current flowing through each of the four motors 16a to 16d.
- the driving device 17 corresponds to the driving device 104 in FIG.
- Each of the motor drive circuits 17a to 17d is a so-called inverter circuit, and the current flowing to each motor is turned on or off by the PWM signal transmitted from the microcomputer 14a, thereby adjusting the current flowing to the motor.
- the map data in the present embodiment can be created by SLAM (Simultaneous Localization and Mapping) technology.
- the AGV 10 scans the surrounding space by operating the LRF 15 while actually traveling in a factory where the AGV 10 is used, and generates a map while estimating its own position.
- the AGV 10 may travel on a specific route while being controlled by the administrator, and generate a map from the sensor data acquired by the LRF 15.
- the map data may be created by “online processing” while the AGV 10 is moving, or by “offline processing” by a computer located outside the AGV 10 using a large amount of sensor data acquired while the AGV 10 is moving. You may go.
- FIG. 10 to 14 show the AGV 10 that generates a map while moving.
- FIG. 10 shows an AGV 10 that scans the surrounding space using the LRF 15. A laser beam is emitted at every predetermined angle, and scanning is performed.
- the position of the reflection point of the laser beam is indicated by using a plurality of points represented by the symbol “ ⁇ ”, such as point 4 in FIG.
- These plural points form point cloud data (Point Cloud Data).
- the positioning device 14e accumulates the position of the black spot 4 obtained as a result of traveling, for example, in the memory 14b.
- the map is gradually completed by continuously performing scanning while the AGV 10 travels.
- FIG. 11 to FIG. 14 only the scan range is shown for the sake of simplicity.
- the scan range is also an example, and is different from the above-described example of 270 degrees in total.
- FIG. 15 schematically shows a part of the completed map 30.
- the positioning device 14e accumulates the data of the map 30 in the memory 14b or the storage device 14c.
- the number or density of black spots shown in the figure is an example.
- FIG. 16 to 18 schematically show a procedure of general position identification processing.
- AGV has acquired in advance a map (hereinafter referred to as “reference map 30”) corresponding to the map 30 of FIG.
- reference map 30 a map corresponding to the map 30 of FIG.
- the AGV acquires sensor data 32 shown in FIG. 16 at predetermined time intervals, and executes a process of identifying its own position on the reference map 30.
- the AGV sequentially sets various local maps (for example, local maps 34a, 34b, 34c) in which the position and angle of the AGV are changed on the reference map 30, and a plurality of reflection points and sensor data included in each of the local maps. 32 is collated with the reflection point included in 32. Such collation can be performed by the ICP matching described above, for example. In order to perform matching efficiently, it is necessary to appropriately set the position and angle of the AGV on the reference map 30, that is, the pose. In the present embodiment, the microcomputer 14a in FIG. 9 sets an initial position by the method described with reference to FIG. 3, and gives it to the positioning device 14e.
- FIG. 17 schematically shows a point (for example, point 5) determined to be a match as a result of the collation, by the symbol “ ⁇ ”.
- a point for example, point 5
- the local map 34d that minimizes the root mean square of the distance (error) between corresponding features is selected, the position of the AGV estimated by matching and the position of the AGV corresponding to the local map 34d The position is determined.
- the identified self-position 36 is represented by the symbol “X”.
- the latest sensor data and the position of the AGV when the sensor data is acquired to be precise, a pose is required.
- the AGV does not lose sight of the self position, it is possible to estimate the current self position with high accuracy as the initial position at the previous estimated position (identification position).
- Such an update of the estimated value can be realized, for example, with a period of 100 milliseconds.
- the AGV loses its own position, if the position of the AGV (initial position) necessary for the start of collation is greatly shifted from the actual AGV position, It takes a long time. For this reason, the AGV needs to be stopped or decelerated for verification.
- the initial position is appropriately predicted according to the moving distance and direction of the AGV and the processing time. For this reason, efficient collation processing is realized based on an appropriate initial position.
- FIG. 19 is a flowchart showing an example of position identification processing after losing sight of the self position. An example of position identification processing will be described with reference to FIGS. 9 and 19.
- the microcomputer 14a in FIG. 9 acquires odometry information in step S10 in FIG. “After losing sight of the self position” means after time t4 in FIG.
- the odometry information can be acquired from a rotary encoder (not shown) or the like.
- step S20 the microcomputer 14a corrects the previous identification position and calculates an initial position necessary for collation. Specifically, the microcomputer 14a calculates the moving speed of the AGV based on the odometry information. Further, the microcomputer 14a determines the time until the time (time t6 in FIG. 3) at which the latest sensor data is acquired, and calculates the moving distance of the AGV based on this time and the moving speed. Furthermore, the microcomputer 14a predicts the position of the AGV at the time when the latest sensor data is acquired (time t6 in FIG. 3) from the moving distance and the moving direction of the AGV.
- step S30 the microcomputer 14a gives the predicted position to the positioning device 14e as an initial position for collation.
- step S40 the positioning device 14e acquires the latest sensor data from the LRF 15.
- step S50 the positioning device 14e starts collation for position identification using the initial position and the latest sensor data. After the collation is completed and the self-position is identified, the positioning device 14e outputs the self-position as a position estimation value.
- FIG. 20 shows a hardware configuration of the travel management device 20.
- the travel management device 20 includes a CPU 21, a memory 22, a storage device 23, a communication circuit 24, and an image processing circuit 25.
- the CPU 21, the memory 22, the storage device 23, the communication circuit 24, and the image processing circuit 25 are connected by a communication bus 27 and can exchange data with each other.
- the CPU 21 is a signal processing circuit (computer) that controls the operation of the travel management device 20.
- the CPU 21 is a semiconductor integrated circuit.
- the memory 22 is a volatile storage device that stores a computer program executed by the CPU 21.
- the memory 22 can also be used as a work memory when the CPU 21 performs calculations.
- the storage device 23 stores map data created by the AGV 10.
- the storage device 23 may be a nonvolatile semiconductor memory, a magnetic recording medium represented by a hard disk, or an optical recording medium represented by an optical disk.
- the storage device 23 may also store position data indicating each position that can be a destination of the AGV 10 that is necessary to function as the travel management device 20.
- the position data can be represented by coordinates virtually set in the factory by an administrator, for example.
- the location data is determined by the administrator.
- the communication circuit 24 performs wired communication based on, for example, the Ethernet (registered trademark) standard.
- the communication circuit 24 is connected to the wireless access points 2a, 2b and the like by wire, and can communicate with the AGV 10 via the wireless access points 2a, 2b and the like.
- the communication circuit 24 receives data and a command of a position to which the AGV 10 should go from the CPU 21 via the bus 27 and transmits it to the AGV 10. This data and command are received by the communication circuit 14d of the AGV 10 shown in FIG.
- the communication circuit 24 transmits data (for example, notification and position information) received from the communication circuit 14 d (FIG. 9) of the AGV 10 to the CPU 21 and / or the memory 22 via the bus 27.
- the AGV 10 periodically transmits the self-position information (position and angle) output from the positioning device 14 e to the communication circuit 24 of the travel management device 20. This period can be, for example, 100 milliseconds to 1 second.
- the image processing circuit 25 is a circuit that generates video data to be displayed on the external monitor 29.
- the image processing circuit 25 operates exclusively when the administrator operates the travel management device 20. In the present embodiment, further detailed explanation is omitted.
- the monitor 29 may be integrated with the travel management apparatus 20. Further, the CPU 21 may perform the processing of the image processing circuit 25.
- the technique of the present disclosure can be widely used for a moving body that performs a process of identifying a self-position.
Landscapes
- Engineering & Computer Science (AREA)
- Aviation & Aerospace Engineering (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Automation & Control Theory (AREA)
- Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
Abstract
La présente invention concerne un corps mobile comprenant: un capteur d'environnement externe (106) qui détecte l'espace à la périphérie du corps mobile et délivre périodiquement des données de capteur; un dispositif de stockage (108) qui stocke des données cartographiques; un dispositif de positionnement (110) qui, au moyen des données de capteur et des données cartographiques, met en oeuvre un procédé d'estimation de la position du corps mobile et délivre séquentiellement des valeurs d'estimation de position; et un circuit de calcul (120). Quand le dispositif de positionnement doit commencer le procédé d'estimation de la position du corps mobile à un second moment temporel après avoir délivré la valeur d'estimation de position pour le corps mobile à un premier moment temporel, le circuit de calcul se réfère à la distance et à la direction du mouvement du corps mobile entre le premier moment temporel et le second moment temporel pour corriger la valeur d'estimation de position au premier moment temporel en conséquence, et fournit au dispositif de positionnement la valeur d'estimation corrigée qui sert de position initiale de la position du corps mobile au second moment temporel.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2019508736A JPWO2018179960A1 (ja) | 2017-03-27 | 2018-02-15 | 移動体および自己位置推定装置 |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2017-061670 | 2017-03-27 | ||
JP2017061670 | 2017-03-27 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2018179960A1 true WO2018179960A1 (fr) | 2018-10-04 |
Family
ID=63674724
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2018/005266 WO2018179960A1 (fr) | 2017-03-27 | 2018-02-15 | Corps mobile et dispositif d'estimation de position locale |
Country Status (2)
Country | Link |
---|---|
JP (1) | JPWO2018179960A1 (fr) |
WO (1) | WO2018179960A1 (fr) |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2020137315A1 (fr) * | 2018-12-28 | 2020-07-02 | パナソニックIpマネジメント株式会社 | Dispositif de positionnement et corps mobile |
WO2020137312A1 (fr) * | 2018-12-28 | 2020-07-02 | パナソニックIpマネジメント株式会社 | Dispositif de positionnement et corps mobile |
WO2020137311A1 (fr) * | 2018-12-28 | 2020-07-02 | パナソニックIpマネジメント株式会社 | Dispositif de positionnement et objet mobile |
US20220066466A1 (en) * | 2020-09-03 | 2022-03-03 | Honda Motor Co., Ltd. | Self-position estimation method |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2009205465A (ja) * | 2008-02-28 | 2009-09-10 | Toyota Motor Corp | 自律移動体 |
JP2010117847A (ja) * | 2008-11-12 | 2010-05-27 | Toyota Motor Corp | 移動体、移動体制御システム及び移動体の制御方法 |
JP2016110576A (ja) * | 2014-12-10 | 2016-06-20 | 株式会社豊田中央研究所 | 自己位置推定装置及び自己位置推定装置を備えた移動体 |
JP2016224680A (ja) * | 2015-05-29 | 2016-12-28 | 株式会社豊田中央研究所 | 自己位置推定装置及び自己位置推定装置を備えた移動体 |
-
2018
- 2018-02-15 JP JP2019508736A patent/JPWO2018179960A1/ja active Pending
- 2018-02-15 WO PCT/JP2018/005266 patent/WO2018179960A1/fr active Application Filing
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2009205465A (ja) * | 2008-02-28 | 2009-09-10 | Toyota Motor Corp | 自律移動体 |
JP2010117847A (ja) * | 2008-11-12 | 2010-05-27 | Toyota Motor Corp | 移動体、移動体制御システム及び移動体の制御方法 |
JP2016110576A (ja) * | 2014-12-10 | 2016-06-20 | 株式会社豊田中央研究所 | 自己位置推定装置及び自己位置推定装置を備えた移動体 |
JP2016224680A (ja) * | 2015-05-29 | 2016-12-28 | 株式会社豊田中央研究所 | 自己位置推定装置及び自己位置推定装置を備えた移動体 |
Cited By (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2020137315A1 (fr) * | 2018-12-28 | 2020-07-02 | パナソニックIpマネジメント株式会社 | Dispositif de positionnement et corps mobile |
WO2020137312A1 (fr) * | 2018-12-28 | 2020-07-02 | パナソニックIpマネジメント株式会社 | Dispositif de positionnement et corps mobile |
WO2020137311A1 (fr) * | 2018-12-28 | 2020-07-02 | パナソニックIpマネジメント株式会社 | Dispositif de positionnement et objet mobile |
JPWO2020137312A1 (fr) * | 2018-12-28 | 2020-07-02 | ||
JPWO2020137311A1 (fr) * | 2018-12-28 | 2020-07-02 | ||
JPWO2020137315A1 (ja) * | 2018-12-28 | 2021-11-11 | パナソニックIpマネジメント株式会社 | 測位装置及び移動体 |
JP7336752B2 (ja) | 2018-12-28 | 2023-09-01 | パナソニックIpマネジメント株式会社 | 測位装置及び移動体 |
JP7336753B2 (ja) | 2018-12-28 | 2023-09-01 | パナソニックIpマネジメント株式会社 | 測位装置及び移動体 |
JP7482453B2 (ja) | 2018-12-28 | 2024-05-14 | パナソニックIpマネジメント株式会社 | 測位装置及び移動体 |
US12203757B2 (en) | 2018-12-28 | 2025-01-21 | Panasonic intellectual property Management co., Ltd | Positioning apparatus capable of measuring position of moving body using image capturing apparatus |
US20220066466A1 (en) * | 2020-09-03 | 2022-03-03 | Honda Motor Co., Ltd. | Self-position estimation method |
US12164305B2 (en) * | 2020-09-03 | 2024-12-10 | Honda Motor Co., Ltd. | Self-position estimation method |
Also Published As
Publication number | Publication date |
---|---|
JPWO2018179960A1 (ja) | 2020-02-13 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP6769659B2 (ja) | 移動体の管理システム、方法、およびコンピュータプログラム | |
JP6825712B2 (ja) | 移動体、位置推定装置、およびコンピュータプログラム | |
EP4016230A1 (fr) | Procédé et système de localisation et d'étalonnage simultanés | |
US11537140B2 (en) | Mobile body, location estimation device, and computer program | |
JP6711138B2 (ja) | 自己位置推定装置、及び、自己位置推定方法 | |
WO2018179960A1 (fr) | Corps mobile et dispositif d'estimation de position locale | |
KR102341712B1 (ko) | 측위 정확도가 높은 이동로봇 및 이의 동작방법 | |
JP2009237851A (ja) | 移動体制御システム | |
JP2011141663A (ja) | 無人搬送車、および、その走行制御方法 | |
JP2020004342A (ja) | 移動体制御装置 | |
US20230333568A1 (en) | Transport vehicle system, transport vehicle, and control method | |
JP2000172337A (ja) | 自律移動ロボット | |
JP2019079171A (ja) | 移動体 | |
JP7396353B2 (ja) | 地図作成システム、信号処理回路、移動体および地図作成方法 | |
JP2021056764A (ja) | 移動体 | |
WO2018180175A1 (fr) | Corps mobile, dispositif de traitement de signal et programme informatique | |
US10990104B2 (en) | Systems and methods including motorized apparatus for calibrating sensors | |
JP7618378B2 (ja) | 自律移動体 | |
JP6751469B2 (ja) | 地図作成システム | |
EP3605263B1 (fr) | Système de gestion de corps mobile, corps mobile, dispositif de gestion de déplacement, et programme informatique | |
US20240142992A1 (en) | Map generation device and map generation system | |
US20240338024A1 (en) | Autonomous Driving Control Apparatus, System Including The Same, And Method Thereof | |
WO2021220331A1 (fr) | Système de corps mobile | |
WO2024101417A1 (fr) | Système de marquage automatique et procédé de marquage automatique | |
WO2019059299A1 (fr) | Dispositif de gestion opérationnelle |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 18774664 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2019508736 Country of ref document: JP Kind code of ref document: A |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 18774664 Country of ref document: EP Kind code of ref document: A1 |