+

WO2019176667A1 - Detecting device and detecting method - Google Patents

Detecting device and detecting method Download PDF

Info

Publication number
WO2019176667A1
WO2019176667A1 PCT/JP2019/008745 JP2019008745W WO2019176667A1 WO 2019176667 A1 WO2019176667 A1 WO 2019176667A1 JP 2019008745 W JP2019008745 W JP 2019008745W WO 2019176667 A1 WO2019176667 A1 WO 2019176667A1
Authority
WO
WIPO (PCT)
Prior art keywords
light
pixel
obstacle
waveform
distance
Prior art date
Application number
PCT/JP2019/008745
Other languages
French (fr)
Japanese (ja)
Inventor
由紀子 柳川
Original Assignee
オムロン株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by オムロン株式会社 filed Critical オムロン株式会社
Publication of WO2019176667A1 publication Critical patent/WO2019176667A1/en

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C3/00Measuring distances in line of sight; Optical rangefinders
    • G01C3/02Details
    • G01C3/06Use of electric means to obtain final indication
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/02Systems using the reflection of electromagnetic waves other than radio waves
    • G01S17/06Systems determining position data of a target
    • G01S17/08Systems determining position data of a target for measuring distance only
    • G01S17/10Systems determining position data of a target for measuring distance only using transmission of interrupted, pulse-modulated waves
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/93Lidar systems specially adapted for specific applications for anti-collision purposes

Definitions

  • the present disclosure relates to a detection device and a detection method for detecting a height difference existing on a road surface.
  • Patent Document 1 discloses a step detecting device that detects a step existing on a road surface.
  • the level difference detection device includes a laser radar and a microcomputer that measures the height of the level difference detection position based on distance measurement data acquired in two positional relationships where the distance from the laser radar to the level difference detection position is different.
  • the step detection device determines that a reflector exists and excludes the height of the virtual image by the reflector and detects the step. This prevents a step from being erroneously detected when the reflector is present on the road surface.
  • Patent Document 2 discloses an obstacle detection device that detects an obstacle.
  • the obstacle detection device includes a transmission / reception unit that transmits a pulse wave and receives a reflected wave from the obstacle, and a microcomputer that detects the number of maximum points of the amplitude of the received signal corresponding to the reflected wave.
  • the obstacle detection device determines whether the height of the obstacle is higher than that of the transmission / reception unit based on the number of local maximum points. At this time, the obstacle detection device measures the distance to the obstacle based on the time difference between the start of transmission and the detection of the received signal, and whether or not the detected local maximum point is correct based on the measured distance. Is determined.
  • Patent Document 1 since the height is calculated based on the distance measurement data, the height cannot be accurately calculated if the distance measurement data is not correct.
  • Patent Document 2 since it is determined whether or not the detection of the maximum point is correct based on the measured distance, the height of the obstacle cannot be accurately determined if the measured distance is not correct.
  • the conventional techniques such as Patent Document 1 and Patent Document 2 have a problem that an error occurs in the height determination when the distance cannot be measured accurately. Therefore, in the prior art, for example, it was not possible to accurately detect the presence or absence of an elevation difference that is so small that the distance cannot be measured accurately.
  • An object of the present disclosure is to provide a detection device and a detection method that accurately detect the presence or absence of a height difference.
  • the detection device includes a light projecting unit that projects light, a light receiving unit that receives light incident from a predetermined angle region, and a received light amount according to an elapsed time after projecting light in the predetermined angle region. And a controller that calculates a distance corresponding to the flight time of light based on the received light waveform, and the controller detects the presence or absence of a height difference within a predetermined angle region based on the distortion of the received light waveform.
  • the detection method includes a step of projecting light by a light projecting unit, a step of receiving light incident from a predetermined angle region by a light receiving unit, and a light projecting in a predetermined angle region by a control unit.
  • the detection apparatus and the detection method according to the present disclosure it is possible to accurately detect whether there is a height difference. For example, it is possible to detect the presence or absence of a small obstacle that fits within the vertical angle of view of one pixel.
  • the figure for demonstrating the application example of the detection apparatus which concerns on this indication 1 is a block diagram illustrating the configuration of a detection apparatus according to first and second embodiments. Diagram for explaining scanning by sensor The figure which shows an example of the distance image when there is an obstacle on the road near the vehicle The figure which shows an example of the distance image when there is an obstacle on the road far from the vehicle The figure which shows typically an example of the light projection to the road surface from a sensor when there is no obstacle in the range of the vertical angle of view for one pixel The figure which shows an example of the light reception waveform for 1 pixel in FIG.
  • FIG. 5A The figure which shows typically an example of the light projection from a sensor to an obstruction and a road surface when an obstruction exists in the range of the vertical angle of view for 1 pixel
  • FIG. 6A The figure which shows an example of the light reception waveform for 1 pixel in FIG. 6A
  • the figure for demonstrating the adjacent pixel to compare The figure for demonstrating the comparison of the light reception waveform of the pixel adjacent to a horizontal direction
  • the flowchart which shows an example of the distance image generation process by a detection apparatus Flowchart showing an example of waveform analysis processing by the detection device
  • the flowchart which shows an example of the obstacle determination process by a detection apparatus The figure which shows typically an example of the pixel in which the difference more than a threshold value produced with respect to the adjacent pixel.
  • the figure which shows typically an example of the pixel in which the obstruction identified from the pixel of a boundary part exists The figure which shows the neural network which concerns on Embodiment 2 typically.
  • 10 is a flowchart illustrating an example of waveform analysis processing according to the second embodiment.
  • FIG. 1 is a diagram for describing an application example of the detection apparatus 100 according to the present disclosure.
  • the detection device 100 is applicable to, for example, in-vehicle use.
  • the detection device 100 is mounted on a vehicle 200.
  • the detection device 100 according to the present embodiment is a LIDAR (Light Detection and Ranging) or Laser Imaging Detection and Ranging) device.
  • the detection device 100 detects the distance and direction to the detection target in the traveling direction of the vehicle 200.
  • the detection target is, for example, an obstacle 300 that forms a road surface and a step on the road.
  • the obstacle 300 is, for example, a curb or a fallen object.
  • the detection device 100 projects light in the traveling direction of the vehicle 200 and receives reflected light reflected by the detection target.
  • the detection device 100 measures the distance to the detection target in the traveling direction of the vehicle based on the time difference from light projection to light reception, and generates a distance image (also referred to as a frame image) based on the measured distance.
  • the detection apparatus 100 outputs, for example, information (for example, a distance image) indicating the distance and direction to the obstacle 300 on the road to the vehicle driving apparatus 210.
  • the vehicle 200 is, for example, an automatic driving vehicle, and includes a vehicle driving device 210 for performing automatic driving.
  • the vehicle drive device 210 includes, for example, a steering mechanism that drives the vehicle 200 by setting the traveling direction while avoiding the obstacle 300 on the road. By detecting the obstacle 300 by the detection device 100, the vehicle drive device 210 can perform automatic driving while avoiding the obstacle 300.
  • the detection device 100 of the present disclosure aims to accurately detect an obstacle at a position far away from the vehicle 200. Specifically, the detection device 100 of the present disclosure detects the presence or absence of a small height difference that can be accommodated in one pixel of the distance image.
  • Embodiment 1 The configuration and operation of the detection apparatus 100 according to Embodiment 1 will be described below.
  • FIG. 2 is a block diagram illustrating the configuration of the detection apparatus 100.
  • FIG. 3 is a diagram for explaining scanning by the sensor 10.
  • the detection apparatus 100 includes a sensor 10, a control unit 20, and a storage unit 30.
  • the sensor 10 includes a light projecting unit 11 that projects light to the outside, a light receiving unit 12 that receives light from the outside, and a scanning unit 13.
  • the light projecting unit 11 emits a light beam to the outside under the control of the control unit 20.
  • the light projecting unit 11 includes, for example, a light source composed of one or more light source elements and a light source driving circuit that drives the light source in pulses.
  • the light source element is, for example, an LD (semiconductor laser) that emits laser light.
  • the light source element may be an LED or the like.
  • the light source elements are arranged in a line in the vertical direction Y shown in FIG. 3, and the light projecting unit 11 projects light toward the light projecting region R11.
  • the light receiving unit 12 includes a plurality of light receiving elements. When receiving the light, the light receiving element generates a received light signal corresponding to the received light amount, that is, the received light amount.
  • the plurality of light receiving elements are arranged in a line in the vertical direction Y.
  • Each light receiving element corresponds to, for example, one pixel of the distance image, and separately receives light incident from a range (an example of a predetermined angle region) corresponding to the vertical field angle of one pixel. That is, the light receiving unit 12 has a light receiving region corresponding to a plurality of pixels arranged in the vertical direction Y of the distance image, and generates a light receiving signal for each pixel.
  • the light receiving element is composed of, for example, SPAD (single photon avalanche photodiode).
  • the light receiving element may be composed of a PD (photodiode) or an APD (avalanche photodiode).
  • the scanning unit 13 includes, for example, a mirror, a rotation mechanism that rotates the mirror around a rotation axis along the vertical direction Y, and a scanning drive circuit that drives the rotation mechanism.
  • the scan drive circuit rotates the mirror under the control of the control unit 20. Accordingly, the scanning unit 13 changes the light projecting direction little by little at regular time intervals, and moves the light path along which the light travels little by little.
  • the scanning direction is the horizontal direction X as shown in FIG.
  • the scanning unit 13 shifts the light projection region R11 in the horizontal direction X.
  • the control unit 20 can be realized by a semiconductor element or the like.
  • the control unit 20 can be configured by, for example, a microcomputer, CPU, MPU, GPU, DSP, FPGA, and ASIC.
  • the function of the control unit 20 may be configured only by hardware, or may be realized by combining hardware and software.
  • the control unit 20 implements a predetermined function by reading out data and programs stored in the storage unit 30 and performing various arithmetic processes.
  • the control unit 20 includes a distance image generation unit 21 and an obstacle detection unit 22 as functional configurations.
  • the distance image generation unit 21 performs distance measurement while scanning the projection plane R1 corresponding to the angle of view of the distance image in the horizontal direction X, and generates a distance image.
  • the resolution of the distance image that is, the angle of view for each pixel is, for example, 1.0 to 1.6 degrees in the horizontal direction X and 0.3 to 1.2 degrees in the vertical direction Y.
  • the distance image indicates the distance in the depth direction Z for each pixel arranged in the horizontal direction X and the vertical direction Y.
  • the distance image generation unit 21 outputs the generated distance image to the vehicle drive device 210, for example.
  • the distance image generation unit 21 controls the timing of light projection by the light projection unit 11.
  • the distance image generation unit 21 includes a light reception waveform generation unit 21a and a distance calculation unit 21b.
  • the light reception waveform generation unit 21a generates, for each pixel, light reception waveform data indicating the amount of light received according to the elapsed time since the light projection, based on the light projection timing and the light reception signal obtained from the light reception unit 12. .
  • the distance calculation unit 21b calculates a distance for each pixel based on the received light waveform. For example, the distance calculation unit 21b measures the flight time of light from when the light projected from the light projecting unit 11 is reflected and received by the light receiving unit 12 based on the received light waveform. The distance calculation unit 21b calculates a distance from the measured flight time to a detection target (for example, a road surface and an obstacle) that reflects light. The distance calculation unit 21b generates a distance image based on the distance measured for each pixel.
  • a detection target for example, a road surface and an obstacle
  • the obstacle detection unit 22 detects the presence or absence of a height difference in each pixel, for example, the presence or absence of an obstacle, based on the distortion of the received light waveform.
  • the obstacle detection unit 22 includes a waveform analysis unit 22a and an obstacle determination unit 22b.
  • the waveform analysis unit 22a analyzes the light reception waveform generated by the light reception waveform generation unit 21a for each pixel. Specifically, in the present embodiment, the waveform analysis unit 22a compares the difference between the light reception waveforms of pixels adjacent in the horizontal direction X (an example of distortion of the light reception waveform) with a threshold, and the difference is calculated between the obstacle and the road surface. It is determined whether or not it is a boundary portion.
  • the “difference in received light waveform” refers to a difference in received light amount according to time.
  • the obstacle determination unit 22b determines whether each pixel constituting the distance image includes an obstacle based on the analysis result of the waveform analysis unit 22a. For example, in the analysis result of the waveform analysis unit 22a, if there is a pixel corresponding to the boundary, it is determined that there is an obstacle, and the pixel including the obstacle is specified based on the boundary. When the obstacle determination unit 22b determines that there is an obstacle, the obstacle determination unit 22b outputs information based on the determination result to the vehicle drive device 210. For example, when it is determined that there is an obstacle, the obstacle determination unit 22b specifies the position or orientation of the obstacle based on the pixel including the obstacle, and information indicating the position or orientation of the obstacle is the vehicle drive device. Output to 210.
  • the storage unit 30 is a storage medium that stores programs and data necessary for realizing the functions of the detection apparatus 100.
  • the storage unit 30 can be realized by, for example, a hard disk (HDD), SSD, RAM, DRAM, ferroelectric memory, flash memory, magnetic disk, or a combination thereof.
  • the storage unit 30 may temporarily store various information.
  • storage part 30 may be comprised so that it may function as a work area of the control part 20, for example.
  • the detection apparatus 100 may further include a communication unit including a circuit that performs communication with an external device in accordance with a predetermined communication standard.
  • the predetermined communication standard includes, for example, LAN, Wi-Fi (registered trademark), Bluetooth (registered trademark), USB, and HDMI (registered trademark).
  • FIG. 4A shows an example of a distance image V1 when the obstacle 2 is on the road 1 near the sensor 10.
  • the size of the obstacle 2 shown in FIG. 4A in the vertical direction Y exceeds the length dy corresponding to the vertical field angle of one pixel.
  • the obstacle 2 is positioned closer to the sensor 10 than the road 1 by the height.
  • the flight time of light measured for the pixel P1 with the obstacle 2 is shorter than the flight time of light measured for the pixel P2 without the obstacle 2.
  • the distance to the obstacle 2 can be calculated by measuring the flight time of light at the pixel P1. Thereby, the obstacle 2 appears in the distance image V1.
  • FIG. 4B shows an example of the distance image V2 when the obstacle 3 is on the road 1 far from the sensor 10.
  • the size in the vertical direction Y of the obstacle 3 shown in FIG. 4B is smaller than the length dy corresponding to the vertical field angle of one pixel.
  • the obstacle 3 may not appear in the distance image V2 (meaning that the obstacle 3 indicated by a broken line in FIG. 4B does not appear in the distance image V2).
  • the flight time of light measured for the pixel P4 with the obstacle 3 may be the same as the flight time of light measured for the pixel P3 without the obstacle 3.
  • the distance to the road surface of the road 1 is measured based on the flight time of the light at the pixel P4, and the presence of the obstacle 3 is not detected.
  • FIG. 5A and FIG. 5B the reason why the obstacle 3 whose size in the vertical direction Y is smaller than the length dy corresponding to the vertical field angle of one pixel does not appear in the distance image V2 will be described in detail.
  • FIG. 5A shows an example of light projection from the sensor 10 to the road surface when there is no obstacle (an area corresponding to the pixel P3 in FIG. 4B) within the range of the vertical angle of view ⁇ for one pixel (an example of a predetermined angle area). Is schematically shown.
  • FIG. 5B shows an example of a light reception waveform in the pixel P3 corresponding to FIG. 5A.
  • the horizontal axis in FIG. 5B is time, and the vertical axis is the amount of received light.
  • FIG. 5B shows the amount of light received according to the elapsed time since the light was projected onto the area corresponding to the pixel P3.
  • the amount of received light becomes maximum at the shortest distance dz1 from the sensor 10 to the road surface in the region corresponding to the pixel P3 (time t1).
  • FIG. 6A shows the case where the obstacle 3 is present in the range of the vertical angle of view ⁇ for one pixel (an example of a predetermined angle region) (the region corresponding to the pixel P4 in FIG. 4B) from the sensor 10 to the obstacle and the road surface.
  • An example of light projection is shown typically.
  • FIG. 6B shows an example of a light reception waveform in the pixel P4 corresponding to FIG. 6A.
  • the horizontal axis in FIG. 6B is time, and the vertical axis is the amount of received light.
  • FIG. 6B shows the amount of light received according to the time elapsed since the light was projected onto the area corresponding to the pixel P4.
  • the pixel P4 shown in FIG. 6A is on the same horizontal line as the pixel P3 shown in FIG.
  • the obstacle 3 exists on the road surface.
  • the size of the obstacle 3 is smaller than the range of the vertical angle of view ⁇ for one pixel.
  • the amount of received light becomes maximum at the same time t1 as when there is no obstacle 3 (FIG. 5B). That is, the amount of received light reflected from the road surface at the shortest distance dz1 from the sensor 10 in the region corresponding to the pixel P4 is larger than the amount of received light reflected from the obstacle 3.
  • There are various methods for calculating the distance For example, if the distance is calculated based on the time t1 at which the amount of received light is maximum, the same distance value as that obtained when there is no obstacle 3 is obtained.
  • the distance value corresponding to the pixel P4 is a distance value from the sensor 10 to the road surface. Therefore, the obstacle 3 within the range of the pixel P4 is not detected, and the obstacle 3 does not appear in the distance image.
  • the area corresponding to the vertical angle of view ⁇ for one pixel is narrower as it is closer to the sensor 10 and wider as it is farther from the sensor 10. Therefore, the obstacle farther from the sensor 10 is more likely to be within the range of the vertical angle of view ⁇ for one pixel. Therefore, obstacles far from the sensor 10 are difficult to detect.
  • the detection device 100 of the present disclosure aims to detect such a small obstacle that falls within the range of the vertical angle of view ⁇ for one pixel. As shown in FIGS. 5B and 6B, the shape of the received light waveform differs depending on whether or not the obstacle 3 is present. Therefore, the detection apparatus 100 according to the present embodiment compares the light reception waveforms of two pixels adjacent in the horizontal direction X.
  • FIG. 7 is a diagram for explaining adjacent pixels to be compared.
  • the detection device 100 compares the light reception waveforms of adjacent pixels in the horizontal direction X, such as the pixel P11 and the pixel P12, the pixel P12 and the pixel P13, the pixel P13 and the pixel P14, and so on.
  • FIG. 8A shows an example in which the light reception waveform W11 corresponding to the pixel P11 is compared with the light reception waveform W12 corresponding to the pixel P12 adjacent to the pixel P11.
  • FIG. 8B illustrates a waveform indicating the absolute value of the difference between the received light waveforms W11 and W12.
  • the shape of the waveform depending on whether the obstacle 3 is present or not. Is different. Therefore, in this embodiment, it is determined whether or not the obstacle 3 is included in the pixel based on the difference in the shape of the waveform.
  • FIG. 8B when the absolute value of the difference between the received light waveforms is equal to or greater than a predetermined threshold, it is determined that the adjacent pixel indicates a boundary portion. For example, the difference in the amount of received light is calculated at the same elapsed time since the light is projected, and it is determined whether or not there is a difference that is equal to or greater than a threshold value.
  • the sign of the difference value changes according to the reflectance of light. Since the reflectance of light varies depending on the material, for example, if the obstacle 3 is white, the reflectance is higher than the reflectance of the road 1, and if the obstacle 3 is black, the reflectance is higher than the reflectance of the road 1. Lower. When the obstacle 3 includes a material having a higher reflectance than the road 1, the amount of received light reflected from the obstacle 3 is larger than the amount of received light reflected from the road 1 at the same position in the horizontal direction X.
  • the difference value when the received light amount of the pixel P11 of only the road 1 is subtracted from the received light amount of the pixel P12 including the obstacle 3 is positive, and the obstacle 3 is included from the received light amount of the pixel P16 of only the road 1
  • the difference value when the received light amount of the pixel P15 is subtracted is negative.
  • 5B, FIG. 6B, and FIG. 8A illustrate a case where the obstacle 3 includes a material having a higher reflectance than the road 1.
  • the obstacle 3 includes a material having a reflectance lower than that of the road 1, the amount of received light reflected from the obstacle 3 is smaller than the amount of received light reflected from the road 1 at the same position in the horizontal direction X. .
  • the difference value when the received light amount of the pixel P11 of only the road 1 is subtracted from the received light amount of the pixel P12 including the obstacle 3 is negative, and the obstacle 3 is included from the received light amount of the pixel P16 of only the road 1
  • the difference value when the received light amount of the pixel P15 is subtracted is positive.
  • the boundary between the obstacle 3 and the road 1 can be specified by comparing the absolute value of the difference between the received light waveforms with a predetermined threshold value.
  • FIG. 9 is a flowchart illustrating the operation of the distance image generation unit 21.
  • the distance image generating unit 21 causes the light projecting unit 11 to project light at a predetermined light projecting timing (S101).
  • the light reception waveform generation unit 21a acquires a light reception signal corresponding to the amount of received light from the light reception unit 12 (S102). Based on the light projection timing and the light reception signal, the light reception waveform generation unit 21a generates light reception waveform data indicating the amount of light reception based on the elapsed time since the light was projected (S103).
  • the distance calculation unit 21b calculates a distance corresponding to the flight time of light based on the received light waveform (S104). For example, the distance calculation unit 21b calculates the distance based on the time when the amount of received light is the largest. The generation of the received light waveform in step S103 and the calculation of the distance in step S104 are performed for each pixel. The distance image generation unit 21 determines whether or not the calculation of the distance of the pixels for one column has been completed (S105). If not yet, the generation of the received light waveform and the calculation of the distance are repeated. When the calculation of the distance of the pixels for one column is completed, the distance image generation unit 21 controls the scanning unit 13 to shift the light projection region R11 in the horizontal direction X that is the scanning direction (S106), and step S101. Return to. When the distance for one frame is calculated, the distance image generation unit 21 generates a distance image (frame image) for one frame based on the calculated distance for each pixel.
  • frame image the distance image generation unit 21 generates a distance image (frame image)
  • FIG. 10 is a flowchart illustrating the operation of the waveform analysis unit 22a.
  • FIG. 11 is a flowchart illustrating the operation of the obstacle determination unit 22b.
  • FIG. 12A schematically illustrates an example of a pixel in which a difference greater than or equal to a threshold value occurs with respect to an adjacent pixel by hatching.
  • FIG. 12B schematically shows the area of the obstacle 3 identified based on the pixels P12 and P16 in which the difference is generated by hatching.
  • FIG. 10 shows the operation for one frame.
  • the waveform analysis unit 22a acquires received light waveform data from the received light waveform generation unit 21a (S201).
  • the waveform analysis unit 22a calculates the difference between the received light waveforms of pixels adjacent in the horizontal direction X (S202).
  • the waveform analysis unit 22a determines whether or not the absolute value of the difference is greater than or equal to a threshold value (S203). If the absolute value of the difference is greater than or equal to the threshold value, the waveform analysis unit 22a determines that the adjacent pixels are each a boundary between the road surface and the obstacle (S204). If the absolute value of the difference is less than the threshold value, the waveform analysis unit 22a determines that the adjacent pixel is not a boundary part (S205).
  • the waveform analysis unit 22a determines whether or not the analysis of all the pixels in the frame is completed (S206). If the analysis of all the pixels in the frame has not been completed, the waveform analysis unit 22a returns to step S201.
  • the operation shown in FIG. 11 is performed after the operation of the waveform analysis unit 22a shown in FIG.
  • the obstacle determination unit 22b determines whether or not there is a pixel determined as a boundary in the frame in the analysis result by the waveform analysis unit 22a (S301). If there is a pixel determined as a boundary, an obstacle region is specified based on the boundary (S302). If there is no pixel determined as the boundary, it is determined that there is no obstacle in the frame (S303).
  • the obstacle determination unit 22b outputs a determination result. For example, when there is an obstacle, based on the coordinates of the pixel identified as an obstacle, information indicating the position or orientation is output to the vehicle drive device 210.
  • the hatching of the pixel P12 illustrated in FIG. 12A indicates that the light reception waveform of the pixel P12 is different from the light reception waveform of the pixel P11.
  • the pixel P11 and the pixel P12 are respectively the boundary between the road surface and the obstacle, that is, the obstacle. It is determined to correspond to a pixel having no object and a certain pixel.
  • the hatching of the pixel P16 illustrated in FIG. 12A indicates that a difference occurs in the light reception waveform of the pixel P16 with respect to the light reception waveform of the pixel P15.
  • step S204 Since the difference between the received light waveforms of the adjacent pixels P15 and P16 is greater than or equal to the threshold value, in step S204, the pixel P15 and the pixel P16 correspond to the boundary between the obstacle and the road surface, that is, the pixel having the obstacle and the pixel having no obstacle, respectively. Then, it is determined. In step S302, for example, based on the pixels P12 and P16 in which a difference from the adjacent pixel is generated, the region from the pixel P12 to the pixel P15 is specified as the region where the obstacle 3 exists as shown in FIG. 12B.
  • the threshold value for example, based on the pixels P12 and P16 in which a difference from the adjacent pixel is generated, the region from the pixel P12 to the pixel P15 is specified as the region where the obstacle 3 exists as shown in FIG. 12B.
  • the detection apparatus 100 includes a light projecting unit 11, a light receiving unit 12, and a scanning unit 13.
  • the light projecting unit 11 projects light.
  • the light receiving unit 12 receives light incident from a predetermined angle region.
  • the control unit 20 calculates a distance corresponding to the flight time of light based on a received light waveform indicating an amount of received light according to an elapsed time after light projection in a predetermined angle region. Further, the control unit 20 detects the presence / absence of a height difference within a predetermined angle region based on the distortion of the received light waveform.
  • the light receiving unit 12 separately receives light incident from a plurality of predetermined angle regions corresponding to a plurality of pixels.
  • the predetermined angle region is a range corresponding to the vertical field angle ⁇ for one pixel.
  • the control unit 20 calculates the distance for each pixel based on the light reception waveform for each pixel.
  • the control unit 20 detects whether there is a height difference for each pixel based on the distortion of the received light waveform.
  • the distortion of the light reception waveform is a difference between the light reception waveforms of pixels adjacent in the horizontal direction.
  • the detection apparatus 100 detects a pixel having a difference in height, for example, a boundary portion between a road surface and an obstacle, based on a difference in received light waveform for each pixel. Therefore, even an obstacle that is small enough to be within the range of the vertical angle of view ⁇ of one pixel can be detected.
  • the detection device 100 is mounted on the vehicle 200 and detects the presence or absence of an obstacle that forms a step on the road as the presence or absence of a height difference. Thereby, for example, even an obstacle far from the vehicle 200 can be detected with high accuracy.
  • the difference between the light reception waveforms of pixels located at the same position in two different frame images may be calculated.
  • the difference between the received light waveforms of the pixels at the same position (coordinates) in the previous frame image and the current frame image may be calculated.
  • the received light waveform has the same shape. Therefore, by comparing the light reception waveforms of pixels in different frame images, the presence or absence of an obstacle can be detected from the difference in the light reception waveforms.
  • Embodiment 2 In the first embodiment, it is determined whether or not it is a boundary portion by comparing the difference between the received light waveforms with a threshold value. In the second embodiment, it is determined by using machine learning whether or not the difference between the received light waveforms indicates a boundary portion. In the present embodiment, for example, deep learning is used.
  • FIG. 13 schematically shows an example of a neural network.
  • the boundary detection network that detects whether or not the difference between the received light waveforms indicates the boundary is constructed by a neural network having a multilayer structure used for deep learning.
  • the boundary detection network includes, in order from the input, a first fully connected layer L1 that is an input layer, a second fully connected layer L2 that is an intermediate layer, and an output layer L3.
  • the second total coupling layer L2 as an intermediate layer is one layer, but may include two or more intermediate layers.
  • Each layer L1 to L3 includes a plurality of nodes. For example, the received light amount corresponding to the elapsed time since the projection of one pixel is input to the nodes IN1, IN2, IN3,... Of the first full coupling layer L1, which is the input layer.
  • the number of nodes of the second total coupling layer L2 can be appropriately set according to the embodiment. From the nodes OUT1 and OUT2 of the output layer L3, a result indicating whether or not the boundary portion between the obstacle and the road surface is output.
  • the boundary determination network is learned from the difference data of the received light waveform indicating the boundary and the difference data of the received light waveform not indicating the boundary.
  • the learned boundary determination network is stored in the storage unit 30, for example.
  • the waveform analysis unit 22a uses this boundary determination network to determine whether or not the difference between the received light waveforms indicates the boundary.
  • FIG. 14 is a flowchart showing the operation of the waveform analysis unit 22a in the second embodiment.
  • steps S401, S402, and S404 are the same as steps S201, S202, and S206 of the first embodiment.
  • the waveform analyzer 22a inputs the difference between the received light waveforms of adjacent pixels to a learned boundary determination network (neural network) and acquires a determination result as to whether or not the boundary is present (S403).
  • the neural network instead of learning the difference of the received light waveform by the neural network, for example, by using the received light waveform of the pixel including the obstacle and the received light waveform of the pixel on the road surface, the neural network as shown in FIG. A configured obstacle determination network may be constructed.
  • the obstacle detection unit 22 may output information indicating the position or orientation of the obstacle to the vehicle drive device 210 based on the coordinates of the pixel determined to be an obstacle.
  • the control unit 20 may detect the presence or absence of a height difference for each pixel using a neural network that has learned the presence or absence of a height difference such as an obstacle and a road surface.
  • the obstacle detection unit 22 determines whether an obstacle is included in the pixel based on the distortion of the received light waveform.
  • the distance value is used to determine whether or not the object is an obstacle.
  • FIG. 15 is a block diagram illustrating the configuration of the detection apparatus 100 according to the third embodiment.
  • the obstacle determination unit 22b determines whether or not the pixel includes an obstacle based on the boundary determination result by the waveform analysis unit 22a and the distance value calculated by the distance calculation unit 21b. Determine. For example, it may be verified whether or not the determination of the boundary portion is correct with reference to the distance value between the pixel determined as the boundary portion and the surrounding pixels. As a result, it is possible to more accurately detect the presence or absence of a small obstacle that falls within the vertical angle of view of one pixel.
  • the scanning unit 13 may include a mechanism that changes the light projection direction in the vertical direction Y, and may scan in the vertical direction Y.
  • the scanning part 13 may be abbreviate
  • the light source elements of the light projecting unit 11 and the light receiving elements of the light receiving unit 12 are arranged in a two-dimensional array, distance measurement similar to that of the first embodiment can be performed without using the scanning unit 13. .
  • the detection apparatus 100 scans the projection plane R1 corresponding to a plurality of pixels, calculates the distance of each pixel, and generates a distance image. You may perform light projection and light reception in the area
  • the detection apparatus 100 includes, for example, one light source element and one light receiving element, and determines the presence or absence of an obstacle in the pixel from the light reception waveform of the region corresponding to only one pixel by the above-described obstacle determination network. Also good.
  • the detection device 100 detects an obstacle such as a curb that forms a step on the road, but the detection target is not limited to an obstacle on the road.
  • the detection device 100 of the present disclosure it is possible to detect an object having a height difference. For example, it is possible to detect a level difference caused by a depression on the road.
  • the detection apparatus 100 which concerns on this indication is applicable not only to in-vehicle use but various applications.
  • the detection apparatus 100 may be installed in a factory and used to measure the distance to a part. In this case, for example, it is possible to determine whether or not there is a scratch on the component by detecting the presence or absence of the height difference in the pixel by the detection device 100.
  • the distance calculation unit 21b is not limited to the distance image, and may generate information indicating the distance in various formats.
  • the distance calculation unit 21b may generate three-dimensional point group data.
  • the obstacle detection unit 22 may include the light reception waveform generation unit 21a.
  • the distance image generation unit 21 and the obstacle detection unit 22 may each include a light reception waveform generation unit 21a.
  • the detection apparatus 100 includes the distance calculation unit 21b, but the distance calculation unit 21b may not be provided.
  • the detection apparatus 100 may include the sensor 10, the light reception waveform generation unit 21a, and the obstacle detection unit 22, and may detect the presence or absence of a height difference in the pixel from the distortion of the light reception waveform.
  • the detection device includes a light projecting unit (11) that projects light, a light receiving unit (12) that receives light incident from a predetermined angle region, and a light projecting in the predetermined angle region. And a control unit (20) for calculating a distance corresponding to the flight time of light based on a received light waveform indicating an amount of received light according to an elapsed time since the time, and the control unit (20) Based on the waveform distortion, the presence or absence of a height difference in the predetermined angle region is detected.
  • the light receiving unit (12) separately receives light incident from a plurality of predetermined angle regions corresponding to a plurality of pixels, and the control unit (20). Calculates the distance for each pixel based on the light reception waveform for each pixel, and the control unit (20) detects the presence or absence of a height difference for each pixel based on the distortion of the light reception waveform.
  • the distortion of the light reception waveform is a difference between the light reception waveforms of pixels adjacent in the horizontal direction.
  • control unit detects the presence or absence of a height difference for each pixel using a neural network that has learned the presence or absence of a height difference.
  • control unit detects the presence or absence of a height difference for each pixel based on the distortion of the light reception waveform and the distance.
  • the control unit in the detection device according to the second aspect, the control unit generates a frame image including the plurality of pixels according to the distance for each pixel, and the received light waveform has a different distortion. This is the difference between the received light waveforms of the pixels at the same position in two frame images.
  • the detection device is mounted on a vehicle, and the presence or absence of the height difference is the presence or absence of a step or an obstacle on the road.
  • the detection method includes a step of projecting light by the light projecting unit (11) (S101), a step of receiving light incident from a predetermined angle region by the light receiving unit (12) (S102), and control.
  • the detection device of the present disclosure can be applied to, for example, an autonomous driving vehicle, a self-propelled robot, and AGV (Automated Guided Vehicle).
  • an autonomous driving vehicle a self-propelled robot
  • AGV Automatic Guided Vehicle

Landscapes

  • Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Electromagnetism (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Optical Radar Systems And Details Thereof (AREA)
  • Measurement Of Optical Distance (AREA)

Abstract

A detecting device (100) is provided with a light emitting unit (11) which emits light, a light receiving unit (12) which receives light incident from a prescribed angular region, and a control unit (20) which calculates a distance corresponding to a time of flight of light, on the basis of a received light waveform indicating an amount of received light corresponding to an elapsed time from the emission of light in the prescribed angular region, wherein the control unit (20) detects the presence or absence of a height difference in the prescribed angular region on the basis of distortion of the received light waveform.

Description

検出装置及び検出方法Detection apparatus and detection method
 本開示は、路面上に存在する高低差を検出する検出装置及び検出方法に関する。 The present disclosure relates to a detection device and a detection method for detecting a height difference existing on a road surface.
 特許文献1は、路面上に存在する段差を検出する段差検出装置を開示している。段差検出装置は、レーザレーダと、レーザレーダから段差検出位置までの距離が異なる2つの位置関係で取得した測距データに基づいて段差検出位置の高さを測定するマイクロコンピュータと、を備える。段差検出装置は、測定した高さの差分が閾値より大きい場合は、反射体が存在すると判断して、反射体による虚像の高さを除外して、段差を検出している。これにより、反射体が路面上に存在する場合に、段差が誤検出されることを防止している。 Patent Document 1 discloses a step detecting device that detects a step existing on a road surface. The level difference detection device includes a laser radar and a microcomputer that measures the height of the level difference detection position based on distance measurement data acquired in two positional relationships where the distance from the laser radar to the level difference detection position is different. When the difference in the measured height is larger than the threshold value, the step detection device determines that a reflector exists and excludes the height of the virtual image by the reflector and detects the step. This prevents a step from being erroneously detected when the reflector is present on the road surface.
 特許文献2は、障害物を検知する障害物検知装置を開示している。障害物検知装置は、パルス波を送信して障害物からの反射波を受信する送受信部と、反射波に応じた受信信号の振幅の極大点の個数を検出するマイコンとを備える。障害物検知装置は、極大点の個数に基づいて、障害物の高さが送受信部より高いか否かを判定している。このとき、障害物検知装置は、送波の開始時と受信信号の検出時との時間差に基づいて障害物までの距離を計測し、計測した距離に基づいて、検出した極大点が正しいか否かを判別している。 Patent Document 2 discloses an obstacle detection device that detects an obstacle. The obstacle detection device includes a transmission / reception unit that transmits a pulse wave and receives a reflected wave from the obstacle, and a microcomputer that detects the number of maximum points of the amplitude of the received signal corresponding to the reflected wave. The obstacle detection device determines whether the height of the obstacle is higher than that of the transmission / reception unit based on the number of local maximum points. At this time, the obstacle detection device measures the distance to the obstacle based on the time difference between the start of transmission and the detection of the received signal, and whether or not the detected local maximum point is correct based on the measured distance. Is determined.
特開2018-21788号公報Japanese Patent Laid-Open No. 2018-21788 特開2015-166705号公報Japanese Patent Application Laid-Open No. 2015-166705
 特許文献1では、測距データに基づいて高さを算出しているため、測距データが正しくない場合は精度良く高さを算出することができない。特許文献2では、計測した距離に基づいて極大点の検出が正しいか否かを判定しているため、計測した距離が正しくない場合は障害物の高さ判定を精度良く行うことができない。このように、特許文献1及び特許文献2のような従来技術では、距離を正確に計測できない場合は高さ判定に誤りが生じるという問題があった。よって、従来技術では、例えば、距離を正確に計測できないほどの小さな高低差の有無を精度良く検出することができなかった。 In Patent Document 1, since the height is calculated based on the distance measurement data, the height cannot be accurately calculated if the distance measurement data is not correct. In Patent Document 2, since it is determined whether or not the detection of the maximum point is correct based on the measured distance, the height of the obstacle cannot be accurately determined if the measured distance is not correct. As described above, the conventional techniques such as Patent Document 1 and Patent Document 2 have a problem that an error occurs in the height determination when the distance cannot be measured accurately. Therefore, in the prior art, for example, it was not possible to accurately detect the presence or absence of an elevation difference that is so small that the distance cannot be measured accurately.
 本開示の目的は、高低差の有無を精度良く検出する検出装置及び検出方法を提供することにある。 An object of the present disclosure is to provide a detection device and a detection method that accurately detect the presence or absence of a height difference.
 本開示に係る検出装置は、光を投光する投光部と、所定角度領域から入射する光を受光する受光部と、所定角度領域における投光してからの経過時間に応じた受光量を示す受光波形に基づいて、光の飛行時間に対応する距離を算出する制御部と、を備え、制御部は、受光波形の歪みに基づいて所定角度領域内の高低差の有無を検出する。 The detection device according to the present disclosure includes a light projecting unit that projects light, a light receiving unit that receives light incident from a predetermined angle region, and a received light amount according to an elapsed time after projecting light in the predetermined angle region. And a controller that calculates a distance corresponding to the flight time of light based on the received light waveform, and the controller detects the presence or absence of a height difference within a predetermined angle region based on the distortion of the received light waveform.
 本開示に係る検出方法は、投光部により光を投光するステップと、所定角度領域から入射する光を受光部で受光するステップと、制御部により、所定角度領域における投光してからの経過時間に応じた受光量を示す受光波形に基づいて、光の飛行時間に対応する距離を算出するステップと、制御部により、受光波形の歪みに基づいて所定角度領域内の高低差の有無を検出するステップと、を含む。 The detection method according to the present disclosure includes a step of projecting light by a light projecting unit, a step of receiving light incident from a predetermined angle region by a light receiving unit, and a light projecting in a predetermined angle region by a control unit. The step of calculating the distance corresponding to the flight time of light based on the received light waveform indicating the amount of received light according to the elapsed time, and the presence or absence of the height difference within the predetermined angle region based on the distortion of the received light waveform by the control unit. Detecting.
 本開示に係る検出装置及び検出方法によると、高低差の有無を精度良く検出することができる。例えば、1画素の垂直画角に収まるサイズの小さな障害物の有無を検出することができる。 According to the detection apparatus and the detection method according to the present disclosure, it is possible to accurately detect whether there is a height difference. For example, it is possible to detect the presence or absence of a small obstacle that fits within the vertical angle of view of one pixel.
本開示に係る検出装置の適用例を説明するための図The figure for demonstrating the application example of the detection apparatus which concerns on this indication 実施形態1,2に係る検出装置の構成を例示するブロック図1 is a block diagram illustrating the configuration of a detection apparatus according to first and second embodiments. センサによる走査を説明するための図Diagram for explaining scanning by sensor 車両から近い道路上に障害物があるときの距離画像の一例を示す図The figure which shows an example of the distance image when there is an obstacle on the road near the vehicle 車両から遠い道路上に障害物があるときの距離画像の一例を示す図The figure which shows an example of the distance image when there is an obstacle on the road far from the vehicle 1画素分の垂直画角の範囲内に障害物がない場合のセンサから路面への投光の一例を模式的に示す図The figure which shows typically an example of the light projection to the road surface from a sensor when there is no obstacle in the range of the vertical angle of view for one pixel 図5Aにおける1画素分の受光波形の一例を示す図The figure which shows an example of the light reception waveform for 1 pixel in FIG. 5A 1画素分の垂直画角の範囲内に障害物がある場合のセンサから障害物及び路面への投光の一例を模式的に示す図The figure which shows typically an example of the light projection from a sensor to an obstruction and a road surface when an obstruction exists in the range of the vertical angle of view for 1 pixel 図6Aにおける1画素分の受光波形の一例を示す図The figure which shows an example of the light reception waveform for 1 pixel in FIG. 6A 比較する隣接画素を説明するための図The figure for demonstrating the adjacent pixel to compare 水平方向に隣接する画素の受光波形の比較を説明するための図The figure for demonstrating the comparison of the light reception waveform of the pixel adjacent to a horizontal direction 受光波形の差分と閾値の比較を説明するための図The figure for demonstrating the comparison of the difference of a received light waveform, and a threshold value 検出装置による距離画像生成処理の一例を示すフローチャートThe flowchart which shows an example of the distance image generation process by a detection apparatus 検出装置による波形解析処理の一例を示すフローチャートFlowchart showing an example of waveform analysis processing by the detection device 検出装置による障害物判定処理の一例を示すフローチャートThe flowchart which shows an example of the obstacle determination process by a detection apparatus 隣の画素に対して閾値以上の差分が生じた画素の一例を模式的に示す図The figure which shows typically an example of the pixel in which the difference more than a threshold value produced with respect to the adjacent pixel. 境界部の画素から特定された障害物が存在する画素の一例を模式的に示す図The figure which shows typically an example of the pixel in which the obstruction identified from the pixel of a boundary part exists 実施形態2に係るニューラルネットワークを模式的に示す図The figure which shows the neural network which concerns on Embodiment 2 typically. 実施形態2に係る波形解析処理の一例を示すフローチャート10 is a flowchart illustrating an example of waveform analysis processing according to the second embodiment. 実施形態3に係る検出装置の構成を例示するブロック図The block diagram which illustrates the composition of the detecting device concerning Embodiment 3.
 以下、添付の図面を参照して本開示に係る検出装置及び検出方法の実施の形態を説明する。なお、以下の各実施形態において、同様の構成要素については同一の符号を付している。 Hereinafter, embodiments of a detection apparatus and a detection method according to the present disclosure will be described with reference to the accompanying drawings. In addition, in each following embodiment, the same code | symbol is attached | subjected about the same component.
(適用例)
 本開示に係る検出装置が適用可能な一例について、図1を用いて説明する。図1は、本開示に係る検出装置100の適用例を説明するための図である。
(Application example)
An example to which the detection apparatus according to the present disclosure can be applied will be described with reference to FIG. FIG. 1 is a diagram for describing an application example of the detection apparatus 100 according to the present disclosure.
 本開示に係る検出装置100は、例えば車載用途に適用可能である。図1に示す一例において、検出装置100は車両200に搭載される。本実施形態の検出装置100は、LIDAR(Light Detection and Ranging、あるいは、Laser Imaging Detection and Ranging)装置である。検出装置100は、例えば、車両200の進行方向にある検出対象までの距離及び方位を検出する。検出対象は、例えば、道路の路面、及び道路上の段差を形成する障害物300である。障害物300は、例えば、縁石、落下物である。具体的には、検出装置100は、車両200の進行方向に向けて光を投光し、検出対象によって反射された反射光を受光する。検出装置100は、投光から受光までの時間差に基づいて、車両の進行方向にある検出対象までの距離を計測し、計測した距離に基づく距離画像(フレーム画像とも称する)を生成する。検出装置100は、例えば、道路上の障害物300までの距離及び方位などを示す情報(例えば、距離画像)を車両駆動装置210に出力する。 The detection device 100 according to the present disclosure is applicable to, for example, in-vehicle use. In the example illustrated in FIG. 1, the detection device 100 is mounted on a vehicle 200. The detection device 100 according to the present embodiment is a LIDAR (Light Detection and Ranging) or Laser Imaging Detection and Ranging) device. For example, the detection device 100 detects the distance and direction to the detection target in the traveling direction of the vehicle 200. The detection target is, for example, an obstacle 300 that forms a road surface and a step on the road. The obstacle 300 is, for example, a curb or a fallen object. Specifically, the detection device 100 projects light in the traveling direction of the vehicle 200 and receives reflected light reflected by the detection target. The detection device 100 measures the distance to the detection target in the traveling direction of the vehicle based on the time difference from light projection to light reception, and generates a distance image (also referred to as a frame image) based on the measured distance. The detection apparatus 100 outputs, for example, information (for example, a distance image) indicating the distance and direction to the obstacle 300 on the road to the vehicle driving apparatus 210.
 車両200は、例えば、自動運転車であり、自動運転を行うための車両駆動装置210を備える。車両駆動装置210は、例えば、道路上の障害物300を回避して進行方向を設定して車両200を駆動する操舵機構を含む。検出装置100によって障害物300を検出することによって、車両駆動装置210は、障害物300を回避しながら自動運転を行うことができる。 The vehicle 200 is, for example, an automatic driving vehicle, and includes a vehicle driving device 210 for performing automatic driving. The vehicle drive device 210 includes, for example, a steering mechanism that drives the vehicle 200 by setting the traveling direction while avoiding the obstacle 300 on the road. By detecting the obstacle 300 by the detection device 100, the vehicle drive device 210 can perform automatic driving while avoiding the obstacle 300.
 本開示の検出装置100は、車両200から遠く離れた位置にある障害物を精度良く検出することを目的とする。具体的には、本開示の検出装置100は、距離画像の1画素に収まる程度の小さな高低差の有無を検出する。 The detection device 100 of the present disclosure aims to accurately detect an obstacle at a position far away from the vehicle 200. Specifically, the detection device 100 of the present disclosure detects the presence or absence of a small height difference that can be accommodated in one pixel of the distance image.
(構成例)
 以下、検出装置100の構成例としての実施形態を説明する。
(Configuration example)
Hereinafter, an embodiment as a configuration example of the detection apparatus 100 will be described.
(実施形態1)
 実施形態1に係る検出装置100の構成と動作を以下に説明する。
(Embodiment 1)
The configuration and operation of the detection apparatus 100 according to Embodiment 1 will be described below.
1.構成
 本実施形態に係る検出装置100の構成について、図2及び図3を用いて説明する。図2は、検出装置100の構成を例示するブロック図である。図3は、センサ10による走査を説明するための図である。
1. Configuration The configuration of the detection apparatus 100 according to the present embodiment will be described with reference to FIGS. 2 and 3. FIG. 2 is a block diagram illustrating the configuration of the detection apparatus 100. FIG. 3 is a diagram for explaining scanning by the sensor 10.
 検出装置100は、センサ10、制御部20、及び記憶部30を備える。 The detection apparatus 100 includes a sensor 10, a control unit 20, and a storage unit 30.
 センサ10は、光を外部に投光する投光部11と、外部から光を受光する受光部12と、走査部13と、を含む。 The sensor 10 includes a light projecting unit 11 that projects light to the outside, a light receiving unit 12 that receives light from the outside, and a scanning unit 13.
 投光部11は、制御部20の制御に従って、光の光束を外部に出射する。投光部11は、例えば、1つ以上の光源素子で構成された光源と、光源をパルス駆動する光源駆動回路とを含む。光源素子は、例えば、レーザ光を発光するLD(半導体レーザ)である。光源素子は、LED等であってもよい。光源素子は、例えば、図3に示す垂直方向Yにおいて一列のアレイ状に配置され、投光部11は投光領域R11に向けて光を投光する。 The light projecting unit 11 emits a light beam to the outside under the control of the control unit 20. The light projecting unit 11 includes, for example, a light source composed of one or more light source elements and a light source driving circuit that drives the light source in pulses. The light source element is, for example, an LD (semiconductor laser) that emits laser light. The light source element may be an LED or the like. For example, the light source elements are arranged in a line in the vertical direction Y shown in FIG. 3, and the light projecting unit 11 projects light toward the light projecting region R11.
 受光部12は、複数の受光素子を備える。受光素子は、光を受光すると、受光した光量すなわち受光量に応じた受光信号を生成する。複数の受光素子は、例えば、垂直方向Yに沿って一列のアレイ状に配置される。各受光素子は、例えば距離画像の1画素に対応し、1画素の垂直画角に応じた範囲(所定角度領域の一例)から入射する光を別々に受光する。すなわち、受光部12は、距離画像の垂直方向Yに並んだ複数の画素に対応する受光領域を有し、画素毎に受光信号を生成する。受光素子は、例えばSPAD(単一光子アバランシェフォトダイオード)で構成される。受光素子は、PD(フォトダイオード)又はAPD(アバランシェフォトダイオード)で構成されてもよい。 The light receiving unit 12 includes a plurality of light receiving elements. When receiving the light, the light receiving element generates a received light signal corresponding to the received light amount, that is, the received light amount. For example, the plurality of light receiving elements are arranged in a line in the vertical direction Y. Each light receiving element corresponds to, for example, one pixel of the distance image, and separately receives light incident from a range (an example of a predetermined angle region) corresponding to the vertical field angle of one pixel. That is, the light receiving unit 12 has a light receiving region corresponding to a plurality of pixels arranged in the vertical direction Y of the distance image, and generates a light receiving signal for each pixel. The light receiving element is composed of, for example, SPAD (single photon avalanche photodiode). The light receiving element may be composed of a PD (photodiode) or an APD (avalanche photodiode).
 走査部13は、例えば、ミラーと、垂直方向Yに沿った回転軸の周りにミラーを回転させる回転機構と、回転機構を駆動する走査駆動回路と、を含む。走査駆動回路は、制御部20の制御により、ミラーを回転駆動する。これにより、走査部13は、投光する方向を一定時間ごとに少しずつ変化させて、光が進行する光路を少しずつ移動させる。本実施形態では、図3に示すように、走査方向は水平方向Xである。走査部13は、投光領域R11を水平方向Xにおいてシフトさせる。 The scanning unit 13 includes, for example, a mirror, a rotation mechanism that rotates the mirror around a rotation axis along the vertical direction Y, and a scanning drive circuit that drives the rotation mechanism. The scan drive circuit rotates the mirror under the control of the control unit 20. Accordingly, the scanning unit 13 changes the light projecting direction little by little at regular time intervals, and moves the light path along which the light travels little by little. In the present embodiment, the scanning direction is the horizontal direction X as shown in FIG. The scanning unit 13 shifts the light projection region R11 in the horizontal direction X.
 制御部20は、半導体素子などで実現可能である。制御部20は、例えば、マイコン、CPU、MPU、GPU、DSP、FPGA、ASICで構成することができる。制御部20の機能は、ハードウェアのみで構成してもよいし、ハードウェアとソフトウェアとを組み合わせることにより実現してもよい。制御部20は、記憶部30に格納されたデータやプログラムを読み出して種々の演算処理を行うことで、所定の機能を実現する。 The control unit 20 can be realized by a semiconductor element or the like. The control unit 20 can be configured by, for example, a microcomputer, CPU, MPU, GPU, DSP, FPGA, and ASIC. The function of the control unit 20 may be configured only by hardware, or may be realized by combining hardware and software. The control unit 20 implements a predetermined function by reading out data and programs stored in the storage unit 30 and performing various arithmetic processes.
 制御部20は、機能的構成として、距離画像生成部21と障害物検出部22とを含む。 The control unit 20 includes a distance image generation unit 21 and an obstacle detection unit 22 as functional configurations.
 距離画像生成部21は、距離画像の画角に対応した投影面R1を、水平方向Xに走査しながら測距を行い、距離画像を生成する。距離画像の分解能すなわち画素毎の画角は、例えば、水平方向Xにおいて1.0度~1.6度であり、垂直方向Yにおいて0.3度~1.2度である。距離画像は、水平方向X及び垂直方向Yに並んだ画素毎に、奥行き方向Zの距離を示す。投影面R1の走査を繰り返すことにより、所望のフレームレートで距離画像を順次、生成することができる。距離画像生成部21は、例えば、生成した距離画像を車両駆動装置210に出力する。 The distance image generation unit 21 performs distance measurement while scanning the projection plane R1 corresponding to the angle of view of the distance image in the horizontal direction X, and generates a distance image. The resolution of the distance image, that is, the angle of view for each pixel is, for example, 1.0 to 1.6 degrees in the horizontal direction X and 0.3 to 1.2 degrees in the vertical direction Y. The distance image indicates the distance in the depth direction Z for each pixel arranged in the horizontal direction X and the vertical direction Y. By repeatedly scanning the projection plane R1, distance images can be sequentially generated at a desired frame rate. The distance image generation unit 21 outputs the generated distance image to the vehicle drive device 210, for example.
 距離画像生成部21は、投光部11による投光のタイミングを制御する。距離画像生成部21は、受光波形生成部21aと距離算出部21bとを含む。受光波形生成部21aは、投光のタイミングと受光部12から得られる受光信号とに基づいて、投光してからの経過時間に応じた受光量を示す受光波形のデータを画素毎に生成する。 The distance image generation unit 21 controls the timing of light projection by the light projection unit 11. The distance image generation unit 21 includes a light reception waveform generation unit 21a and a distance calculation unit 21b. The light reception waveform generation unit 21a generates, for each pixel, light reception waveform data indicating the amount of light received according to the elapsed time since the light projection, based on the light projection timing and the light reception signal obtained from the light reception unit 12. .
 距離算出部21bは、受光波形に基づいて画素毎に距離を算出する。例えば、距離算出部21bは、投光部11から投光された光が反射されて受光部12によって受光されるまでの光の飛行時間を受光波形に基づいて計測する。距離算出部21bは、計測した飛行時間から、光を反射した検出対象(例えば、路面及び障害物)までの距離を算出する。距離算出部21bは、画素毎に測定した距離に基づいて、距離画像を生成する。 The distance calculation unit 21b calculates a distance for each pixel based on the received light waveform. For example, the distance calculation unit 21b measures the flight time of light from when the light projected from the light projecting unit 11 is reflected and received by the light receiving unit 12 based on the received light waveform. The distance calculation unit 21b calculates a distance from the measured flight time to a detection target (for example, a road surface and an obstacle) that reflects light. The distance calculation unit 21b generates a distance image based on the distance measured for each pixel.
 障害物検出部22は、受光波形の歪みに基づいて、各画素内における高低差の有無、例えば障害物の有無を検出する。障害物検出部22は、波形解析部22aと障害物判定部22bとを含む。 The obstacle detection unit 22 detects the presence or absence of a height difference in each pixel, for example, the presence or absence of an obstacle, based on the distortion of the received light waveform. The obstacle detection unit 22 includes a waveform analysis unit 22a and an obstacle determination unit 22b.
 波形解析部22aは、受光波形生成部21aが生成した受光波形を画素毎に解析する。具体的には、本実施形態では、波形解析部22aは、水平方向Xに隣接する画素の受光波形の差分(受光波形の歪みの一例)を閾値と比較して、差分が障害物と路面との境界部であることを示しているか否かを判定する。「受光波形の差分」とは、時間に応じた受光量の差分のことをいう。 The waveform analysis unit 22a analyzes the light reception waveform generated by the light reception waveform generation unit 21a for each pixel. Specifically, in the present embodiment, the waveform analysis unit 22a compares the difference between the light reception waveforms of pixels adjacent in the horizontal direction X (an example of distortion of the light reception waveform) with a threshold, and the difference is calculated between the obstacle and the road surface. It is determined whether or not it is a boundary portion. The “difference in received light waveform” refers to a difference in received light amount according to time.
 障害物判定部22bは、波形解析部22aの解析結果に基づいて、距離画像を構成する各画素が障害物を含むか否かの判定を行う。例えば、波形解析部22aの解析結果において、境界部に相当する画素があれば、障害物があると判定して、その境界部に基づいて障害物を含む画素を特定する。障害物判定部22bは、障害物があると判定したときは、判定結果に基づく情報を車両駆動装置210に出力する。例えば、障害物があると判定したときは、障害物判定部22bは、障害物を含む画素に基づいて障害物の位置又は方位を特定し、障害物の位置又は方位を示す情報を車両駆動装置210に出力する。 The obstacle determination unit 22b determines whether each pixel constituting the distance image includes an obstacle based on the analysis result of the waveform analysis unit 22a. For example, in the analysis result of the waveform analysis unit 22a, if there is a pixel corresponding to the boundary, it is determined that there is an obstacle, and the pixel including the obstacle is specified based on the boundary. When the obstacle determination unit 22b determines that there is an obstacle, the obstacle determination unit 22b outputs information based on the determination result to the vehicle drive device 210. For example, when it is determined that there is an obstacle, the obstacle determination unit 22b specifies the position or orientation of the obstacle based on the pixel including the obstacle, and information indicating the position or orientation of the obstacle is the vehicle drive device. Output to 210.
 記憶部30は、検出装置100の機能を実現するために必要なプログラム及びデータを記憶する記憶媒体である。記憶部30は、例えば、ハードディスク(HDD)、SSD、RAM、DRAM、強誘電体メモリ、フラッシュメモリ、磁気ディスク、又はこれらの組み合わせによって実現できる。記憶部30は、各種情報を一時的に記憶してもよい。記憶部30は、例えば、制御部20の作業エリアとして機能するように構成されてもよい。 The storage unit 30 is a storage medium that stores programs and data necessary for realizing the functions of the detection apparatus 100. The storage unit 30 can be realized by, for example, a hard disk (HDD), SSD, RAM, DRAM, ferroelectric memory, flash memory, magnetic disk, or a combination thereof. The storage unit 30 may temporarily store various information. The memory | storage part 30 may be comprised so that it may function as a work area of the control part 20, for example.
 検出装置100は、さらに、所定の通信規格に準拠して外部機器との通信を行う回路を含む通信部を備えてもよい。所定の通信規格は、例えば、LAN、Wi-Fi(登録商標)、Bluetooth(登録商標)、USB、及びHDMI(登録商標)を含む。 The detection apparatus 100 may further include a communication unit including a circuit that performs communication with an external device in accordance with a predetermined communication standard. The predetermined communication standard includes, for example, LAN, Wi-Fi (registered trademark), Bluetooth (registered trademark), USB, and HDMI (registered trademark).
2.動作
 以上のように構成される検出装置100の動作について、以下説明する。
2. Operation The operation of the detection apparatus 100 configured as described above will be described below.
2-1.距離計測と受光波形の比較の概要
 図4Aは、センサ10から近い位置の道路1上に障害物2があるときの距離画像V1の一例を示している。図4Aに示す障害物2の垂直方向Yの大きさは、1画素の垂直画角に対応する長さdyを越える。画素P1に対応する範囲において、障害物2はその高さ分だけ、道路1よりもセンサ10の近くに位置する。障害物2の材質や向きにもよるが、例えば、障害物2が有る画素P1について測定される光の飛行時間は、障害物2が無い画素P2について測定される光の飛行時間よりも短い。この場合、画素P1における光の飛行時間を計測することで、障害物2までの距離を算出することができる。これにより、距離画像V1において障害物2が表れる。
2-1. Overview of Comparison between Distance Measurement and Light-Received Waveform FIG. 4A shows an example of a distance image V1 when the obstacle 2 is on the road 1 near the sensor 10. The size of the obstacle 2 shown in FIG. 4A in the vertical direction Y exceeds the length dy corresponding to the vertical field angle of one pixel. In the range corresponding to the pixel P1, the obstacle 2 is positioned closer to the sensor 10 than the road 1 by the height. Depending on the material and orientation of the obstacle 2, for example, the flight time of light measured for the pixel P1 with the obstacle 2 is shorter than the flight time of light measured for the pixel P2 without the obstacle 2. In this case, the distance to the obstacle 2 can be calculated by measuring the flight time of light at the pixel P1. Thereby, the obstacle 2 appears in the distance image V1.
 図4Bは、センサ10から遠い位置の道路1上に障害物3があるときの距離画像V2の一例を示している。図4Bに示す障害物3の垂直方向Yの大きさは、1画素の垂直画角に対応する長さdyよりも小さい。この場合、距離画像V2に障害物3が表れない(図4Bにおいて破線で示す障害物3は距離画像V2に表れないことを意味している)場合がある。具体的には、例えば、障害物3が有る画素P4について測定される光の飛行時間が、障害物3が無い画素P3について測定される光の飛行時間と同じになる場合がある。この場合、画素P4における光の飛行時間に基づいて、道路1の路面までの距離が計測され、障害物3があることが検出されない。図5A及び図5Bを参照して、垂直方向Yの大きさが1画素の垂直画角に対応する長さdyよりも小さい障害物3が距離画像V2に表れない理由についてより詳細に説明する。 FIG. 4B shows an example of the distance image V2 when the obstacle 3 is on the road 1 far from the sensor 10. The size in the vertical direction Y of the obstacle 3 shown in FIG. 4B is smaller than the length dy corresponding to the vertical field angle of one pixel. In this case, the obstacle 3 may not appear in the distance image V2 (meaning that the obstacle 3 indicated by a broken line in FIG. 4B does not appear in the distance image V2). Specifically, for example, the flight time of light measured for the pixel P4 with the obstacle 3 may be the same as the flight time of light measured for the pixel P3 without the obstacle 3. In this case, the distance to the road surface of the road 1 is measured based on the flight time of the light at the pixel P4, and the presence of the obstacle 3 is not detected. With reference to FIG. 5A and FIG. 5B, the reason why the obstacle 3 whose size in the vertical direction Y is smaller than the length dy corresponding to the vertical field angle of one pixel does not appear in the distance image V2 will be described in detail.
 図5Aは、1画素分の垂直画角θの範囲(所定角度領域の一例)内に障害物がない場合(図4Bの画素P3に対応する領域)のセンサ10から路面への投光の一例を模式的に示している。図5Bは、図5Aに対応する画素P3における受光波形の一例を示している。図5Bの横軸は、時間であり、縦軸は受光量である。図5Bは、画素P3に対応する領域に光を投光してからの経過時間に応じた受光量を示している。図5Aにおいては障害物が路面上に存在しないため、画素P3に対応する領域における、センサ10から路面までの最短の距離dz1において受光量は最大となる(時間t1)。 FIG. 5A shows an example of light projection from the sensor 10 to the road surface when there is no obstacle (an area corresponding to the pixel P3 in FIG. 4B) within the range of the vertical angle of view θ for one pixel (an example of a predetermined angle area). Is schematically shown. FIG. 5B shows an example of a light reception waveform in the pixel P3 corresponding to FIG. 5A. The horizontal axis in FIG. 5B is time, and the vertical axis is the amount of received light. FIG. 5B shows the amount of light received according to the elapsed time since the light was projected onto the area corresponding to the pixel P3. In FIG. 5A, since there is no obstacle on the road surface, the amount of received light becomes maximum at the shortest distance dz1 from the sensor 10 to the road surface in the region corresponding to the pixel P3 (time t1).
 図6Aは、1画素分の垂直画角θの範囲(所定角度領域の一例)内に障害物3がある場合(図4Bの画素P4に対応する領域)のセンサ10から障害物及び路面への投光の一例を模式的に示している。図6Bは、図6Aに対応する画素P4における受光波形の一例を示している。図6Bの横軸は、時間であり、縦軸は受光量である。図6Bは、画素P4に対応する領域に光を投光してからの経過時間に応じた受光量を示している。図6Aに示す画素P4は、図5Aに示す画素P3と同一の水平ラインにある。図6Aにおいては、障害物3が路面上に存在する。しかし、障害物3の大きさは、1画素分の垂直画角θの範囲よりも小さい。この場合、図6Bに示す例において、受光量は、障害物3がない場合(図5B)と同一の時間t1において最大となる。すなわち、障害物3から反射された光の受光量よりも、画素P4に対応する領域内において、センサ10から最短の距離dz1にある路面から反射された受光量が大きくなる。距離の算出方法には各種の方法があるが、例えば、受光量が最大となる時間t1に基づいて距離を算出すると、障害物3がない場合と同一の距離値が得られることになる。すなわち、画素P4に対応する距離値は、センサ10から路面までの距離値となる。よって、画素P4の範囲内にある障害物3が検出されず、障害物3が距離画像に表れない。1画素分の垂直画角θに対応する領域は、センサ10に近いほど狭く、センサ10から遠いほど広くなる。よって、センサ10から遠くにある障害物ほど、1画素分の垂直画角θの範囲内に収まりやすい。そのため、センサ10の遠方にある障害物は検出されにくい。 FIG. 6A shows the case where the obstacle 3 is present in the range of the vertical angle of view θ for one pixel (an example of a predetermined angle region) (the region corresponding to the pixel P4 in FIG. 4B) from the sensor 10 to the obstacle and the road surface. An example of light projection is shown typically. FIG. 6B shows an example of a light reception waveform in the pixel P4 corresponding to FIG. 6A. The horizontal axis in FIG. 6B is time, and the vertical axis is the amount of received light. FIG. 6B shows the amount of light received according to the time elapsed since the light was projected onto the area corresponding to the pixel P4. The pixel P4 shown in FIG. 6A is on the same horizontal line as the pixel P3 shown in FIG. 5A. In FIG. 6A, the obstacle 3 exists on the road surface. However, the size of the obstacle 3 is smaller than the range of the vertical angle of view θ for one pixel. In this case, in the example shown in FIG. 6B, the amount of received light becomes maximum at the same time t1 as when there is no obstacle 3 (FIG. 5B). That is, the amount of received light reflected from the road surface at the shortest distance dz1 from the sensor 10 in the region corresponding to the pixel P4 is larger than the amount of received light reflected from the obstacle 3. There are various methods for calculating the distance. For example, if the distance is calculated based on the time t1 at which the amount of received light is maximum, the same distance value as that obtained when there is no obstacle 3 is obtained. That is, the distance value corresponding to the pixel P4 is a distance value from the sensor 10 to the road surface. Therefore, the obstacle 3 within the range of the pixel P4 is not detected, and the obstacle 3 does not appear in the distance image. The area corresponding to the vertical angle of view θ for one pixel is narrower as it is closer to the sensor 10 and wider as it is farther from the sensor 10. Therefore, the obstacle farther from the sensor 10 is more likely to be within the range of the vertical angle of view θ for one pixel. Therefore, obstacles far from the sensor 10 are difficult to detect.
 本開示の検出装置100は、このような1画素分の垂直画角θの範囲内に収まるような小さな障害物を検出することを目的とする。図5B及び図6Bに示すように、障害物3がある場合とない場合とでは受光波形の形状が異なる。よって、本実施形態の検出装置100は、水平方向Xにおいて隣接する2つの画素の受光波形を比較する。図7は、比較する隣接画素を説明するための図である。検出装置100は、例えば、画素P11と画素P12、画素P12と画素P13、画素P13と画素P14・・・などのように、水平方向Xにおいて隣り合う画素の受光波形を比較する。より具体的には、例えば、画素P11,P12の受光波形の比較において、画素P12の受光波形が画素P11の受光波形に対して差があるか否かを判定し、差がある場合に、画素P11と画素P12が、それぞれ、路面と障害物との境界部(障害物が無い画素と有る画素)に相当すると判断する。 The detection device 100 of the present disclosure aims to detect such a small obstacle that falls within the range of the vertical angle of view θ for one pixel. As shown in FIGS. 5B and 6B, the shape of the received light waveform differs depending on whether or not the obstacle 3 is present. Therefore, the detection apparatus 100 according to the present embodiment compares the light reception waveforms of two pixels adjacent in the horizontal direction X. FIG. 7 is a diagram for explaining adjacent pixels to be compared. The detection device 100 compares the light reception waveforms of adjacent pixels in the horizontal direction X, such as the pixel P11 and the pixel P12, the pixel P12 and the pixel P13, the pixel P13 and the pixel P14, and so on. More specifically, for example, in the comparison of the light reception waveforms of the pixels P11 and P12, it is determined whether or not the light reception waveform of the pixel P12 is different from the light reception waveform of the pixel P11. It is determined that P11 and pixel P12 correspond to the boundary between the road surface and the obstacle (pixels having no obstacle).
 図8Aは、画素P11に対応する受光波形W11と、画素P11に隣接する画素P12に対応する受光波形W12とを比較した例を示している。図8Bは、受光波形W11,W12の差分の絶対値を示す波形を例示している。 FIG. 8A shows an example in which the light reception waveform W11 corresponding to the pixel P11 is compared with the light reception waveform W12 corresponding to the pixel P12 adjacent to the pixel P11. FIG. 8B illustrates a waveform indicating the absolute value of the difference between the received light waveforms W11 and W12.
 図5B,図6B,及び図8Aに示すように、1画素分の垂直画角θの範囲内に収まる小さな障害物3であっても、障害物3が有る場合と無い場合とで波形の形状が異なる。よって、本実施形態では、この波形の形状の違いに基づいて、障害物3が画素に含まれるか否かを判定する。具体的には、図8Bに示すように、受光波形の差分の絶対値が所定の閾値以上のときは、隣接する画素が境界部を示していると判断する。例えば、投光してからの同一の経過時間毎に受光量の差を算出し、閾値以上となる差があるか否かを判断する。 As shown in FIG. 5B, FIG. 6B, and FIG. 8A, even if the small obstacle 3 is within the range of the vertical angle of view θ for one pixel, the shape of the waveform depending on whether the obstacle 3 is present or not. Is different. Therefore, in this embodiment, it is determined whether or not the obstacle 3 is included in the pixel based on the difference in the shape of the waveform. Specifically, as shown in FIG. 8B, when the absolute value of the difference between the received light waveforms is equal to or greater than a predetermined threshold, it is determined that the adjacent pixel indicates a boundary portion. For example, the difference in the amount of received light is calculated at the same elapsed time since the light is projected, and it is determined whether or not there is a difference that is equal to or greater than a threshold value.
 ここで、差分値の正負は、光の反射率に応じて変わる。光の反射率は材質によって変わるため、例えば、障害物3が白色であれば反射率は道路1の反射率よりも高く、障害物3が黒色であれば反射率は道路1の反射率よりも低くなる。障害物3が道路1よりも反射率の高い材料を含む場合は、障害物3から反射される受光量は水平方向Xにおいて同じ位置にある道路1から反射される受光量よりも多くなる。この場合に、障害物3を含む画素P12の受光量から道路1のみの画素P11の受光量を引いた場合の差分値は正となり、道路1のみの画素P16の受光量から障害物3を含む画素P15の受光量を引いた場合の差分値は負となる。図5B,図6B,及び図8Aでは、障害物3が道路1よりも反射率の高い材料を含む場合を例示している。一方、障害物3が道路1よりも反射率の低い材料を含む場合は、障害物3から反射される受光量は水平方向Xにおいて同じ位置にある道路1から反射される受光量よりも少なくなる。この場合は、障害物3を含む画素P12の受光量から道路1のみの画素P11の受光量を引いた場合の差分値は負となり、道路1のみの画素P16の受光量から障害物3を含む画素P15の受光量を引いた場合の差分値は正となる。図8Bに示すように、受光波形の差分の絶対値を所定の閾値と比較することで、障害物3と道路1の境界部を特定することができる。 Here, the sign of the difference value changes according to the reflectance of light. Since the reflectance of light varies depending on the material, for example, if the obstacle 3 is white, the reflectance is higher than the reflectance of the road 1, and if the obstacle 3 is black, the reflectance is higher than the reflectance of the road 1. Lower. When the obstacle 3 includes a material having a higher reflectance than the road 1, the amount of received light reflected from the obstacle 3 is larger than the amount of received light reflected from the road 1 at the same position in the horizontal direction X. In this case, the difference value when the received light amount of the pixel P11 of only the road 1 is subtracted from the received light amount of the pixel P12 including the obstacle 3 is positive, and the obstacle 3 is included from the received light amount of the pixel P16 of only the road 1 The difference value when the received light amount of the pixel P15 is subtracted is negative. 5B, FIG. 6B, and FIG. 8A illustrate a case where the obstacle 3 includes a material having a higher reflectance than the road 1. On the other hand, when the obstacle 3 includes a material having a reflectance lower than that of the road 1, the amount of received light reflected from the obstacle 3 is smaller than the amount of received light reflected from the road 1 at the same position in the horizontal direction X. . In this case, the difference value when the received light amount of the pixel P11 of only the road 1 is subtracted from the received light amount of the pixel P12 including the obstacle 3 is negative, and the obstacle 3 is included from the received light amount of the pixel P16 of only the road 1 The difference value when the received light amount of the pixel P15 is subtracted is positive. As shown in FIG. 8B, the boundary between the obstacle 3 and the road 1 can be specified by comparing the absolute value of the difference between the received light waveforms with a predetermined threshold value.
2-2.距離画像生成部の動作
 距離画像生成部21の受光波形生成部21aと距離算出部21bの動作について説明する。図9は、距離画像生成部21の動作を例示するフローチャートである。距離画像生成部21は、投光部11に所定の投光タイミングで光を投光させる(S101)。受光波形生成部21aは、受光部12から受光量に応じた受光信号を取得する(S102)。受光波形生成部21aは、投光タイミングと受光信号とに基づいて、光を投光してからの経過時間に基づく受光量を示す受光波形のデータを生成する(S103)。距離算出部21bは、受光波形に基づいて、光の飛行時間に対応する距離を算出する(S104)。例えば、距離算出部21bは、受光量が最も多い時間に基づいて距離を算出する。ステップS103の受光波形の生成と、ステップS104の距離の算出は画素毎に行う。距離画像生成部21は、1列分の画素の距離の算出が完了したか否かを判断し(S105)、未だであれば受光波形の生成と距離の算出を繰り返す。1列分の画素の距離の算出が完了すると、距離画像生成部21は、走査部13を制御して、投光領域R11を走査方向である水平方向Xにシフトさせて(S106)、ステップS101に戻る。1フレーム分の距離が算出されると、距離画像生成部21は算出した画素毎の距離に基づいて、1フレーム分の距離画像(フレーム画像)を生成する。
2-2. Operation of Distance Image Generating Unit Operations of the light reception waveform generating unit 21a and the distance calculating unit 21b of the distance image generating unit 21 will be described. FIG. 9 is a flowchart illustrating the operation of the distance image generation unit 21. The distance image generating unit 21 causes the light projecting unit 11 to project light at a predetermined light projecting timing (S101). The light reception waveform generation unit 21a acquires a light reception signal corresponding to the amount of received light from the light reception unit 12 (S102). Based on the light projection timing and the light reception signal, the light reception waveform generation unit 21a generates light reception waveform data indicating the amount of light reception based on the elapsed time since the light was projected (S103). The distance calculation unit 21b calculates a distance corresponding to the flight time of light based on the received light waveform (S104). For example, the distance calculation unit 21b calculates the distance based on the time when the amount of received light is the largest. The generation of the received light waveform in step S103 and the calculation of the distance in step S104 are performed for each pixel. The distance image generation unit 21 determines whether or not the calculation of the distance of the pixels for one column has been completed (S105). If not yet, the generation of the received light waveform and the calculation of the distance are repeated. When the calculation of the distance of the pixels for one column is completed, the distance image generation unit 21 controls the scanning unit 13 to shift the light projection region R11 in the horizontal direction X that is the scanning direction (S106), and step S101. Return to. When the distance for one frame is calculated, the distance image generation unit 21 generates a distance image (frame image) for one frame based on the calculated distance for each pixel.
2-3.障害物検出部の動作
 図10~図12Bを参照して、障害物検出部22の波形解析部22aと障害物判定部22bの動作について説明する。図10は、波形解析部22aの動作を例示するフローチャートである。図11は、障害物判定部22bの動作を例示するフローチャートである。図12Aは、隣の画素に対して閾値以上の差分が生じた画素の一例をハッチングにより模式的に示している。図12Bは、差分が生じた画素P12,P16に基づいて特定された障害物3の領域をハッチングにより模式的に示している。
2-3. Operation of Obstacle Detection Unit The operation of the waveform analysis unit 22a and the obstacle determination unit 22b of the obstacle detection unit 22 will be described with reference to FIGS. 10 to 12B. FIG. 10 is a flowchart illustrating the operation of the waveform analysis unit 22a. FIG. 11 is a flowchart illustrating the operation of the obstacle determination unit 22b. FIG. 12A schematically illustrates an example of a pixel in which a difference greater than or equal to a threshold value occurs with respect to an adjacent pixel by hatching. FIG. 12B schematically shows the area of the obstacle 3 identified based on the pixels P12 and P16 in which the difference is generated by hatching.
 図10は、1フレーム分の動作について示している。波形解析部22aは、受光波形生成部21aから受光波形のデータを取得する(S201)。波形解析部22aは、水平方向Xにおいて隣接する画素の受光波形の差分を算出する(S202)。波形解析部22aは、差分の絶対値が閾値以上か否かを判断する(S203)。差分の絶対値が閾値以上であれば、波形解析部22aは隣接する画素がそれぞれ路面と障害物の境界部であると判定する(S204)。差分の絶対値が閾値未満であれば、波形解析部22aは、隣接する画素が境界部ではないと判定する(S205)。波形解析部22aは、フレーム内の全画素の解析が完了したか否かを判断する(S206)。波形解析部22aは、フレーム内の全画素の解析が完了していなければ、ステップS201に戻る。 FIG. 10 shows the operation for one frame. The waveform analysis unit 22a acquires received light waveform data from the received light waveform generation unit 21a (S201). The waveform analysis unit 22a calculates the difference between the received light waveforms of pixels adjacent in the horizontal direction X (S202). The waveform analysis unit 22a determines whether or not the absolute value of the difference is greater than or equal to a threshold value (S203). If the absolute value of the difference is greater than or equal to the threshold value, the waveform analysis unit 22a determines that the adjacent pixels are each a boundary between the road surface and the obstacle (S204). If the absolute value of the difference is less than the threshold value, the waveform analysis unit 22a determines that the adjacent pixel is not a boundary part (S205). The waveform analysis unit 22a determines whether or not the analysis of all the pixels in the frame is completed (S206). If the analysis of all the pixels in the frame has not been completed, the waveform analysis unit 22a returns to step S201.
 図11に示す動作は、図10に示す波形解析部22aの動作の後に行われる。障害物判定部22bは、波形解析部22aによる解析結果において、フレーム内に境界部として判定された画素があるか否かを判断する(S301)。境界部として判定された画素があれば、その境界部に基づいて、障害物の領域を特定する(S302)。境界部として判定された画素が一つもなければ、フレーム内に障害物がないと判定する(S303)。障害物判定部22bは、判定結果を出力する。例えば、障害物がある場合は、障害物であると特定した画素の座標に基づいて、その位置又は方位を示す情報を車両駆動装置210に出力する。 The operation shown in FIG. 11 is performed after the operation of the waveform analysis unit 22a shown in FIG. The obstacle determination unit 22b determines whether or not there is a pixel determined as a boundary in the frame in the analysis result by the waveform analysis unit 22a (S301). If there is a pixel determined as a boundary, an obstacle region is specified based on the boundary (S302). If there is no pixel determined as the boundary, it is determined that there is no obstacle in the frame (S303). The obstacle determination unit 22b outputs a determination result. For example, when there is an obstacle, based on the coordinates of the pixel identified as an obstacle, information indicating the position or orientation is output to the vehicle drive device 210.
 図12Aに示す画素P12のハッチングは、画素P12の受光波形が画素P11の受光波形に対して差が生じたことを示している。図7及び図12Aの例では、隣接する画素P11,P12の受光波形の差分が閾値以上であるため、ステップS204において、画素P11と画素P12が、それぞれ路面と障害物の境界部、すなわち、障害物が無い画素と有る画素に相当すると判定される。図12Aに示す画素P16のハッチングは、画素P16の受光波形が画素P15の受光波形に対して差が生じたことを示している。隣接する画素P15,P16の受光波形の差分が閾値以上であるため、ステップS204において、画素P15と画素P16が、それぞれ障害物と路面の境界部、すなわち、障害物が有る画素と無い画素に相当すると判定される。ステップS302においては、例えば、隣の画素との差分が生じた画素P12,P16に基づいて、図12Bに示すように、画素P12から画素P15までが障害物3が存在する領域であると特定される。 The hatching of the pixel P12 illustrated in FIG. 12A indicates that the light reception waveform of the pixel P12 is different from the light reception waveform of the pixel P11. In the example of FIGS. 7 and 12A, since the difference between the received light waveforms of the adjacent pixels P11 and P12 is equal to or greater than the threshold value, in step S204, the pixel P11 and the pixel P12 are respectively the boundary between the road surface and the obstacle, that is, the obstacle. It is determined to correspond to a pixel having no object and a certain pixel. The hatching of the pixel P16 illustrated in FIG. 12A indicates that a difference occurs in the light reception waveform of the pixel P16 with respect to the light reception waveform of the pixel P15. Since the difference between the received light waveforms of the adjacent pixels P15 and P16 is greater than or equal to the threshold value, in step S204, the pixel P15 and the pixel P16 correspond to the boundary between the obstacle and the road surface, that is, the pixel having the obstacle and the pixel having no obstacle, respectively. Then, it is determined. In step S302, for example, based on the pixels P12 and P16 in which a difference from the adjacent pixel is generated, the region from the pixel P12 to the pixel P15 is specified as the region where the obstacle 3 exists as shown in FIG. 12B. The
3.まとめ
 本実施形態に係る検出装置100は、投光部11と、受光部12と、走査部13とを備える。投光部11は、光を投光する。受光部12は、所定角度領域から入射する光を受光する。制御部20は、所定角度領域における投光してからの経過時間に応じた受光量を示す受光波形に基づいて、光の飛行時間に対応する距離を算出する。また、制御部20は、受光波形の歪みに基づいて所定角度領域内の高低差の有無を検出する。受光部12は、複数の画素に対応した複数の所定角度領域から入射する光を別々に受光する。所定角度領域は1画素分の垂直画角θに対応する範囲である。制御部20は、画素毎の受光波形に基づいて、画素毎に距離を算出する。制御部20は、受光波形の歪みに基づいて、画素毎に高低差の有無を検出する。本実施形態において、受光波形の歪みは、水平方向に隣接する画素の受光波形の差分である。
3. Summary The detection apparatus 100 according to the present embodiment includes a light projecting unit 11, a light receiving unit 12, and a scanning unit 13. The light projecting unit 11 projects light. The light receiving unit 12 receives light incident from a predetermined angle region. The control unit 20 calculates a distance corresponding to the flight time of light based on a received light waveform indicating an amount of received light according to an elapsed time after light projection in a predetermined angle region. Further, the control unit 20 detects the presence / absence of a height difference within a predetermined angle region based on the distortion of the received light waveform. The light receiving unit 12 separately receives light incident from a plurality of predetermined angle regions corresponding to a plurality of pixels. The predetermined angle region is a range corresponding to the vertical field angle θ for one pixel. The control unit 20 calculates the distance for each pixel based on the light reception waveform for each pixel. The control unit 20 detects whether there is a height difference for each pixel based on the distortion of the received light waveform. In the present embodiment, the distortion of the light reception waveform is a difference between the light reception waveforms of pixels adjacent in the horizontal direction.
 上述したように、高低差のある障害物が1画素の垂直画角の範囲内に収まるほど小さい場合は、障害物の距離が正しく計測されないことがある。すなわち、垂直方向Yにおいて1画素よりも小さな障害物は、距離画像に表れないことがある。しかし、本実施形態の検出装置100は、画素毎の受光波形の差分に基づいて、高低差のある画素、例えば、路面と障害物との境界部を検出している。よって、1画素の垂直画角θの範囲内に収まるほど小さな障害物であっても検出することができる。 As described above, when an obstacle with a height difference is small enough to be within the range of one pixel's vertical angle of view, the distance of the obstacle may not be measured correctly. That is, an obstacle smaller than one pixel in the vertical direction Y may not appear in the distance image. However, the detection apparatus 100 according to the present embodiment detects a pixel having a difference in height, for example, a boundary portion between a road surface and an obstacle, based on a difference in received light waveform for each pixel. Therefore, even an obstacle that is small enough to be within the range of the vertical angle of view θ of one pixel can be detected.
 本実施形態では、検出装置100は車両200に搭載され、高低差の有無として、道路上の段差を形成する障害物の有無を検出する。これにより、例えば、車両200から遠く離れた障害物であっても精度良く検出することができる。 In the present embodiment, the detection device 100 is mounted on the vehicle 200 and detects the presence or absence of an obstacle that forms a step on the road as the presence or absence of a height difference. Thereby, for example, even an obstacle far from the vehicle 200 can be detected with high accuracy.
(実施形態1の変形例)
 隣接画素の受光波形の差分を算出することに代えて、異なる2つのフレーム画像内において同一位置にある画素の受光波形の差分を算出してもよい。例えば、1つ前のフレーム画像と現フレーム画像とにおいて同一位置(座標)にある画素の受光波形の差分を算出してもよい。例えば、道路であれば、異なるフレーム画像内であっても同じ位置の画素であれば同じような形状の受光波形となる。よって、異なるフレーム画像内の画素の受光波形を比較することで、受光波形の差分から障害物の有無を検出することができる。
(Modification of Embodiment 1)
Instead of calculating the difference between the light reception waveforms of adjacent pixels, the difference between the light reception waveforms of pixels located at the same position in two different frame images may be calculated. For example, the difference between the received light waveforms of the pixels at the same position (coordinates) in the previous frame image and the current frame image may be calculated. For example, in the case of a road, even if the pixels are in the same position even in different frame images, the received light waveform has the same shape. Therefore, by comparing the light reception waveforms of pixels in different frame images, the presence or absence of an obstacle can be detected from the difference in the light reception waveforms.
(実施形態2)
 実施形態1では、受光波形の差分を閾値と比較することによって、境界部か否かを判定した。実施形態2では、機械学習を利用して、受光波形の差分が境界部を示しているか否かを判定する。本実施形態では、例えば、深層学習を利用する。
(Embodiment 2)
In the first embodiment, it is determined whether or not it is a boundary portion by comparing the difference between the received light waveforms with a threshold value. In the second embodiment, it is determined by using machine learning whether or not the difference between the received light waveforms indicates a boundary portion. In the present embodiment, for example, deep learning is used.
 図13は、ニューラルネットワークの一例を模式的に示している。本実施形態では、受光波形の差分が境界部を示しているか否かを検出する境界部検出ネットワークを、深層学習に用いられる多層構造のニューラルネットワークによって構築する。境界部検出ネットワーク(ニューラルネットワーク)は、入力から順に、入力層である第1全結合層L1、中間層である第2全結合層L2、及び出力層L3を含む。なお、図13の例では、中間層としての第2全結合層L2は1層であるが、2層以上の中間層を含んでもよい。 FIG. 13 schematically shows an example of a neural network. In the present embodiment, the boundary detection network that detects whether or not the difference between the received light waveforms indicates the boundary is constructed by a neural network having a multilayer structure used for deep learning. The boundary detection network (neural network) includes, in order from the input, a first fully connected layer L1 that is an input layer, a second fully connected layer L2 that is an intermediate layer, and an output layer L3. In the example of FIG. 13, the second total coupling layer L2 as an intermediate layer is one layer, but may include two or more intermediate layers.
 各層L1~L3は、複数のノードを備えている。例えば、入力層である第1全結合層L1のノードIN1,IN2,IN3・・・・には、1画素分の投光してからの経過時間に応じた受光量が入力される。第2全結合層L2のノードの数は実施形態に応じて適宜設定することができる。出力層L3のノードOUT1,OUT2からは、障害物と路面との境界部であるか否かの結果が出力される。 Each layer L1 to L3 includes a plurality of nodes. For example, the received light amount corresponding to the elapsed time since the projection of one pixel is input to the nodes IN1, IN2, IN3,... Of the first full coupling layer L1, which is the input layer. The number of nodes of the second total coupling layer L2 can be appropriately set according to the embodiment. From the nodes OUT1 and OUT2 of the output layer L3, a result indicating whether or not the boundary portion between the obstacle and the road surface is output.
 境界部判定ネットワークは、境界部を示す受光波形の差分データと、境界部を示さない受光波形の差分データとによって、学習される。学習済みの境界部判定ネットワークは、例えば、記憶部30に格納される。波形解析部22aは、この境界部判定ネットワークを使用して、受光波形の差分が境界部を示しているか否かを判定する。 The boundary determination network is learned from the difference data of the received light waveform indicating the boundary and the difference data of the received light waveform not indicating the boundary. The learned boundary determination network is stored in the storage unit 30, for example. The waveform analysis unit 22a uses this boundary determination network to determine whether or not the difference between the received light waveforms indicates the boundary.
 図14は、実施形態2における波形解析部22aの動作を示すフローチャートである。図14において、ステップS401,S402,S404は、実施形態1のステップS201,S202,S206と同一である。本実施形態では、波形解析部22aは、隣接画素の受光波形の差分を学習済みの境界部判定ネットワーク(ニューラルネットワーク)に入力して境界部か否かの判定結果を取得する(S403)。 FIG. 14 is a flowchart showing the operation of the waveform analysis unit 22a in the second embodiment. In FIG. 14, steps S401, S402, and S404 are the same as steps S201, S202, and S206 of the first embodiment. In the present embodiment, the waveform analyzer 22a inputs the difference between the received light waveforms of adjacent pixels to a learned boundary determination network (neural network) and acquires a determination result as to whether or not the boundary is present (S403).
 本実施形態においても、実施形態1と同様に、1画素の垂直画角に収まる小さな障害物の有無を検出することができる。 Also in the present embodiment, as in the first embodiment, it is possible to detect the presence or absence of a small obstacle that fits within the vertical field angle of one pixel.
(実施形態2の変形例)
 なお、受光波形の差分をニューラルネットワークに学習させることに代えて、例えば、障害物を含む画素の受光波形と路面の画素の受光波形とを使用した学習によって、図13に示すようなニューラルネットワークで構成される障害物判定ネットワークを構築してもよい。この場合、例えば、ステップS201で取得した画素毎の受光波形を1画素分ずつ障害物判定ネットワーク(ニューラルネットワーク)に入力することによって、画素毎に障害物か否かの判定結果を取得できる。障害物検出部22は、例えば、障害物であると判定された画素の座標に基づいて、障害物の位置又は方位を示す情報を車両駆動装置210に出力してもよい。このように、制御部20は、障害物と路面などのような高低差の有無を学習したニューラルネットワークを使用して、画素毎に高低差の有無を検出してもよい。
(Modification of Embodiment 2)
Instead of learning the difference of the received light waveform by the neural network, for example, by using the received light waveform of the pixel including the obstacle and the received light waveform of the pixel on the road surface, the neural network as shown in FIG. A configured obstacle determination network may be constructed. In this case, for example, by inputting the light reception waveform for each pixel acquired in step S201 into the obstacle determination network (neural network) for each pixel, it is possible to acquire the determination result as to whether the pixel is an obstacle. For example, the obstacle detection unit 22 may output information indicating the position or orientation of the obstacle to the vehicle drive device 210 based on the coordinates of the pixel determined to be an obstacle. Thus, the control unit 20 may detect the presence or absence of a height difference for each pixel using a neural network that has learned the presence or absence of a height difference such as an obstacle and a road surface.
(実施形態3)
 実施形態1及び実施形態2では、障害物検出部22が受光波形の歪みに基づいて、画素内に障害物が含まれるか否かを判定した。実施形態3では、受光波形の歪みに加えて、距離値を利用して、障害物か否かを判定する。
(Embodiment 3)
In the first embodiment and the second embodiment, the obstacle detection unit 22 determines whether an obstacle is included in the pixel based on the distortion of the received light waveform. In Embodiment 3, in addition to the distortion of the received light waveform, the distance value is used to determine whether or not the object is an obstacle.
 図15は、実施形態3に係る検出装置100の構成を例示するブロック図である。本実施形態では、障害物判定部22bは、波形解析部22aによる境界部の判定結果と、距離算出部21bによって算出された距離値とに基づいて、画素に障害物が含まれているか否かを判定する。例えば、境界部として判定された画素とその周辺の画素の距離値を参照して、境界部の判定が正しいか否かを検証してもよい。これにより、1画素の垂直画角に収まる小さな障害物の有無をより精度良く検出することができる。 FIG. 15 is a block diagram illustrating the configuration of the detection apparatus 100 according to the third embodiment. In the present embodiment, the obstacle determination unit 22b determines whether or not the pixel includes an obstacle based on the boundary determination result by the waveform analysis unit 22a and the distance value calculated by the distance calculation unit 21b. Determine. For example, it may be verified whether or not the determination of the boundary portion is correct with reference to the distance value between the pixel determined as the boundary portion and the surrounding pixels. As a result, it is possible to more accurately detect the presence or absence of a small obstacle that falls within the vertical angle of view of one pixel.
(他の実施形態)
 上記実施形態では、水平方向Xに走査する例について説明したが、走査部13は垂直方向Yにおいて投光する方向を変化させる機構等を備え、垂直方向Yに走査してもよい。また、上記実施形態では、走査部13を備える例について説明したが、走査部13は、適宜、省略されてもよい。例えば、投光部11の光源素子と受光部12の受光素子とが2次元アレイ状に配置される場合、走査部13を用いずに、実施形態1と同様の測距を実行することができる。
(Other embodiments)
In the above-described embodiment, an example of scanning in the horizontal direction X has been described. However, the scanning unit 13 may include a mechanism that changes the light projection direction in the vertical direction Y, and may scan in the vertical direction Y. Moreover, although the said embodiment demonstrated the example provided with the scanning part 13, the scanning part 13 may be abbreviate | omitted suitably. For example, when the light source elements of the light projecting unit 11 and the light receiving elements of the light receiving unit 12 are arranged in a two-dimensional array, distance measurement similar to that of the first embodiment can be performed without using the scanning unit 13. .
 上記実施形態では、検出装置100が、複数の画素に対応する投影面R1を走査して、各画素の距離を算出して距離画像を生成する例について説明したが、検出装置100は、1画素分のみに対応する領域において投光と受光を行うものであってもよい。検出装置100は、例えば、1つの光源素子と1つの受光素子を備え、上述の障害物判定ネットワークによって、1画素のみに対応する領域の受光波形からその画素内における障害物の有無を判定してもよい。 In the above-described embodiment, an example has been described in which the detection apparatus 100 scans the projection plane R1 corresponding to a plurality of pixels, calculates the distance of each pixel, and generates a distance image. You may perform light projection and light reception in the area | region corresponding only to minute. The detection apparatus 100 includes, for example, one light source element and one light receiving element, and determines the presence or absence of an obstacle in the pixel from the light reception waveform of the region corresponding to only one pixel by the above-described obstacle determination network. Also good.
 上記実施形態では、検出装置100によって、道路上の段差を形成する縁石などの障害物を検出する例について説明したが、検出する対象は道路上にある障害物に限らない。本開示の検出装置100によれば、高低差があるものを検出できる。例えば、道路の凹みによる段差を検出することもできる。また、上記実施形態では、検出装置100が車両200に搭載される例について説明したが、本開示に係る検出装置100は、特に車載用途に限らず、種々の用途に適用可能である。例えば、検出装置100は、工場内に設置されて、部品までの距離を測定するのに利用されてもよい。この場合、検出装置100によって、画素内における高低差の有無を検出することによって、例えば、部品に傷があるか否かを判定することができる。 In the above-described embodiment, an example has been described in which the detection device 100 detects an obstacle such as a curb that forms a step on the road, but the detection target is not limited to an obstacle on the road. According to the detection device 100 of the present disclosure, it is possible to detect an object having a height difference. For example, it is possible to detect a level difference caused by a depression on the road. Moreover, although the said embodiment demonstrated the example in which the detection apparatus 100 was mounted in the vehicle 200, the detection apparatus 100 which concerns on this indication is applicable not only to in-vehicle use but various applications. For example, the detection apparatus 100 may be installed in a factory and used to measure the distance to a part. In this case, for example, it is possible to determine whether or not there is a scratch on the component by detecting the presence or absence of the height difference in the pixel by the detection device 100.
 上記実施形態では、距離算出部21bが距離画像を生成する例を説明した。しかし、距離算出部21bは、距離画像に限らず、種々の形式で距離を示す情報を生成してもよく、例えば3次元の点群データを生成してもよい。 In the above embodiment, the example in which the distance calculation unit 21b generates a distance image has been described. However, the distance calculation unit 21b is not limited to the distance image, and may generate information indicating the distance in various formats. For example, the distance calculation unit 21b may generate three-dimensional point group data.
 上記実施形態では、距離画像生成部21が受光波形生成部21aを含む例について説明したが、障害物検出部22が受光波形生成部21aを含んでもよい。距離画像生成部21と障害物検出部22がそれぞれ受光波形生成部21aを含んでもよい。 In the above embodiment, the example in which the distance image generation unit 21 includes the light reception waveform generation unit 21a has been described, but the obstacle detection unit 22 may include the light reception waveform generation unit 21a. The distance image generation unit 21 and the obstacle detection unit 22 may each include a light reception waveform generation unit 21a.
 上記実施形態では、検出装置100は距離算出部21bを備えたが、距離算出部21bはなくてもよい。例えば、検出装置100は、センサ10、受光波形生成部21a、及び障害物検出部22を備え、受光波形の歪みから画素内における高低差の有無を検出してもよい。 In the above embodiment, the detection apparatus 100 includes the distance calculation unit 21b, but the distance calculation unit 21b may not be provided. For example, the detection apparatus 100 may include the sensor 10, the light reception waveform generation unit 21a, and the obstacle detection unit 22, and may detect the presence or absence of a height difference in the pixel from the distortion of the light reception waveform.
(付記)
 以上のように、本開示の各種実施形態について説明したが、本開示は上記の内容に限定されるものではなく、技術的思想が実質的に同一の範囲内で種々の変更を行うことができる。以下、本開示に係る各種態様を付記する。
(Appendix)
As described above, the various embodiments of the present disclosure have been described. However, the present disclosure is not limited to the above-described content, and various modifications can be made within the scope in which the technical idea is substantially the same. . Hereinafter, various aspects according to the present disclosure will be additionally described.
 本開示に係る第1の態様の検出装置は、光を投光する投光部(11)と、所定角度領域から入射する光を受光する受光部(12)と、前記所定角度領域における投光してからの経過時間に応じた受光量を示す受光波形に基づいて、光の飛行時間に対応する距離を算出する制御部(20)と、を備え、前記制御部(20)は、前記受光波形の歪みに基づいて前記所定角度領域内の高低差の有無を検出する。 The detection device according to the first aspect of the present disclosure includes a light projecting unit (11) that projects light, a light receiving unit (12) that receives light incident from a predetermined angle region, and a light projecting in the predetermined angle region. And a control unit (20) for calculating a distance corresponding to the flight time of light based on a received light waveform indicating an amount of received light according to an elapsed time since the time, and the control unit (20) Based on the waveform distortion, the presence or absence of a height difference in the predetermined angle region is detected.
 第2の態様では、第1の態様の検出装置において、前記受光部(12)は、複数の画素に対応した複数の所定角度領域から入射する光を別々に受光し、前記制御部(20)は、画素毎の受光波形に基づいて、画素毎に前記距離を算出し、前記制御部(20)は、前記受光波形の歪みに基づいて、画素毎に高低差の有無を検出する。 According to a second aspect, in the detection device of the first aspect, the light receiving unit (12) separately receives light incident from a plurality of predetermined angle regions corresponding to a plurality of pixels, and the control unit (20). Calculates the distance for each pixel based on the light reception waveform for each pixel, and the control unit (20) detects the presence or absence of a height difference for each pixel based on the distortion of the light reception waveform.
 第3の態様では、第2の態様の検出装置において、前記受光波形の歪みは、水平方向に隣接する画素の前記受光波形の差分である。 In the third aspect, in the detection device of the second aspect, the distortion of the light reception waveform is a difference between the light reception waveforms of pixels adjacent in the horizontal direction.
 第4の態様では、第2の態様の検出装置において、前記制御部は、高低差の有無を学習したニューラルネットワークを使用して、前記画素毎に高低差の有無を検出する。 In the fourth aspect, in the detection device of the second aspect, the control unit detects the presence or absence of a height difference for each pixel using a neural network that has learned the presence or absence of a height difference.
 第5の態様では、第2の態様の検出装置において、前記制御部は、前記受光波形の歪みと前記距離とに基づいて、前記画素毎に高低差の有無を検出する。 In the fifth aspect, in the detection device of the second aspect, the control unit detects the presence or absence of a height difference for each pixel based on the distortion of the light reception waveform and the distance.
 第6の態様では、第2の態様の検出装置において、前記制御部は、前記複数の画素を含むフレーム画像を前記画素毎の前記距離に応じて生成し、前記受光波形の歪みは、異なる2つのフレーム画像において同一位置にある画素の前記受光波形の差分である。 According to a sixth aspect, in the detection device according to the second aspect, the control unit generates a frame image including the plurality of pixels according to the distance for each pixel, and the received light waveform has a different distortion. This is the difference between the received light waveforms of the pixels at the same position in two frame images.
 第7の態様では、第1の態様の検出装置において、前記検出装置は車両に搭載され、前記高低差の有無は道路上の段差又は障害物の有無である。 According to a seventh aspect, in the detection device according to the first aspect, the detection device is mounted on a vehicle, and the presence or absence of the height difference is the presence or absence of a step or an obstacle on the road.
 第8の態様の検出方法は、投光部(11)により光を投光するステップ(S101)と、所定角度領域から入射する光を受光部(12)で受光するステップ(S102)と、制御部(20)により、前記所定角度領域における投光してからの経過時間に応じた受光量を示す受光波形に基づいて、光の飛行時間に対応する距離を算出するステップ(S103,S104)と、前記制御部(20)により、前記受光波形の歪みに基づいて前記所定角度領域内の高低差の有無を検出するステップ(S201~S205,S301~S303)と、を含む。 The detection method according to the eighth aspect includes a step of projecting light by the light projecting unit (11) (S101), a step of receiving light incident from a predetermined angle region by the light receiving unit (12) (S102), and control. A step (S103, S104) of calculating a distance corresponding to the flight time of light based on a received light waveform indicating an amount of light received according to an elapsed time after light projection in the predetermined angle region by the unit (20); And a step (S201 to S205, S301 to S303) of detecting the presence or absence of a height difference in the predetermined angle region based on the distortion of the received light waveform by the control unit (20).
 本開示の検出装置は、例えば、自動運転車、自走ロボット、及びAGV(Automated Guided Vehicle)などに適用可能である。 The detection device of the present disclosure can be applied to, for example, an autonomous driving vehicle, a self-propelled robot, and AGV (Automated Guided Vehicle).
  10    センサ
  11    投光部
  12    受光部
  13    走査部
  20    制御部
  21    距離画像生成部
  21a   受光波形生成部
  21b   距離算出部
  22    障害物検出部
  22a   波形解析部
  22b   障害物判定部
  30    記憶部
  100   検出装置
  200   車両
  210   車両駆動装置
DESCRIPTION OF SYMBOLS 10 Sensor 11 Light projection part 12 Light reception part 13 Scan part 20 Control part 21 Distance image generation part 21a Light reception waveform generation part 21b Distance calculation part 22 Obstacle detection part 22a Waveform analysis part 22b Obstacle determination part 30 Memory | storage part 100 Detection apparatus 200 Vehicle 210 Vehicle drive device

Claims (8)

  1.  光を投光する投光部と、
     所定角度領域から入射する光を受光する受光部と、
     前記所定角度領域における投光してからの経過時間に応じた受光量を示す受光波形に基づいて、光の飛行時間に対応する距離を算出する制御部と、
     を備え、
     前記制御部は、前記受光波形の歪みに基づいて前記所定角度領域内の高低差の有無を検出する、検出装置。
    A light projecting unit that projects light;
    A light receiving unit that receives light incident from a predetermined angle region;
    A control unit that calculates a distance corresponding to a flight time of light based on a light reception waveform indicating an amount of light received according to an elapsed time after light projection in the predetermined angle region;
    With
    The said control part is a detection apparatus which detects the presence or absence of the height difference in the said predetermined angle area | region based on the distortion of the said light reception waveform.
  2.  前記受光部は、複数の画素に対応した複数の所定角度領域から入射する光を別々に受光し、
     前記制御部は、画素毎の受光波形に基づいて、画素毎に前記距離を算出し、
     前記制御部は、前記受光波形の歪みに基づいて、画素毎に高低差の有無を検出する、
     請求項1に記載の検出装置。
    The light receiving unit separately receives light incident from a plurality of predetermined angle regions corresponding to a plurality of pixels,
    The control unit calculates the distance for each pixel based on the light reception waveform for each pixel,
    The control unit detects the presence or absence of a height difference for each pixel based on the distortion of the received light waveform,
    The detection device according to claim 1.
  3.  前記受光波形の歪みは、水平方向に隣接する画素の前記受光波形の差分である、
     請求項2に記載の検出装置。
    The distortion of the light reception waveform is a difference between the light reception waveforms of pixels adjacent in the horizontal direction.
    The detection device according to claim 2.
  4.  前記制御部は、高低差の有無を学習したニューラルネットワークを使用して、前記画素毎に高低差の有無を検出する、請求項2に記載の検出装置。 The detection device according to claim 2, wherein the control unit detects the presence or absence of a height difference for each pixel using a neural network that has learned the presence or absence of a height difference.
  5.  前記制御部は、前記受光波形の歪みと前記距離とに基づいて、前記画素毎に高低差の有無を検出する、
     請求項2に記載の検出装置。
    The control unit detects the presence or absence of a height difference for each pixel based on the distortion of the received light waveform and the distance.
    The detection device according to claim 2.
  6.  前記制御部は、前記複数の画素を含むフレーム画像を前記画素毎の前記距離に応じて生成し、
     前記受光波形の歪みは、異なる2つのフレーム画像において同一位置にある画素の前記受光波形の差分である。
     請求項2に記載の検出装置。
    The control unit generates a frame image including the plurality of pixels according to the distance for each pixel,
    The distortion of the light reception waveform is a difference between the light reception waveforms of pixels at the same position in two different frame images.
    The detection device according to claim 2.
  7.  前記検出装置は車両に搭載され、
     前記高低差の有無は道路上の段差又は障害物の有無である、
     請求項1に記載の検出装置。
    The detection device is mounted on a vehicle,
    The presence or absence of the height difference is the presence or absence of a step on the road or an obstacle.
    The detection device according to claim 1.
  8.  投光部により光を投光するステップと、
     所定角度領域から入射する光を受光部で受光するステップと、
     制御部により、前記所定角度領域における投光してからの経過時間に応じた受光量を示す受光波形に基づいて、光の飛行時間に対応する距離を算出するステップと、
     前記制御部により、前記受光波形の歪みに基づいて前記所定角度領域内の高低差の有無を検出するステップと、
     を含む、検出方法。
    Projecting light by the light projecting unit;
    Receiving light incident from a predetermined angle region by a light receiving unit;
    Calculating a distance corresponding to a flight time of light based on a received light waveform indicating an amount of received light according to an elapsed time after light projection in the predetermined angle region by the control unit;
    Detecting the presence or absence of a height difference in the predetermined angle region based on distortion of the received light waveform by the control unit;
    A detection method comprising:
PCT/JP2019/008745 2018-03-14 2019-03-06 Detecting device and detecting method WO2019176667A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2018-047291 2018-03-14
JP2018047291A JP6812997B2 (en) 2018-03-14 2018-03-14 Detection device and detection method

Publications (1)

Publication Number Publication Date
WO2019176667A1 true WO2019176667A1 (en) 2019-09-19

Family

ID=67908174

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2019/008745 WO2019176667A1 (en) 2018-03-14 2019-03-06 Detecting device and detecting method

Country Status (2)

Country Link
JP (1) JP6812997B2 (en)
WO (1) WO2019176667A1 (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003042757A (en) * 2001-07-31 2003-02-13 Omron Corp Distance measuring instrument for vehicle
EP1286178A2 (en) * 2001-08-23 2003-02-26 IBEO Automobile Sensor GmbH Method for optical ground detection
JP2016044969A (en) * 2014-08-19 2016-04-04 株式会社デンソー On-vehicle radar system
JP2017032329A (en) * 2015-07-30 2017-02-09 シャープ株式会社 Obstacle determination device, mobile body, and obstacle determination method

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003042757A (en) * 2001-07-31 2003-02-13 Omron Corp Distance measuring instrument for vehicle
EP1286178A2 (en) * 2001-08-23 2003-02-26 IBEO Automobile Sensor GmbH Method for optical ground detection
JP2016044969A (en) * 2014-08-19 2016-04-04 株式会社デンソー On-vehicle radar system
JP2017032329A (en) * 2015-07-30 2017-02-09 シャープ株式会社 Obstacle determination device, mobile body, and obstacle determination method

Also Published As

Publication number Publication date
JP6812997B2 (en) 2021-01-13
JP2019158686A (en) 2019-09-19

Similar Documents

Publication Publication Date Title
CN111742241B (en) Optical distance measuring device
EP3187895B1 (en) Variable resolution light radar system
US9207074B2 (en) Distance measurement apparatus, and distance measurement method
US10698092B2 (en) Angle calibration in light detection and ranging system
US8217327B2 (en) Apparatus and method of obtaining depth image
US10509111B2 (en) LIDAR sensor device
US9134117B2 (en) Distance measuring system and distance measuring method
JP2015179078A (en) Parallax calculation system and distance measurement device
JP6464410B2 (en) Obstacle determination device and obstacle determination method
CN109444916A (en) The unmanned travelable area determining device of one kind and method
JP6186863B2 (en) Ranging device and program
JP4691701B2 (en) Number detection device and method
WO2016051861A1 (en) Obstacle determination device and obstacle determination method
WO2019176667A1 (en) Detecting device and detecting method
WO2024106214A1 (en) Object detection device and object detection method
CN110554398B (en) Laser radar and detection method
CN118786358A (en) System and method for solid-state LiDAR with adaptive vignette correction
US11921216B2 (en) Electronic apparatus and method for controlling thereof
JP2008180646A (en) Shape measuring device and shape measuring technique
US20240264286A1 (en) Control method and apparatus, lidar, and terminal device
US12146764B2 (en) Ranging method and range finder
US20220319025A1 (en) Output control device, distance measuring device comprising the same, output control method, and output control program
US20230066857A1 (en) Dynamic laser emission control in light detection and ranging (lidar) systems
US20240241233A1 (en) Method for operating a detection device, detection device, and vehicle comprising at least one detection device
JP6819161B2 (en) Coordinate calculation method and coordinate calculation device

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19767059

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19767059

Country of ref document: EP

Kind code of ref document: A1

点击 这是indexloc提供的php浏览器服务,不要输入任何密码和下载