+

WO2012005377A1 - Dispositif et procédé de calcul du moment d'une collision, et programme - Google Patents

Dispositif et procédé de calcul du moment d'une collision, et programme Download PDF

Info

Publication number
WO2012005377A1
WO2012005377A1 PCT/JP2011/065830 JP2011065830W WO2012005377A1 WO 2012005377 A1 WO2012005377 A1 WO 2012005377A1 JP 2011065830 W JP2011065830 W JP 2011065830W WO 2012005377 A1 WO2012005377 A1 WO 2012005377A1
Authority
WO
WIPO (PCT)
Prior art keywords
vehicle
image
time
point
optical flow
Prior art date
Application number
PCT/JP2011/065830
Other languages
English (en)
Japanese (ja)
Inventor
秋田 時彦
Original Assignee
アイシン精機株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by アイシン精機株式会社 filed Critical アイシン精機株式会社
Publication of WO2012005377A1 publication Critical patent/WO2012005377A1/fr

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R21/00Arrangements or fittings on vehicles for protecting or preventing injuries to occupants or pedestrians in case of accidents or other traffic risks
    • B60R21/01Electrical circuits for triggering passive safety arrangements, e.g. airbags, safety belt tighteners, in case of vehicle accidents or impending vehicle accidents
    • B60R21/013Electrical circuits for triggering passive safety arrangements, e.g. airbags, safety belt tighteners, in case of vehicle accidents or impending vehicle accidents including means for detecting collisions, impending collisions or roll-over
    • B60R21/0134Electrical circuits for triggering passive safety arrangements, e.g. airbags, safety belt tighteners, in case of vehicle accidents or impending vehicle accidents including means for detecting collisions, impending collisions or roll-over responsive to imminent contact with an obstacle, e.g. using radar systems
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/246Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30248Vehicle exterior or interior
    • G06T2207/30252Vehicle exterior; Vicinity of vehicle
    • G06T2207/30261Obstacle

Definitions

  • the present invention relates to a collision time calculation device, a collision time calculation method, and a program, and more particularly, to a collision time calculation device, a collision time calculation method, and a program for calculating a time until a collision with an approaching object.
  • a driving support system that detects an object such as a vehicle approaching the host vehicle based on an image output from a camera installed in the car is being put into practical use.
  • This type of driving support system detects an object approaching the host vehicle based on the magnitude of a vector (optical flow) indicating a time-series position change of feature points included in an image (for example, Patent Document 1 and 2).
  • Patent Documents 1 and 2 constantly monitor the average value of the magnitude of the optical flow, which is correlated with the positional relationship between the own vehicle and an object approaching the own vehicle. Then, when the average value of the magnitude of the optical flow exceeds the threshold, a warning is issued to the driver.
  • the optical flow represents a locus (vector) of a point (feature point) on an image corresponding to a characteristic part of an actually existing object. For this reason, depending on the positional relationship between the vehicle and the object, an error occurs between the trajectory of the feature point on the image and the trajectory of the characteristic part in the real world. For this reason, it is conceivable that the above-described apparatus cannot detect an object approaching the host vehicle at an appropriate timing due to the error.
  • the present invention has been made under the above circumstances, and an object thereof is to appropriately predict the time until an object approaching the host vehicle collides with the host vehicle.
  • a collision time calculation device provides: An extraction unit for extracting a first feature point included in the first image captured by the imaging unit mounted on the vehicle and a second feature point corresponding to the first feature point included in the second image; Based on the size of the optical flow having the first feature point as the start point and the second feature point as the end point, and the time from the time when the first image was taken to the time when the second image was taken.
  • a prediction unit that calculates a first prediction time until an object in the field of view of the photographing unit collides with the vehicle; Is provided.
  • the prediction unit The first predicted time may be calculated using a second predicted time until the object collides with a first surface that includes the optical center of the photographing unit and is perpendicular to the optical axis of the photographing unit. .
  • the second predicted time may be calculated using a ratio between a distance from the vanishing point of the optical flow to a start point of the optical flow and the magnitude of the optical flow.
  • the first predicted time may be calculated by using a distance between a first surface that includes the foremost point of the vehicle and is orthogonal to the optical axis of the photographing unit.
  • the first predicted time may be calculated using a distance between a portion corresponding to the first or second feature point of the vehicle and the first surface.
  • the time from the time when the first image was taken to the time when the second image was taken is ⁇ t
  • the distance from the vanishing point of the optical flow to the start point of the optical flow is x2
  • the vanishing of the optical flow The distance between the point and the end point of the optical flow is x1
  • the second predicted time TTCc may be calculated using the following equation.
  • the prediction unit The time from the time when the first image was taken to the time when the second image was taken is ⁇ t, the X coordinate of the vanishing point in the XY coordinate system included in the first surface is xvp, and the optical flow The X coordinate of the start point of x is x1, the X coordinate of the end point of the optical flow is x2, and the focal length of the imaging unit is f.
  • the second predicted time TTCc may be calculated using an approximate expression represented by the following expression.
  • the second predicted time is TTCc
  • the distance between the first surface and the second surface that includes the foremost point of the vehicle and is orthogonal to the optical axis of the imaging unit is L
  • the first or second of the vehicle The distance between the portion corresponding to the two feature points and the first surface is Z1
  • the first predicted time TTC may be calculated using the following equation.
  • the focal length of the photographing unit is f
  • the arrangement interval of pixels constituting the first or second image is ⁇
  • the distance between the road surface on which the vehicle travels and the photographing unit is h
  • the first or second The distance between the vanishing point and the road surface in two images is yb
  • the distance Z1 may be calculated using the following equation.
  • the width of the vehicle is WV
  • the width of the vehicle in the second image is wv
  • the width of the second image is wc
  • the viewing angle of the photographing unit corresponding to the second image is FOV.
  • the distance Z1 may be calculated using the following equation.
  • the collision time calculation method is: A first step of extracting a first feature point included in the first image shot by a shooting unit mounted on the vehicle; A second step of extracting a second feature point corresponding to the first feature point, which is included in a second image photographed by the photographing means; Based on the size of the optical flow having the first feature point as the start point and the second feature point as the end point, and the time from the time when the first image was taken to the time when the second image was taken. A third step of calculating a first predicted time until an object in the field of view of the photographing means collides with the vehicle; including.
  • the program according to the third aspect of the present invention is: On the computer, A first procedure for extracting a first feature point included in the first image, which is imaged by an imaging means mounted on the vehicle; A second procedure for extracting a second feature point corresponding to the first feature point, which is included in a second image photographed by the photographing means; Based on the size of the optical flow having the first feature point as the start point and the second feature point as the end point, and the time from the time when the first image was taken to the time when the second image was taken. A third procedure for calculating a first predicted time until an object in the field of view of the photographing means collides with the vehicle; Is executed.
  • FIG. 3 is a first diagram illustrating an image photographed by the photographing apparatus. It is FIG. (2) which shows the image image
  • FIG. 1 is a block diagram showing a schematic configuration of a collision time calculation system 10 according to the present embodiment.
  • the collision time calculation system 10 is a system that calculates the time until an object such as an approaching vehicle that is installed in a vehicle and approaches the vehicle collides with the vehicle.
  • the collision time calculation system 10 includes an imaging device 20 and a collision time calculation device 30.
  • the imaging device 20 is a device that includes, for example, a CCD (Charge Coupled Device) and a CMOS (Complementary Metal Oxide Semiconductor), and converts an image acquired by photographing a subject into an electrical signal and outputs the electrical signal. Moreover, the imaging device 20 includes, for example, an RTC (Real Time Clock) and a timer, the time when the image was captured, the time difference when the first image and the second image were captured, and the frame between the first image and the second image. Output the difference.
  • the imaging device 20 is attached to the upper part of the front window of the vehicle 100, for example, as shown in FIG.
  • the photographing apparatus 20 photographs the front of the vehicle 100 and outputs information (for example, the image, the photographing time of the image, the frame number, and the number of pixels) about the image acquired by photographing to the collision time calculating apparatus 30.
  • FIG. 3 is a diagram showing a relative positional relationship between the vehicle 100 and the vehicle 101.
  • a vehicle 101 traveling in front of the vehicle 100 comes relatively close to the vehicle 100.
  • the vehicle 101 at the position indicated by the arrow a1 relatively moves to the position indicated by the arrow a2 after a predetermined time has elapsed.
  • the vehicle 101 at the position indicated by the arrow a1 is first photographed by the photographing device 20, and then the vehicle 101 at the position indicated by the arrow a2 is photographed.
  • FIG. 4 is a diagram showing an image PH1 obtained by photographing the vehicle 101 at the position indicated by the arrow a1.
  • FIG. 5 is a diagram showing an image PH2 obtained by photographing the vehicle 101 at the position indicated by the arrow a2.
  • the image capturing device 20 captures the images PH1 and PH2
  • the information regarding the images PH1 and PH2 for example, images, image capturing times, frame numbers, frame differences between the images PH1 and PH2, images PH1 and PH2
  • Time difference is output to the collision time calculation device 30.
  • a point corresponding to the optical center of the imaging device 20 is defined as an origin Xc (FOE: FocusFof Define the xy coordinate system as (Expansion).
  • the origin Xc of the xy coordinate system coincides with the centers of the images PH1 and PH2.
  • the collision time calculation device 30 is based on an image (for example, the images PH1 and PH2) output from the imaging device 20, and approaches an approaching vehicle (vehicle 101) relatively approaching the own vehicle (vehicle 100). ) Is a device for calculating the time until the vehicle collides.
  • the collision time calculation device 30 includes a storage unit 31, a feature point extraction unit 32, a correlation value calculation unit 33, an optical flow definition unit 34, a grouping processing unit 35, and a collision prediction time calculation unit 36. is doing.
  • the storage unit 31 stores information on images sequentially output from the photographing apparatus 20 in frame units in time series.
  • the storage unit 31 sequentially stores information as processing results of the units 32 to 36.
  • the feature point extraction unit 32 calculates a feature amount for each pixel constituting the image stored in the storage unit 31. Then, the feature point extraction unit 32 extracts feature points included in the image based on this feature amount. For example, the feature quantity f (x, y) of a certain pixel M (x, y) shown in the xy coordinate system defined on the image PH1 shown in FIG. 4 is a function I ( When x, y) is used, it is expressed by the following formula (1).
  • Ixx, Iyy, and Ixy are respectively expressed by the following equations (2) to (4).
  • K is a constant.
  • the feature point extraction unit 32 first calculates a feature amount f (x, y) for each pixel M (x, y) constituting the image PH1 using Expression (1). Next, the feature point extraction unit 32 calculates the average value AVG (x, y) of the luminance of the pixels around the pixel M (x, y) (for example, the surrounding 4 pixels, the surrounding 8 pixels, and the surrounding 24 pixels). Then, the feature amount f (x, y) is divided by the brightness average value AVG (x, y) raised to the fourth power.
  • the pixel M (x, y) at this time is extracted as a feature point.
  • the reason why the feature quantity f (x, y) is divided by the fourth power of the average luminance value AVG (x, y) is to normalize the feature quantity f (x, y) with respect to the brightness. .
  • FIG. 4 shows feature points P1, P2, P3, and P4 related to the vehicle 101 extracted from the image PH1.
  • FIG. 5 shows feature points Q1, Q2, Q3, and Q4 extracted from the image PH2.
  • the feature point extraction unit 32 specifically forms discontinuous points in which the contour of the vehicle 101 reflected in the images PH1 and PH2 changes sharply, and the vehicle 101. A point where the shape of the part to be changed changes discontinuously is extracted as a feature point.
  • the feature point extraction unit 32 When the feature point extraction unit 32 completes the extraction of the feature points for the images PH1 and PH2, the feature point extraction unit 32 outputs information about the extracted feature points to the storage unit 31 and also indicates that the feature point extraction has been completed. To notify. Although a case where four feature points are extracted is described here, in practice, a large number (for example, 30, 100) of feature points are extracted from one image.
  • the correlation value calculator 33 sequentially selects the feature points P1 to P4 of the image PH1. Then, the correlation value calculator 33 calculates correlation values between the selected feature points P1 to P4 and the feature points Q1 to Q4 of the image PH2.
  • the correlation value calculation unit 33 has a predetermined shape (for example, a rectangle, a square, an ellipse, etc.) centered on the feature point (here, the feature point P1) of the image PH1. )
  • a predetermined shape for example, a rectangle, a square, an ellipse, etc.
  • the template TF1 is an image composed of pixels arranged in a matrix of M rows and N columns.
  • the coordinates of the template TF1 refer to the coordinates of the center of the template.
  • the correlation value calculator 33 sequentially calculates the correlation value R of the template TF1 with respect to the image PH2 while moving the template TF1 in the vicinity of the feature points Q1 to Q4 of the image PH2.
  • the correlation value R can be calculated using, for example, the following equation (5) indicating normalized cross-correlation.
  • T (i, j) is the luminance of the pixel located in the i-th row and the j-th column of the template TF1.
  • I (i, j) is the luminance of the pixel located in the i-th row and the j-th column of the partial image of the image PH2 overlapping the template TF1.
  • IAVG is an average value of luminances of pixels constituting the partial image.
  • TAVG is an average value of the luminance of the pixels constituting the template.
  • I AVG and T AVG are represented by the following formulas (6) and (7).
  • the correlation value calculation unit 33 calculates the correlation value R based on the above equation (5), information on the correlation value R (for example, numerical values, information on feature points used for calculating the correlation value R) is used as a template.
  • the TF1 is stored in the storage unit 31 in association with the position coordinates in the xy coordinate system.
  • the correlation value calculation unit 33 performs the same processing as described above for the feature points P2, P3, and P4 of the image PH1. After calculating the correlation value R for all the feature points P1 to P4, the correlation value calculating unit 33 notifies the optical flow defining unit 34 that the calculation of the correlation value has been completed.
  • the optical flow defining unit 34 starts from the feature points P1 to P4 of the image PH1 and ends at the feature points Q1 to Q4 of the image PH2.
  • the optical flow is defined.
  • the optical flow defining unit 34 uses the feature point (here, the feature point) of the image PH2 closest to the coordinates of the template TF1 when the correlation value R calculated using the template TF1 for the feature point P1 is maximized. Any one of the characteristic points of Q1 to Q4) is specified.
  • the feature point P ⁇ b> 1 is a feature point corresponding to the right end portion of the rear bumper of the vehicle 101. For this reason, the correlation value R calculated using the template TF1 is maximized when the center of the template TF1 substantially matches the feature point Q1 of the image PH2. Therefore, here, the feature point Q1 is specified as the feature point corresponding to the feature point P1.
  • the optical flow defining unit 34 defines an optical flow OP1 having a feature point P1 as a start point and a feature point Q1 as an end point in the xy coordinate system.
  • the optical flow defining unit 34 similarly defines the optical flow OP2 having the feature point P2 as the start point and the feature point Q2 as the end point in the above-described procedure. Further, the optical flow defining unit 34 defines an optical flow OP3 having a feature point P3 as a start point and a feature point Q3 as an end point. Further, the optical flow defining unit 34 defines an optical flow OP4 having a feature point P4 as a start point and a feature point Q4 as an end point.
  • the optical flow defining unit 34 When the optical flow defining unit 34 defines the optical flows OP1 to OP4 for all the feature points P1 to P4, the optical flow defining unit 34 stores information on the optical flows OP1 to OP4 (for example, vector quantities, coordinates of the start and end points of the optical flow). The information is output to the unit 31 and the grouping processing unit 35 is notified that the optical flow has been defined.
  • the grouping processing unit 35 groups the specified group of optical flows OP1 to OP4. As shown in FIG. 6, the present embodiment describes a case where there are four optical flows related to the vehicle 101. However, in practice, tens or hundreds of feature points can be extracted from an image obtained by photographing the vehicle 101. Dozens or hundreds of optical flows can be defined.
  • the grouping processing unit 35 excludes optical flows including a lot of noise components from among tens or hundreds of optical flows, and groups the remaining optical flows. For example, when the vehicle 101 is in a complete linear motion, each straight line that coincides with each optical flow (a straight line extended from the start point of the optical flow) intersects at the vanishing point VP. Therefore, the grouping processing unit 35 excludes the optical flow from the grouping when the straight line that matches the optical flow is significantly away from the vanishing point VP, and the optical flow related to the same moving object is excluded from the grouping. It is considered that it is grouped.
  • the optical flows OP1 to OP4 related to the vehicle 101 are grouped as the optical flows of the vehicle 101.
  • the grouping processing unit 35 When the optical flow grouping is completed, the grouping processing unit 35 outputs information on the grouped optical flows (for example, the vector amount, the coordinates of the start and end points of the optical flow, the name of the optical flow) to the storage unit 31 and the grouping. Is notified to the collision prediction time calculation unit 36.
  • the collision prediction time calculation unit 36 uses the optical flows OP1 to OP4 to calculate the collision prediction time TTC until the vehicle 101 collides with the vehicle 100.
  • FIG. 7 is a diagram for explaining processing executed by the collision prediction time calculation unit 36.
  • a point RP1 in FIG. 7 is a point indicating the right end portion of the rear bumper constituting the vehicle 101 at the position indicated by the arrow a1 in FIG.
  • the point RQ1 in FIG. 7 is a point which shows the right side edge part of the rear bumper which comprises the vehicle 101 in the position shown by arrow a2 in FIG.
  • This point RP1 corresponds to the feature point P1
  • the point RQ1 corresponds to the feature point Q1.
  • the point RP1 and the point RQ1 are also referred to as a corresponding point RP1 and a corresponding point RQ1, respectively.
  • the right end portion of the rear bumper constituting the vehicle 101 is also referred to as an index point for convenience.
  • a straight line LN1 in FIG. 7 indicates a plane including the image plane IM of the imaging device 20.
  • a straight line LN ⁇ b> 2 indicates a collision surface including a portion on the most + Z side of the vehicle 100.
  • the origin O of the XYZ coordinate system coincides with the optical center of the photographing apparatus 20. Therefore, the distance from the origin O of the XYZ coordinate system to the straight line LN1 is equal to the focal length f of the imaging device 20. The distance between the origin O and the straight line LN2 is assumed to be L.
  • the vehicle 101 located at the position indicated by the arrow a1 in FIG. 3 has moved to the position indicated by the arrow a2 in FIG. 3 when the image PH2 is photographed.
  • the vector in the XYZ coordinate system corresponding to the optical flow OP1 having the feature point P1 as the start point and the feature point Q1 as the end point has the start point as the corresponding point RP1 and the end point.
  • the vector MV0 is the point RQ1.
  • the optical flow OP ⁇ b> 1 indicates the movement locus (vector amount) of the feature point in the image plane IM of the photographing apparatus 20.
  • a vector MV0 indicates the movement locus of the corresponding point in the XYZ coordinate system. In the present embodiment, since the vehicle 101 relatively moves in parallel with the Z axis, the vector MV0 is parallel to the Z axis.
  • the feature point P1 and the corresponding point RP1 are arranged on a straight line LN3 passing through the origin O in the XYZ coordinate system.
  • the feature point Q1 and the corresponding point RQ1 are arranged on a straight line LN4 passing through the origin O in the XYZ coordinate system.
  • of the vector MV2 indicating the trajectory until the index point of the vehicle 101 that coincides with the corresponding point RQ1 reaches the point CP1 on the X axis with the X coordinate as X1, and the feature point P1
  • of the vector MV0 is expressed by the following equation (8).
  • the vector MV1 indicating the trajectory until the index point of the vehicle 101 reaches the point CP2 on the collision plane indicated by the straight line LN2 from the corresponding point RQ1 is parallel to the Z axis.
  • of the vector MV2 is expressed by the following equation using the distance Z1 from the X axis to the corresponding point RQ1 and the distance L from the X axis to the point CP2. It is indicated by (10).
  • L in the above equation (11) is a distance between the X axis and the collision surface indicated by the straight line LN2, and is a known value that is substantially equal to the distance between the mounting position of the imaging device 20 and the front end of the vehicle 100. Therefore, if the value of the distance Z1 between the X axis and the corresponding point RQ1 is known, the collision prediction time calculation unit 36 uses the above equation (11) to predict the collision prediction time TTC until the vehicle 101 collides with the vehicle 100. Can be calculated.
  • the collision prediction time calculation unit 36 calculates the distance Z1 using the following equation (12).
  • f is a focal length of the photographing apparatus 20.
  • is the arrangement interval in the y-axis direction of the pixels constituting the images PH1 and PH2.
  • h is a distance between the road surface on which the vehicle 100 travels and the photographing apparatus 20.
  • yb is a distance between the feature point Q1 and the road surface on which the vehicle 100 travels in the image PH2, as shown in FIG.
  • the collision prediction time calculation unit 36 calculates the distance Z1 using the equation (12), the collision prediction time until the vehicle 101 collides with the vehicle 100 by substituting the calculated distance Z1 into the equation (11). Time TTC is calculated. Then, the collision prediction time calculation unit 36 outputs the collision prediction time TTC to an external device or the like.
  • the external device is, for example, a device that includes a speaker and issues an alarm to the driver.
  • the external device or the like to issue a warning for avoiding a collision to the driver when, for example, the collision prediction time TTC is equal to or less than a threshold value.
  • the time TTCc until the vehicle 101 reaches the surface including the optical center of the imaging device 20 is calculated using the optical flow for the vehicle 100. Based on this time TTCc, a predicted collision time TTC until the vehicle 101 collides with the collision surface of the vehicle 100 is calculated.
  • the predicted collision time TTC is greatly affected by an error generated between the magnitude of the vector MV0 indicating the trajectory of the index point of the vehicle 101 and the magnitude of the optical flow OP1 even when the vehicle 101 is away from the vehicle 100. It is calculated without. Therefore, by calculating the predicted collision time TTC until the vehicle 100 and the vehicle 101 collide in advance, it is possible to issue an alarm for avoiding the collision to the driver at an appropriate timing.
  • the collision prediction time TTC calculated using the optical flow is a time TTCc required for the vehicle 101 to reach the surface including the optical center of the image capturing device 20, and a distance L from the optical center of the image capturing device 20 to the collision surface is considered. Then, it is obtained by correcting. When this correction is performed, a distance Z1 including a lot of detection errors is used. However, since the error included in the distance Z1 becomes smaller as the vehicle 100 and the vehicle 101 get closer, the reliability of the collision prediction time TTC is sufficiently maintained.
  • the collision prediction until the object collides with the vehicle 100 is based on the time until the object moving relative to the vehicle 100 reaches the surface including the optical center of the imaging device 20.
  • Time TTC is calculated.
  • the predicted collision time TTC can be calculated without being greatly affected by the error even if the vehicle 100 and the object are separated from each other. Therefore, it is possible to issue a warning to the driver at an appropriate timing based on the predicted collision time TTC.
  • the vehicle 100 travels in the + Z direction and the vehicle 101 travels in a direction crossing the Z axis.
  • the vehicle 101 in the XYZ coordinate system in which the optical center of the photographing apparatus 20 is the origin O, the vehicle 101 relatively moves in a direction in which the traveling direction of the vehicle 100 and the traveling direction of the vehicle 101 are combined. .
  • the vehicle 101 at the position indicated by the arrow a1 in FIG. 8 moves relatively to the position indicated by the arrow a2 when a predetermined time has elapsed.
  • the vehicle 101 at the position indicated by the arrow a1 is first photographed by the photographing device 20, and then the vehicle 101 at the position indicated by the arrow a2 is photographed.
  • FIG. 9 shows optical flows OP1 to OP4 defined by the optical flow defining unit 34.
  • the vehicle 101 approaches the vehicle 100 by moving relative to the vehicle 100 in the direction intersecting the Y axis. For this reason, the vanishing point VP of the optical flows OP1 to OP4 does not coincide with the origin O of the xy coordinate system.
  • FIG. 10 is a diagram for explaining processing executed by the collision prediction time calculation unit 36.
  • the straight line LN5 is a straight line that passes through the origin and is orthogonal to the vector MV0.
  • Point CP3 is an intersection of a straight line passing through corresponding point RQ1 and point CP1 and straight line LN5.
  • the point CP3 and the point CP1 are illustrated with a certain distance therebetween.
  • the distance between the point CP3 and the point CP1 is significantly smaller than the distance L between the origin O and the collision surface indicated by the straight line LN2. Therefore, the vector MV3 that is parallel to the vector MV2 and that has the corresponding point RQ1 as the start point and the point CP3 as the end point may be handled as having the same size as the vector MV2.
  • the collision prediction time calculation unit 36 calculates the time TTCc0 as an approximate value of the time TTCc until the vehicle 101 collides with the surface indicated by the straight line LN5, based on the following equation (13).
  • Xvp is the X coordinate of the vanishing point VP.
  • the collision prediction time calculation unit 36 substitutes the time TTCc0 calculated by the above equation (13) into the above equation (11) as the time TTCc, and the collision prediction time until the vehicle 101 collides with the vehicle 100. TTC is calculated. Then, the collision prediction time calculation unit 36 outputs the calculated collision prediction time TTC to an external device or the like.
  • the external device or the like to issue a warning for avoiding a collision to the driver when, for example, the collision prediction time TTC is equal to or less than a threshold value.
  • the time TTCc0 until the vehicle 101 reaches the point CP3 on the straight line LN5 is calculated as an approximate value of the time TTCc.
  • a predicted collision time TTC until the vehicle 101 collides with the collision surface of the vehicle 100 is calculated.
  • the predicted collision time TTC is greatly affected by an error generated between the magnitude of the vector MV0 indicating the trajectory of the index point of the vehicle 101 and the magnitude of the optical flow OP1 even when the vehicle 101 is away from the vehicle 100. It is calculated without. Therefore, by calculating the predicted collision time TTC until the vehicle 100 and the vehicle 101 collide in advance, it is possible to issue an alarm for avoiding the collision to the driver at an appropriate timing.
  • the above equation (13) is obtained when the traveling direction of the vehicle 100 and the traveling direction of the vehicle 101 are parallel, or when the vanishing point VP coincides with the origin O of the XYZ coordinate system. The same holds true.
  • the value of xvp is 0.
  • time TTCc0 coincides with time TTCc.
  • the collision time calculation system 10 is the same as the collision time calculation device 30 in that the collision time calculation device 30 is realized by the same configuration as a general computer or a device such as a microcomputer. This is different from the calculation system 10.
  • FIG. 11 is a block diagram showing a physical configuration of the collision time calculation system 10. As shown in FIG. 11, the collision time calculation system 10 includes an imaging device 20 and a collision time calculation device 30 including a computer.
  • the collision time calculation device 30 includes a central processing unit (CPU) 30a, a main storage unit 30b, an auxiliary storage unit 30c, a display unit 30d, an input unit 30e, an interface unit 30f, and a system bus 30g that interconnects the above units. It consists of
  • the CPU 30a executes processing to be described later on the image acquired by the imaging device 20 according to the program stored in the auxiliary storage unit 30c.
  • the main storage unit 30b includes a RAM (Random Access Memory) and the like, and is used as a work area of the CPU 30a.
  • RAM Random Access Memory
  • the auxiliary storage unit 30c includes a non-volatile memory such as a ROM (Read Only Memory), a magnetic disk, and a semiconductor memory.
  • the auxiliary storage unit 30c stores programs executed by the CPU 30a, various parameters, and the like.
  • information including information (for example, an image, an image capturing time, a frame number, and the number of pixels) related to an image output from the image capturing device 20 and a processing result by the CPU 30a are sequentially stored.
  • the display unit 30d includes a CRT (Cathode Ray Tube) or LCD (Liquid Crystal Display), and displays the processing result of the CPU 30a.
  • CRT Cathode Ray Tube
  • LCD Liquid Crystal Display
  • the input unit 30e includes a key switch and a pointing device.
  • the operator's instruction is input via the input unit 30e and notified to the CPU 30a via the system bus 30g.
  • the interface unit 30f includes a serial interface or a LAN (Local Area Network) interface.
  • the imaging device 20 is connected to the system bus 30g via the interface unit 30f.
  • the flowchart of FIG. 12 corresponds to a series of processing algorithms of a program executed by the CPU 30a.
  • the process performed by the collision time calculation device 30 will be described with reference to FIG. Note that this process is executed with a trigger when the collision time calculation system 10 is activated and information regarding an image captured by the imaging device 20 is output.
  • the image capturing device 20 sequentially outputs an image PH1 shown in FIG. 4 and an image PH2 shown in FIG.
  • the CPU 30a calculates a feature amount for each pixel constituting the images PH1 and PH2 stored in the storage unit 31, and extracts feature points included in the image based on the feature amount. .
  • feature points P1, P2, P3, and P4 are extracted from the image PH1 here.
  • feature points Q1, Q2, Q3, and Q4 are extracted from the image PH2.
  • the CPU 30a sequentially selects the feature points P1 to P4 of the image PH1. Then, the CPU 30a calculates a correlation value between the selected feature points P1 to P4 and the feature points Q1 to Q4 of the image PH2. For example, first, the CPU 30a sequentially calculates the correlation value R of the template TF1 with respect to the image PH2 while moving the template TF1 centered on the feature point P1 of the image PH1 in the vicinity of the feature points Q1 to Q4 of the image PH2. The CPU 30a performs the above-described processing for the feature points P2 to P4.
  • the CPU 30a uses the feature points P1 to P4 of the image PH1 as the start points and the feature points Q1 to Q4 of the image PH2 as the end points.
  • the optical flows OP1 to OP4 are defined.
  • the CPU 30a groups the specified group of optical flows OP1 to OP4.
  • the specified group of optical flows OP1 to OP4 As shown in FIG. 6, in the present embodiment, a case where there are four optical flows related to the vehicle 101 is described for convenience of explanation. However, in practice, tens or hundreds of feature points can be extracted from an image obtained by photographing the vehicle 101. Dozens or hundreds of optical flows can be defined.
  • the CPU 30a excludes optical flows including many noise components from the tens or hundreds of optical flows from the grouping, and groups the remaining optical flows. For example, when the vehicle 101 is in a complete linear motion, straight lines that coincide with the optical flows (straight lines extended from the start point of the optical flow) intersect at the vanishing point VP. Therefore, the CPU 30a excludes the optical flow from the grouping when the straight line that matches the optical flow is significantly away from the vanishing point VP, and regards the remaining optical flow as the optical flow related to the same moving object. Group.
  • the optical flows OP1 to OP4 related to the vehicle 101 are grouped as the optical flows of the vehicle 101.
  • the CPU 30a calculates a predicted collision time TTC until the vehicle 101 collides with the vehicle 100 using the optical flows OP1 to OP4.
  • the vehicle 101 located at the position indicated by the arrow a1 in FIG. 3 moves relatively to the position indicated by the arrow a2 in FIG. 3 when the image PH2 is taken.
  • the vector in the XYZ coordinate system corresponding to the optical flow OP1 having the feature point P1 as the start point and the feature point Q1 as the end point has the start point as the corresponding point RP1 and the end point as the corresponding point.
  • the vector MV0 is RQ1.
  • the optical flow OP1 shows the movement locus of the feature points in the image plane IM of the photographing apparatus 20.
  • a vector MV0 indicates the movement locus of the corresponding point in the XYZ coordinate system.
  • the vector MV0 is parallel to the Z axis.
  • the feature point P1 and the corresponding point RP1 are arranged on a straight line LN3 passing through the origin O in the XYZ coordinate system.
  • the feature point Q1 and the corresponding point RQ1 are arranged on a straight line LN4 passing through the origin O in the XYZ coordinate system.
  • of the vector MV2 indicating the trajectory until the index point of the vehicle 101 that coincides with the corresponding point RQ1 reaches the point CP1 on the X axis with the X coordinate as X1, and the feature point P1
  • of the vector MV0 is expressed by the above equation (8).
  • the said Formula (9) is guide
  • the vector MV1 indicating the trajectory until the index point of the vehicle 101 reaches the point CP2 on the collision plane indicated by the straight line LN2 from the corresponding point RQ1 is parallel to the Z axis.
  • of the vector MV2 is obtained by using the distance Z1 from the X axis to the corresponding point RQ1 and the distance L from the X axis to the point CP2. It is indicated by (10).
  • the said Formula (11) is guide
  • L in the above equation (11) is a distance between the X axis and the collision surface indicated by the straight line LN2, and is a known value that is substantially equal to the distance between the mounting position of the imaging device 20 and the front end of the vehicle 100. Therefore, if the value of the distance Z1 between the X axis and the corresponding point RQ1 is known, the collision prediction time calculation unit 36 uses the above equation (11) to predict the collision prediction time TTC until the vehicle 101 collides with the vehicle 100. Can be calculated. Therefore, the CPU 30a calculates the distance Z1 using the above equation (12). Then, the CPU 30a calculates the collision prediction time TTC by substituting the calculated distance Z1 into the above equation (11). After calculating the predicted collision time TTC, the CPU 30a outputs the predicted collision time TTC to an external device or the like.
  • the external device or the like to issue a warning for avoiding a collision to the driver when, for example, the collision prediction time TTC is equal to or less than a threshold value.
  • the time TTCc until the vehicle 101 reaches the surface including the optical center of the imaging device 20 is calculated using the optical flow for the vehicle 100. Based on this time TTCc, a predicted collision time TTC until the vehicle 101 collides with the collision surface of the vehicle 100 is calculated.
  • the predicted collision time TTC is greatly affected by an error generated between the magnitude of the vector MV0 indicating the trajectory of the index point of the vehicle 101 and the magnitude of the optical flow OP1 even when the vehicle 101 is away from the vehicle 100. It is calculated without. Therefore, by calculating the predicted collision time TTC until the vehicle 100 and the vehicle 101 collide in advance, it is possible to issue an alarm for avoiding the collision to the driver at an appropriate timing.
  • the distance Z1 is calculated using Expression (12). Not limited to this, the distance Z1 may be calculated using the following equation (14).
  • WV is the vehicle width of the vehicle 101.
  • wv is the vehicle width of the vehicle 101 in the image PH2.
  • Wc is the horizontal width of the image PH2.
  • FOV is a viewing angle of the photographing apparatus 20 corresponding to the image PH2.
  • Japanese Patent Application Laid-Open No. 2008-97126 discloses that optical flows that are equal to each other with an index C0 expressed by the following equation (15) allowing a certain error are grouped. (See FIG. 9 in particular).
  • the following equation (15) is established from the relationship of geometric similarity, the following equation (16) is also established.
  • the index C1 represented by the following equation (16) is calculated in the same manner as the calculation of the index C0 in the technique disclosed in Japanese Patent Laid-Open No. 2008-97126. .
  • Equation (16) is obtained when the vanishing point VP is equal to the image center Xc (when the optical axis direction of the imaging device 20 and the relative movement direction of the vehicle 101 with respect to the vehicle 100 are parallel).
  • the following equation (17) can be modified.
  • the right side of equation (17) is equivalent to the denominator of the left side of equation (9). Therefore, when optical flow grouping is performed using the technique disclosed in Japanese Patent Application Laid-Open No. 2008-97126, a result calculated in the process of grouping is used to calculate the collision prediction time TTC. Processing can be performed in a short time.
  • using the index C1 calculated in the process of grouping while allowing a certain error uses the value of the average index C1 of the moving object. For example, the index C1 at one specific point of the moving object. It is more resistant to noise than when using the value of.
  • the collision prediction time is calculated using the optical flow OP1 defined by the feature point P1 and the feature point Q1 has been described.
  • the present invention is not limited to this, and the collision prediction time may be calculated using an optical flow other than the optical flow OP1.
  • the average value of the collision prediction times calculated using the respective optical flows OP1 to OP4 may be output to an external device or the like as the final collision prediction time.
  • the photographing apparatus 20 is attached to the upper portion of the front window as shown in FIG. 2 .
  • the imaging device 20 may have a camera arranged in the vicinity of the front bumper.
  • the time TTCc is equivalent to the collision prediction time TTC.
  • the feature amount f (x, y) is calculated using the equation (1), but the present invention is not limited to this.
  • the feature amount may be a so-called KLT feature amount min ( ⁇ 1, ⁇ 2).
  • the comparison value V can be calculated by dividing the feature amount by the square of the average value AVG (x, y) described above.
  • ⁇ 1 and ⁇ 2 are represented by the following equations (18) and (19), respectively.
  • Ix and Iy indicate gradients in the X-axis direction and Y-axis direction of the luminance I (x, y) at the position (x, y) on the image. Specifically, it is represented by the following formula (21) and formula (22).
  • the predicted collision time TTC until an object approaching the host vehicle collides with the host vehicle is calculated based on the time from the time when the image PH1 is captured to the time when the image PH2 is captured. Then explained.
  • the imaging device 20 can capture images PH1 and PH2 used for calculating the predicted collision time TTC at an arbitrary time. For example, an image captured when the ratio of the image of an object in the image to the entire image exceeds a predetermined ratio (10%, 20%, etc.) of the image captured by the image capturing device 20 is: The image is PH1. An image taken after a predetermined time (0.1 seconds, 0.5 seconds, etc.) from the time when the image PH1 was taken can also be set as PH2.
  • the time ⁇ t has been described as the time from the time when the image PH1 is captured to the time when the image PH2 is captured.
  • the collision prediction time calculation unit 36 calculates the collision prediction time TTC using the first frame image captured by the imaging device 20 as the image PH1 and the second frame image as the image PH2, and then the second frame.
  • the collision prediction time TTC can be calculated one after another using the image PH1 as the image PH1 and the third frame image as the image PH2, and the collision prediction time TTC can be calculated one after another.
  • the collision prediction time calculation unit 36 also calculates the collision prediction time TTC based on the time difference between two frames separated by a predetermined interval (for example, the first frame image is the image PH1 and the fifth frame image is the image PH2). Can also be calculated.
  • the function of the collision time calculation device 30 according to each of the above embodiments can be realized by dedicated hardware or by a normal computer system.
  • the programs stored in the auxiliary storage unit 30c of the collision time calculation device 30 are a flexible disk, a CD-ROM (Compact Disk Read-Only Memory), a DVD (Digital Versatile Disk), a MO ( A device that executes the above-described processing may be configured by storing and distributing in a computer-readable recording medium such as Magneto-Optical disk) and installing the program in the computer.
  • a computer-readable recording medium such as Magneto-Optical disk
  • the program may be stored in a disk device or the like of a predetermined server device on a communication network such as the Internet, and may be downloaded onto a computer by being superimposed on a carrier wave, for example.
  • the collision time calculation device, collision time calculation method, and program of the present invention are suitable for calculating the collision time.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Mechanical Engineering (AREA)
  • Traffic Control Systems (AREA)
  • Image Analysis (AREA)

Abstract

Le temps d'arrivée auquel un véhicule se rapprochant arrive sur une surface qui comprend le centre optique d'un dispositif d'imagerie équipant un véhicule, est calculé au moyen d'un flux optique du véhicule se rapprochant, pendant que celui-ci se rapproche de l'autre véhicule. Le temps d'estimation de collision auquel le véhicule se rapprochant entre en collision avec l'autre véhicule, est calculé sur la base du temps d'arrivée. Le temps d'estimation de collision peut être calculé sans qu'il soit influencé de manière significative par l'erreur produite entre l'amplitude d'un vecteur indiquant le lieu de déplacement du véhicule se rapprochant et l'amplitude du flux optique, même si le véhicule se rapprochant est distant de l'autre véhicule. De cette manière, une alerte peut être produite par le calcul préalable du temps d'estimation de collision, avant que le véhicule entre en collision avec un véhicule se rapprochant, à un moment permettant au conducteur d'éviter une collision.
PCT/JP2011/065830 2010-07-09 2011-07-11 Dispositif et procédé de calcul du moment d'une collision, et programme WO2012005377A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2010-157182 2010-07-09
JP2010157182A JP5418427B2 (ja) 2010-07-09 2010-07-09 衝突時間算出装置、衝突時間算出方法及びプログラム

Publications (1)

Publication Number Publication Date
WO2012005377A1 true WO2012005377A1 (fr) 2012-01-12

Family

ID=45441346

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2011/065830 WO2012005377A1 (fr) 2010-07-09 2011-07-11 Dispositif et procédé de calcul du moment d'une collision, et programme

Country Status (2)

Country Link
JP (1) JP5418427B2 (fr)
WO (1) WO2012005377A1 (fr)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2014028064A3 (fr) * 2012-08-13 2014-04-10 The Boeing Company Détection d'impact au moyen d'images vidéo
EP3855409A1 (fr) * 2020-01-21 2021-07-28 Thinkware Corporation Procédé, appareil, dispositif électronique, programme informatique et support d'enregistrement lisible par ordinateur pour mesurer la distance entre véhicules au moyen d'image de véhicule
CN115472005A (zh) * 2022-08-09 2022-12-13 东软睿驰汽车技术(上海)有限公司 车辆碰撞预警方法、装置、设备及存储介质
US12031834B2 (en) 2020-01-21 2024-07-09 Thinkware Corporation Method, apparatus, electronic device, computer program, and computer readable recording medium for measuring inter-vehicle distance based on vehicle image

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5655038B2 (ja) * 2012-07-30 2015-01-14 株式会社デンソーアイティーラボラトリ 移動体認識システム、移動体認識プログラム、及び移動体認識方法

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH11353565A (ja) * 1998-06-09 1999-12-24 Yazaki Corp 車両用衝突警報方法及び装置
JP2006099155A (ja) * 2004-09-28 2006-04-13 Nissan Motor Co Ltd 衝突判定装置、および方法
JP2006107422A (ja) * 2004-09-07 2006-04-20 Nissan Motor Co Ltd 衝突時間算出装置および障害物検出装置

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH11353565A (ja) * 1998-06-09 1999-12-24 Yazaki Corp 車両用衝突警報方法及び装置
JP2006107422A (ja) * 2004-09-07 2006-04-20 Nissan Motor Co Ltd 衝突時間算出装置および障害物検出装置
JP2006099155A (ja) * 2004-09-28 2006-04-13 Nissan Motor Co Ltd 衝突判定装置、および方法

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2014028064A3 (fr) * 2012-08-13 2014-04-10 The Boeing Company Détection d'impact au moyen d'images vidéo
CN104603839A (zh) * 2012-08-13 2015-05-06 波音公司 使用视频图像的撞击检测
US9047675B2 (en) 2012-08-13 2015-06-02 The Boeing Company Strike detection using video images
EP3855409A1 (fr) * 2020-01-21 2021-07-28 Thinkware Corporation Procédé, appareil, dispositif électronique, programme informatique et support d'enregistrement lisible par ordinateur pour mesurer la distance entre véhicules au moyen d'image de véhicule
US11680813B2 (en) 2020-01-21 2023-06-20 Thinkware Corporation Method, apparatus, electronic device, computer program, and computer readable recording medium for measuring inter-vehicle distance based on vehicle image
US12031834B2 (en) 2020-01-21 2024-07-09 Thinkware Corporation Method, apparatus, electronic device, computer program, and computer readable recording medium for measuring inter-vehicle distance based on vehicle image
JP7658745B2 (ja) 2020-01-21 2025-04-08 シンクウェア コーポレーション 車両映像に基づく車間距離の測定方法、車間距離測定装置、電子機器、コンピュータプログラム、及びコンピュータ読み取り可能な記録媒体
CN115472005A (zh) * 2022-08-09 2022-12-13 东软睿驰汽车技术(上海)有限公司 车辆碰撞预警方法、装置、设备及存储介质
CN115472005B (zh) * 2022-08-09 2023-12-19 东软睿驰汽车技术(上海)有限公司 车辆碰撞预警方法、装置、设备及存储介质

Also Published As

Publication number Publication date
JP5418427B2 (ja) 2014-02-19
JP2012018637A (ja) 2012-01-26

Similar Documents

Publication Publication Date Title
CN107431786B (zh) 图像处理设备、图像处理系统和图像处理方法
JP6058256B2 (ja) 車載カメラ姿勢検出装置および方法
US9736460B2 (en) Distance measuring apparatus and distance measuring method
JP7038345B2 (ja) カメラパラメタセット算出方法、カメラパラメタセット算出プログラム及びカメラパラメタセット算出装置
JP5656567B2 (ja) 映像処理装置および方法
JP6347211B2 (ja) 情報処理システム、情報処理方法及びプログラム
JP2004354257A (ja) キャリブレーションずれ補正装置及びこの装置を備えたステレオカメラ並びにステレオカメラシステム
JP2008113296A (ja) 車両周辺監視装置
US9781336B2 (en) Optimum camera setting device and optimum camera setting method
JP6328327B2 (ja) 画像処理装置及び画像処理方法
JP5146446B2 (ja) 移動体検知装置および移動体検知プログラムと移動体検知方法
JP2002366937A (ja) 車外監視装置
JP2008299458A (ja) 車両監視装置および車両監視方法
WO2012005377A1 (fr) Dispositif et procédé de calcul du moment d'une collision, et programme
JP2007263657A (ja) 3次元座標取得装置
CN105716567A (zh) 通过单眼图像获取设备侦测物体与机动车辆距离的方法
JP4655242B2 (ja) 車両用画像処理装置
US10595003B2 (en) Stereo camera apparatus and vehicle comprising the same
CN111932590B (zh) 对象跟踪方法、装置、电子设备及可读存储介质
JP5624370B2 (ja) 移動体検出装置及び移動体検出方法
JP2015186085A (ja) 移動量導出装置、及び、移動量導出方法
WO2017010268A1 (fr) Dispositif et programme d'estimation
JP5539250B2 (ja) 接近物体検知装置及び接近物体検知方法
CN112991401A (zh) 车辆运行轨迹追踪方法、装置、电子设备和存储介质
US20100027847A1 (en) Motion estimating device

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 11803708

Country of ref document: EP

Kind code of ref document: A1

DPE1 Request for preliminary examination filed after expiration of 19th month from priority date (pct application filed from 20040101)
NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 11803708

Country of ref document: EP

Kind code of ref document: A1

点击 这是indexloc提供的php浏览器服务,不要输入任何密码和下载