WO2015092977A1 - Sight line detection device and sight line detection method - Google Patents
Sight line detection device and sight line detection method Download PDFInfo
- Publication number
- WO2015092977A1 WO2015092977A1 PCT/JP2014/005903 JP2014005903W WO2015092977A1 WO 2015092977 A1 WO2015092977 A1 WO 2015092977A1 JP 2014005903 W JP2014005903 W JP 2014005903W WO 2015092977 A1 WO2015092977 A1 WO 2015092977A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- line
- sight
- gaze
- driver
- vehicle
- Prior art date
Links
- 238000001514 detection method Methods 0.000 title claims abstract description 117
- 230000008859 change Effects 0.000 claims abstract description 30
- 230000002093 peripheral effect Effects 0.000 abstract description 2
- 210000001747 pupil Anatomy 0.000 description 61
- 238000000034 method Methods 0.000 description 59
- 230000008569 process Effects 0.000 description 34
- 230000006870 function Effects 0.000 description 12
- 210000004087 cornea Anatomy 0.000 description 11
- 238000010586 diagram Methods 0.000 description 11
- 230000007423 decrease Effects 0.000 description 10
- 238000003384 imaging method Methods 0.000 description 5
- 230000000007 visual effect Effects 0.000 description 5
- 230000003247 decreasing effect Effects 0.000 description 3
- 238000013213 extrapolation Methods 0.000 description 3
- 230000001678 irradiating effect Effects 0.000 description 2
- 230000004397 blinking Effects 0.000 description 1
- 238000003708 edge detection Methods 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 230000004044 response Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
- G06F3/013—Eye tracking input arrangements
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
- G06T7/73—Determining position or orientation of objects or cameras using feature-based methods
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/26—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
- G01C21/34—Route searching; Route guidance
- G01C21/36—Input/output arrangements for on-board computers
- G01C21/3664—Details of the user input interface, e.g. buttons, knobs or sliders, including those provided on a touch screen; remote controllers; input using gestures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10016—Video; Image sequence
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10048—Infrared image
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30196—Human being; Person
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30204—Marker
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30248—Vehicle exterior or interior
- G06T2207/30252—Vehicle exterior; Vicinity of vehicle
Definitions
- the present disclosure relates to a technique for detecting a line-of-sight position where a driver's line of sight arrives.
- a technique for detecting the driver's gaze position by analyzing the driver's face image is known. If this technique is used, it is possible to monitor the driver's aside driving. Furthermore, if the detection accuracy of the line-of-sight position is sufficiently high, various devices (such as a car navigation system) mounted on the vehicle can be operated with the line of sight (for example, Patent Document 1).
- the inventors of the present application have found the following regarding the line-of-sight detection device and the line-of-sight detection method.
- the proposed technique even if the detection accuracy of the line-of-sight position is lowered, calibration cannot be performed when the preceding vehicle is not detected, and therefore there is a possibility that the detection accuracy of the line-of-sight position cannot be kept high.
- the present disclosure has been made in view of the above-described problems, and an object thereof is to provide a technique capable of keeping high eye-gaze position detection accuracy.
- the line-of-sight detection device and the line-of-sight detection method according to an aspect of the present disclosure are detected by the line-of-sight detection unit when the brightness around the vehicle is detected and the amount of change in brightness around the vehicle exceeds a predetermined amount.
- the gaze detection device includes a gaze detection unit that detects the gaze position of the driver, a calibration unit that calibrates the gaze position detected by the gaze detection unit, and a surrounding environment detection unit that detects brightness around the vehicle.
- the gaze detection method includes a step of detecting the gaze position of the driver, a step of calibrating the gaze position detected by the step of detecting the gaze position, and a step of detecting the brightness around the vehicle. .
- the detection accuracy of the line-of-sight position is likely to decrease as the amount of change in brightness around the vehicle increases. Therefore, if the brightness around the vehicle is detected, and the amount of change in the brightness around the vehicle exceeds a predetermined amount, the line-of-sight position detected by the line-of-sight detection unit is calibrated. Calibration can be performed at the timing when the accuracy is lowered to increase the detection accuracy.
- the detection accuracy of the line-of-sight position can be kept high.
- FIG. 1A is an explanatory diagram illustrating the configuration of the line-of-sight detection device according to the present embodiment.
- FIG. 1B is a block diagram illustrating functions of the CPU.
- FIG. 2 is a flowchart of a gaze position detection process performed by the gaze detection device of the present embodiment.
- FIG. 3A is an explanatory diagram illustrating a pupil and a corneal reflection image in an image obtained by imaging the right eye of the driver
- FIG. 3B is an explanatory diagram illustrating a pupil and a corneal reflection image in an image obtained by imaging the right eye of the driver;
- FIG. 1A is an explanatory diagram illustrating the configuration of the line-of-sight detection device according to the present embodiment.
- FIG. 1B is a block diagram illustrating functions of the CPU.
- FIG. 2 is a flowchart of a gaze position detection process performed by the gaze detection device of the present embodiment.
- FIG. 3A is an explanatory diagram illustrating a pupil and a corneal reflection image in
- FIG. 3C is an explanatory diagram illustrating a pupil and a cornea reflection image captured in an image obtained by imaging the right eye of the driver;
- FIG. 4 is an explanatory diagram showing a method of detecting the relative pupil position,
- FIG. 5 is an explanatory diagram illustrating a line-of-sight position determination table.
- FIG. 6 is a flowchart of the calibration process performed by the line-of-sight detection apparatus according to the present embodiment.
- FIG. 7 is an explanatory diagram illustrating a gaze image displayed to the driver.
- FIG. 8 is an explanatory diagram conceptually showing how the line-of-sight position determination table is calibrated.
- FIG. 9 is an explanatory view showing another example of a gaze image displayed to the driver,
- FIG. 10 is an explanatory diagram showing the relationship between the brightness around the vehicle and the detection accuracy of the line-of-sight position in the present embodiment.
- the line-of-sight detection apparatus is an apparatus that is mounted on a vehicle and detects a line-of-sight position where the line of sight of the driver of the vehicle reaches.
- FIG. 1A shows the configuration of the line-of-sight detection device 10 of the present embodiment.
- the line-of-sight detection device 10 is connected to a flash memory 12 that stores a program executed by the CPU 11 and a RAM 13 that is a work area of the CPU 11 via a bus 14. ing.
- the CPU 11 reads out various programs and data necessary for executing the programs from the flash memory 12 and performs processing for detecting the line-of-sight position.
- the line-of-sight position is detected based on the positional relationship between the cornea reflection image captured in the driver's eye image and the pupil. We will adopt the method to do.
- the bus 14 includes a near-infrared light camera 16 for imaging using near-infrared light, and a near-infrared light irradiation LED (near-infrared light LED). 17 is connected via the camera interface 15.
- the near-infrared light camera 16 and the near-infrared light LED 17 are provided with the driver's seat side facing the front of the driver's seat in order to image the driver.
- the CPU 11 controls the near-infrared light camera 16 and the near-infrared light LED 17 to acquire an image of the driver's eyes imaged in a state where the near-infrared light is irradiated.
- an illuminance sensor 19 is connected to the bus 14 via an illuminance sensor interface 18, and the CPU 11 detects the brightness around the vehicle using the illuminance sensor 19.
- a HUD device (head-up display device) 21 is connected to the bus 14 via a HUD device interface 20.
- the HUD device 21 can project a predetermined image on the windshield (hereinafter “windshield”) of the vehicle based on an instruction from the CPU 11.
- FIG. 1B is a block diagram showing functions of the CPU 11. These functions are realized by the CPU 11 executing a program stored in the flash memory 12. As shown in FIG. 1B, the CPU 11 includes a gaze position detection function 11a that performs a gaze position detection process, a calibration function 11b that calibrates the gaze position detected by the detection process, and a vehicle periphery, which will be described in detail later. A surrounding environment detection function 11c for detecting the brightness of the vehicle, and an image display function 11d for displaying a predetermined image to the driver.
- a gaze position detection function 11a that performs a gaze position detection process
- a calibration function 11b that calibrates the gaze position detected by the detection process
- vehicle periphery which will be described in detail later.
- a surrounding environment detection function 11c for detecting the brightness of the vehicle
- an image display function 11d for displaying a predetermined image to the driver.
- the line-of-sight position detection function 11a according to the present embodiment corresponds to the “line-of-sight detection unit” of the present disclosure
- the calibration function 11b according to the present embodiment corresponds to the “calibration unit” of the present disclosure
- the environment detection function 11c corresponds to the “peripheral environment detection unit” of the present disclosure
- the image display function 11d of the present embodiment corresponds to the “image display unit” of the present disclosure.
- FIG. 2 shows a flowchart of the eye gaze position detection process executed by the CPU 11 of this embodiment.
- a process for detecting the driver's line-of-sight position is performed. This process is started when a predetermined condition is satisfied (for example, when a vehicle engine is started or when a predetermined start operation is performed by the driver).
- the CPU 11 When starting the line-of-sight position detection process, the CPU 11 first controls the near-infrared light camera 16 and the near-infrared light LED 17 to image the eyes of the driver while irradiating near-infrared light.
- the obtained eye image is stored in a predetermined address of the RAM 13 (S100).
- the CPU 11 detects the center of the “pupil” and the center of the “corneal reflection image” from the eye image captured in the process of S100 using a known method such as edge detection (S102).
- the “corneal reflection image” is a light reflected from the near-infrared light LED 17 reflected by the cornea (so-called Purkinje image), and is brightly reflected in the cornea.
- FIG. 3A, FIG. 3B, and FIG. 3C exemplify “pupil” and “corneal reflection image” in an image obtained by imaging the right eye of the driver.
- FIG. 3A shows an image captured when the line of sight is directed to the front.
- a “corneal reflection image” is shown directly below the “pupil”.
- the CPU 11 detects the centers of the “pupil” and “corneal reflection image” in the eye image in this way.
- the positional relationship between the “pupil” in the eye image and the “corneal reflection image” changes according to the line-of-sight position. For example, roughly speaking, as illustrated in FIG. 3A, when the line of sight is directed to the front, even if the pupil is positioned “directly above” the corneal reflection image, As illustrated in FIG. 3B, the pupil is positioned on the “upper right” of the cornea reflection image. When the line of sight is directed to the right side, as illustrated in FIG. 3C, the pupil is positioned at the “upper left” of the cornea reflection image.
- the line-of-sight position is detected based on “the position of the center of the pupil relative to the center of the cornea reflection image”. Can do.
- the CPU 11 detects the center of the “pupil” and the “corneal reflection image” in the process of S102, subsequently, “the position of the center of the pupil relative to the center of the cornea reflection image” (hereinafter, “pupil relative position”). Is detected (S104). As shown in FIG. 4, the relative pupil position is represented by the distance in the horizontal direction (x direction) and the distance in the vertical direction (y direction) from the center of the cornea reflection image to the center of the pupil. These distances are represented by the number of pixels.
- the CPU 11 refers to the gaze position determination table and detects the gaze position of the driver (S106).
- the line-of-sight position determination table as illustrated in FIG. 5, the line-of-sight positions (bx, by) of the driver are set for all values (ax, ay) that the relative position of the pupil can take.
- the CPU 11 refers to this line-of-sight position determination table and detects the line-of-sight position corresponding to the pupil relative position obtained by the process of S104 described above.
- the line-of-sight position determination table is generated as follows before starting the line-of-sight position detection process, such as after the engine of the vehicle is started, and stored in the flash memory 12 in advance.
- the driver's eyes are imaged while the driver is gazing at a predetermined position (for example, the image displayed by the HUD device 21 at the predetermined position), and the pupil is captured in the same manner as described above.
- Detect relative position Then, the gaze position (line-of-sight position) and the obtained pupil relative position are stored in association with each other.
- Such processing is repeated a plurality of times (for example, 5 to 10 times) while changing the position to be watched.
- the pupil relative position is calculated by performing known interpolation processing, extrapolation processing, or the like. Thereby, a correspondence relationship between the “pupil relative position” and the “line-of-sight position”, that is, a line-of-sight position determination table is generated.
- the line-of-sight detection apparatus 10 of the present embodiment detects the line-of-sight position based on the pupil relative position obtained from the eye image.
- the CPU 11 outputs data relating to the line-of-sight position detected in this way to the device in response to requests from various driving support devices (not shown). Thereby, various driving assistances using the driver's line-of-sight position are performed.
- the driver when the driver's line-of-sight position is greatly deviated from the traveling direction of the vehicle detected by the steering angle sensor (not shown), the driver is warned to prevent the driver from looking aside.
- a device such as a car navigation system is operated by the movement of the driver's line of sight. In this case, the physical burden can be reduced as compared with the case of operating with a finger, and the device can be operated even when the hand is closed.
- FIG. 6 shows a flowchart of the calibration process performed by the CPU 11.
- the CPU 11 detects the brightness around the vehicle using the illuminance sensor 19 (S202). Then, it is determined whether or not the amount of change in brightness around the vehicle since the previous calibration exceeded a predetermined amount (S204).
- the brightness around the vehicle is stored in the RAM 13 every time the visual line position determination table is calibrated. Therefore, in the process of S204, the brightness stored in the RAM 13 when the previous calibration was performed is compared with the brightness detected this time, and it is determined whether or not the difference between these brightnesses exceeds a predetermined amount. (S204).
- the predetermined amount include several hundred lx to several thousand lx.
- the detection accuracy of the line-of-sight position may decrease for the following reason. That is, when the brightness around the vehicle changes, the brightness of light entering the eyes of the driver in the passenger compartment also changes, so the pupil diameter of the driver's eyes changes. Then, even if the actual line-of-sight position does not change, the “pupil relative position” obtained from the eye image may change. Since the “pupil relative position” is associated with the line-of-sight position corresponding to the “pupil relative position”, the corresponding line-of-sight position is naturally different if the pupil relative position is different.
- the “pupil relative position” obtained from the eye image changes even if the actual line-of-sight position does not change, and is thus detected based on the “pupil relative position”.
- the sight line position may be shifted from the actual sight line position.
- the gaze position determination table is calibrated as follows in order to increase the gaze position detection accuracy.
- FIG. 7 illustrates a state in which the gaze image 30a is displayed on the windshield in front of the driver.
- FIG. 7 shows a state when the front of the vehicle is viewed from the driver's seat, and a rectangular broken line represents an image display area on the windshield by the HUD device 21.
- the gaze image 30a is displayed in the lower right part of the image display area.
- the HUD device 21 displays the gaze image 30a in a manner that stands out from the image that is normally displayed (for example, blinking or increasing the brightness).
- the gaze image 30a is displayed in front of the driver in this manner, it is considered that the driver driving while looking at the front gazes at the gaze image 30a.
- the CPU 11 detects the pupil relative position when the driver is gazing at the gaze image 30a, that is, the pupil relative position when the gaze image 30a is displayed (S210).
- the detection of the pupil relative position is performed by the same method as the visual line position detection process described above with reference to FIG. That is, the driver's eyes are imaged in the state of irradiating near infrared light, and the pupil relative position is detected based on the pupil and the center of the cornea reflection image detected from the eye image.
- the CPU 11 obtains the position where the gaze image 30a is displayed and the “pupil” obtained when the gaze image 30a is displayed. Based on the “relative position”, the line-of-sight position determination table is calibrated (S212). In other words, the “pupil relative position” obtained this time can be estimated to be the “pupil relative position” in a state where the driver is gazing at the gaze image 30a, and thus the “pupil relative position” is detected.
- the line-of-sight position corresponding to the “pupil relative position” in the line-of-sight position determination table is set to “the position where the line-of-sight image 30a is displayed” so that the “position where the line-of-sight image 30a is displayed” is detected as the line-of-sight position. To the line-of-sight position corresponding to “”.
- the value of the “relative pupil position” obtained this time is (120, 250).
- the line-of-sight position corresponding to the pupil relative position (120, 250) is updated to the line-of-sight position (bx80, by80) corresponding to the “position where the gaze image 30a is displayed”.
- the line-of-sight positions corresponding to the other “pupil relative positions” are also known. It is updated by performing interpolation processing and extrapolation processing.
- the accuracy of detecting the eye gaze position can be improved by calibrating the eye gaze position determination table. Therefore, in the present embodiment, “calibrating the line-of-sight position determination table” corresponds to “calibrating the line-of-sight position” in the present disclosure.
- the line-of-sight position determination table may be regenerated. That is, as shown in FIG. 9, the process of displaying the gaze image 30a described in the process of S208 is repeated a plurality of times while changing the display position.
- the gaze images 30b to 30f are sequentially displayed at five locations (four corners and a central portion) of the image display area of the HUD device 21. Since the driver is considered to gaze at the sequentially displayed gaze images 30b to 30f, the CPU 11 detects the pupil relative position from the eye image captured when the gaze images 30b to 30f are displayed.
- the display positions (line-of-sight positions) of the images for gaze 30b to 30f and the detected pupil relative positions are stored in association with each other. Further, for positions (line-of-sight positions) where the gazing images 30b to 30f are not displayed, pupil relative positions are calculated by performing known interpolation processing, extrapolation processing, and the like.
- the line-of-sight detection device 10 displays the gaze image 30a to the driver when the line-of-sight position determination table is calibrated (or regenerated).
- the process of displaying the gaze image 30a for the driver can be performed at any time using the HUD device 21. Further, it is considered that the driver who drives while looking forward looks at the gaze image 30a displayed to the driver (displayed in front of the driver). For these reasons, the driver's “actual line-of-sight position” can be acquired at a desired timing by the process of displaying the gaze image 30a to the driver.
- the gaze image 30a is displayed to the driver, the position where the gaze image 30a is displayed, and the gaze position detection process (similar to the same).
- the line-of-sight position determination table is calibrated based on the pupil relative position detected in (Method). In this way, the driver's “actual gaze position” is acquired without missing the timing when the amount of change in brightness around the vehicle exceeds the predetermined amount, and the “actual gaze position” is detected by the gaze position detection process.
- the line-of-sight position determination table can be calibrated to be detected by As a result, the detection accuracy of the line-of-sight position can be reliably kept high.
- the gaze position determination table is calibrated based on the position where the gaze image 30a is displayed and the “pupil relative position” detected by the gaze position detection process (similar method).
- the “pupil relative position” corresponds to the “line-of-sight position” detected by the line-of-sight position detection process. Therefore, in the present embodiment, “calibrating the gaze position determination table based on the position where the gaze image 30a is displayed and the“ pupil relative position ”detected by the gaze position detection process (similar method)” This corresponds to “calibrating the line-of-sight position based on the position where the gaze image is displayed and the“ line-of-sight position ”detected by the line-of-sight detection unit” in the present disclosure.
- a driver who drives while looking forward looks at the gaze image 30a displayed to the driver (displayed in front of the driver) unconsciously without greatly moving his line of sight. Conceivable. For this reason, since a driver
- the gaze image 30a is displayed in order to cause the driver to gaze at a predetermined position.
- the HUD device 21 is used instead of using a dedicated image like the gaze image 30a.
- the image displayed normally may be used.
- a warning image for warning the driver for example, a warning image for warning that an obstacle has been detected ahead or a warning image for displaying a road sign to call attention
- a display mode that attracts the attention of is adopted. For this reason, the warning image can make the driver gaze at a predetermined position in the same manner as the gaze image 30a.
- the CPU 11 may calibrate the line-of-sight position determination table based on the display position of the warning image when the amount of change in brightness around the vehicle exceeds a predetermined amount. As described above, if the warning image displayed normally is used instead of the dedicated image (gaze image 30a), the image displayed to the driver does not change at all. For this reason, the line-of-sight position determination table can be calibrated more safely without disturbing the driving of the driver.
- the line-of-sight detection device 10 it is estimated that the detection accuracy of the line-of-sight position has decreased when the amount of change in brightness around the vehicle exceeds a predetermined amount, and the line-of-sight position determination table It was decided to calibrate. In the following, this point will be supplementarily described.
- FIG. 10 conceptually shows the relationship between “brightness around the vehicle” and “detection accuracy of the line-of-sight position” in this embodiment.
- the upper side in FIG. 10 shows the brightness around the vehicle, and the lower side in FIG. 10 shows the detection accuracy of the line-of-sight position.
- the brightness around the vehicle may be greatly reduced by entering the tunnel.
- the relative position of the pupil obtained from the eye image changes even if the line-of-sight position does not actually change.
- the detection accuracy may be reduced. Therefore, if the CPU 11 determines that the amount of change in brightness around the vehicle has exceeded a predetermined amount (decreased beyond a predetermined amount), the CPU 11 calibrates the line-of-sight position determination table. Thereby, as shown at time t2 in FIG. 10, the detection accuracy of the line-of-sight position can be improved.
- the CPU 11 determines that the amount of change in brightness around the vehicle has exceeded a predetermined amount (increased beyond the predetermined amount), the CPU 11 calibrates the line-of-sight position determination table. Thereby, as shown at time t4 in FIG. 10, the detection accuracy of the line-of-sight position can be increased.
- the gaze position determination table is calibrated when the amount of change in brightness around the vehicle exceeds a predetermined amount, calibration is performed at the timing when the gaze position detection accuracy is reduced, and the detection accuracy is increased. Can be increased. As a result, the detection accuracy of the line-of-sight position can be kept high.
- the brightness around the vehicle may gradually decrease as shown from time t5 to time t6, t7, t8 in FIG. In this case, for example, it may not be possible to appropriately determine whether or not the amount of change in brightness around the vehicle exceeds a predetermined amount only by the amount of change in brightness per unit time.
- the line-of-sight detection device 10 (CPU 11) of the present embodiment stores the brightness around the vehicle when the line-of-sight position determination table is calibrated in the RAM 13, and the line-of-sight position determination table is calibrated by detecting the vehicle periphery. And the brightness stored in the RAM 13 is greater than a predetermined amount.
- the stored brightness that is, the brightness at the time of previous calibration
- the brightness detected this time are compared. Therefore, it is possible to appropriately determine whether or not the amount of change in brightness around the vehicle exceeds a predetermined amount.
- the detection accuracy of the line-of-sight position can be kept high (see times t6, t7, and t8 in FIG. 10).
- the RAM 13 in this embodiment corresponds to a “storage unit” in the present disclosure.
- the method for detecting the line-of-sight position is not limited to this method. That is, in various methods for detecting the line-of-sight position, it is considered that the detection accuracy of the line-of-sight position is likely to decrease when the amount of change in brightness around the vehicle increases. Therefore, in various methods for detecting the line-of-sight position, when the amount of change in the brightness around the vehicle exceeds a predetermined amount, if the processing performed in the line-of-sight position detection is calibrated, it is the same as the above-described embodiment.
- the calibration of the processing performed when detecting the line-of-sight position is performed by a method corresponding to the detection method. For example, when the line-of-sight position is detected based on a predetermined relational expression instead of the line-of-sight position determination table, the relational expression is calibrated.
- the gaze position determination table is taken as an example to calibrate the “processing performed when the gaze position is detected”, but of course, the “gaze line” that is a result obtained by detection is used.
- the “position” may be calibrated. That is, in the present disclosure, “calibrate the line-of-sight position” refers to whether to “calibrate the process performed when detecting the line-of-sight position” or to calibrate the “line-of-sight position” obtained as a result of detection. Regardless of whether or not, as a result, the calibration is performed so that the detection accuracy of the line-of-sight position is improved.
- the visual line detection device and the visual line detection method detect brightness around the vehicle and detect the brightness around the vehicle when the amount of change in brightness around the vehicle exceeds a predetermined amount.
- the sight line position is calibrated.
- the detection accuracy of the line-of-sight position is likely to decrease as the amount of change in brightness around the vehicle increases. Therefore, if the brightness around the vehicle is detected, and the amount of change in the brightness around the vehicle exceeds a predetermined amount, the line-of-sight position detected by the line-of-sight detection unit is calibrated. Calibration can be performed at the timing when the accuracy is lowered to increase the detection accuracy. As a result, the detection accuracy of the line-of-sight position can be kept high.
- the brightness around the vehicle when the line-of-sight position is calibrated is stored, and the line-of-sight position calibration is performed when the difference between the detected brightness around the vehicle and the stored brightness exceeds a predetermined amount. Do.
- the brightness around the vehicle may change not only suddenly when entering and exiting a tunnel but also gradually changing over time. Even in such a case, if the brightness around the vehicle when the line-of-sight position is calibrated is memorized, and the brightness around the vehicle changes from the brightness at that time by a predetermined amount or more, the calibration is performed.
- the line-of-sight position can be calibrated at the required timing. As a result, the detection accuracy of the line-of-sight position can be kept high.
- the gaze image is displayed to the driver, and the gaze position is calibrated based on the position where the gaze image is displayed and the gaze position detected by the gaze detection unit.
- the gaze image If the gaze image is displayed, it can be considered that the driver moves the line of sight to the gaze image. Therefore, if the gaze image is displayed, the line-of-sight position can be calibrated at the required timing. As a result, the detection accuracy of the line-of-sight position can be reliably kept high.
- each step is expressed as S100, for example. Further, each step can be divided into a plurality of sub-steps, while a plurality of steps can be combined into one step. Further, each step configured in this way can be referred to as a device, module, or means.
- each of the plurality of steps described above or a combination thereof includes not only (i) a software step combined with a hardware unit (eg, a computer) but also (ii) hardware (eg, an integrated circuit, As a step of the wiring logic circuit), it can be realized with or without including the functions of related devices. Further, the hardware steps can be configured inside the microcomputer.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Human Computer Interaction (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Eye Examination Apparatus (AREA)
- Image Processing (AREA)
- Position Input By Displaying (AREA)
- Traffic Control Systems (AREA)
- Image Analysis (AREA)
Abstract
Provided are a sight line detection device and a sight line detection method which are disposed upon a vehicle and whereby a sight line location whereto a driver's sight line reaches is detected. This sight line detection device comprises: a sight line detection unit (11a) which detects the sight line location of the driver; a correction unit (11b) which corrects the sight line location which is detected by the sight line detection unit (11a); and a peripheral environment detection unit (11c) which detects the brightness of the vehicle periphery. When a degree of change of the brightness of the vehicle periphery exceeds a prescribed degree, the correction unit (11b) corrects the sight line location.
Description
本出願は、2013年12月18日に出願された日本国特許出願2013-261878号に基づくものであり、ここにその記載内容を参照により援用する。
This application is based on Japanese Patent Application No. 2013-261878 filed on December 18, 2013, the contents of which are incorporated herein by reference.
本開示は、運転者の視線が到達する視線位置を検出する技術に関する。
The present disclosure relates to a technique for detecting a line-of-sight position where a driver's line of sight arrives.
運転者の顔画像を解析することによって、運転者の視線位置を検出する技術が知られている。この技術を用いれば、運転者の脇見運転を監視することが可能である。更に、視線位置の検出精度が十分に高ければ、車両に搭載された各種の機器(カーナビなど)を視線で操作することも可能となる(たとえば特許文献1)。
A technique for detecting the driver's gaze position by analyzing the driver's face image is known. If this technique is used, it is possible to monitor the driver's aside driving. Furthermore, if the detection accuracy of the line-of-sight position is sufficiently high, various devices (such as a car navigation system) mounted on the vehicle can be operated with the line of sight (for example, Patent Document 1).
各種の機器を視線で操作することができれば、運転者が運転中に手を伸ばして機器を操作する必要がなくなるので、運転者の負担を軽減することが可能と考えられる。その一方で、視線位置の検出精度が低下すると、運転者の意図とは異なる操作が行われるおそれがある。そこで、運転中にたとえば前方車両を検出したら、その前方車両を運転者が注視しているものとして、検出した視線位置が前方車両の位置となるように、視線位置を校正する技術が提案されている(特許文献2)。
If it is possible to operate various devices with a line of sight, the driver does not need to reach out and operate the device while driving, so the driver's burden can be reduced. On the other hand, when the detection accuracy of the line-of-sight position decreases, there is a possibility that an operation different from the driver's intention may be performed. Therefore, for example, when a forward vehicle is detected during driving, a technique has been proposed that calibrates the line-of-sight position so that the detected line-of-sight position becomes the position of the vehicle ahead, assuming that the driver is gazing at the vehicle ahead. (Patent Document 2).
本願発明者は、視線検出装置および視線検出方法に関して以下を見出した。提案されている技術では、視線位置の検出精度が低下していても前方車両が検出されないときは校正ができないので、視線位置の検出精度を高く保つことができないおそれがある。
The inventors of the present application have found the following regarding the line-of-sight detection device and the line-of-sight detection method. With the proposed technique, even if the detection accuracy of the line-of-sight position is lowered, calibration cannot be performed when the preceding vehicle is not detected, and therefore there is a possibility that the detection accuracy of the line-of-sight position cannot be kept high.
本開示は、上述した問題に鑑みてなされたものであり、視線位置の検出精度を高く保つことが可能な技術の提供を目的とする。
The present disclosure has been made in view of the above-described problems, and an object thereof is to provide a technique capable of keeping high eye-gaze position detection accuracy.
本開示の一態様である視線検出装置および視線検出方法は、車両周辺の明るさを検出して、車両周辺の明るさの変化量が所定量を超えた場合に、視線検出部によって検出された視線位置を校正する。視線検出装置は、運転者の視線位置を検出する視線検出部と、視線検出部によって検出された視線位置を校正する校正部と、車両の周辺の明るさを検出する周辺環境検出部とを備える。また、視線検出方法は、運転者の前記視線位置を検出する工程と、視線位置を検出する工程によって検出された視線位置を校正する工程と、車両の周辺の明るさを検出する工程とを備える。
The line-of-sight detection device and the line-of-sight detection method according to an aspect of the present disclosure are detected by the line-of-sight detection unit when the brightness around the vehicle is detected and the amount of change in brightness around the vehicle exceeds a predetermined amount. Calibrate the gaze position. The gaze detection device includes a gaze detection unit that detects the gaze position of the driver, a calibration unit that calibrates the gaze position detected by the gaze detection unit, and a surrounding environment detection unit that detects brightness around the vehicle. . Further, the gaze detection method includes a step of detecting the gaze position of the driver, a step of calibrating the gaze position detected by the step of detecting the gaze position, and a step of detecting the brightness around the vehicle. .
車両周辺の明るさの変化量が大きくなると、視線位置の検出精度が低下しやすくなると考えられる。そこで、車両周辺の明るさを検出して、車両周辺の明るさの変化量が所定量を超えた場合に、視線検出部によって検出された視線位置を校正することとすれば、視線位置の検出精度が低下したタイミングで校正を行って検出精度を高めることができる。
It is considered that the detection accuracy of the line-of-sight position is likely to decrease as the amount of change in brightness around the vehicle increases. Therefore, if the brightness around the vehicle is detected, and the amount of change in the brightness around the vehicle exceeds a predetermined amount, the line-of-sight position detected by the line-of-sight detection unit is calibrated. Calibration can be performed at the timing when the accuracy is lowered to increase the detection accuracy.
本開示による視線検出装置および視線検出方法によれば、視線位置の検出精度を高く保つことができる。
According to the line-of-sight detection device and the line-of-sight detection method according to the present disclosure, the detection accuracy of the line-of-sight position can be kept high.
本開示についての上記および他の目的、特徴や利点は、添付の図面を参照した下記の詳細な説明から、より明確になる。添付図面において
図1Aは、本実施例の視線検出装置の構成を示す説明図であり、
図1Bは、CPUが有する機能を示すブロック図であり、
図2は、本実施例の視線検出装置が実施する視線位置検出処理のフローチャートであり、
図3Aは、運転者の右目を撮像した画像に写った瞳孔および角膜反射像を例示した説明図であり、
図3Bは、運転者の右目を撮像した画像に写った瞳孔および角膜反射像を例示した説明図であり、
図3Cは、運転者の右目を撮像した画像に写った瞳孔および角膜反射像を例示した説明図であり、
図4は、瞳孔相対位置を検出する方法を示す説明図であり、
図5は、視線位置決定テーブルを例示した説明図であり、
図6は、本実施例の視線検出装置が実施する校正処理のフローチャートであり、
図7は、運転者に対して表示される注視用画像を例示した説明図であり、
図8は、視線位置決定テーブルを校正する様子を概念的に示す説明図であり、
図9は、運転者に対して表示される注視用画像の別例を示した説明図であり、
図10は、本実施例における車両周辺の明るさと視線位置の検出精度との関係を示す説明図である。
The above and other objects, features and advantages of the present disclosure will become more apparent from the following detailed description with reference to the accompanying drawings. In the attached drawings
FIG. 1A is an explanatory diagram illustrating the configuration of the line-of-sight detection device according to the present embodiment. FIG. 1B is a block diagram illustrating functions of the CPU. FIG. 2 is a flowchart of a gaze position detection process performed by the gaze detection device of the present embodiment. FIG. 3A is an explanatory diagram illustrating a pupil and a corneal reflection image in an image obtained by imaging the right eye of the driver, FIG. 3B is an explanatory diagram illustrating a pupil and a corneal reflection image in an image obtained by imaging the right eye of the driver; FIG. 3C is an explanatory diagram illustrating a pupil and a cornea reflection image captured in an image obtained by imaging the right eye of the driver; FIG. 4 is an explanatory diagram showing a method of detecting the relative pupil position, FIG. 5 is an explanatory diagram illustrating a line-of-sight position determination table. FIG. 6 is a flowchart of the calibration process performed by the line-of-sight detection apparatus according to the present embodiment. FIG. 7 is an explanatory diagram illustrating a gaze image displayed to the driver. FIG. 8 is an explanatory diagram conceptually showing how the line-of-sight position determination table is calibrated. FIG. 9 is an explanatory view showing another example of a gaze image displayed to the driver, FIG. 10 is an explanatory diagram showing the relationship between the brightness around the vehicle and the detection accuracy of the line-of-sight position in the present embodiment.
以下では、上述した本願開示の内容を明確にするために実施例について説明する。本実施例の視線検出装置は、車両に搭載され、車両の運転者の視線が到達する視線位置を検出する装置である。
(装置構成)
図1Aには、本実施例の視線検出装置10の構成が示されている。図示されるように、視線検出装置10は、CPU11を中心として、CPU11が実行するプログラム等が記憶されているフラッシュメモリ12や、CPU11の作業領域であるRAM13などが、バス14を介して接続されている。CPU11は、各種のプログラムやプログラムの実行に必要なデータをフラッシュメモリ12から読み出し、視線位置を検出する処理を行う。 Hereinafter, examples will be described in order to clarify the contents of the disclosure of the present application described above. The line-of-sight detection apparatus according to the present embodiment is an apparatus that is mounted on a vehicle and detects a line-of-sight position where the line of sight of the driver of the vehicle reaches.
(Device configuration)
FIG. 1A shows the configuration of the line-of-sight detection device 10 of the present embodiment. As shown in the figure, the line-of-sight detection device 10 is connected to a flash memory 12 that stores a program executed by the CPU 11 and a RAM 13 that is a work area of the CPU 11 via a bus 14. ing. The CPU 11 reads out various programs and data necessary for executing the programs from the flash memory 12 and performs processing for detecting the line-of-sight position.
(装置構成)
図1Aには、本実施例の視線検出装置10の構成が示されている。図示されるように、視線検出装置10は、CPU11を中心として、CPU11が実行するプログラム等が記憶されているフラッシュメモリ12や、CPU11の作業領域であるRAM13などが、バス14を介して接続されている。CPU11は、各種のプログラムやプログラムの実行に必要なデータをフラッシュメモリ12から読み出し、視線位置を検出する処理を行う。 Hereinafter, examples will be described in order to clarify the contents of the disclosure of the present application described above. The line-of-sight detection apparatus according to the present embodiment is an apparatus that is mounted on a vehicle and detects a line-of-sight position where the line of sight of the driver of the vehicle reaches.
(Device configuration)
FIG. 1A shows the configuration of the line-of-
視線位置を検出する方法としては、各種の方法を採用することができるが、本実施例では、運転者の目の画像に写った角膜反射像と瞳孔との位置関係に基づいて視線位置を検出する方法を採用することとする。
Various methods can be adopted as a method for detecting the line-of-sight position. In this embodiment, the line-of-sight position is detected based on the positional relationship between the cornea reflection image captured in the driver's eye image and the pupil. We will adopt the method to do.
このような方法を採用することに対応して、バス14には、近赤外光を用いて撮像する近赤外光カメラ16や、近赤外光照射用のLED(近赤外光LED)17が、カメラ用インターフェース15を介して接続されている。近赤外光カメラ16および近赤外光LED17は、運転者を撮像すべく、運転席の前方に運転席側を向けて設けられている。CPU11は、近赤外光カメラ16および近赤外光LED17を制御することで、近赤外光を照射した状態で撮像された運転者の目の画像を取得する。
Corresponding to adopting such a method, the bus 14 includes a near-infrared light camera 16 for imaging using near-infrared light, and a near-infrared light irradiation LED (near-infrared light LED). 17 is connected via the camera interface 15. The near-infrared light camera 16 and the near-infrared light LED 17 are provided with the driver's seat side facing the front of the driver's seat in order to image the driver. The CPU 11 controls the near-infrared light camera 16 and the near-infrared light LED 17 to acquire an image of the driver's eyes imaged in a state where the near-infrared light is irradiated.
また、バス14には、照度センサー用インターフェース18を介して、照度センサー19が接続されており、CPU11は、照度センサー19を用いて車両周辺の明るさを検出する。さらに、バス14には、HUD装置用インターフェース20を介して、HUD装置(ヘッドアップディスプレイ装置)21が接続されている。HUD装置21は、CPU11からの指示に基づいて、車両のフロントガラス(以下「ウィンドシールド」)に所定の画像を投影することが可能である。
Further, an illuminance sensor 19 is connected to the bus 14 via an illuminance sensor interface 18, and the CPU 11 detects the brightness around the vehicle using the illuminance sensor 19. Further, a HUD device (head-up display device) 21 is connected to the bus 14 via a HUD device interface 20. The HUD device 21 can project a predetermined image on the windshield (hereinafter “windshield”) of the vehicle based on an instruction from the CPU 11.
図1Bには、CPU11が有する機能を示すブロック図が示されている。これらの機能は、CPU11がフラッシュメモリ12に記憶されているプログラムを実行することによって実現される。図1Bに示されるように、CPU11は、視線位置の検出処理を行う視線位置検出機能11aと、詳しくは後述するが、検出処理によって検出された視線位置の校正を行う校正機能11bと、車両周辺の明るさを検出する周辺環境検出機能11cと、運転者に対して所定の画像を表示する画像表示機能11dとを有する。
FIG. 1B is a block diagram showing functions of the CPU 11. These functions are realized by the CPU 11 executing a program stored in the flash memory 12. As shown in FIG. 1B, the CPU 11 includes a gaze position detection function 11a that performs a gaze position detection process, a calibration function 11b that calibrates the gaze position detected by the detection process, and a vehicle periphery, which will be described in detail later. A surrounding environment detection function 11c for detecting the brightness of the vehicle, and an image display function 11d for displaying a predetermined image to the driver.
尚、本実施例の視線位置検出機能11aは、本開示の「視線検出部」に相当し、本実施例の校正機能11bは、本開示の「校正部」に相当し、本実施例の周辺環境検出機能11cは、本開示の「周辺環境検出部」に相当し、本実施例の画像表示機能11dは、本開示の「画像表示部」に相当する。
(視線位置検出処理)
図2には、本実施例のCPU11によって実行される視線位置検出処理のフローチャートが示されている。視線位置検出処理では、運転者の視線位置を検出するための処理が行われる。また、この処理は、所定の条件が成立した場合(たとえば、車両のエンジンが起動された場合や、運転者によって所定の開始操作が行われた場合)に開始される。 The line-of-sight position detection function 11a according to the present embodiment corresponds to the “line-of-sight detection unit” of the present disclosure, and thecalibration function 11b according to the present embodiment corresponds to the “calibration unit” of the present disclosure. The environment detection function 11c corresponds to the “peripheral environment detection unit” of the present disclosure, and the image display function 11d of the present embodiment corresponds to the “image display unit” of the present disclosure.
(Gaze position detection processing)
FIG. 2 shows a flowchart of the eye gaze position detection process executed by theCPU 11 of this embodiment. In the line-of-sight position detection process, a process for detecting the driver's line-of-sight position is performed. This process is started when a predetermined condition is satisfied (for example, when a vehicle engine is started or when a predetermined start operation is performed by the driver).
(視線位置検出処理)
図2には、本実施例のCPU11によって実行される視線位置検出処理のフローチャートが示されている。視線位置検出処理では、運転者の視線位置を検出するための処理が行われる。また、この処理は、所定の条件が成立した場合(たとえば、車両のエンジンが起動された場合や、運転者によって所定の開始操作が行われた場合)に開始される。 The line-of-sight position detection function 11a according to the present embodiment corresponds to the “line-of-sight detection unit” of the present disclosure, and the
(Gaze position detection processing)
FIG. 2 shows a flowchart of the eye gaze position detection process executed by the
CPU11は、視線位置検出処理を開始すると、先ず、近赤外光カメラ16および近赤外光LED17を制御することで、近赤外光を照射した状態で運転者の目を撮像して、撮像された目の画像をRAM13の所定アドレスに記憶する(S100)。
When starting the line-of-sight position detection process, the CPU 11 first controls the near-infrared light camera 16 and the near-infrared light LED 17 to image the eyes of the driver while irradiating near-infrared light. The obtained eye image is stored in a predetermined address of the RAM 13 (S100).
続いて、CPU11は、S100の処理で撮像された目の画像から、エッジ検出等の周知の方法を用いて、「瞳孔」の中心、および「角膜反射像」の中心を検出する(S102)。「角膜反射像」とは、近赤外光LED17から照射された光が角膜で反射したもの(いわゆるプルキンエ像)であり、角膜の部分に明るく写っている。
Subsequently, the CPU 11 detects the center of the “pupil” and the center of the “corneal reflection image” from the eye image captured in the process of S100 using a known method such as edge detection (S102). The “corneal reflection image” is a light reflected from the near-infrared light LED 17 reflected by the cornea (so-called Purkinje image), and is brightly reflected in the cornea.
図3A、図3B、図3Cには、運転者の右目を撮像した画像に写った「瞳孔」および「角膜反射像」が例示されている。このうち、図3Aには、視線が正面に向けられたときに撮像した画像が示されており、図示された例では、「瞳孔」の真下に「角膜反射像」が写っている。CPU11は、このようにして目の画像に写った「瞳孔」および「角膜反射像」の中心を検出する。
FIG. 3A, FIG. 3B, and FIG. 3C exemplify “pupil” and “corneal reflection image” in an image obtained by imaging the right eye of the driver. Among these, FIG. 3A shows an image captured when the line of sight is directed to the front. In the illustrated example, a “corneal reflection image” is shown directly below the “pupil”. The CPU 11 detects the centers of the “pupil” and “corneal reflection image” in the eye image in this way.
ここで、目の画像に写った「瞳孔」と「角膜反射像」との位置関係は、視線位置に応じて変化する。たとえば、大まかに説明すると、図3Aに例示したように、視線が正面に向けられたときに瞳孔が角膜反射像の「真上」に位置していたとしても、視線が左側に向けられると、図3Bに例示するように、瞳孔が角膜反射像の「右上」に位置することとなる。また、視線が右側に向けられると、図3Cに例示するように、瞳孔が角膜反射像の「左上」に位置することとなる。
Here, the positional relationship between the “pupil” in the eye image and the “corneal reflection image” changes according to the line-of-sight position. For example, roughly speaking, as illustrated in FIG. 3A, when the line of sight is directed to the front, even if the pupil is positioned “directly above” the corneal reflection image, As illustrated in FIG. 3B, the pupil is positioned on the “upper right” of the cornea reflection image. When the line of sight is directed to the right side, as illustrated in FIG. 3C, the pupil is positioned at the “upper left” of the cornea reflection image.
このように視線位置に応じて「角膜反射像」と「瞳孔」との位置関係が変化するので、「角膜反射像の中心に対する相対的な瞳孔の中心の位置」に基づき視線位置を検出することができる。
Since the positional relationship between the “corneal reflection image” and the “pupil” changes in accordance with the line-of-sight position in this way, the line-of-sight position is detected based on “the position of the center of the pupil relative to the center of the cornea reflection image”. Can do.
そこでCPU11は、S102の処理で「瞳孔」および「角膜反射像」の中心を検出したら、続いて、「角膜反射像の中心に対する相対的な瞳孔の中心の位置」(以下「瞳孔相対位置」)を検出する(S104)。尚、瞳孔相対位置は、図4に示されるように、角膜反射像の中心から瞳孔の中心までの水平方向(x方向)の距離、および垂直方向(y方向)の距離で表される。また、これらの距離は、画素数で表される。
Therefore, when the CPU 11 detects the center of the “pupil” and the “corneal reflection image” in the process of S102, subsequently, “the position of the center of the pupil relative to the center of the cornea reflection image” (hereinafter, “pupil relative position”). Is detected (S104). As shown in FIG. 4, the relative pupil position is represented by the distance in the horizontal direction (x direction) and the distance in the vertical direction (y direction) from the center of the cornea reflection image to the center of the pupil. These distances are represented by the number of pixels.
こうして、瞳孔相対位置を検出したら(S104)、CPU11は、視線位置決定テーブルを参照して、運転者の視線位置を検出する(S106)。視線位置決定テーブルには、図5に例示するように、瞳孔相対位置が取り得る全ての値(ax、ay)に対して、運転者の視線位置(bx、by)が設定されている。CPU11は、この視線位置決定テーブルを参照して、上述のS104の処理で得られた瞳孔相対位置に対応する視線位置を検出する。
Thus, when the pupil relative position is detected (S104), the CPU 11 refers to the gaze position determination table and detects the gaze position of the driver (S106). In the line-of-sight position determination table, as illustrated in FIG. 5, the line-of-sight positions (bx, by) of the driver are set for all values (ax, ay) that the relative position of the pupil can take. The CPU 11 refers to this line-of-sight position determination table and detects the line-of-sight position corresponding to the pupil relative position obtained by the process of S104 described above.
尚、視線位置決定テーブルは、車両のエンジンが起動された後など、視線位置検出処理を開始する前に、次のようにして生成し、フラッシュメモリ12に予め記憶しておく。先ず、運転者に所定の位置を注視させた状態で(たとえば、HUD装置21が所定の位置に表示した画像を注視させた状態で)運転者の目を撮像し、上述と同様の方法で瞳孔相対位置を検出する。そして、注視させた位置(視線位置)と、得られた瞳孔相対位置とを対応付けて記憶する。このような処理を、注視させる位置を変更しながら複数回(たとえば、5~10回)繰り返す。また、注視させていない位置(視線位置)については、周知の補間処理や捕外処理等を行うことで瞳孔相対位置を演算する。これにより、「瞳孔相対位置」と「視線位置」との対応関係、すなわち視線位置決定テーブルが生成される。
The line-of-sight position determination table is generated as follows before starting the line-of-sight position detection process, such as after the engine of the vehicle is started, and stored in the flash memory 12 in advance. First, the driver's eyes are imaged while the driver is gazing at a predetermined position (for example, the image displayed by the HUD device 21 at the predetermined position), and the pupil is captured in the same manner as described above. Detect relative position. Then, the gaze position (line-of-sight position) and the obtained pupil relative position are stored in association with each other. Such processing is repeated a plurality of times (for example, 5 to 10 times) while changing the position to be watched. In addition, for a position (gaze position) that is not gazed, the pupil relative position is calculated by performing known interpolation processing, extrapolation processing, or the like. Thereby, a correspondence relationship between the “pupil relative position” and the “line-of-sight position”, that is, a line-of-sight position determination table is generated.
このように、本実施例の視線検出装置10では、目の画像から得られた瞳孔相対位置に基づいて視線位置を検出する。CPU11は、こうして検出された視線位置に関するデータを、各種の運転支援装置(図示省略)からの求めに応じて該装置へ出力する。これにより、運転者の視線位置を利用した各種の運転支援が行われる。
As described above, the line-of-sight detection apparatus 10 of the present embodiment detects the line-of-sight position based on the pupil relative position obtained from the eye image. The CPU 11 outputs data relating to the line-of-sight position detected in this way to the device in response to requests from various driving support devices (not shown). Thereby, various driving assistances using the driver's line-of-sight position are performed.
たとえば、運転者の視線位置が、操舵角センサー(図示省略)で検出された車両の進行方向から大きく外れている場合、運転者に警告することで脇見運転を防止することが行われる。あるいは、カーナビ等の機器を、運転者の視線の動きで操作することが行われる。この場合は、指で操作するのに比べて身体的な負担を軽減することができ、手が塞がっている状況でも機器の操作が可能となる。
For example, when the driver's line-of-sight position is greatly deviated from the traveling direction of the vehicle detected by the steering angle sensor (not shown), the driver is warned to prevent the driver from looking aside. Alternatively, a device such as a car navigation system is operated by the movement of the driver's line of sight. In this case, the physical burden can be reduced as compared with the case of operating with a finger, and the device can be operated even when the hand is closed.
以上のように、視線位置を検出するための一連の処理を行ったら(S100~S106)、CPU11は、校正処理を行う(図2のS200)。この校正処理では、視線位置の検出精度が低下したと推定された場合に、次回からの視線位置の検出精度を高めるべく、上述した視線位置決定テーブルを校正する。
(校正処理)
図6には、CPU11によって行われる校正処理のフローチャートが示されている。CPU11は、校正処理を開始すると、照度センサー19を用いて車両周辺の明るさを検出する(S202)。そして、前回校正を行った時からの車両周辺の明るさの変化量が所定量を超えたか否かを判断する(S204)。詳しくは後述するが、本実施例の視線検出装置10では、視線位置決定テーブルの校正を行うたびに、車両周辺の明るさをRAM13に記憶する。そこで、S204の処理では、前回校正を行ったときにRAM13に記憶された明るさと、今回検出された明るさとを比較し、これらの明るさの差が所定量より大きくなったか否かを判断する(S204)。尚、所定量としては、数100lx~数1000lxを例示することができる。 As described above, after performing a series of processes for detecting the line-of-sight position (S100 to S106), theCPU 11 performs a calibration process (S200 in FIG. 2). In this calibration process, when it is estimated that the detection accuracy of the gaze position has been lowered, the above-described gaze position determination table is calibrated in order to increase the detection accuracy of the gaze position from the next time.
(Calibration process)
FIG. 6 shows a flowchart of the calibration process performed by theCPU 11. When starting the calibration process, the CPU 11 detects the brightness around the vehicle using the illuminance sensor 19 (S202). Then, it is determined whether or not the amount of change in brightness around the vehicle since the previous calibration exceeded a predetermined amount (S204). As will be described in detail later, in the visual line detection device 10 of the present embodiment, the brightness around the vehicle is stored in the RAM 13 every time the visual line position determination table is calibrated. Therefore, in the process of S204, the brightness stored in the RAM 13 when the previous calibration was performed is compared with the brightness detected this time, and it is determined whether or not the difference between these brightnesses exceeds a predetermined amount. (S204). Note that examples of the predetermined amount include several hundred lx to several thousand lx.
(校正処理)
図6には、CPU11によって行われる校正処理のフローチャートが示されている。CPU11は、校正処理を開始すると、照度センサー19を用いて車両周辺の明るさを検出する(S202)。そして、前回校正を行った時からの車両周辺の明るさの変化量が所定量を超えたか否かを判断する(S204)。詳しくは後述するが、本実施例の視線検出装置10では、視線位置決定テーブルの校正を行うたびに、車両周辺の明るさをRAM13に記憶する。そこで、S204の処理では、前回校正を行ったときにRAM13に記憶された明るさと、今回検出された明るさとを比較し、これらの明るさの差が所定量より大きくなったか否かを判断する(S204)。尚、所定量としては、数100lx~数1000lxを例示することができる。 As described above, after performing a series of processes for detecting the line-of-sight position (S100 to S106), the
(Calibration process)
FIG. 6 shows a flowchart of the calibration process performed by the
ここで、車両周辺の明るさが変化すると、次のような理由により、視線位置の検出精度が低下することがある。すなわち、車両周辺の明るさが変化すると、車室内の運転者の目に入る光の明るさも変化するので、運転者の目の瞳孔径の大きさが変化する。すると、実際の視線位置が変化していなくても、目の画像から得られる「瞳孔相対位置」が変化することがある。「瞳孔相対位置」には、該「瞳孔相対位置」に応じた視線位置が対応付けられているので、瞳孔相対位置が異なれば、当然ながら、対応する視線位置は異なったものとなる。このように、車両周辺の明るさが変化すると、実際の視線位置が変化していなくても目の画像から得られる「瞳孔相対位置」が変化し、ひいては該「瞳孔相対位置」に基づいて検出される視線位置が、実際の視線位置からずれてしまうことがある。
Here, when the brightness around the vehicle changes, the detection accuracy of the line-of-sight position may decrease for the following reason. That is, when the brightness around the vehicle changes, the brightness of light entering the eyes of the driver in the passenger compartment also changes, so the pupil diameter of the driver's eyes changes. Then, even if the actual line-of-sight position does not change, the “pupil relative position” obtained from the eye image may change. Since the “pupil relative position” is associated with the line-of-sight position corresponding to the “pupil relative position”, the corresponding line-of-sight position is naturally different if the pupil relative position is different. As described above, when the brightness around the vehicle changes, the “pupil relative position” obtained from the eye image changes even if the actual line-of-sight position does not change, and is thus detected based on the “pupil relative position”. The sight line position may be shifted from the actual sight line position.
したがって、上述したS204の判断処理で、車両周辺の明るさの変化量が所定量を超えたと判断された場合は(S204:yes)、視線位置の検出精度が低下したと推定することができる。そこで、CPU11は、車両周辺の明るさの変化量が所定量を超えたと判断したら(S204:yes)、S202の処理で検出した車両周辺の明るさを「今回校正時の明るさ」としてRAM13に記憶した後(S206)、視線位置の検出精度を高めるべく、次のようにして視線位置決定テーブルの校正を行う。
Therefore, if it is determined in the determination process of S204 described above that the amount of change in brightness around the vehicle has exceeded a predetermined amount (S204: yes), it can be estimated that the detection accuracy of the line-of-sight position has decreased. Therefore, when the CPU 11 determines that the amount of change in brightness around the vehicle has exceeded a predetermined amount (S204: yes), the brightness around the vehicle detected in the processing of S202 is stored in the RAM 13 as "brightness at the time of current calibration". After storing (S206), the gaze position determination table is calibrated as follows in order to increase the gaze position detection accuracy.
先ず、CPU11は、所定の制御信号をHUD装置21に出力することで、運転者の前方に「注視用画像30a」を表示する(S208)。図7には、運転者の前方のウィンドシールドに注視用画像30aが表示された様子が例示されている。尚、図7は、運転席から車両前方を見たときの様子を示しており、矩形形状の破線は、HUD装置21によるウィンドシールド上での画像表示領域を表している。そして、図示された例では、画像表示領域の右下部分に注視用画像30aが表示されている。HUD装置21は、CPU11からの制御信号に基づいて、普段表示する画像よりも目立つような態様で(たとえば、点滅させたり輝度を高くしたりして)注視用画像30aを表示する。こうして運転者の前方に注視用画像30aを表示すると、前方を見ながら運転する運転者は、その注視用画像30aを注視するものと考えられる。
First, the CPU 11 outputs a predetermined control signal to the HUD device 21 to display the “gazing image 30a” in front of the driver (S208). FIG. 7 illustrates a state in which the gaze image 30a is displayed on the windshield in front of the driver. FIG. 7 shows a state when the front of the vehicle is viewed from the driver's seat, and a rectangular broken line represents an image display area on the windshield by the HUD device 21. In the illustrated example, the gaze image 30a is displayed in the lower right part of the image display area. Based on the control signal from the CPU 11, the HUD device 21 displays the gaze image 30a in a manner that stands out from the image that is normally displayed (for example, blinking or increasing the brightness). When the gaze image 30a is displayed in front of the driver in this manner, it is considered that the driver driving while looking at the front gazes at the gaze image 30a.
そこで、CPU11は、運転者が注視用画像30aを注視している状態での瞳孔相対位置、すなわち注視用画像30aを表示したときの瞳孔相対位置を検出する(S210)。瞳孔相対位置の検出は、図2を用いて前述した視線位置検出処理と同様の方法で行う。すなわち、近赤外光を照射した状態で運転者の目を撮像し、目の画像から検出された瞳孔および角膜反射像の中心に基づいて瞳孔相対位置を検出する。
Therefore, the CPU 11 detects the pupil relative position when the driver is gazing at the gaze image 30a, that is, the pupil relative position when the gaze image 30a is displayed (S210). The detection of the pupil relative position is performed by the same method as the visual line position detection process described above with reference to FIG. That is, the driver's eyes are imaged in the state of irradiating near infrared light, and the pupil relative position is detected based on the pupil and the center of the cornea reflection image detected from the eye image.
こうして、注視用画像30aを表示したときの「瞳孔相対位置」を検出したら(S210)、CPU11は、注視用画像30aを表示した位置、および注視用画像30aを表示したときに得られた「瞳孔相対位置」に基づいて、視線位置決定テーブルを校正する(S212)。すなわち、今回得られた「瞳孔相対位置」は、運転者が注視用画像30aを注視している状態での「瞳孔相対位置」であると推定できるので、該「瞳孔相対位置」が検出された場合に、「注視用画像30aを表示した位置」が視線位置として検出されるように、視線位置決定テーブルにおける該「瞳孔相対位置」に対応する視線位置を、「注視用画像30aを表示した位置」に対応する視線位置に更新する。
Thus, when the “pupil relative position” when the gaze image 30a is displayed is detected (S210), the CPU 11 obtains the position where the gaze image 30a is displayed and the “pupil” obtained when the gaze image 30a is displayed. Based on the “relative position”, the line-of-sight position determination table is calibrated (S212). In other words, the “pupil relative position” obtained this time can be estimated to be the “pupil relative position” in a state where the driver is gazing at the gaze image 30a, and thus the “pupil relative position” is detected. In this case, the line-of-sight position corresponding to the “pupil relative position” in the line-of-sight position determination table is set to “the position where the line-of-sight image 30a is displayed” so that the “position where the line-of-sight image 30a is displayed” is detected as the line-of-sight position. To the line-of-sight position corresponding to “”.
たとえば、図8に示すように、今回得られた「瞳孔相対位置」(言い換えると、注視用画像30aを表示したときに得られた「瞳孔相対位置」)の値が(120,250)であった場合、この瞳孔相対位置(120、250)に対応する視線位置を、「注視用画像30aを表示した位置」に対応する視線位置(bx80、by80)に更新する。
For example, as shown in FIG. 8, the value of the “relative pupil position” obtained this time (in other words, the “relative pupil position obtained when the gaze image 30a is displayed) is (120, 250). In this case, the line-of-sight position corresponding to the pupil relative position (120, 250) is updated to the line-of-sight position (bx80, by80) corresponding to the “position where the gaze image 30a is displayed”.
こうして、注視用画像30aを表示したときに得られた「瞳孔相対位置」に対応する視線位置を更新したら、これに連動して、その他の「瞳孔相対位置」に対応する視線位置も、周知の補間処理や捕外処理等を行うことによって更新する。こうして視線位置決定テーブルを校正することで、視線位置検出処理で検出される視線位置と実際の視線位置とのズレが解消され、視線位置の検出精度が高められる。
Thus, when the line-of-sight position corresponding to the “pupil relative position” obtained when the gaze image 30a is displayed is updated, the line-of-sight positions corresponding to the other “pupil relative positions” are also known. It is updated by performing interpolation processing and extrapolation processing. By calibrating the line-of-sight position determination table in this manner, the deviation between the line-of-sight position detected in the line-of-sight position detection process and the actual line-of-sight position is eliminated, and the line-of-sight position detection accuracy is improved.
尚、本実施例では、視線位置決定テーブルを校正することにより、視線位置の検出精度が高められる。したがって、本実施例において「視線位置決定テーブルを校正する」ことは、本開示における「視線位置を校正する」ことに相当する。
In this embodiment, the accuracy of detecting the eye gaze position can be improved by calibrating the eye gaze position determination table. Therefore, in the present embodiment, “calibrating the line-of-sight position determination table” corresponds to “calibrating the line-of-sight position” in the present disclosure.
尚、視線位置決定テーブルを上述のようにして校正するのではなく、視線位置決定テーブルを生成し直すこととしてもよい。すなわち、図9に示されるように、S208の処理で説明した注視用画像30aを表示する処理を、表示位置を変更しながら複数回繰り返す。図9に示す例では、HUD装置21の画像表示領域の5カ所(四隅、および中央の部分)に注視用画像30b~30fを順次表示する。運転者は、順次表示された注視用画像30b~30fを注視すると考えられるので、CPU11は、注視用画像30b~30fを表示したときに撮像した目の画像から、瞳孔相対位置を検出する。そして、注視用画像30b~30fの表示位置(視線位置)と、検出された瞳孔相対位置とを対応付けて記憶する。また、注視用画像30b~30fを表示していない位置(視線位置)については、周知の補間処理や捕外処理等を行うことによって瞳孔相対位置を演算する。
Note that, instead of calibrating the line-of-sight position determination table as described above, the line-of-sight position determination table may be regenerated. That is, as shown in FIG. 9, the process of displaying the gaze image 30a described in the process of S208 is repeated a plurality of times while changing the display position. In the example shown in FIG. 9, the gaze images 30b to 30f are sequentially displayed at five locations (four corners and a central portion) of the image display area of the HUD device 21. Since the driver is considered to gaze at the sequentially displayed gaze images 30b to 30f, the CPU 11 detects the pupil relative position from the eye image captured when the gaze images 30b to 30f are displayed. Then, the display positions (line-of-sight positions) of the images for gaze 30b to 30f and the detected pupil relative positions are stored in association with each other. Further, for positions (line-of-sight positions) where the gazing images 30b to 30f are not displayed, pupil relative positions are calculated by performing known interpolation processing, extrapolation processing, and the like.
こうして、視線位置決定テーブルを生成し直すと、視線位置検出処理で検出される視線位置と実際の視線位置とのズレが確実に解消され、視線位置の検出精度を確実に高めることができる。尚、このように視線位置決定テーブルを生成し直した場合は、「視線位置決定テーブルを生成し直す」ことが、本開示における「視線位置を校正する」ことに相当する。
Thus, when the line-of-sight position determination table is regenerated, the deviation between the line-of-sight position detected by the line-of-sight position detection process and the actual line-of-sight position is surely eliminated, and the line-of-sight position detection accuracy can be reliably increased. When the line-of-sight position determination table is regenerated as described above, “re-generation of the line-of-sight position determination table” corresponds to “calibrate the line-of-sight position” in the present disclosure.
以上に説明したように、本実施例の視線検出装置10は、視線位置決定テーブルの校正(あるいは生成し直し)を行うに際し、運転者に対して注視用画像30aを表示する。運転者に対して注視用画像30aを表示する処理は、HUD装置21を用いていつでも行うことができる。また、前方を見ながら運転する運転者は、運転者に対して表示された(運転者の前方に表示された)注視用画像30aを注視すると考えられる。これらのことから、運転者に対して注視用画像30aを表示する処理により、所望のタイミングで運転者の「実際の視線位置」を取得することができる。
As described above, the line-of-sight detection device 10 according to the present embodiment displays the gaze image 30a to the driver when the line-of-sight position determination table is calibrated (or regenerated). The process of displaying the gaze image 30a for the driver can be performed at any time using the HUD device 21. Further, it is considered that the driver who drives while looking forward looks at the gaze image 30a displayed to the driver (displayed in front of the driver). For these reasons, the driver's “actual line-of-sight position” can be acquired at a desired timing by the process of displaying the gaze image 30a to the driver.
そこで、本実施例の視線検出装置10では、前述したように、運転者に対して注視用画像30aを表示して、その注視用画像30aを表示した位置と、視線位置検出処理(と同様の方法)で検出した瞳孔相対位置とに基づいて、視線位置決定テーブルを校正する。こうすれば、車両周辺の明るさの変化量が所定量を超えたタイミングを逃すことなく、運転者の「実際の視線位置」を取得して、その「実際の視線位置」が視線位置検出処理によって検出されるように視線位置決定テーブルを校正することができる。その結果、視線位置の検出精度を確実に高く保つことができる。
Therefore, as described above, in the line-of-sight detection device 10 of the present embodiment, the gaze image 30a is displayed to the driver, the position where the gaze image 30a is displayed, and the gaze position detection process (similar to the same). The line-of-sight position determination table is calibrated based on the pupil relative position detected in (Method). In this way, the driver's “actual gaze position” is acquired without missing the timing when the amount of change in brightness around the vehicle exceeds the predetermined amount, and the “actual gaze position” is detected by the gaze position detection process. The line-of-sight position determination table can be calibrated to be detected by As a result, the detection accuracy of the line-of-sight position can be reliably kept high.
尚、本実施例では、注視用画像30aを表示した位置と、視線位置検出処理(と同様の方法)で検出した「瞳孔相対位置」とに基づいて、視線位置決定テーブルを校正したが、この「瞳孔相対位置」は、視線位置検出処理によって検出される「視線位置」に対応するものである。したがって、本実施例における「注視用画像30aを表示した位置と、視線位置検出処理(と同様の方法)で検出した「瞳孔相対位置」とに基づいて、視線位置決定テーブルを校正する」ことは、本開示における「注視用画像を表示した位置と、視線検出部が検出した「視線位置」とに基づいて、視線位置を校正する」ことに相当する。
In this embodiment, the gaze position determination table is calibrated based on the position where the gaze image 30a is displayed and the “pupil relative position” detected by the gaze position detection process (similar method). The “pupil relative position” corresponds to the “line-of-sight position” detected by the line-of-sight position detection process. Therefore, in the present embodiment, “calibrating the gaze position determination table based on the position where the gaze image 30a is displayed and the“ pupil relative position ”detected by the gaze position detection process (similar method)” This corresponds to “calibrating the line-of-sight position based on the position where the gaze image is displayed and the“ line-of-sight position ”detected by the line-of-sight detection unit” in the present disclosure.
また、前方を見ながら運転する運転者は、運転者に対して表示された(運転者の前方に表示された)注視用画像30aを、視線を大きく動かすことなく無意識のうちに注視するものと考えられる。このため、運転者の運転を妨げることがないので、視線位置決定テーブルの校正を安全に行うことができる。
In addition, a driver who drives while looking forward looks at the gaze image 30a displayed to the driver (displayed in front of the driver) unconsciously without greatly moving his line of sight. Conceivable. For this reason, since a driver | operator's driving | operation is not prevented, the calibration of a gaze position determination table can be performed safely.
さらに、上述した説明では、運転者を所定の位置に注視させるために、注視用画像30aを表示することとしたが、注視用画像30aのように専用の画像を用いるのではなく、HUD装置21によって普段から表示される画像を用いることとしてもよい。なかでも、運転者に警告するための警告画像(たとえば、前方に障害物を検出したことを警告する警告画像や、道路標識を提示して注意を促す警告画像)は、一般的に、運転者の注意を引き付けるような表示態様が採用されている。このため、警告画像は、注視用画像30aと同様に、運転者を所定の位置に注視させることができる。
Further, in the above description, the gaze image 30a is displayed in order to cause the driver to gaze at a predetermined position. However, instead of using a dedicated image like the gaze image 30a, the HUD device 21 is used. The image displayed normally may be used. In particular, a warning image for warning the driver (for example, a warning image for warning that an obstacle has been detected ahead or a warning image for displaying a road sign to call attention) is generally used by the driver. A display mode that attracts the attention of is adopted. For this reason, the warning image can make the driver gaze at a predetermined position in the same manner as the gaze image 30a.
そこで、CPU11は、車両周辺の明るさの変化量が所定量を超えた場合、視線位置決定テーブルの校正を、警告画像の表示位置に基づいて行うこととしてもよい。このように、専用の画像(注視用画像30a)ではなく、普段から表示される警告画像を利用することとすれば、運転者に対して表示される画像は普段と何ら変わることがない。このため、運転者の運転を妨げることなく、視線位置決定テーブルの校正をより安全に行うことができる。
Therefore, the CPU 11 may calibrate the line-of-sight position determination table based on the display position of the warning image when the amount of change in brightness around the vehicle exceeds a predetermined amount. As described above, if the warning image displayed normally is used instead of the dedicated image (gaze image 30a), the image displayed to the driver does not change at all. For this reason, the line-of-sight position determination table can be calibrated more safely without disturbing the driving of the driver.
また、前述したように、本実施例の視線検出装置10では、車両周辺の明るさの変化量が所定量を超えた場合に視線位置の検出精度が低下したと推定して、視線位置決定テーブルの校正を行うこととした。以下では、この点について補足して説明する。
Further, as described above, in the line-of-sight detection device 10 according to the present embodiment, it is estimated that the detection accuracy of the line-of-sight position has decreased when the amount of change in brightness around the vehicle exceeds a predetermined amount, and the line-of-sight position determination table It was decided to calibrate. In the following, this point will be supplementarily described.
図10には、本実施例における「車両周辺の明るさ」と「視線位置の検出精度」との関係が概念的に示されている。図10中の上側には、車両周辺の明るさが示されており、図10中の下側には、視線位置の検出精度が示されている。
FIG. 10 conceptually shows the relationship between “brightness around the vehicle” and “detection accuracy of the line-of-sight position” in this embodiment. The upper side in FIG. 10 shows the brightness around the vehicle, and the lower side in FIG. 10 shows the detection accuracy of the line-of-sight position.
車両が日中に走行しているとき、図10中の時刻t1で示されるように、トンネル内に進入することで車両周辺の明るさが大きく低下することがある。これにより、車室内の運転者の目に入る光の明るさが低下すると、実際には視線位置が変化していなくても、目の画像から得られる瞳孔相対位置が変化してしまい、視線位置の検出精度が低下することがある。そこで、CPU11は、車両周辺の明るさの変化量が所定量を超えた(所定量を超えて低下した)と判断したら、視線位置決定テーブルの校正を行う。これにより、図10中の時刻t2で示されるように、視線位置の検出精度を高めることができる。
When the vehicle is traveling during the daytime, as shown at time t1 in FIG. 10, the brightness around the vehicle may be greatly reduced by entering the tunnel. As a result, when the brightness of light entering the eyes of the driver in the passenger compartment decreases, the relative position of the pupil obtained from the eye image changes even if the line-of-sight position does not actually change. The detection accuracy may be reduced. Therefore, if the CPU 11 determines that the amount of change in brightness around the vehicle has exceeded a predetermined amount (decreased beyond a predetermined amount), the CPU 11 calibrates the line-of-sight position determination table. Thereby, as shown at time t2 in FIG. 10, the detection accuracy of the line-of-sight position can be improved.
そして、図10中の時刻t3で示されるように、いずれ、車両がトンネルの外に出ると、車両周辺の明るさが元の明るさに戻り、運転者の目に入る光の明るさも元に戻る。すると、実際には視線位置が変化していなくても、瞳孔相対位置が変化してしまい、視線位置の検出精度が低下することがある。そこで、CPU11は、車両周辺の明るさの変化量が所定量を超えた(所定量を超えて上昇した)と判断したら、視線位置決定テーブルの校正を行う。これにより、図10中の時刻t4で示されるように、視線位置の検出精度を高めることができる。
Then, as shown at time t3 in FIG. 10, when the vehicle goes outside the tunnel, the brightness around the vehicle returns to the original brightness, and the brightness of the light entering the driver's eyes is also based on the brightness. Return. Then, even if the line-of-sight position does not actually change, the pupil relative position changes, and the line-of-sight position detection accuracy may decrease. Therefore, when the CPU 11 determines that the amount of change in brightness around the vehicle has exceeded a predetermined amount (increased beyond the predetermined amount), the CPU 11 calibrates the line-of-sight position determination table. Thereby, as shown at time t4 in FIG. 10, the detection accuracy of the line-of-sight position can be increased.
このように、車両周辺の明るさの変化量が所定量を超えた場合に視線位置決定テーブルの校正を行うこととすれば、視線位置の検出精度が低下したタイミングで校正を行って検出精度を高めることができる。この結果、視線位置の検出精度を高く保つことができる。
As described above, if the gaze position determination table is calibrated when the amount of change in brightness around the vehicle exceeds a predetermined amount, calibration is performed at the timing when the gaze position detection accuracy is reduced, and the detection accuracy is increased. Can be increased. As a result, the detection accuracy of the line-of-sight position can be kept high.
また、車両が日没時に走行しているとき、図10中の時刻t5から、時刻t6,t7,t8にかけて示されているように、車両周辺の明るさが徐々に低下することがある。この場合、たとえば、単位時間当たりの明るさの変化量だけでは、車両周辺の明るさの変化量が所定量を超えたか否かを適切に判断することができない虞がある。
Further, when the vehicle is traveling at sunset, the brightness around the vehicle may gradually decrease as shown from time t5 to time t6, t7, t8 in FIG. In this case, for example, it may not be possible to appropriately determine whether or not the amount of change in brightness around the vehicle exceeds a predetermined amount only by the amount of change in brightness per unit time.
そこで、本実施例の視線検出装置10(CPU11)は、視線位置決定テーブルを校正したときの車両周辺の明るさをRAM13に記憶することとし、視線位置決定テーブルの校正は、検出された車両周辺の明るさと、RAM13に記憶されている明るさとの差が所定量より大きくなった場合に行うこととする。
Therefore, the line-of-sight detection device 10 (CPU 11) of the present embodiment stores the brightness around the vehicle when the line-of-sight position determination table is calibrated in the RAM 13, and the line-of-sight position determination table is calibrated by detecting the vehicle periphery. And the brightness stored in the RAM 13 is greater than a predetermined amount.
こうすれば、車両周辺の明るさが徐々に変化する場合であっても、記憶されている明るさ(すなわち、前回校正を行ったときの明るさ)と、今回検出された明るさとを比較することができるので、車両周辺の明るさの変化量が所定量を超えたか否かを適切に判断することができる。これにより、車両周辺の明るさが徐々に変化することで視線位置の検出精度が徐々に低下するような場合でも、検出精度が前回の校正により高められた状態から大きく低下する前に校正を行って、視線位置の検出精度を高く保つことができる(図10中の時刻t6,t7,t8を参照)。尚、本実施例におけるRAM13は、本開示における「記憶部」に相当する。
In this way, even if the brightness around the vehicle gradually changes, the stored brightness (that is, the brightness at the time of previous calibration) and the brightness detected this time are compared. Therefore, it is possible to appropriately determine whether or not the amount of change in brightness around the vehicle exceeds a predetermined amount. As a result, even when the gaze position detection accuracy gradually decreases due to the gradual change in brightness around the vehicle, calibration is performed before the detection accuracy greatly decreases from the state that was increased by the previous calibration. Thus, the detection accuracy of the line-of-sight position can be kept high (see times t6, t7, and t8 in FIG. 10). The RAM 13 in this embodiment corresponds to a “storage unit” in the present disclosure.
以上、本実施例では、視線位置を検出する方法として、運転者の目の画像に写った角膜反射像と瞳孔との位置関係を利用する方法を用いて説明した。しかし、視線位置を検出する方法としては、この方法に限られるものではない。すなわち、視線位置を検出する各種方法において、車両周辺の明るさの変化量が大きくなると、視線位置の検出精度が低下しやすくなると考えられる。そこで、視線位置を検出する各種方法において、車両周辺の明るさの変化量が所定量を超えた場合に、視線位置の検出で行われる処理を校正することとすれば、上述した実施例と同様の効果を得ることができる。尚、視線位置の検出で行われる処理の校正は、検出方法に対応した方法で行われる。たとえば、視線位置の検出を、視線位置決定テーブルではなく、所定の関係式に基づいて行っている場合、その関係式を校正する。
As described above, in this embodiment, as a method for detecting the line-of-sight position, the method using the positional relationship between the cornea reflection image reflected in the driver's eye image and the pupil has been described. However, the method for detecting the line-of-sight position is not limited to this method. That is, in various methods for detecting the line-of-sight position, it is considered that the detection accuracy of the line-of-sight position is likely to decrease when the amount of change in brightness around the vehicle increases. Therefore, in various methods for detecting the line-of-sight position, when the amount of change in the brightness around the vehicle exceeds a predetermined amount, if the processing performed in the line-of-sight position detection is calibrated, it is the same as the above-described embodiment. The effect of can be obtained. It should be noted that the calibration of the processing performed when detecting the line-of-sight position is performed by a method corresponding to the detection method. For example, when the line-of-sight position is detected based on a predetermined relational expression instead of the line-of-sight position determination table, the relational expression is calibrated.
また、本実施例では、視線位置決定テーブルを例として、「視線位置を検出する際に行われる処理」を校正するものとして説明したが、もちろん、検出して得られた結果物である「視線位置」を校正することとしてもよい。すなわち、本開示における「視線位置を校正する」とは、「視線位置を検出する際に行われる処理」を校正するのか、検出して得られた結果物である「視線位置」を校正するのかに拘らず、結果的に視線位置の検出精度が高められるように校正することを表す。尚、検出して得られた結果物である「視線位置」を校正する方法としては、注視用画像30aを表示した位置(すなわち、実際の視線位置)と、検出された視線位置とに基づいて、該視線位置が「実際の視線位置」となるように、該視線位置をオフセットする方法を例示することができる。
In the present embodiment, the gaze position determination table is taken as an example to calibrate the “processing performed when the gaze position is detected”, but of course, the “gaze line” that is a result obtained by detection is used. The “position” may be calibrated. That is, in the present disclosure, “calibrate the line-of-sight position” refers to whether to “calibrate the process performed when detecting the line-of-sight position” or to calibrate the “line-of-sight position” obtained as a result of detection. Regardless of whether or not, as a result, the calibration is performed so that the detection accuracy of the line-of-sight position is improved. In addition, as a method of calibrating the “sight line position” which is a result obtained by detection, based on the position where the gaze image 30a is displayed (that is, the actual line-of-sight position) and the detected line-of-sight position. A method of offsetting the line-of-sight position so that the line-of-sight position becomes the “actual line-of-sight position” can be exemplified.
本開示の一態様によれば、視線検出装置および視線検出方法は、車両周辺の明るさを検出して、車両周辺の明るさの変化量が所定量を超えた場合に、視線検出部によって検出された視線位置を校正する。
According to one aspect of the present disclosure, the visual line detection device and the visual line detection method detect brightness around the vehicle and detect the brightness around the vehicle when the amount of change in brightness around the vehicle exceeds a predetermined amount. The sight line position is calibrated.
車両周辺の明るさの変化量が大きくなると、視線位置の検出精度が低下しやすくなると考えられる。そこで、車両周辺の明るさを検出して、車両周辺の明るさの変化量が所定量を超えた場合に、視線検出部によって検出された視線位置を校正することとすれば、視線位置の検出精度が低下したタイミングで校正を行って検出精度を高めることができる。その結果、視線位置の検出精度を高く保つことができる。
It is considered that the detection accuracy of the line-of-sight position is likely to decrease as the amount of change in brightness around the vehicle increases. Therefore, if the brightness around the vehicle is detected, and the amount of change in the brightness around the vehicle exceeds a predetermined amount, the line-of-sight position detected by the line-of-sight detection unit is calibrated. Calibration can be performed at the timing when the accuracy is lowered to increase the detection accuracy. As a result, the detection accuracy of the line-of-sight position can be kept high.
また、本開示の視線検出装置においては、次のようにしてもよい。視線位置を校正した時の車両周辺の明るさを記憶することとし、視線位置の校正は、検出された車両周辺の明るさと、記憶されている明るさとの差が所定量より大きくなった場合に行う。
Further, in the gaze detection device of the present disclosure, the following may be performed. The brightness around the vehicle when the line-of-sight position is calibrated is stored, and the line-of-sight position calibration is performed when the difference between the detected brightness around the vehicle and the stored brightness exceeds a predetermined amount. Do.
車両周辺の明るさは、トンネルの出入り時のように急変する場合だけでなく、時間の経過とともに徐々に変化する場合がある。このような場合でも、視線位置を校正した時の車両周辺の明るさを記憶しておき、その時の明るさから、車両周辺の明るさが所定量以上変化した場合に校正することとすれば、必要となったタイミングで視線位置を校正することができる。この結果、視線位置の検出精度を高く保つことができる。
The brightness around the vehicle may change not only suddenly when entering and exiting a tunnel but also gradually changing over time. Even in such a case, if the brightness around the vehicle when the line-of-sight position is calibrated is memorized, and the brightness around the vehicle changes from the brightness at that time by a predetermined amount or more, the calibration is performed. The line-of-sight position can be calibrated at the required timing. As a result, the detection accuracy of the line-of-sight position can be kept high.
また、本開示の視線検出装置においては、次のようにしてもよい。運転者に対して注視用画像を表示して、この注視用画像を表示した位置と、視線検出部が検出した視線位置とに基づいて、視線位置を校正する。
Further, in the gaze detection device of the present disclosure, the following may be performed. The gaze image is displayed to the driver, and the gaze position is calibrated based on the position where the gaze image is displayed and the gaze position detected by the gaze detection unit.
注視用画像を表示すれば、運転者はその注視用画像に視線を移動させると考えて良い。従って、注視用画像を表示すれば、必要となったタイミングで視線位置を校正することができる。その結果、視線位置の検出精度を確実に高く保つことができる。
If the gaze image is displayed, it can be considered that the driver moves the line of sight to the gaze image. Therefore, if the gaze image is displayed, the line-of-sight position can be calibrated at the required timing. As a result, the detection accuracy of the line-of-sight position can be reliably kept high.
ここで、この出願に記載されるフローチャート、あるいは、フローチャートの処理は、複数のステップ(あるいはセクションと言及される)から構成され、各ステップは、たとえば、S100と表現される。さらに、各ステップは、複数のサブステップに分割されることができる、一方、複数のステップが合わさって一つのステップにすることも可能である。さらに、このように構成される各ステップは、デバイス、モジュール、ミーンズとして言及されることができる。
Here, the flowchart described in this application or the processing of the flowchart is configured by a plurality of steps (or referred to as sections), and each step is expressed as S100, for example. Further, each step can be divided into a plurality of sub-steps, while a plurality of steps can be combined into one step. Further, each step configured in this way can be referred to as a device, module, or means.
また、上記の複数のステップの各々あるいは組合わさったものは、(i)ハードウエアユニット(例えば、コンピュータ)と組み合わさったソフトウエアのステップのみならず、(ii)ハードウエア(例えば、集積回路、配線論理回路)のステップとして、関連する装置の機能を含みあるいは含まずに実現できる。さらに、ハードウエアのステップは、マイクロコンピュータの内部に構成されることもできる。
In addition, each of the plurality of steps described above or a combination thereof includes not only (i) a software step combined with a hardware unit (eg, a computer) but also (ii) hardware (eg, an integrated circuit, As a step of the wiring logic circuit), it can be realized with or without including the functions of related devices. Further, the hardware steps can be configured inside the microcomputer.
以上、本開示の実施形態、構成、態様を例示したが、本開示に係わる実施形態、構成、態様は、上述した各実施形態、各構成、各態様に限定されるものではない。例えば、異なる実施形態、構成、態様にそれぞれ開示された技術的部を適宜組み合わせて得られる実施形態、構成、態様についても本開示に係わる実施形態、構成、態様の範囲に含まれる。
The embodiments, configurations, and aspects of the present disclosure have been illustrated above, but the embodiments, configurations, and aspects according to the present disclosure are not limited to the above-described embodiments, configurations, and aspects. For example, embodiments, configurations, and aspects obtained by appropriately combining technical units disclosed in different embodiments, configurations, and aspects are also included in the scope of the embodiments, configurations, and aspects according to the present disclosure.
Claims (4)
- 車両に設けられ、運転者の視線が到達する視線位置を検出する視線検出装置であって、
前記運転者の前記視線位置を検出する視線検出部(11a)と、
前記視線検出部(11a)によって検出された前記視線位置を校正する校正部(11b)と、
前記車両の周辺の明るさを検出する周辺環境検出部(11c)と
を備え、
前記校正部(11b)は、前記車両の周辺の明るさの変化量が所定量を超えた場合に、前記視線位置を校正する視線検出装置。 A gaze detection device that is provided in a vehicle and detects a gaze position where a driver's gaze arrives,
A line-of-sight detector (11a) for detecting the line-of-sight position of the driver;
A calibration unit (11b) for calibrating the line-of-sight position detected by the line-of-sight detection unit (11a);
A surrounding environment detection unit (11c) for detecting brightness around the vehicle,
The said calibrating part (11b) is a gaze detection apparatus which calibrates the said gaze position, when the variation | change_quantity of the brightness around the said vehicle exceeds predetermined amount. - 請求項1に記載の視線検出装置であって、
前記視線位置を校正した時の前記車両の周辺の明るさを記憶する記憶部(13)を備え、
前記校正部(11b)は、前記周辺環境検出部(11c)で検出された前記車両の周辺の明るさと、前記記憶部(13)に記憶されている明るさとの差が前記所定量より大きくなった場合に、前記視線位置を校正する視線検出装置。 The line-of-sight detection device according to claim 1,
A storage unit (13) for storing brightness around the vehicle when the line-of-sight position is calibrated;
In the calibration unit (11b), the difference between the brightness around the vehicle detected by the surrounding environment detection unit (11c) and the brightness stored in the storage unit (13) is larger than the predetermined amount. A line-of-sight detection device for calibrating the line-of-sight position. - 請求項1または請求項2に記載の視線検出装置であって、
前記運転者に対して注視用画像を表示する画像表示部(11d)を備え、
前記校正部(11b)は、前記画像表示部(11d)が前記注視用画像を表示した位置と、前記視線検出部(11a)が検出した前記視線位置とに基づいて、該視線位置を校正する視線検出装置。 The line-of-sight detection device according to claim 1 or 2,
An image display unit (11d) for displaying a gaze image for the driver;
The calibration unit (11b) calibrates the line-of-sight position based on the position where the image display unit (11d) displays the gaze image and the line-of-sight position detected by the line-of-sight detection unit (11a). Gaze detection device. - 車両に設けられ、運転者の視線が到達する視線位置を検出する視線検出方法であって、
前記運転者の前記視線位置を検出する工程(S106)と、
前記視線位置を検出する工程によって検出された前記視線位置を校正する工程(S212)と、
前記車両の周辺の明るさを検出する工程(S202)と
を備え、
前記視線位置を校正する工程(S106)は、前記車両の周辺の明るさの変化量が所定量を超えた場合に、前記視線位置を校正する視線検出方法。
A line-of-sight detection method that is provided in a vehicle and detects a line-of-sight position that a driver's line of sight reaches
Detecting the driver's line-of-sight position (S106);
Calibrating the line-of-sight position detected by the step of detecting the line-of-sight position (S212);
Detecting the brightness around the vehicle (S202),
The step of calibrating the line-of-sight position (S106) is a line-of-sight detection method for calibrating the line-of-sight position when the amount of change in brightness around the vehicle exceeds a predetermined amount.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2013-261878 | 2013-12-18 | ||
JP2013261878A JP6322991B2 (en) | 2013-12-18 | 2013-12-18 | Gaze detection device and gaze detection method |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2015092977A1 true WO2015092977A1 (en) | 2015-06-25 |
Family
ID=53402358
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2014/005903 WO2015092977A1 (en) | 2013-12-18 | 2014-11-26 | Sight line detection device and sight line detection method |
Country Status (2)
Country | Link |
---|---|
JP (1) | JP6322991B2 (en) |
WO (1) | WO2015092977A1 (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108243332A (en) * | 2016-12-23 | 2018-07-03 | 深圳点石创新科技有限公司 | Vehicle-mounted head-up-display system image adjusting method and vehicle-mounted head-up-display system |
US10755671B2 (en) | 2017-12-08 | 2020-08-25 | Topcon Corporation | Device, method, and program for controlling displaying of survey image |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2020071773A (en) * | 2018-11-01 | 2020-05-07 | アイシン精機株式会社 | Line-of-sight detection apparatus |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH0735543A (en) * | 1993-07-20 | 1995-02-07 | Nissan Motor Co Ltd | Eye point measuring device for vehicle |
JP2004012607A (en) * | 2002-06-04 | 2004-01-15 | Canon Inc | Optical equipment with function to detect line of sight and camera |
JP2004130940A (en) * | 2002-10-10 | 2004-04-30 | Nissan Motor Co Ltd | Gazing direction detection device |
US6927694B1 (en) * | 2001-08-20 | 2005-08-09 | Research Foundation Of The University Of Central Florida | Algorithm for monitoring head/eye motion for driver alertness with one camera |
JP2006025140A (en) * | 2004-07-07 | 2006-01-26 | Denso Corp | Cabin lighting device for vehicle |
JP2009015533A (en) * | 2007-07-03 | 2009-01-22 | Toyota Motor Corp | Gaze direction detection device |
Family Cites Families (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH09212082A (en) * | 1996-01-30 | 1997-08-15 | Nissan Motor Co Ltd | Visual line input device |
-
2013
- 2013-12-18 JP JP2013261878A patent/JP6322991B2/en active Active
-
2014
- 2014-11-26 WO PCT/JP2014/005903 patent/WO2015092977A1/en active Application Filing
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH0735543A (en) * | 1993-07-20 | 1995-02-07 | Nissan Motor Co Ltd | Eye point measuring device for vehicle |
US6927694B1 (en) * | 2001-08-20 | 2005-08-09 | Research Foundation Of The University Of Central Florida | Algorithm for monitoring head/eye motion for driver alertness with one camera |
JP2004012607A (en) * | 2002-06-04 | 2004-01-15 | Canon Inc | Optical equipment with function to detect line of sight and camera |
JP2004130940A (en) * | 2002-10-10 | 2004-04-30 | Nissan Motor Co Ltd | Gazing direction detection device |
JP2006025140A (en) * | 2004-07-07 | 2006-01-26 | Denso Corp | Cabin lighting device for vehicle |
JP2009015533A (en) * | 2007-07-03 | 2009-01-22 | Toyota Motor Corp | Gaze direction detection device |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108243332A (en) * | 2016-12-23 | 2018-07-03 | 深圳点石创新科技有限公司 | Vehicle-mounted head-up-display system image adjusting method and vehicle-mounted head-up-display system |
CN108243332B (en) * | 2016-12-23 | 2024-04-12 | 深圳点石创新科技有限公司 | Image adjusting method of vehicle-mounted head-up display system and vehicle-mounted head-up display system |
US10755671B2 (en) | 2017-12-08 | 2020-08-25 | Topcon Corporation | Device, method, and program for controlling displaying of survey image |
Also Published As
Publication number | Publication date |
---|---|
JP6322991B2 (en) | 2018-05-16 |
JP2015118579A (en) | 2015-06-25 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10795166B2 (en) | Head up display system and control method thereof | |
US10510276B1 (en) | Apparatus and method for controlling a display of a vehicle | |
CN102224443B (en) | Vehicle display device and display method | |
US10308283B2 (en) | Parking assist apparatus | |
JP5817843B2 (en) | Vehicle information transmission device | |
US9463743B2 (en) | Vehicle information display device and vehicle information display method | |
JP5092776B2 (en) | Gaze direction detection device and gaze direction detection method | |
TWI522257B (en) | Vehicle safety system and operating method thereof | |
US10706585B2 (en) | Eyeball information estimation device, eyeball information estimation method, and eyeball information estimation program | |
JP2010018201A (en) | Driver assistant device, driver assistant method, and driver assistant processing program | |
JP2016055801A (en) | On-vehicle display device | |
US10306154B2 (en) | Image display device | |
CN105711511A (en) | Method for operating a head-up display, presentation apparatus, vehicle | |
US9623819B2 (en) | Method for controlling a lighting brightness of a lit motor vehicle instrument as well as a motor vehicle with at least one dimmably lit motor vehicle instrument | |
JP7268526B2 (en) | VEHICLE DISPLAY CONTROL DEVICE AND VEHICLE DISPLAY SYSTEM | |
KR20200046613A (en) | Apparatus for calibrating camera monitoring driver and method thereof | |
US10997861B2 (en) | Luminance control device, luminance control system, and luminance control method | |
CN107010077A (en) | Method and the driver assistance system of adaptability for the driver that transmits information to motor vehicle | |
CN116761745A (en) | Camera mirror system display camera calibration | |
WO2015092977A1 (en) | Sight line detection device and sight line detection method | |
US10222613B2 (en) | Display apparatus for a vehicle | |
JP2009248812A (en) | Driving assistance device | |
JP2018022958A (en) | Vehicle display controller and vehicle monitor system | |
US20190156133A1 (en) | Vehicle driving support apparatus and vehicle driving support program | |
KR101825450B1 (en) | Vehicle control apparatus and method thereof |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 14872564 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 14872564 Country of ref document: EP Kind code of ref document: A1 |