WO2012052061A1 - Method and system for calibrating a gaze detector system - Google Patents
Method and system for calibrating a gaze detector system Download PDFInfo
- Publication number
- WO2012052061A1 WO2012052061A1 PCT/EP2010/065928 EP2010065928W WO2012052061A1 WO 2012052061 A1 WO2012052061 A1 WO 2012052061A1 EP 2010065928 W EP2010065928 W EP 2010065928W WO 2012052061 A1 WO2012052061 A1 WO 2012052061A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- calibration
- gaze
- point
- calibrating
- user
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims abstract description 66
- 238000001514 detection method Methods 0.000 claims abstract description 11
- 238000005259 measurement Methods 0.000 claims description 5
- 238000004458 analytical method Methods 0.000 claims description 2
- 238000012804 iterative process Methods 0.000 claims 2
- 230000008569 process Effects 0.000 description 11
- 210000001747 pupil Anatomy 0.000 description 6
- 230000000694 effects Effects 0.000 description 4
- 230000006870 function Effects 0.000 description 4
- 238000003384 imaging method Methods 0.000 description 4
- 238000003825 pressing Methods 0.000 description 4
- 238000005286 illumination Methods 0.000 description 3
- 230000009471 action Effects 0.000 description 2
- 230000001771 impaired effect Effects 0.000 description 2
- 230000003993 interaction Effects 0.000 description 2
- 239000000243 solution Substances 0.000 description 2
- 241000287181 Sturnus vulgaris Species 0.000 description 1
- 238000004378 air conditioning Methods 0.000 description 1
- 239000012482 calibration solution Substances 0.000 description 1
- 230000008859 change Effects 0.000 description 1
- 238000004883 computer application Methods 0.000 description 1
- 230000007613 environmental effect Effects 0.000 description 1
- 210000003128 head Anatomy 0.000 description 1
- 238000010191 image analysis Methods 0.000 description 1
- 230000000977 initiatory effect Effects 0.000 description 1
- 230000002452 interceptive effect Effects 0.000 description 1
- 239000000463 material Substances 0.000 description 1
- 230000004044 response Effects 0.000 description 1
- 230000000717 retained effect Effects 0.000 description 1
- 230000001960 triggered effect Effects 0.000 description 1
- 238000005303 weighing Methods 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
- G06F3/013—Eye tracking input arrangements
Definitions
- the present invention relates to a gaze detector, and particularly to a method for calibrating a gaze detector.
- the invention relates to a method for calibrating a gaze detector system according to the preamble of claim 1 .
- Gaze detectors are devices designed for recognizing the direction of the gaze of a human being, and thus the point he is looking at. Such gaze detectors are useful for several applications, especially belonging to human-machine interaction.
- Gaze detectors are also frequently used in the automotive field, e.g. to control the degree of attention of a driver, and in particular to detect whether the driver is looking at the road or not.
- the gaze detectors based on image processing use one or more cameras (usually two) and one or more light sources (e.g. infrared LEDs, and/or visible light) for illumination. The illuminating light is reflected towards the cameras by the user. These cameras provide images of the user's eyes, and by detecting the head position, the face position and the pupil position, the gaze detector can calculate the user's gaze direction. In certain applications, the pupil glint can also be considered as a factor. ln order to properly function, gaze detectors need an appropriate calibration to provide measurements with adequate accuracy.
- light sources e.g. infrared LEDs, and/or visible light
- Accuracy is inversely related to the observational error of the measure, defined as the spatial distance separating the point estimated by the gaze detector and the point the user is actually looking at.
- the calibration shall be particularly accurate.
- a gaze detector For the calibration of a gaze detector the user looks at a predetermined point (i.e. generated and known by the detector) for a certain amount of seconds.
- the gaze detector establishes a correlation between the gaze direction derived by the images of the user and the point the user actually is looking at. By repeating these steps for several predetermined points, calibration can be achieved.
- the method for calibrating a gaze detector comprises the steps of: starting a calibration phase; detecting a measured gaze point observed by a user, preferably by capturing images of the user; comparing the measured gaze point with a calibration point that the user is assumed to observe during the gaze detection; calibrating the gaze detector based on comparison.
- the calibration phase is automatically started as a calibration event is detected by sensor means; also the calibration point is determined based on the calibration event and its type.
- the method comprises the further steps of determining a quality value associated to the probability of accuracy of the calibration point and weighing the calibration based on the quality value.
- different preferred events can be interpreted as calibration events.
- such events can belong to categories of tasks spontaneously performed by the user during the use of a system. Therefore, the method of calibrating a gaze detector does not require a dedicated time for calibration, in which the user is completely absorbed by the calibration procedure. The calibration procedure thus results in an easier and lighter experience for the user.
- the method of calibration takes into account the nature of the calibration event triggering the calibration, since different events can have a different reliability for calibration purposes.
- This feature is implemented by calculating a quality value for each specific calibration event, and by weighting the calibration based on this quality value.
- the quality value can be a function of the measure gaze point, of the calibration point and of a measure confidence interval, whenever the calibration process is repeated several times.
- calibration events with a low quality value, caused by bad readings can be discarded during the calibration process.
- Figures 1 a, 1 b, 1 c and 1d show various stages of the functioning of the system according to the invention
- FIG. 2 is a flow chart of a method according to the invention.
- Figures 1 a trough 1 d represent an example of the system for calibrating a gaze detector, whose functioning is described in the following.
- the system implements the method according to the present invention.
- the system of the example of Figure 1 a to 1d preferably comprises:
- - imaging means 1 to acquire videos or images of the user 2, while he is performing tasks; at least one computational unit 6 adapted to measure a gaze point by analysis of the acquired images;
- the imaging means to acquire images of the user 2 comprise at least one camera 1 , pointed in such a direction so that the face of the user 2 can be viewed at all times.
- the camera 1 can advantageously be sensitive to infrared and/or ultraviolet light in addition to visible light, so that its imaging performances can be raised.
- special illumination for the user 2 can be provided (not represented in the Figures) so that his features can be more easily detected by the camera 1 .
- imaging means 1 also commercially available, can be miniaturized and located wherever needed (in frames of appliances, inside a car, on a wall, ...)
- the camera 1 is connected to a computational unit 6 comprising means to analyze the images and measure a gaze point which the user is looking at. This measure can be performed with different degrees of accuracy, depending on the calibration.
- the present invention provides means for improving such calibration.
- the user 2 is standing in front of the camera 1 , performing an activity.
- This activity can be anything that might require, at a certain time, to determine the user's gaze direction. Examples of this activity can be interacting with a display using a mouse, driving, surfing the web, performing an ATM cash withdrawal, selecting items on a touchscreen, and so on.
- a "calibration event” occurs, and is detected by the system.
- Such calibration event is an event belonging to a predetermined set of events (or category), that are used for calibrating the system 1 .
- Such events can be initiated by the user performing a particular action, or can correspond to external inputs presented to user and recognized by the system.
- the user 2 presses a button 3.
- the pressing of the button 3 can happen independently by the calibration method; in other words, the user 2 may decide spontaneously to press button 3, or may be prompted to do so.
- the pressing of the button 3 can result in effects other than or additional to the calibration of the system, i.e. turning on a light, or playing music, or initiating a computer procedure.
- the user may regard such "calibration events", depending on the situation, as something completely unrelated to the calibration process, being natural tasks that he would anyway perform.
- these natural tasks are used according to the present invention to provide a system calibration.
- the system in response to the calibration event (i.e. pressing of the button 3), can identify a calibration point (i.e. the button 3 itself).
- the acquired image 4, the calibration point 3 and the kind of calibration event are passed to a computational unit 6 adapted to perform the calibration method 5, that will be described in more depth in the following and is now schematized as revolving gears.
- Another example of calibration event useful for indirect calibration of the system as explained, can be the use of a GUI (Graphical User Interface) of a notebook.
- GUI Graphic User Interface
- the user utilizes a mouse to point and click icons shown on a display; when the user clicks on a particular icon, it is likely that he is looking exactly at the pointer of the mouse. Therefore, the mouse click represent a calibration event, and the position of the mouse pointer at the instant of the click represent the calibration point.
- a calibration event is the pressure of a certain combination of keys on a keyboard of a computer, such as the command CTRL+ALT+DEL.
- the execution of this command is likely to cause, in people that write from left to right, to look from the bottom left corner of the keyboard (where the CTRL key is) moving right towards the DEL key. It is clear that this calibration event can be less reliable than the latter, and the method can take into account this difference as it will be described in the following.
- the invention can be implemented in a car. While driving, the driver often stares at the rear-view mirror or at each of the side view mirrors. These actions are rather frequent, and can be elected as calibration events, knowing the exact position of the calibration points on the respective mirrors.
- the external sensors of the car can be employed to find external calibration points, such as traffic lights.
- a calibration event is the switching to the green light
- the green lamp of the traffic light is the actual point looked by the driver.
- further calibration events can be determined with the aid of a GPS navigation system, frequently installed in vehicles, by providing to the gaze detector information about the surroundings and the movement of the car.
- calibration events for the invention implemented in a car can the use of radio or air conditioning controls.
- system 1 may comprise several different means to detect calibration events.
- Calibration event detecting means trigger the computational unit via connecting means 7, when a specific event belonging to a predetermined category happens.
- the computational unit 6 recognizes the origin of the trigger signal, and acquires information relative to the calibration event.
- information can be the time of the calibration event, the nature of the calibration event (i.e. the category or subcategory of the event), the referenced calibration point and information related to the actual user's gaze measured. This pieces of information can be retained and stored (fully or partially, in volatile or non volatile form) in a memory of computational unit 6, in order to improve subsequent readings of other calibration events.
- the gaze detector can determine the gaze direction of the user using different techniques, more or less invasive, featuring contact or non-contact sensors; all known gaze measuring techniques can be applied.
- the system 1 is based on video or image analysis that outputs the position of the measured gaze point of the user as a function of time.
- the gaze measuring means can be based on:
- a border detection algorithm preferably based on the Laplace operator
- FIG. 2 exemplifies a flowchart of other aspects of the invention, describing in detail the calibration method.
- a calibration event 201 can be detected by the system, when an appropriate signal is triggered at the instant N, as previously described.
- the detection of the calibration event sets into motion an instance of the calibration of the gaze detector. This method can efficiently be performed while the gaze detector is active and being utilized by the user, or when the gaze detector is active even though the user is not currently using it directly.
- a calibration event belongs to a predetermined category as previously described, i.e. "click of mouse” or “press of a button” and so on .
- the calibration event category is predetermined, the calibration event itself may or may not be predetermined by the system, i.e. the time at which the event occurs may not be forecast by the system, being a consequence of the user's own initiative.
- the gaze detecting system acquires the necessary data for estimating the user's gaze direction.
- the system acquires one or more images 203 or videos of the user immediately after the calibration event 201 , from one or more points of view.
- Other embodiments may consider a plurality of images from a plurality of point of views, belonging to different instants of time, i.e. several images when the user is moving his gaze on a pattern (i.e. as described above with reference of the pressing of CTRL+ALT+DEL on a keyboard).
- the system is therefore capable of estimating a measured gaze point 204 of the user.
- the gaze detector 205 is adequately calibrated, the actual point looked at by the user can be identified.
- the calibration method according to the invention is used.
- 206 can be identified with more or less accuracy. As exemplified above, when the user clicks with a mouse, he is probably looking at the pointer on the screen. When he presses a point of a touchscreen, he is probably looking at the point itself.
- one or more calibration points 206 can be identified, corresponding to a particular calibration event 201 .
- This points 206 can be advantageously used for calibrating the system, according to the invention.
- the calibration point is, generally, a point that the user is probably looking at. Even in known solutions, there is no guarantee that the user is actually looking at the point as requested. Similarly, in this case, there are types of calibration points 206 that have a higher probability of being correctly identified by the system, and others that might have a smaller probability. In example, a click of a mouse has more probability of being a reliable calibration point, if compared to the use of video objects moving on a display or physical objects moving outside of a car.
- a value related to the reliability of a particular calibration event is determined.
- a value 207 corresponding to the reliability of the event type is determined; preferably, this value can be a probability value between 0 and 1 .
- an overall quality 208 can associated to the calibration available for the system at all times.
- the gaze detector 205 can have a calibration status that is known having a more or less reliable calibration. The procedure for making the overall quality 208 known will be described in the following.
- An initial or partial calibration for the gaze detector 205 has to be available, in order to measure the gaze point 204.
- This measure is therefore associated to a confidence interval 209 given at instance N-1 of calibration (i.e. the previous calibration instance); this confidence interval 209 is related to the expected observational error of the measure; i.e. the better the calibration, the higher will be the measure confidence 209.
- a measure quality 210 can therefore be determined, to take into account that, even if a reliable calibration event 201 with a high probability 207 is provided, the system could more or less capable of correctly using this information.
- the measured gaze point 204 and the information on the actual position of the calibration point 206 are cross-examined, to obtain the existing difference 21 1 between them.
- This is provided mainly for calibration purposes, since the calibration 212 of the gaze detector is based on updating a set of parameters related to the gaze point measure, in order to minimize the error between the measured gaze point and the actual calibration points, when available.
- the updated set of parameters is therefore passed over to the gaze detector 205 to be used for subsequent measurements.
- the difference 21 1 can be employed to increase the precision on the measure quality 210, in order to discard calibration events whose readings may not be advantageous for the system calibration.
- the measure quality 201 is obtained as a function of the difference 21 1 between the measured gaze point and the calibration point, and the value of the measure confidence 209. More in particular, the measure quality 210 can be determined as a statistical correlation between the variance of the measure related to the measure confidence 209 and the difference 21 1 . If the distance 21 1 is much smaller than its variance, the measure quality 210 value will be high; instead, if the distance 21 1 is much larger than its variance, the measure quality 210 of event N will be low.
- the measure quality is expressed as a probability having a value between 0 and 1 .
- the event's type probability 207 and the measure quality 210 are combined together to determine the calibration event quality 213.
- the event's type probability 207 and the measure quality 210 are simply multiplied for each other.
- the information on the calibration event quality 213 can be used to improve the calibration process.
- the calibration of the parameters of the gaze detector can be weighted to give more importance to the measurements obtained by calibration events with good quality 213 (i.e. above 0.8 of probability).
- the calibration event quality 213 at instance N can contribute to determine the overall quality of calibration 208, i.e. the latter being the average of the qualities of all the calibration events 1 ,...,N considered.
- calibration events 201 having qualities 213 in a wide range can be used for calibration (i.e. ranging from 0.4 to 1 .0), while after a certain overall calibration 208 quality has been achieved (i.e. above 0.9), all events with a quality 203 lower than the overall quality 208 can be discarded, so that the calibration can be steadily improved.
- the present method collects calibration points for the calibration process automatically, without requiring any user's aware contribution. It is clear that the method allows to use the gaze detector in an straight forward manner, without wasting time for an explicit calibration process.
- the method assesses the quality of the detected calibration events, in order to avoid events providing inaccurate data that might damage the calibration process.
- the method being iterative, continuously improves the calibration accuracy because more and more calibration events are detected with time.
- the method described above and the system implementing the method provides an automatic and iterative calibration process for a gaze detector.
- the present invention can find applications for the interaction of a user with graphical interfaces, such as computers or interactive displays in general.
- the invention is particularly useful whenever the user can interact with the interface using both his gaze and other input means (such as mouse, touchpad, touchscreen, keyboard, special devices for impaired users, and so on) so as to provide a plurality of possible calibration events.
- other input means such as mouse, touchpad, touchscreen, keyboard, special devices for impaired users, and so on
- the present invention can find other applications in the automotive field, in which the gaze of a driver is monitored for safety of control purposes.
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Eye Examination Apparatus (AREA)
Abstract
It is disclosed a method for calibrating a gaze detector, comprising the steps of: a) starting a calibration phase of said gaze detector; b) detecting a measured gaze point (204) observed by a user (2) by means of said gaze detector, preferably by capturing (203) at least an image (4) of at least an eye of said user (2); c) comparing said measured gaze point (204) with a calibration point (206, 3), said calibration point (206, 3) being a point that the user is assumed to observe during gaze detection at step b); d) calibrating (212) said gaze detector (205) based on the comparison of step c). The calibration phase is automatically started as a calibration event (201) is detected by sensor means (3, 7) and the calibration point (206, 3) is determined based on the calibration event (201) and on the type (202) of calibration event (201). A corresponding system for gaze detection is also disclosed.
Description
METHOD AND SYSTEM FOR CALIBRATING A GAZE DETECTOR SYSTEM
DESCRIPTION Technical Field
The present invention relates to a gaze detector, and particularly to a method for calibrating a gaze detector. In particular, the invention relates to a method for calibrating a gaze detector system according to the preamble of claim 1 . Background art
Gaze detectors are devices designed for recognizing the direction of the gaze of a human being, and thus the point he is looking at. Such gaze detectors are useful for several applications, especially belonging to human-machine interaction.
In particular, such devices are used to allow a physically impaired person to operate electronic equipments (e.g. computers). Moreover, gaze detectors can also be used for commercial purposes, e.g. for the precise identification of the advertisement being looked at by a user on a display. Gaze detectors are also frequently used in the automotive field, e.g. to control the degree of attention of a driver, and in particular to detect whether the driver is looking at the road or not.
Several implementations of gaze detectors are nowadays known. Solutions based on the processing of images of the user are particularly advantageous, since they do not require that the user wears dedicated sensors. The gaze detectors based on image processing use one or more cameras (usually two) and one or more light sources (e.g. infrared LEDs, and/or visible light) for illumination. The illuminating light is reflected towards the cameras by the user. These cameras provide images of the user's eyes, and by detecting the head position, the face position and the pupil position, the gaze detector can calculate the user's gaze direction. In certain applications, the pupil glint can also be considered as a factor.
ln order to properly function, gaze detectors need an appropriate calibration to provide measurements with adequate accuracy.
Accuracy is inversely related to the observational error of the measure, defined as the spatial distance separating the point estimated by the gaze detector and the point the user is actually looking at.
In order to use gaze detectors for interacting with a computer graphic interface, i.e. for the selection of buttons and icons on a display, the calibration shall be particularly accurate.
Also, a specific calibration is needed, to some extent, every time a different user uses the system or the environmental and illumination conditions undergo a relevant change.
As disclosed by Patent Application EP1209553A1 , for the calibration of a gaze detector the user looks at a predetermined point (i.e. generated and known by the detector) for a certain amount of seconds. The gaze detector establishes a correlation between the gaze direction derived by the images of the user and the point the user actually is looking at. By repeating these steps for several predetermined points, calibration can be achieved.
It is clear that the more predetermined points the user is required to look at, the better is the calibration and the system accuracy.
Summary of the invention
Known calibration solutions suffer from drawbacks in the calibration procedure, that in order to be accurate can become cumbersome for the user. For example, before being able to use a computer application featuring gaze detection at start-up, the user must undergo a lengthy calibration process. The calibration process can require a long time and become therefore rather unpractical and boring for the user.
It is an object of the present invention to overcome some of the above drawbacks of known gaze detectors.
ln particular, it is an object of the present invention to provide a method and a system for calibrating a gaze detector that is wieldy and convenient for the user. It is also an object of the present invention to provide a method and a system for calibrating a gaze detector which improves the accuracy of the detector.
These and further objects of the present invention are achieved by a method and a system for calibrating a gaze detector incorporating the features of the annexed claims, which form integral part of the present description. The method for calibrating a gaze detector comprises the steps of: starting a calibration phase; detecting a measured gaze point observed by a user, preferably by capturing images of the user; comparing the measured gaze point with a calibration point that the user is assumed to observe during the gaze detection; calibrating the gaze detector based on comparison. According to the method the calibration phase is automatically started as a calibration event is detected by sensor means; also the calibration point is determined based on the calibration event and its type.
According to another aspect of the invention, the method comprises the further steps of determining a quality value associated to the probability of accuracy of the calibration point and weighing the calibration based on the quality value.
Preferably, according to the method, different preferred events can be interpreted as calibration events. Advantageously, such events can belong to categories of tasks spontaneously performed by the user during the use of a system. Therefore, the method of calibrating a gaze detector does not require a dedicated time for calibration, in which the user is completely absorbed by the calibration procedure. The calibration procedure thus results in an easier and lighter experience for the user.
Preferably, the method of calibration takes into account the nature of the calibration
event triggering the calibration, since different events can have a different reliability for calibration purposes.
This feature is implemented by calculating a quality value for each specific calibration event, and by weighting the calibration based on this quality value. The quality value can be a function of the measure gaze point, of the calibration point and of a measure confidence interval, whenever the calibration process is repeated several times.
Advantageously, calibration events with a low quality value, caused by bad readings, can be discarded during the calibration process.
Brief description of the drawings
Further features and advantages of the present invention will become apparent in the detailed description of a preferred, non-exclusive, embodiment of a method and a system for calibrating a gaze detector according to the invention, which is described as a non-limiting example with the help of the annexed drawings, wherein:
Figures 1 a, 1 b, 1 c and 1d show various stages of the functioning of the system according to the invention;
Figure 2 is a flow chart of a method according to the invention. These drawings illustrate different aspects and embodiments of the present invention and, where appropriate, like structures, components, materials and/or elements in different figures are indicated by the same reference numbers.
Detailed description of embodiments
Figures 1 a trough 1 d represent an example of the system for calibrating a gaze detector, whose functioning is described in the following. The system implements the method according to the present invention.
The system of the example of Figure 1 a to 1d preferably comprises:
- imaging means 1 to acquire videos or images of the user 2, while he is performing tasks;
at least one computational unit 6 adapted to measure a gaze point by analysis of the acquired images;
means 7,3 to detect at least one calibration event;
means 7,3 to detect at least one calibration point.
These preferred elements will be thoroughly described in the following; these elements for sake of simplicity are represented schematically in the Figures.
Preferably, the imaging means to acquire images of the user 2 comprise at least one camera 1 , pointed in such a direction so that the face of the user 2 can be viewed at all times.
The camera 1 can advantageously be sensitive to infrared and/or ultraviolet light in addition to visible light, so that its imaging performances can be raised. For this purpose, special illumination for the user 2 can be provided (not represented in the Figures) so that his features can be more easily detected by the camera 1 .
Such imaging means 1 , also commercially available, can be miniaturized and located wherever needed (in frames of appliances, inside a car, on a wall, ...)
The camera 1 is connected to a computational unit 6 comprising means to analyze the images and measure a gaze point which the user is looking at. This measure can be performed with different degrees of accuracy, depending on the calibration. The present invention provides means for improving such calibration.
As shown in Figure 1 a, the user 2 is standing in front of the camera 1 , performing an activity. This activity can be anything that might require, at a certain time, to determine the user's gaze direction. Examples of this activity can be interacting with a display using a mouse, driving, surfing the web, performing an ATM cash withdrawal, selecting items on a touchscreen, and so on.
At the time of Figure 1 a, the user 2 is looking at a point "X"; the direction of his gaze at this time may or may not be detected by the system depending on the particular task the user 2 is performing.
As shown in Figure 1 b, at a certain instant a "calibration event" occurs, and is detected by the system. Such calibration event is an event belonging to a predetermined set of events (or category), that are used for calibrating the system 1 . Such events can be initiated by the user performing a particular action, or can correspond to external inputs presented to user and recognized by the system.
In the example of Figure 1 b, the simplest example of task is represented: the user 2 presses a button 3. It has to be noted that, according to the invention, the pressing of the button 3 can happen independently by the calibration method; in other words, the user 2 may decide spontaneously to press button 3, or may be prompted to do so. Moreover, the pressing of the button 3 can result in effects other than or additional to the calibration of the system, i.e. turning on a light, or playing music, or initiating a computer procedure.
In fact, according to the present invention, the user may regard such "calibration events", depending on the situation, as something completely unrelated to the calibration process, being natural tasks that he would anyway perform. Advantageously, these natural tasks are used according to the present invention to provide a system calibration.
In fact, at the instant in which the user presses the button 3, it is reasonable to assume that he is actually looking at button 3, as depicted. The system, in response to the calibration event (i.e. pressing of the button 3), can identify a calibration point (i.e. the button 3 itself).
As shown in Figure 1 c, at the instant the calibration event is detected by system 1 , an image 4 of the gaze of the user 2 is acquired ("Click!"). This image 4, knowing with sufficient accuracy the actual point looked at by the user 2 (i.e. the button 3) can be used by the computational unit 6 for calibration purposes.
As shown in Figure 1 c, the acquired image 4, the calibration point 3 and the kind of calibration event are passed to a computational unit 6 adapted to perform the calibration method 5, that will be described in more depth in the following and is now schematized as revolving gears.
Another example of calibration event, useful for indirect calibration of the system as explained, can be the use of a GUI (Graphical User Interface) of a notebook. In this case, the user utilizes a mouse to point and click icons shown on a display; when the user clicks on a particular icon, it is likely that he is looking exactly at the pointer of the mouse. Therefore, the mouse click represent a calibration event, and the position of the mouse pointer at the instant of the click represent the calibration point. Another example of a calibration event is the pressure of a certain combination of keys on a keyboard of a computer, such as the command CTRL+ALT+DEL. The execution of this command is likely to cause, in people that write from left to right, to look from the bottom left corner of the keyboard (where the CTRL key is) moving right towards the DEL key. It is clear that this calibration event can be less reliable than the latter, and the method can take into account this difference as it will be described in the following.
In another embodiment, the invention can be implemented in a car. While driving, the driver often stares at the rear-view mirror or at each of the side view mirrors. These actions are rather frequent, and can be elected as calibration events, knowing the exact position of the calibration points on the respective mirrors.
Moreover, the external sensors of the car (if available) can be employed to find external calibration points, such as traffic lights. In this case, a calibration event is the switching to the green light, and the green lamp of the traffic light is the actual point looked by the driver. In addition, further calibration events can be determined with the aid of a GPS navigation system, frequently installed in vehicles, by providing to the gaze detector information about the surroundings and the movement of the car.
Other calibration events for the invention implemented in a car can the use of radio or air conditioning controls.
It is clear that system 1 may comprise several different means to detect calibration events. Calibration event detecting means trigger the computational unit via connecting means 7, when a specific event belonging to a predetermined category happens. The computational unit 6 recognizes the origin of the trigger signal, and acquires information relative to the calibration event. Such information can be the time of the calibration event, the nature of the calibration event (i.e. the category or subcategory of the event), the referenced calibration point and information related to the actual user's gaze measured. This pieces of information can be retained and stored (fully or partially, in volatile or non volatile form) in a memory of computational unit 6, in order to improve subsequent readings of other calibration events.
The gaze detector can determine the gaze direction of the user using different techniques, more or less invasive, featuring contact or non-contact sensors; all known gaze measuring techniques can be applied.
In a preferred embodiment, the system 1 is based on video or image analysis that outputs the position of the measured gaze point of the user as a function of time. As an example, the gaze measuring means can be based on:
determination of the face position in the image;
- determination of the face tilt;
determination of the eye position;
determination of the pupil centre;
determination of the pupil contour by a border detection algorithm based on the Laplace operator;
- determination of the pupil radius;
determination of the iris contour by a border detection algorithm preferably based on the Laplace operator;
determination of the iris radius;
identification of pupil glints.
These and other measurements can be used in order to estimate the user's gaze direction.
Figure 2 exemplifies a flowchart of other aspects of the invention, describing in detail the calibration method. While the system is active, a calibration event 201 can be detected by the system, when an appropriate signal is triggered at the instant N, as previously described. The detection of the calibration event sets into motion an instance of the calibration of the gaze detector. This method can efficiently be performed while the gaze detector is active and being utilized by the user, or when the gaze detector is active even though the user is not currently using it directly.
When the instance N of the calibration event is detected, the system is able to identify the type 202 of calibration event, in order to properly respond to it. A calibration event belongs to a predetermined category as previously described, i.e. "click of mouse" or "press of a button" and so on .
While the calibration event category is predetermined, the calibration event itself may or may not be predetermined by the system, i.e. the time at which the event occurs may not be forecast by the system, being a consequence of the user's own initiative.
Simultaneously with the event type 202 recognition, the gaze detecting system acquires the necessary data for estimating the user's gaze direction. In the preferred embodiment, the system acquires one or more images 203 or videos of the user immediately after the calibration event 201 , from one or more points of view. Other embodiments may consider a plurality of images from a plurality of point of views, belonging to different instants of time, i.e. several images when the user is moving his gaze on a pattern (i.e. as described above with reference of the pressing of CTRL+ALT+DEL on a keyboard). With the images 203, the system is therefore capable of estimating a measured gaze point 204 of the user. In fact, according to known techniques, it is possible to
correlate the acquired data (i.e. the image 203) with the gaze point of the user. For this purpose, when the gaze detector 205 is adequately calibrated, the actual point looked at by the user can be identified. When the calibration is unreliable or can be further improved, the calibration method according to the invention is used.
In case of a calibration event 201 , after its type 202 is identified, a calibration point
206 can be identified with more or less accuracy. As exemplified above, when the user clicks with a mouse, he is probably looking at the pointer on the screen. When he presses a point of a touchscreen, he is probably looking at the point itself.
Therefore, one or more calibration points 206 can be identified, corresponding to a particular calibration event 201 . This points 206 can be advantageously used for calibrating the system, according to the invention.
As pointed out above, the calibration point is, generally, a point that the user is probably looking at. Even in known solutions, there is no guarantee that the user is actually looking at the point as requested. Similarly, in this case, there are types of calibration points 206 that have a higher probability of being correctly identified by the system, and others that might have a smaller probability. In example, a click of a mouse has more probability of being a reliable calibration point, if compared to the use of video objects moving on a display or physical objects moving outside of a car.
In order to assess this difference, a value related to the reliability of a particular calibration event is determined. In a preferred embodiment, a value 207 corresponding to the reliability of the event type is determined; preferably, this value can be a probability value between 0 and 1 .
If a calibration event is suitable for providing a good calibrating point, a higher value
207 will be associated to it. These criteria can be predetermined and embodied in the system, varying widely depending on the sought application. According to a preferred embodiment, at any given time and especially during a calibration event, an overall quality 208 can associated to the calibration available
for the system at all times. In particular, the gaze detector 205 can have a calibration status that is known having a more or less reliable calibration. The procedure for making the overall quality 208 known will be described in the following.
An initial or partial calibration for the gaze detector 205 has to be available, in order to measure the gaze point 204. This measure is therefore associated to a confidence interval 209 given at instance N-1 of calibration (i.e. the previous calibration instance); this confidence interval 209 is related to the expected observational error of the measure; i.e. the better the calibration, the higher will be the measure confidence 209.
A measure quality 210 can therefore be determined, to take into account that, even if a reliable calibration event 201 with a high probability 207 is provided, the system could more or less capable of correctly using this information.
The measured gaze point 204 and the information on the actual position of the calibration point 206 are cross-examined, to obtain the existing difference 21 1 between them. This is provided mainly for calibration purposes, since the calibration 212 of the gaze detector is based on updating a set of parameters related to the gaze point measure, in order to minimize the error between the measured gaze point and the actual calibration points, when available. The updated set of parameters is therefore passed over to the gaze detector 205 to be used for subsequent measurements.
In a preferred embodiment, the difference 21 1 can be employed to increase the precision on the measure quality 210, in order to discard calibration events whose readings may not be advantageous for the system calibration.
In a preferred embodiment, the measure quality 201 is obtained as a function of the difference 21 1 between the measured gaze point and the calibration point, and the
value of the measure confidence 209. More in particular, the measure quality 210 can be determined as a statistical correlation between the variance of the measure related to the measure confidence 209 and the difference 21 1 . If the distance 21 1 is much smaller than its variance, the measure quality 210 value will be high; instead, if the distance 21 1 is much larger than its variance, the measure quality 210 of event N will be low.
In a preferred embodiment, the measure quality is expressed as a probability having a value between 0 and 1 . The event's type probability 207 and the measure quality 210 are combined together to determine the calibration event quality 213. Preferably, the event's type probability 207 and the measure quality 210 are simply multiplied for each other.
The information on the calibration event quality 213 can be used to improve the calibration process. For example, the calibration of the parameters of the gaze detector can be weighted to give more importance to the measurements obtained by calibration events with good quality 213 (i.e. above 0.8 of probability).
Moreover, the calibration event quality 213 at instance N can contribute to determine the overall quality of calibration 208, i.e. the latter being the average of the qualities of all the calibration events 1 ,...,N considered.
For example, at the beginning of the calibration process, calibration events 201 having qualities 213 in a wide range can be used for calibration (i.e. ranging from 0.4 to 1 .0), while after a certain overall calibration 208 quality has been achieved (i.e. above 0.9), all events with a quality 203 lower than the overall quality 208 can be discarded, so that the calibration can be steadily improved.
The present method collects calibration points for the calibration process automatically, without requiring any user's aware contribution. It is clear that the method allows to use the gaze detector in an straight forward manner, without wasting time for an explicit calibration process.
Moreover, the method assesses the quality of the detected calibration events, in
order to avoid events providing inaccurate data that might damage the calibration process.
The method, being iterative, continuously improves the calibration accuracy because more and more calibration events are detected with time.
The method described above and the system implementing the method provides an automatic and iterative calibration process for a gaze detector.
The present invention can find applications for the interaction of a user with graphical interfaces, such as computers or interactive displays in general.
For example, the invention is particularly useful whenever the user can interact with the interface using both his gaze and other input means (such as mouse, touchpad, touchscreen, keyboard, special devices for impaired users, and so on) so as to provide a plurality of possible calibration events.
Also, the present invention can find other applications in the automotive field, in which the gaze of a driver is monitored for safety of control purposes.
The system and the method according to the invention are susceptible of a number of changes and variants, within the inventive concept as defined by the appended claims. All the details can be replaced by other technically equivalent parts without departing from the scope of the present invention.
While the system and the method have been described with particular reference to a particular embodiment and the accompanying figures, the details given in the description are meant to provide a better intelligibility of the invention and shall not be intended to limit the claimed scope in any manner.
Claims
1 . Method for calibrating a gaze detector, comprising the steps of:
a) starting a calibration phase of said gaze detector;
b) detecting a measured gaze point (204) observed by a user (2) by means of said gaze detector, preferably by capturing (203) at least an image (4) of at least an eye of said user (2);
c) comparing said measured gaze point (204) with a calibration point (206, 3), said calibration point (206, 3) being a point that the user is assumed to observe during gaze detection at step b);
d) calibrating (212) said gaze detector (205) based on the comparison of step c); the method being characterized in that said calibration phase is automatically started as a calibration event (201 ) is detected by sensor means (3, 7) and in that said calibration point (206, 3) is determined based on said calibration event (201 ) and on the type (202) of said calibration event (201 ).
2. Method for calibrating a gaze detector according to Claim 1 , further comprising the steps of:
- determining a quality value (213) associated to the accuracy of said calibration event (201 );
- weighting said calibration (212) based on said quality value (213).
3. Method for calibrating a gaze detector according to Claim 2, wherein said quality value (213) is determined as a function (207) of said type (202) of calibration event (201 ), in particular as a probability (207) of accuracy of said calibration point (206).
4. Method for calibrating a gaze detector according to Claim 3, wherein said quality value (213) is further determined as a function (210) of the accuracy of said measured gaze point (204), in particular as a probability (210) of accuracy of said measured gaze point (204).
5. Method for calibrating a gaze detector according to Claim 4, wherein said function (2 0) depends on the measure confidence (209) of said measured gaze point (204) related to the calibration accuracy of said gaze detector (205).
6. Method for calibrating a gaze detector according to Claim 5, wherein said measure confidence is based on said difference (21 1 ) between said measured gaze point (204) and said calibration point (206) and its variance.
7. Method for calibrating a gaze detector according to any one of Claims 5 or 6, wherein said measure confidence (209) is further a function of an overall quality (208) comprising said quality value (213) for one or more instances of said calibration method of said gaze detector (205) in an iterative process, comprising several calibration phases.
8. Method for calibrating a gaze detector according to any one of Claims 4 to 7, wherein said quality value (213) is determined as a probability, by multiplying said probability (210) of said accuracy of said measured gaze point (204) times said probability (207) of accuracy of said calibration point (206).
9. Method for calibrating a gaze detector according to any one of Claims 1 to 8, wherein said calibration point (206) is determined by sensor means (3,7) as a function of the time of said calibration event (201 ) and as a function of spatial measurements related to said detected calibration event (201 ), in particular said function not being predetermined before the beginning of said calibration phase a).
10. Method for calibrating a gaze detector according to any one of Claims 1 to 9, wherein said type (202) of said calibration event (201 ) belongs to a predetermined group of types of events.
1 1 . Method for calibrating a gaze detector according to any one of Claims 2 to 6, wherein a calibration event having said quality value (213) below a threshold is discarded and said calibration phase a) interrupted.
12. Method for calibrating a gaze detector according to Claim 1 1 , wherein said threshold is a function of on overall quality (208) comprising said quality value (213) for one or more instances of said calibration method of said gaze detector (205) in an iterative process comprising several calibration phases.
13. System for gaze detection, comprising:
- sensor means (1 ) adapted to detect information on the gaze of a user (2); - at least one computational unit (6) adapted to measure a gaze point by analysis of said detected information;
- means (7) to detect at least one calibration event;
- means (3) to detect at least one calibration point.
and characterized in that said system is adapted to implement the method for calibrating a gaze detector according to any one of Claims 1 to 12
14. System for gaze detection according to Claim 13, wherein said sensor means comprise at least one camera (1 ) adapted to acquire video or images of the user (2), while he is performing tasks.
15. System for gaze detection according to Claim 13 or 14, wherein said means (7,3) to detect at least one calibration event (7) comprise a mouse or a keyboard or another input device of a computer, and said means to detect at least one calibration point (3) comprise a display and a graphical processor.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/EP2010/065928 WO2012052061A1 (en) | 2010-10-22 | 2010-10-22 | Method and system for calibrating a gaze detector system |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/EP2010/065928 WO2012052061A1 (en) | 2010-10-22 | 2010-10-22 | Method and system for calibrating a gaze detector system |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2012052061A1 true WO2012052061A1 (en) | 2012-04-26 |
Family
ID=44201112
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/EP2010/065928 WO2012052061A1 (en) | 2010-10-22 | 2010-10-22 | Method and system for calibrating a gaze detector system |
Country Status (1)
Country | Link |
---|---|
WO (1) | WO2012052061A1 (en) |
Cited By (24)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2014125380A3 (en) * | 2013-02-14 | 2014-11-06 | The Eye Tribe Aps | Systems and methods of eye tracking calibration |
WO2014197408A1 (en) * | 2013-06-06 | 2014-12-11 | Microsoft Corporation | Calibrating eye tracking system by touch input |
US20150131850A1 (en) * | 2013-11-12 | 2015-05-14 | Fuji Xerox Co., Ltd. | Identifying user activities using eye tracking data, mouse events, and keystrokes |
DE102013019117A1 (en) | 2013-11-15 | 2015-05-21 | Audi Ag | Method for calibrating a viewing direction detection device for a motor vehicle, calibration device and motor vehicle |
EP2940555A1 (en) * | 2014-04-22 | 2015-11-04 | Lenovo (Singapore) Pte. Ltd. | Automatic gaze calibration |
WO2016146488A1 (en) * | 2015-03-13 | 2016-09-22 | SensoMotoric Instruments Gesellschaft für innovative Sensorik mbH | Method for automatically identifying at least one user of an eye tracking device and eye tracking device |
CN106462733A (en) * | 2014-05-19 | 2017-02-22 | 微软技术许可有限责任公司 | Gaze detection calibration |
CN106662917A (en) * | 2014-04-11 | 2017-05-10 | 眼球控制技术有限公司 | Systems and methods of eye tracking calibration |
US9727135B2 (en) | 2014-04-30 | 2017-08-08 | Microsoft Technology Licensing, Llc | Gaze calibration |
CN107407977A (en) * | 2015-03-05 | 2017-11-28 | 索尼公司 | Message processing device, control method and program |
DE102016210288A1 (en) * | 2016-06-10 | 2017-12-14 | Volkswagen Aktiengesellschaft | Eyetracker unit operating device and method for calibrating an eyetracker unit of an operating device |
US9851791B2 (en) | 2014-11-14 | 2017-12-26 | Facebook, Inc. | Dynamic eye tracking calibration |
US9961258B2 (en) | 2015-02-23 | 2018-05-01 | Facebook, Inc. | Illumination system synchronized with image sensor |
US10067561B2 (en) | 2014-09-22 | 2018-09-04 | Facebook, Inc. | Display visibility based on eye convergence |
WO2019014756A1 (en) * | 2017-07-17 | 2019-01-24 | Thalmic Labs Inc. | Dynamic calibration systems and methods for wearable heads-up displays |
CN110502099A (en) * | 2018-05-16 | 2019-11-26 | 托比股份公司 | Reliably detect the associated method between watching attentively and stimulating |
CN112148112A (en) * | 2019-06-27 | 2020-12-29 | 北京七鑫易维科技有限公司 | Calibration method and device, nonvolatile storage medium and processor |
US11194161B2 (en) | 2018-02-09 | 2021-12-07 | Pupil Labs Gmbh | Devices, systems and methods for predicting gaze-related parameters |
US11393251B2 (en) | 2018-02-09 | 2022-07-19 | Pupil Labs Gmbh | Devices, systems and methods for predicting gaze-related parameters |
US11537202B2 (en) | 2019-01-16 | 2022-12-27 | Pupil Labs Gmbh | Methods for generating calibration data for head-wearable devices and eye tracking system |
US11556741B2 (en) | 2018-02-09 | 2023-01-17 | Pupil Labs Gmbh | Devices, systems and methods for predicting gaze-related parameters using a neural network |
US11676422B2 (en) | 2019-06-05 | 2023-06-13 | Pupil Labs Gmbh | Devices, systems and methods for predicting gaze-related parameters |
CN118860141A (en) * | 2024-07-01 | 2024-10-29 | 成都集思鸣智科技有限公司 | Eye tracking calibration method, device, system and storage medium |
US12140771B2 (en) | 2020-02-19 | 2024-11-12 | Pupil Labs Gmbh | Eye tracking module and head-wearable device |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP1209553A1 (en) | 1998-02-20 | 2002-05-29 | Thomas E. Hutchinson | Eye-gaze direction detector |
EP1906296A2 (en) * | 2006-09-27 | 2008-04-02 | Malvern Scientific Solutions Limited | Method of employing a gaze direction tracking system for control of a computer |
WO2008141460A1 (en) * | 2007-05-23 | 2008-11-27 | The University Of British Columbia | Methods and apparatus for estimating point-of-gaze in three dimensions |
WO2010071928A1 (en) * | 2008-12-22 | 2010-07-01 | Seeing Machines Limited | Automatic calibration of a gaze direction algorithm from user behaviour |
-
2010
- 2010-10-22 WO PCT/EP2010/065928 patent/WO2012052061A1/en active Application Filing
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP1209553A1 (en) | 1998-02-20 | 2002-05-29 | Thomas E. Hutchinson | Eye-gaze direction detector |
EP1906296A2 (en) * | 2006-09-27 | 2008-04-02 | Malvern Scientific Solutions Limited | Method of employing a gaze direction tracking system for control of a computer |
WO2008141460A1 (en) * | 2007-05-23 | 2008-11-27 | The University Of British Columbia | Methods and apparatus for estimating point-of-gaze in three dimensions |
WO2010071928A1 (en) * | 2008-12-22 | 2010-07-01 | Seeing Machines Limited | Automatic calibration of a gaze direction algorithm from user behaviour |
Cited By (45)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2014125380A3 (en) * | 2013-02-14 | 2014-11-06 | The Eye Tribe Aps | Systems and methods of eye tracking calibration |
US9791927B2 (en) | 2013-02-14 | 2017-10-17 | Facebook, Inc. | Systems and methods of eye tracking calibration |
US9693684B2 (en) | 2013-02-14 | 2017-07-04 | Facebook, Inc. | Systems and methods of eye tracking calibration |
US9189095B2 (en) | 2013-06-06 | 2015-11-17 | Microsoft Technology Licensing, Llc | Calibrating eye tracking system by touch input |
CN105378595A (en) * | 2013-06-06 | 2016-03-02 | 微软技术许可有限责任公司 | Calibrating eye tracking system by touch input |
CN105378595B (en) * | 2013-06-06 | 2018-12-07 | 微软技术许可有限责任公司 | The method for calibrating eyes tracking system by touch input |
WO2014197408A1 (en) * | 2013-06-06 | 2014-12-11 | Microsoft Corporation | Calibrating eye tracking system by touch input |
US9256785B2 (en) * | 2013-11-12 | 2016-02-09 | Fuji Xerox Co., Ltd. | Identifying user activities using eye tracking data, mouse events, and keystrokes |
US20150131850A1 (en) * | 2013-11-12 | 2015-05-14 | Fuji Xerox Co., Ltd. | Identifying user activities using eye tracking data, mouse events, and keystrokes |
DE102013019117A1 (en) | 2013-11-15 | 2015-05-21 | Audi Ag | Method for calibrating a viewing direction detection device for a motor vehicle, calibration device and motor vehicle |
US9785233B2 (en) | 2014-04-11 | 2017-10-10 | Facebook, Inc. | Systems and methods of eye tracking calibration |
CN106662917A (en) * | 2014-04-11 | 2017-05-10 | 眼球控制技术有限公司 | Systems and methods of eye tracking calibration |
EP2940555A1 (en) * | 2014-04-22 | 2015-11-04 | Lenovo (Singapore) Pte. Ltd. | Automatic gaze calibration |
US9727135B2 (en) | 2014-04-30 | 2017-08-08 | Microsoft Technology Licensing, Llc | Gaze calibration |
US9727136B2 (en) * | 2014-05-19 | 2017-08-08 | Microsoft Technology Licensing, Llc | Gaze detection calibration |
CN106462733A (en) * | 2014-05-19 | 2017-02-22 | 微软技术许可有限责任公司 | Gaze detection calibration |
US20170336867A1 (en) * | 2014-05-19 | 2017-11-23 | Microsoft Technology Licensing, Llc | Gaze detection calibration |
CN106462733B (en) * | 2014-05-19 | 2019-09-20 | 微软技术许可有限责任公司 | A kind of method and calculating equipment for line-of-sight detection calibration |
US10248199B2 (en) | 2014-05-19 | 2019-04-02 | Microsoft Technology Licensing, Llc | Gaze detection calibration |
CN110569750A (en) * | 2014-05-19 | 2019-12-13 | 微软技术许可有限责任公司 | method and computing device for sight line detection calibration |
US10067561B2 (en) | 2014-09-22 | 2018-09-04 | Facebook, Inc. | Display visibility based on eye convergence |
US9851791B2 (en) | 2014-11-14 | 2017-12-26 | Facebook, Inc. | Dynamic eye tracking calibration |
US10013056B2 (en) | 2014-11-14 | 2018-07-03 | Facebook, Inc. | Dynamic eye tracking calibration |
US9961258B2 (en) | 2015-02-23 | 2018-05-01 | Facebook, Inc. | Illumination system synchronized with image sensor |
CN107407977A (en) * | 2015-03-05 | 2017-11-28 | 索尼公司 | Message processing device, control method and program |
EP3267295A4 (en) * | 2015-03-05 | 2018-10-24 | Sony Corporation | Information processing device, control method, and program |
US11003245B2 (en) | 2015-03-13 | 2021-05-11 | Apple Inc. | Method for automatically identifying at least one user of an eye tracking device and eye tracking device |
WO2016146488A1 (en) * | 2015-03-13 | 2016-09-22 | SensoMotoric Instruments Gesellschaft für innovative Sensorik mbH | Method for automatically identifying at least one user of an eye tracking device and eye tracking device |
US10521012B2 (en) | 2015-03-13 | 2019-12-31 | Apple Inc. | Method for automatically identifying at least one user of an eye tracking device and eye tracking device |
DE102016210288A1 (en) * | 2016-06-10 | 2017-12-14 | Volkswagen Aktiengesellschaft | Eyetracker unit operating device and method for calibrating an eyetracker unit of an operating device |
US10635170B2 (en) | 2016-06-10 | 2020-04-28 | Volkswagen Aktiengesellschaft | Operating device with eye tracker unit and method for calibrating an eye tracker unit of an operating device |
WO2019014756A1 (en) * | 2017-07-17 | 2019-01-24 | Thalmic Labs Inc. | Dynamic calibration systems and methods for wearable heads-up displays |
US11340461B2 (en) | 2018-02-09 | 2022-05-24 | Pupil Labs Gmbh | Devices, systems and methods for predicting gaze-related parameters |
US11194161B2 (en) | 2018-02-09 | 2021-12-07 | Pupil Labs Gmbh | Devices, systems and methods for predicting gaze-related parameters |
US11393251B2 (en) | 2018-02-09 | 2022-07-19 | Pupil Labs Gmbh | Devices, systems and methods for predicting gaze-related parameters |
US11556741B2 (en) | 2018-02-09 | 2023-01-17 | Pupil Labs Gmbh | Devices, systems and methods for predicting gaze-related parameters using a neural network |
CN110502099A (en) * | 2018-05-16 | 2019-11-26 | 托比股份公司 | Reliably detect the associated method between watching attentively and stimulating |
CN110502099B (en) * | 2018-05-16 | 2023-06-23 | 托比股份公司 | Method for reliably detecting a correlation between gaze and stimulus |
US11537202B2 (en) | 2019-01-16 | 2022-12-27 | Pupil Labs Gmbh | Methods for generating calibration data for head-wearable devices and eye tracking system |
US11676422B2 (en) | 2019-06-05 | 2023-06-13 | Pupil Labs Gmbh | Devices, systems and methods for predicting gaze-related parameters |
US12154383B2 (en) | 2019-06-05 | 2024-11-26 | Pupil Labs Gmbh | Methods, devices and systems for determining eye parameters |
CN112148112A (en) * | 2019-06-27 | 2020-12-29 | 北京七鑫易维科技有限公司 | Calibration method and device, nonvolatile storage medium and processor |
CN112148112B (en) * | 2019-06-27 | 2024-02-06 | 北京七鑫易维科技有限公司 | Calibration method and device, nonvolatile storage medium and processor |
US12140771B2 (en) | 2020-02-19 | 2024-11-12 | Pupil Labs Gmbh | Eye tracking module and head-wearable device |
CN118860141A (en) * | 2024-07-01 | 2024-10-29 | 成都集思鸣智科技有限公司 | Eye tracking calibration method, device, system and storage medium |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2012052061A1 (en) | Method and system for calibrating a gaze detector system | |
US10866677B2 (en) | Electronic device equipped with touch panel and update method of base line value | |
US11294470B2 (en) | Human-to-computer natural three-dimensional hand gesture based navigation method | |
US8760432B2 (en) | Finger pointing, gesture based human-machine interface for vehicles | |
US8982046B2 (en) | Automatic calibration of a gaze direction algorithm from user behavior | |
JP6420486B2 (en) | Techniques for distinguishing between intended and unintended gestures on a wearable touch-sensing fabric | |
US9423877B2 (en) | Navigation approaches for multi-dimensional input | |
US9952663B2 (en) | Method for gesture-based operation control | |
US20150301684A1 (en) | Apparatus and method for inputting information | |
EP2795450B1 (en) | User gesture recognition | |
US10019055B2 (en) | Proximity aware content switching user interface | |
US10366281B2 (en) | Gesture identification with natural images | |
TWI708178B (en) | Force-sensitive user input interface for an electronic device | |
US20180299996A1 (en) | Electronic Device Response to Force-Sensitive Interface | |
US10394370B2 (en) | Apparatus and method for recognizing touch input using interpolation | |
KR20140145579A (en) | Classifying the intent of user input | |
WO2014020323A1 (en) | Cursor movement device | |
WO2013115991A1 (en) | Latency measurement | |
TW201415291A (en) | Method and system for gesture identification based on object tracing | |
US20160357301A1 (en) | Method and system for performing an action based on number of hover events | |
US20230125410A1 (en) | Information processing apparatus, image capturing system, method, and non-transitory computer-readable storage medium | |
CN113885730A (en) | False touch prevention method and device | |
CN117608420A (en) | Method, device and system for preventing false touch of folding screen, electronic equipment and storage medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 10774172 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 10774172 Country of ref document: EP Kind code of ref document: A1 |