US20190101767A1 - Fresnel assembly for light redirection in eye tracking systems - Google Patents
Fresnel assembly for light redirection in eye tracking systems Download PDFInfo
- Publication number
- US20190101767A1 US20190101767A1 US15/723,305 US201715723305A US2019101767A1 US 20190101767 A1 US20190101767 A1 US 20190101767A1 US 201715723305 A US201715723305 A US 201715723305A US 2019101767 A1 US2019101767 A1 US 2019101767A1
- Authority
- US
- United States
- Prior art keywords
- light
- band
- hmd
- fresnel
- reflective surfaces
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Images
Classifications
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/10—Beam splitting or combining systems
- G02B27/14—Beam splitting or combining systems operating by reflection only
- G02B27/141—Beam splitting or combining systems operating by reflection only using dichroic mirrors
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0101—Head-up displays characterised by optical features
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/0093—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00 with means for monitoring data relating to the user, e.g. head-tracking, eye-tracking
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/017—Head mounted
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/017—Head mounted
- G02B27/0172—Head mounted characterised by optical features
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/10—Beam splitting or combining systems
- G02B27/14—Beam splitting or combining systems operating by reflection only
- G02B27/143—Beam splitting or combining systems operating by reflection only using macroscopically faceted or segmented reflective surfaces
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B3/00—Simple or compound lenses
- G02B3/02—Simple or compound lenses with non-spherical faces
- G02B3/08—Simple or compound lenses with non-spherical faces with discontinuous faces, e.g. Fresnel lens
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
- G06F3/013—Eye tracking input arrangements
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0101—Head-up displays characterised by optical features
- G02B2027/011—Head-up displays characterised by optical features comprising device for correcting geometrical aberrations, distortion
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0101—Head-up displays characterised by optical features
- G02B2027/0123—Head-up displays characterised by optical features comprising devices increasing the field of view
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0101—Head-up displays characterised by optical features
- G02B2027/0138—Head-up displays characterised by optical features comprising image capture systems, e.g. camera
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0149—Head-up displays characterised by mechanical features
- G02B2027/015—Head-up displays characterised by mechanical features involving arrangement aiming to get less bulky devices
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/017—Head mounted
- G02B2027/0178—Eyeglass type
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0179—Display position adjusting means not related to the information to be displayed
- G02B2027/0187—Display position adjusting means not related to the information to be displayed slaved to motion of at least a part of the body of the user, e.g. head, eye
Definitions
- the present disclosure relates generally to light redirection, and specifically relates to a Fresnel assembly for light redirection in eye tracking systems.
- Eye tracking refers to the process of detecting the direction of a user's gaze, which may comprise detecting an orientation of an eye in 3-dimentional (3D) space. Eye tracking in the context of headsets used in, e.g., virtual reality and/or augmented reality applications can be an important feature.
- Conventional systems commonly use, e.g., a number of infrared light sources to illuminate the eye light, and a camera is used to image a reflection of the light sources from the eye.
- eye tracking systems use beam splitters to redirect infrared light reflected from the eye to the camera.
- beam splitters are often large, cumbersome, and unsuitable for HMDs used in augmented reality (AR), mixed reality (MR), and virtual reality (VR) systems.
- a Fresnel assembly transmits (e.g., partially or fully) light in a first band (e.g., visible light) and redirects some, or all of light in a second band (e.g., infrared light) to one or more locations.
- the Fresnel assembly includes a plurality of surfaces that act to redirect light in the second band.
- the plurality of surfaces may be coated with a dichroic material that is transmissive in the first band and reflective in the second band.
- an immersion layer is overmolded onto the Fresnel assembly to form an immersed Fresnel assembly.
- the immersion layer may be index matched to the Fresnel assembly.
- one or more surfaces of the immersed Fresnel assembly may be shaped (e.g., concave, convex, asphere, freeform, etc.) to adjust optical power of the immersed Fresnel assembly.
- the Fresnel assembly may be integrated into a head-mounted display (HMD).
- HMD head-mounted display
- a HMD includes a display element, an optics block, a Fresnel assembly, an illumination source, and a camera assembly.
- the display element outputs image light in a first band (e.g., visible light) of light through a display surface of the display element.
- the optics block directs light from the display element to an eyebox (a region in space occupied by an eye of the user).
- the Fresnel assembly transmits light in the first band and directs light in a second band (e.g., infrared light) different than the first band to a first position.
- the illumination source e.g., part of a tracking system
- the camera e.g., part of an eye and/or face tracking system
- the Fresnel assembly may be non-wavelength sensitive (e.g. it could be a non-immersed, uncoated surface, or it could have a partially-reflective coating and is immersed). It transmits a portion of the light from the display and directs it to the eyebox. It also directs parts of the display light outside of the eyebox.
- the camera is positioned such that the light that comes from the eye is reflected off of the Fresnels and is captured by the camera.
- the HMD includes a controller (e.g., part of the tracking system) that generates tracking information (e.g., gaze location and/or facial expressions).
- FIG. 1 is a diagram of a cross section of a portion of a display block of an HMD (not shown), in accordance with an embodiment.
- FIG. 2 is a diagram of a cross section of a portion of a display block of an HMD (not shown) with a canted Fresnel assembly, in accordance with an embodiment.
- FIG. 3 is a diagram of a cross section of a portion of a display block of an HMD (not shown) with an immersed Fresnel assembly, in accordance with an embodiment.
- FIG. 4 is a diagram of a cross section of a portion of an immersed Fresnel assembly 410 , in accordance with an embodiment.
- FIG. 5 is a diagram of a cross section of a portion of an immersed Fresnel assembly that captures multiple view angles, in accordance with an embodiment.
- FIG. 6 is a diagram of a cross section of a portion of an immersed Fresnel assembly configured to interact with multiple positions, in accordance with an embodiment.
- FIG. 7A is a diagram of an HMD, in one embodiment.
- FIG. 7B is a diagram of a cross-section of the HMD, in one embodiment.
- FIG. 8A is an example array of sub-pixels on an electronic display element, in accordance with an embodiment
- FIG. 8B is an image of an example array of sub-pixels adjusted by an optical block, in accordance with an embodiment.
- FIG. 1 is a diagram of a cross section 100 of a portion of a display block of an HMD (not shown), in accordance with an embodiment.
- the display block includes a display element 110 , a Fresnel assembly 120 , and a tracking system 130 .
- the display block 100 may also include an optics block 140 .
- the display element 110 displays images to the user.
- the display element 110 may comprise a single electronic display panel or multiple electronic display panels (e.g., a display for each eye of a user).
- Examples of the display element 110 include: a liquid crystal display (LCD), an organic light emitting diode (OLED) display, an active-matrix organic light-emitting diode display (AMOLED), a quantum organic light emitting diode (QOLED) display, a quantum light emitting diode (QLED) display, a transparent organic light emitting diode (TOLED) display, some other display, or some combination thereof.
- the display element 110 is a waveguide display. The display element 110 emits content within a first band of light (e.g., in a visible band).
- the Fresnel assembly 120 transmits light in the first band and reflects light within a second band.
- the Fresnel assembly 120 may transmit light within a visible band (e.g., 400-700 nanometers (nm)), and may reflect light within an infrared (IR) band (e.g., above 780 nm).
- the Fresnel assembly 120 comprises a first surface that includes plurality of reflective surfaces that form a Fresnel lens, a portion 145 of the plurality of reflective surfaces are illustrated in FIG. 1 .
- the plurality of reflective surfaces are coated with a dichroic film that is transmissive in the first band and reflective in the second band.
- the plurality of the reflective surfaces are coated with a partial reflective coating.
- the plurality of reflective surfaces can be configured to reflect light in the second band to different positions, and the positions.
- the Fresnel assembly 120 can partially transmit light, and partially reflect light, for any light band of interest. For example, without coating the Fresnel assembly 120 can weakly reflect light by Fresnel reflection, and transmit the rest of the light. In contrast, with coatings, the transmission/reflection ratio can be adjusted. Alternatively, the Fresnel assembly 120 can transmit light with one polarization, and reflect light with an orthogonal polarization.
- the Fresnel assembly 120 is coupled directly to a display surface of to the display element.
- the Fresnel assembly 120 is at least partially transmissive to light in the first band.
- the Fresnel assembly 120 may increase or decrease the optical power of light in the first band from the display element 110 . Additionally, for light in the first band, the Fresnel assembly 120 may reduce fixed pattern noise (i.e., the screen door effect).
- the Fresnel assembly 120 blurs light emitted from individual pixels in the display element 110 . This blurring allows for dark spots between individual colors to be masked.
- the optics block 140 directs light to a target region.
- the target region includes a portion of the user's face.
- the target region includes one or both eyes (e.g., the eye 155 ) of the user.
- the target region in FIG. 1 is represented as an eyebox 150 .
- the eyebox 150 is a region in space that is occupied by an eye 155 of a user of the HMD.
- the optics block 140 includes one or more optical elements and/or combinations of different optical elements.
- an optical element is an aperture, a Fresnel lens, a convex lens, a concave lens, a filter, or any other suitable optical element that affects the image light emitted from the display element 110 .
- one or more of the optical elements in the optics block 140 may have one or more coatings, such as anti-reflective coatings.
- Magnification of the image light by the optics block 140 allows the display element 110 to be physically smaller, weigh less, and consume less power than larger displays. Additionally, magnification may increase a field of view of the displayed content. For example, the field of view of the displayed content is such that the displayed content is presented using almost all (e.g., 110 degrees diagonal), and in some cases all, of the user's field of view.
- the optics block 140 is designed so its effective focal length is larger than the spacing to the display element 110 , which magnifies the image light projected by the display element 110 . Additionally, in some embodiments, the amount of magnification is adjusted by adding or removing optical elements.
- the optics block 140 is designed to correct one or more types of optical errors.
- optical errors include: two-dimensional optical errors, three-dimensional optical errors, or some combination thereof.
- Two-dimensional errors are optical aberrations that occur in two dimensions.
- Example types of two-dimensional errors include: barrel distortion, pincushion distortion, longitudinal chromatic aberration, transverse chromatic aberration, or any other type of two-dimensional optical error.
- Three-dimensional errors are optical errors that occur in three dimensions.
- Example types of three-dimensional errors include spherical aberration, comatic aberration, field curvature, astigmatism, or any other type of three-dimensional optical error.
- content provided to the display element 110 for display is pre-distorted, and the optics block 140 corrects the distortion when it receives image light from the electronic display element 110 generated based on the content.
- the tracking system 130 tracks movement of the target region. Some or all of the tracking system 130 may or may not be in a line of sight of a user wearing the HMD.
- the tracking system 130 is typically located off-axis to avoid obstructing the user's view of the display element 110 , although the tracking system 130 may alternately be placed elsewhere.
- there is at least one tracking system 130 for different portions of the user's face e.g., a tracking system for the user's left eye and a second tracking system for the user's right eye. In some embodiments, only one tracking system 130 may track multiple target regions.
- the tracking system 130 may include one or more illumination sources 160 , a camera assembly 165 , and a controller 170 .
- the tracking system 130 determines tracking information using data (e.g., images) captured by the camera assembly 165 of the target region (e.g., the eye 155 ).
- Tracking information describes a position of a portion of the user's face. Tracking information may include, e.g., facial expressions, gaze angle, eye orientation, inter-pupillary distance, vergence depth, some other metric associated with tracking an eye, some other metric associated with tracking a portion of the user's face, or some combination thereof.
- Some embodiments of the tracking unit have different components than those described in FIG. 1 .
- the illumination source 160 illuminates the target region (e.g., the eyebox 150 ) with light in the second band of light (i.e., source light 175 ) that is different from the first band of light associated with content from the display element 110 .
- the illumination source 160 may include: a laser (e.g., a tunable laser, a continuous wave laser, a pulse laser, other suitable laser emitting infrared light), a light emitted diode (LED), a fiber light source, a light guide, another other suitable light source emitting light in the second band, or some combination thereof.
- the illumination source can be inside the field of view of the display, and is transparent to the display light.
- the illumination source can be a transparent light guide with small light extraction features on top.
- the illumination source 160 may also be configured to emit light in the first band.
- the tracking system 130 may include multiple illumination sources 160 for illuminating one or more portions of the target region.
- the light emitted from the one or more illumination sources 160 is a structured light pattern.
- Reflected light 180 (inclusive of scattered light) from the illuminated portion of the target region (e.g., the eye 155 ) is light in the second band that is reflected by the target region towards the Fresnel assembly 120 .
- the reflected light 162 is redirected by the Fresnel assembly 120 toward a first position, and this redirected light is referred to as redirected light 185 .
- Some or all of the redirected light 185 is captured by the camera assembly 165 that is located at the first position.
- the camera assembly 165 captures images of the target region (e.g., the eye 155 in the eyebox 150 ).
- the camera assembly 165 includes one or more cameras that are sensitive to light in at least the second band (i.e., light emitted by the illumination source 160 ).
- the one or more cameras may also be sensitive in other bands of light (e.g., the first band).
- a camera may be based on single-point detection (e.g., photodiode, balanced/matched photodiodes, or avalanche photodiode), or based on one or two-dimensional detector arrays (e.g., linear photodiode array, CCD array, or CMOS array).
- the sensor plane of the camera is tilted with regards to the camera's lens, following the Scheimpflug condition, such that the image can be in focus across the whole sensor plane.
- the camera assembly 165 may include multiple cameras to capture light reflected from the target region.
- the camera assembly 165 includes at least one camera located at the first position. Accordingly, the camera assembly 165 captures images of the target region (e.g., the eye 155 ) using the redirected light 164 .
- the controller 170 determines tracking information using images from the camera assembly 165 . For example, in some embodiments, the controller 170 identifies locations of reflections of light from the one or more illumination sources 160 in an image of the target region. The controller 170 determines a position and an orientation of the eye 155 and/or portions of the face in the target region based on the shape and/or locations of the identified reflections. In cases where the target region is illuminated with a structured light pattern, the controller 170 can detect distortions of the structured light pattern projected onto the target region, and can estimate a position and an orientation of the eye 155 and/or portions of the face based on the detected distortions.
- the controller 170 can also estimate a pupillary axis, a gaze angle (e.g., corresponds to a foveal axis), a translation of the eye, a torsion of the eye, and a current shape of the eye 155 based on the image of the illumination pattern captured by the camera 155 .
- a gaze angle e.g., corresponds to a foveal axis
- a translation of the eye e.g., corresponds to a foveal axis
- a torsion of the eye e.g., a current shape of the eye 155 based on the image of the illumination pattern captured by the camera 155 .
- the reflected light 180 passes through the optics block 140 prior to incidence on the Fresnel assembly 120 .
- the Fresnel assembly 120 is closer to the eyebox 150 than the optics block 140 (i.e., a distance between the optics block 140 and the display element 110 is less than a distance between the Fresnel assembly 120 and the display element 110 ) and light in the second band reflected from the eye 155 does not pass through an optics block 140 before it is directed by the Fresnel assembly 120 to a first position.
- FIG. 2 is a diagram of a cross section 200 of a portion of a display block of an HMD (not shown) with a canted Fresnel assembly 210 , in accordance with an embodiment.
- the portion of the display block includes the display element 110 , the canted Fresnel assembly 210 , the optics block 140 , and the tracking system 130 .
- the canted Fresnel assembly 210 is substantially the same as the Fresnel assembly 120 , except that it is canted with respect to an optical axis 220 and is positioned at a first distance 230 from the display element 110 .
- the optical axis 220 passes through the display element 110 , the canted Fresnel assembly 210 , and the optics block 140 .
- the optical axis 220 bisects one or more of the display element 110 , the canted Fresnel assembly 210 , the optics block 140 , or some combination thereof.
- the canted Fresnel assembly 210 is positioned at a tilt angle, ⁇ , with respect to the optical axis 220 .
- the tilt angle is determined by system level trade-offs between stray light from the draft facets, how normal the camera's view of the eye is needed, how compact the Fresnel assembly needs to be, etc. Reasonable tilt angles would be smaller than 30 degrees.
- the canted Fresnel assembly 210 is substantially closer to the optics block 140 than the display element 110 .
- the canted Fresnel assembly 210 is located the first distance 230 from the display element 110 , and a second distance 240 from the optics block 140 .
- the first distance 230 is substantially larger than the second distance 240 . If the Fresnel assembly is canted, it's preferred to put it behind the optics block 140 , and with a large distance from the display. If it's in between the eye and the optics block 140 , it wants to be close to non-canted such that it would not increase the design eye-relief.
- the optics block 140 When it's put behind the optics block 140 , it is usually better to put Fresnel surfaces far away from the display, such that the eye would not be able to focus on the Fresnel grooves or imperfections of immersion. If the Fresnels/prisms also function as screen-door-reduction films, then it can potentially be put very close to the display.
- FIG. 3 is a diagram of a cross section 300 of a portion of a display block of an HMD (not shown) with an immersed Fresnel assembly 310 , in accordance with an embodiment.
- the portion of the display block includes the display element 110 , the immersed Fresnel assembly 310 , and the tracking system 130 .
- the immersed Fresnel assembly 310 is the Fresnel assembly 120 overmolded with an immersion layer 320 .
- the immersed Fresnel assembly 310 is coupled to the display element 110 .
- the immersed Fresnel assembly 310 is directly coupled to a display surface of the display element 110 .
- the immersed Fresnel assembly 310 and the display element 110 are separated by one or more intermediate layers (e.g., optical films).
- the immersed Fresnel assembly 310 can be located in any location between the display 110 and the eye 155 .
- the immersion layer 320 is index matched to a material that makes up the Fresnel assembly 120 .
- the immersion layer 320 is index matched to the Fresnel assembly 120 , light in the first band from the display element 110 is not affected by the Fresnel assembly 120 .
- the substantially matched index light in the first band is not refracted by the plurality of surfaces (e.g., surface 330 ) at the interface between the immersion layer 320 and the Fresnel assembly 120 .
- light in the first band interacts with the immersed Fresnel assembly 310 as a block of a material of a single index.
- the immersion layer can be a glue or UV curable material that is of similar refractive index to the Fresnels.
- both matching Fresnels can be sandwiched together, with index-matching glue in between.
- both match Fresnels are the same material (such as a UV curable material), and no index-match glue is needed in between.
- the immersed Fresnel assembly 310 includes an outer surface 340 (e.g., substrate).
- the outer surface 340 is flat.
- the outer surface 340 is shaped to provide some adjustment to optical power of light being transmitted by the immersed Fresnel assembly 310 .
- the output surface may be concave, convex, an asphere, spherical, a freeform surface, flat, or some other shape.
- the Fresnel assembly 120 is coupled to on a substrate that is not flat (e.g., curved, or some other shape).
- the plurality of surfaces of the Fresnel assembly 120 have a dichroic coating that is transmissive in the first band of light (e.g., visible light), but is reflective in the second band of light (e.g., IR light).
- the coatings may be sputtered onto one or more of the surfaces, laminated onto one or more of the surfaces, etc. Accordingly, as discussed below with regard to FIGS. 4-6 , light in the second band incident on the surfaces are reflected (e.g., toward the camera assembly 155 ).
- the coating maybe a partial reflective coating.
- the coating may be a polarization beam splitter coating.
- one or more of junctions between surfaces of the Fresnel assembly 120 may be softened (i.e., smoothed to have a radius of curvature v. a sharp edge formed by two intersecting surfaces). Softening of a junction can facilitate lamination of a coating onto the plurality of surfaces of the Fresnel structure.
- FIG. 4 is a diagram of a cross section 400 of a portion of an immersed Fresnel assembly 410 , in accordance with an embodiment.
- the portion of the immersed Fresnel assembly 410 includes a portion of the Fresnel assembly 120 and a portion of the immersion layer 320 that is coupled to the Fresnel assembly 120 .
- Light 430 in the second band is incident on a surface 420 of the plurality of surfaces of the Fresnel assembly 120 .
- the plurality of surfaces have a dichroic coating that is transmissive in the first band and reflective in the second band. Accordingly, the light 430 is reflected toward a first position (e.g., occupied a camera of the camera assembly 165 ).
- Some or all of the surfaces of the Fresnel assembly 120 may be optimized to reduce stray light.
- a slope of a draft facet 440 may be shaped such that it aligns with a chief ray angle exitance ray going into the camera's entrance pupil.
- An alternative way to remove stray light is to deactivate the “unused” Fresnel draft facet, such that it wouldn't be reflective and give stray light.
- only the “useful” facet is coated to reflect light, and the draft facet is not coated (this can be done by directional coating techniques) or painted black. Note that prior to arriving at a camera of the camera assembly 165 , in the immersed Fresnel assembly 410 , the light 430 is refracted at the outer surface 340 .
- the light 430 can be bent at a greater angle relative to a low index material (e.g., 1.3). And a greater bending of the light 430 may be useful in reducing a form factor of the HMD.
- a high index material e.g., 2.5
- a low index material e.g., 1.3
- Another artifact to avoid is Fresnel visibility to the view's eye. If the “unused” draft facet reflects a lot of visible light into the viewer's eye, the Fresnel facets may be visible. In some embodiments, optimum co-designs of the coating and the Fresnel draft angle mitigates this artifact. In alternative embodiments, the draft facet is uncoated and becomes invisible for the see-through transmission path. In other embodiments, the Fresnel pitch is small (smaller than 400 microns) such that it is difficult for the eye to resolve the individual facets.
- the first position is also occupied by the illumination source 165 .
- the illumination source 160 illuminates the Fresnel assembly 120 with light in the second band, and the Fresnel assembly 120 redirects the light toward an eyebox.
- crossed polarizers or other polarization diversity methods are used to block unwanted illumination light hitting the camera.
- FIG. 5 is a diagram of a cross section 500 of a portion of an immersed Fresnel assembly 505 that captures multiple view angles, in accordance with an embodiment.
- the portion of the immersed Fresnel assembly 505 includes a portion of a Fresnel assembly 510 and a portion of the immersion layer 320 that is coupled to the Fresnel assembly 510 .
- the Fresnel assembly 510 is substantially similar to the Fresnel assembly 120 , except that, e.g., the plurality of surfaces are configured to direct light in a second band from different angles to the camera assembly 165 .
- the camera assembly 165 is able to capture using a single camera in a single image frame multiple view angles of a target region (e.g., eye, portion of the face, or both).
- a target region e.g., eye, portion of the face, or both.
- the camera assembly 165 may capture three dimensional (3D) information that describes a portion of the eye 155 .
- the Fresnel assembly 510 includes a plurality of surfaces.
- the plurality of surfaces have a dichroic coating that is transmissive in the first band and reflective in the second band.
- Light 515 in the second band is incident on the outer surface 340 of the immersion layer 320 at a first angle.
- the light 515 is refracted at the outer surface 340 toward a surface 520 of the plurality of surfaces of the Fresnel assembly 510 .
- the surface 520 reflects the light toward a surface 525 of the plurality of surfaces, and the surface 525 reflects the light 515 toward the outer surface 340 of the immersion layer 320 .
- the light 515 refracts at a refraction point 530 and propagates toward a first position that is occupied by a camera (e.g., a camera of the camera assembly 165 ).
- a camera e.g., a camera of the camera assembly 165
- the light might not follow the same paths in alternative embodiments, but in those cases, light coming from different paths can also allow for multiple views of the eye 155 , or allow for paths where an illuminator can be inserted.
- light 535 in the second band is incident on the outer surface 340 of the immersion layer 320 at a second angle that is different than the first angle.
- the light 535 is refracted at the outer surface 340 toward a surface 540 of the plurality of surfaces of the Fresnel assembly 510 .
- the surface 540 reflects the light 535 back toward a portion of the outer surface 340 at an angle such that total internal reflection occurs at a TIR point 545 .
- the light 535 is reflected toward a surface 550 of the plurality of surfaces, and the surface 550 reflects the light 535 toward the outer surface 340 of the immersion layer 320 .
- the light 535 refracts at a refraction point 555 and propagates toward a first position that is occupied by a camera (e.g., a camera of the camera assembly 165 ).
- the light 515 and the light 535 are representative of different view angles of a target area (e.g., eye and/or portion of the face) being imaged by the camera. And that the camera is able to capture in a single image frame both view angles of the eye.
- the tracking system 130 may use the captured images for generating tracking information.
- a first set of surfaces are used for one view, and another set of surface is used for a different view.
- the surfaces of the first set may be shaped to capture light specifically for a particular view
- the surface of the second set may be shaped to capture light specifically for a different view.
- the surfaces of the first set and the surfaces of the second set may be positioned in separate portions of the Fresnel assembly 510 , interlaced across the entire Fresnel assembly 510 , etc.
- the surfaces may be shaped and/or coated such that different polarizations of light and/or wavelengths of light correspond to different view angles that are captured by the camera.
- FIG. 6 is a diagram of a cross section 600 of a portion of an immersed Fresnel assembly 605 configured to interact with multiple positions, in accordance with an embodiment.
- the portion of the immersed Fresnel assembly 605 includes a portion of a Fresnel assembly 610 and a portion of the immersion layer 320 that is coupled to the Fresnel assembly 610 .
- the Fresnel assembly 610 is substantially similar to the Fresnel assembly 120 , except that, e.g., the plurality of surfaces include a first set of surfaces associated with a first position 615 and a second set of surfaces associated with a second position 620 that is different from the first position 615 .
- the first position 615 is on a first side of the optical axis 220
- the second position 615 is on a second side of the optical axis 220
- the first position 615 and the second position 620 are symmetric with respect to the optical axis 220 .
- the first position 615 is occupied by a first device 625 and the second position 620 is occupied by a second device 630 .
- the first device 625 and the second device 630 may be a camera (e.g., of the camera assembly 165 ), an illumination source 160 , a display panel, or some combination thereof.
- the illumination source or the display panel can be LEDs, microLEDs, MicroOLEDs, display panels that show an arbitrary pattern (e.g., such as dots or sinusoidal patterns), some other suitable illuminators, or some combination thereof.
- at least one of the first position 615 or the second position 620 includes a camera. As illustrated in FIG.
- the first device 625 is a camera
- the second device 630 may be a camera or an illumination source 160 .
- both the first position 615 and the second position 620 include cameras of the camera assembly 165 .
- the first position 615 includes a camera of the camera assembly 165 and the second position 620 includes an illumination source 160 .
- FIG. 7A shows a diagram of the HMD, as a near eye display 700 , in one embodiment.
- the HMD is a pair of augmented reality glasses.
- the HMD 700 presents computer-generated media to a user and augments views of a physical, real-world environment with the computer-generated media. Examples of computer-generated media presented by the HMD 700 include one or more images, video, audio, or some combination thereof.
- audio is presented via an external device (e.g. speakers and headphones) that receives audio information from the HMD 700 , a console (not shown), or both, and presents audio data based on audio information.
- an external device e.g. speakers and headphones
- the HMD 700 may be modified to also operate as a virtual reality (VR) HMD, a mixed reality (MR) HMD, or some combination thereof.
- the HMD 700 includes a frame 710 and a display assembly 720 .
- the frame 710 mounts the near eye display 700 to the user's head.
- the display 720 provides image light to the user.
- FIG. 7B shows a cross-section view 730 of the near eye display 700 .
- This view includes the frame 710 , the display assembly 720 , a display block 740 , and the eye 155 .
- the display assembly 720 supplies image light to the eye 155 .
- the display assembly 720 houses the display block 740 .
- FIG. 7B shows the cross section 730 associated with a single display block 740 and a single eye 155 , but in alternative embodiments not shown, another display block which is separate from the display block 730 shown in FIG. 7B , provides image light to another eye of the user.
- the display block 740 is configured to combine light from a local area with light from computer generated image to form an augmented scene.
- the display block 740 is also configured to provide the augmented scene to the eyebox 150 corresponding to a location of a user's eye 155 .
- the eyebox 150 is a region of space that would contain a user's eye while the user is wearing the HMD 700 .
- the display block 740 may include, e.g., a waveguide display, a focusing assembly, a compensation assembly, or some combination thereof.
- the display block 740 is an embodiment of the display blocks discussed above with regard to FIGS. 1-3 .
- the display block 720 includes may include an immersed Fresnel assembly as discussed above with regard to FIGS. 4-6 .
- the HMD 700 may include one or more other optical elements between the display block 740 and the eye 155 .
- the optical elements may act to, e.g., correct aberrations in image light emitted from the display block 740 , magnify image light emitted from the display block 740 , some other optical adjustment of image light emitted from the display block 740 , or some combination thereof.
- the example for optical elements may include an aperture, a Fresnel lens, a convex lens, a concave lens, a filter, a grating, a waveguide element, or any other suitable optical element that affects image light.
- the display block 740 may be composed of one or more materials (e.g., plastic, glass, etc.) with one or more refractive indices that effectively minimize the weight and widen a field of view of the HMD 700 .
- FIG. 8A is an example array 800 of sub-pixels on the display element 110 .
- the example array 800 shown in FIG. 8A includes red sub-pixels 810 , blue sub-pixels 820 , and green sub-pixels 830 .
- the array 800 is portion of a PENTILE® display. In other embodiments, the array 800 may be in some other configuration (e.g., RGB).
- a dark space 840 separates each sub-pixel from one or more adjacent sub-pixels.
- the dark space 840 is a portion of the array 800 that does not emit light, and may become visible to a user under certain circumstances (e.g., magnification) causing the screen door effect that degrades image quality.
- the Fresnel assembly 120 can serve to reduce fixed pattern noise so the dark space 840 between the sub-pixels is not visible to the user (e.g., by blurring each sub-pixel, creating a blur spot associated with each sub-pixel in the image). The blur spots are large enough so adjacent blur spots mask the dark space 840 between adjacent full pixels.
- the largest pixel fill-ratio is 100%, if there is no gap at all between sub-pixels.
- the pixel fill-ratio may be much greater (e.g., 300%), such that the sub-pixels of different colors are overlapping. This way, when only green pixels are emitting light, for example, when viewed with perfect viewing optics, there would be no gap between the sub-pixels. This is difficult to do for OLED and/or LCD display panels, but it is doable with a diffractive element such as the prism array redirection structure 125 .
- FIG. 8B is an example illustrating adjustment of image data of the array 800 of FIG. 8A by the Fresnel assembly 120 .
- each of the sub-pixels has an associated blur spot.
- the red sub-pixels 810 have a corresponding red blur spot 860
- the blue sub-pixels 820 have a corresponding blue blur spot 870
- the green sub-pixels 830 have a corresponding green blur spot 880 .
- a blur spot is an area filled with an image of a blurred sub-pixel. So long as a blur spot does not overlap with a point of maximum intensity of an adjacent blur spot that is created by a sub-pixel of the same color, the two blur spots are resolvable as two adjacent pixels.
- the three sub-pixels all overlap and creates a white pixel.
- the shape of the blur spot is not necessarily a circle, but is rather an area including the blurred image of a sub-pixel.
- the redirection structure 125 of FIG. 1A can be configured to blur each sub-pixel so the blur spots mask the dark space 940 between adjacent pixels.
- a software module is implemented with a computer program product comprising a computer-readable medium containing computer program code, which can be executed by a computer processor for performing any or all of the steps, operations, or processes described.
- Embodiments of the disclosure may also relate to an apparatus for performing the operations herein.
- This apparatus may be specially constructed for the required purposes, and/or it may comprise a general-purpose computing device selectively activated or reconfigured by a computer program stored in the computer.
- a computer program may be stored in a non-transitory, tangible computer readable storage medium, or any type of media suitable for storing electronic instructions, which may be coupled to a computer system bus.
- any computing systems referred to in the specification may include a single processor or may be architectures employing multiple processor designs for increased computing capability.
- Embodiments of the disclosure may also relate to a product that is produced by a computing process described herein.
- a product may comprise information resulting from a computing process, where the information is stored on a non-transitory, tangible computer readable storage medium and may include any embodiment of a computer program product or other data combination described herein.
Landscapes
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Optics & Photonics (AREA)
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
Abstract
Description
- The present disclosure relates generally to light redirection, and specifically relates to a Fresnel assembly for light redirection in eye tracking systems.
- Eye tracking refers to the process of detecting the direction of a user's gaze, which may comprise detecting an orientation of an eye in 3-dimentional (3D) space. Eye tracking in the context of headsets used in, e.g., virtual reality and/or augmented reality applications can be an important feature. Conventional systems commonly use, e.g., a number of infrared light sources to illuminate the eye light, and a camera is used to image a reflection of the light sources from the eye. Traditionally, eye tracking systems use beam splitters to redirect infrared light reflected from the eye to the camera. However, beam splitters are often large, cumbersome, and unsuitable for HMDs used in augmented reality (AR), mixed reality (MR), and virtual reality (VR) systems.
- A Fresnel assembly transmits (e.g., partially or fully) light in a first band (e.g., visible light) and redirects some, or all of light in a second band (e.g., infrared light) to one or more locations. The Fresnel assembly includes a plurality of surfaces that act to redirect light in the second band. The plurality of surfaces may be coated with a dichroic material that is transmissive in the first band and reflective in the second band. In some embodiments, an immersion layer is overmolded onto the Fresnel assembly to form an immersed Fresnel assembly. The immersion layer may be index matched to the Fresnel assembly. Additionally, in some embodiments, one or more surfaces of the immersed Fresnel assembly may be shaped (e.g., concave, convex, asphere, freeform, etc.) to adjust optical power of the immersed Fresnel assembly. The Fresnel assembly may be integrated into a head-mounted display (HMD).
- In some embodiments, a HMD includes a display element, an optics block, a Fresnel assembly, an illumination source, and a camera assembly. The display element outputs image light in a first band (e.g., visible light) of light through a display surface of the display element. The optics block directs light from the display element to an eyebox (a region in space occupied by an eye of the user). The Fresnel assembly transmits light in the first band and directs light in a second band (e.g., infrared light) different than the first band to a first position. The illumination source (e.g., part of a tracking system) illuminates a target region (eyes and/or portion of the face) with light in the second band. The camera (e.g., part of an eye and/or face tracking system) is located in the first position, and is configured to capture light in the second band corresponding to light reflected from the target region of the user and reflected by the Fresnel assembly.
- In some embodiments, the Fresnel assembly may be non-wavelength sensitive (e.g. it could be a non-immersed, uncoated surface, or it could have a partially-reflective coating and is immersed). It transmits a portion of the light from the display and directs it to the eyebox. It also directs parts of the display light outside of the eyebox. The camera is positioned such that the light that comes from the eye is reflected off of the Fresnels and is captured by the camera. Additionally, in some embodiments, the HMD includes a controller (e.g., part of the tracking system) that generates tracking information (e.g., gaze location and/or facial expressions).
-
FIG. 1 is a diagram of a cross section of a portion of a display block of an HMD (not shown), in accordance with an embodiment. -
FIG. 2 is a diagram of a cross section of a portion of a display block of an HMD (not shown) with a canted Fresnel assembly, in accordance with an embodiment. -
FIG. 3 is a diagram of a cross section of a portion of a display block of an HMD (not shown) with an immersed Fresnel assembly, in accordance with an embodiment. -
FIG. 4 is a diagram of a cross section of a portion of an immersed Fresnelassembly 410, in accordance with an embodiment. -
FIG. 5 is a diagram of a cross section of a portion of an immersed Fresnel assembly that captures multiple view angles, in accordance with an embodiment. -
FIG. 6 is a diagram of a cross section of a portion of an immersed Fresnel assembly configured to interact with multiple positions, in accordance with an embodiment. -
FIG. 7A is a diagram of an HMD, in one embodiment. -
FIG. 7B is a diagram of a cross-section of the HMD, in one embodiment. -
FIG. 8A is an example array of sub-pixels on an electronic display element, in accordance with an embodiment -
FIG. 8B is an image of an example array of sub-pixels adjusted by an optical block, in accordance with an embodiment. - The figures depict embodiments of the present disclosure for purposes of illustration only. One skilled in the art will readily recognize from the following description that alternative embodiments of the structures and methods illustrated herein may be employed without departing from the principles, or benefits touted, of the disclosure described herein.
-
FIG. 1 is a diagram of across section 100 of a portion of a display block of an HMD (not shown), in accordance with an embodiment. As shown inFIG. 1 , the display block includes adisplay element 110, a Fresnelassembly 120, and atracking system 130. In some embodiments, thedisplay block 100 may also include anoptics block 140. - The
display element 110 displays images to the user. In various embodiments, thedisplay element 110 may comprise a single electronic display panel or multiple electronic display panels (e.g., a display for each eye of a user). Examples of thedisplay element 110 include: a liquid crystal display (LCD), an organic light emitting diode (OLED) display, an active-matrix organic light-emitting diode display (AMOLED), a quantum organic light emitting diode (QOLED) display, a quantum light emitting diode (QLED) display, a transparent organic light emitting diode (TOLED) display, some other display, or some combination thereof. In some embodiments, thedisplay element 110 is a waveguide display. Thedisplay element 110 emits content within a first band of light (e.g., in a visible band). - The Fresnel
assembly 120 transmits light in the first band and reflects light within a second band. For example, the Fresnelassembly 120 may transmit light within a visible band (e.g., 400-700 nanometers (nm)), and may reflect light within an infrared (IR) band (e.g., above 780 nm). The Fresnelassembly 120 comprises a first surface that includes plurality of reflective surfaces that form a Fresnel lens, aportion 145 of the plurality of reflective surfaces are illustrated inFIG. 1 . In some embodiments, the plurality of reflective surfaces are coated with a dichroic film that is transmissive in the first band and reflective in the second band. In other embodiments, the plurality of the reflective surfaces are coated with a partial reflective coating. As discussed in detail below with regard to, e.g.,FIGS. 4-6 the plurality of reflective surfaces can be configured to reflect light in the second band to different positions, and the positions. The Fresnelassembly 120 can partially transmit light, and partially reflect light, for any light band of interest. For example, without coating the Fresnelassembly 120 can weakly reflect light by Fresnel reflection, and transmit the rest of the light. In contrast, with coatings, the transmission/reflection ratio can be adjusted. Alternatively, the Fresnelassembly 120 can transmit light with one polarization, and reflect light with an orthogonal polarization. This can be useful especially for polarization based viewing optics, where light is transmitted in one single polarization. Additionally, while not shown inFIG. 1 , in some embodiments, the Fresnelassembly 120 is coupled directly to a display surface of to the display element. - The Fresnel
assembly 120 is at least partially transmissive to light in the first band. In some embodiments, theFresnel assembly 120 may increase or decrease the optical power of light in the first band from thedisplay element 110. Additionally, for light in the first band, theFresnel assembly 120 may reduce fixed pattern noise (i.e., the screen door effect). As is described in the descriptions ofFIGS. 8A and 8B , theFresnel assembly 120 blurs light emitted from individual pixels in thedisplay element 110. This blurring allows for dark spots between individual colors to be masked. - The optics block 140 directs light to a target region. The target region includes a portion of the user's face. In some embodiments, the target region includes one or both eyes (e.g., the eye 155) of the user. For ease of illustration, the target region in
FIG. 1 is represented as aneyebox 150. Theeyebox 150 is a region in space that is occupied by aneye 155 of a user of the HMD. In an embodiment, the optics block 140 includes one or more optical elements and/or combinations of different optical elements. For example, an optical element is an aperture, a Fresnel lens, a convex lens, a concave lens, a filter, or any other suitable optical element that affects the image light emitted from thedisplay element 110. In some embodiments, one or more of the optical elements in the optics block 140 may have one or more coatings, such as anti-reflective coatings. - Magnification of the image light by the optics block 140 allows the
display element 110 to be physically smaller, weigh less, and consume less power than larger displays. Additionally, magnification may increase a field of view of the displayed content. For example, the field of view of the displayed content is such that the displayed content is presented using almost all (e.g., 110 degrees diagonal), and in some cases all, of the user's field of view. In some embodiments, the optics block 140 is designed so its effective focal length is larger than the spacing to thedisplay element 110, which magnifies the image light projected by thedisplay element 110. Additionally, in some embodiments, the amount of magnification is adjusted by adding or removing optical elements. - In some embodiments, the optics block 140 is designed to correct one or more types of optical errors. Examples of optical errors include: two-dimensional optical errors, three-dimensional optical errors, or some combination thereof. Two-dimensional errors are optical aberrations that occur in two dimensions. Example types of two-dimensional errors include: barrel distortion, pincushion distortion, longitudinal chromatic aberration, transverse chromatic aberration, or any other type of two-dimensional optical error. Three-dimensional errors are optical errors that occur in three dimensions. Example types of three-dimensional errors include spherical aberration, comatic aberration, field curvature, astigmatism, or any other type of three-dimensional optical error. In some embodiments, content provided to the
display element 110 for display is pre-distorted, and the optics block 140 corrects the distortion when it receives image light from theelectronic display element 110 generated based on the content. - The
tracking system 130 tracks movement of the target region. Some or all of thetracking system 130 may or may not be in a line of sight of a user wearing the HMD. Thetracking system 130 is typically located off-axis to avoid obstructing the user's view of thedisplay element 110, although thetracking system 130 may alternately be placed elsewhere. Also, in some embodiments, there is at least onetracking system 130 for different portions of the user's face (e.g., a tracking system for the user's left eye and a second tracking system for the user's right eye). In some embodiments, only onetracking system 130 may track multiple target regions. - The
tracking system 130 may include one ormore illumination sources 160, acamera assembly 165, and acontroller 170. Thetracking system 130 determines tracking information using data (e.g., images) captured by thecamera assembly 165 of the target region (e.g., the eye 155). Tracking information describes a position of a portion of the user's face. Tracking information may include, e.g., facial expressions, gaze angle, eye orientation, inter-pupillary distance, vergence depth, some other metric associated with tracking an eye, some other metric associated with tracking a portion of the user's face, or some combination thereof. Some embodiments of the tracking unit have different components than those described inFIG. 1 . - The
illumination source 160 illuminates the target region (e.g., the eyebox 150) with light in the second band of light (i.e., source light 175) that is different from the first band of light associated with content from thedisplay element 110. Examples of theillumination source 160 may include: a laser (e.g., a tunable laser, a continuous wave laser, a pulse laser, other suitable laser emitting infrared light), a light emitted diode (LED), a fiber light source, a light guide, another other suitable light source emitting light in the second band, or some combination thereof. In some embodiments, the illumination source can be inside the field of view of the display, and is transparent to the display light. For example, the illumination source can be a transparent light guide with small light extraction features on top. In various embodiments, theillumination source 160 may also be configured to emit light in the first band. In some embodiments, thetracking system 130 may includemultiple illumination sources 160 for illuminating one or more portions of the target region. In some embodiments, the light emitted from the one ormore illumination sources 160 is a structured light pattern. - Reflected light 180 (inclusive of scattered light) from the illuminated portion of the target region (e.g., the eye 155) is light in the second band that is reflected by the target region towards the
Fresnel assembly 120. The reflected light 162 is redirected by theFresnel assembly 120 toward a first position, and this redirected light is referred to as redirectedlight 185. Some or all of the redirectedlight 185 is captured by thecamera assembly 165 that is located at the first position. - The
camera assembly 165 captures images of the target region (e.g., theeye 155 in the eyebox 150). Thecamera assembly 165 includes one or more cameras that are sensitive to light in at least the second band (i.e., light emitted by the illumination source 160). In some embodiments, the one or more cameras may also be sensitive in other bands of light (e.g., the first band). In some embodiments, a camera may be based on single-point detection (e.g., photodiode, balanced/matched photodiodes, or avalanche photodiode), or based on one or two-dimensional detector arrays (e.g., linear photodiode array, CCD array, or CMOS array). In some embodiments, the sensor plane of the camera is tilted with regards to the camera's lens, following the Scheimpflug condition, such that the image can be in focus across the whole sensor plane. In some embodiments, thecamera assembly 165 may include multiple cameras to capture light reflected from the target region. Thecamera assembly 165 includes at least one camera located at the first position. Accordingly, thecamera assembly 165 captures images of the target region (e.g., the eye 155) using the redirected light 164. - The
controller 170 determines tracking information using images from thecamera assembly 165. For example, in some embodiments, thecontroller 170 identifies locations of reflections of light from the one ormore illumination sources 160 in an image of the target region. Thecontroller 170 determines a position and an orientation of theeye 155 and/or portions of the face in the target region based on the shape and/or locations of the identified reflections. In cases where the target region is illuminated with a structured light pattern, thecontroller 170 can detect distortions of the structured light pattern projected onto the target region, and can estimate a position and an orientation of theeye 155 and/or portions of the face based on the detected distortions. Thecontroller 170 can also estimate a pupillary axis, a gaze angle (e.g., corresponds to a foveal axis), a translation of the eye, a torsion of the eye, and a current shape of theeye 155 based on the image of the illumination pattern captured by thecamera 155. - Note that in the illustrated embodiment, the reflected light 180 passes through the optics block 140 prior to incidence on the
Fresnel assembly 120. In alternate embodiments (not shown), theFresnel assembly 120 is closer to theeyebox 150 than the optics block 140 (i.e., a distance between the optics block 140 and thedisplay element 110 is less than a distance between theFresnel assembly 120 and the display element 110) and light in the second band reflected from theeye 155 does not pass through anoptics block 140 before it is directed by theFresnel assembly 120 to a first position. -
FIG. 2 is a diagram of across section 200 of a portion of a display block of an HMD (not shown) with a cantedFresnel assembly 210, in accordance with an embodiment. The portion of the display block includes thedisplay element 110, the cantedFresnel assembly 210, the optics block 140, and thetracking system 130. The cantedFresnel assembly 210 is substantially the same as theFresnel assembly 120, except that it is canted with respect to anoptical axis 220 and is positioned at afirst distance 230 from thedisplay element 110. - The
optical axis 220 passes through thedisplay element 110, the cantedFresnel assembly 210, and the optics block 140. In some embodiments, theoptical axis 220 bisects one or more of thedisplay element 110, the cantedFresnel assembly 210, the optics block 140, or some combination thereof. The cantedFresnel assembly 210 is positioned at a tilt angle, α, with respect to theoptical axis 220. The tilt angle is determined by system level trade-offs between stray light from the draft facets, how normal the camera's view of the eye is needed, how compact the Fresnel assembly needs to be, etc. Reasonable tilt angles would be smaller than 30 degrees. - In this embodiments, the canted
Fresnel assembly 210 is substantially closer to the optics block 140 than thedisplay element 110. The cantedFresnel assembly 210 is located thefirst distance 230 from thedisplay element 110, and asecond distance 240 from the optics block 140. Thefirst distance 230 is substantially larger than thesecond distance 240. If the Fresnel assembly is canted, it's preferred to put it behind the optics block 140, and with a large distance from the display. If it's in between the eye and the optics block 140, it wants to be close to non-canted such that it would not increase the design eye-relief. When it's put behind the optics block 140, it is usually better to put Fresnel surfaces far away from the display, such that the eye would not be able to focus on the Fresnel grooves or imperfections of immersion. If the Fresnels/prisms also function as screen-door-reduction films, then it can potentially be put very close to the display. -
FIG. 3 is a diagram of across section 300 of a portion of a display block of an HMD (not shown) with an immersedFresnel assembly 310, in accordance with an embodiment. The portion of the display block includes thedisplay element 110, the immersedFresnel assembly 310, and thetracking system 130. - The immersed
Fresnel assembly 310 is theFresnel assembly 120 overmolded with animmersion layer 320. The immersedFresnel assembly 310 is coupled to thedisplay element 110. In some embodiments, the immersedFresnel assembly 310 is directly coupled to a display surface of thedisplay element 110. In alternate embodiments, the immersedFresnel assembly 310 and thedisplay element 110 are separated by one or more intermediate layers (e.g., optical films). The immersedFresnel assembly 310 can be located in any location between thedisplay 110 and theeye 155. - The
immersion layer 320 is index matched to a material that makes up theFresnel assembly 120. As theimmersion layer 320 is index matched to theFresnel assembly 120, light in the first band from thedisplay element 110 is not affected by theFresnel assembly 120. For example, due to the substantially matched index, light in the first band is not refracted by the plurality of surfaces (e.g., surface 330) at the interface between theimmersion layer 320 and theFresnel assembly 120. Instead light in the first band interacts with the immersedFresnel assembly 310 as a block of a material of a single index. The immersion layer can be a glue or UV curable material that is of similar refractive index to the Fresnels. Or, a different process can be, two matching Fresnels can be sandwiched together, with index-matching glue in between. In some embodiments, both match Fresnels are the same material (such as a UV curable material), and no index-match glue is needed in between. - The immersed
Fresnel assembly 310 includes an outer surface 340 (e.g., substrate). In this embodiment, theouter surface 340 is flat. In alternate embodiments, theouter surface 340 is shaped to provide some adjustment to optical power of light being transmitted by the immersedFresnel assembly 310. For example, the output surface may be concave, convex, an asphere, spherical, a freeform surface, flat, or some other shape. In some embodiments, theFresnel assembly 120 is coupled to on a substrate that is not flat (e.g., curved, or some other shape). - In this embodiment, the plurality of surfaces of the
Fresnel assembly 120 have a dichroic coating that is transmissive in the first band of light (e.g., visible light), but is reflective in the second band of light (e.g., IR light). In some embodiments, the coatings may be sputtered onto one or more of the surfaces, laminated onto one or more of the surfaces, etc. Accordingly, as discussed below with regard toFIGS. 4-6 , light in the second band incident on the surfaces are reflected (e.g., toward the camera assembly 155). In some embodiments, the coating maybe a partial reflective coating. In other embodiments, the coating may be a polarization beam splitter coating. Moreover, in some embodiments, one or more of junctions between surfaces of theFresnel assembly 120 may be softened (i.e., smoothed to have a radius of curvature v. a sharp edge formed by two intersecting surfaces). Softening of a junction can facilitate lamination of a coating onto the plurality of surfaces of the Fresnel structure. -
FIG. 4 is a diagram of across section 400 of a portion of an immersedFresnel assembly 410, in accordance with an embodiment. The portion of the immersedFresnel assembly 410 includes a portion of theFresnel assembly 120 and a portion of theimmersion layer 320 that is coupled to theFresnel assembly 120. -
Light 430 in the second band is incident on asurface 420 of the plurality of surfaces of theFresnel assembly 120. In this embodiment, the plurality of surfaces have a dichroic coating that is transmissive in the first band and reflective in the second band. Accordingly, the light 430 is reflected toward a first position (e.g., occupied a camera of the camera assembly 165). Some or all of the surfaces of theFresnel assembly 120 may be optimized to reduce stray light. For example, a slope of adraft facet 440 may be shaped such that it aligns with a chief ray angle exitance ray going into the camera's entrance pupil. Note that light might hit the wrong Fresnel facet in two scenarios, on its way to the Fresnel, and after it gets reflected by the Fresnel. In some embodiments where the view of theeye 155 is close to normal, and rays hit theFresnel assembly 120 at close to normal and rarely hit the wrong Fresnel facet on the way in, so it's better to align the chief ray such that it would avoid hitting the wrong Fresnel facet on the way out and into thecamera assembly 165. In other embodiments, some tradeoffs may be made such that stray light can be minimized considering both scenarios. An alternative way to remove stray light is to deactivate the “unused” Fresnel draft facet, such that it wouldn't be reflective and give stray light. In some embodiments, only the “useful” facet is coated to reflect light, and the draft facet is not coated (this can be done by directional coating techniques) or painted black. Note that prior to arriving at a camera of thecamera assembly 165, in the immersedFresnel assembly 410, the light 430 is refracted at theouter surface 340. Accordingly, by using a high index material (e.g., 2.5) for theimmersion layer 320 and theFresnel assembly 120, the light 430 can be bent at a greater angle relative to a low index material (e.g., 1.3). And a greater bending of the light 430 may be useful in reducing a form factor of the HMD. - Another artifact to avoid is Fresnel visibility to the view's eye. If the “unused” draft facet reflects a lot of visible light into the viewer's eye, the Fresnel facets may be visible. In some embodiments, optimum co-designs of the coating and the Fresnel draft angle mitigates this artifact. In alternative embodiments, the draft facet is uncoated and becomes invisible for the see-through transmission path. In other embodiments, the Fresnel pitch is small (smaller than 400 microns) such that it is difficult for the eye to resolve the individual facets.
- Note that in alternate embodiments, the first position is also occupied by the
illumination source 165. In these embodiments, theillumination source 160 illuminates theFresnel assembly 120 with light in the second band, and theFresnel assembly 120 redirects the light toward an eyebox. For further suppression of stray light into the camera, in some embodiments, crossed polarizers or other polarization diversity methods are used to block unwanted illumination light hitting the camera. -
FIG. 5 is a diagram of across section 500 of a portion of an immersedFresnel assembly 505 that captures multiple view angles, in accordance with an embodiment. The portion of the immersedFresnel assembly 505 includes a portion of aFresnel assembly 510 and a portion of theimmersion layer 320 that is coupled to theFresnel assembly 510. TheFresnel assembly 510 is substantially similar to theFresnel assembly 120, except that, e.g., the plurality of surfaces are configured to direct light in a second band from different angles to thecamera assembly 165. In this manner thecamera assembly 165 is able to capture using a single camera in a single image frame multiple view angles of a target region (e.g., eye, portion of the face, or both). In some embodiments, thecamera assembly 165 may capture three dimensional (3D) information that describes a portion of theeye 155. - In this embodiment, the
Fresnel assembly 510 includes a plurality of surfaces. The plurality of surfaces have a dichroic coating that is transmissive in the first band and reflective in the second band.Light 515 in the second band is incident on theouter surface 340 of theimmersion layer 320 at a first angle. The light 515 is refracted at theouter surface 340 toward asurface 520 of the plurality of surfaces of theFresnel assembly 510. Thesurface 520 reflects the light toward asurface 525 of the plurality of surfaces, and thesurface 525 reflects the light 515 toward theouter surface 340 of theimmersion layer 320. The light 515 refracts at arefraction point 530 and propagates toward a first position that is occupied by a camera (e.g., a camera of the camera assembly 165). Note that the light might not follow the same paths in alternative embodiments, but in those cases, light coming from different paths can also allow for multiple views of theeye 155, or allow for paths where an illuminator can be inserted. - Concurrent with the light 515, light 535 in the second band is incident on the
outer surface 340 of theimmersion layer 320 at a second angle that is different than the first angle. The light 535 is refracted at theouter surface 340 toward asurface 540 of the plurality of surfaces of theFresnel assembly 510. Thesurface 540 reflects the light 535 back toward a portion of theouter surface 340 at an angle such that total internal reflection occurs at a TIR point 545. The light 535 is reflected toward asurface 550 of the plurality of surfaces, and thesurface 550 reflects the light 535 toward theouter surface 340 of theimmersion layer 320. The light 535 refracts at arefraction point 555 and propagates toward a first position that is occupied by a camera (e.g., a camera of the camera assembly 165). - Note that the light 515 and the light 535 are representative of different view angles of a target area (e.g., eye and/or portion of the face) being imaged by the camera. And that the camera is able to capture in a single image frame both view angles of the eye. The
tracking system 130 may use the captured images for generating tracking information. - In alternate embodiments, a first set of surfaces are used for one view, and another set of surface is used for a different view. For example, the surfaces of the first set may be shaped to capture light specifically for a particular view, and the surface of the second set may be shaped to capture light specifically for a different view. In this embodiments, the surfaces of the first set and the surfaces of the second set may be positioned in separate portions of the
Fresnel assembly 510, interlaced across theentire Fresnel assembly 510, etc. Additionally, in some embodiments, the surfaces may be shaped and/or coated such that different polarizations of light and/or wavelengths of light correspond to different view angles that are captured by the camera. -
FIG. 6 is a diagram of across section 600 of a portion of an immersed Fresnel assembly 605 configured to interact with multiple positions, in accordance with an embodiment. The portion of the immersed Fresnel assembly 605 includes a portion of aFresnel assembly 610 and a portion of theimmersion layer 320 that is coupled to theFresnel assembly 610. TheFresnel assembly 610 is substantially similar to theFresnel assembly 120, except that, e.g., the plurality of surfaces include a first set of surfaces associated with afirst position 615 and a second set of surfaces associated with asecond position 620 that is different from thefirst position 615. Thefirst position 615 is on a first side of theoptical axis 220, and thesecond position 615 is on a second side of theoptical axis 220. In alternate embodiments, thefirst position 615 and thesecond position 620 are symmetric with respect to theoptical axis 220. - The
first position 615 is occupied by a first device 625 and thesecond position 620 is occupied by asecond device 630. The first device 625 and thesecond device 630 may be a camera (e.g., of the camera assembly 165), anillumination source 160, a display panel, or some combination thereof. The illumination source or the display panel can be LEDs, microLEDs, MicroOLEDs, display panels that show an arbitrary pattern (e.g., such as dots or sinusoidal patterns), some other suitable illuminators, or some combination thereof. However, at least one of thefirst position 615 or thesecond position 620 includes a camera. As illustrated inFIG. 6 , the first device 625 is a camera, and thesecond device 630 may be a camera or anillumination source 160. In an alternate embodiment, both thefirst position 615 and thesecond position 620 include cameras of thecamera assembly 165. In an alternate embodiment, thefirst position 615 includes a camera of thecamera assembly 165 and thesecond position 620 includes anillumination source 160. -
FIG. 7A shows a diagram of the HMD, as anear eye display 700, in one embodiment. In this embodiment, the HMD is a pair of augmented reality glasses. TheHMD 700 presents computer-generated media to a user and augments views of a physical, real-world environment with the computer-generated media. Examples of computer-generated media presented by theHMD 700 include one or more images, video, audio, or some combination thereof. In some embodiments, audio is presented via an external device (e.g. speakers and headphones) that receives audio information from theHMD 700, a console (not shown), or both, and presents audio data based on audio information. In some embodiments, theHMD 700 may be modified to also operate as a virtual reality (VR) HMD, a mixed reality (MR) HMD, or some combination thereof. TheHMD 700 includes aframe 710 and adisplay assembly 720. In this embodiment, theframe 710 mounts thenear eye display 700 to the user's head. Thedisplay 720 provides image light to the user. -
FIG. 7B shows across-section view 730 of thenear eye display 700. This view includes theframe 710, thedisplay assembly 720, adisplay block 740, and theeye 155. Thedisplay assembly 720 supplies image light to theeye 155. Thedisplay assembly 720 houses thedisplay block 740. For purposes of illustration,FIG. 7B shows thecross section 730 associated with asingle display block 740 and asingle eye 155, but in alternative embodiments not shown, another display block which is separate from thedisplay block 730 shown inFIG. 7B , provides image light to another eye of the user. - The
display block 740, as illustrated below inFIG. 7B , is configured to combine light from a local area with light from computer generated image to form an augmented scene. Thedisplay block 740 is also configured to provide the augmented scene to theeyebox 150 corresponding to a location of a user'seye 155. Theeyebox 150 is a region of space that would contain a user's eye while the user is wearing theHMD 700. Thedisplay block 740 may include, e.g., a waveguide display, a focusing assembly, a compensation assembly, or some combination thereof. Thedisplay block 740 is an embodiment of the display blocks discussed above with regard toFIGS. 1-3 . Additionally, thedisplay block 720 includes may include an immersed Fresnel assembly as discussed above with regard toFIGS. 4-6 . - The
HMD 700 may include one or more other optical elements between thedisplay block 740 and theeye 155. The optical elements may act to, e.g., correct aberrations in image light emitted from thedisplay block 740, magnify image light emitted from thedisplay block 740, some other optical adjustment of image light emitted from thedisplay block 740, or some combination thereof. The example for optical elements may include an aperture, a Fresnel lens, a convex lens, a concave lens, a filter, a grating, a waveguide element, or any other suitable optical element that affects image light. Thedisplay block 740 may be composed of one or more materials (e.g., plastic, glass, etc.) with one or more refractive indices that effectively minimize the weight and widen a field of view of theHMD 700. -
FIG. 8A is anexample array 800 of sub-pixels on thedisplay element 110. Theexample array 800 shown inFIG. 8A includesred sub-pixels 810,blue sub-pixels 820, and green sub-pixels 830. For example, thearray 800 is portion of a PENTILE® display. In other embodiments, thearray 800 may be in some other configuration (e.g., RGB). - A
dark space 840 separates each sub-pixel from one or more adjacent sub-pixels. Thedark space 840 is a portion of thearray 800 that does not emit light, and may become visible to a user under certain circumstances (e.g., magnification) causing the screen door effect that degrades image quality. As discussed above in conjunction withFIG. 1 , theFresnel assembly 120 can serve to reduce fixed pattern noise so thedark space 840 between the sub-pixels is not visible to the user (e.g., by blurring each sub-pixel, creating a blur spot associated with each sub-pixel in the image). The blur spots are large enough so adjacent blur spots mask thedark space 840 between adjacent full pixels. In other words, for any display panel, the largest pixel fill-ratio is 100%, if there is no gap at all between sub-pixels. However, to completely get rid of the screen door artifact on the panel side, the pixel fill-ratio may be much greater (e.g., 300%), such that the sub-pixels of different colors are overlapping. This way, when only green pixels are emitting light, for example, when viewed with perfect viewing optics, there would be no gap between the sub-pixels. This is difficult to do for OLED and/or LCD display panels, but it is doable with a diffractive element such as the prism array redirection structure 125. -
FIG. 8B is an example illustrating adjustment of image data of thearray 800 ofFIG. 8A by theFresnel assembly 120. As shown inFIG. 8B , each of the sub-pixels has an associated blur spot. Specifically, thered sub-pixels 810 have a correspondingred blur spot 860, theblue sub-pixels 820 have a correspondingblue blur spot 870, and the green sub-pixels 830 have a correspondinggreen blur spot 880. A blur spot is an area filled with an image of a blurred sub-pixel. So long as a blur spot does not overlap with a point of maximum intensity of an adjacent blur spot that is created by a sub-pixel of the same color, the two blur spots are resolvable as two adjacent pixels. In some embodiments, the three sub-pixels all overlap and creates a white pixel. The shape of the blur spot is not necessarily a circle, but is rather an area including the blurred image of a sub-pixel. The redirection structure 125 ofFIG. 1A can be configured to blur each sub-pixel so the blur spots mask the dark space 940 between adjacent pixels. - The foregoing description of the embodiments of the disclosure has been presented for the purpose of illustration; it is not intended to be exhaustive or to limit the disclosure to the precise forms disclosed. Persons skilled in the relevant art can appreciate that many modifications and variations are possible in light of the above disclosure.
- Some portions of this description describe the embodiments of the disclosure in terms of algorithms and symbolic representations of operations on information. These algorithmic descriptions and representations are commonly used by those skilled in the data processing arts to convey the substance of their work effectively to others skilled in the art. These operations, while described functionally, computationally, or logically, are understood to be implemented by computer programs or equivalent electrical circuits, microcode, or the like. Furthermore, it has also proven convenient at times, to refer to these arrangements of operations as modules, without loss of generality. The described operations and their associated modules may be embodied in software, firmware, hardware, or any combinations thereof.
- Any of the steps, operations, or processes described herein may be performed or implemented with one or more hardware or software modules, alone or in combination with other devices. In one embodiment, a software module is implemented with a computer program product comprising a computer-readable medium containing computer program code, which can be executed by a computer processor for performing any or all of the steps, operations, or processes described.
- Embodiments of the disclosure may also relate to an apparatus for performing the operations herein. This apparatus may be specially constructed for the required purposes, and/or it may comprise a general-purpose computing device selectively activated or reconfigured by a computer program stored in the computer. Such a computer program may be stored in a non-transitory, tangible computer readable storage medium, or any type of media suitable for storing electronic instructions, which may be coupled to a computer system bus. Furthermore, any computing systems referred to in the specification may include a single processor or may be architectures employing multiple processor designs for increased computing capability.
- Embodiments of the disclosure may also relate to a product that is produced by a computing process described herein. Such a product may comprise information resulting from a computing process, where the information is stored on a non-transitory, tangible computer readable storage medium and may include any embodiment of a computer program product or other data combination described herein.
- Finally, the language used in the specification has been principally selected for readability and instructional purposes, and it may not have been selected to delineate or circumscribe the inventive subject matter. It is therefore intended that the scope of the disclosure be limited not by this detailed description, but rather by any claims that issue on an application based hereon. Accordingly, the disclosure of the embodiments is intended to be illustrative, but not limiting, of the scope of the disclosure, which is set forth in the following claims.
Claims (20)
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US15/723,305 US10788677B2 (en) | 2017-10-03 | 2017-10-03 | Fresnel assembly for light redirection in eye tracking systems |
CN201811150298.9A CN109597201B (en) | 2017-10-03 | 2018-09-29 | Fresnel component for light redirection in eye tracking systems |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US15/723,305 US10788677B2 (en) | 2017-10-03 | 2017-10-03 | Fresnel assembly for light redirection in eye tracking systems |
Publications (2)
Publication Number | Publication Date |
---|---|
US20190101767A1 true US20190101767A1 (en) | 2019-04-04 |
US10788677B2 US10788677B2 (en) | 2020-09-29 |
Family
ID=65897537
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/723,305 Active US10788677B2 (en) | 2017-10-03 | 2017-10-03 | Fresnel assembly for light redirection in eye tracking systems |
Country Status (2)
Country | Link |
---|---|
US (1) | US10788677B2 (en) |
CN (1) | CN109597201B (en) |
Cited By (19)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10845594B1 (en) * | 2017-12-21 | 2020-11-24 | Facebook Technologies, Llc | Prism based light redirection system for eye tracking systems |
WO2021130739A1 (en) * | 2019-12-25 | 2021-07-01 | Lumus Ltd. | Optical systems and methods for eye tracking based on redirecting light from eye using an optical arrangement associated with a light-guide optical element |
WO2021167748A1 (en) * | 2020-02-20 | 2021-08-26 | Facebook Technologies, Llc | Dual wavelength eye imaging |
US11205069B1 (en) | 2020-02-20 | 2021-12-21 | Facebook Technologies, Llc | Hybrid cornea and pupil tracking |
US20220113537A1 (en) * | 2020-10-12 | 2022-04-14 | Dell Products L.P. | System and method of operating a display of an information handling system |
US11307420B2 (en) | 2017-07-03 | 2022-04-19 | Holovisions LLC | Augmented reality eyewear with “ghost buster” technology |
US11415808B1 (en) | 2019-02-08 | 2022-08-16 | Facebook Technologies, Llc | Illumination device with encapsulated lens |
US11442270B2 (en) * | 2017-02-27 | 2022-09-13 | Advanced New Technologies Co., Ltd. | Virtual reality head-mounted apparatus with a partial-reflection partial-transmission wedge |
US11592670B2 (en) * | 2019-08-02 | 2023-02-28 | Sharp Kabushiki Kaisha | Compact high field of view display |
US11617508B2 (en) | 2020-02-28 | 2023-04-04 | Meta Platforms Technologies, Llc | Eye-tracking fundus imaging system |
US11650403B2 (en) | 2019-02-08 | 2023-05-16 | Meta Platforms Technologies, Llc | Optical elements for beam-shaping and illumination |
US11754843B2 (en) | 2017-07-03 | 2023-09-12 | Holovisions LLC | Augmented reality eyewear with “ghost buster” technology |
US20230367041A1 (en) * | 2022-05-13 | 2023-11-16 | Meta Platforms Technologies, Llc | Reflective polarizer coated fresnel lens |
US11867900B2 (en) | 2020-02-28 | 2024-01-09 | Meta Platforms Technologies, Llc | Bright pupil eye-tracking system |
US20240020864A1 (en) * | 2021-05-31 | 2024-01-18 | Trinamix Gmbh | Auto calibration from epipolar line distance in projection pattern |
US12013538B2 (en) | 2017-07-03 | 2024-06-18 | Holovisions LLC | Augmented reality (AR) eyewear with a section of a fresnel reflector comprising individually-adjustable transmissive-reflective optical elements |
US12147056B2 (en) | 2021-08-11 | 2024-11-19 | Google Llc | Fresnel-reflection-based light pickoff element for laser-based systems |
US12205231B2 (en) | 2017-07-03 | 2025-01-21 | Holovisions | Holovisions™—adjustable and/or modular augmented reality (AR) eyewear with a movable transflective mirror and different viewing modes |
US12222500B2 (en) * | 2017-12-20 | 2025-02-11 | 3M Innovative Properties Company | Structured optical surface and optical imaging system |
Families Citing this family (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11442272B2 (en) * | 2020-03-12 | 2022-09-13 | Facebook Technologies, Llc | High-resolution liquid crystal displays |
TWI858237B (en) | 2021-03-05 | 2024-10-11 | 大立光電股份有限公司 | Head-mounted device |
CN113359292B (en) * | 2021-06-30 | 2022-08-30 | 京东方科技集团股份有限公司 | Display module, manufacturing method thereof and head-mounted display device |
TWM621801U (en) | 2021-08-16 | 2022-01-01 | 大立光電股份有限公司 | Head-mounted device |
CN114442316B (en) * | 2022-02-22 | 2024-08-16 | 京东方科技集团股份有限公司 | A display module and a manufacturing method thereof, and a head-mounted display device |
Citations (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20030214710A1 (en) * | 2002-05-17 | 2003-11-20 | Susumu Takahashi | Three-dimensional observation apparatus |
US7656585B1 (en) * | 2008-08-19 | 2010-02-02 | Microvision, Inc. | Embedded relay lens for head-up displays or the like |
US20120002295A1 (en) * | 2009-02-25 | 2012-01-05 | Carl Zeiss Ag | Display device comprising multifunction glass, production method, and optical element having a fresnel structure |
US20150070481A1 (en) * | 2013-09-06 | 2015-03-12 | Arvind S. | Multiple Viewpoint Image Capture of a Display User |
US20160070103A1 (en) * | 2014-09-08 | 2016-03-10 | Oculus Vr, Llc | Corrective optics for reducing fixed pattern noise in a virtual reality headset |
US20160223819A1 (en) * | 2014-09-30 | 2016-08-04 | Omnivision Technologies, Inc. | Near-eye display device and methods with coaxial eye imaging |
US20170017842A1 (en) * | 2014-01-28 | 2017-01-19 | Beijing Irisking Co., Ltd | Mobile terminal iris recognition method and device having human-computer interaction mechanism |
US20170329398A1 (en) * | 2016-05-13 | 2017-11-16 | Google Inc. | Eye tracking systems and methods for virtual reality environments |
US20180113508A1 (en) * | 2016-10-21 | 2018-04-26 | Apple Inc. | Eye tracking system |
US20180203505A1 (en) * | 2017-01-17 | 2018-07-19 | Oculus Vr, Llc | Varifocal head-mounted display including modular air spaced optical assembly |
US20190101757A1 (en) * | 2017-10-02 | 2019-04-04 | Google Inc. | Eye tracking using light guide with faceted combiner |
Family Cites Families (14)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7224526B2 (en) | 1999-12-08 | 2007-05-29 | Neurok Llc | Three-dimensional free space image projection employing Fresnel lenses |
US8585609B2 (en) | 2008-10-09 | 2013-11-19 | Neuro Kinetics, Inc. | Quantitative, non-invasive, clinical diagnosis of traumatic brain injury using simulated distance visual stimulus device for neurologic testing |
US8998414B2 (en) | 2011-09-26 | 2015-04-07 | Microsoft Technology Licensing, Llc | Integrated eye tracking and display system |
CN113568175B (en) | 2013-11-27 | 2023-06-27 | 奇跃公司 | Virtual and augmented reality systems and methods |
WO2015157016A1 (en) | 2014-04-09 | 2015-10-15 | 3M Innovative Properties Company | Head mounted display and low conspicuity pupil illuminator |
CN106797423B (en) | 2014-06-27 | 2019-12-06 | Fove股份有限公司 | Sight line detection device |
WO2016076153A1 (en) | 2014-11-11 | 2016-05-19 | シャープ株式会社 | Light guide plate and virtual image display device |
CN108352075B (en) | 2015-11-06 | 2020-01-21 | 脸谱科技有限责任公司 | Eye tracking using optical flow |
US10152121B2 (en) | 2016-01-06 | 2018-12-11 | Facebook Technologies, Llc | Eye tracking through illumination by head-mounted displays |
JP2017224942A (en) | 2016-06-14 | 2017-12-21 | フォーブ インコーポレーテッド | Head mounted display, eye gaze detection system |
CN114935863A (en) | 2017-02-23 | 2022-08-23 | 奇跃公司 | Display system with variable power reflector |
CN106932904A (en) | 2017-02-27 | 2017-07-07 | 阿里巴巴集团控股有限公司 | Virtual reality helmet |
IL303471B2 (en) | 2017-03-21 | 2024-08-01 | Magic Leap Inc | Eye-imaging apparatus using diffractive optical elements |
CN107065049A (en) | 2017-05-10 | 2017-08-18 | 杭州光粒科技有限公司 | The display prism and optical system of a kind of big angle of visual field augmented reality |
-
2017
- 2017-10-03 US US15/723,305 patent/US10788677B2/en active Active
-
2018
- 2018-09-29 CN CN201811150298.9A patent/CN109597201B/en active Active
Patent Citations (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20030214710A1 (en) * | 2002-05-17 | 2003-11-20 | Susumu Takahashi | Three-dimensional observation apparatus |
US7656585B1 (en) * | 2008-08-19 | 2010-02-02 | Microvision, Inc. | Embedded relay lens for head-up displays or the like |
US20120002295A1 (en) * | 2009-02-25 | 2012-01-05 | Carl Zeiss Ag | Display device comprising multifunction glass, production method, and optical element having a fresnel structure |
US20150070481A1 (en) * | 2013-09-06 | 2015-03-12 | Arvind S. | Multiple Viewpoint Image Capture of a Display User |
US20170017842A1 (en) * | 2014-01-28 | 2017-01-19 | Beijing Irisking Co., Ltd | Mobile terminal iris recognition method and device having human-computer interaction mechanism |
US20160070103A1 (en) * | 2014-09-08 | 2016-03-10 | Oculus Vr, Llc | Corrective optics for reducing fixed pattern noise in a virtual reality headset |
US20160223819A1 (en) * | 2014-09-30 | 2016-08-04 | Omnivision Technologies, Inc. | Near-eye display device and methods with coaxial eye imaging |
US20170329398A1 (en) * | 2016-05-13 | 2017-11-16 | Google Inc. | Eye tracking systems and methods for virtual reality environments |
US20180113508A1 (en) * | 2016-10-21 | 2018-04-26 | Apple Inc. | Eye tracking system |
US20180203505A1 (en) * | 2017-01-17 | 2018-07-19 | Oculus Vr, Llc | Varifocal head-mounted display including modular air spaced optical assembly |
US20190101757A1 (en) * | 2017-10-02 | 2019-04-04 | Google Inc. | Eye tracking using light guide with faceted combiner |
Cited By (25)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11442270B2 (en) * | 2017-02-27 | 2022-09-13 | Advanced New Technologies Co., Ltd. | Virtual reality head-mounted apparatus with a partial-reflection partial-transmission wedge |
US11307420B2 (en) | 2017-07-03 | 2022-04-19 | Holovisions LLC | Augmented reality eyewear with “ghost buster” technology |
US11754843B2 (en) | 2017-07-03 | 2023-09-12 | Holovisions LLC | Augmented reality eyewear with “ghost buster” technology |
US12013538B2 (en) | 2017-07-03 | 2024-06-18 | Holovisions LLC | Augmented reality (AR) eyewear with a section of a fresnel reflector comprising individually-adjustable transmissive-reflective optical elements |
US12205231B2 (en) | 2017-07-03 | 2025-01-21 | Holovisions | Holovisions™—adjustable and/or modular augmented reality (AR) eyewear with a movable transflective mirror and different viewing modes |
US12222500B2 (en) * | 2017-12-20 | 2025-02-11 | 3M Innovative Properties Company | Structured optical surface and optical imaging system |
US10845594B1 (en) * | 2017-12-21 | 2020-11-24 | Facebook Technologies, Llc | Prism based light redirection system for eye tracking systems |
US11579451B1 (en) | 2017-12-21 | 2023-02-14 | Mela Platforms Technologies, LLC | Prism based light redirection system for eye tracking systems |
US11415808B1 (en) | 2019-02-08 | 2022-08-16 | Facebook Technologies, Llc | Illumination device with encapsulated lens |
US11650403B2 (en) | 2019-02-08 | 2023-05-16 | Meta Platforms Technologies, Llc | Optical elements for beam-shaping and illumination |
US11592670B2 (en) * | 2019-08-02 | 2023-02-28 | Sharp Kabushiki Kaisha | Compact high field of view display |
WO2021130739A1 (en) * | 2019-12-25 | 2021-07-01 | Lumus Ltd. | Optical systems and methods for eye tracking based on redirecting light from eye using an optical arrangement associated with a light-guide optical element |
US12019249B2 (en) | 2019-12-25 | 2024-06-25 | Lumus Ltd. | Optical systems and methods for eye tracking based on redirecting light from eye using an optical arrangement associated with a light-guide optical element |
US11205069B1 (en) | 2020-02-20 | 2021-12-21 | Facebook Technologies, Llc | Hybrid cornea and pupil tracking |
US11108977B1 (en) | 2020-02-20 | 2021-08-31 | Facebook Technologies, Llc | Dual wavelength eye imaging |
WO2021167748A1 (en) * | 2020-02-20 | 2021-08-26 | Facebook Technologies, Llc | Dual wavelength eye imaging |
US11617508B2 (en) | 2020-02-28 | 2023-04-04 | Meta Platforms Technologies, Llc | Eye-tracking fundus imaging system |
US11867900B2 (en) | 2020-02-28 | 2024-01-09 | Meta Platforms Technologies, Llc | Bright pupil eye-tracking system |
US11598955B2 (en) * | 2020-10-12 | 2023-03-07 | Dell Products L.P. | System and method of operating a display of an information handling system |
US20220113537A1 (en) * | 2020-10-12 | 2022-04-14 | Dell Products L.P. | System and method of operating a display of an information handling system |
US20240020864A1 (en) * | 2021-05-31 | 2024-01-18 | Trinamix Gmbh | Auto calibration from epipolar line distance in projection pattern |
US12087003B2 (en) * | 2021-05-31 | 2024-09-10 | Trinamix Gmbh | Auto calibration from epipolar line distance in projection pattern |
US12147056B2 (en) | 2021-08-11 | 2024-11-19 | Google Llc | Fresnel-reflection-based light pickoff element for laser-based systems |
WO2023220378A1 (en) * | 2022-05-13 | 2023-11-16 | Meta Platforms Technologies, Llc | Reflective polarizer coated fresnel lens |
US20230367041A1 (en) * | 2022-05-13 | 2023-11-16 | Meta Platforms Technologies, Llc | Reflective polarizer coated fresnel lens |
Also Published As
Publication number | Publication date |
---|---|
US10788677B2 (en) | 2020-09-29 |
CN109597201A (en) | 2019-04-09 |
CN109597201B (en) | 2021-07-23 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10788677B2 (en) | Fresnel assembly for light redirection in eye tracking systems | |
US10598928B1 (en) | Light redirection structures for eye tracking systems | |
US11579451B1 (en) | Prism based light redirection system for eye tracking systems | |
US12099197B2 (en) | Head-mounted display eyeball tracker integrated system | |
US11327311B2 (en) | Field curvature corrected display | |
US11933975B2 (en) | Eye tracking based on waveguide imaging | |
US10437051B2 (en) | Apparatus for eye tracking | |
US10502963B1 (en) | Immersed fresnel structure with curable liquid polymer | |
CN102004317B (en) | Spectacles-type image display device | |
US11416071B1 (en) | Infrared transparent backlight device | |
KR101556839B1 (en) | Eyepiece for near-to-eye display with multi-reflectors | |
CN104423044B (en) | virtual image display device | |
US10690929B1 (en) | Beamsplitter assembly for eye tracking in head-mounted displays | |
US9915823B1 (en) | Lightguide optical combiner for head wearable display | |
US10474229B1 (en) | Folded viewing optics with high eye tracking contrast ratio | |
US20100290127A1 (en) | Head-mounted optical apparatus using an oled display | |
US10609364B2 (en) | Pupil swim corrected lens for head mounted display | |
US10488921B1 (en) | Pellicle beamsplitter for eye tracking | |
US10928635B1 (en) | Curved display assembly for artificial reality headset | |
US10877274B1 (en) | Composite optical element for eye tracking having beam splitter formed by coupling of junction surfaces | |
US10733439B1 (en) | Imaging retina in head-mounted displays | |
US10572731B1 (en) | Infrared transparent backlight device for eye tracking applications | |
US20230194872A1 (en) | Head-mounted display device | |
US12276796B2 (en) | Light field directional backlighting based three-dimensional (3D) pupil steering | |
US20240061246A1 (en) | Light field directional backlighting based three-dimensional (3d) pupil steering |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
FEPP | Fee payment procedure |
Free format text: ENTITY STATUS SET TO UNDISCOUNTED (ORIGINAL EVENT CODE: BIG.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY |
|
AS | Assignment |
Owner name: OCULUS VR, LLC, CALIFORNIA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:GENG, YING;SULAI, YUSUFU NJONI BAMAXAM;OUDERKIRK, ANDREW JOHN;AND OTHERS;SIGNING DATES FROM 20171108 TO 20171115;REEL/FRAME:044321/0850 |
|
AS | Assignment |
Owner name: FACEBOOK TECHNOLOGIES, LLC, CALIFORNIA Free format text: CHANGE OF NAME;ASSIGNOR:OCULUS VR, LLC;REEL/FRAME:047178/0616 Effective date: 20180903 |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: FINAL REJECTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: FINAL REJECTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE AFTER FINAL ACTION FORWARDED TO EXAMINER |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NOTICE OF ALLOWANCE MAILED -- APPLICATION RECEIVED IN OFFICE OF PUBLICATIONS |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: PUBLICATIONS -- ISSUE FEE PAYMENT VERIFIED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: AWAITING TC RESP., ISSUE FEE NOT PAID |
|
STCF | Information on status: patent grant |
Free format text: PATENTED CASE |
|
AS | Assignment |
Owner name: META PLATFORMS TECHNOLOGIES, LLC, CALIFORNIA Free format text: CHANGE OF NAME;ASSIGNOR:FACEBOOK TECHNOLOGIES, LLC;REEL/FRAME:060315/0224 Effective date: 20220318 |
|
MAFP | Maintenance fee payment |
Free format text: PAYMENT OF MAINTENANCE FEE, 4TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1551); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY Year of fee payment: 4 |