US9077901B2 - Light field image capture device having 2D image capture mode - Google Patents
Light field image capture device having 2D image capture mode Download PDFInfo
- Publication number
- US9077901B2 US9077901B2 US14/480,240 US201414480240A US9077901B2 US 9077901 B2 US9077901 B2 US 9077901B2 US 201414480240 A US201414480240 A US 201414480240A US 9077901 B2 US9077901 B2 US 9077901B2
- Authority
- US
- United States
- Prior art keywords
- microlens array
- causing
- mla
- enabled
- disabled
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000003384 imaging method Methods 0.000 claims abstract description 47
- 238000000034 method Methods 0.000 claims description 41
- 239000004973 liquid crystal related substance Substances 0.000 claims description 40
- 230000003287 optical effect Effects 0.000 claims description 40
- 230000007246 mechanism Effects 0.000 claims description 29
- 239000012530 fluid Substances 0.000 claims description 19
- 239000000463 material Substances 0.000 claims description 19
- 230000005684 electric field Effects 0.000 claims description 18
- 230000008859 change Effects 0.000 claims description 15
- 239000000382 optic material Substances 0.000 claims description 12
- 230000004044 response Effects 0.000 claims description 12
- 238000012545 processing Methods 0.000 claims description 8
- 238000004458 analytical method Methods 0.000 claims description 6
- 230000004048 modification Effects 0.000 claims description 4
- 238000012986 modification Methods 0.000 claims description 4
- -1 polytetrafluoroethylene Polymers 0.000 claims description 3
- 229920001343 polytetrafluoroethylene Polymers 0.000 claims description 2
- 239000004810 polytetrafluoroethylene Substances 0.000 claims description 2
- 230000003213 activating effect Effects 0.000 claims 2
- 230000015654 memory Effects 0.000 description 11
- 239000011521 glass Substances 0.000 description 10
- 230000006870 function Effects 0.000 description 8
- 238000000926 separation method Methods 0.000 description 7
- 230000000694 effects Effects 0.000 description 6
- 229920000642 polymer Polymers 0.000 description 6
- 239000000758 substrate Substances 0.000 description 6
- 239000011248 coating agent Substances 0.000 description 5
- 238000000576 coating method Methods 0.000 description 5
- 238000009736 wetting Methods 0.000 description 5
- 238000004590 computer program Methods 0.000 description 4
- 125000006850 spacer group Chemical group 0.000 description 4
- 238000012805 post-processing Methods 0.000 description 3
- 230000008569 process Effects 0.000 description 3
- 238000005070 sampling Methods 0.000 description 3
- 230000009471 action Effects 0.000 description 2
- 238000010276 construction Methods 0.000 description 2
- 230000001419 dependent effect Effects 0.000 description 2
- 230000009977 dual effect Effects 0.000 description 2
- 238000005538 encapsulation Methods 0.000 description 2
- AMGQUBHHOARCQH-UHFFFAOYSA-N indium;oxotin Chemical compound [In].[Sn]=O AMGQUBHHOARCQH-UHFFFAOYSA-N 0.000 description 2
- 230000033001 locomotion Effects 0.000 description 2
- 210000001747 pupil Anatomy 0.000 description 2
- 229910001285 shape-memory alloy Inorganic materials 0.000 description 2
- 238000004088 simulation Methods 0.000 description 2
- 230000000007 visual effect Effects 0.000 description 2
- 238000013459 approach Methods 0.000 description 1
- 238000003491 array Methods 0.000 description 1
- 230000008901 benefit Effects 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 239000003795 chemical substances by application Substances 0.000 description 1
- 238000000205 computational method Methods 0.000 description 1
- 238000000151 deposition Methods 0.000 description 1
- 230000008021 deposition Effects 0.000 description 1
- 238000010586 diagram Methods 0.000 description 1
- 229920001746 electroactive polymer Polymers 0.000 description 1
- 210000000887 face Anatomy 0.000 description 1
- 230000002349 favourable effect Effects 0.000 description 1
- 230000007774 longterm Effects 0.000 description 1
- 238000001465 metallisation Methods 0.000 description 1
- 229910001000 nickel titanium Inorganic materials 0.000 description 1
- 239000013618 particulate matter Substances 0.000 description 1
- 238000000059 patterning Methods 0.000 description 1
- 230000010363 phase shift Effects 0.000 description 1
- 239000011295 pitch Substances 0.000 description 1
- 239000002861 polymer material Substances 0.000 description 1
- 230000001902 propagating effect Effects 0.000 description 1
- 239000007787 solid Substances 0.000 description 1
- 239000000126 substance Substances 0.000 description 1
- 230000001960 triggered effect Effects 0.000 description 1
Images
Classifications
-
- H04N5/23245—
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/0075—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00 with means for altering, e.g. increasing, the depth of field or depth of focus
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B7/00—Mountings, adjusting means, or light-tight connections, for optical elements
- G02B7/02—Mountings, adjusting means, or light-tight connections, for optical elements for lenses
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B13/00—Optical objectives specially designed for the purposes specified below
- G02B13/001—Miniaturised objectives for electronic devices, e.g. portable telephones, webcams, PDAs, small digital cameras
- G02B13/0015—Miniaturised objectives for electronic devices, e.g. portable telephones, webcams, PDAs, small digital cameras characterised by the lens design
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B3/00—Simple or compound lenses
- G02B3/0006—Arrays
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B3/00—Simple or compound lenses
- G02B3/12—Fluid-filled or evacuated lenses
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B7/00—Mountings, adjusting means, or light-tight connections, for optical elements
- G02B7/02—Mountings, adjusting means, or light-tight connections, for optical elements for lenses
- G02B7/023—Mountings, adjusting means, or light-tight connections, for optical elements for lenses permitting adjustment
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B7/00—Mountings, adjusting means, or light-tight connections, for optical elements
- G02B7/02—Mountings, adjusting means, or light-tight connections, for optical elements for lenses
- G02B7/04—Mountings, adjusting means, or light-tight connections, for optical elements for lenses with mechanism for focusing or varying magnification
- G02B7/08—Mountings, adjusting means, or light-tight connections, for optical elements for lenses with mechanism for focusing or varying magnification adapted to co-operate with a remote control mechanism
-
- H04N13/0235—
-
- H04N13/026—
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/20—Image signal generators
- H04N13/204—Image signal generators using stereoscopic image cameras
- H04N13/207—Image signal generators using stereoscopic image cameras using a single 2D image sensor
- H04N13/232—Image signal generators using stereoscopic image cameras using a single 2D image sensor using fly-eye lenses, e.g. arrangements of circular lenses
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/20—Image signal generators
- H04N13/204—Image signal generators using stereoscopic image cameras
- H04N13/207—Image signal generators using stereoscopic image cameras using a single 2D image sensor
- H04N13/236—Image signal generators using stereoscopic image cameras using a single 2D image sensor using varifocal lenses or mirrors
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/20—Image signal generators
- H04N13/261—Image signal generators with monoscopic-to-stereoscopic image conversion
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/20—Image signal generators
- H04N13/286—Image signal generators having separate monoscopic and stereoscopic modes
- H04N13/289—Switching between monoscopic and stereoscopic modes
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/667—Camera operation mode switching, e.g. between still and video, sport and normal or high- and low-resolution modes
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/95—Computational photography systems, e.g. light-field imaging systems
- H04N23/957—Light-field or plenoptic cameras or camera modules
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/20—Image signal generators
- H04N13/296—Synchronisation thereof; Control thereof
Definitions
- the present description relates to light field imaging devices and applications, and more particularly to mechanisms for facilitating both light field imaging and conventional 2D imaging within the same camera system.
- Light field capture devices are defined herein as any devices that are capable of capturing light field data, optionally processing light field data, optionally accepting and acting upon user input, and/or optionally displaying or otherwise outputting images and/or other types of data.
- Light field capture devices such as plenoptic cameras may capture light field data using any suitable method for doing so.
- a method includes, without limitation, using a microlens array (MLA) disposed between a main imaging lens and an image sensor (e.g., a CCD or CMOS sensor) as described in Ng et al., Light field photography with a hand-held plenoptic capture device, Technical Report CSTR 2005-02, Stanford Computer Science.
- MLA microlens array
- the MLA is disposed within the optical path, in front of and close to (or in contact with) the image sensor.
- the main lens collects light from the scene and projects this onto the MLA and image sensor combination (also known as the light field sensor).
- Each lens of the MLA images a portion of the exit pupil image from the main lens onto the image sensor as a series of disk images.
- Each lens of the MLA records a slightly different portion of the exit pupil of main lens. This difference in the projected MLA disk images can be used to compute the angular direction of the light rays at each pixel location.
- this system is equivalent to a sub-aperture array based plenoptic camera.
- the MLA is usually attached to the image sensor surface at a fixed distance that is optimized to achieve the best spatial and angular resolution for the application of the light field camera.
- the MLA can be designed with different microlens pitches and focal distances to enable higher spatial resolution or to enable higher angular resolution.
- Various embodiments provide mechanisms to enable a dual-mode light field camera or plenoptic camera to function as a conventional 2D camera.
- MLA microlens array
- the gap separation between the MLA and image sensor is fixed to achieve optimal spatial resolution and angular resolution to achieve refocusing and/or 3D imaging of the light field.
- Various techniques provide enhancements to such cameras to enable them to perform both 3D light field imaging and conventional high-resolution 2D imaging, depending on the selected mode.
- various embodiments include an active system that enables the microlenses to be optically or effectively turned on or turned off, allowing the camera to selectively operate as a 2D imaging camera or a 3D light field camera.
- FIG. 1 depicts an example of an architecture for a light field image data acquisition device, such as an MLA-based plenoptic camera.
- FIG. 2 depicts an example of a functional architecture for a light field image data acquisition device, such as a camera, according to one embodiment.
- FIG. 3 depicts an example of construction of a light field sensor assembly with a fixed spacer defining the gap separation between the MLA layer and the image sensor.
- FIG. 4 depicts an architecture for a light field camera suitable for mobile applications, wherein the MLA can be optically disabled to enable higher resolution 2D image capture, according to one embodiment.
- FIG. 5 depicts an example of a conventional non-MLA type camera module equipped with VCM actuation.
- FIG. 6 depicts an exemplary embodiment for VCM-based actuation of the MLA for two-position operation, according to one embodiment.
- FIGS. 7-1 and 7 - 2 depict exemplary simulation results of moving the MLA to various positions in front of the image sensor plane, according to one embodiment.
- FIG. 8 depicts exemplary results collected by an embodiment with zero separation between MLA and sensor.
- FIG. 9 depicts an example of a MEMS-based actuator for use in connection with one embodiment.
- FIG. 10 depicts an exemplary embodiment of the use of index matching fluid to selectively remove MLA lens focus ability.
- FIG. 11 depicts an exemplary embodiment of the use of index matching fluid in a cavity adjacent to the MLA, wherein low-wetting coating is applied to the MLA and opposing cavity wall.
- FIG. 12 depicts an exemplary embodiment of the use of liquid crystal in a cavity adjacent to the MLA, wherein the liquid crystal has optical properties dependent on the orientation of the liquid crystal molecules.
- FIG. 13 depicts a default orientation of liquid crystal molecules relative to the MLA (with no electric field applied between the electrodes), according to one embodiment.
- FIG. 14 depicts re-orientation of liquid crystal molecules after application of an electric field between the electrodes, according to one embodiment.
- FIGS. 15 and 16 depict an embodiment in which a microlens array is defined in liquid crystal.
- an electric field has been applied between the two ITO layers to turn the liquid crystal off, so as to eliminate the index gradient.
- FIGS. 17 and 18 depict an embodiment wherein a microlens array can be deactivated using electro-optic material.
- no voltage is applied, so that the electro-optic material has a uniform refractive index and does not act like a lens.
- FIG. 19 is a flowchart depicting a method for switching between a 2D imaging mode and a 3D light field capture mode, according to one embodiment.
- FIG. 20 depicts an example of a camera wherein a secondary optical element is positioned in the optical path, according to one embodiment.
- FIGS. 21 and 22 depict two positions of a movable secondary optical element with relation to a deformable microlens array, according to one embodiment.
- the term “camera” is used herein to refer to an image capture device or other data acquisition device.
- a data acquisition device can be any device or system for acquiring, recording, measuring, estimating, determining and/or computing data representative of a scene, including but not limited to two-dimensional image data, three-dimensional image data, and/or light field data.
- Such a data acquisition device may include optics, sensors, and image processing electronics for acquiring data representative of a scene, using techniques that are well known in the art, are disclosed herein, or could be conceived by a person of skill in the art with the aid of the present disclosure.
- Various embodiments include an active system that enables the microlenses to be optically or effectively turned on or turned off, allowing the camera to selectively operate as a 2D imaging camera or a 3D light field camera.
- FIG. 1 there is shown an example of an architecture for a light field image data acquisition device, such as an MLA-based plenoptic camera 100 , according to one embodiment.
- FIG. 1 is not shown to scale.
- FIG. 1 shows, in conceptual form, the relationship between aperture 112 , main lens 113 , microlens array 102 , and sensor 103 , as such components interact to capture light field data for subject 201 .
- microlens array 102 is placed within the optical path of camera 100 , between main lens 113 and sensor 103 .
- FIG. 2 there is shown a block diagram depicting a functional architecture for a light field image data acquisition device, such as a camera 100 , according to one embodiment.
- Camera 100 may be constructed using the architecture depicted in FIG. 1 .
- FIGS. 1 and 2 are merely exemplary, and that other architectures are possible.
- FIGS. 1 and 2 are optional, and may be omitted or reconfigured.
- Other components as known in the art may additionally or alternatively be added.
- camera 100 may be a light field camera that includes light field image data acquisition device 109 having optics 101 , image sensor or sensor 103 (including a plurality of individual sensors for capturing pixels), and microlens array 102 .
- Optics 101 may include, for example, aperture 112 for allowing a selectable amount of light into camera 100 , and main lens 113 for focusing light toward microlens array 102 .
- microlens array 102 may be disposed and/or incorporated in the optical path of camera 100 (between main lens 113 and sensor 103 ) so as to facilitate acquisition, capture, sampling of, recording, and/or obtaining light field image data via sensor 103 .
- MLA 102 may be constructed using any suitable material, including for example a deformable material or a non-deformable material.
- a deformable material is an optically transparent polymer.
- non-deformable materials include optically transparent polymer and optically transparent glass.
- any other suitable material can be used.
- optics 101 may also optionally include a secondary optical element 117 .
- This can be any element or component that transmits, blocks, or refracts light moving through it.
- secondary optical element 117 include, without limitation, a lens, an LCD, a flat or curved piece of polymer or glass, or the like.
- Element 117 can be affixed to or attached to any other component(s), such as for example MLA 102 , sensor 103 , main lens 113 , or the like, or it can be separate from such components.
- element 117 is in the optical path of light entering camera 100 . In other embodiments, element 117 can be omitted.
- Sensor 103 can be of any suitable type, such as for example a CMOS sensor.
- camera 100 may also include a user interface 105 , which may include any suitable input device for allowing a user to provide input for controlling the operation of camera 100 for capturing, acquiring, storing, and/or processing image data.
- the techniques described herein provide mechanisms for display of depth information in connection with user interface 105 . Such depth information can be displayed, for example, on display device 114 which may be a display screen on camera 100 .
- camera 100 may include memory 111 for storing image data, such as output by sensor 103 .
- the memory 111 can include external and/or internal memory.
- memory 111 can be provided at a separate device and/or location from camera 100 .
- camera 100 may store raw light field image data, as output by sensor 103 , and/or a representation thereof, such as a compressed image data file.
- memory 111 can also store data representing the characteristics, parameters, and/or configurations of camera 100 and/or its components.
- captured image data is provided to post-processing circuitry 104 .
- processing circuitry 104 may be disposed in or integrated into light field image data acquisition device 109 , as shown in FIG. 2 , or it may be in a separate post-processing device (not shown) external to light field image data acquisition device 109 .
- Such separate component may be local or remote with respect to light field image data acquisition device 109 .
- the post-processing circuitry 104 may include a processor of any known configuration, including microprocessors, ASICS, and the like.
- camera 100 includes MLA enabling/disabling mechanism 115 , which selectively enables or disables MLA 102 so as to provide dual modes wherein the camera alternatively functions as either a light field camera or a conventional 2D camera, as described in more detail below.
- Mechanism 115 can be coupled to any components within camera 100 , including for example MLA 102 and/or secondary optical element 117 (if included).
- camera 100 may also include control circuitry 110 for facilitating acquisition, sampling, recording, and/or obtaining light field image data.
- control circuitry 110 may manage and/or control (automatically or in response to user input) the acquisition timing, rate of acquisition, sampling, capturing, recording, and/or obtaining of light field image data.
- control circuitry 110 also sends control signals to MLA enabling/disabling mechanism 115 to cause mechanism 115 to switch modes, for example under the control of user interface 105 .
- camera 100 may optionally include scene analysis module 116 for analyzing a scene to automatically determine whether a 2D imaging mode or a light field imaging mode should be used.
- scene analysis module 116 for analyzing a scene to automatically determine whether a 2D imaging mode or a light field imaging mode should be used.
- FIG. 3 there is shown an example of construction of a light field sensor assembly 300 with a fixed spacer 305 defining the gap separation 307 between the MLA layer 102 and image sensor 103 .
- Sensor 103 is affixed to printed circuit board 303 , which may contain the hardware circuitry for processing light field data and/or storing or relaying such data to other components.
- Wire bonds 304 are an example of a mechanism for holding sensor 103 in place with respect to printed circuit board 303 .
- Encapsulation 306 forms a casing around the assembly to protect and stabilize it.
- MLA layer 102 may be formed from substrate 302 , which may be made of glass or any other suitable material.
- spacer 305 is fixed and may be constructed of any suitable material. Spacer 305 can be affixed to sensor 103 and/or to encapsulation 306 and/or to substrate 302 .
- FIG. 4 there is shown an example of an architecture for a light field camera 100 suitable for mobile applications (as well as other applications), wherein MLA 102 can be optically disabled to enable higher resolution 2D image capture.
- MLA 102 is positioned in front of sensor 103 and can be constructed from substrate 302 .
- MLA 102 is enabled, so that camera 100 functions as a light field image acquisition device.
- MLA 102 has been optically disabled, so that camera 100 functions as a conventional 2D camera.
- camera 100 provides functionality for switching between a light field acquisition mode and a conventional image acquisition mode.
- camera 100 includes MLA enabling/disabling mechanism 115 , so as to provide such functionality.
- FIG. 20 there is shown an alternative embodiment for a camera 100 , wherein a secondary optical element 117 is positioned in the optical path between main lens 113 and MLA 102 .
- Camera 100 is initialized 1901 to a default mode, either a 2D imaging mode or a light field capture mode.
- This default mode can be preset, or it can be the same mode that was in effect the last time camera 100 was used, or it can be chosen by some other means. As described in more detail below, any of a number of techniques can be used to configure camera 100 to be in the default mode.
- MLA enabling/disabling mechanism 115 receives 1902 a signal to change to the other mode.
- Such signal can be triggered, for example, by a user command entered via user interface 105 , or automatically by detecting that one or the other mode is more suitable to the particular image subject at hand, or by some other means.
- MLA enabling/disabling mechanism 115 causes MLA 102 to be enabled or disabled 1903 accordingly. More particularly, if 2D imaging mode is desired, MLA 102 is optically or effectively disabled; conversely, if light field capture mode is desired, MLA 102 is optically or effectively enabled. Again, any suitable technique, including but not limited to those described below, can be used for optically or effectively enabling/disabling MLA 102 . If, in step 1903 , it is determined that MLA 102 is already in the desired enabled/disabled state, then no action need be taken.
- mechanism 115 can take any of a number of different forms. Examples of techniques for optically or effectively enabling and disabling MLA 102 according to various embodiments include:
- mechanism 115 can take whatever form is suitable for performing the above-described operations so as to enable and disable MLA 102 .
- camera 100 can provide active feedback regarding the relative position of MLA 102 in any of the above-described embodiments.
- the enablement and disablement of MLA 102 in any of the above-described embodiments can be algorithmically determined based on collected parameters of the light field.
- scene analysis module 116 is included, which examines the depth information of each pixel and applies an algorithm to determine, based upon a predetermined or user-defined threshold, which mode would best represent the scene within the camera system's field of view.
- an algorithm can be applied to determine, based upon a predetermined or user defined threshold, which mode would best represent the scene within the camera system's field of view.
- angular resolution is traded off for higher spatial resolution by moving MLA 102 closer to the surface of the image sensor 103 .
- MLA 102 can be moved to a specific distance from image sensor 103 (or from secondary optical element 117 ) so that the effective optical properties of MLA 102 can be completely neutralized.
- a minor visual perturbation of the image at the microlens interstitial region can be introduced.
- This minor loss of information at the interstitial region can be corrected by various computational methods, for example by interpolation of pixels surrounding the interstitial regions.
- a calibration step is performed to characterize the interstitial pattern by imaging a diffuse white screen on image sensor 103 . This calibration data is called a modulation image.
- the inverse of the modulation image is multiplied with a captured image on a per-pixel basis. This process is called demodulation and it removes the intensity variations from the image due to MLA 102 .
- Moving MLA 102 sufficiently close to image sensor 103 (or to secondary optical element 117 ) causes MLA 102 to become optically inactive and lose its ability to focus light; this occurs because the microlenses are so far away from the optimal focus position.
- This is similar in principle to a magnifying glass at the maximum magnification position and moving closer to the object. When the magnifying glass moves spatially close enough to the object, it loses the lens effect and the object appears as it does without the magnifying lens in place.
- VCM voice coil motor
- AF auto-focus
- FIG. 5 there is shown an example of a conventional non-MLA type camera module 500 equipped with VCM actuation.
- VCM 501 is used to change the position of components such as main lens 113 .
- the right-hand side of FIG. 5 depicts further details of VCM 501 .
- Permanent magnets 504 are affixed to base 505 .
- Coil 503 introduces a magnetic field that causes the assembly to move when electrical current is applied. This causes lens holder 508 and main lens 113 to shift position accordingly.
- Spring plates 506 cause the components to return to their original positions when the current is switched off.
- Yoke 507 guides the motion of VCM 501 and provides structural support for the components.
- FIG. 6 there is shown an exemplary embodiment for implementing VCM-based actuation of MLA 102 for 2-position operation.
- the left-hand side of FIG. 6 shows MLA 102 in a position for light field imaging (i.e., with MLA 102 spaced apart from sensor 103 ), while the right-hand side shows MLA 102 in a position for high-resolution 2D imaging (i.e., with MLA 102 very close to sensor 103 ).
- Actuator 601 moves MLA 102 (along with substrate 302 ) from one position to the other in accordance with user commands or automated switching from one mode to the other.
- FIG. 6 thus depicts one embodiment for configuring actuator 601 to operate between a first stop position for light field imaging and a second position for high-resolution 2D imaging.
- VCM actuation in conventional AF systems is that the tilt control of main lens 113 is not very good (approx. +/ ⁇ 0.2 degrees) when the VCM is not stopped against a mechanical limiting mechanism.
- VCM actuator 601 can over-drive MLA 102 into a precisely fabricated mechanical stop position with respect to the surface of image sensor 103 , and thereby mitigate any tilt performance shortcomings.
- a second VCM actuator can be incorporated for the AF actuation portion. This is shown in FIG. 6 by the presence of second VCM actuator 501 , which performs a function similar to that depicted in FIG. 5 .
- Second VCM actuator 501 is optional, but may be advantageous because it provide actuation capability for main lens 113 to adjust focus when MLA 102 is in the high-resolution 2D imaging position. Furthermore, when in the light field imaging position, the adjustability in main lens 113 position allows additional adjustment on the refocus range.
- MEMS-based actuator 901 Any suitable actuator can be used, whether or not it is VCM-controlled.
- Example of actuators include microelectromechanical systems (MEMS) actuators, shape memory alloys, piezo-based transducers, electroactive polymer based transducers and other micro-actuation devices.
- MEMS-based actuation of the MLA between the optically active and optically inactive positions in the assembly may be performed to an acceptable accuracy without the use of physical stops included in the VCM-based configuration depicted in FIG. 6 .
- accuracy of location of the MLA may actually be increased over that achieved in the case with stops in the case that debris and/or stray particulate matter becomes lodged between one of the stops and the moving MLA sub-assembly.
- positioning accuracy of the MLA sub-assembly is further improved through the use of a real-time active feedback of image quality.
- shape memory alloy based actuators based on copper-aluminum-nickel or nickel-titanium alloy systems can be used; these may provide for faster switching between modes and for a smaller footprint than traditional VCM actuators.
- MLA 102 may be constructed using a deformable material, so that it can conform to the surface of sensor 103 (and/or secondary optical element 117 ) when pressed against it.
- the deformable MLA 102 can therefore provide even more direct pass-through of light to sensor 103 , and can provide higher 2D resolutions by improving the degree to which MLA 102 can be disabled.
- MLA 102 may be constructed of a non-deformable material.
- secondary optical element 117 rather than (or in addition to) moving MLA 102 , secondary optical element 117 itself can be moved so that its position with relation to MLA 102 causes MLA 102 to be enabled or disabled.
- FIGS. 21 and 22 there are shown examples of two positions of a movable secondary optical element 117 with relation to a deformable microlens array 102 , according to one embodiment.
- secondary optical element 117 can be, for example and without limitation, a lens, an LCD, a flat or curved piece of polymer or glass, or the like.
- secondary optical element 117 is moved with relation to MLA 102 , so as to change the optical properties of MLA 102 .
- MLA 102 is enabled, since there is space between it and secondary optical element 117 .
- MLA 102 is disabled, since it is now in contact with secondary optical element 117 .
- MLA 102 is deformed as a result of contact with element 117 , which deformation improves the contact with element 117 .
- FIGS. 7-1 and 7 - 2 there are shown exemplary simulation results of moving MLA 102 to various positions in front of the image sensor plane, according to one embodiment.
- the effect of the MLA pattern can be corrected using algorithmic interpolation with a modulation image.
- FIG. 7-1 shows an example image 700 resulting from a light field camera system such as camera 100 , with 32 um separation between MLA 102 and image sensor 103 ; this corresponds to light field imaging mode.
- FIG. 7-2 shows an example image 701 resulting from the same light field camera system such as camera 100 , with 4 um separation between MLA 102 and image sensor 103 ; this corresponds to high-resolution 2D imaging mode.
- FIG. 7-2 also shows image 702 , which is the result of application of algorithmic interpolation to image 701 , to visually remove minor visual perturbation created by MLA 102 on image sensor 103 , for example by using demodulation as described above.
- Example image 800 results from the same light field camera system such as camera 100 , with 0 um separation between MLA 102 and image sensor 103 ; this corresponds to high-resolution 2D imaging mode.
- Image 801 is the result of application of demodulation calibration to remove interstitional MLA effects.
- MLA 102 can be moved out of the optical path altogether, for example by moving it in a sideways direction (i.e. parallel to the plane of image sensor 103 ). In this embodiment, MLA 102 is rendered optically ineffective by moving it to a position where light rays no longer pass through MLA 102 on their way to image sensor 103 .
- FIG. 10 there is shown an example of an embodiment wherein the effect of MLA 102 is selectively removed by using a pump, such as micro-pump 1001 , to move index-matching material, such as a fluid, between MLA 102 and image sensor 103 .
- the index of refraction of the fluid is matched to the index of the MLA polymer material, so that when MLA 102 is covered with the fluid, the microlenses no longer act as focusing lenses.
- the fluid has been pumped into cavity 1003 adjacent to MLA 102 to remove the MLA lens focus ability.
- configuration 1001 the fluid has been pumped out of cavity 1003 and is stored in reservoir 1002 .
- Pump and valve assembly 1001 is used to selectively move the fluid between cavity 1003 and reservoir 1002 , thereby switching between high-resolution 2D imaging mode (as shown in configuration 1000 ) and light field imaging mode (as shown in configuration 1001 ). Any suitable type of pump and valve can be used in assembly 1001 .
- the fluid has low viscosity [e.g. ⁇ 1 ⁇ 10-3 Pa ⁇ s dynamic viscosity], and (ii) the fluid has low wetting to both MLA 102 and the opposing planar side of cavity 1003 .
- optically-transparent surface modification layers may employed to improve non-wetting. Referring now to FIG. 11 , there is shown such an embodiment, wherein low-wetting coating 1102 is applied to MLA 102 and opposing cavity wall 1101 .
- the index-matching fluid is polar
- a surface modifying agent resulting in a non-polar surface such as polytetrafluoroethylene
- a surface coating resulting in a polar surface may be used. So as to ensure complete evacuation of cavity 1003 when the index matching fluid is removed from cavity 1003 , the coating is preferably applied to the entire interior of cavity 1003 .
- Transparent electrodes 1203 made of indium tin oxide (ITO) or any other appropriate material, are provided on MLA 102 and on cavity wall 1202 opposing MLA 102 . Additional metallization outside the active area may be performed as needed.
- ITO indium tin oxide
- the default orientation and pre-tilt of the liquid crystal is determined by the application of a textured coating over the top of the electrode surface, often called a command surface.
- the transparent electrode 1203 on MLA 102 may be placed between MLA 102 and MLA support substrate 302 .
- This can be useful, for example, if a polymer-on-glass MLA 102 is used which the polymer MLA 102 may become denatured during deposition and patterning of the ITO.
- the textured surface encouraging default orientation of the liquid crystals is positioned on the top surface of MLA 102 , in direct contact with the liquid crystal (as opposed to directly on top of the ITO electrode).
- FIGS. 13 and 14 there are shown two orientations of liquid crystal molecules relative to MLA 102 , to select between light field acquisition mode and high-resolution 2D image mode.
- the default orientation of the liquid crystal molecules 1301 in cavity 1201 relative to MLA 102 (with no electric field applied between electrodes 1203 ) is shown in FIG. 13 .
- the liquid crystal has a first effective refractive index to light propagating in a direction perpendicular to the MLA substrate.
- MLA 102 can be rendered optically disabled.
- MLA 102 can be rendered optically enabled. In this manner, two modes of operation are provided: one with a functional MLA 102 for capture of light field images, and one in which MLA 102 is disabled for capture of 2D images.
- FIGS. 15 and 16 there is shown an embodiment 1500 in which a microlens array is defined in liquid crystal 1501 .
- the arrangement consists of an ITO coated glass cell, wherein indium tin oxide (ITO 1503 ), or some other suitable material, is used to coat glass layers 1502 .
- ITO 1503 indium tin oxide
- one ITO layer 1503 B has a patterned alignment layer 1505
- the other ITO layer 1503 A has a uniform alignment layer 1504 .
- a liquid crystal layer 1501 is situated between layers 1504 and 1505 of ITO 1503 A and 1503 B.
- the patterned alignment layer 1505 defines the orientation of liquid crystal 1501 .
- a lensing effect can be produced by varying the orientation and pre-tilt angle of liquid crystal 1501 to generate a gradient in the refractive index of liquid crystal 1501 and therefore cause liquid crystal 1501 to act as a lens.
- the orientation change to liquid crystal 1501 thus alters the effective refractive index of liquid crystal 1501 by rotating the index ellipsoid.
- liquid crystal 1501 To turn liquid crystal 1501 off, an electric field is applied between the two ITO layers 1503 A, 1503 B, as shown in FIG. 16 .
- the molecules of liquid crystal 1501 reorient themselves so that they are aligned parallel to the electric field, eliminating any index gradient.
- Liquid crystal 1501 is thus uniformly aligned perpendicular to glass layers 1502 , so that there is no refractive index change for light normally incident to the device, and liquid crystal layer 1501 no longer acts like a lens.
- an MLA defined by liquid crystal 1501 can be selectively enabled or disabled to implement two modes of operation: one with a functional MLA for capture of light field images, and one in which MLA is disabled for capture of 2D images.
- Electro-Optical Material Fresnel MLA
- FIGS. 17 and 18 there is shown an embodiment 1700 wherein a microlens array can be selectively activated or deactivated using electro-optic material 1701 such as an electro-optic polymer.
- Electro-optic material 1701 has a changeable refractive index that is proportional to an applied electric field.
- FIG. 17 shows an example of a Fresnel microlens array defined in such electro-optic material 107 .
- the quality of a Fresnel lens is proportional to how many zones and phase levels it has.
- FIG. 17 depicts a lens with four zones and two phase levels (i.e., binary).
- higher quality lens arrays can be made using any number of zones with any number of phase levels, such as, for example, four, eight, or more phase levels.
- a voltage corresponding to a pi phase change is applied to between patterned ITO layer 1702 and uniform ITO layer 1703 .
- a voltage is applied between patterned ITO layer 1702 and uniform ITO layer 1703 , a Fresnel lens is formed by electro-optic material 1701 .
- electro-optic material 1701 when no voltage is applied, electro-optic material 1701 has a uniform refractive index and does not act like a lens.
- ITO layers 1702 , 1703 may be coated at a thickness that results in a 2*pi phase shift in the light so that layers 1702 , 1703 do does not act as a Fresnel lens themselves.
- Other configurations and arrangements are possible.
- Some embodiments may include a system or a method for performing the above-described techniques, either singly or in any combination.
- Other embodiments may include a computer program product comprising a non-transitory computer-readable storage medium and computer program code, encoded on the medium, for causing a processor in a computing device or other electronic device to perform the above-described techniques.
- process steps and instructions described herein in the form of an algorithm can be embodied in software, firmware and/or hardware, and when embodied in software, can be downloaded to reside on and be operated from different platforms used by a variety of operating systems.
- This apparatus may be specially constructed for the required purposes, or it may comprise a general-purpose computing device selectively activated or reconfigured by a computer program stored in the computing device.
- a computer program may be stored in a computer readable storage medium, such as, but is not limited to, any type of disk including floppy disks, optical disks, CD-ROMs, DVD-ROMs, magnetic-optical disks, read-only memories (ROMs), random access memories (RAMs), EPROMs, EEPROMs, flash memory, solid state drives, magnetic or optical cards, application specific integrated circuits (ASICs), or any type of media suitable for storing electronic instructions, and each coupled to a computer system bus.
- the computing devices referred to herein may include a single processor or may be architectures employing multiple processor designs for increased computing capability.
- various embodiments may include software, hardware, and/or other elements for controlling a computer system, computing device, or other electronic device, or any combination or plurality thereof.
- an electronic device can include, for example, a processor, an input device (such as a keyboard, mouse, touchpad, track pad, joystick, trackball, microphone, and/or any combination thereof), an output device (such as a screen, speaker, and/or the like), memory, long-term storage (such as magnetic storage, optical storage, and/or the like), and/or network connectivity, according to techniques that are well known in the art.
- Such an electronic device may be portable or non-portable.
- Examples of electronic devices include: a mobile phone, personal digital assistant, smartphone, kiosk, server computer, enterprise computing device, desktop computer, laptop computer, tablet computer, consumer electronic device, or the like.
- An electronic device for implementing the system or method described herein may use any operating system such as, for example and without limitation: Linux; Microsoft Windows, available from Microsoft Corporation of Redmond, Wash.; Mac OS X, available from Apple Inc. of Cupertino, Calif.; iOS, available from Apple Inc. of Cupertino, Calif.; Android, available from Google, Inc. of Mountain View, Calif.; and/or any other operating system that is adapted for use on the device.
Landscapes
- Physics & Mathematics (AREA)
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- General Physics & Mathematics (AREA)
- Optics & Photonics (AREA)
- Computing Systems (AREA)
- Theoretical Computer Science (AREA)
- Studio Devices (AREA)
Abstract
Description
-
- Moving
MLA 102 from a light field imaging position to a position close toimage sensor 103. - Moving
MLA 102 to a position where it is no longer in the optical path. - Introducing an index-matching medium between
image sensor 103 andMLA 102 to make the lenses ofMLA 102 lose their refractive focusing power. - Moving
MLA 102 from the light field imaging position to a position where it is in contact with an index-matching material on the surface ofimage sensor 103. - Using a known material medium with changeable refractive index properties to disable
MLA 102. In at least one embodiment, a liquid crystal type medium can be placed betweenMLA 102 andimage sensor 103. A known electric field can be applied to such medium to change the index of refraction of the medium and thus effectively change MLA's 102 overall refractive power.
- Moving
Claims (28)
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US14/480,240 US9077901B2 (en) | 2013-09-11 | 2014-09-08 | Light field image capture device having 2D image capture mode |
US14/716,055 US9411122B2 (en) | 2013-09-11 | 2015-05-19 | Light field image capture device having 2D image capture mode |
US15/054,030 US20160182786A1 (en) | 2013-09-11 | 2016-02-25 | Hybrid light-field camera |
US15/203,643 US10154197B2 (en) | 2013-09-11 | 2016-07-06 | Image capture device having light field image capture mode, 2D image capture mode, and intermediate capture mode |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US201361876377P | 2013-09-11 | 2013-09-11 | |
US14/480,240 US9077901B2 (en) | 2013-09-11 | 2014-09-08 | Light field image capture device having 2D image capture mode |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/716,055 Continuation US9411122B2 (en) | 2013-09-11 | 2015-05-19 | Light field image capture device having 2D image capture mode |
Publications (2)
Publication Number | Publication Date |
---|---|
US20150070474A1 US20150070474A1 (en) | 2015-03-12 |
US9077901B2 true US9077901B2 (en) | 2015-07-07 |
Family
ID=52625216
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/480,240 Active US9077901B2 (en) | 2013-09-11 | 2014-09-08 | Light field image capture device having 2D image capture mode |
US14/716,055 Expired - Fee Related US9411122B2 (en) | 2013-09-11 | 2015-05-19 | Light field image capture device having 2D image capture mode |
Family Applications After (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/716,055 Expired - Fee Related US9411122B2 (en) | 2013-09-11 | 2015-05-19 | Light field image capture device having 2D image capture mode |
Country Status (1)
Country | Link |
---|---|
US (2) | US9077901B2 (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10154197B2 (en) * | 2013-09-11 | 2018-12-11 | Google Llc | Image capture device having light field image capture mode, 2D image capture mode, and intermediate capture mode |
US10404965B2 (en) | 2015-12-04 | 2019-09-03 | Olympus Corporation | Microscope system |
Families Citing this family (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP5851320B2 (en) * | 2012-04-18 | 2016-02-03 | 株式会社東芝 | The camera module |
US9667846B2 (en) * | 2012-11-27 | 2017-05-30 | Nokia Technologies Oy | Plenoptic camera apparatus, a method and a computer program |
US20150116588A1 (en) * | 2013-10-25 | 2015-04-30 | Larview Technologies Corp. | Image capturing module and actuator structure thereof |
KR102487393B1 (en) | 2016-04-15 | 2023-01-12 | 에스케이하이닉스 주식회사 | Image Sensor Having a Light Field Mode and a Conventional Mode |
US10656407B2 (en) * | 2016-06-30 | 2020-05-19 | Intel Corporation | Electronically switching microlenses between optical states |
EP3264743A1 (en) * | 2016-06-30 | 2018-01-03 | Thomson Licensing | A digital imaging system for switching image capturing mode |
US10868945B2 (en) * | 2019-04-08 | 2020-12-15 | Omnivision Technologies, Inc. | Light-field camera and method using wafer-level integration process |
WO2021080515A1 (en) * | 2019-10-24 | 2021-04-29 | Nanyang Technological University | Method and apparatus for determining crystallographic orientation on crystalline surfaces |
KR20210124807A (en) | 2020-04-07 | 2021-10-15 | 에스케이하이닉스 주식회사 | Image Sensing Device |
CN115379090A (en) * | 2022-08-03 | 2022-11-22 | 奕目(上海)科技有限公司 | Imaging device |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20100141802A1 (en) * | 2008-12-08 | 2010-06-10 | Timothy Knight | Light Field Data Acquisition Devices, and Methods of Using and Manufacturing Same |
US8248515B2 (en) | 2006-02-07 | 2012-08-21 | The Board Of Trustees Of The Leland Stanford Junior University | Variable imaging arrangements and methods therefor |
US20130234935A1 (en) * | 2010-10-26 | 2013-09-12 | Bae Systems Plc | Display assembly |
US8593564B2 (en) | 2011-09-22 | 2013-11-26 | Apple Inc. | Digital camera including refocusable imaging mode adaptor |
Family Cites Families (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6999238B2 (en) * | 2003-12-01 | 2006-02-14 | Fujitsu Limited | Tunable micro-lens array |
-
2014
- 2014-09-08 US US14/480,240 patent/US9077901B2/en active Active
-
2015
- 2015-05-19 US US14/716,055 patent/US9411122B2/en not_active Expired - Fee Related
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8248515B2 (en) | 2006-02-07 | 2012-08-21 | The Board Of Trustees Of The Leland Stanford Junior University | Variable imaging arrangements and methods therefor |
US20100141802A1 (en) * | 2008-12-08 | 2010-06-10 | Timothy Knight | Light Field Data Acquisition Devices, and Methods of Using and Manufacturing Same |
US20130234935A1 (en) * | 2010-10-26 | 2013-09-12 | Bae Systems Plc | Display assembly |
US8593564B2 (en) | 2011-09-22 | 2013-11-26 | Apple Inc. | Digital camera including refocusable imaging mode adaptor |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10154197B2 (en) * | 2013-09-11 | 2018-12-11 | Google Llc | Image capture device having light field image capture mode, 2D image capture mode, and intermediate capture mode |
US10404965B2 (en) | 2015-12-04 | 2019-09-03 | Olympus Corporation | Microscope system |
Also Published As
Publication number | Publication date |
---|---|
US20150070474A1 (en) | 2015-03-12 |
US9411122B2 (en) | 2016-08-09 |
US20150247986A1 (en) | 2015-09-03 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US9411122B2 (en) | Light field image capture device having 2D image capture mode | |
US10154197B2 (en) | Image capture device having light field image capture mode, 2D image capture mode, and intermediate capture mode | |
US9456141B2 (en) | Light-field based autofocus | |
Zhou et al. | Computational cameras: convergence of optics and processing | |
US7627236B2 (en) | Hydraulic optical focusing-stabilizer | |
CN102739945B (en) | Optical field imaging device and method | |
Kuthirummal et al. | Flexible depth of field photography | |
US9712738B2 (en) | Systems, devices, and methods for managing camera focus | |
US8687040B2 (en) | Optical device with electrically variable extended depth of field | |
CA2857714C (en) | Apparatus and method for image super-resolution using integral shifting optics | |
US20160182786A1 (en) | Hybrid light-field camera | |
US11509835B2 (en) | Imaging system and method for producing images using means for adjusting optical focus | |
US11575821B2 (en) | Camera device having first and second cameras, and method of operating same | |
JP5809390B2 (en) | Ranging / photometric device and imaging device | |
AU2013306138A1 (en) | Dynamically curved sensor for optical zoom lens | |
WO2015062215A1 (en) | Device and method for acquiring image | |
US20120315952A1 (en) | Image capture systems with focusing capabilities | |
US9176263B2 (en) | Optical micro-sensor | |
CN214675328U (en) | Camera module and electronic equipment | |
Koppal et al. | Wide-angle micro sensors for vision on a tight budget | |
TWI862672B (en) | Tof camera | |
US11039117B2 (en) | Dual lens imaging module and capturing method thereof | |
KR101550307B1 (en) | The optical lens system for a camera | |
WO2024200292A1 (en) | Apparatuses and methods for polarization based surface normal imaging | |
KR20200108736A (en) | Camera module |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: LYTRO, INC., CALIFORNIA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:PITTS, COLVIN;NG, YI-REN;SIGNING DATES FROM 20141113 TO 20141117;REEL/FRAME:034427/0201 Owner name: LYTRO, INC., CALIFORNIA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:BHAT, JEROME CHANDRA;CLARKE, BRANDON ELLIOTT MERLE;MYHRE, GRAHAM BUTLER;AND OTHERS;SIGNING DATES FROM 20140916 TO 20141105;REEL/FRAME:034427/0232 |
|
STCF | Information on status: patent grant |
Free format text: PATENTED CASE |
|
AS | Assignment |
Owner name: TRIPLEPOINT CAPITAL LLC (GRANTEE), CALIFORNIA Free format text: SECURITY INTEREST;ASSIGNOR:LYTRO, INC. (GRANTOR);REEL/FRAME:036167/0081 Effective date: 20150407 |
|
MAFP | Maintenance fee payment |
Free format text: PAYMENT OF MAINTENANCE FEE, 4TH YR, SMALL ENTITY (ORIGINAL EVENT CODE: M2551); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY Year of fee payment: 4 |
|
FEPP | Fee payment procedure |
Free format text: ENTITY STATUS SET TO UNDISCOUNTED (ORIGINAL EVENT CODE: BIG.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY |
|
AS | Assignment |
Owner name: GOOGLE LLC, CALIFORNIA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:LYTRO, INC.;REEL/FRAME:050009/0829 Effective date: 20180325 |
|
MAFP | Maintenance fee payment |
Free format text: PAYMENT OF MAINTENANCE FEE, 8TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1552); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY Year of fee payment: 8 |