US20070030543A1 - Depth and lateral size control of three-dimensional images in projection integral imaging - Google Patents
Depth and lateral size control of three-dimensional images in projection integral imaging Download PDFInfo
- Publication number
- US20070030543A1 US20070030543A1 US11/498,666 US49866606A US2007030543A1 US 20070030543 A1 US20070030543 A1 US 20070030543A1 US 49866606 A US49866606 A US 49866606A US 2007030543 A1 US2007030543 A1 US 2007030543A1
- Authority
- US
- United States
- Prior art keywords
- images
- displaying
- planar
- micro
- depth
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Images
Classifications
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B30/00—Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images
- G02B30/20—Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images by providing first and second parallax images to an observer's left and right eyes
- G02B30/26—Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images by providing first and second parallax images to an observer's left and right eyes of the autostereoscopic type
- G02B30/27—Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images by providing first and second parallax images to an observer's left and right eyes of the autostereoscopic type involving lenticular arrays
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/10—Processing, recording or transmission of stereoscopic or multi-view image signals
- H04N13/106—Processing image signals
- H04N13/144—Processing image signals for flicker reduction
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/30—Image reproducers
- H04N13/302—Image reproducers for viewing without the aid of special glasses, i.e. using autostereoscopic displays
- H04N13/305—Image reproducers for viewing without the aid of special glasses, i.e. using autostereoscopic displays using lenticular lenses, e.g. arrangements of cylindrical lenses
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N2213/00—Details of stereoscopic systems
- H04N2213/002—Eyestrain reduction by processing stereoscopic signals or controlling stereoscopic devices
Definitions
- stereoscopic Most 3D display techniques developed to date are stereoscopic.
- a stereoscopic system may be realized that may display large images with high resolution, however stereoscopic techniques may require supplementary glasses to evoke 3-D visual effects. Additionally, stereoscopic techniques may provide observers with horizontal parallax and a small number of viewpoints. Observation of stereoscopic images may also cause visual fatigue due to convergence-accommodation conflict.
- Convergence-accommodation conflict may be avoided by a true 3-D image formation in space with full parallax and continuous viewing points.
- Holography is one way to form 3-D images in space, but recording full-color holograms for an outdoor scene may be difficult. For example, when computer-generated holograms are prepared, a large amount of computation time and capacity may be required to obtain proper gratings. Because coherent light is often used in holography, speckle may also occur.
- II integral imaging
- 3-D images may be formed by crossing the rays coming from 2-D elemental images using a lenslet array.
- Each microlens in a lenslet array may act as a directional pixel in a pinhole fashion.
- the pinholes create directional views which when viewed with two eyes for example, appear as a 3D image in space. II may provide observers with true 3-D images with full parallax and continuous viewing points. However, the viewing angle, depth-of-focus, and resolution of 3-D images may be limited.
- 3-D images produced in direct camera pickup II are pseudoscopic (depth-reversed) images, and thus may make II systems more complex and thus more impractical.
- Advancements in the art are needed to increase viewing angles and improve image quality. Also needed are ways to display, images of large objects that are far from the pickup device. Additionally needed advancements include the ability to project 3-D images to a large display screen.
- a method disclosed herein relates to a method of displaying three-dimensional images.
- the method comprising, projecting integral images to a display device, and displaying three-dimensional images with the display device.
- the method comprising, magnifying elemental images during pickup, projecting the magnified elemental images via an optics relay to a display device, and displaying 3-D images within the depth-of-focus of the display device while maintaining lateral image sizes.
- the method comprising, positioning an optical path-length-equalizing (OPLE) lens adjacent to a planar lenslet array, projecting 3-D images via an optics relay to a planar display device, and displaying 3-D images within the depth-of-focus of the display device.
- OLE optical path-length-equalizing
- the method comprising, generating elemental images with a micro-lenslet array, increasing disparity of elemental images with an optical path-length-equalizing (OPLE) lens, recording the elemental images on an imaging sensor of a recording device.
- the method further comprising, projecting 3-D images through an optical relay to a display device, and displaying the 3-D images within the depth-of-focus of the display device.
- OLE optical path-length-equalizing
- the apparatus comprising, a projector for projecting integral images, and a micro-convex-mirror array for displaying the projected images.
- FIGS. 1 a , 1 b , and 1 c are side views of an integral imaging (II) arrangements using planar devices;
- FIGS. 2 a , 2 b , and 2 c are side views of a projection integral imaging (PII) arrangements using planar devices;
- PII projection integral imaging
- FIGS. 3 a , 3 b , 3 c , 3 d , 3 e, and 3 f are side views of non-linear depth control arrangements using curved devices;
- FIGS. 4 a and 4 b are side views of modified pick up systems
- FIGS. 5 a , 5 b , and 5 c show divergent projection methods
- FIG. 6 a shows examples of objects to be imaged
- FIG. 6 b shows a modified pick up lens system attached to a digital camera
- FIG. 7 shows a top view of optical set up for 3D image display which includes a micro-convex mirror array
- FIGS. 8 a , 8 b , 8 c and 8 d show center parts of elemental images
- FIG. 9 shows reconstructed orthoscopic virtual 3-D images when an optical path-length-equalizing (OPLE) lens was not used.
- FIG. 10 shows reconstructed orthoscopic virtual 3-D images when an OPLE lens was used.
- curved pickup devices i.e., a curved 2-D image sensor and a curved lenslet array
- curved display devices or both may be used for this purpose.
- the lenslets in the curved array have a zooming capability, a linear depth control is additionally possible.
- planar pickup devices may be used (lenslet array, sensor, and display).
- An additional large aperture negative lens also referred to herein as an optical path-length-equalizing (OPLE) lens, is placed in contact with the pickup lenslet array.
- OPLE optical path-length-equalizing
- planar lenslet arrays with positive focal lengths have been used as depicted in FIG. 1 .
- a set of elemental images 1 of a 3-D object 2 may be obtained by use of a lenslet array 3 and a 2-D image sensor 4 such as a charged coupled device (CCD) or a complementary metal oxide semiconductor (CMOS) image sensor.
- a 2-D image sensor 4 such as a charged coupled device (CCD) or a complementary metal oxide semiconductor (CMOS) image sensor.
- CCD charged coupled device
- CMOS complementary metal oxide semiconductor
- FIG. 1 ( b ) to reconstruct a 3-D image 7 of the object 2 , the set of 2-D elemental images 1 may be displayed in front of a lenslet array 3 using a 2-D display panel 5 , such as a liquid crystal display (LCD) panel.
- LCD liquid crystal display
- (L i ⁇ ) ⁇ g r , where it may be assumed that 3-D real images 7 are formed around z L i . The rays coming from elemental images converge to form a 3-D real image through the lenslet array 3 . The reconstructed 3-D image may be a pseudoscopic (depth-reversed) real image 7 of the object 2 .
- the gap distance g should be L i ⁇
- Projection integral imaging is the novel subject of this invention. In other words, the inventors were the first to invent projection integral imaging (PII).
- elemental images may be projected through relay optics 10 onto a lenslet array 3 as depicted in FIGS. 2 ( a ) and ( b ).
- a micro-convex/concave-mirror array 11 , 21 as a projection screen may be used, as depicted in FIGS. 2 ( c ) and ( d ).
- 3-D orthoscopic virtual images 8 may be displayed without the P/O conversion.
- the gap distance g becomes L i ⁇
- PII allows for the following because of the use of a micro-convex-mirror array as a projection screen:
- the full viewing angle ⁇ is limited and determined approximately by 2 ⁇ arctan[0.5/( ⁇ /#)], where ⁇ /# is the ⁇ number of the lenslet, when the fill factor of the lenslet array is close to 1.
- the P/O conversion is unnecessary, if a positive lenslet array is used for direct camera pickup.
- 3-D images reconstructed in II systems may have limited depth-of-focus ⁇ . It has been shown that ⁇ cannot be larger than 1/( ⁇ 2 ) where ⁇ is the display wavelength and ⁇ is the resolution of reconstructed 3-D images. ⁇ is defined as the inverse of the reconstructed image spot size. In PII, 3-D images with high resolution can be reconstructed only near the projection screen of micro-convex-mirror arrays (or the display lenslet array). Thus the depth-of-focus ⁇ should be measured from the projection screen.
- the focal length of the pickup lenslet array ⁇ p is longer than that of the display micro-convex-mirror array ⁇ d , the longitudinal scale of reconstructed image space is reduced linearly by a factor of ⁇ d / ⁇ p ⁇ r while the lateral scale does not change. So if (z o +T)r ⁇ , the 3-D reconstructed image is well focused.
- Digital zoom-in may degrade the resolution of elemental images.
- a nonlinear depth control method may be used.
- a curved pickup devices e.g., a curved lenslet array 17 and a curved 2-D image sensor 18 ) with a radius of curvature R may be used, and then 3-D images may be reconstructed using planar display devices as depicted in FIGS. 3 ( a ) and ( b ), respectively.
- planar pickup devices e.g., a planar image sensor 14 and a planar lenslet array 16
- curved display devices e.g., a curved display panel 19 and a curved lenslet array 20
- R>0 when the center of the curvature is positioned at the same side of the object (observer 6 ) in the pickup (display) process
- R ⁇ 0 when it is positioned at the opposite side.
- the effect of depth and size reduction using the negatively curved pickup lenslet array can be analyzed by introducing a hypothetical thin lens with a negative focal length ⁇ R p , which is in contact with the planar pickup lenslet array 16 , as depicted in FIG. 3 ( e ). This is because ray propagation behaviors for the two setups in FIGS. 3 ( a ) and 3 ( e ), and those in FIGS. 3 ( d ) and 3 ( f ) are the same, respectively.
- this lens an optical path-length-equalizing (OPLE) lens 15 .
- OPLE optical path-length-equalizing
- R p >> ⁇ p and thus ⁇ p e ⁇ p .
- a planar lenslet array 16 with a focal length ⁇ p e may be used, a flat image sensor 14 , and the pickup OPLE lens 15 with a focal length ⁇ R p .
- the OPLE lens 15 first produces images of objects, and then the images are actually picked up by the planar pickup devices 14 , 16 to produce elemental images with increased disparity.
- the effect of depth and size reduction can also be achieved by use of negatively curved display devices 19 , 20 .
- curved display devices 19 , 20 with a radius of curvature ⁇ R d are used, while elemental images are obtained by use of planar pickup devices 14 , 16 .
- both linear and nonlinear depth control methods may be used together.
- the position of the reconstructed image can be predicted from the equivalent planar pickup 14 , 16 and display 22 , 16 devices with OPLE lenses.
- elemental images with increased disparity are obtained and then they are digitally zoomed-in.
- a modified pickup system is usually used as depicted in FIG. 4 ( a ).
- elemental images formed by a planar lenslet array 3 are detected through a camera lens 25 with a large ⁇ /#.
- the use of such a camera lens 25 and the planar pickup lenslet array 3 produces the effect of a negatively curved pickup lenslet array, because disparity of elemental images increases. This effect is taken into account, by considering the modified pickup system as a curved pickup system with a curved lenslet array whose radius of curvature is ⁇ R c .
- R c equals approximately the distance between the planar pickup lenslet array and the camera lens.
- the projection beam angle ⁇ (e.g., in the azimuthal direction) may not be negligible.
- the effect of negatively curved display devices naturally exists even if planar display devices are used.
- the horizontal size of the overall projected elemental images on the screen is S.
- the planar display devices as curved display devices with a radius of curvature ⁇ R s ⁇ S/ ⁇ if the aperture size of the relay optics is much smaller than S.
- R s is approximately equal to the distance between the planar projection screen and the relay optics.
- the object to be imaged is composed of a small cacti 35 and a large building 36 as shown in FIG. 6 ( a ).
- the distance between the pickup lenslet array and the cacti 35 is approximately 20 cm and that between the pickup lenslet array and the building is approximately 70 m.
- elemental images were obtained by use of a planar 2-D image sensor and a planar lenslet array in contact with a large-aperture negative lens as an OPLE lens.
- the planar pickup lenslet array used is made from acrylic, and has 53 ⁇ 53 plano-convex lenslets.
- Each lenslet element is square-shaped and has a uniform base size of 1.09 mm ⁇ 1.09 mm, with less than 7.6 ⁇ m separating the lenslet elements.
- a total of 48 ⁇ 36 elemental images are used in the experiments.
- a digital camera 37 with 4500 ⁇ 3000 CMOS pixels was used for the 2-D image sensor.
- the camera pickup system 37 is shown in FIG. 6 ( b ).
- the linear depth reduction method was also used in combination with the nonlinear method. To avoid resolution degradation caused by digital zoom-in, the resolution of the zoom-in elemental images was kept higher than that of the LCD projector.
- a planar micro-convex-mirror array for the projection screen was obtained by coating the convex surface of a lenslet array that is identical to the pickup lenslet array. Light intensity reflectance of the screen is more than 90%.
- FIG. 7 The setup for 3-D image reconstruction is depicted in FIG. 7 .
- a color LCD projector 40 that has 3 (red, green, and blue) panels was used for elemental image projection. Each panel has 1024 ⁇ 768 square pixels with a pixel pitch of 18 ⁇ m. Each elemental image has approximately 21 ⁇ 21 pixels on average.
- Magnification of the relay optics 41 is 2.9.
- the diverging angle of the projection beam ⁇ is approximately 6 degrees in the azimuthal direction. The effect of curved display devices slightly exists.
- z oc 20 cm
- z ob 70 m
- FIGS. 8 ( a ) and 8 ( b ) Center parts of elemental images that were obtained without the OPLE lens and those obtained with the OPLE lens are shown in FIGS. 8 ( a ) and 8 ( b ), respectively.
- the OPLE lens increases disparity between neighboring elemental images.
- FIGS. 9 and 10 Left, center, and right views of reconstructed 3-D images for different depth control parameters are illustrated in FIGS. 9 and 10 .
- the observed positions of the reconstructed images agree qualitatively with the estimated positions given in Table 1. Comparing the images shown in FIGS. 9 and 10 , one can see that smaller 3-D images are reconstructed for shorter R p e .
- reconstructed 3-D images squeeze further in the longitudinal direction and thus disparity between left and right views reduces.
- the lateral size of reconstructed 3-D images is independent of r.
- Reconstructed 3-D images at deeper positions are more blurred because the depth-of-focus of the PII system is limited, which is estimated to be 5 cm approximately.
- Binocular parallax is the most effective depth cue for viewing medium distances.
- our depth control method degrades solidity of reconstructed 3-D images because it squeezes their longitudinal depth more excessively than the lateral size for distant objects.
- human vision also uses other depth cues, and binocular parallax may not be so effective for viewing long distances. Therefore, our nonlinear position control method can be efficiently used for large-scale 3-D display system with limited depth-of-focus. Nevertheless, efforts to enhance the depth-of-focus of II systems should be pursued.
- Some embodiments of the invention have the following advantages: imaging is performed with direct pickup to create true 3-D image formations with full parallax and continuous viewing points with incoherent light using two-dimensional display devices resulting in orthoscopic images with wide viewing angles, large depth of focus and high resolution. Additional advantages include the ability to project 3-D images to a large display screen.
- a computer or other client or server device can be deployed as part of a computer network, or in a distributed computing environment.
- the methods and apparatus described above and/or claimed herein pertain to any computer system having any number of memory or storage units, and any number of applications and processes occurring across any number of storage units or volumes, which may be used in connection with the methods and apparatus described above and/or claimed herein.
- the same may apply to an environment with server computers and client computers deployed in a network environment or distributed computing environment, having remote or local storage.
- the methods and apparatus described above and/or claimed herein may also be applied to standalone computing devices, having programming language functionality, interpretation and execution capabilities for generating, receiving and transmitting information in connection with remote or local services.
- the methods and apparatus described above and/or claimed herein is operational with numerous other general purpose or special purpose computing system environments or configurations.
- Examples of well known computing systems, environments, and/or configurations that may be suitable for use with the methods and apparatus described above and/or claimed herein include, but are not limited to, personal computers, server computers, hand-held or laptop devices, multiprocessor systems, microprocessor-based systems, network PCs, minicomputers, mainframe computers, distributed computing environments that include any of the above systems or devices.
- the methods described above and/or claimed herein may be described in the general context of computer-executable instructions, such as program modules, being executed by a computer.
- Program modules typically include routines, programs, objects, components, data structures, etc. that perform particular tasks or implement particular abstract data types.
- the methods and apparatus described above and/or claimed herein may also be practiced in distributed computing environments such as between different units where tasks are performed by remote processing devices that are linked through a communications network or other data transmission medium.
- program modules and routines or data may be located in both local and remote computer storage media including memory storage devices.
- Distributed computing facilitates sharing of computer resources and services by direct exchange between computing devices and systems. These resources and services may include the exchange of information, cache storage, and disk storage for files.
- Distributed computing takes advantage of network connectivity, allowing clients to leverage their collective power to benefit the entire enterprise.
- a variety of devices may have applications, objects or resources that may utilize the methods and apparatus described above and/or claimed herein.
- Computer programs implementing the method described above will commonly be distributed to users on a distribution medium such as a CD-ROM.
- the program could be copied to a hard disk or a similar intermediate storage medium.
- the programs When the programs are to be run, they will be loaded either from their distribution medium or their intermediate storage medium into the execution memory of the computer, thus configuring a computer to act in accordance with the methods and apparatus described above.
- computer-readable medium encompasses all distribution and storage media, memory of a computer, and any other medium or device capable of storing for reading by a computer a computer program implementing the method described above.
- the various techniques described herein may be implemented in connection with hardware or software or, where appropriate, with a combination of both.
- the methods and apparatus described above and/or claimed herein, or certain aspects or portions thereof may take the form of program code or instructions embodied in tangible media, such as floppy diskettes, CD-ROMs, hard drives, or any other machine-readable storage medium, wherein, when the program code is loaded into and executed by a machine, such as a computer, the machine becomes an apparatus for practicing the methods and apparatus of described above and/or claimed herein.
- the computing device will generally include a processor, a storage medium readable by the processor, which may include volatile and non-volatile memory and/or storage elements, at least one input device, and at least one output device.
- One or more programs that may utilize the techniques of the methods and apparatus described above and/or claimed herein, e.g., through the use of a data processing, may be implemented in a high level procedural or object oriented programming language to communicate with a computer system.
- the program(s) can be implemented in assembly or machine language, if desired.
- the language may be a compiled or interpreted language, and combined with hardware implementations.
- the methods and apparatus described above and/or claimed herein may also be practiced via communications embodied in the form of program code that is transmitted over some transmission medium, such as over electrical wiring or cabling, through fiber optics, or via any other form of transmission, wherein, when the program code is received and loaded into and executed by a machine, such as an EPROM, a gate array, a programmable logic device (PLD), a client computer, or a receiving machine having the signal processing capabilities as described in exemplary embodiments above becomes an apparatus for practicing the method described above and/or claimed herein.
- a machine such as an EPROM, a gate array, a programmable logic device (PLD), a client computer, or a receiving machine having the signal processing capabilities as described in exemplary embodiments above becomes an apparatus for practicing the method described above and/or claimed herein.
- PLD programmable logic device
- client computer or a receiving machine having the signal processing capabilities as described in exemplary embodiments above becomes an apparatus for practicing the method described above and/or claimed herein.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Optics & Photonics (AREA)
- Testing, Inspecting, Measuring Of Stereoscopic Televisions And Televisions (AREA)
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US11/498,666 US20070030543A1 (en) | 2005-08-08 | 2006-08-03 | Depth and lateral size control of three-dimensional images in projection integral imaging |
US12/939,647 US8264772B2 (en) | 2005-08-08 | 2010-11-04 | Depth and lateral size control of three-dimensional images in projection integral imaging |
US13/596,715 US20120320161A1 (en) | 2005-08-08 | 2012-08-28 | Depth and lateral size control of three-dimensional images in projection integral imaging |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US70628105P | 2005-08-08 | 2005-08-08 | |
US11/498,666 US20070030543A1 (en) | 2005-08-08 | 2006-08-03 | Depth and lateral size control of three-dimensional images in projection integral imaging |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US12/939,647 Division US8264772B2 (en) | 2005-08-08 | 2010-11-04 | Depth and lateral size control of three-dimensional images in projection integral imaging |
Publications (1)
Publication Number | Publication Date |
---|---|
US20070030543A1 true US20070030543A1 (en) | 2007-02-08 |
Family
ID=37727940
Family Applications (3)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US11/498,666 Abandoned US20070030543A1 (en) | 2005-08-08 | 2006-08-03 | Depth and lateral size control of three-dimensional images in projection integral imaging |
US12/939,647 Active US8264772B2 (en) | 2005-08-08 | 2010-11-04 | Depth and lateral size control of three-dimensional images in projection integral imaging |
US13/596,715 Abandoned US20120320161A1 (en) | 2005-08-08 | 2012-08-28 | Depth and lateral size control of three-dimensional images in projection integral imaging |
Family Applications After (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US12/939,647 Active US8264772B2 (en) | 2005-08-08 | 2010-11-04 | Depth and lateral size control of three-dimensional images in projection integral imaging |
US13/596,715 Abandoned US20120320161A1 (en) | 2005-08-08 | 2012-08-28 | Depth and lateral size control of three-dimensional images in projection integral imaging |
Country Status (4)
Country | Link |
---|---|
US (3) | US20070030543A1 (fr) |
CN (1) | CN101278565A (fr) |
DE (1) | DE112006002095T5 (fr) |
WO (1) | WO2007019347A2 (fr) |
Cited By (17)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20070126967A1 (en) * | 2005-12-02 | 2007-06-07 | Choi Kyung H | Two-dimensional and three-dimensional image selectable display device |
US20080169997A1 (en) * | 2007-01-16 | 2008-07-17 | Kyung-Ho Choi | Multi-dimensional image selectable display device |
US20130193679A1 (en) * | 2010-07-21 | 2013-08-01 | Giesecke & Devrient | Optically variable security element with tilt image |
KR20140121529A (ko) * | 2013-04-05 | 2014-10-16 | 삼성전자주식회사 | 광 필드 영상을 생성하는 방법 및 장치 |
JP2015212795A (ja) * | 2014-05-07 | 2015-11-26 | 日本放送協会 | 立体映像表示装置 |
JP2017003688A (ja) * | 2015-06-08 | 2017-01-05 | 日本放送協会 | 光線制御素子および立体表示装置 |
CN107430277A (zh) * | 2015-01-21 | 2017-12-01 | 特塞兰德有限责任公司 | 用于沉浸式虚拟现实的高级折射光学器件 |
WO2018031963A1 (fr) * | 2016-08-12 | 2018-02-15 | Avegant Corp. | Système d'affichage proche de l'oeil comprenant un empilement de modulation |
US10057488B2 (en) | 2016-08-12 | 2018-08-21 | Avegant Corp. | Image capture with digital light path length modulation |
US10185153B2 (en) | 2016-08-12 | 2019-01-22 | Avegant Corp. | Orthogonal optical path length extender |
US10296098B2 (en) * | 2014-09-30 | 2019-05-21 | Mirama Service Inc. | Input/output device, input/output program, and input/output method |
US10379388B2 (en) | 2016-08-12 | 2019-08-13 | Avegant Corp. | Digital light path length modulation systems |
US10401639B2 (en) | 2016-08-12 | 2019-09-03 | Avegant Corp. | Method and apparatus for an optical path length extender |
US10516879B2 (en) | 2016-08-12 | 2019-12-24 | Avegant Corp. | Binocular display with digital light path length modulation |
US10809546B2 (en) | 2016-08-12 | 2020-10-20 | Avegant Corp. | Digital light path length modulation |
JP2021152661A (ja) * | 2010-06-16 | 2021-09-30 | 株式会社ニコン | 画像表示装置 |
US11303878B2 (en) * | 2017-06-27 | 2022-04-12 | Boe Technology Group Co., Ltd. | Three-dimensional display panel, display method thereof, and display device |
Families Citing this family (25)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
ES2562924T3 (es) | 2008-10-10 | 2016-03-09 | Koninklijke Philips N.V. | Un procedimiento de procesamiento de información de paralaje comprendida en una señal |
EP2471268B1 (fr) * | 2009-08-25 | 2014-10-08 | Dolby Laboratories Licensing Corporation | Système d'affichage tridimensionnel |
US20110075257A1 (en) | 2009-09-14 | 2011-03-31 | The Arizona Board Of Regents On Behalf Of The University Of Arizona | 3-Dimensional electro-optical see-through displays |
US9244277B2 (en) | 2010-04-30 | 2016-01-26 | The Arizona Board Of Regents On Behalf Of The University Of Arizona | Wide angle and high resolution tiled head-mounted display device |
DE102012212088A1 (de) * | 2012-07-11 | 2014-05-22 | Bayerische Motoren Werke Aktiengesellschaft | Bilderzeugende Einheit |
US9341853B2 (en) * | 2012-08-14 | 2016-05-17 | Young Optics Inc. | Stereo image displaying system and stereo image capturing system |
KR101984701B1 (ko) * | 2012-11-13 | 2019-05-31 | 삼성전자주식회사 | 전기습윤 렌즈 어레이를 포함하는 3차원 영상 디스플레이 장치 및 3차원 영상 획득 장치 |
KR101294261B1 (ko) * | 2013-01-08 | 2013-08-06 | 동서대학교산학협력단 | 마스크와 시간다중화 방식을 이용한 3차원 집적 영상 표시방법 |
US8699868B1 (en) | 2013-03-14 | 2014-04-15 | Microsoft Corporation | Anti-shake correction system for curved optical sensor |
KR20150066901A (ko) | 2013-12-09 | 2015-06-17 | 삼성전자주식회사 | 디스플레이 패널의 구동 장치 및 구동 방법 |
US9182605B2 (en) * | 2014-01-29 | 2015-11-10 | Emine Goulanian | Front-projection autostereoscopic 3D display system |
KR101617514B1 (ko) * | 2014-04-16 | 2016-05-13 | 광운대학교 산학협력단 | 멀티 프로젝션형 집적영상방법 |
CN105025284B (zh) | 2014-04-18 | 2019-02-05 | 北京三星通信技术研究有限公司 | 标定集成成像显示设备的显示误差的方法和设备 |
CN104407442A (zh) * | 2014-05-31 | 2015-03-11 | 福州大学 | 一种集成成像3d显示微透镜阵列及其3d制作方法 |
CN104113750B (zh) * | 2014-07-04 | 2015-11-11 | 四川大学 | 一种无深度反转的集成成像3d投影显示装置 |
DE102016113669A1 (de) * | 2016-07-25 | 2018-01-25 | Osram Opto Semiconductors Gmbh | Verfahren zur autostereoskopischen Bildgebung und autostereoskopische Beleuchtungseinheit |
DE102016224162A1 (de) | 2016-12-05 | 2018-06-07 | Continental Automotive Gmbh | Head-Up-Display |
IL269043B2 (en) | 2017-03-09 | 2024-02-01 | Univ Arizona | A complex light field display in the head with integral imaging and a waveguide prism |
JP7185303B2 (ja) | 2017-03-09 | 2022-12-07 | アリゾナ ボード オブ リージェンツ オン ビハーフ オブ ザ ユニバーシティ オブ アリゾナ | インテグラルイメージングおよびリレー光学部品を用いたヘッドマウント・ライトフィールド・ディスプレイ |
CN107301620B (zh) * | 2017-06-02 | 2019-08-13 | 西安电子科技大学 | 基于相机阵列的全景成像方法 |
JP7185331B2 (ja) | 2018-03-22 | 2022-12-07 | アリゾナ ボード オブ リージェンツ オン ビハーフ オブ ザ ユニバーシティ オブ アリゾナ | インテグラルイメージング方式のライトフィールドディスプレイ用にライトフィールド画像をレンダリングする方法 |
KR20210127744A (ko) | 2019-02-18 | 2021-10-22 | 알앤브이테크 엘티디 | 고 해상도 3d 디스플레이 |
CN114973935B (zh) * | 2021-02-23 | 2023-08-18 | 苏州佳世达电通有限公司 | 影像显示装置 |
EP4305487A1 (fr) * | 2021-03-09 | 2024-01-17 | Arizona Board of Regents on behalf of The University of Arizona | Dispositifs et procédés pour améliorer la performance d'affichages de champ lumineux basés sur une imagerie intégrale à l'aide de schémas de multiplexage temporel |
KR20240126940A (ko) * | 2023-02-14 | 2024-08-22 | 삼성디스플레이 주식회사 | 다시점 휘도 측정기 및 다시점 휘도 측정 방법 |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20020114077A1 (en) * | 2001-01-23 | 2002-08-22 | Bahram Javidi | Integral three-dimensional imaging with digital reconstruction |
US20030020809A1 (en) * | 2000-03-15 | 2003-01-30 | Gibbon Michael A | Methods and apparatuses for superimposition of images |
US20040061934A1 (en) * | 2000-12-18 | 2004-04-01 | Byoungho Lee | Reflecting three-dimensional display system |
US20040184145A1 (en) * | 2002-01-23 | 2004-09-23 | Sergey Fridman | Autostereoscopic display and method |
Family Cites Families (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
DE3921061A1 (de) * | 1989-06-23 | 1991-01-03 | Hertz Inst Heinrich | Wiedergabeeinrichtung fuer dreidimensionale wahrnehmung von bildern |
JP2005070255A (ja) * | 2003-08-22 | 2005-03-17 | Denso Corp | 虚像表示装置 |
US7261417B2 (en) * | 2004-02-13 | 2007-08-28 | Angstrom, Inc. | Three-dimensional integral imaging and display system using variable focal length lens |
-
2006
- 2006-08-03 CN CNA2006800313194A patent/CN101278565A/zh active Pending
- 2006-08-03 DE DE112006002095T patent/DE112006002095T5/de not_active Withdrawn
- 2006-08-03 WO PCT/US2006/030513 patent/WO2007019347A2/fr active Application Filing
- 2006-08-03 US US11/498,666 patent/US20070030543A1/en not_active Abandoned
-
2010
- 2010-11-04 US US12/939,647 patent/US8264772B2/en active Active
-
2012
- 2012-08-28 US US13/596,715 patent/US20120320161A1/en not_active Abandoned
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20030020809A1 (en) * | 2000-03-15 | 2003-01-30 | Gibbon Michael A | Methods and apparatuses for superimposition of images |
US20040061934A1 (en) * | 2000-12-18 | 2004-04-01 | Byoungho Lee | Reflecting three-dimensional display system |
US20020114077A1 (en) * | 2001-01-23 | 2002-08-22 | Bahram Javidi | Integral three-dimensional imaging with digital reconstruction |
US20040184145A1 (en) * | 2002-01-23 | 2004-09-23 | Sergey Fridman | Autostereoscopic display and method |
Cited By (33)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8174464B2 (en) | 2005-12-02 | 2012-05-08 | Samsung Mobile Display Co., Ltd. | Two-dimensional and three-dimensional image selectable display device |
US20070126967A1 (en) * | 2005-12-02 | 2007-06-07 | Choi Kyung H | Two-dimensional and three-dimensional image selectable display device |
US20080169997A1 (en) * | 2007-01-16 | 2008-07-17 | Kyung-Ho Choi | Multi-dimensional image selectable display device |
JP2021152661A (ja) * | 2010-06-16 | 2021-09-30 | 株式会社ニコン | 画像表示装置 |
US9987875B2 (en) * | 2010-07-21 | 2018-06-05 | Giesecke+Devrient Mobile Security Gmbh | Optically variable security element with tilt image |
US20130193679A1 (en) * | 2010-07-21 | 2013-08-01 | Giesecke & Devrient | Optically variable security element with tilt image |
KR20140121529A (ko) * | 2013-04-05 | 2014-10-16 | 삼성전자주식회사 | 광 필드 영상을 생성하는 방법 및 장치 |
JP2014203462A (ja) * | 2013-04-05 | 2014-10-27 | 三星電子株式会社Samsung Electronics Co.,Ltd. | 光フィールド映像を生成する方法及び装置 |
EP2787734A3 (fr) * | 2013-04-05 | 2014-11-05 | Samsung Electronics Co., Ltd. | Appareil et procédé de formation d'image de champ lumineux |
US9536347B2 (en) | 2013-04-05 | 2017-01-03 | Samsung Electronics Co., Ltd. | Apparatus and method for forming light field image |
KR102049456B1 (ko) | 2013-04-05 | 2019-11-27 | 삼성전자주식회사 | 광 필드 영상을 생성하는 방법 및 장치 |
JP2015212795A (ja) * | 2014-05-07 | 2015-11-26 | 日本放送協会 | 立体映像表示装置 |
US10296098B2 (en) * | 2014-09-30 | 2019-05-21 | Mirama Service Inc. | Input/output device, input/output program, and input/output method |
CN107430277A (zh) * | 2015-01-21 | 2017-12-01 | 特塞兰德有限责任公司 | 用于沉浸式虚拟现实的高级折射光学器件 |
US10663626B2 (en) * | 2015-01-21 | 2020-05-26 | Tesseland, Llc | Advanced refractive optics for immersive virtual reality |
JP2017003688A (ja) * | 2015-06-08 | 2017-01-05 | 日本放送協会 | 光線制御素子および立体表示装置 |
US10944904B2 (en) | 2016-08-12 | 2021-03-09 | Avegant Corp. | Image capture with digital light path length modulation |
US11025893B2 (en) | 2016-08-12 | 2021-06-01 | Avegant Corp. | Near-eye display system including a modulation stack |
US10401639B2 (en) | 2016-08-12 | 2019-09-03 | Avegant Corp. | Method and apparatus for an optical path length extender |
WO2018031963A1 (fr) * | 2016-08-12 | 2018-02-15 | Avegant Corp. | Système d'affichage proche de l'oeil comprenant un empilement de modulation |
US10516879B2 (en) | 2016-08-12 | 2019-12-24 | Avegant Corp. | Binocular display with digital light path length modulation |
US10185153B2 (en) | 2016-08-12 | 2019-01-22 | Avegant Corp. | Orthogonal optical path length extender |
US10809546B2 (en) | 2016-08-12 | 2020-10-20 | Avegant Corp. | Digital light path length modulation |
US10057488B2 (en) | 2016-08-12 | 2018-08-21 | Avegant Corp. | Image capture with digital light path length modulation |
US11016307B2 (en) | 2016-08-12 | 2021-05-25 | Avegant Corp. | Method and apparatus for a shaped optical path length extender |
US10379388B2 (en) | 2016-08-12 | 2019-08-13 | Avegant Corp. | Digital light path length modulation systems |
US11042048B2 (en) | 2016-08-12 | 2021-06-22 | Avegant Corp. | Digital light path length modulation systems |
US10187634B2 (en) | 2016-08-12 | 2019-01-22 | Avegant Corp. | Near-eye display system including a modulation stack |
US12025811B2 (en) | 2016-08-12 | 2024-07-02 | Avegant Corp. | Polarized light direction system |
US11480784B2 (en) | 2016-08-12 | 2022-10-25 | Avegant Corp. | Binocular display with digital light path length modulation |
US11852890B2 (en) | 2016-08-12 | 2023-12-26 | Avegant Corp. | Near-eye display system |
US11852839B2 (en) | 2016-08-12 | 2023-12-26 | Avegant Corp. | Optical path length extender |
US11303878B2 (en) * | 2017-06-27 | 2022-04-12 | Boe Technology Group Co., Ltd. | Three-dimensional display panel, display method thereof, and display device |
Also Published As
Publication number | Publication date |
---|---|
US20120320161A1 (en) | 2012-12-20 |
DE112006002095T5 (de) | 2008-10-30 |
WO2007019347A2 (fr) | 2007-02-15 |
US8264772B2 (en) | 2012-09-11 |
WO2007019347A3 (fr) | 2007-06-07 |
CN101278565A (zh) | 2008-10-01 |
US20110043611A1 (en) | 2011-02-24 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US8264772B2 (en) | Depth and lateral size control of three-dimensional images in projection integral imaging | |
JP5752823B2 (ja) | 粗インテグラルホログラフィックディスプレイ | |
Min et al. | Three-dimensional display system based on computer-generated integral photography | |
Lipton et al. | New autostereoscopic display technology: the synthaGram | |
US20050180019A1 (en) | Three-dimensional integral imaging and display system using variable focal length lens | |
US20020114077A1 (en) | Integral three-dimensional imaging with digital reconstruction | |
CN108803053B (zh) | 三维光场显示系统 | |
US9740015B2 (en) | Three-dimensional imaging system based on stereo hologram having nine-to-one microlens-to-prism arrangement | |
US9197877B2 (en) | Smart pseudoscopic-to-orthoscopic conversion (SPOC) protocol for three-dimensional (3D) display | |
CN107092096A (zh) | 一种裸眼3d地面沙盘显示系统及方法 | |
US20060256436A1 (en) | Integral three-dimensional imaging with digital reconstruction | |
RU2625815C2 (ru) | Устройство отображения | |
Javidi et al. | Breakthroughs in photonics 2014: recent advances in 3-D integral imaging sensing and display | |
TWI489149B (zh) | 立體顯示裝置及儲存媒體 | |
Jang et al. | Depth and lateral size control of three-dimensional images in projection integral imaging | |
Miyazaki et al. | Floating three-dimensional display viewable from 360 degrees | |
JP4741395B2 (ja) | 立体映像表示装置 | |
CN1598690A (zh) | 分屏式立体摄影、投影仪 | |
Arai | Three-dimensional television system based on spatial imaging method using integral photography | |
Javidi et al. | Enhanced 3D color integral imaging using multiple display devices | |
Kim et al. | Integral imaging with reduced color moiré pattern by using a slanted lens array | |
JP2012177756A (ja) | 立体画像取得装置 | |
Okaichi et al. | Integral three-dimensional display with high image quality using multiple flat-panel displays | |
Yang et al. | Projection-type integral imaging using a pico-projector | |
Moon et al. | Compensation of image distortion in Fresnel lens-based 3D projection display system using a curved screen |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: THE UNIVERSITY OF CONNECTICUT, CONNECTICUT Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:JAVIDI, BAHRAM;JANG (DEC'D) BY HYUNJU HA, WIFE AND LEGAL REPRESENTATIVE, JU-SEOG;REEL/FRAME:018452/0239;SIGNING DATES FROM 20061020 TO 20061023 |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |