US20150116458A1 - Method and apparatus for generating enhanced 3d-effects for real-time and offline appplications - Google Patents
Method and apparatus for generating enhanced 3d-effects for real-time and offline appplications Download PDFInfo
- Publication number
- US20150116458A1 US20150116458A1 US14/522,278 US201414522278A US2015116458A1 US 20150116458 A1 US20150116458 A1 US 20150116458A1 US 201414522278 A US201414522278 A US 201414522278A US 2015116458 A1 US2015116458 A1 US 2015116458A1
- Authority
- US
- United States
- Prior art keywords
- image
- depth
- objects
- depth map
- video
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Images
Classifications
-
- H04N13/0018—
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/10—Processing, recording or transmission of stereoscopic or multi-view image signals
- H04N13/106—Processing image signals
- H04N13/128—Adjusting depth or disparity
-
- H04N13/0022—
-
- H04N13/0029—
-
- H04N13/0059—
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/20—Image signal generators
- H04N13/261—Image signal generators with monoscopic-to-stereoscopic image conversion
Definitions
- Embodiments here relate generally to the field of 2D to 3D video and image conversion performed either in real time or offline. More particularly, the embodiments relate to a method and apparatus for enhancing and/or exaggerating depth and negative parallax and adjusting the zero-parallax plane, also referred to as the screen plane, for 3D-image rendering on different 3D display technologies and formats.
- a greyscale image represents the depth map of an image in which each pixel is assigned a value between and including 0 and 255.
- a value of 255 (100% white level) indicates the pixel is in the front most and a value of 0 represents the pixel is in the back most.
- the depth value of a pixel is used to calculate the horizontal (x-axis) offset of the pixel for left and right eye view images. In particular, if the calculated offset is w for pixel at position (x, y) in the original image, then this pixel is placed at position (x+w, y) in the left image and (x ⁇ w, y) in the right image.
- the value of the offset w for a pixel is positive, it creates a negative parallax where the pixel appears to pop out of the screen.
- the value of the offset w for a pixel is negative, it creates a positive parallax where the pixel appears to be behind the screen plane.
- the offset w is zero, the pixel appears on the screen plane. The larger the offset, the greater the disparity between the left and right eye view and hence larger the depth inside the screen or pop out of the screen.
- FIG. 1 shows an exemplary block diagram of the system, according to one embodiment of the invention.
- FIG. 2 illustrates an exemplary transformation from a depth value for a pixel in the 2D image to calculate its offset for placement in the left and right eye view images.
- FIG. 3 illustrates with four settings of an exemplary graphical user interface (GUI) where user can move the location of the screen plane (also known as zero plane) in the scene, according to one software embodiment of the invention.
- GUI graphical user interface
- FIG. 4 illustrates a graphical user interface (GUI) for user to control depth volume, according to one embodiment of the invention.
- GUI graphical user interface
- FIG. 5 illustrates an exemplary method for exaggerating depth by adding a step offset for all depths equal or greater than a user defined value, according to one embodiment of the invention.
- the slope of the depth to offset function is modified to exaggerate the 3D-effect.
- FIG. 6 illustrates an exemplary method for exaggerating depth by adding a step offset and scaling the slope of the depth to offset function for all depths equal or greater than a user defined value, according to one embodiment of the invention.
- FIG. 7 illustrates yet another exemplary method for exaggerating depth by using an exponential transfer function for depth to offset, according to one embodiment of the invention.
- FIG. 8 illustrates an exemplary flow chart for rendering exaggerated 3D image, given a 2D image source and its depth map, according to one embodiment of the invention.
- Embodiments here relate to a method, apparatus, system, and computer program for modifying, enhancing or exaggerating 3D-image rendered given a mono-ocular (2D) image source and its depth map.
- user can control and change the attributes and quality for 3D-rendition of a 2D-image using graphical user interface (GUI).
- GUI graphical user interface
- control settings can be presented to the 3D-render engine as commands stored in a file and read by 3D-rendering application or routine.
- These attributes and quality of the 3D image are not specific to a particular 3D-format but can be used for all 3D formats including but not limited to various stereo-3D formats and glasses free multi-view auto-stereo formats.
- the embodiments can take advantage of the computing power of general purpose CPU, GPU or dedicated FPGA or ASIC chip to process sequence of images from video frames of a streaming 2D-video to generate 3D video frames.
- the conversion of 2D video frames to 3D can be done in real.
- the enhanced 3D-experience may be implemented as a software application running on a computing device such as a personal computer, tablet computer or smart-phone.
- a user receives a streaming 2D-video from the internet or from a file stored on a local storage device.
- the user uses the application GUI to adjust the quality and attributes of 3D-video in an automatic 2D video to 3D conversion and display it on the attached 3D display in real time.
- the converted enhanced 3D-video can be stored back on the local or network storage device.
- the 2D to 3D conversion process is implemented as a software application running on a computing device such as a personal computer, tablet computer or smart-phone.
- a user loads a video from a file stored on a local or network attached storage device and uses the application to automatically or in an interactive mode convert the 2D video to 3D and store it back offline on the local or network attached disk.
- the user settings for 3D attributes can be stored in a file using some pre-defined syntax such as XML and can be read in by the 2D to 3D conversion application and applied during the rendering of the 3D-video.
- the enhanced 3D render method is implemented in dedicated hardware such as an FPGA or a custom ASIC chip as an independent 3D-render application. In one embodiment, the enhanced 3D render method is implemented in dedicated hardware such as an FPGA or a custom ASIC chip as part of a larger 2D to 3D conversion application. In one embodiment, the enhanced 3D-render video conversion system is implemented as a stand-alone converter box. In one embodiment, the entire 2D to 3D video conversion system is implemented a circuit board or a daughter card. In one embodiment, a stand-alone implantation of the conversion system can be attached to the output of a streaming video receiver, broadcast TV receiver, satellite-TV receiver or cable-TV receiver and the output of standalone converter box can be connected to 3D-displays.
- the enhanced 3D render method is implemented as a software application utilizing the graphics processing unit (GPU) of a computing device such as a personal computer, tablet computer or smart-phone to enhance performance.
- GPU graphics processing unit
- the system receives a 2D image and its depth map either as separately but synchronized fashion or together in a single frame, usually referred to as 2D+D format, and the software or hardware implementation of the enhanced 3D-render method uses that to produce the enhanced 3D-image.
- FIG. 1 shows an exemplary block diagram of a 2D to 3D conversion process, according to one embodiment.
- the process comprises receiving single or a sequence of image frames.
- the depth map generator block 102 generates the depth map 112 from the 2D-source image.
- the depth map 112 is used by the enhanced 3D-render block 106 that generates a transformed depth map to calculate new pixel displacements by the render engine.
- FIG. 2 illustrates one embodiment of transformation from pixel depth to pixel offset in 3D-image.
- Lines 101 and 102 are the linear transformation from depth to offset for the right and left eye view images.
- 103 represents a plane in the depth field where both the left and right eye view offsets are equal and zero. All objects with depths and hence offsets to the right of this plane will have negative parallax, meaning the object will appear to pop out of the screen. All objects with depths and hence offsets to the left of this plane will have positive parallax, meaning the object will appear to be behind the screen.
- FIG. 3 illustrates one embodiment of graphical user interface (GUI) 202 to enable the user to adjust the location of the zero plane, which is the point in the graph 201 where the two lines meet
- GUI graphical user interface
- the GUI 202 shows offset of the zero plane to be zero. Different situations of this GUI are shown with different adjustments represented by the lines above them.
- GUI 204 shows an offset in which the zero plane position is 127 on the GUI and the graphical representation is shown as 203 .
- GUI 206 shows the offset of 170 , with the zero plane moving to the right as shown as 205
- GUI 208 shows the offset of 255 , with the zero plane to the farthest right position.
- FIG. 4 illustrate one embodiment of graphical user interface (GUI) 302 to enable user adjust the amount of depth in the 3D-image by adjusting the amount of disparity produced between the left and right eye view.
- GUI 304 sows a lower value for disparity. As shown by comparing 301 and 303 , the lower values result in less depth.
- FIG. 5 illustrates two embodiments of graphical user interface (GUI) consisting of controls 402 , 404 and 406 that enable user to artificially separate objects selectively from background objects and pop it out.
- GUI graphical user interface
- a step offset value 403 is used in one embodiment.
- a scaled slope 405 is used in another embodiment. The depth location where the offset or slope scaling is indicated by 401 and is controlled by the GUI control 402 .
- FIG. 6 illustrates one embodiment of graphical user interface (GUI) where both step and slope scale is applied simultaneously.
- GUI graphical user interface
- FIG. 7 illustrates one embodiment where the depth to offset transformation is exponential. This creates an effect where all the background objects are squished flat, while the objects in the foreground have increasingly exaggerated depth and/or pop-out.
- the exponential function can be replaced by any nonlinear, monotonic function to create special 3D-effects.
- FIG. 8 illustrates one embodiment of a flowchart for enhanced 3D-render method.
- the process obtains the control data needed for the further processing. This data may include maximum disparity, zero plane position, and the segmentation type, amount and location.
- the process calculates the offset for the right and left eye views using the pixel depth from the depth map and the control data.
- the process renders the right and left eye view using the offsets for each pixel.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Testing, Inspecting, Measuring Of Stereoscopic Televisions And Televisions (AREA)
- Controls And Circuits For Display Device (AREA)
- Processing Or Creating Images (AREA)
Abstract
A method for adjusting and generating enhanced 3D-effects for 2D to 3D image and video conversion applications includes controlling a depth location of a zero parallax plane within a depth field of an image scene to adjust parallax of objects in the image scene, controlling a depth volume of objects in the image scene to one of either exaggerate or reduce 3D-effect of the image scene, controlling a depth location of a segmentation plane within the depth field of the image scene, dividing the objects in the image scene into a foreground group and a background group, selectively increasing or decreasing depth volume of objects in the foreground group, selectively increasing or decreasing depth separation of objects in the foreground group relative to the objects in the background group, and generating an updated depth map file for a 2D-image.
Description
- This application claims benefit of U.S. provisional patent application Ser. No 61/897787, filed Oct. 30, 2013, which is herein incorporated by reference.
- Embodiments here relate generally to the field of 2D to 3D video and image conversion performed either in real time or offline. More particularly, the embodiments relate to a method and apparatus for enhancing and/or exaggerating depth and negative parallax and adjusting the zero-parallax plane, also referred to as the screen plane, for 3D-image rendering on different 3D display technologies and formats.
- With the rising sale of 3D-enabled TVs and personal devices in the consumer segment, the need to release new and old movies in 3D is increasing. In the commercial application space, the use of large screen electronic billboards which can display attention grabbing 3D-images for advertising or informational purposes has increased. Because of the increasing demand for creating 3D-content, the demand for automatically or semi-automatically convert existing 2D-contents to 3D contents increases. Enhancing the 3D-experience of the consumers and viewers can produce further growth of 3D entertainment and advertisement market. A demand exists for tools and services to generate stunning 3D-image effects.
- Traditionally, converting 2D videos to 3D for professional application starts with generating a depth map of the image for each video frame using a very labor intensive manual process of roto-scoping, where objects in each frame are manually and painstakingly traced by the artist and depth information for each object is painted by hand. For consumer applications such as built-in automated 2D to 3D function in 3D-TV or game consoles, the converted 3D-image suffers from extremely poor depth and pop-out effects. Moreover, there is no automated control to modify the zero-parallax plane position and artificially exaggerate pop-out or depth of selective objects for enhanced special-effects.
- Numerous research publications exist on methods of automatically generating depth map from a mono-ocular 2D-image for the purpose of converting the 2D-image to 3D-image. The methods range from very simplistic heuristics to very complicated and compute intensive image analysis. Simple heuristics may be suitable for real time conversion application but provides poor 3D quality. On the other hand, complex mathematical analysis may provide good 3D-image quality but may not be suitable for real time application and hardware implementation.
- A greyscale image represents the depth map of an image in which each pixel is assigned a value between and including 0 and 255. A value of 255 (100% white level) indicates the pixel is in the front most and a value of 0 represents the pixel is in the back most. The depth value of a pixel is used to calculate the horizontal (x-axis) offset of the pixel for left and right eye view images. In particular, if the calculated offset is w for pixel at position (x, y) in the original image, then this pixel is placed at position (x+w, y) in the left image and (x−w, y) in the right image. If the value of the offset w for a pixel is positive, it creates a negative parallax where the pixel appears to pop out of the screen. Alternatively, if the value of the offset w for a pixel is negative, it creates a positive parallax where the pixel appears to be behind the screen plane. If the offset w is zero, the pixel appears on the screen plane. The larger the offset, the greater the disparity between the left and right eye view and hence larger the depth inside the screen or pop out of the screen. Hence, given a depth map for a 2D, or monocular, image, by selectively manipulating the offsets the pixels for 3D rendering, it is possible to artificially enhance or exaggerate 3D effects in a scene and this transformations can be done in real time or offline.
-
FIG. 1 shows an exemplary block diagram of the system, according to one embodiment of the invention. -
FIG. 2 illustrates an exemplary transformation from a depth value for a pixel in the 2D image to calculate its offset for placement in the left and right eye view images. -
FIG. 3 illustrates with four settings of an exemplary graphical user interface (GUI) where user can move the location of the screen plane (also known as zero plane) in the scene, according to one software embodiment of the invention. -
FIG. 4 illustrates a graphical user interface (GUI) for user to control depth volume, according to one embodiment of the invention. -
FIG. 5 illustrates an exemplary method for exaggerating depth by adding a step offset for all depths equal or greater than a user defined value, according to one embodiment of the invention. In another embodiment, the slope of the depth to offset function is modified to exaggerate the 3D-effect. -
FIG. 6 illustrates an exemplary method for exaggerating depth by adding a step offset and scaling the slope of the depth to offset function for all depths equal or greater than a user defined value, according to one embodiment of the invention. -
FIG. 7 illustrates yet another exemplary method for exaggerating depth by using an exponential transfer function for depth to offset, according to one embodiment of the invention. -
FIG. 8 illustrates an exemplary flow chart for rendering exaggerated 3D image, given a 2D image source and its depth map, according to one embodiment of the invention. - Embodiments here relate to a method, apparatus, system, and computer program for modifying, enhancing or exaggerating 3D-image rendered given a mono-ocular (2D) image source and its depth map. In an interactive mode, user can control and change the attributes and quality for 3D-rendition of a 2D-image using graphical user interface (GUI). Optionally, such control settings can be presented to the 3D-render engine as commands stored in a file and read by 3D-rendering application or routine. These attributes and quality of the 3D image are not specific to a particular 3D-format but can be used for all 3D formats including but not limited to various stereo-3D formats and glasses free multi-view auto-stereo formats. The embodiments can take advantage of the computing power of general purpose CPU, GPU or dedicated FPGA or ASIC chip to process sequence of images from video frames of a
streaming 2D-video to generate 3D video frames. Depending on the available processing capabilities of the processing unit and complexity of desired transformations, the conversion of 2D video frames to 3D can be done in real. - In one embodiment, the enhanced 3D-experience may be implemented as a software application running on a computing device such as a personal computer, tablet computer or smart-phone. A user receives a
streaming 2D-video from the internet or from a file stored on a local storage device. The user then uses the application GUI to adjust the quality and attributes of 3D-video in an automatic 2D video to 3D conversion and display it on the attached 3D display in real time. In one embodiment, the converted enhanced 3D-video can be stored back on the local or network storage device. - In one embodiment, the 2D to 3D conversion process is implemented as a software application running on a computing device such as a personal computer, tablet computer or smart-phone. A user loads a video from a file stored on a local or network attached storage device and uses the application to automatically or in an interactive mode convert the 2D video to 3D and store it back offline on the local or network attached disk. In one embodiment, the user settings for 3D attributes can be stored in a file using some pre-defined syntax such as XML and can be read in by the 2D to 3D conversion application and applied during the rendering of the 3D-video.
- In one embodiment, the enhanced 3D render method is implemented in dedicated hardware such as an FPGA or a custom ASIC chip as an independent 3D-render application. In one embodiment, the enhanced 3D render method is implemented in dedicated hardware such as an FPGA or a custom ASIC chip as part of a larger 2D to 3D conversion application. In one embodiment, the enhanced 3D-render video conversion system is implemented as a stand-alone converter box. In one embodiment, the entire 2D to 3D video conversion system is implemented a circuit board or a daughter card. In one embodiment, a stand-alone implantation of the conversion system can be attached to the output of a streaming video receiver, broadcast TV receiver, satellite-TV receiver or cable-TV receiver and the output of standalone converter box can be connected to 3D-displays.
- In one embodiment, the enhanced 3D render method is implemented as a software application utilizing the graphics processing unit (GPU) of a computing device such as a personal computer, tablet computer or smart-phone to enhance performance.
- In one embodiment, the system receives a 2D image and its depth map either as separately but synchronized fashion or together in a single frame, usually referred to as 2D+D format, and the software or hardware implementation of the enhanced 3D-render method uses that to produce the enhanced 3D-image.
-
FIG. 1 shows an exemplary block diagram of a 2D to 3D conversion process, according to one embodiment. In one embodiment, the process comprises receiving single or a sequence of image frames. The depthmap generator block 102 generates thedepth map 112 from the 2D-source image. In one embodiment, thedepth map 112 is used by the enhanced 3D-render block 106 that generates a transformed depth map to calculate new pixel displacements by the render engine. -
FIG. 2 illustrates one embodiment of transformation from pixel depth to pixel offset in 3D-image.Lines -
FIG. 3 illustrates one embodiment of graphical user interface (GUI) 202 to enable the user to adjust the location of the zero plane, which is the point in thegraph 201 where the two lines meet TheGUI 202 shows offset of the zero plane to be zero. Different situations of this GUI are shown with different adjustments represented by the lines above them.GUI 204 shows an offset in which the zero plane position is 127 on the GUI and the graphical representation is shown as 203. Similarly,GUI 206 shows the offset of 170, with the zero plane moving to the right as shown as 205, andGUI 208 shows the offset of 255, with the zero plane to the farthest right position. -
FIG. 4 illustrate one embodiment of graphical user interface (GUI) 302 to enable user adjust the amount of depth in the 3D-image by adjusting the amount of disparity produced between the left and right eye view.GUI 304 sows a lower value for disparity. As shown by comparing 301 and 303, the lower values result in less depth. -
FIG. 5 illustrates two embodiments of graphical user interface (GUI) consisting ofcontrols value 403 is used in one embodiment. Ascaled slope 405 is used in another embodiment. The depth location where the offset or slope scaling is indicated by 401 and is controlled by theGUI control 402. -
FIG. 6 illustrates one embodiment of graphical user interface (GUI) where both step and slope scale is applied simultaneously. TheGUI 502 with the values shows results in the representation shown as 503. -
FIG. 7 illustrates one embodiment where the depth to offset transformation is exponential. This creates an effect where all the background objects are squished flat, while the objects in the foreground have increasingly exaggerated depth and/or pop-out. In general, the exponential function can be replaced by any nonlinear, monotonic function to create special 3D-effects. -
FIG. 8 illustrates one embodiment of a flowchart for enhanced 3D-render method. At 800, the process obtains the control data needed for the further processing. This data may include maximum disparity, zero plane position, and the segmentation type, amount and location. At 802, the process calculates the offset for the right and left eye views using the pixel depth from the depth map and the control data. At 804, the process renders the right and left eye view using the offsets for each pixel. - Reference in the specification to “an embodiment,” “one embodiment,” “some embodiments,” or “other embodiments” means that a particular feature, structure, or characteristic described in connection with the embodiments is included in at least some embodiments, but not necessarily all embodiments. The various appearances of “an embodiment,” “one embodiment,” or “some embodiments” are not necessarily all referring to the same embodiments. If the specification states a component, feature, structure, or characteristic “may,” “might,” or “could” be included, that particular component, feature, structure, or characteristic is not required to be included. If the specification or claim refers to “a” or “an” element, that does not mean there is only one of the element. If the specification or claims refer to “an additional” element, that does not preclude there being more than one of the additional element.
- While the invention has been described in conjunction with specific embodiments thereof, many alternatives, modifications and variations of such embodiments will be apparent to those of ordinary skill in the art in light of the foregoing description.
Claims (16)
1. A method for adjusting and generating enhanced 3D-effects for real time and offline 2D to 3D image and video conversion applications consisting of:
controlling a depth location of a zero parallax plane within a depth field of an image scene to adjust parallax of objects in the image scene;
controlling a depth volume of objects in the image scene to one of either exaggerate or reduce 3D-effect of the image scene;
controlling a depth location of a segmentation plane within the depth field of the image scene, dividing the objects in the image scene into a foreground group and a background group based on a location of the objects relative to the segmentation plane;
selectively increasing or decreasing depth volume of objects in the foreground group;
selectively increasing or decreasing depth separation of objects in the foreground group relative to the objects in the background group; and
generating an updated depth map file for a 2D-image based upon the controlling, and increasing and decreasing.
2. The method of claim 1 , further comprising rendering an enhanced 3D-image using the updated depth map.
3. The method of claim 1 , wherein the method further comprises a software application running on a computing device.
4. The method of claim 3 , wherein the computing device comprises one of a server computer, personal computer, tablet computer or smart-phone, graphics processor unit.
5. The method of claim 1 , further comprising receiving a 2D-still image or a streaming 2D-video from a network with an associated depth map.
6. The method of claim 1 , further comprising reading a 2D-still image or a 2D-video from a file stored on a local or remote storage device with the associated depth map image.
7. The method of claim 1 , further comprising generating a depth map for each 2D-still image or a sequence of depth maps for each frame in a 2D-video.
8. The method of claim 1 , further comprising reading meta-instructions for depth map enhancement for the 2D-image or video from a file stored on a local or remote storage device.
9. The method of claim 1 , further comprising enabling a user to enhance the depth map through one of a set of graphical user interfaces (GUI), command line instructions, and custom input devices.
10. The method of claim 2 , wherein rendering a 3D image comprises one of rendering an anaglyph, stereo-3D or auto-stereo 3D using the enhanced depth map.
11. The method of claim 2 , further comprising one of displaying generated 3D image or video on and attached 3D display in real time, and storing the 3D image on local or remote storage device(s) for offline viewing.
12. The method of claim 1 , further comprising storing the generated enhanced depth map as grey scale images on a storage device.
13. The method of claim 1 , further comprising storing user modifications of the depth map as a sequence of instructions associated with each image in a control file using a pre-defined syntax.
14. The method of claim 1 , wherein the method is executed by a dedicated hardware device.
15. The method of claim 1 , wherein the method is executed by hardware contained in a stand-alone converter box.
16. The method of claim 1 , wherein the method is implemented as one of a circuit board, a daughter card or any other plug-in card or module.
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US14/522,278 US20150116458A1 (en) | 2013-10-30 | 2014-10-23 | Method and apparatus for generating enhanced 3d-effects for real-time and offline appplications |
US15/855,756 US10250864B2 (en) | 2013-10-30 | 2017-12-27 | Method and apparatus for generating enhanced 3D-effects for real-time and offline applications |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US201361897787P | 2013-10-30 | 2013-10-30 | |
US14/522,278 US20150116458A1 (en) | 2013-10-30 | 2014-10-23 | Method and apparatus for generating enhanced 3d-effects for real-time and offline appplications |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/855,756 Continuation US10250864B2 (en) | 2013-10-30 | 2017-12-27 | Method and apparatus for generating enhanced 3D-effects for real-time and offline applications |
Publications (1)
Publication Number | Publication Date |
---|---|
US20150116458A1 true US20150116458A1 (en) | 2015-04-30 |
Family
ID=52994936
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/522,278 Abandoned US20150116458A1 (en) | 2013-10-30 | 2014-10-23 | Method and apparatus for generating enhanced 3d-effects for real-time and offline appplications |
US15/855,756 Active US10250864B2 (en) | 2013-10-30 | 2017-12-27 | Method and apparatus for generating enhanced 3D-effects for real-time and offline applications |
Family Applications After (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/855,756 Active US10250864B2 (en) | 2013-10-30 | 2017-12-27 | Method and apparatus for generating enhanced 3D-effects for real-time and offline applications |
Country Status (1)
Country | Link |
---|---|
US (2) | US20150116458A1 (en) |
Cited By (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20140078262A1 (en) * | 2012-09-18 | 2014-03-20 | Lg Innotek Co., Ltd. | Image processing apparatus and camera module using the same |
US20160165205A1 (en) * | 2014-12-03 | 2016-06-09 | Shenzhen Estar Technology Group Co.,Ltd | Holographic displaying method and device based on human eyes tracking |
CN106713889A (en) * | 2015-11-13 | 2017-05-24 | 中国电信股份有限公司 | 3D frame rendering method and system and mobile terminal |
US20170155886A1 (en) * | 2015-06-24 | 2017-06-01 | Derek John Hartling | Colour-Z: Low-D Loading to High-D Processing |
US20170178298A1 (en) * | 2015-12-18 | 2017-06-22 | Canon Kabushiki Kaisha | System and method for adjusting perceived depth of an image |
CN108419446A (en) * | 2015-08-24 | 2018-08-17 | 高通股份有限公司 | Systems and methods for laser depth map sampling |
US10158847B2 (en) | 2014-06-19 | 2018-12-18 | Vefxi Corporation | Real—time stereo 3D and autostereoscopic 3D video and image editing |
CN109327694A (en) * | 2018-11-19 | 2019-02-12 | 威创集团股份有限公司 | A kind of 3D control room method for changing scenes, device, equipment and storage medium |
EP3398329A4 (en) * | 2015-12-30 | 2019-08-07 | Creative Technology Ltd. | A method for creating a stereoscopic image sequence |
CN110290374A (en) * | 2019-06-28 | 2019-09-27 | 宝琳创展国际文化科技发展(北京)有限公司 | A kind of implementation method of naked eye 3D |
CN112272295A (en) * | 2020-10-26 | 2021-01-26 | 腾讯科技(深圳)有限公司 | Method for generating video with three-dimensional effect, method for playing video, device and equipment |
WO2024043435A1 (en) * | 2022-08-23 | 2024-02-29 | 삼성전자 주식회사 | Electronic device and method for generating image in which depth recognized by viewer is reinforced |
Families Citing this family (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN115993739A (en) * | 2018-05-09 | 2023-04-21 | 群创光电股份有限公司 | Light emitting module |
CN114331828A (en) | 2020-09-30 | 2022-04-12 | 脸萌有限公司 | Method, device, device and storage medium for converting picture to video |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20100080448A1 (en) * | 2007-04-03 | 2010-04-01 | Wa James Tam | Method and graphical user interface for modifying depth maps |
US20120120063A1 (en) * | 2010-11-11 | 2012-05-17 | Sony Corporation | Image processing device, image processing method, and program |
US20140132726A1 (en) * | 2012-11-13 | 2014-05-15 | Lg Electronics Inc. | Image display apparatus and method for operating the same |
Family Cites Families (291)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPS5946376B2 (en) | 1977-03-11 | 1984-11-12 | 富士写真フイルム株式会社 | stereoscopic photography device |
US6219015B1 (en) | 1992-04-28 | 2001-04-17 | The Board Of Directors Of The Leland Stanford, Junior University | Method and apparatus for using an array of grating light valves to produce multicolor optical images |
GB2272555A (en) | 1992-11-11 | 1994-05-18 | Sharp Kk | Stereoscopic display using a light modulator |
US6188518B1 (en) | 1993-01-22 | 2001-02-13 | Donald Lewis Maunsell Martin | Method and apparatus for use in producing three-dimensional imagery |
US5650876A (en) | 1993-03-25 | 1997-07-22 | De Montfort University | Lens system with intermediate optical transmission microlens screen |
JP4262305B2 (en) | 1993-05-05 | 2009-05-13 | アリオ ピエール | Self-stereoscopic video equipment |
US5896225A (en) | 1993-05-24 | 1999-04-20 | Deutsche Thomson Brandt Gmbh | Device for stereoscopic image observation within an increased observation area |
EP0627644B1 (en) | 1993-06-01 | 2001-11-28 | Sharp Kabushiki Kaisha | Image display device with backlighting |
JPH07104276A (en) | 1993-10-08 | 1995-04-21 | Olympus Optical Co Ltd | Liquid crystal display device |
US6040807A (en) | 1993-11-15 | 2000-03-21 | Sanyo Electric Co., Ltd. | Three-dimensional display |
EP1209508B1 (en) | 1993-12-01 | 2004-10-27 | Sharp Kabushiki Kaisha | Display for 3D images |
JP2951202B2 (en) | 1994-02-23 | 1999-09-20 | 三洋電機株式会社 | 3D display without glasses |
DE19500315C1 (en) | 1995-01-07 | 1995-10-26 | Siegbert Prof Dr Ing Hentschke | Personal autostereoscopic viewing screen for TV or displays |
US5731853A (en) | 1995-02-24 | 1998-03-24 | Matsushita Electric Industrial Co., Ltd. | Display device |
US5825552A (en) | 1995-03-24 | 1998-10-20 | Eastman Kodak Company | Beamsplitter/staggerer for multi-beam laser printers |
JP2951264B2 (en) | 1995-05-24 | 1999-09-20 | 三洋電機株式会社 | 2D / 3D video compatible video display |
US5841579A (en) | 1995-06-07 | 1998-11-24 | Silicon Light Machines | Flat diffraction grating light valve |
JPH0918897A (en) | 1995-07-03 | 1997-01-17 | Canon Inc | Stereoscopic image display device |
US6377230B1 (en) | 1995-10-05 | 2002-04-23 | Semiconductor Energy Laboratory Co., Ltd. | Three dimensional display unit and display method |
GB2307058A (en) | 1995-11-13 | 1997-05-14 | Thomson Multimedia Sa | Stereoscopic display with lens,prism and barrier arrays |
EP0788008B1 (en) | 1996-01-31 | 2006-04-26 | Canon Kabushiki Kaisha | Stereoscopic image display apparatus whose observation area is widened |
US6064424A (en) | 1996-02-23 | 2000-05-16 | U.S. Philips Corporation | Autostereoscopic display apparatus |
JPH09289655A (en) | 1996-04-22 | 1997-11-04 | Fujitsu Ltd | Stereoscopic image display method, multi-view image input method, multi-view image processing method, stereo image display device, multi-view image input device, and multi-view image processing device |
US6020931A (en) | 1996-04-25 | 2000-02-01 | George S. Sheng | Video composition and position system and media signal communication system |
JP3443272B2 (en) | 1996-05-10 | 2003-09-02 | 三洋電機株式会社 | 3D image display device |
US5757545A (en) | 1996-05-24 | 1998-05-26 | Image Technology International, Inc. | Lenticular and barrier strip pictures with changeable scenes |
US6304263B1 (en) | 1996-06-05 | 2001-10-16 | Hyper3D Corp. | Three-dimensional display system: apparatus and method |
US6259450B1 (en) | 1996-06-05 | 2001-07-10 | Hyper3D Corp. | Three-dimensional display system apparatus and method |
US5914805A (en) | 1996-06-27 | 1999-06-22 | Xerox Corporation | Gyricon display with interstitially packed particles |
JP4083829B2 (en) | 1996-07-15 | 2008-04-30 | 富士通株式会社 | Stereoscopic image display device |
JP2846856B2 (en) | 1996-07-19 | 1999-01-13 | 三洋電機株式会社 | 3D image display device |
GB2317524A (en) | 1996-09-19 | 1998-03-25 | Sharp Kk | Three dimensional stereoscopic projection display |
GB2317710A (en) | 1996-09-27 | 1998-04-01 | Sharp Kk | Spatial light modulator and directional display |
WO1998027451A1 (en) | 1996-12-18 | 1998-06-25 | Technische Universität Dresden | Method and device for the three-dimensional representation of information |
US5822125A (en) | 1996-12-20 | 1998-10-13 | Eastman Kodak Company | Lenslet array system |
US5731899A (en) | 1996-12-20 | 1998-03-24 | Eastman Kodak Company | Lenslet array system incorporating an integral field lens/reimager lenslet array |
JPH10221643A (en) | 1997-01-31 | 1998-08-21 | Canon Inc | Stereoscopic picture display device |
US6157402A (en) | 1997-02-13 | 2000-12-05 | Torgeson; W. Lee | Autostereoscopic image presentation system using a screen assembly |
JP3595645B2 (en) | 1997-02-18 | 2004-12-02 | キヤノン株式会社 | 3D image display device |
US5781229A (en) | 1997-02-18 | 1998-07-14 | Mcdonnell Douglas Corporation | Multi-viewer three dimensional (3-D) virtual display system and operating method therefor |
JP3630906B2 (en) | 1997-02-18 | 2005-03-23 | キヤノン株式会社 | Stereoscopic image display device |
US5982553A (en) | 1997-03-20 | 1999-11-09 | Silicon Light Machines | Display device incorporating one-dimensional grating light-valve array |
ATE273598T1 (en) | 1997-03-27 | 2004-08-15 | Litton Systems Inc | AUTOSTEREOSCOPIC PROJECTION SYSTEM |
US5993003A (en) | 1997-03-27 | 1999-11-30 | Litton Systems, Inc. | Autostereo projection system |
US5900981A (en) | 1997-04-15 | 1999-05-04 | Scitex Corporation Ltd. | Optical system for illuminating a spatial light modulator |
US6266106B1 (en) | 1997-09-24 | 2001-07-24 | Sanyo Electric Co., Ltd. | Liquid crystal shutter glasses having inclined liquid crystal glasses |
US6088102A (en) | 1997-10-31 | 2000-07-11 | Silicon Light Machines | Display apparatus including grating light-valve array and interferometric optical system |
US6381072B1 (en) | 1998-01-23 | 2002-04-30 | Proxemics | Lenslet array systems and methods |
JPH11234703A (en) | 1998-02-09 | 1999-08-27 | Toshiba Corp | Stereoscopic display device |
US6226907B1 (en) | 1998-04-29 | 2001-05-08 | Eastman Chemical Company | Display having combination of visually moveable and stationary elements and process for making the same |
US6227669B1 (en) | 1998-05-26 | 2001-05-08 | Industrial Technology Research Institute | Illumination device and image projection apparatus comprising the device |
US6271808B1 (en) | 1998-06-05 | 2001-08-07 | Silicon Light Machines | Stereo head mounted display using a single display device |
US6048081A (en) | 1998-06-15 | 2000-04-11 | Richardson; Brian Edward | Beam divergence and shape controlling module for projected light |
US6130770A (en) | 1998-06-23 | 2000-10-10 | Silicon Light Machines | Electron gun activated grating light valve |
US6101036A (en) | 1998-06-23 | 2000-08-08 | Silicon Light Machines | Embossed diffraction grating alone and in combination with changeable image display |
US6215579B1 (en) | 1998-06-24 | 2001-04-10 | Silicon Light Machines | Method and apparatus for modulating an incident light beam for forming a two-dimensional image |
US6303986B1 (en) | 1998-07-29 | 2001-10-16 | Silicon Light Machines | Method of and apparatus for sealing an hermetic lid to a semiconductor die |
JP2000098299A (en) | 1998-09-18 | 2000-04-07 | Sanyo Electric Co Ltd | Stereoscopic video display device |
US6097554A (en) | 1999-01-05 | 2000-08-01 | Raytheon Company | Multiple dove prism assembly |
US6533420B1 (en) | 1999-01-22 | 2003-03-18 | Dimension Technologies, Inc. | Apparatus and method for generating and projecting autostereoscopic images |
KR20010009720A (en) | 1999-07-13 | 2001-02-05 | 박호군 | 3-Dimentional imaging screen for multi-viewer and fabrication method thereof |
WO2001014924A1 (en) | 1999-08-26 | 2001-03-01 | The Ohio State University | Device for producing optically-controlled incremental time delays |
US6388815B1 (en) | 2000-08-24 | 2002-05-14 | The Ohio State University | Device and method for producing optically-controlled incremental time delays |
EP1083755A3 (en) | 1999-09-07 | 2003-11-12 | Canon Kabushiki Kaisha | Image input apparatus and image display apparatus |
ATE278298T1 (en) | 1999-11-26 | 2004-10-15 | Sanyo Electric Co | METHOD FOR 2D/3D VIDEO CONVERSION |
HU0000752D0 (en) | 2000-02-21 | 2000-04-28 | Pixel element for three-dimensional screen | |
US6714173B2 (en) | 2000-06-16 | 2004-03-30 | Tdk Corporation | Three dimensional screen display |
US6775048B1 (en) | 2000-10-31 | 2004-08-10 | Microsoft Corporation | Microelectrical mechanical structure (MEMS) optical modulator and optical display system |
US6697042B1 (en) | 2000-11-27 | 2004-02-24 | Rainbow Displays, Inc. | Backlight assembly for collimated illumination |
US6795250B2 (en) | 2000-12-29 | 2004-09-21 | Lenticlear Lenticular Lens, Inc. | Lenticular lens array |
US6707591B2 (en) | 2001-04-10 | 2004-03-16 | Silicon Light Machines | Angled illumination for a single order light modulator based projection system |
TW476002B (en) | 2001-05-31 | 2002-02-11 | Ind Tech Res Inst | Vertical parallax barrier bare eye three-dimensional display device |
US6747781B2 (en) | 2001-06-25 | 2004-06-08 | Silicon Light Machines, Inc. | Method, apparatus, and diffuser for reducing laser speckle |
US6782205B2 (en) | 2001-06-25 | 2004-08-24 | Silicon Light Machines | Method and apparatus for dynamic equalization in wavelength division multiplexing |
US6829092B2 (en) | 2001-08-15 | 2004-12-07 | Silicon Light Machines, Inc. | Blazed grating light valve |
US6574047B2 (en) | 2001-08-15 | 2003-06-03 | Eastman Kodak Company | Backlit display for selectively illuminating lenticular images |
DE60238691D1 (en) | 2001-08-21 | 2011-02-03 | Koninkl Philips Electronics Nv | AUTOSTEREOSCOPIC IMAGE DISPLAY DEVICE WITH USER SUCCESSION SYSTEM |
US6547628B1 (en) | 2001-10-03 | 2003-04-15 | Hasbro, Inc. | Electronic learning toy |
US20030067421A1 (en) | 2001-10-10 | 2003-04-10 | Alan Sullivan | Variable focusing projection system |
US6800238B1 (en) | 2002-01-15 | 2004-10-05 | Silicon Light Machines, Inc. | Method for domain patterning in low coercive field ferroelectrics |
US6760140B1 (en) | 2002-03-01 | 2004-07-06 | The Ohio State University Research Foundation | Binary optical interconnection |
US6724951B1 (en) | 2002-03-26 | 2004-04-20 | The Ohio State University | Using fibers as shifting elements in optical interconnection devices based on the white cell |
US6674939B1 (en) | 2002-03-26 | 2004-01-06 | The Ohio State University | Using fibers as delay elements in optical true-time delay devices based on the white cell |
US6766073B1 (en) | 2002-05-17 | 2004-07-20 | The Ohio State University | Optical circulator with large number of ports and no polarization-based components |
US6767751B2 (en) | 2002-05-28 | 2004-07-27 | Silicon Light Machines, Inc. | Integrated driver process flow |
US6728023B1 (en) | 2002-05-28 | 2004-04-27 | Silicon Light Machines | Optical device arrays with optimized image resolution |
US6822797B1 (en) | 2002-05-31 | 2004-11-23 | Silicon Light Machines, Inc. | Light modulator structure for producing high-contrast operation using zero-order light |
US6829258B1 (en) | 2002-06-26 | 2004-12-07 | Silicon Light Machines, Inc. | Rapidly tunable external cavity laser |
US6813059B2 (en) | 2002-06-28 | 2004-11-02 | Silicon Light Machines, Inc. | Reduced formation of asperities in contact micro-structures |
US6801354B1 (en) | 2002-08-20 | 2004-10-05 | Silicon Light Machines, Inc. | 2-D diffraction grating for substantially eliminating polarization dependent losses |
US6712480B1 (en) | 2002-09-27 | 2004-03-30 | Silicon Light Machines | Controlled curvature of stressed micro-structures |
WO2004047239A2 (en) | 2002-11-20 | 2004-06-03 | Mems Optical, Inc. | Laser diode bar integrator/reimager |
KR100490416B1 (en) | 2002-11-23 | 2005-05-17 | 삼성전자주식회사 | Apparatus capable of displaying selectively 2D image and 3D image |
US7236238B1 (en) | 2002-12-02 | 2007-06-26 | The Ohio State University | Method and apparatus for monitoring the quality of optical links |
US6958861B1 (en) | 2002-12-02 | 2005-10-25 | The Ohio State University | Method and apparatus for combining optical beams |
EP1437898A1 (en) | 2002-12-30 | 2004-07-14 | Koninklijke Philips Electronics N.V. | Video filtering for stereo images |
US7417782B2 (en) | 2005-02-23 | 2008-08-26 | Pixtronix, Incorporated | Methods and apparatus for spatial light modulation |
JP2004258163A (en) | 2003-02-25 | 2004-09-16 | Nec Corp | Stereoscopic image display device and stereoscopic image display method |
US6829077B1 (en) | 2003-02-28 | 2004-12-07 | Silicon Light Machines, Inc. | Diffractive light modulator with dynamically rotatable diffraction plane |
US6806997B1 (en) | 2003-02-28 | 2004-10-19 | Silicon Light Machines, Inc. | Patterned diffractive light modulator ribbon for PDL reduction |
US6877882B1 (en) | 2003-03-12 | 2005-04-12 | Delta Electronics, Inc. | Illumination system for a projection system |
GB2399653A (en) | 2003-03-21 | 2004-09-22 | Sharp Kk | Parallax barrier for multiple view display |
US8118674B2 (en) | 2003-03-27 | 2012-02-21 | Wms Gaming Inc. | Gaming machine having a 3D display |
US7518663B2 (en) | 2003-03-31 | 2009-04-14 | Koninklike Philips Electronics N.V. | Display device with multi-grooved light direction element and first and second alternating illuminated light sources simultaneously switched for 2D display and synchronously switched for 3D display |
KR100728204B1 (en) | 2003-06-02 | 2007-06-13 | 삼성에스디아이 주식회사 | Display device for both 2D and 3D video |
DE10339076B4 (en) | 2003-08-26 | 2007-10-31 | Seereal Technologies Gmbh | Autostereoscopic multi-user display |
GB2405519A (en) | 2003-08-30 | 2005-03-02 | Sharp Kk | A multiple-view directional display |
DE10340089B4 (en) | 2003-08-30 | 2005-12-22 | Seereal Technologies Gmbh | Sweet-spot beam splitter for image separation |
US20050083400A1 (en) | 2003-09-04 | 2005-04-21 | Yuzo Hirayama | Three-dimensional image display device, three-dimensional image display method and three-dimensional display image data generating method |
US7857700B2 (en) | 2003-09-12 | 2010-12-28 | Igt | Three-dimensional autostereoscopic image display for a gaming apparatus |
GB0400373D0 (en) | 2004-01-09 | 2004-02-11 | Koninkl Philips Electronics Nv | A three-dimensional display |
KR100759392B1 (en) | 2004-02-26 | 2007-09-19 | 삼성에스디아이 주식회사 | Stereoscopic display |
US7432878B1 (en) | 2004-04-19 | 2008-10-07 | The Trustees Of Columbia University In The City Of New York | Methods and systems for displaying three-dimensional images |
US7286280B2 (en) | 2004-05-07 | 2007-10-23 | The University Of British Columbia | Brightness enhancement film for backlit image displays |
GB0410551D0 (en) | 2004-05-12 | 2004-06-16 | Ller Christian M | 3d autostereoscopic display |
US20070222954A1 (en) | 2004-05-28 | 2007-09-27 | Sea Phone Co., Ltd. | Image Display Unit |
WO2005122596A1 (en) | 2004-06-08 | 2005-12-22 | Actuality Systems, Inc. | Optical scanning assembly |
US20060034567A1 (en) | 2004-07-16 | 2006-02-16 | Anderson Betty L | Optical beam combiner |
US7430347B2 (en) | 2004-07-16 | 2008-09-30 | The Ohio State University | Methods, systems, and apparatuses for optically generating time delays in signals |
US7633670B2 (en) | 2004-07-16 | 2009-12-15 | The Ohio State University | Methods, systems, and devices for steering optical beams |
US20060023065A1 (en) | 2004-07-31 | 2006-02-02 | Alden Ray M | Multiple program and 3D display with high resolution display and recording applications |
US7541060B2 (en) | 2004-08-17 | 2009-06-02 | Xerox Corporation | Bichromal balls |
US20070268590A1 (en) | 2004-08-19 | 2007-11-22 | Seereal Technologies Gmbh | Lenticule and Prism Unit |
US7311607B2 (en) | 2004-09-08 | 2007-12-25 | Igt | Three dimensional image display systems and methods for gaming machines |
FR2876804B1 (en) | 2004-10-18 | 2007-01-05 | Imagine Optic Sa | DEVICE AND METHOD FOR AUTOSTEREOSCOPIC VISUALIZATION BASED ON LENTICULAR, AND METHOD FOR SYNTHESIZING AUTOSTEREOSCOPIC IMAGES |
WO2006047487A2 (en) | 2004-10-25 | 2006-05-04 | The Trustees Of Columbia University In The City Of New York | Systems and methods for displaying three-dimensional images |
JP4708042B2 (en) | 2005-02-04 | 2011-06-22 | 株式会社 日立ディスプレイズ | 3D image display device |
US7616368B2 (en) | 2005-02-23 | 2009-11-10 | Pixtronix, Inc. | Light concentrating reflective display methods and apparatus |
US7405852B2 (en) | 2005-02-23 | 2008-07-29 | Pixtronix, Inc. | Display apparatus and methods for manufacture thereof |
US9229222B2 (en) | 2005-02-23 | 2016-01-05 | Pixtronix, Inc. | Alignment methods in fluid-filled MEMS displays |
US8310442B2 (en) | 2005-02-23 | 2012-11-13 | Pixtronix, Inc. | Circuits for controlling display apparatus |
US7502159B2 (en) | 2005-02-23 | 2009-03-10 | Pixtronix, Inc. | Methods and apparatus for actuating displays |
US8519945B2 (en) | 2006-01-06 | 2013-08-27 | Pixtronix, Inc. | Circuits for controlling display apparatus |
US7755582B2 (en) | 2005-02-23 | 2010-07-13 | Pixtronix, Incorporated | Display methods and apparatus |
US7746529B2 (en) | 2005-02-23 | 2010-06-29 | Pixtronix, Inc. | MEMS display apparatus |
US8482496B2 (en) | 2006-01-06 | 2013-07-09 | Pixtronix, Inc. | Circuits for controlling MEMS display apparatus on a transparent substrate |
US7304785B2 (en) | 2005-02-23 | 2007-12-04 | Pixtronix, Inc. | Display methods and apparatus |
US7675665B2 (en) | 2005-02-23 | 2010-03-09 | Pixtronix, Incorporated | Methods and apparatus for actuating displays |
US7271945B2 (en) | 2005-02-23 | 2007-09-18 | Pixtronix, Inc. | Methods and apparatus for actuating displays |
US8159428B2 (en) | 2005-02-23 | 2012-04-17 | Pixtronix, Inc. | Display methods and apparatus |
US9082353B2 (en) | 2010-01-05 | 2015-07-14 | Pixtronix, Inc. | Circuits for controlling display apparatus |
US7742016B2 (en) | 2005-02-23 | 2010-06-22 | Pixtronix, Incorporated | Display methods and apparatus |
US7304786B2 (en) | 2005-02-23 | 2007-12-04 | Pixtronix, Inc. | Methods and apparatus for bi-stable actuation of displays |
US9261694B2 (en) | 2005-02-23 | 2016-02-16 | Pixtronix, Inc. | Display apparatus and methods for manufacture thereof |
US9158106B2 (en) | 2005-02-23 | 2015-10-13 | Pixtronix, Inc. | Display methods and apparatus |
US20070205969A1 (en) | 2005-02-23 | 2007-09-06 | Pixtronix, Incorporated | Direct-view MEMS display devices and methods for generating images thereon |
US8675125B2 (en) | 2005-04-27 | 2014-03-18 | Parellel Consulting Limited Liability Company | Minimized-thickness angular scanner of electromagnetic radiation |
JP5294845B2 (en) | 2005-04-29 | 2013-09-18 | コーニンクレッカ フィリップス エレクトロニクス エヌ ヴィ | 3D display device |
US7651282B2 (en) | 2005-05-04 | 2010-01-26 | The Trustees Of Columbia University In The City Of New York | Devices and methods for electronically controlling imaging |
WO2007024313A1 (en) | 2005-05-27 | 2007-03-01 | Imax Corporation | Equipment and methods for the synchronization of stereoscopic projection displays |
TWI446004B (en) | 2005-06-14 | 2014-07-21 | Koninkl Philips Electronics Nv | Combined single/multiple view-display |
KR101170120B1 (en) | 2005-07-27 | 2012-07-31 | 삼성전자주식회사 | Stereoscopic display apparatus |
ATE551839T1 (en) | 2005-08-19 | 2012-04-15 | Koninkl Philips Electronics Nv | STEREOSCOPIC DISPLAY DEVICE |
EP1949171B1 (en) | 2005-11-09 | 2009-09-16 | Koninklijke Philips Electronics N.V. | Display device with homogenising attentuating filter |
CN101310294A (en) | 2005-11-15 | 2008-11-19 | 伯纳黛特·加纳 | Neural Network Training Method |
EP1964414A2 (en) | 2005-12-14 | 2008-09-03 | Koninklijke Philips Electronics N.V. | Autostereoscopic display device |
EP1969861A2 (en) | 2005-12-15 | 2008-09-17 | Michael Mehrle | Stereoscopic imaging apparatus incorporating a parallax barrier |
US8330881B2 (en) | 2005-12-20 | 2012-12-11 | Koninklijke Philips Electronics N.V. | Autostereoscopic display device |
US8526096B2 (en) | 2006-02-23 | 2013-09-03 | Pixtronix, Inc. | Mechanical light modulators with stressed beams |
US20070229778A1 (en) | 2006-03-28 | 2007-10-04 | Soohyun Cha | Time-multiplexed 3D display system with seamless multiple projection |
EP2014101B1 (en) | 2006-04-19 | 2016-02-24 | Setred AS | Bandwidth improvement for 3d display |
US20070255139A1 (en) | 2006-04-27 | 2007-11-01 | General Electric Company | User interface for automatic multi-plane imaging ultrasound system |
US7630598B2 (en) | 2006-05-10 | 2009-12-08 | The Ohio State University | Apparatus and method for providing an optical cross-connect |
US7911671B2 (en) | 2006-05-10 | 2011-03-22 | The Ohio State University | Apparatus and method for providing true time delay in optical signals using a Fourier cell |
US7876489B2 (en) | 2006-06-05 | 2011-01-25 | Pixtronix, Inc. | Display apparatus with optical cavities |
US8736557B2 (en) | 2006-09-11 | 2014-05-27 | Apple Inc. | Electronic device with image based browsers |
JP4197716B2 (en) | 2006-10-03 | 2008-12-17 | 株式会社東芝 | 3D image display device |
KR101255275B1 (en) | 2006-10-13 | 2013-04-15 | 엘지디스플레이 주식회사 | Steroscopic Liquid Crystal Display Device, method for Manufacturing the same and Bonding Apparatus for the same |
WO2008051362A1 (en) | 2006-10-20 | 2008-05-02 | Pixtronix, Inc. | Light guides and backlight systems incorporating light redirectors at varying densities |
US7586681B2 (en) | 2006-11-29 | 2009-09-08 | Honeywell International Inc. | Directional display |
US8736675B1 (en) | 2006-12-01 | 2014-05-27 | Zebra Imaging, Inc. | Multi-core processor architecture for active autostereoscopic emissive displays |
JP2010518417A (en) | 2007-01-03 | 2010-05-27 | コーニンクレッカ フィリップス エレクトロニクス エヌ ヴィ | Display device |
US9176318B2 (en) | 2007-05-18 | 2015-11-03 | Pixtronix, Inc. | Methods for manufacturing fluid-filled MEMS displays |
US7852546B2 (en) | 2007-10-19 | 2010-12-14 | Pixtronix, Inc. | Spacers for maintaining display apparatus alignment |
US8269822B2 (en) | 2007-04-03 | 2012-09-18 | Sony Computer Entertainment America, LLC | Display viewing system and methods for optimizing display view based on active tracking |
US7817045B2 (en) | 2007-05-30 | 2010-10-19 | Onderko John C | Handling system for exception RFID labels |
US20080316303A1 (en) | 2007-06-08 | 2008-12-25 | Joseph Chiu | Display Device |
US9088792B2 (en) | 2007-06-08 | 2015-07-21 | Reald Inc. | Stereoscopic flat panel display with synchronized backlight, polarization control panel, and liquid crystal display |
GB0716776D0 (en) | 2007-08-29 | 2007-10-10 | Setred As | Rendering improvement for 3D display |
CN101904120A (en) | 2007-12-21 | 2010-12-01 | 罗姆股份有限公司 | Information exchange device |
US8112722B2 (en) | 2008-02-21 | 2012-02-07 | Honeywell International Inc. | Method and system of controlling a cursor in a three-dimensional graphical environment |
US7750982B2 (en) | 2008-03-19 | 2010-07-06 | 3M Innovative Properties Company | Autostereoscopic display with fresnel lens element and double sided prism film adjacent a backlight having a light transmission surface with left and right eye light sources at opposing ends modulated at a rate of at least 90 hz |
US8248560B2 (en) | 2008-04-18 | 2012-08-21 | Pixtronix, Inc. | Light guides and backlight systems incorporating prismatic structures and light redirectors |
US8558961B2 (en) | 2008-04-22 | 2013-10-15 | Samsung Display Co., Ltd. | Display device and lenticular sheet of the display device |
KR101451933B1 (en) | 2008-04-22 | 2014-10-16 | 삼성디스플레이 주식회사 | Display apparatus and lenticular sheet included therein |
KR100939214B1 (en) | 2008-06-12 | 2010-01-28 | 엘지디스플레이 주식회사 | Alignment System and Method of Stereoscopic Display |
JP2009300815A (en) | 2008-06-16 | 2009-12-24 | Seiko Epson Corp | Display device |
US20100033813A1 (en) | 2008-08-05 | 2010-02-11 | Rogoff Gerald L | 3-D Display Requiring No Special Eyewear |
JP2012501465A (en) | 2008-08-27 | 2012-01-19 | ピュアデプス リミテッド | Improvements in and relating to electronic visual displays |
US8068271B2 (en) | 2008-10-22 | 2011-11-29 | Cospheric Llc | Rotating element transmissive displays |
US8169679B2 (en) | 2008-10-27 | 2012-05-01 | Pixtronix, Inc. | MEMS anchors |
TW201019018A (en) | 2008-11-04 | 2010-05-16 | Chunghwa Picture Tubes Ltd | Stereoscopic display device |
US8363100B2 (en) | 2008-11-19 | 2013-01-29 | Honeywell International Inc. | Three dimensional display systems and methods for producing three dimensional images |
KR101531391B1 (en) | 2008-12-18 | 2015-06-25 | 코닌클리케 필립스 엔.브이. | Autostereoscopic display device |
US8217910B2 (en) | 2008-12-19 | 2012-07-10 | Verizon Patent And Licensing Inc. | Morphing touch screen layout |
KR101547151B1 (en) | 2008-12-26 | 2015-08-25 | 삼성전자주식회사 | Image processing method and apparatus |
US7889425B1 (en) | 2008-12-30 | 2011-02-15 | Holovisions LLC | Device with array of spinning microlenses to display three-dimensional images |
WO2010095440A1 (en) | 2009-02-20 | 2010-08-26 | パナソニック株式会社 | Recording medium, reproduction device, and integrated circuit |
US20120057229A1 (en) | 2009-04-21 | 2012-03-08 | Ryo Kikuchi | Display apparatus |
US20100309290A1 (en) | 2009-06-08 | 2010-12-09 | Stephen Brooks Myers | System for capture and display of stereoscopic content |
US7978407B1 (en) | 2009-06-27 | 2011-07-12 | Holovisions LLC | Holovision (TM) 3D imaging with rotating light-emitting members |
US8731373B2 (en) | 2009-06-30 | 2014-05-20 | Rovi Technologies Corporation | Managing and editing stored media assets |
WO2011007757A1 (en) | 2009-07-13 | 2011-01-20 | Yoshida Kenji | Parallax barrier for autostereoscopic display, autostereoscopic display, and method for designing parallax barrier for autostereoscopic display |
US20110013258A1 (en) | 2009-07-14 | 2011-01-20 | Samsung Electro-Mechanics Co., Ltd. | Manufacturing method of electronic paper display device and electronic paper display device manufactured therefrom |
EP2282231A3 (en) | 2009-08-07 | 2011-05-04 | JDS Uniphase Corporation | Multi-segment optical retarder for creating 3d images |
DE102010021343A1 (en) | 2009-09-04 | 2011-03-10 | Volkswagen Ag | Method and device for providing information in a vehicle |
KR101596963B1 (en) | 2009-09-29 | 2016-02-23 | 엘지디스플레이 주식회사 | Stereoscopic image display device |
TWI417574B (en) | 2009-10-09 | 2013-12-01 | Chunghwa Picture Tubes Ltd | Zoom lens array and switchable two and three dimensional display |
EP2337362A3 (en) | 2009-12-21 | 2013-07-17 | Samsung Electronics Co., Ltd. | Display apparatus and control method thereof |
US8659830B2 (en) | 2009-12-21 | 2014-02-25 | 3M Innovative Properties Company | Optical films enabling autostereoscopy |
KR20120132680A (en) | 2010-02-02 | 2012-12-07 | 픽스트로닉스 인코포레이티드 | Methods for manufacturing cold seal fluid-filled display apparatus |
BR112012019383A2 (en) | 2010-02-02 | 2017-09-12 | Pixtronix Inc | CIRCUITS TO CONTROL DISPLAY APPARATUS |
US8472746B2 (en) | 2010-02-04 | 2013-06-25 | Sony Corporation | Fast depth map generation for 2D to 3D conversion |
KR20110096494A (en) | 2010-02-22 | 2011-08-30 | 엘지전자 주식회사 | Electronic device and stereoscopic image playback method |
US8587498B2 (en) | 2010-03-01 | 2013-11-19 | Holovisions LLC | 3D image display with binocular disparity and motion parallax |
US20110234605A1 (en) | 2010-03-26 | 2011-09-29 | Nathan James Smith | Display having split sub-pixels for multiple image display functions |
US8826184B2 (en) | 2010-04-05 | 2014-09-02 | Lg Electronics Inc. | Mobile terminal and image display controlling method thereof |
US8860672B2 (en) | 2010-05-26 | 2014-10-14 | T-Mobile Usa, Inc. | User interface with z-axis interaction |
US9030536B2 (en) | 2010-06-04 | 2015-05-12 | At&T Intellectual Property I, Lp | Apparatus and method for presenting media content |
US8402502B2 (en) | 2010-06-16 | 2013-03-19 | At&T Intellectual Property I, L.P. | Method and apparatus for presenting media content |
DE102010017458B4 (en) * | 2010-06-18 | 2012-02-16 | Phoenix Contact Gmbh & Co. Kg | electric plug |
US8508347B2 (en) | 2010-06-24 | 2013-08-13 | Nokia Corporation | Apparatus and method for proximity based input |
US8593574B2 (en) | 2010-06-30 | 2013-11-26 | At&T Intellectual Property I, L.P. | Apparatus and method for providing dimensional media content based on detected display capability |
US8640182B2 (en) | 2010-06-30 | 2014-01-28 | At&T Intellectual Property I, L.P. | Method for detecting a viewing apparatus |
US8918831B2 (en) | 2010-07-06 | 2014-12-23 | At&T Intellectual Property I, Lp | Method and apparatus for managing a presentation of media content |
US9049426B2 (en) | 2010-07-07 | 2015-06-02 | At&T Intellectual Property I, Lp | Apparatus and method for distributing three dimensional media content |
US8349059B2 (en) * | 2010-07-13 | 2013-01-08 | Peerless Manufacturing Co. | Pocketed cyclonic separator |
TWI439730B (en) | 2010-07-16 | 2014-06-01 | Au Optronics Corp | Parallax barrier and application thereof |
US9232274B2 (en) | 2010-07-20 | 2016-01-05 | At&T Intellectual Property I, L.P. | Apparatus for adapting a presentation of media content to a requesting device |
US9032470B2 (en) | 2010-07-20 | 2015-05-12 | At&T Intellectual Property I, Lp | Apparatus for adapting a presentation of media content according to a position of a viewing apparatus |
KR101704738B1 (en) | 2010-07-26 | 2017-02-08 | 한국전자통신연구원 | Holographic display with high resolution |
TW201219835A (en) | 2010-07-28 | 2012-05-16 | Unipixel Displays Inc | Two and three-dimensional image display with optical emission frequency control |
CN102346311B (en) | 2010-08-02 | 2014-11-05 | 群康科技(深圳)有限公司 | Display device and phase delay film |
US8994716B2 (en) | 2010-08-02 | 2015-03-31 | At&T Intellectual Property I, Lp | Apparatus and method for providing media content |
US8438502B2 (en) | 2010-08-25 | 2013-05-07 | At&T Intellectual Property I, L.P. | Apparatus for controlling three-dimensional images |
KR101685982B1 (en) | 2010-09-01 | 2016-12-13 | 엘지전자 주식회사 | Mobile terminal and Method for controlling 3 dimention display thereof |
US20120057006A1 (en) | 2010-09-08 | 2012-03-08 | Disney Enterprises, Inc. | Autostereoscopic display system and method |
US9207859B2 (en) | 2010-09-14 | 2015-12-08 | Lg Electronics Inc. | Method and mobile terminal for displaying fixed objects independent of shifting background images on a touchscreen |
US8786685B1 (en) | 2010-09-15 | 2014-07-22 | Rockwell Collins, Inc. | Full-resolution single-LCD stereoscopic display |
US8896664B2 (en) | 2010-09-19 | 2014-11-25 | Lg Electronics Inc. | Method and apparatus for processing a broadcast signal for 3D broadcast service |
US8610708B2 (en) | 2010-09-22 | 2013-12-17 | Raytheon Company | Method and apparatus for three-dimensional image reconstruction |
US8947511B2 (en) | 2010-10-01 | 2015-02-03 | At&T Intellectual Property I, L.P. | Apparatus and method for presenting three-dimensional media content |
JP6111197B2 (en) | 2010-10-01 | 2017-04-05 | サムスン エレクトロニクス カンパニー リミテッド | 3D display device using barrier and driving method thereof |
KR101728725B1 (en) | 2010-10-04 | 2017-04-20 | 엘지전자 주식회사 | Mobile terminal and method for controlling thereof |
US9043732B2 (en) | 2010-10-21 | 2015-05-26 | Nokia Corporation | Apparatus and method for user input for controlling displayed information |
US10146426B2 (en) | 2010-11-09 | 2018-12-04 | Nokia Technologies Oy | Apparatus and method for user input for controlling displayed information |
US9014462B2 (en) | 2010-11-10 | 2015-04-21 | Panasonic Intellectual Property Management Co., Ltd. | Depth information generating device, depth information generating method, and stereo image converter |
US9519153B2 (en) | 2010-11-19 | 2016-12-13 | Reald Inc. | Directional flat illuminators |
US9250448B2 (en) | 2010-11-19 | 2016-02-02 | Reald Inc. | Segmented directional backlight and related methods of backlight illumination |
US8651726B2 (en) | 2010-11-19 | 2014-02-18 | Reald Inc. | Efficient polarized directional backlight |
US20120154559A1 (en) | 2010-12-21 | 2012-06-21 | Voss Shane D | Generate Media |
US20120202187A1 (en) | 2011-02-03 | 2012-08-09 | Shadowbox Comics, Llc | Method for distribution and display of sequential graphic art |
EP2495602A1 (en) | 2011-03-01 | 2012-09-05 | Thomson Licensing | Autostereoscopic display and method for operating the same |
KR101852428B1 (en) | 2011-03-09 | 2018-04-26 | 엘지전자 주식회사 | Mobile twrminal and 3d object control method thereof |
US20120229718A1 (en) | 2011-03-09 | 2012-09-13 | Yinkuei Huang | Direct-view adjustable lenticular 3D device and manufacturing process |
US8824821B2 (en) | 2011-03-28 | 2014-09-02 | Sony Corporation | Method and apparatus for performing user inspired visual effects rendering on an image |
EP2508920B1 (en) | 2011-04-08 | 2016-12-14 | Optosys SA | Method and device for monitoring moving objects |
KR101748668B1 (en) | 2011-04-14 | 2017-06-19 | 엘지전자 주식회사 | Mobile twrminal and 3d image controlling method thereof |
US20120274626A1 (en) | 2011-04-29 | 2012-11-01 | Himax Media Solutions, Inc. | Stereoscopic Image Generating Apparatus and Method |
US8937767B2 (en) | 2011-05-11 | 2015-01-20 | Academia Sinica | Auto-stereoscopic display and three-dimensional imaging double-sided mirror array |
TWI404893B (en) | 2011-05-13 | 2013-08-11 | 南臺科技大學 | An illuminating device without a light guide board |
US9024927B2 (en) | 2011-06-15 | 2015-05-05 | Semiconductor Energy Laboratory Co., Ltd. | Display device and method for driving the same |
US9030522B2 (en) | 2011-06-24 | 2015-05-12 | At&T Intellectual Property I, Lp | Apparatus and method for providing media content |
US8947497B2 (en) | 2011-06-24 | 2015-02-03 | At&T Intellectual Property I, Lp | Apparatus and method for managing telepresence sessions |
US8359556B1 (en) * | 2011-06-29 | 2013-01-22 | International Business Machines Corporation | Resolving double patterning conflicts |
US8587635B2 (en) | 2011-07-15 | 2013-11-19 | At&T Intellectual Property I, L.P. | Apparatus and method for providing media services with telepresence |
KR101888672B1 (en) | 2011-07-27 | 2018-08-16 | 엘지디스플레이 주식회사 | Streoscopic image display device and method for driving thereof |
WO2013028944A1 (en) | 2011-08-24 | 2013-02-28 | Reald Inc. | Autostereoscopic display with a passive cycloidal diffractive waveplate |
KR101287786B1 (en) * | 2011-09-22 | 2013-07-18 | 엘지전자 주식회사 | Method for displaying stereoscopic image and display apparatus thereof |
US9363498B2 (en) * | 2011-11-11 | 2016-06-07 | Texas Instruments Incorporated | Method, system and computer program product for adjusting a convergence plane of a stereoscopic image |
US8897542B2 (en) | 2011-12-15 | 2014-11-25 | Sony Corporation | Depth map generation based on soft classification |
WO2013109252A1 (en) | 2012-01-17 | 2013-07-25 | Thomson Licensing | Generating an image for another view |
US9143754B2 (en) | 2012-02-02 | 2015-09-22 | Cyberlink Corp. | Systems and methods for modifying stereoscopic images |
US20140309865A1 (en) | 2013-04-15 | 2014-10-16 | Flextronics Ap, Llc | Facial recognition database created from social networking sites |
US9188731B2 (en) | 2012-05-18 | 2015-11-17 | Reald Inc. | Directional backlight |
US9235057B2 (en) | 2012-05-18 | 2016-01-12 | Reald Inc. | Polarization recovery in a directional display device |
US20140013272A1 (en) * | 2012-07-06 | 2014-01-09 | Navico Holding As | Page Editing |
US8947385B2 (en) | 2012-07-06 | 2015-02-03 | Google Technology Holdings LLC | Method and device for interactive stereoscopic display |
US8917441B2 (en) | 2012-07-23 | 2014-12-23 | Reald Inc. | Observe tracking autostereoscopic display |
KR101470693B1 (en) | 2012-07-31 | 2014-12-08 | 엘지디스플레이 주식회사 | Image data processing method and stereoscopic image display using the same |
TWI467237B (en) | 2012-08-03 | 2015-01-01 | Au Optronics Corp | Stereoscopic display and stereoscopic display device |
WO2014055695A1 (en) | 2012-10-02 | 2014-04-10 | Reald Inc. | Temporally multiplexed display with landscape and portrait operation modes |
US9134552B2 (en) | 2013-03-13 | 2015-09-15 | Pixtronix, Inc. | Display apparatus with narrow gap electrostatic actuators |
US10379278B2 (en) | 2013-03-15 | 2019-08-13 | Ideal Industries Lighting Llc | Outdoor and/or enclosed structure LED luminaire outdoor and/or enclosed structure LED luminaire having outward illumination |
US9261641B2 (en) | 2013-03-25 | 2016-02-16 | 3M Innovative Properties Company | Dual-sided film with compound prisms |
US8988343B2 (en) | 2013-03-29 | 2015-03-24 | Panasonic Intellectual Property Management Co., Ltd. | Method of automatically forming one three-dimensional space with multiple screens |
KR102111407B1 (en) | 2013-08-19 | 2020-05-15 | 엘지전자 주식회사 | Display apparatus and method for operating the same |
US10108258B2 (en) | 2013-09-06 | 2018-10-23 | Intel Corporation | Multiple viewpoint image capture of a display user |
US20150185957A1 (en) | 2013-12-30 | 2015-07-02 | Hannstar Display Corporation | Touch naked eyes stereoscopic display |
CN103838034A (en) | 2014-02-07 | 2014-06-04 | 京东方科技集团股份有限公司 | Backlight module and dual-view display device |
TWI514006B (en) | 2014-03-11 | 2015-12-21 | Au Optronics Corp | Multi-view display |
-
2014
- 2014-10-23 US US14/522,278 patent/US20150116458A1/en not_active Abandoned
-
2017
- 2017-12-27 US US15/855,756 patent/US10250864B2/en active Active
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20100080448A1 (en) * | 2007-04-03 | 2010-04-01 | Wa James Tam | Method and graphical user interface for modifying depth maps |
US20120120063A1 (en) * | 2010-11-11 | 2012-05-17 | Sony Corporation | Image processing device, image processing method, and program |
US20140132726A1 (en) * | 2012-11-13 | 2014-05-15 | Lg Electronics Inc. | Image display apparatus and method for operating the same |
Cited By (18)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20140078262A1 (en) * | 2012-09-18 | 2014-03-20 | Lg Innotek Co., Ltd. | Image processing apparatus and camera module using the same |
US9736453B2 (en) * | 2012-09-18 | 2017-08-15 | Lg Innotek Co., Ltd. | Method for encoding a stereoscopic image |
US10158847B2 (en) | 2014-06-19 | 2018-12-18 | Vefxi Corporation | Real—time stereo 3D and autostereoscopic 3D video and image editing |
US20160165205A1 (en) * | 2014-12-03 | 2016-06-09 | Shenzhen Estar Technology Group Co.,Ltd | Holographic displaying method and device based on human eyes tracking |
US20170155886A1 (en) * | 2015-06-24 | 2017-06-01 | Derek John Hartling | Colour-Z: Low-D Loading to High-D Processing |
US11915502B2 (en) | 2015-08-24 | 2024-02-27 | Qualcomm Incorporated | Systems and methods for depth map sampling |
CN108419446A (en) * | 2015-08-24 | 2018-08-17 | 高通股份有限公司 | Systems and methods for laser depth map sampling |
CN106713889A (en) * | 2015-11-13 | 2017-05-24 | 中国电信股份有限公司 | 3D frame rendering method and system and mobile terminal |
US10198794B2 (en) * | 2015-12-18 | 2019-02-05 | Canon Kabushiki Kaisha | System and method for adjusting perceived depth of an image |
US20170178298A1 (en) * | 2015-12-18 | 2017-06-22 | Canon Kabushiki Kaisha | System and method for adjusting perceived depth of an image |
EP3398329A4 (en) * | 2015-12-30 | 2019-08-07 | Creative Technology Ltd. | A method for creating a stereoscopic image sequence |
US10602123B2 (en) | 2015-12-30 | 2020-03-24 | Creative Technology Ltd | Method for creating a stereoscopic image sequence |
CN109327694A (en) * | 2018-11-19 | 2019-02-12 | 威创集团股份有限公司 | A kind of 3D control room method for changing scenes, device, equipment and storage medium |
CN110290374A (en) * | 2019-06-28 | 2019-09-27 | 宝琳创展国际文化科技发展(北京)有限公司 | A kind of implementation method of naked eye 3D |
CN112272295A (en) * | 2020-10-26 | 2021-01-26 | 腾讯科技(深圳)有限公司 | Method for generating video with three-dimensional effect, method for playing video, device and equipment |
WO2022089168A1 (en) * | 2020-10-26 | 2022-05-05 | 腾讯科技(深圳)有限公司 | Generation method and apparatus and playback method and apparatus for video having three-dimensional effect, and device |
US12243561B2 (en) | 2020-10-26 | 2025-03-04 | Tencent Technology (Shenzhen) Company Limited | Method and apparatus for generating video with 3D effect, method and apparatus for playing video with 3D effect, and device |
WO2024043435A1 (en) * | 2022-08-23 | 2024-02-29 | 삼성전자 주식회사 | Electronic device and method for generating image in which depth recognized by viewer is reinforced |
Also Published As
Publication number | Publication date |
---|---|
US20180139432A1 (en) | 2018-05-17 |
US10250864B2 (en) | 2019-04-02 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10250864B2 (en) | Method and apparatus for generating enhanced 3D-effects for real-time and offline applications | |
JP6563453B2 (en) | Generation of a depth map for an input image using an exemplary approximate depth map associated with an exemplary similar image | |
US9445072B2 (en) | Synthesizing views based on image domain warping | |
US10095953B2 (en) | Depth modification for display applications | |
US20130141550A1 (en) | Method, apparatus and computer program for selecting a stereoscopic imaging viewpoint pair | |
US20120075291A1 (en) | Display apparatus and method for processing image applied to the same | |
US10127714B1 (en) | Spherical three-dimensional video rendering for virtual reality | |
CN103369353A (en) | Integrated Stereo Converter for Internet-Based Networks | |
US20140363100A1 (en) | Method and apparatus for real-time conversion of 2-dimensional content to 3-dimensional content | |
US20130321409A1 (en) | Method and system for rendering a stereoscopic view | |
Smolic et al. | Disparity-aware stereo 3d production tools | |
CN102026012B (en) | Generation method and device of depth map through three-dimensional conversion to planar video | |
KR102459850B1 (en) | Method and apparatus for processing 3-dimension image, and graphic processing unit | |
US20120121163A1 (en) | 3d display apparatus and method for extracting depth of 3d image thereof | |
US10152818B2 (en) | Techniques for stereo three dimensional image mapping | |
US9967546B2 (en) | Method and apparatus for converting 2D-images and videos to 3D for consumer, commercial and professional applications | |
Yao et al. | A real-time full HD 2D-to-3D video conversion system based on FPGA | |
US9924150B2 (en) | Techniques for stereo three dimensional video processing | |
CN103108201A (en) | Stereoscopic image display device and dynamic depth image generation method | |
KR20130081569A (en) | Apparatus and method for outputting 3d image | |
CN102419707B (en) | Assume two-dimensional element in 3 D stereo application | |
KR101369006B1 (en) | Apparatus and method for displaying multiview image | |
Huang et al. | P‐8.13: Low‐cost Multi‐view Image Synthesis method for Autostereoscopic Display | |
CN119135868A (en) | A method and device for generating binocular images based on monocular images | |
CN102724527A (en) | Depth evaluation method capable of configuring multi-scenario model and system using same |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: BARKATECH CONSULTING, LLC, OREGON Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:BARKATULLAH, JAVED SABIR;REEL/FRAME:034022/0754 Effective date: 20141023 |
|
AS | Assignment |
Owner name: VEFXI CORPORATION, OREGON Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:BARKATECH CONSULTING, LLC;REEL/FRAME:036443/0019 Effective date: 20150811 |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |