US20110249757A1 - Method and device for overlaying 3d graphics over 3d video - Google Patents
Method and device for overlaying 3d graphics over 3d video Download PDFInfo
- Publication number
- US20110249757A1 US20110249757A1 US13/139,925 US200913139925A US2011249757A1 US 20110249757 A1 US20110249757 A1 US 20110249757A1 US 200913139925 A US200913139925 A US 200913139925A US 2011249757 A1 US2011249757 A1 US 2011249757A1
- Authority
- US
- United States
- Prior art keywords
- information
- video
- frame
- overlay
- video information
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
- 238000000034 method Methods 0.000 title claims abstract description 19
- 239000000872 buffer Substances 0.000 claims abstract description 43
- 230000003139 buffering effect Effects 0.000 claims abstract description 10
- 238000009877 rendering Methods 0.000 description 9
- 230000002452 interceptive effect Effects 0.000 description 6
- 230000000694 effects Effects 0.000 description 3
- 239000011521 glass Substances 0.000 description 3
- 239000002131 composite material Substances 0.000 description 2
- 239000000203 mixture Substances 0.000 description 2
- 230000003287 optical effect Effects 0.000 description 2
- 230000009466 transformation Effects 0.000 description 2
- 230000004913 activation Effects 0.000 description 1
- 239000000969 carrier Substances 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 238000007781 pre-processing Methods 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/10—Processing, recording or transmission of stereoscopic or multi-view image signals
- H04N13/189—Recording image signals; Reproducing recorded image signals
-
- G—PHYSICS
- G11—INFORMATION STORAGE
- G11B—INFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
- G11B20/00—Signal processing not specific to the method of recording or reproducing; Circuits therefor
- G11B20/00007—Time or data compression or expansion
-
- G—PHYSICS
- G11—INFORMATION STORAGE
- G11B—INFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
- G11B20/00—Signal processing not specific to the method of recording or reproducing; Circuits therefor
- G11B20/10—Digital recording or reproducing
- G11B20/10527—Audio or video recording; Data buffering arrangements
-
- G—PHYSICS
- G11—INFORMATION STORAGE
- G11B—INFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
- G11B27/00—Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
- G11B27/02—Editing, e.g. varying the order of information signals recorded on, or reproduced from, record carriers
- G11B27/031—Electronic editing of digitised analogue information signals, e.g. audio or video signals
- G11B27/036—Insert-editing
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/10—Processing, recording or transmission of stereoscopic or multi-view image signals
- H04N13/106—Processing image signals
- H04N13/156—Mixing image signals
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/10—Processing, recording or transmission of stereoscopic or multi-view image signals
- H04N13/106—Processing image signals
- H04N13/161—Encoding, multiplexing or demultiplexing different image signal components
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/10—Processing, recording or transmission of stereoscopic or multi-view image signals
- H04N13/106—Processing image signals
- H04N13/172—Processing image signals image signals comprising non-image signal components, e.g. headers or format information
- H04N13/183—On-screen display [OSD] information, e.g. subtitles or menus
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N19/00—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
- H04N19/50—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using predictive coding
- H04N19/597—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using predictive coding specially adapted for multi-view video sequence encoding
-
- G—PHYSICS
- G11—INFORMATION STORAGE
- G11B—INFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
- G11B20/00—Signal processing not specific to the method of recording or reproducing; Circuits therefor
- G11B20/00007—Time or data compression or expansion
- G11B2020/00072—Time or data compression or expansion the compressed signal including a video signal
-
- G—PHYSICS
- G11—INFORMATION STORAGE
- G11B—INFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
- G11B20/00—Signal processing not specific to the method of recording or reproducing; Circuits therefor
- G11B20/10—Digital recording or reproducing
- G11B20/10527—Audio or video recording; Data buffering arrangements
- G11B2020/1062—Data buffering arrangements, e.g. recording or playback buffers
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/10—Processing, recording or transmission of stereoscopic or multi-view image signals
Definitions
- the invention relates to a method of decoding and outputting video information suitable for three-dimensional [3D] display, the video information comprising encoded main video information suitable for displaying on a 2D display and encoded additional video information for enabling three-dimensional [3D] display, 3D overlay information being overlayed onto the video information.
- the invention further relates to a device for decoding and outputting video information suitable for three-dimensional [3D] display, the video information comprising encoded main video information suitable for displaying on a 2D display and encoded additional video information for enabling three-dimensional [3D] display, the device adapted to overlay 3D overlay information onto the video information.
- the invention relates to the field playback of 3D video information and 3D overlay information by a playback device, the information to be displayed onto a 3D enabled display.
- Devices for rendering video data are well known, for example video players like DVD players, BD players or set top boxes for rendering digital video signals.
- the rendering device is commonly used as a source device to be coupled to a display device like a TV set.
- Image data is transferred from the source device via a suitable interface like HDMI.
- coded video information stream may under the format known as stereoscopic, where left and right (L+R) images are encoded.
- coded video information stream may comprise a 2D picture and an additional picture (L+D), a so-called depth map, as described in Oliver Sheer-“3 D Video Communication ”, Wiley, 2005, pages 29-34.
- the depth map conveys information about the depth of objects in the 2D image.
- the grey scale values in the depth map indicate the depth of the associated pixel in the 2D image.
- a stereo display can calculate the additional view required for stereo by using the depth value from the depth map and by calculating the required pixel transformation.
- the 2D video+depth map may be extended by adding occlusion and transparency information (DOT).
- DOT occlusion and transparency information
- the method further comprises receiving or generating three-dimensional [3D] overlay information to be overlayed over the video information; buffering a first part of the overlay information to be overlayed over the main video information in a first buffer; buffering a second part of overlay information to be overlayed over the additional video information in a second buffer; decoding the main video information and the additional video information and generating as a series of time interleaved video frames, each outputted video frame being either main video frame or additional video frame; determining a type of an video frame to be outputted being either a main video frame or an additional video frame; overlaying either first or second part of the overlay information on an video frame to be outputted in agreement with the determined type of frame-outputting the video frames and the overlayed information.
- 3D three-dimensional [3D] overlay information to be overlayed over the video information
- the device described in the opening paragraph comprises input means for receiving three-dimensional [3D] overlay information to be overlayed over the video information or generation means for generating three-dimensional [3D] overlay information to be overlayed over the video information a decoder for decoding the main video information and the additional video information, the decoder further adapted to generating as a series of time interleaved video frames, each outputted video frame being either main video frame or additional video frame; means for receiving or generating three-dimensional [3D] overlay information to be overlayed over the video information; a graphics processing unit comprising a first buffer for buffering a first part of the overlay information to be overlayed over the main video information and a second buffer for buffering a second part of overlay information to be overlayed over the additional video information; the graphics processing unit further comprising a controller for determining a type of an video frame to be outputted being either a main video frame or an additional video frame; a mixer for overlaying either first or second
- 3D Overlay graphics can no longer simply be composited with the 3D video output in systems outputting frames corresponding tot Left or 2D information interleaved with Right or DOT frames, since the 3D video output switches between the two different video streams each frame.
- the video output could contain the 2D frame, and at time T+1 the video output contains accompanying depth information for the frame at time T.
- the graphics that need to be composited with the video at time T greatly differ from the graphics that need to be composited with the video at time T+1 (the depth graphics or the R graphics).
- the graphics unit present in 2D video player devices is not fast enough to frame accurately update its graphics plane with these different graphics every frame.
- the solution according to the invention is to implement two buffers in the graphics unit. Each buffer is assigned to one of the output video streams. For example, for 2D+depth drawing, one buffer could be assigned for graphics overlay over the 2D frame and one buffer could be assigned for the graphics overlay over the depth frame. For L+R, similarly, one buffer could be used for graphics overlay over the L frame, and one buffer could be assigned for overlay over the R frame.
- the advantage of this solution is that the slow graphics are decoupled from the frame accurate overlaying engine, so that the processing requirements are significantly reduces.
- the graphics control unit further comprises a controller is adapted to copy parts of a first overlay frame in the first buffer or parts of a second overlay frame in the second buffer at frame frequency for generating an overlay frame.
- a controller is adapted to copy parts of a first overlay frame in the first buffer or parts of a second overlay frame in the second buffer at frame frequency for generating an overlay frame.
- FIG. 1 shows schematically a system for receiving and displaying 3D video information in parts of which the invention may be practiced
- FIG. 2 shows schematically a graphics processing unit of a known 2D video player.
- FIG. 3 shows schematically the composition of video planes in known Blu-Ray (BD) systems
- FIG. 4 illustrates schematically a graphics processing unit according to the invention
- FIG. 1 A system 1 for playback of 3D video information wherein the invention may be practiced is shown in FIG. 1 .
- the system comprises a player device 10 and a display device 11 communicating via an interface 12 .
- the player device 10 comprises a front end unit 12 responsible for receiving and pre-processing the coded video information stream to be displayed, and a processing unit for decoding, processing and generation a video stream to be supplied to the output 14 .
- the display device comprises a rendering unit for rendering 3D views from the received.
- coded video information stream may under the format known as stereoscopic, where left and right (L+R) images are encoded.
- coded video information stream may comprise a 2D picture and an additional picture (L+D), a so-called depth map, as described in Oliver Sheer-“3 D Video Communication ”, Wiley, 2005, pages 29-34.
- the depth map conveys information about the depth of objects in the 2D image.
- the grey scale values in the depth map indicate the depth of the associated pixel in the 2D image.
- a stereo display can calculate the additional view required for stereo by using the depth value from the depth map and by calculating the required pixel transformation.
- the 2D video+depth map may be extended by adding occlusion and transparency information (DOT).
- DOT occlusion and transparency information
- DOT occlusion and transparency information
- a flexible data format comprising stereo information and depth map, adding occlusion and transparency, as described in EP 08305420.5 (Attorney docket PH010082), to be
- this can be either a display device that makes use of controllable glasses to control the images displayed to the left and right eye respectively, or, in a preferred embodiment, the so called autostereoscopic displays are used.
- a number of auto-stereoscopic devices that are able to switch between 2D and 3 D displays are known, one of them being described in U.S. Pat. No. 6,069,650.
- the display device comprises an LCD display comprising actively switchable Liquid Crystal lenticular lens.
- processing inside a rendering unit 16 converts the decoded video information received via the interface 12 from the player device 10 to multiple views and maps these onto the sub-pixels of the display panel 17 . It is duly noted that the rendering unit 16 may reside either inside the player device 10 , in such case the multiple views being sent via the interface.
- this may be adapted to read the video stream from an optical disc, another storage media such as flash, or receive the video information via wired or wireless network, such as an internet connection.
- a known example of a Blu-RayTM player is the PlayStationTM 3, as sold by Sony Corporation.
- BD systems also provide a fully programmable application environment with network connectivity thereby enabling the Content Provider to create interactive content. This mode is based on the JavaTM( )3 platform and is known as “BD-J”.
- BD-J defines a subset of the Digital Video Broadcasting (DVB)-Multimedia Home Platform (MHP) Specification 1.0, publicly available as ETSI TS 101 812
- FIG. 2 illustrates a graphics processing unit (part of the processing unit 13 ) of a known 2D video player, namely a Blu-Ray player.
- the graphics processing unit is equipped with two read buffers ( 1304 and 1305 ), two preloading buffers ( 1302 and 1303 ) and two switches ( 1306 and 1307 ).
- the second read buffer ( 1305 ) enables the supply of an Out-of-Mux audio stream to the decoder even while the main MPEG stream is being decoded.
- the preloading buffers cache Text subtitles, Interactive Graphics and sounds effects (which are presented at Button selection or activation).
- the preloading buffer 1303 stores data before movie playback begins and supplies data for presentation even while the main MPEG stream is being decoded.
- This switch 1301 between the data input and buffers selects the appropriate buffer to receive packet data from any one of read buffers or preloading buffers.
- effect sounds data if it exists
- text subtitle data if it exists
- Interactive Graphics if preloaded Interactive Graphics exist
- the main MPEG stream is sent to the primary read buffer ( 1304 ) and the Out-of-Mux stream is sent to the secondary read buffer ( 1305 ) by the switch 1301 .
- FIG. 3 shows schematically the composition of video planes in known Blu-Ray (BD) systems.
- two independent full graphics planes ( 32 , 33 ) for graphics which are composited on the video plane ( 31 ) are present.
- One graphics plane ( 32 ) is assigned for subtitling applications (Presentation Graphics or Text Subtitles) and the other plane is assigned to interactive applications ( 33 ) (HDMV or BD-J mode interactivity graphics).
- the main video plane ( 1310 ) and the presentation ( 1309 ) and graphics plane ( 1308 ) are supplied by the corresponding decoders, and the three planes are overlayed by an overlayer 1311 and outputted.
- FIG. 4 illustrates schematically a graphics processing unit ( 13 ) according to the invention.
- This specific example constitutes an improvement of the known graphics processing unit in BD systems, but the concept described herein are directly applicable to all graphics processing unit in video players, as the decoder models for various type of video players are similar.
- Autostereoscopic displays requires a different interface format: the 2D+depth video format. Besides the 2D video, an additional video stream is used to send depth information. The display combines the video stream in the rendering stage and calculates the resulting 3D picture.
- a possible interface format is sending the frames from both videos time interleaved to the display. This means that at time T a frame from the first video stream (left or 2D) is send, and at time T+1 a frame from the second video stream (right or depth) is send.
- Overlay graphics are for example used to display subtitles of create a selection menu.
- Blu-ray overlay graphics are read from disc (presentation graphics and interactive graphics) or generated in real time (BD-J graphics, OSD displays and text based subtitles).
- Outputting the video in a time-sequential interface format greatly effects the performance requirements of drawing routines for the real-time generated overlay graphics, in particular that of BD-J graphics. This is because the graphics plane can no longer simply be composited with the video output, since the video output switches between the two different video streams each frame. As an example, at time T the video plane could contain the 2D view, and at time T+1 the video plane contains accompanying depth information for the frame at time T.
- the BD-J graphics that need to be composited with the video at time T greatly differ from the BD-J graphics that need to be composited with the video at time T+1 (the depth graphics).
- a graphics processing unit in particular the BD-J drawing is not fast enough to frame accurately update its graphics plane with these different graphics every frame.
- the solution according to the invention is to implement two buffers in the graphics unit. Each buffer is assigned to one of the output video streams. For example, for 2D+depth drawing, one buffer could be assigned for graphics overlay over the 2D frame and one buffer could be assigned for the graphics overlay over the depth frame. For L+R, similarly, one buffer could be used for graphics overlay over the L frame, and one buffer could be assigned for overlay over the R frame.
- the advantage of this solution is that the slow graphics are decoupled from the frame accurate overlaying engine, so that the processing requirements are significantly reduces.
- a Java application 41 running on a Java Virtual machine generating overlay information and sending it to the graphics processing unit (API). It is noted that the source of the overlay information is not important, such overlay information for a graphics plane could be other graphics from disc or OSD (On Screen display) information.
- the graphics processing unit comprises two buffers 42 and 43 . Each buffer communicate with a controller ( 45 ), the controller preferably comprising a frame accurate area copier. Timing information is sent from the drawing application ( 41 ) and from the video decoder ( 47 ) to the to the graphics processing unit.
- the frame accurate area copier then can composite the correct buffer onto the graphics output plane, according to what video frame is currently being decoded onto the video output plane (this is known by the Time info from the video source).
- the frame accurate area copier ensures that the mixer composites the correct BD-J graphics over the video frame that is currently outputted (for 2D+depth this means that the 2D graphics buffer is copied onto the graphics plane when a 2D video frame is decoded, and the depth DOT graphics buffer is copied onto the graphics plane when a depth frame is decoded).
- 2D+depth this means that the 2D graphics buffer is copied onto the graphics plane when a 2D video frame is decoded
- the depth DOT graphics buffer is copied onto the graphics plane when a depth frame is decoded.
- L+R graphics this ensure that L real time graphics is overlayed over the L frame and the R real time graphics is overlayed over the R frame.
- the invention may be implemented in hardware and/or software, using programmable components.
- a method for implementing the invention has the processing steps corresponding to the rendering system elucidated with reference to FIG. 1 .
- the invention has been mainly explained by embodiments using optical record carriers or the internet, the invention is also suitable for any image processing environment, like authoring software or broadcasting equipment. Further applications include a 3D personal computer [PC] user interface or 3D media center PC, a 3D mobile player and a 3D mobile phone.
- PC personal computer
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Human Computer Interaction (AREA)
- Testing, Inspecting, Measuring Of Stereoscopic Televisions And Televisions (AREA)
- Compression Or Coding Systems Of Tv Signals (AREA)
- Processing Or Creating Images (AREA)
Abstract
A method of decoding and outputting video information suitable for three-dimensional [3D] display, the video information comprising encoded main video information suitable for displaying on a 2D display and encoded additional video information for enabling three-dimensional [3D] display, the method comprising: receiving or generating three-dimensional [3D] overlay information to be overlayed over the video information; buffering a first part of the overlay information to be overlayed over the main video information in a first buffer; buffering a second part of overlay information to be overlayed over the additional video information in a second buffer; decoding the main video information and the additional video information and generating as a series of time interleaved video frames, each outputted video frame being either main video frame or additional video frame; determining a type of an video frame to be outputted being either a main video frame or an additional video frame; overlaying either first or second part of the overlay information on an video frame to be outputted in agreement with the determined type of frame outputting the video frames and the overlayed information.
Description
- The invention relates to a method of decoding and outputting video information suitable for three-dimensional [3D] display, the video information comprising encoded main video information suitable for displaying on a 2D display and encoded additional video information for enabling three-dimensional [3D] display, 3D overlay information being overlayed onto the video information.
- The invention further relates to a device for decoding and outputting video information suitable for three-dimensional [3D] display, the video information comprising encoded main video information suitable for displaying on a 2D display and encoded additional video information for enabling three-dimensional [3D] display, the device adapted to overlay 3D overlay information onto the video information.
- The invention relates to the field playback of 3D video information and 3D overlay information by a playback device, the information to be displayed onto a 3D enabled display.
- Devices for rendering video data are well known, for example video players like DVD players, BD players or set top boxes for rendering digital video signals. The rendering device is commonly used as a source device to be coupled to a display device like a TV set. Image data is transferred from the source device via a suitable interface like HDMI.
- With respect to the coded video information stream, for example this may under the format known as stereoscopic, where left and right (L+R) images are encoded. Alternatively, coded video information stream may comprise a 2D picture and an additional picture (L+D), a so-called depth map, as described in Oliver Sheer-“3D Video Communication”, Wiley, 2005, pages 29-34. The depth map conveys information about the depth of objects in the 2D image. The grey scale values in the depth map indicate the depth of the associated pixel in the 2D image. A stereo display can calculate the additional view required for stereo by using the depth value from the depth map and by calculating the required pixel transformation. The 2D video+depth map may be extended by adding occlusion and transparency information (DOT).
- Currently in 3D systems, a known solution for the output video data to be transferred via the HDMI interface to the 3D display is time interleaving, wherein frames corresponding tot Left or 2D information are interleaved with Right or DOT frames.
- It is known that, for 2D video systems, application formats like for distribution of video content and playback device support overlay or real time generated graphics on top of the video. Overlay graphics are for example internally generated by the player device for on screen display (SD) menus, or received, such as subtitles or other graphics.
- However extending the known overlay models to 3D systems creates the problem that the performance requirements of drawing routines for the real-time generated overlay graphics are increased.
- It is an object of the invention to provide a method for decoding and outputting video information and overlay information which is suitable for 3D systems
- For this purpose, according to a first aspect of the invention, in the method as described in the opening paragraph, the method further comprises receiving or generating three-dimensional [3D] overlay information to be overlayed over the video information; buffering a first part of the overlay information to be overlayed over the main video information in a first buffer; buffering a second part of overlay information to be overlayed over the additional video information in a second buffer; decoding the main video information and the additional video information and generating as a series of time interleaved video frames, each outputted video frame being either main video frame or additional video frame; determining a type of an video frame to be outputted being either a main video frame or an additional video frame; overlaying either first or second part of the overlay information on an video frame to be outputted in agreement with the determined type of frame-outputting the video frames and the overlayed information.
- For this purpose, according to a second aspect of the invention, the device described in the opening paragraph comprises input means for receiving three-dimensional [3D] overlay information to be overlayed over the video information or generation means for generating three-dimensional [3D] overlay information to be overlayed over the video information a decoder for decoding the main video information and the additional video information, the decoder further adapted to generating as a series of time interleaved video frames, each outputted video frame being either main video frame or additional video frame; means for receiving or generating three-dimensional [3D] overlay information to be overlayed over the video information; a graphics processing unit comprising a first buffer for buffering a first part of the overlay information to be overlayed over the main video information and a second buffer for buffering a second part of overlay information to be overlayed over the additional video information; the graphics processing unit further comprising a controller for determining a type of an video frame to be outputted being either a main video frame or an additional video frame; a mixer for overlaying either first or second part of the overlay information on an video frame to be outputted in agreement with the determined type of frame; output means for outputting the video frames and the overlayed information.
- The invention is also based on the following recognition. 3D Overlay graphics can no longer simply be composited with the 3D video output in systems outputting frames corresponding tot Left or 2D information interleaved with Right or DOT frames, since the 3D video output switches between the two different video streams each frame. As an example, at time T the video output could contain the 2D frame, and at time T+1 the video output contains accompanying depth information for the frame at time T. The graphics that need to be composited with the video at time T (the 2D graphics) greatly differ from the graphics that need to be composited with the video at time T+1 (the depth graphics or the R graphics). The graphics unit present in 2D video player devices is not fast enough to frame accurately update its graphics plane with these different graphics every frame. The solution according to the invention is to implement two buffers in the graphics unit. Each buffer is assigned to one of the output video streams. For example, for 2D+depth drawing, one buffer could be assigned for graphics overlay over the 2D frame and one buffer could be assigned for the graphics overlay over the depth frame. For L+R, similarly, one buffer could be used for graphics overlay over the L frame, and one buffer could be assigned for overlay over the R frame. The advantage of this solution is that the slow graphics are decoupled from the frame accurate overlaying engine, so that the processing requirements are significantly reduces.
- Advantageously, the graphics control unit further comprises a controller is adapted to copy parts of a first overlay frame in the first buffer or parts of a second overlay frame in the second buffer at frame frequency for generating an overlay frame. When the player device handles 2D+DOT depth streams, this enables fast generation of occasion data, by copying the relevant areas from the buffered frames.
- These and other aspects of the invention will be apparent from and elucidated further with reference to the embodiments described by way of example in the following description and with reference to the accompanying drawings, in which
-
FIG. 1 shows schematically a system for receiving and displaying 3D video information in parts of which the invention may be practiced -
FIG. 2 shows schematically a graphics processing unit of a known 2D video player. -
FIG. 3 shows schematically the composition of video planes in known Blu-Ray (BD) systems -
FIG. 4 illustrates schematically a graphics processing unit according to the invention - In the Figures, elements which correspond to elements already described have the same reference numerals.
- A
system 1 for playback of 3D video information wherein the invention may be practiced is shown inFIG. 1 . The system comprises aplayer device 10 and adisplay device 11 communicating via aninterface 12. Theplayer device 10 comprises afront end unit 12 responsible for receiving and pre-processing the coded video information stream to be displayed, and a processing unit for decoding, processing and generation a video stream to be supplied to theoutput 14. The display device comprises a rendering unit for rendering 3D views from the received. - With respect to the coded video information stream, for example this may under the format known as stereoscopic, where left and right (L+R) images are encoded. Alternatively, coded video information stream may comprise a 2D picture and an additional picture (L+D), a so-called depth map, as described in Oliver Sheer-“3D Video Communication”, Wiley, 2005, pages 29-34. The depth map conveys information about the depth of objects in the 2D image. The grey scale values in the depth map indicate the depth of the associated pixel in the 2D image. A stereo display can calculate the additional view required for stereo by using the depth value from the depth map and by calculating the required pixel transformation. The 2D video+depth map may be extended by adding occlusion and transparency information (DOT). In a preferred embodiment, a flexible data format comprising stereo information and depth map, adding occlusion and transparency, as described in EP 08305420.5 (Attorney docket PH010082), to be included herein by reference, is used.
- With respect to the
display device 11, this can be either a display device that makes use of controllable glasses to control the images displayed to the left and right eye respectively, or, in a preferred embodiment, the so called autostereoscopic displays are used. A number of auto-stereoscopic devices that are able to switch between 2D and 3 D displays are known, one of them being described in U.S. Pat. No. 6,069,650. The display device comprises an LCD display comprising actively switchable Liquid Crystal lenticular lens. In auto-stereoscopic displays processing inside arendering unit 16 converts the decoded video information received via theinterface 12 from theplayer device 10 to multiple views and maps these onto the sub-pixels of thedisplay panel 17. It is duly noted that therendering unit 16 may reside either inside theplayer device 10, in such case the multiple views being sent via the interface. - With respect to the
player device 10, this may be adapted to read the video stream from an optical disc, another storage media such as flash, or receive the video information via wired or wireless network, such as an internet connection. A known example of a Blu-Ray™ player is the PlayStation™ 3, as sold by Sony Corporation. - In case of BD systems, further details can be found in the publicly available technical white papers “Blu-ray Disc Format General August 2004” and “Blu-ray Disc 1.C Physical Format Specifications for BD-ROM November, 2005”, published by the Blu-Ray Disc association (http://www.bluraydisc.com).
- In the following, when referring to the BD application format, we refer specifically to the application formats as disclosed in the US application No. 2006-0110111 (Attorney docket NL021359) and in white paper “Blu-ray Disc Format 2.B Audio Visual Application Format Specifications for BD-ROM, March 2005” as published by the Blu-ray Disc Association.
- It is knows that BD systems also provide a fully programmable application environment with network connectivity thereby enabling the Content Provider to create interactive content. This mode is based on the Java™( )3 platform and is known as “BD-J”. BD-J defines a subset of the Digital Video Broadcasting (DVB)-Multimedia Home Platform (MHP) Specification 1.0, publicly available as ETSI TS 101 812
-
FIG. 2 illustrates a graphics processing unit (part of the processing unit 13) of a known 2D video player, namely a Blu-Ray player. The graphics processing unit is equipped with two read buffers (1304 and 1305), two preloading buffers (1302 and 1303) and two switches (1306 and 1307). The second read buffer (1305) enables the supply of an Out-of-Mux audio stream to the decoder even while the main MPEG stream is being decoded. The preloading buffers cache Text subtitles, Interactive Graphics and sounds effects (which are presented at Button selection or activation). Thepreloading buffer 1303 stores data before movie playback begins and supplies data for presentation even while the main MPEG stream is being decoded. - This
switch 1301 between the data input and buffers selects the appropriate buffer to receive packet data from any one of read buffers or preloading buffers. Before starting the main movie presentation, effect sounds data (if it exists), text subtitle data (if it exists) and Interactive Graphics (if preloaded Interactive Graphics exist) are preloaded and sent to each buffer respectively through the switch. The main MPEG stream is sent to the primary read buffer (1304) and the Out-of-Mux stream is sent to the secondary read buffer (1305) by theswitch 1301. -
FIG. 3 shows schematically the composition of video planes in known Blu-Ray (BD) systems. - As shown, two independent full graphics planes (32, 33) for graphics which are composited on the video plane (31) are present. One graphics plane (32) is assigned for subtitling applications (Presentation Graphics or Text Subtitles) and the other plane is assigned to interactive applications (33) (HDMV or BD-J mode interactivity graphics).
- Returning to
FIG. 3 , the main video plane (1310) and the presentation (1309) and graphics plane (1308) are supplied by the corresponding decoders, and the three planes are overlayed by anoverlayer 1311 and outputted. -
FIG. 4 illustrates schematically a graphics processing unit (13) according to the invention. This specific example constitutes an improvement of the known graphics processing unit in BD systems, but the concept described herein are directly applicable to all graphics processing unit in video players, as the decoder models for various type of video players are similar. - For clarity, the overlaying of one graphics plane over the main video plane will be discussed, but the concept is directly applicable to overlaying more than one graphics plane.
- For 3D video, extra information is needed besides the 2D video that is stored and send to the display in normal Blu-ray movies. For stereoscopic 3D, it is necessary to send both the left view and the right view to the stereoscopic display. The display then uses a certain technique to make sure only the left eye of the viewer sees the left picture and only the right eye sees the right picture. Common techniques to achieve this are shutter glasses or polarized glasses.
- Autostereoscopic displays requires a different interface format: the 2D+depth video format. Besides the 2D video, an additional video stream is used to send depth information. The display combines the video stream in the rendering stage and calculates the resulting 3D picture.
- For both 3D techniques it is necessary to send the 2 video streams to the display in a certain interface format, which depends on the display type. A possible interface format is sending the frames from both videos time interleaved to the display. This means that at time T a frame from the first video stream (left or 2D) is send, and at time T+1 a frame from the second video stream (right or depth) is send.
- Application formats like Blu-ray format as mentioned above, support overlay graphics on top of the video. Overlay graphics are for example used to display subtitles of create a selection menu. Blu-ray overlay graphics are read from disc (presentation graphics and interactive graphics) or generated in real time (BD-J graphics, OSD displays and text based subtitles).
- Outputting the video in a time-sequential interface format greatly effects the performance requirements of drawing routines for the real-time generated overlay graphics, in particular that of BD-J graphics. This is because the graphics plane can no longer simply be composited with the video output, since the video output switches between the two different video streams each frame. As an example, at time T the video plane could contain the 2D view, and at time T+1 the video plane contains accompanying depth information for the frame at time T. The BD-J graphics that need to be composited with the video at time T (the 2D graphics) greatly differ from the BD-J graphics that need to be composited with the video at time T+1 (the depth graphics).
- A graphics processing unit, in particular the BD-J drawing is not fast enough to frame accurately update its graphics plane with these different graphics every frame. The solution according to the invention is to implement two buffers in the graphics unit. Each buffer is assigned to one of the output video streams. For example, for 2D+depth drawing, one buffer could be assigned for graphics overlay over the 2D frame and one buffer could be assigned for the graphics overlay over the depth frame. For L+R, similarly, one buffer could be used for graphics overlay over the L frame, and one buffer could be assigned for overlay over the R frame. The advantage of this solution is that the slow graphics are decoupled from the frame accurate overlaying engine, so that the processing requirements are significantly reduces.
- In
FIG. 4 , aJava application 41 running on a Java Virtual machine generating overlay information and sending it to the graphics processing unit (API). It is noted that the source of the overlay information is not important, such overlay information for a graphics plane could be other graphics from disc or OSD (On Screen display) information. The graphics processing unit comprises twobuffers - It is to be noted that the invention may be implemented in hardware and/or software, using programmable components. A method for implementing the invention has the processing steps corresponding to the rendering system elucidated with reference to
FIG. 1 . Although the invention has been mainly explained by embodiments using optical record carriers or the internet, the invention is also suitable for any image processing environment, like authoring software or broadcasting equipment. Further applications include a 3D personal computer [PC] user interface or 3D media center PC, a 3D mobile player and a 3D mobile phone. - It is noted, that in this document the word ‘comprising’ does not exclude the presence of other elements or steps than those listed and the word ‘a’ or ‘an’ preceding an element does not exclude the presence of a plurality of such elements, that any reference signs do not limit the scope of the claims, that the invention may be implemented by means of both hardware and software, and that several ‘means’ or ‘units’ may be represented by the same item of hardware or software, and a processor may fulfill the function of one or more units, possibly in cooperation with hardware elements. Further, the invention is not limited to the embodiments, and lies in each and every novel feature or combination of features described above.
Claims (15)
1. A method of decoding and outputting video information suitable for three-dimensional [3D] display, the video information comprising encoded main video information suitable for displaying on a 2D display and encoded additional video information for enabling three-dimensional [3D] display,
the method comprising:
receiving or generating three-dimensional [3D] overlay information to be overlayed over the video information;
buffering a first part of the overlay information to be overlayed over the main video information in a first buffer;
buffering a second part of overlay information to be overlayed over the additional video information in a second buffer;
decoding the main video information and the additional video information and generating as a series of time interleaved video frames, each outputted video frame being either main video frame or additional video frame;
determining a type of an video frame to be outputted being either a main video frame or an additional video frame;
overlaying either first or second part of the overlay information on an video frame to be outputted in agreement with the determined type of frame
outputting the video frames and the overlayed information.
2. A method according to claim 1 wherein the main video information is a left video frame and the additional video information is a right video frame.
3. A method according to claim 2 wherein the overlay information is real time graphics.
4. A method according to claim 3 , wherein the real time graphics is generated by a Java application running on a Java Virtual machine.
5. A method according to claim 3 , wherein timing information is used to controlling the overlaying either first or second part of the overlay information on an video frame to be outputted in agreement with the determined type of frame.
6. A method according to claim 1 wherein the additional video information comprised depth information with respect to the video information.
7. A method according to claim 1 wherein the additional video information further comprises depth and occlusion information.
8. A device for decoding and outputting video information suitable for three-dimensional [3D] display, the video information comprising encoded main video information suitable for displaying on a 2D display and encoded additional video information for enabling three-dimensional [3D] display,
the device comprising
input means for receiving three-dimensional [3D] overlay information to be overlayed over the video information or generation means for generating three-dimensional [3D] overlay information to be overlayed over the video information
a decoder for decoding the main video information and the additional video information, the decoder further adapted to generating as a series of time interleaved video frames, each outputted video frame being either main video frame or additional video frame;
means for receiving or generating three-dimensional [3D] overlay information to be overlayed over the video information
a graphics processing unit comprising a first buffer for buffering a first part of the overlay information to be overlayed over the main video information and a second buffer for buffering a second part of overlay information to be overlayed over the additional video information;
the graphics processing unit further comprising a controller for determining a type of an video frame to be outputted being either a main video frame or an additional video frame
a mixer for overlaying either first or second part of the overlay information on an video frame to be outputted in agreement with the determined type of frame
output means for outputting the video frames and the overlayed information.
9. A device according to claim 8 wherein the main video information is a left video frame and the additional video information is a right video frame.
10. A device according to claim 9 wherein the overlay information is real time graphics.
11. A device according to claim 10 , wherein the real time graphics is generated by a Java application running on a Java Virtual machine.
12. A device according to claim 11 , wherein timing information is used to controlling the overlaying either first or second part of the overlay information on an video frame to be outputted in agreement with the determined type of frame.
13. A device according to claim 8 wherein the additional video information comprised depth information with respect to the video information.
14. A device according to claim 9 wherein the additional video information further comprises depth and occlusion information.
15. A device according to claim 8 wherein the controller is adapted to copy parts of a first overlay frame in the first buffer or parts of a second overlay frame in the second buffer at frame frequency for generating an overlay frame.
Applications Claiming Priority (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP08172411 | 2008-12-19 | ||
EP08172411.4 | 2008-12-19 | ||
PCT/IB2009/055726 WO2010070567A1 (en) | 2008-12-19 | 2009-12-14 | Method and device for overlaying 3d graphics over 3d video |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/IB2009/055726 A-371-Of-International WO2010070567A1 (en) | 2008-12-19 | 2009-12-14 | Method and device for overlaying 3d graphics over 3d video |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/236,553 Continuation US9918069B2 (en) | 2008-12-19 | 2016-08-15 | Method and device for overlaying 3D graphics over 3D video |
Publications (1)
Publication Number | Publication Date |
---|---|
US20110249757A1 true US20110249757A1 (en) | 2011-10-13 |
Family
ID=41718987
Family Applications (3)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US13/139,925 Abandoned US20110249757A1 (en) | 2008-12-19 | 2009-12-14 | Method and device for overlaying 3d graphics over 3d video |
US15/236,553 Active US9918069B2 (en) | 2008-12-19 | 2016-08-15 | Method and device for overlaying 3D graphics over 3D video |
US15/895,421 Active US10158841B2 (en) | 2008-12-19 | 2018-02-13 | Method and device for overlaying 3D graphics over 3D video |
Family Applications After (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/236,553 Active US9918069B2 (en) | 2008-12-19 | 2016-08-15 | Method and device for overlaying 3D graphics over 3D video |
US15/895,421 Active US10158841B2 (en) | 2008-12-19 | 2018-02-13 | Method and device for overlaying 3D graphics over 3D video |
Country Status (14)
Country | Link |
---|---|
US (3) | US20110249757A1 (en) |
EP (1) | EP2380357B2 (en) |
JP (4) | JP2012513056A (en) |
KR (1) | KR20110106371A (en) |
CN (1) | CN102257825B (en) |
AU (1) | AU2009329113B2 (en) |
BR (1) | BRPI0917764B1 (en) |
CA (1) | CA2747106C (en) |
ES (1) | ES2640869T3 (en) |
MX (1) | MX2011006496A (en) |
MY (1) | MY158823A (en) |
RU (1) | RU2537800C2 (en) |
TW (1) | TWI520566B (en) |
WO (1) | WO2010070567A1 (en) |
Cited By (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20120076197A1 (en) * | 2010-09-23 | 2012-03-29 | Vmware, Inc. | System and Method for Transmitting Video and User Interface Elements |
US20120082309A1 (en) * | 2010-10-03 | 2012-04-05 | Shang-Chieh Wen | Method and apparatus of processing three-dimensional video content |
US20120188335A1 (en) * | 2011-01-26 | 2012-07-26 | Samsung Electronics Co., Ltd. | Apparatus and method for processing 3d video |
US20130307942A1 (en) * | 2011-01-19 | 2013-11-21 | S.I.Sv.El.Societa Italiana Per Lo Sviluppo Dell'elettronica S.P.A. | Video Stream Composed of Combined Video Frames and Methods and Systems for its Generation, Transmission, Reception and Reproduction |
WO2014108741A1 (en) * | 2013-01-09 | 2014-07-17 | Freescale Semiconductor, Inc. | A method and apparatus for adaptive graphics compression and display buffer switching |
WO2015142649A1 (en) * | 2014-03-19 | 2015-09-24 | Matterport, Inc. | Selecting two-dimensional imagery data for display within a three-dimensional model |
US9786097B2 (en) | 2012-06-22 | 2017-10-10 | Matterport, Inc. | Multi-modal method for interacting with 3D models |
US9918069B2 (en) | 2008-12-19 | 2018-03-13 | Koninklijke Philips N.V. | Method and device for overlaying 3D graphics over 3D video |
US20180288353A1 (en) * | 2015-06-03 | 2018-10-04 | Intel Corporation | Low power video composition using a stream out buffer |
US10127722B2 (en) | 2015-06-30 | 2018-11-13 | Matterport, Inc. | Mobile capture visualization incorporating three-dimensional and two-dimensional imagery |
US10139985B2 (en) | 2012-06-22 | 2018-11-27 | Matterport, Inc. | Defining, displaying and interacting with tags in a three-dimensional model |
US10447990B2 (en) | 2012-02-28 | 2019-10-15 | Qualcomm Incorporated | Network abstraction layer (NAL) unit header design for three-dimensional video coding |
Families Citing this family (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
TW201119353A (en) | 2009-06-24 | 2011-06-01 | Dolby Lab Licensing Corp | Perceptual depth placement for 3D objects |
US9215435B2 (en) | 2009-06-24 | 2015-12-15 | Dolby Laboratories Licensing Corp. | Method for embedding subtitles and/or graphic overlays in a 3D or multi-view video data |
US9426441B2 (en) | 2010-03-08 | 2016-08-23 | Dolby Laboratories Licensing Corporation | Methods for carrying and transmitting 3D z-norm attributes in digital TV closed captioning |
WO2012145191A1 (en) | 2011-04-15 | 2012-10-26 | Dolby Laboratories Licensing Corporation | Systems and methods for rendering 3d images independent of display size and viewing distance |
TWI543116B (en) * | 2011-04-26 | 2016-07-21 | 國立成功大學 | Method for merging the regions in the image/video |
EP2936805A4 (en) | 2012-12-24 | 2016-07-20 | Thomson Licensing | APPARATUS AND METHOD FOR DISPLAYING STEREOSCOPIC IMAGES |
CN109996034B (en) | 2013-05-31 | 2021-06-08 | 佳能株式会社 | Client apparatus, control method thereof, and recording medium |
TWI512678B (en) * | 2013-10-02 | 2015-12-11 | Univ Nat Cheng Kung | Non-transitory storage medium |
RU2597462C1 (en) * | 2015-07-17 | 2016-09-10 | Виталий Витальевич Аверьянов | Method of displaying object on spatial model |
US11461953B2 (en) | 2019-12-27 | 2022-10-04 | Wipro Limited | Method and device for rendering object detection graphics on image frames |
CN117201834B (en) * | 2023-09-11 | 2024-06-21 | 南京天创电子技术有限公司 | Real-time double-spectrum fusion video stream display method and system based on target detection |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20100118119A1 (en) * | 2006-10-11 | 2010-05-13 | Koninklijke Philips Electronics N.V. | Creating three dimensional graphics data |
US20120170917A1 (en) * | 2008-07-24 | 2012-07-05 | Panasonic Corporation | Play back apparatus, playback method and program for playing back 3d video |
Family Cites Families (31)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH06335029A (en) * | 1993-05-25 | 1994-12-02 | Sony Corp | Stereoscopic video display device and method for adjusting congestion angle of stereoscopic vision |
DE19545356C2 (en) * | 1995-12-05 | 1998-04-16 | Deutsche Telekom Ag | Device for displaying stereo video images |
JP3231618B2 (en) * | 1996-04-23 | 2001-11-26 | 日本電気株式会社 | 3D image encoding / decoding system |
US6144701A (en) | 1996-10-11 | 2000-11-07 | Sarnoff Corporation | Stereoscopic video coding and decoding apparatus and method |
GB9623682D0 (en) | 1996-11-14 | 1997-01-08 | Philips Electronics Nv | Autostereoscopic display apparatus |
EP2259585B1 (en) | 1996-12-04 | 2013-10-16 | Panasonic Corporation | Optical disk for high resolution and three dimensional video recording, optical disk reproduction apparatus, and optical disk recording apparatus |
RU2157056C2 (en) * | 1998-02-03 | 2000-09-27 | Логутко Альберт Леонидович | Method for three-dimensional tv recording |
US6549201B1 (en) * | 1999-11-23 | 2003-04-15 | Center For Advanced Science And Technology Incubation, Ltd. | Method for constructing a 3D polygonal surface from a 2D silhouette by using computer, apparatus thereof and storage medium |
JP2001186515A (en) * | 1999-12-22 | 2001-07-06 | Sanyo Electric Co Ltd | Decoder for channel signals |
US20020009137A1 (en) * | 2000-02-01 | 2002-01-24 | Nelson John E. | Three-dimensional video broadcasting system |
US20030103062A1 (en) | 2001-11-30 | 2003-06-05 | Ruen-Rone Lee | Apparatus and method for controlling a stereo 3D display using overlay mechanism |
KR100454194B1 (en) | 2001-12-28 | 2004-10-26 | 한국전자통신연구원 | Stereoscopic Video Encoder and Decoder Supporting Multi-Display Mode and Method Thereof |
KR100488804B1 (en) * | 2002-10-07 | 2005-05-12 | 한국전자통신연구원 | System for data processing of 2-view 3dimention moving picture being based on MPEG-4 and method thereof |
WO2004053875A2 (en) | 2002-12-10 | 2004-06-24 | Koninklijke Philips Electronics N.V. | Editing of real time information on a record carrier |
CN101702750B (en) * | 2003-04-28 | 2013-03-06 | 松下电器产业株式会社 | Recording medium, reproduction apparatus, recording method, reproducing method, program and integrated circuit |
WO2004107765A1 (en) | 2003-05-28 | 2004-12-09 | Sanyo Electric Co., Ltd. | 3-dimensional video display device, text data processing device, program, and storage medium |
JP2004357156A (en) * | 2003-05-30 | 2004-12-16 | Sharp Corp | Video reception apparatus and video playback apparatus |
JP2005039398A (en) * | 2003-07-17 | 2005-02-10 | Funai Electric Co Ltd | Television receiver |
US7529467B2 (en) | 2004-02-28 | 2009-05-05 | Samsung Electronics Co., Ltd. | Storage medium recording text-based subtitle stream, reproducing apparatus and reproducing method for reproducing text-based subtitle stream recorded on the storage medium |
JP2008517343A (en) * | 2004-10-19 | 2008-05-22 | コーニンクレッカ フィリップス エレクトロニクス エヌ ヴィ | Animation judder compensation |
US7773864B2 (en) * | 2005-08-29 | 2010-08-10 | Sony Corporation | Slicing interactive graphic data in disc authoring |
JP4717728B2 (en) * | 2005-08-29 | 2011-07-06 | キヤノン株式会社 | Stereo display device and control method thereof |
CN101416520B (en) * | 2006-03-31 | 2011-12-14 | 皇家飞利浦电子股份有限公司 | Efficient encoding of multiple views |
KR20080047673A (en) * | 2006-11-27 | 2008-05-30 | (주)플렛디스 | 3D image conversion device and method |
CN101653011A (en) * | 2007-03-16 | 2010-02-17 | 汤姆森许可贸易公司 | System and method for combining text with three-dimensional content |
CN101415126A (en) * | 2007-10-18 | 2009-04-22 | 深圳Tcl新技术有限公司 | Method for generating three-dimensional image effect and digital video apparatus |
KR20100002032A (en) * | 2008-06-24 | 2010-01-06 | 삼성전자주식회사 | Image generating method, image processing method, and apparatus thereof |
BRPI0904965A2 (en) † | 2008-09-30 | 2015-06-30 | Panasonic Corp | Recording medium for recording 3D video, playback device for 3D video playback and lsi system |
WO2010070567A1 (en) | 2008-12-19 | 2010-06-24 | Koninklijke Philips Electronics N.V. | Method and device for overlaying 3d graphics over 3d video |
JP2011086515A (en) | 2009-10-16 | 2011-04-28 | Konica Minolta Opto Inc | Led lighting device and reflector |
KR20130098844A (en) * | 2010-09-03 | 2013-09-05 | 파나소닉 주식회사 | Video processing device, video processing method, computer program and delivery method |
-
2009
- 2009-12-14 WO PCT/IB2009/055726 patent/WO2010070567A1/en active Application Filing
- 2009-12-14 AU AU2009329113A patent/AU2009329113B2/en active Active
- 2009-12-14 CN CN200980151022.5A patent/CN102257825B/en active Active
- 2009-12-14 JP JP2011541682A patent/JP2012513056A/en active Pending
- 2009-12-14 US US13/139,925 patent/US20110249757A1/en not_active Abandoned
- 2009-12-14 MX MX2011006496A patent/MX2011006496A/en active IP Right Grant
- 2009-12-14 KR KR1020117016477A patent/KR20110106371A/en not_active Application Discontinuation
- 2009-12-14 BR BRPI0917764-7A patent/BRPI0917764B1/en active IP Right Grant
- 2009-12-14 EP EP09796088.4A patent/EP2380357B2/en active Active
- 2009-12-14 CA CA2747106A patent/CA2747106C/en active Active
- 2009-12-14 RU RU2011129788/08A patent/RU2537800C2/en active
- 2009-12-14 MY MYPI2011002810A patent/MY158823A/en unknown
- 2009-12-14 ES ES09796088.4T patent/ES2640869T3/en active Active
- 2009-12-17 TW TW098143436A patent/TWI520566B/en active
-
2014
- 2014-12-15 JP JP2014252631A patent/JP2015111833A/en not_active Withdrawn
-
2016
- 2016-08-03 JP JP2016152588A patent/JP6846130B2/en active Active
- 2016-08-15 US US15/236,553 patent/US9918069B2/en active Active
-
2018
- 2018-02-13 US US15/895,421 patent/US10158841B2/en active Active
-
2019
- 2019-11-29 JP JP2019216414A patent/JP2020099045A/en active Pending
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20100118119A1 (en) * | 2006-10-11 | 2010-05-13 | Koninklijke Philips Electronics N.V. | Creating three dimensional graphics data |
US20120170917A1 (en) * | 2008-07-24 | 2012-07-05 | Panasonic Corporation | Play back apparatus, playback method and program for playing back 3d video |
Cited By (28)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10158841B2 (en) | 2008-12-19 | 2018-12-18 | Koninklijke Philips N.V. | Method and device for overlaying 3D graphics over 3D video |
US9918069B2 (en) | 2008-12-19 | 2018-03-13 | Koninklijke Philips N.V. | Method and device for overlaying 3D graphics over 3D video |
US20120076197A1 (en) * | 2010-09-23 | 2012-03-29 | Vmware, Inc. | System and Method for Transmitting Video and User Interface Elements |
US8724696B2 (en) * | 2010-09-23 | 2014-05-13 | Vmware, Inc. | System and method for transmitting video and user interface elements |
US20120082309A1 (en) * | 2010-10-03 | 2012-04-05 | Shang-Chieh Wen | Method and apparatus of processing three-dimensional video content |
US8693687B2 (en) * | 2010-10-03 | 2014-04-08 | Himax Media Solutions, Inc. | Method and apparatus of processing three-dimensional video content |
US9843760B2 (en) * | 2011-01-19 | 2017-12-12 | S.I.Sv.El Societa Italiana Per Lo Sviluppo Dell'elettronica S.P.A. | Video stream composed of combined video frames and methods and systems for its generation, transmission, reception and reproduction |
US20130307942A1 (en) * | 2011-01-19 | 2013-11-21 | S.I.Sv.El.Societa Italiana Per Lo Sviluppo Dell'elettronica S.P.A. | Video Stream Composed of Combined Video Frames and Methods and Systems for its Generation, Transmission, Reception and Reproduction |
US9723291B2 (en) * | 2011-01-26 | 2017-08-01 | Samsung Electronics Co., Ltd | Apparatus and method for generating 3D video data |
US20120188335A1 (en) * | 2011-01-26 | 2012-07-26 | Samsung Electronics Co., Ltd. | Apparatus and method for processing 3d video |
US10447990B2 (en) | 2012-02-28 | 2019-10-15 | Qualcomm Incorporated | Network abstraction layer (NAL) unit header design for three-dimensional video coding |
US10775959B2 (en) | 2012-06-22 | 2020-09-15 | Matterport, Inc. | Defining, displaying and interacting with tags in a three-dimensional model |
US12086376B2 (en) | 2012-06-22 | 2024-09-10 | Matterport, Inc. | Defining, displaying and interacting with tags in a three-dimensional model |
US11551410B2 (en) | 2012-06-22 | 2023-01-10 | Matterport, Inc. | Multi-modal method for interacting with 3D models |
US10139985B2 (en) | 2012-06-22 | 2018-11-27 | Matterport, Inc. | Defining, displaying and interacting with tags in a three-dimensional model |
US11422671B2 (en) | 2012-06-22 | 2022-08-23 | Matterport, Inc. | Defining, displaying and interacting with tags in a three-dimensional model |
US10304240B2 (en) | 2012-06-22 | 2019-05-28 | Matterport, Inc. | Multi-modal method for interacting with 3D models |
US9786097B2 (en) | 2012-06-22 | 2017-10-10 | Matterport, Inc. | Multi-modal method for interacting with 3D models |
US11062509B2 (en) | 2012-06-22 | 2021-07-13 | Matterport, Inc. | Multi-modal method for interacting with 3D models |
US10102828B2 (en) | 2013-01-09 | 2018-10-16 | Nxp Usa, Inc. | Method and apparatus for adaptive graphics compression and display buffer switching |
WO2014108741A1 (en) * | 2013-01-09 | 2014-07-17 | Freescale Semiconductor, Inc. | A method and apparatus for adaptive graphics compression and display buffer switching |
WO2015142649A1 (en) * | 2014-03-19 | 2015-09-24 | Matterport, Inc. | Selecting two-dimensional imagery data for display within a three-dimensional model |
US10909758B2 (en) | 2014-03-19 | 2021-02-02 | Matterport, Inc. | Selecting two-dimensional imagery data for display within a three-dimensional model |
US10163261B2 (en) | 2014-03-19 | 2018-12-25 | Matterport, Inc. | Selecting two-dimensional imagery data for display within a three-dimensional model |
US11600046B2 (en) | 2014-03-19 | 2023-03-07 | Matterport, Inc. | Selecting two-dimensional imagery data for display within a three-dimensional model |
US10484640B2 (en) * | 2015-06-03 | 2019-11-19 | Intel Corporation | Low power video composition using a stream out buffer |
US20180288353A1 (en) * | 2015-06-03 | 2018-10-04 | Intel Corporation | Low power video composition using a stream out buffer |
US10127722B2 (en) | 2015-06-30 | 2018-11-13 | Matterport, Inc. | Mobile capture visualization incorporating three-dimensional and two-dimensional imagery |
Also Published As
Publication number | Publication date |
---|---|
RU2011129788A (en) | 2013-01-27 |
MY158823A (en) | 2016-11-15 |
CN102257825B (en) | 2016-11-16 |
US20180176537A1 (en) | 2018-06-21 |
KR20110106371A (en) | 2011-09-28 |
US20160353081A1 (en) | 2016-12-01 |
EP2380357B1 (en) | 2017-07-26 |
CA2747106C (en) | 2017-07-04 |
JP2015111833A (en) | 2015-06-18 |
CN102257825A (en) | 2011-11-23 |
EP2380357B2 (en) | 2020-03-18 |
US10158841B2 (en) | 2018-12-18 |
TW201043000A (en) | 2010-12-01 |
RU2537800C2 (en) | 2015-01-10 |
US9918069B2 (en) | 2018-03-13 |
TWI520566B (en) | 2016-02-01 |
MX2011006496A (en) | 2011-07-13 |
EP2380357A1 (en) | 2011-10-26 |
AU2009329113A1 (en) | 2011-08-11 |
JP6846130B2 (en) | 2021-03-24 |
JP2017022714A (en) | 2017-01-26 |
CA2747106A1 (en) | 2010-06-24 |
ES2640869T3 (en) | 2017-11-07 |
BRPI0917764A2 (en) | 2016-02-23 |
JP2012513056A (en) | 2012-06-07 |
JP2020099045A (en) | 2020-06-25 |
BRPI0917764B1 (en) | 2021-03-16 |
AU2009329113B2 (en) | 2015-01-22 |
WO2010070567A1 (en) | 2010-06-24 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10158841B2 (en) | Method and device for overlaying 3D graphics over 3D video | |
US11277600B2 (en) | Switching between 3D video and 2D video | |
US20110293240A1 (en) | Method and system for transmitting over a video interface and for compositing 3d video and 3d overlays | |
AU2011202552B2 (en) | 3D display handling of subtitles |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: KONINKLIJKE PHILIPS ELECTRONICS N.V., NETHERLANDS Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:NEWTON, PHILIP STEVEN;KURVERS, MARKUS JOZEF MARIA;BOLIO, DENNIS DANIEL ROBERT JOZEF;SIGNING DATES FROM 20091215 TO 20100111;REEL/FRAME:026477/0706 |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- AFTER EXAMINER'S ANSWER OR BOARD OF APPEALS DECISION |