WO1995019093A1 - Visualisation d'objects mis en images a partir de points de vue selectionnes - Google Patents
Visualisation d'objects mis en images a partir de points de vue selectionnes Download PDFInfo
- Publication number
- WO1995019093A1 WO1995019093A1 PCT/US1995/000303 US9500303W WO9519093A1 WO 1995019093 A1 WO1995019093 A1 WO 1995019093A1 US 9500303 W US9500303 W US 9500303W WO 9519093 A1 WO9519093 A1 WO 9519093A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- images
- view
- taken
- image
- imaging devices
- Prior art date
Links
- 238000003384 imaging method Methods 0.000 claims description 88
- 238000000034 method Methods 0.000 claims description 25
- 230000002452 interceptive effect Effects 0.000 claims description 10
- 230000001360 synchronised effect Effects 0.000 claims description 9
- 238000001454 recorded image Methods 0.000 claims description 4
- 241000251468 Actinopterygii Species 0.000 claims description 3
- 238000003860 storage Methods 0.000 description 10
- 238000010586 diagram Methods 0.000 description 6
- 230000033001 locomotion Effects 0.000 description 6
- 238000009434 installation Methods 0.000 description 5
- 238000004519 manufacturing process Methods 0.000 description 4
- 230000003287 optical effect Effects 0.000 description 3
- 230000005540 biological transmission Effects 0.000 description 2
- 230000008569 process Effects 0.000 description 2
- 238000010276 construction Methods 0.000 description 1
- 230000007423 decrease Effects 0.000 description 1
- 230000001934 delay Effects 0.000 description 1
- 230000003111 delayed effect Effects 0.000 description 1
- 238000009826 distribution Methods 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 230000000873 masking effect Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000000737 periodic effect Effects 0.000 description 1
- 230000010287 polarization Effects 0.000 description 1
- 230000008707 rearrangement Effects 0.000 description 1
- 230000004044 response Effects 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/10—Processing, recording or transmission of stereoscopic or multi-view image signals
- H04N13/106—Processing image signals
- H04N13/111—Transformation of image signals corresponding to virtual viewpoints, e.g. spatial image interpolation
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/20—Image signal generators
- H04N13/204—Image signal generators using stereoscopic image cameras
- H04N13/243—Image signal generators using stereoscopic image cameras using three or more 2D image sensors
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/10—Processing, recording or transmission of stereoscopic or multi-view image signals
- H04N13/106—Processing image signals
- H04N13/111—Transformation of image signals corresponding to virtual viewpoints, e.g. spatial image interpolation
- H04N13/117—Transformation of image signals corresponding to virtual viewpoints, e.g. spatial image interpolation the virtual viewpoint locations being selected by the viewers or determined by viewer tracking
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/10—Processing, recording or transmission of stereoscopic or multi-view image signals
- H04N13/189—Recording image signals; Reproducing recorded image signals
Definitions
- the present invention relates to the art of recording and displaying images of objects and, in parti-.'.ilar, to the manipulation of such images for presentation from selected points of view. Description of Related Art
- the viewer of objects has traditionally been restricted to viewing sequences of images of those objects from a limited number of points of view established at the time of imaging.
- objects for example, articles, events, scenes or locations
- the viewer is commonly limited to only a single point of view a"s selected by another person (e.g., a director) when that object was imaged.
- another person e.g., a director
- the viewer is often frustrated by the fact that the point of view desired by the viewer was not selected and/or that a plurality of different views of the imaged object are not available for consideration.
- the viewer may continue to be frustrated by the fact that a certain other point of view with respect to the object was not selected and made available for viewing.
- a plurality of imaging devices are positioned at spaced apart locations for recording sequences of images of an object from a plurality of different points of view.
- a laser tracking (direction and distance) device determines the relative locations of the imaging devices, and the relative location determination is synchronized with the recorded sequences of images.
- a processor presents the recorded images as taken by the imaging devices to the user as is, if desired, to display the imaged object from one of the taken points of view.
- the processor manipulates the taken sequences of images, in accordance with viewer input provided through an interface in conjunction with the determined relative locations of the imaging devices, to construct a sequence of images or a freeze frame image of the object for display as if taken from a viewer selected point of view.
- the processor further processes the images under user control to facilitate zoom in and zoom out on the imaged object, and removal of unwanted portions of the imaged object for display.
- FIGURE 1 is a block diagram of the image processing system of the present invention.
- FIGURE 2 illustrates the exemplary installation of a plurality of imaging devices around and about an object to be imaged in accordance with the teachings of the present invention
- FIGURE 3 shows a moveable structure for mounting and supporting the plurality of imaging devices around and about an object to be imaged
- FIGURE 4 is a flow diagram of a first embodiment of the method of the present invention.
- FIGURE 5 is a flow diagram of a second embodiment of the method of the present invention.
- the system includes a plurality of imaging devices 12 (such as digitizing optical cameras, high definition film cameras and/or three-dimensional imaging cameras) connected to a storage device 14.
- the sequences of images recorded by the imaging devices 12 are separately stored by the storage device 14 and synchronized in accordance with a time recording for future retrieval and processing in a manner to be described.
- the sequences of images are stored on film, optical or magnetic tape, optical or magnetic disk, or other suitable storage media as desired or necessitated. It will, of course, be understood that the storage device 14 used to store the sequences of images need not also be used in the retrieval of the images.
- the functions of storing the image data on and subsequently retrieving the image data from the media may occur at different locations.
- the sequences of images may be simultaneously, separately and contemporaneously transmitted (also synchronized in accordance with a time recording) over lines 15 from the imaging devices 12 using digitization, polarization, different bandwidth or sequential simultaneous transmission techniques for further processing in a manner to be described. Transmission of the images over lines 15 is especially useful in "live" processing and display of images of as will be further described herein.
- FIGURE 2 there is shown an illustration of an exemplary installation of the plurality of imaging devices 12 at a plurality of selected spaced apart locations 16 around and about an object 18.
- the plurality of imaging devices 12 are preferably positioned at the locations 16 to simultaneously record a sequence of images of the object 18 from a plurality of different points of view
- object 18 is used generally to describe not only a single article, person or thing, but also a collection of articles, persons or things forming a scene, and thus further includes a particular geographic location or an event or activity occurring at that location.
- Imaged objects 18 thus include, without limitation, scenes for motion picture or television productions as well as "live” events such as televised concerts or sporting events.
- FIGURE 2 Five imaging devices 12 are positioned one at each of the corners of an imaginary square 20, and one directly above the center of the square. It will, of course, be understood that other installation configurations not necessarily completely surrounding and/or with more or fewer imaging devices 12 are possible (including a spherical configuration) and that the use of the square 20 configuration is by way of example only. Thus, the imaging devices 12 may be positioned in accordance with the present invention in any manner desired provided that the imaging devices are spaced apart from each other at locations 16 that provide a plurality of different taken points of view 17 with respect to the object 18 being imaged.
- the plurality of sequences of images of the object 18 recorded by the imaging devices 12 and stored in the storage device 14 are simultaneously retrievable by a processor 22 from the storage device 14, or receivable over lines 15 by the processor from the imaging devices in a "live" operating mode.
- An interactive device 24
- the processor 22 (such as a graphical user interface) is connected to the processor and operated by the user to choose a point of view from which the user desires to view the imaged object.
- This chosen point of view may comprise one of the plurality of different taken points of view 17 provided by the imaging devices 12 at the locations 16.
- the processor 22 will acquire the sequence of images taken by the imaging device at the chosen location 16 and display those images on a display 26 comprising either a monitor 28 or a virtual reality type display helmet 30.
- the user may select one image in any of the taken sequences for viewing in a "freeze frame" mode.
- the chosen point of view may alternatively comprise a point of view defined by a selected location different from (for example, somewhere between) any of the locations 16 associated with the placement of the imaging devices 12. Representative selected locations are illustrated in FIGURE 2 at each reference numeral 32. Responsive to such a selection by the user through the interactive device 24 of such a point of view with respect to the object 18 (hereinafter referred to as the "selected point of view" 33) , the processor 22 will acquire the sequences of images recorded by the imaging devices 12 at each of the locations 16, and through the use of fuzzy logic processing 34 construct a sequence of images or a single freeze frame image from the recorded sequences of images having a selected point of view 33 corresponding to the selected location 32.
- This constructed sequence of images or freeze frame image will also be presented to the user on the display 26.
- the present invention allows a user to select for viewing an individual image or sequence of images of the object 18 from a selected point of view 33 (at one of the locations 32) not provided by the placement of the imaging devices 12 at the locations 16 for the taken points of view 17.
- the user may "move” around and about, and zoom in and zoom out on the object 18 for viewing the object at any of a number of selected angles (including phantom angles defined by the selected points of view 33) that are supported by the images taken by the plurality of imaging devices 12.
- supported it is meant that sufficient and suitable image data necessary for the processor 22 to construct the sequence of images or freeze frame image at the selected location 32 and distance must be obtained by the imaging devices 12. It is thus preferred that high definition film or digital cameras be used to obtain sufficient data.
- the processor 22 may limit user selection of such points of view or distances, or construct the images with a resolution and accuracy as best as possible from the existing image data.
- the image data provided by the imaging device 12' is cut out by the processor 22 and pasted around the sequences of images or freeze frame images provided by the conventional imaging devices 12.
- the curvature of the image associated with fish eye lenses is reduced or eliminated by the processor 22 using conventional image processing techniques.
- the image of the object 18 taken from one point of view 17, or constructed for a selected point of view 33 may include another one of the imaging devices at another location 16, or some other unwanted object or item.
- the sequence of images of the object 18 taken by the imaging device 12 at one corner of the square 20 may include therein the imaging device at the diagonally opposite corner of the square.
- the presence of another imaging device 12 in the taken image of the object will not be distracting.
- the processor 22 at the control of the user through the interactive device 24 further utilizes image subtraction logic processing 36 to remove unwanted items from either the taken or constructed sequences of images or freeze frame images.
- unwanted items may be color coded and removed through conventional blue masking techniques.
- the processor 22 further includes an image buffer 23 for sequentially storing for a predetermined time delay prior to display the individual images as constructed and/or output from the processor. Storage of the images in the buffer 23 during the time delay allows for more computationally intense subsequent images to catch up with previously generated and output images.
- the images are presented by the display devices 26 in an uninterrupted, sequential, continuous manner for viewing from the selected point of view 33. Any sound associated with the images is likewise delayed in the buffer 23 for synchronized output.
- the present invention further includes a laser tracking device 31 positioned at a known location which may comprise the location of one of the imaging devices 12.
- the laser tracking device 31 emits a laser beam 35 directed at and reflected by each imaging device 12 (two such beams are shown in FIGURE 2) .
- the reflected beam is processed by the device 31 to determine distance and direction to (i.e., relative location of) each imaging device.
- the relative locations of the imaging devices 12 are transmitted to the storage device 14 for storage with the recorded imaged or, alternatively in a "live" operating mode, are transmitted directly to the processor 22 for use in the image construction, zooming and image subtraction processes.
- the relative location information need only be calculated once and transmitted to the processor 22 prior to recording images of the object 18.
- the relative location information is updated by the device 31 on a periodic basis and transmitted to the storage device 14 or processor 22 synchronized with the recorded images in accordance with the time recording.
- FIGURE 3 wherein there is shown a structure 39 for mounting and supporting the plurality of imaging devices 12 at the locations 16.
- the structure 39 includes a plurality of telescoping arms 41.
- a universal joint 43 is provided at each end of each one of the arms 41.
- a distal end one of the universal joints 43 on each arm 41 supports mounting of an imaging device 12.
- a proximal end one of the universal joints 43 is mounted to a support platform 45.
- the structure 39 is typically fixed in one location during the imaging of stationary objects 18. With non-stationary objects 18, however, the structure 39 is mounted to a vehicle 43 (schematically shown in phantom, for example comprising a track cart, truck, or helicopter) for movement along with the object to be imaged.
- telescoping arms 41 and universal joints 43 facilitates the positioning of the plurality of imaging devices 12 in accordance with the present invention in any one of a number of positions at spaced apart locations 16 providing a plurality of points of view with respect to the object 18.
- the positions of the arms 41 and joints 43 for the structure 39 are lockable to maintain a substantially constant relative location between the plurality of imaging devices 12 even during object 18 and/or vehicle 43 movement.
- FIGURE 4 wherein there is shown a flow diagram of a first embodiment of the method of the present invention.
- Sequences of images of the object 18 are first captured by a plurality of imaging devices 12 having known relative locations (step 38) .
- the captured image data is processed for phantom angle and zooming capability (step 40) .
- the processed data is then either recorded on media (step 42) or transmitted live (step 44) for subsequent processing to display the sequences of images from user chosen and selected points of view 17 and 33, respectively.
- Such display comprises the steps of inputting from the user preferred viewing information such as a selected point of view and distance (step 46) .
- the processed image data is then read responsive to the user input (step 48) , and a sequence of images or freeze frame image is constructed from the data in accordance with the user input (step 49) and the known relative locations of the imaging devices 12 and transmitted to a monitor for viewing (step 50) .
- FIGURE 5 there is shown a flow diagram of a second embodiment of the method of the present invention. Sequences of images of the object 18 are first captured by a plurality of imaging devices 12 having known relative locations (step 52) . The images ' are then recorded on appropriate media (step 54) . In this connection, it will be understood that the recorded sequences of images may then be packaged for subsequent sale or lease.
- the image data is simultaneously accessed by the processor 22 (step 56) , and processed in accordance with the user preferred viewing information input through the use of an interactive viewing device and the known relative locations of the imaging devices 12 (step 58) to generate or construct sequences of images or freeze frame images for viewing (step 60) or, alternatively, the generated sequences of images or freeze frame images are recorded (step 62) and the recording is subsequently provided to the viewer for later access (step 64) .
- the present invention is not intended to replace or render obsolete the traditional role of a director in the production of movies, television shows and the like.
- the taken points of view 17 of scenes and events selected by the director will be provided along with other taken points of view 17 provided by the imaging devices 12 for user selection.
- the selected points of view 33 from phantom angles will also be made available to the viewer through processing the sequences of images in accordance with the present invention.
- those images will be displayed. Otherwise, the viewer is free with the present method and apparatus to select any other point of view and distance (using either taken or constructed images) for viewing that is supported by the image data.
- Such viewing would include the viewing of freeze frame images as well as sequences of images.
- One immediate use of the present invention would be for a director to record a scene from multiple taken points of view 17.
- the director may select the optimum angles and imaging device positions for making the final recording.
- multiple imaging devices can be used in final filming to produce different versions of the production for general distribution.
- the processor of the present invention may be made available at home or at a theater to allow viewers to select other points of view 33 relating to phantom angles during a showing.
- the freeze frame feature of the present invention further facilitates careful consideration by the viewer of a recorded scene such as would be necessary in participating in an interactive or role playing game (e.g., a mystery/detective or combat game) .
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Studio Devices (AREA)
- Studio Circuits (AREA)
- Testing, Inspecting, Measuring Of Stereoscopic Televisions And Televisions (AREA)
- Image Processing (AREA)
Abstract
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP7518668A JPH09507620A (ja) | 1994-01-10 | 1995-01-09 | 撮影された物体の選択された視点からの観察 |
EP95907364A EP0742987A4 (fr) | 1994-01-10 | 1995-01-09 | Visualisation d'objects mis en images a partir de points de vue selectionnes |
Applications Claiming Priority (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US17938394A | 1994-01-10 | 1994-01-10 | |
US36689094A | 1994-12-30 | 1994-12-30 | |
US08/179,383 | 1994-12-30 | ||
US08/366,890 | 1994-12-30 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO1995019093A1 true WO1995019093A1 (fr) | 1995-07-13 |
Family
ID=26875277
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/US1995/000303 WO1995019093A1 (fr) | 1994-01-10 | 1995-01-09 | Visualisation d'objects mis en images a partir de points de vue selectionnes |
Country Status (4)
Country | Link |
---|---|
EP (1) | EP0742987A4 (fr) |
JP (1) | JPH09507620A (fr) |
CA (1) | CA2179809A1 (fr) |
WO (1) | WO1995019093A1 (fr) |
Cited By (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP0793392A1 (fr) * | 1996-02-29 | 1997-09-03 | Matsushita Electric Industrial Co., Ltd. | Méthode et appareil pour la transmission et la réception de signaux télévision à trois dimensions d'images stéréoscopiques |
EP0903695A1 (fr) * | 1997-09-16 | 1999-03-24 | Canon Kabushiki Kaisha | Appareil de traitement d'images |
EP0930585A1 (fr) * | 1998-01-14 | 1999-07-21 | Canon Kabushiki Kaisha | Appareil pour la traitement d'images. |
GB2378341A (en) * | 2001-07-31 | 2003-02-05 | Hewlett Packard Co | Altering the viewpoint of part of an image to simulate a different viewing angle |
US20100007735A1 (en) * | 2005-12-22 | 2010-01-14 | Marco Jacobs | Arrangement for video surveillance |
DE19825302B4 (de) * | 1997-06-09 | 2014-09-25 | Evans & Sutherland Computer Corp. | System zur Einrichtung einer dreidimensionalen Abfallmatte, welche eine vereinfachte Einstellung räumlicher Beziehungen zwischen realen und virtuellen Szeneelementen ermöglicht |
Families Citing this family (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP4499872A1 (fr) | 2022-03-29 | 2025-02-05 | Illumina, Inc. | Systèmes et procédés de séquençage de polynucléotides |
WO2025006464A1 (fr) | 2023-06-30 | 2025-01-02 | Illumina, Inc. | Systèmes et procédés de séquençage de polynucléotides avec des diagrammes de dispersion alternatifs |
WO2025006466A1 (fr) | 2023-06-30 | 2025-01-02 | Illumina, Inc. | Systèmes et procédés de séquençage de polynucléotides avec quatre nucléotides marqués |
WO2025006460A1 (fr) | 2023-06-30 | 2025-01-02 | Illumina, Inc. | Systèmes et procédés de séquençage de polynucléotides à bases modifiées |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5200818A (en) * | 1991-03-22 | 1993-04-06 | Inbal Neta | Video imaging system with interactive windowing capability |
US5285397A (en) * | 1989-12-13 | 1994-02-08 | Carl-Zeiss-Stiftung | Coordinate-measuring machine for non-contact measurement of objects |
US5315313A (en) * | 1991-07-24 | 1994-05-24 | Matsushita Electric Industrial Co., Ltd. | Device for electing a figure from among figures depicted on a display device |
Family Cites Families (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4797942A (en) * | 1987-03-02 | 1989-01-10 | General Electric | Pyramid processor for building large-area, high-resolution image by parts |
US5187571A (en) * | 1991-02-01 | 1993-02-16 | Bell Communications Research, Inc. | Television system for displaying multiple views of a remote location |
-
1995
- 1995-01-09 CA CA002179809A patent/CA2179809A1/fr not_active Abandoned
- 1995-01-09 JP JP7518668A patent/JPH09507620A/ja active Pending
- 1995-01-09 EP EP95907364A patent/EP0742987A4/fr not_active Withdrawn
- 1995-01-09 WO PCT/US1995/000303 patent/WO1995019093A1/fr not_active Application Discontinuation
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5285397A (en) * | 1989-12-13 | 1994-02-08 | Carl-Zeiss-Stiftung | Coordinate-measuring machine for non-contact measurement of objects |
US5200818A (en) * | 1991-03-22 | 1993-04-06 | Inbal Neta | Video imaging system with interactive windowing capability |
US5315313A (en) * | 1991-07-24 | 1994-05-24 | Matsushita Electric Industrial Co., Ltd. | Device for electing a figure from among figures depicted on a display device |
Non-Patent Citations (1)
Title |
---|
See also references of EP0742987A4 * |
Cited By (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP0793392A1 (fr) * | 1996-02-29 | 1997-09-03 | Matsushita Electric Industrial Co., Ltd. | Méthode et appareil pour la transmission et la réception de signaux télévision à trois dimensions d'images stéréoscopiques |
US6104425A (en) * | 1996-02-29 | 2000-08-15 | Matsushita Electric Industrial Co., Ltd. | Method and apparatus for transmitting television signals, method and apparatus for receiving television signals, and method and apparatus for transmitting/receiving television signals |
DE19825302B4 (de) * | 1997-06-09 | 2014-09-25 | Evans & Sutherland Computer Corp. | System zur Einrichtung einer dreidimensionalen Abfallmatte, welche eine vereinfachte Einstellung räumlicher Beziehungen zwischen realen und virtuellen Szeneelementen ermöglicht |
EP0903695A1 (fr) * | 1997-09-16 | 1999-03-24 | Canon Kabushiki Kaisha | Appareil de traitement d'images |
US6421459B1 (en) | 1997-09-16 | 2002-07-16 | Canon Kabushiki Kaisha | Image processing apparatus |
EP0930585A1 (fr) * | 1998-01-14 | 1999-07-21 | Canon Kabushiki Kaisha | Appareil pour la traitement d'images. |
US6914599B1 (en) | 1998-01-14 | 2005-07-05 | Canon Kabushiki Kaisha | Image processing apparatus |
GB2378341A (en) * | 2001-07-31 | 2003-02-05 | Hewlett Packard Co | Altering the viewpoint of part of an image to simulate a different viewing angle |
GB2378341B (en) * | 2001-07-31 | 2005-08-24 | Hewlett Packard Co | Improvements in and relating to dislaying digital images |
US7432930B2 (en) | 2001-07-31 | 2008-10-07 | Hewlett-Packard Development Company, L.P. | Displaying digital images |
US20100007735A1 (en) * | 2005-12-22 | 2010-01-14 | Marco Jacobs | Arrangement for video surveillance |
US9241140B2 (en) * | 2005-12-22 | 2016-01-19 | Robert Bosch Gmbh | Arrangement for video surveillance |
Also Published As
Publication number | Publication date |
---|---|
CA2179809A1 (fr) | 1996-08-20 |
EP0742987A1 (fr) | 1996-11-20 |
JPH09507620A (ja) | 1997-07-29 |
EP0742987A4 (fr) | 1998-01-07 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US6522325B1 (en) | Navigable telepresence method and system utilizing an array of cameras | |
AU761950B2 (en) | A navigable telepresence method and system utilizing an array of cameras | |
US6741250B1 (en) | Method and system for generation of multiple viewpoints into a scene viewed by motionless cameras and for presentation of a view path | |
EP2161925B1 (fr) | Procédé et système pour fusionner des flux vidéo | |
US20020190991A1 (en) | 3-D instant replay system and method | |
US20040027451A1 (en) | Immersive imaging system | |
KR20010023596A (ko) | 영상처리 방법 및 장치 | |
WO2002011431A1 (fr) | Systeme video et procede de commande associe | |
WO2001028309A2 (fr) | Procede et systeme permettant de comparer plusieurs images au moyen d'un reseau de cameras navigable | |
CA2794928A1 (fr) | Systeme et procede de capture et d'affichage d'images panoramiques de qualite cinematographique | |
US20220019280A1 (en) | System and Method for Interactive 360 Video Playback Based on User Location | |
EP0742987A1 (fr) | Visualisation d'objects mis en images a partir de points de vue selectionnes | |
US7106335B2 (en) | Method for displaying an object in a panorama window | |
WO2002087218A2 (fr) | Ensemble de cameras maniables et indicateur associe | |
US6525765B1 (en) | Image processing | |
JP2000182058A (ja) | 三次元運動入力方法及び三次元運動入力システム | |
JPH07105400A (ja) | 動画再生装置 | |
WO1998044723A1 (fr) | Studio virtuel | |
GB2317299A (en) | Processing digital image data derived from cinematographic film |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AK | Designated states |
Kind code of ref document: A1 Designated state(s): CA JP |
|
AL | Designated countries for regional patents |
Kind code of ref document: A1 Designated state(s): AT BE CH DE DK ES FR GB GR IE IT LU MC NL PT SE |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application | ||
DFPE | Request for preliminary examination filed prior to expiration of 19th month from priority date (pct application filed before 20040101) | ||
WWE | Wipo information: entry into national phase |
Ref document number: 2179809 Country of ref document: CA |
|
WWE | Wipo information: entry into national phase |
Ref document number: 1995907364 Country of ref document: EP |
|
WWP | Wipo information: published in national office |
Ref document number: 1995907364 Country of ref document: EP |
|
WWW | Wipo information: withdrawn in national office |
Ref document number: 1995907364 Country of ref document: EP |