WO2004068865A1 - Dispositif de capture d'images panoramiques stereoscopiques - Google Patents
Dispositif de capture d'images panoramiques stereoscopiques Download PDFInfo
- Publication number
- WO2004068865A1 WO2004068865A1 PCT/US2003/002285 US0302285W WO2004068865A1 WO 2004068865 A1 WO2004068865 A1 WO 2004068865A1 US 0302285 W US0302285 W US 0302285W WO 2004068865 A1 WO2004068865 A1 WO 2004068865A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- combined
- panoramic
- imaging system
- images
- Prior art date
Links
- 238000003384 imaging method Methods 0.000 claims abstract description 49
- 239000000872 buffer Substances 0.000 claims description 35
- 238000000034 method Methods 0.000 claims description 32
- 239000013598 vector Substances 0.000 claims description 18
- 230000001131 transforming effect Effects 0.000 claims description 3
- 230000000593 degrading effect Effects 0.000 claims description 2
- 230000009466 transformation Effects 0.000 description 13
- 230000008569 process Effects 0.000 description 12
- 230000003287 optical effect Effects 0.000 description 3
- 239000011521 glass Substances 0.000 description 2
- 238000013507 mapping Methods 0.000 description 2
- 239000011159 matrix material Substances 0.000 description 2
- 230000000712 assembly Effects 0.000 description 1
- 238000000429 assembly Methods 0.000 description 1
- QVGXLLKOCUKJST-UHFFFAOYSA-N atomic oxygen Chemical compound [O] QVGXLLKOCUKJST-UHFFFAOYSA-N 0.000 description 1
- 230000008901 benefit Effects 0.000 description 1
- 230000003139 buffering effect Effects 0.000 description 1
- 230000008859 change Effects 0.000 description 1
- 239000003086 colorant Substances 0.000 description 1
- 238000009499 grossing Methods 0.000 description 1
- 230000008676 import Effects 0.000 description 1
- 230000010354 integration Effects 0.000 description 1
- 230000002452 interceptive effect Effects 0.000 description 1
- 239000003562 lightweight material Substances 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 229910052760 oxygen Inorganic materials 0.000 description 1
- 239000001301 oxygen Substances 0.000 description 1
- 230000035807 sensation Effects 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Classifications
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B13/00—Optical objectives specially designed for the purposes specified below
- G02B13/06—Panoramic objectives; So-called "sky lenses" including panoramic objectives having reflecting surfaces
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/30—Determination of transform parameters for the alignment of images, i.e. image registration
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/20—Image signal generators
- H04N13/204—Image signal generators using stereoscopic image cameras
- H04N13/243—Image signal generators using stereoscopic image cameras using three or more 2D image sensors
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/698—Control of cameras or camera modules for achieving an enlarged field of view, e.g. panoramic image capture
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/10—Processing, recording or transmission of stereoscopic or multi-view image signals
- H04N13/194—Transmission of image signals
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/20—Image signal generators
- H04N13/204—Image signal generators using stereoscopic image cameras
- H04N13/207—Image signal generators using stereoscopic image cameras using a single 2D image sensor
- H04N13/221—Image signal generators using stereoscopic image cameras using a single 2D image sensor using the relative movement between cameras and objects
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/20—Image signal generators
- H04N13/204—Image signal generators using stereoscopic image cameras
- H04N13/239—Image signal generators using stereoscopic image cameras using two 2D image sensors having a relative position equal to or related to the interocular distance
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/20—Image signal generators
- H04N13/286—Image signal generators having separate monoscopic and stereoscopic modes
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/30—Image reproducers
- H04N13/332—Displays for viewing with the aid of special glasses or head-mounted displays [HMD]
- H04N13/344—Displays for viewing with the aid of special glasses or head-mounted displays [HMD] with head-mounted left-right displays
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N2013/0074—Stereoscopic image analysis
- H04N2013/0081—Depth or disparity estimation from stereoscopic image signals
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N2013/0074—Stereoscopic image analysis
- H04N2013/0088—Synthesising a monoscopic image signal from stereoscopic images, e.g. synthesising a panoramic or high resolution monoscopic image
Definitions
- Embodiments of the invention relate in general to a panoramic image capture device and, more specifically, to a panoramic image capture device for producing a stereoscopic panoramic image.
- Panoramic cameras are known in the art. Such cameras often use a single, rotatable camera. Although such devices are suitable for stationary images, such devices typically produce blurred or distorted images when used to capture non-stationary objects. It is also known in the art to utilize an image capture system having a plurality of image capture devices. In this manner, a plurality of images can be captured, substantially simultaneously, and stitched together using processes known in the art. Although such systems substantially eliminate the problem associated with capturing objects in motion, such systems do not provide means for producing a stereoscopic image. It is also known in the art to use a "fish-eye" lens to capture a panoramic image. Such images, however, import a large amount of distortion into the resulting image, and capture images of relatively low quality.
- an image capture system produces a stereoscopic pair of panoramic images.
- this invention provides an image capture system for producing a seamless panoramic image.
- this invention provides an image capture system for producing a stereoscopic panoramic image in motion.
- this invention provides a stereoscopic panoramic image of minimal distortion.
- this invention provides an imaging system for full- motion, real time, panoramic stereoscopic imaging.
- an imaging system comprising a first image capture device, a second image capture device, and a third image capture device.
- Means are also provided for combining at least a first portion of a first image captured using the first image capture device with a portion of a second image captured using the second image capture device to produce a first combined image.
- Means are also provided for combining at least a second portion of the first image with at least a portion of a third image captured using the third image capture device, to produce a second combined image.
- a plurality of image capture devices are utilized to produce a plurality of images, a portion of each of which is combined with a portion of adjacent images to produce a first combined panoramic image. Similarly, a second portion of each image is combined with separate portions of adjacent images to produce a second combined panoramic image.
- the first combined panoramic image, and second combined panoramic image are displayed in a stereoscopic orientation to produce a stereoscopic panoramic image.
- the imaging system of the present invention may be utilized to capture a plurality of images, to produce a full-motion stereoscopic panoramic image.
- Fig. 1 illustrates a front elevation of the imaging system of the present invention
- Fig. 2 illustrates a graphic depiction of image capture regions of adjacent image capture devices of the imaging system of the present invention
- Fig. 3 illustrates a bottom perspective view of the final panoramic stereoscopic image displayed on a screen, and the polarized glasses used to view the image;
- Figs. 4A-4B illustrate a left panoramic image and right panoramic image
- Figs. 5A-5B illustrates images associated with first image buffer and second image buffer
- Figs. 6A-6C are a flowchart of the transformation process utilized in association with the imaging system of the present device, to transform the plurality of images captured with the imaging devices into a first panoramic image and a second panoramic image to produce a stereoscopic panoramic image;
- Fig. 7 illustrates a perspective view of a 360 degree unobstructed panoramic stereoscopic image, created using the camera of Fig. 1.
- a camera (10) is shown having a body (12) constructed of plastic or other similar lightweight material.
- the body (12) is substantially spherical, having a diameter preferably between about 0.001 and about 500 centimeters, and more preferably, between about 10 and about 50 centimeters.
- a plurality of lenses (14) are provided substantially equally spaced across the surface of the body (12).
- the lenses (14) are preferably circular lenses having a diameter of preferably between about 5 angstroms and about 10 centimeters, and more preferably, between about 0.5 and about 5 centimeters, hi the preferred embodiment, the lenses are model number BCL38C 3.8 millimeter micro lenses, manufactured by CBC America, located at 55 Mall Drive, Commack, NY 11725. As shown in Fig. 2, the lenses (14) are each associated with a charge coupled device (CCD) assembly (16), such as those well known in the art. Although in the preferred embodiment a GP-CX171/LM CCD color board camera, manufactured by CCD (CCD) assembly (16), such as those well known in the art. Although in the preferred embodiment a GP-CX171/LM CCD color board camera, manufactured by CBC America, located at 55 Mall Drive, Commack, NY 11725. Although in the preferred embodiment a GP-CX171/LM CCD color board camera, manufactured by
- the lenses (14) and/or CCD assemblies (16) can also be described as "image capture units” (26), (40), and (54).
- image capture units 26), (40), and (54).
- all of the image capture units (26), (40), and (54) are operationally coupled to a processing unit (CPU)(22), which can thereby receive images from the image capture devices (14) and/or (16).
- the CPU (22) is a 900 MHz, Pentium®4 class personal computer provided with an Oxygen GVX210 graphics card manufactured by 3Dlabs of 480 Pontrero, Sunnyvale, CA 94086.
- the CPU may be of any type known in the art, it is preferably capable of quad buffering and utilizing page flipping software in a manner such as that known in the art.
- the CPU (22) may be coupled to a head mounted display (24) which, in the preferred embodiment, is a VFX3D, manufactured by Interactive Imaging Systems, Inc., located at 2166 Brighton Henrietta Townline Road, Rochester, NY 14623. As shown in Fig. 2, the lenses (14) are divergent from one another, and offset about twenty degrees from one another along the substantially arcuate path or line defined by the body (12).
- each lens (14) and/or image capture unit (26), (40), (54) allows each lens (14) and/or image capture unit (26), (40), (54) to have a substantially similar focal point.
- Each lens (14) has a about a fifty-three degree field of view, which overlaps the field of view of a laterally adjoining lens by between about ten and about ninety percent, and, preferably, between about fifty and about sixty-five percent.
- a first image capture unit (26) is associated with an optical axis (28), bisecting the image bordered on one side by a left side plane (30) and on the other side by a right side plane (32).
- the lens (14) of the image capture unit (26) is focused on a defined image plane (34), divided into a left image plane (36) and a right image plane (38).
- a second image capture unit (40) is also provided with an optical axis (42), a left side plane (44), a right side plane (46), a defined image plane (48), a left image plane (50), and a right image plane (52).
- a third image capture unit (54), to the right of the first image capture unit (26), is provided with optical axis (56), a left side plane (58), a right side plane (60), a defined image plane (62), a left image plane (64), and a right image plane (66).
- every point associated with a final panoramic image is within the defined image plane of at least two adjacent image capture units (26), (40) and/or (54).
- the defined image planes of adjacent image capture units overlap vertically, preferably about 1-20 percent, more preferably about 5- 10 percent, and about 7 percent in the preferred embodiment.
- the defined image planes of adjacent image capture units overlap horizontally, preferably about 20 percent, more preferably about 5-10 percent, and about 6 percent in the preferred embodiment.
- a first panoramic image (72) and second panoramic image (74) are created.
- Each of the panoramic images (72), (74) may display about 90 degrees of a scene (e.g., about half of a hemisphere).
- an image (76), associated with the left image plane (36) of the first image capture unit (26) is combined with an image (78), associated with the left image plane (50) of the second image capture unit (40) and an image (80), associated with the left image plane (64) of the third image capture unit (54).
- an image (76), associated with the left image plane (36) of the first image capture unit (26) is combined with an image (78), associated with the left image plane (50) of the second image capture unit (40) and an image (80), associated with the left image plane (64) of the third image capture unit (54).
- the associated image planes (36), (50) and (64), preferably overlap by about 0.5-30 percent, more preferably by about 10-20 percent, and about 13 percent in the preferred embodiment, but are not parallel to one another, and are not necessarily tangent to a curve defining the first panoramic image (72).
- the images (76), (78), (80), (82), (84) and (86), associated with planes (36), (50) and (64) must be transformed, to remove distortion associated with their non-parallel orientation, before they can be stitched together as described below to form the final panoramic image (68).
- Figs. 2, 3, 4A-4B and 5A-5B are examples of the images (76), (78), (80), (82), (84) and (86), associated with planes (36), (50) and (64)
- the images may be transmitted via hardwired, wireless, or any desired connection to the CPU (22).
- the CPU (22) then may operate to transform the images in accordance with the process described in Figs. 6A-6C.
- source images which in the preferred embodiment are substantially rectilinear images, but which may, of course, be any type of image, are obtained or received from the image capture units (26), (40) and (54) by the CPU (22).
- Figs. 2, 4A, 4B and 6A-6C As shown in block (94), the CPU (22) may then define registration pixel pairs for the untransformed source images.
- the CPU(22) creates an input file.
- the input file includes the height and width of the final panoramic image (68), the source information, and registration point information.
- the source information includes the file name and path of the source image, the height and width of the source image in pixels, and the yaw, pitch and roll angles of the source of the associated image capture unit.
- the horizontal field of view of the image capture unit which is preferably between about 1 and about 80 degrees, more preferably between about 30 and about 60 degrees, and about 53 degrees in the preferred embodiment, is defined by the associated left side plane and right side plane, the X Shift, Y Shift and zoom values of the source image.
- the information associated with the registration points includes information regarding the source image associated with the first pixel position of the registration point, a horizontal and vertical pixel position in the first source image, the list index of info ⁇ nation regarding the source image associated with the second pixel position, and a horizontal and vertical pixel position in the second source image.
- the images (76-86) associated with the image planes (36, 38, 50, 52, 64 and 66) are substantially rectilinear, normal flat-field images, and the panoramic images (72 and 74) are at least partially equirectangular, representing pixel mapping on a spherical surface in a manner such as that shown in Figs. 4A-4B. Accordingly, once the CPU (22) creates the input file, as shown in block (98), the registration pixel pairs can be transformed to locate their position in the final panoramic image (68).
- a vector is defined that represents a first pixel of a given registration pixel pair in three- dimensional space, locating it on the final panoramic image (68). This is accomplished by applying the following matrix transformation to each pixel:
- segment x Alter the horizontal position of the pixel in the source image so that it is relative to the image's center. Then compensate for the
- segment y Alter the vertical position of the pixel in the source image so that it is relative to the image's center. Then compensate for the Y Shift and zoom variables of the source image.
- segment z Using various source image variables, determine the z segment that corresponds to the scale provided by the image's size in pixels.
- the pixel vector represents the global globX, globY, and globZ positions of that point in three-dimensional space.
- the CPU (22) then converts these positions into spherical coordinates and applies them directly to the final panoramic coordinates.
- the vector's yaw angle represents its horizontal panorama position, newX, and its pitch angle represents its vertical panorama position, newY.
- the CPU (22) calculates the distance between the registration pixel pairs. If the average distance of the registration pixel pairs for a given source image are not yet minimized, as would be the case upon the initial transformation, shown in block (102), the yaw of the source image is altered slightly, whereafter the process returns to block (98) and the registration pixel pairs are again transformed to pixel points in the final panoramic image (68). This process continues, altering the yaw, until the average distance of the source image registration pixel pairs is minimized.
- the average distance of all source image registration pixel pairs is calculated and, if they are not yet minimized, the yaw, pitch, roll, XShift, YShift and zoom of the source images are altered as shown in block (108), and the process returns to block (98), where the process continues, until the distance between the registration pixel pairs is minimized across all of the source images.
- an output file is created, identifying the height and width of the first panoramic image (72), the yaw, pitch, roll, XShift, YShift and zoom transformation image information relative to each particular source image.
- a vector is defined representing the particular pixel's position in three dimensional space, using the vector transformation described above.
- the vector is transformed to reflect the yaw, pitch, roll, XShift, YShift and zoom information associated with the source image as defined in the output file.
- the transformed vector is associated with a pixel in the final panoramic image (68). As shown in block (118), this process is repeated until all of the pixels in a particular source image have been transformed into vectors, and their position located on the final panoramic image (68)
- two image buffers (90) and (92) are created, each having a height and width approximately equal to that of the final panoramic image (68).
- FIGs. 3, 5A-5B and 6B As shown in block (122), once the image buffers (90) and (92) have been created, vector transformation information associated with a quadrilateral of four adjacent pixels of a particular source image is utilized to draw the quadrilateral of pixels onto the appropriate image buffer (90) or (92). (Figs. 5A-5B and 6C). If the pixel is in the left image planes (38), (52) or (66), the pixel is written to the left image buffer (90).
- the pixel is in the right image buffer (92).
- FIGs. 2 and 5A-5B since the transformation is likely to spread out the quadrilateral of pixels on the image buffer, there will likely be gaps between pixels as they are converted from their rectilinear location to their equirectangular location in the associated image buffer (90) or (92).
- FIGs. 5A- 5B and 6C When the quadrilateral of pixels is located on the associated image buffer (90) or (92), the gaps thereby created are filled as smoothly as possible, perhaps using a linear gradient of the corner pixel colors, in a manner such as that known in the art.
- Known linear gradient fill techniques may also be used to diminish visible seams between images.
- additional source image information is applied to the image buffers (90) and (92), for areas of the image buffers (90) and (92) that have already been filled with pixel data, the alpha transparency of the new overlapping pixels is linearly degraded, smoothing the resulting seam as described below.
- the CPU (22) may eliminate the gaps by interpolating the internal pixel values, using any method known in the art, which may include comparing the gap to the adjacent pixels, or "white boxing" the gap by utilizing the immediately preceding frame and immediately succeeding frame in a motion capture system, to extrapolate the most appropriate pixel value for the gap.
- blocks (122) and (124) are repeated until all of the source image pixels have been mapped to the appropriate image buffer (90) or (92). Once all of the pixels have been mapped, as shown in block (128), the first image buffer pixels are compared to the first panoramic image pixels.
- the CPU (22) sets the pixel in the first image buffer (90) to maximum visibility. If the first panoramic image (72) already has a pixel associated with a pixel in the first image buffer (90), the existing pixel is compared to the corresponding pixel in the first image buffer (90). If the pixel in the first image buffer (90) has a shorter distance to the center of its respective source image than does the existing pixel, the pixel in the first image buffer (90) is set to maximum visibility.
- the pixel in the first image buffer (90) is set to minimum visibility.
- the process is repeated to merge pixels of the second image buffer (92) into the second panoramic image (74).
- the overlapping edges of the image in the image buffers (90) and (92) are feathered by degrading visibility of the pixels over the area of overlap. This feathering smoothes the overlapping areas of the image once it is merged from the image buffers (90) and (92) into the panoramic images (72) and (74).
- blocks (122) through (132) are repeated until all of the source images have been merged into the first and second panoramic images (72) and (74).
- the images (76), (78) and (80) associated with the left image planes (36), (50) and (64) of the image capture units (26), (40) and (54) are used to create the first panoramic image (72), and the images (82), (84) and (86) associated with the right image planes (38), (52) and (66) of the image capture units (26), (40) and (54) are used to create the second panoramic image (74).
- the panoramic images (72) and (74) can be displayed together as the final panoramic image (68). (Figs. 3, 4A and 4B).
- the panoramic images (72) and (74) may be reverse polarized, shown on a standard panoramic screen (140), such as that shown in Fig. 3, and viewed using glasses (142) having lenses of reversed polarity.
- the panoramic images (72) and (74) may be conveyed to the head mounted display (24) shown in Fig. 1.
- the CPU (22) can send multiple panoramic images to the head mounted display (24) using known "page-flipping" techniques to send and separately display the images in the left and right displays of the head mounted display (24), as appropriate. This process can be used to animate the display as a full 24 frame per second motion picture, or to display multiple visual images to each display.
- an imaging system see Fig.
- first image capture unit (26) to capture a first image (36)
- second image capture unit (40) to capture a second image (50)
- third image capture unit (54) to capture a third image (64)
- the first portion of the first image can be any amount up to the entire first image, or may be between about 20 percent and about 80 percent of the first image.
- the portion of the second image can be any amount up to the entire second image, or may be between about 20 percent and about 80 percent of the second image.
- the first and second combined images can be repeatedly produced and displayed to convey stereoscopic motion.
- An imaging system may also comprise an image capture unit (26) for providing an image, a means (22) for using a first portion of said image to provide a first stereoscopic image, and means (22) for using a second portion of said image to provide a second stereoscopic image.
- the head mounted display unit (24) may also be provided with an orientation sensor (144), such as those well known in the art, to change the image provided to the head mounted display (24) as the sensor (144) moves.
- an orientation sensor 144
- a user may look up, down, and in either direction, to see that portion of the final panoramic image (68) associated with the vector of the user's line of sight, and have the sensation of actually looking around a three- dimensional space.
- the camera (10) may be provided with a plurality of image capture pairs having substantially rectilinear capture systems oriented substantially parallel to one another.
- the pairs may be offset by a predetermined factor to obtain a desired stereoscopic image.
- the transformation process associated with this embodiment is identical to that described above, albeit instead of dividing the images in half, and sending pixels from each half to a separate image buffer, all of the pixels associated with the images from the "left" image capture devices of each image capture device pair are sent to one image buffer, and all of the pixels associated with images from the "right” image capture devices are sent to another image buffer.
- the camera (10) and CPU (22) may be utilized to capture twenty-four or more frames per second, and display the final stereoscopic, panoramic image (68), in real time, as a motion picture.
- computer generated graphical information produced in a manner such as that well known in the art, may be combined with the final panoramic images (72) and (74) in the CPU (22) to provide a seamless integration of actual images captured with the camera (10), and digitized virtual reality images (146). (Fig. 3) This combination produces a seamless display of real and virtual panoramic stereographic virtual images.
- the images captured by the camera (10) may be transformed, utilizing the above transformation procedures, to produce a seamless 360-degree panoramic monographic image.
- the camera (10) is provided with a support post (148) and a transportation unit such as a remote control carriage (150), similar to or identical to those used in association with remote control cars and the like. Images associated with the left image planes of the image capture units may be used to produce a combined image and the images associated with the right image planes of the image capture units may be used to overwrite and fill the combined image to hide the support post (148), carriage (150) and any other camera equipment otherwise visible in the combined image.
- the foregoing transformation procedures may be utilized for such overwriting and filling.
- a final panoramic image (152), which may be displayed on a spherical screen (154) or on the head mounted display (24). (Figs. 1 and 7).
- the interpolation and feathering procedures detailed above may be utilized to approximate the image lying beneath the carriage (150), to produce the appearance of a completely unobstructed 360-degree image.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Signal Processing (AREA)
- General Physics & Mathematics (AREA)
- Multimedia (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Optics & Photonics (AREA)
- Theoretical Computer Science (AREA)
- Studio Devices (AREA)
- Testing, Inspecting, Measuring Of Stereoscopic Televisions And Televisions (AREA)
- Stereoscopic And Panoramic Photography (AREA)
- Image Processing (AREA)
- Control Of Indicators Other Than Cathode Ray Tubes (AREA)
- Closed-Circuit Television Systems (AREA)
Abstract
Priority Applications (7)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CNA038261855A CN1771740A (zh) | 2003-01-24 | 2003-01-24 | 立体全景图像捕捉装置 |
CA002513874A CA2513874A1 (fr) | 2003-01-24 | 2003-01-24 | Dispositif de capture d'images panoramiques stereoscopiques |
EP03710742A EP1586204A1 (fr) | 2003-01-24 | 2003-01-24 | Dispositif de capture d'images panoramiques stereoscopiques |
PCT/US2003/002285 WO2004068865A1 (fr) | 2003-01-24 | 2003-01-24 | Dispositif de capture d'images panoramiques stereoscopiques |
JP2004567640A JP2006515128A (ja) | 2003-01-24 | 2003-01-24 | 立体パノラマ画像取り込み装置 |
AU2003214899A AU2003214899A1 (en) | 2003-01-24 | 2003-01-24 | Stereoscopic Panoramic Image Capture Device |
IL169827A IL169827A0 (en) | 2003-01-24 | 2005-07-21 | Steroscopic panoramic image capture device |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/US2003/002285 WO2004068865A1 (fr) | 2003-01-24 | 2003-01-24 | Dispositif de capture d'images panoramiques stereoscopiques |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2004068865A1 true WO2004068865A1 (fr) | 2004-08-12 |
Family
ID=32823170
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/US2003/002285 WO2004068865A1 (fr) | 2003-01-24 | 2003-01-24 | Dispositif de capture d'images panoramiques stereoscopiques |
Country Status (7)
Country | Link |
---|---|
EP (1) | EP1586204A1 (fr) |
JP (1) | JP2006515128A (fr) |
CN (1) | CN1771740A (fr) |
AU (1) | AU2003214899A1 (fr) |
CA (1) | CA2513874A1 (fr) |
IL (1) | IL169827A0 (fr) |
WO (1) | WO2004068865A1 (fr) |
Cited By (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2008005066A1 (fr) * | 2006-06-30 | 2008-01-10 | Microsoft Corporation | étalonnage paramétrique pour des systèmes de caméras panoramiques |
WO2012039669A1 (fr) * | 2010-09-20 | 2012-03-29 | Scalado Ab | Procédé de formation d'images |
US20120243746A1 (en) * | 2011-03-22 | 2012-09-27 | Sony Corporation | Image processor, image processing method, and program |
GB2525170A (en) * | 2014-04-07 | 2015-10-21 | Nokia Technologies Oy | Stereo viewing |
US9196069B2 (en) | 2010-02-15 | 2015-11-24 | Mobile Imaging In Sweden Ab | Digital image manipulation |
US9344642B2 (en) | 2011-05-31 | 2016-05-17 | Mobile Imaging In Sweden Ab | Method and apparatus for capturing a first image using a first configuration of a camera and capturing a second image using a second configuration of a camera |
US9432583B2 (en) | 2011-07-15 | 2016-08-30 | Mobile Imaging In Sweden Ab | Method of providing an adjusted digital image representation of a view, and an apparatus |
EP3198865A4 (fr) * | 2014-09-22 | 2017-09-27 | Samsung Electronics Co., Ltd. | Système de caméra pour vidéo tridimensionnelle |
US9792012B2 (en) | 2009-10-01 | 2017-10-17 | Mobile Imaging In Sweden Ab | Method relating to digital images |
US11049218B2 (en) | 2017-08-11 | 2021-06-29 | Samsung Electronics Company, Ltd. | Seamless image stitching |
GB2591278A (en) * | 2020-01-24 | 2021-07-28 | Bombardier Transp Gmbh | A monitoring system of a rail vehicle, a method for monitoring and a rail vehicle |
US11205305B2 (en) | 2014-09-22 | 2021-12-21 | Samsung Electronics Company, Ltd. | Presentation of three-dimensional video |
Families Citing this family (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2011082919A (ja) * | 2009-10-09 | 2011-04-21 | Sony Corp | 画像処理装置および方法、並びにプログラム |
WO2011162227A1 (fr) * | 2010-06-24 | 2011-12-29 | 富士フイルム株式会社 | Dispositif de synthèse d'image panoramique stéréoscopique, dispositif de saisie d'image, procédé de synthèse d'image panoramique stéréoscopique, support d'enregistrement et programme informatique |
TWI506595B (zh) * | 2011-01-11 | 2015-11-01 | Altek Corp | 全景影像產生方法及裝置 |
JP5754312B2 (ja) * | 2011-09-08 | 2015-07-29 | カシオ計算機株式会社 | 画像処理装置及び画像処理方法、並びにプログラム |
KR101804199B1 (ko) * | 2011-10-12 | 2017-12-05 | 삼성전자주식회사 | 입체 파노라마 영상을 생성하는 장치 및 방법 |
CN103096101B (zh) * | 2011-11-07 | 2016-03-30 | 联想(北京)有限公司 | 视频合成方法、装置及电子设备 |
KR20150068298A (ko) * | 2013-12-09 | 2015-06-19 | 씨제이씨지브이 주식회사 | 다면 영상 생성 방법 및 시스템 |
US10334227B2 (en) * | 2014-03-28 | 2019-06-25 | Intuitive Surgical Operations, Inc. | Quantitative three-dimensional imaging of surgical scenes from multiport perspectives |
US9934615B2 (en) * | 2016-04-06 | 2018-04-03 | Facebook, Inc. | Transition between binocular and monocular views |
CN106851240A (zh) * | 2016-12-26 | 2017-06-13 | 网易(杭州)网络有限公司 | 图像数据处理的方法及装置 |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5023725A (en) * | 1989-10-23 | 1991-06-11 | Mccutchen David | Method and apparatus for dodecahedral imaging system |
US6141034A (en) * | 1995-12-15 | 2000-10-31 | Immersive Media Co. | Immersive imaging method and apparatus |
Family Cites Families (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
DE69324224T2 (de) * | 1992-12-29 | 1999-10-28 | Koninklijke Philips Electronics N.V., Eindhoven | Bildverarbeitungsverfahren und -vorrichtung zum Erzeugen eines Bildes aus mehreren angrenzenden Bildern |
JPH0962861A (ja) * | 1995-08-21 | 1997-03-07 | Matsushita Electric Ind Co Ltd | パノラマ映像装置 |
US7015954B1 (en) * | 1999-08-09 | 2006-03-21 | Fuji Xerox Co., Ltd. | Automatic video system using multiple cameras |
JP2002094849A (ja) * | 2000-09-12 | 2002-03-29 | Sanyo Electric Co Ltd | 広視野画像撮像装置 |
JP2002159019A (ja) * | 2000-11-16 | 2002-05-31 | Canon Inc | 表示制御装置、撮影位置推測装置、表示システム、撮影システム、画像位置決め方法、撮影位置推測方法、及び処理プログラムを記録した記録媒体 |
JP4590754B2 (ja) * | 2001-02-28 | 2010-12-01 | ソニー株式会社 | 画像入力処理装置 |
JP2003141562A (ja) * | 2001-10-29 | 2003-05-16 | Sony Corp | 非平面画像の画像処理装置及び画像処理方法、記憶媒体、並びにコンピュータ・プログラム |
-
2003
- 2003-01-24 CN CNA038261855A patent/CN1771740A/zh active Pending
- 2003-01-24 EP EP03710742A patent/EP1586204A1/fr not_active Withdrawn
- 2003-01-24 JP JP2004567640A patent/JP2006515128A/ja active Pending
- 2003-01-24 WO PCT/US2003/002285 patent/WO2004068865A1/fr active Application Filing
- 2003-01-24 CA CA002513874A patent/CA2513874A1/fr not_active Abandoned
- 2003-01-24 AU AU2003214899A patent/AU2003214899A1/en not_active Abandoned
-
2005
- 2005-07-21 IL IL169827A patent/IL169827A0/en unknown
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5023725A (en) * | 1989-10-23 | 1991-06-11 | Mccutchen David | Method and apparatus for dodecahedral imaging system |
US6141034A (en) * | 1995-12-15 | 2000-10-31 | Immersive Media Co. | Immersive imaging method and apparatus |
Cited By (23)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR101330373B1 (ko) | 2006-06-30 | 2013-11-15 | 마이크로소프트 코포레이션 | 파노라마 카메라 시스템을 위한 파라미터 보정 |
WO2008005066A1 (fr) * | 2006-06-30 | 2008-01-10 | Microsoft Corporation | étalonnage paramétrique pour des systèmes de caméras panoramiques |
US9792012B2 (en) | 2009-10-01 | 2017-10-17 | Mobile Imaging In Sweden Ab | Method relating to digital images |
US9196069B2 (en) | 2010-02-15 | 2015-11-24 | Mobile Imaging In Sweden Ab | Digital image manipulation |
US9396569B2 (en) | 2010-02-15 | 2016-07-19 | Mobile Imaging In Sweden Ab | Digital image manipulation |
WO2012039669A1 (fr) * | 2010-09-20 | 2012-03-29 | Scalado Ab | Procédé de formation d'images |
US9544498B2 (en) | 2010-09-20 | 2017-01-10 | Mobile Imaging In Sweden Ab | Method for forming images |
US20120243746A1 (en) * | 2011-03-22 | 2012-09-27 | Sony Corporation | Image processor, image processing method, and program |
US9071751B2 (en) * | 2011-03-22 | 2015-06-30 | Sony Corporation | Image processor method and program for correcting distance distortion in panorama images |
US9344642B2 (en) | 2011-05-31 | 2016-05-17 | Mobile Imaging In Sweden Ab | Method and apparatus for capturing a first image using a first configuration of a camera and capturing a second image using a second configuration of a camera |
US9432583B2 (en) | 2011-07-15 | 2016-08-30 | Mobile Imaging In Sweden Ab | Method of providing an adjusted digital image representation of a view, and an apparatus |
GB2525170A (en) * | 2014-04-07 | 2015-10-21 | Nokia Technologies Oy | Stereo viewing |
US10455221B2 (en) | 2014-04-07 | 2019-10-22 | Nokia Technologies Oy | Stereo viewing |
US10645369B2 (en) | 2014-04-07 | 2020-05-05 | Nokia Technologies Oy | Stereo viewing |
US11575876B2 (en) | 2014-04-07 | 2023-02-07 | Nokia Technologies Oy | Stereo viewing |
EP3198865A4 (fr) * | 2014-09-22 | 2017-09-27 | Samsung Electronics Co., Ltd. | Système de caméra pour vidéo tridimensionnelle |
US10257494B2 (en) | 2014-09-22 | 2019-04-09 | Samsung Electronics Co., Ltd. | Reconstruction of three-dimensional video |
US10313656B2 (en) | 2014-09-22 | 2019-06-04 | Samsung Electronics Company Ltd. | Image stitching for three-dimensional video |
US10547825B2 (en) | 2014-09-22 | 2020-01-28 | Samsung Electronics Company, Ltd. | Transmission of three-dimensional video |
US10750153B2 (en) | 2014-09-22 | 2020-08-18 | Samsung Electronics Company, Ltd. | Camera system for three-dimensional video |
US11205305B2 (en) | 2014-09-22 | 2021-12-21 | Samsung Electronics Company, Ltd. | Presentation of three-dimensional video |
US11049218B2 (en) | 2017-08-11 | 2021-06-29 | Samsung Electronics Company, Ltd. | Seamless image stitching |
GB2591278A (en) * | 2020-01-24 | 2021-07-28 | Bombardier Transp Gmbh | A monitoring system of a rail vehicle, a method for monitoring and a rail vehicle |
Also Published As
Publication number | Publication date |
---|---|
AU2003214899A1 (en) | 2004-08-23 |
CA2513874A1 (fr) | 2004-08-12 |
IL169827A0 (en) | 2007-07-04 |
JP2006515128A (ja) | 2006-05-18 |
CN1771740A (zh) | 2006-05-10 |
EP1586204A1 (fr) | 2005-10-19 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US6947059B2 (en) | Stereoscopic panoramic image capture device | |
US11528468B2 (en) | System and method for creating a navigable, three-dimensional virtual reality environment having ultra-wide field of view | |
WO2004068865A1 (fr) | Dispositif de capture d'images panoramiques stereoscopiques | |
US7429997B2 (en) | System and method for spherical stereoscopic photographing | |
JP4642723B2 (ja) | 画像生成装置および画像生成方法 | |
EP3057066B1 (fr) | Génération d'une imagerie tridimensionnelle à partir d'une image bidimensionnelle à l'aide d'une carte de profondeur | |
US20170363949A1 (en) | Multi-tier camera rig for stereoscopic image capture | |
WO2012166593A2 (fr) | Système et procédé pour créer un environnement de réalité virtuelle tridimensionnel, panoramique, navigable ayant un champ de vision ultra-large | |
CN101938599A (zh) | 生成互动的动态全景影像的方法 | |
US11812009B2 (en) | Generating virtual reality content via light fields | |
US6836286B1 (en) | Method and apparatus for producing images in a virtual space, and image pickup system for use therein | |
WO2018035347A1 (fr) | Appareil de prise de vues multiniveau à capture d'image stéréoscopique | |
JP4406824B2 (ja) | 画像表示装置、画素データ取得方法、およびその方法を実行させるためのプログラム | |
CN109769111A (zh) | 图像显示方法、装置、系统、存储介质和处理器 | |
US8019180B2 (en) | Constructing arbitrary-plane and multi-arbitrary-plane mosaic composite images from a multi-imager | |
CN115174805A (zh) | 全景立体图像的生成方法、装置和电子设备 | |
CN114581297B (zh) | 用于环景影像的影像处理方法及装置 | |
US20040061775A1 (en) | Apparatus for placing primary image in registration with lenticular lens in system for using binocular fusing to produce secondary 3D image from primary image | |
KR20060015460A (ko) | 입체 파노라마 화상 캡처 디바이스 | |
CN109272445A (zh) | 基于球面模型的全景视频拼接方法 | |
JPH11220758A (ja) | ステレオ画像表示方法及び装置 | |
CN108122283B (zh) | 一种利用坐标变换编辑vr影像的方法 | |
Zaitseva et al. | The development of mobile applications for the capturing and visualization of stereo and spherical panoramas | |
Coleshill | Spherical panoramic video: the space ball | |
JP2005077808A (ja) | 画像表示装置および画像表示方法ならびに画像表示システム |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AK | Designated states |
Kind code of ref document: A1 Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EC EE ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NO NZ OM PH PL PT RO RU SC SD SE SG SK SL TJ TM TN TR TT TZ UA UG US UZ VC VN YU ZA ZM ZW |
|
AL | Designated countries for regional patents |
Kind code of ref document: A1 Designated state(s): GH GM KE LS MW MZ SD SL SZ TZ UG ZM ZW AM AZ BY KG KZ MD RU TJ TM AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IT LU MC NL PT SE SI SK TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application | ||
DFPE | Request for preliminary examination filed prior to expiration of 19th month from priority date (pct application filed before 20040101) | ||
WWE | Wipo information: entry into national phase |
Ref document number: 2513874 Country of ref document: CA |
|
WWE | Wipo information: entry into national phase |
Ref document number: 169827 Country of ref document: IL |
|
WWE | Wipo information: entry into national phase |
Ref document number: 1020057013716 Country of ref document: KR Ref document number: 2004567640 Country of ref document: JP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2003710742 Country of ref document: EP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2003214899 Country of ref document: AU |
|
WWE | Wipo information: entry into national phase |
Ref document number: 1496/KOLNP/2005 Country of ref document: IN |
|
ENP | Entry into the national phase |
Ref document number: 2005126718 Country of ref document: RU Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 20038261855 Country of ref document: CN |
|
WWP | Wipo information: published in national office |
Ref document number: 2003710742 Country of ref document: EP |
|
WWP | Wipo information: published in national office |
Ref document number: 1020057013716 Country of ref document: KR |