WO2018136709A1 - Procédé d'imagerie de temps de vol épipolaire - Google Patents
Procédé d'imagerie de temps de vol épipolaire Download PDFInfo
- Publication number
- WO2018136709A1 WO2018136709A1 PCT/US2018/014369 US2018014369W WO2018136709A1 WO 2018136709 A1 WO2018136709 A1 WO 2018136709A1 US 2018014369 W US2018014369 W US 2018014369W WO 2018136709 A1 WO2018136709 A1 WO 2018136709A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- sensor
- epipolar
- illuminated
- depth
- pixel
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/88—Lidar systems specially adapted for specific applications
- G01S17/89—Lidar systems specially adapted for specific applications for mapping or imaging
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C11/00—Photogrammetry or videogrammetry, e.g. stereogrammetry; Photographic surveying
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/02—Systems using the reflection of electromagnetic waves other than radio waves
- G01S17/06—Systems determining position data of a target
- G01S17/08—Systems determining position data of a target for measuring distance only
- G01S17/10—Systems determining position data of a target for measuring distance only using transmission of interrupted, pulse-modulated waves
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/48—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
- G01S7/491—Details of non-pulse systems
- G01S7/4912—Receivers
- G01S7/4913—Circuits for detection, sampling, integration or read-out
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/48—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
- G01S7/491—Details of non-pulse systems
- G01S7/4912—Receivers
- G01S7/4915—Time delay measurement, e.g. operational details for pixel components; Phase measurement
Definitions
- Time-of-flight (ToF) depth sensors have become the technology of choice in diverse applications, from automotive and aviation to robotics, gaming and consumer electronics. These sensors come in two general flavors: LIDAR-based systems that rely on extremely brief pulses of light to sense depth, and continuous- wave (CW) systems that emit a modulated light signal over much longer duration.
- LIDAR-based systems can acquire centimeter-accurate depth maps up to a kilometer away in broad daylight, but have low measurement rates. Additionally, the cost per pixel is orders of magnitude higher than CW systems, whose range, outdoor operation and robustness are extremely limited.
- continuous- wave time-of-flight (CW-ToF) sensors continue to dominate the consumer electronics and low-end robotics space despite their shortcomings. Further, consumer grade time- of-flight depth cameras like Kinect and PMD are cheap, compact and produce video- rate depth maps in short-range applications.
- CW-ToF continuous- wave time-of-flight
- the present invention significantly reduces the shortcomings of CW-ToF through the use of energy-efficient epipolar imaging.
- a continuously-modulated sheet of laser light is projected along a sequence of carefully chosen epipolar planes that collectively span the field of view. For each projected sheet, only a strip of CW-ToF pixels corresponding to each epipolar plane is exposed.
- FIG. 2 a prototype implementation of the invention couples a specially built projection system to a CW-ToF sensor that has a controllable region of interest.
- an off-the-shelf CW-ToF sensor may be used.
- the off-the- shelf sensor may output live 320 x 240 3D video at 7.5 frames per second, with the frame rate only limited by the sensor's API.
- Epipolar imaging was first proposed for acquiring live direct-only or global-only video with a conventional (non-ToF) video sensor.
- the approach has been extended to the ToF domain, but its energy efficiency is very low and it involves capturing more than 500 images to calculate a single "direct-only" ToF image.
- triangulation-based 3D imaging significant improvements in energy efficiency and robustness can be achieved with a 2D scanning-laser projector and a rolling shutter camera.
- the present invention extends this idea to the ToF domain. As such, it inherits all the advantages of non-ToF energy-efficient epipolar imaging while also addressing challenges that are specific to CW-ToF.
- the primary difficulty is that the range of CW-ToF sensors is severely limited by power consumption and eye safety considerations. Although most CW-ToF sensors electronically subtract the DC component of incident light, photon noise from strong ambient sources such as sunlight can easily overwhelm the CW-ToF signal at distances of more than a few meters outdoors at typical frame rates. By concentrating the energy of the light source into a single sheet, epipolar ToF boosts this range to 10m and acquires a useful, albeit noisier, depth signal at over 15m outdoors.
- a secondary difficulty is that the depth accuracy of CW-ToF sensors is strongly affected by global illumination effects, such as inter-reflections and global illumination transport. These effects produce longer light paths and show up as a source of structured additive noise. These effects cannot be cancelled a posteriori without imposing strong assumptions on the scene's geometry and reflectance properties, yet are extremely common indoors (e,g. , comers between walls, shiny surfaces of tables and floors, mirrors, etc,).
- the present invention demonstrates significant robustness to all forms of global transport, and to specular inter-reflections in particular, a form of global illumination transport that has never been possible to handle in live CW-ToF.
- CW-ToF sensors must acquire two or more frames with a different phase of emitted light to compute a single depth map. This makes them highly sensitive to camera shake, unlike conventional cameras where shaking merely blurs the image, camera shake in CW-ToF causes the static-scene assumption to be violated, leading to depth maps that are both blurry and corrupted by motion artifacts.
- Epipolar ToF makes it possible to address both problems: motion blur is minimized by relying on very short exposures for each epipolar plane, motion artifacts and depth errors are minimized by acquiring multiple phase measurements per epipolar plane, rather than per frame and rolling-shutter-like distortions due to the sequential nature of epipolar-plane ToF are reduced by scheduling the sequence of epipolar planes so that post-acquisition distortion correction becomes easier.
- FIGS. 1(a) - 1(d) show a comparison of various scenes scanned using a regular (ToF) system versus the epipolar ToF system of the present invention.
- FIG. 2 is a schematic view of a system for performing epipolar ToF imaging.
- FIGS. 3(a) - 3(f) shows several possible epipolar plane sampling schemes and row exposures in ToF imaging.
- FIG. 4(a) is a depiction of the prototype of the present application.
- FIG. 4(b) depicts the laser light source components.
- FIG. 5 shows timing diagrams for camera exposure, readout and mirror position for a particular sequence of the rows.
- FIG. 6(a) - 6(d) shows the results of imaging a white planar target at a range of distances from the sensor in cloudy weather and bright sunshine.
- FIG. 7(a) - 7(b) are graphs showing (a) the standard deviation in depth measurements obtained using regular and epipolar ToF imaging; and (b) the working range of the same simulated camera at different levels of acceptable range accuracy.
- FIG. 8 shows that epipolar ToF imaging provides accurate depth returns from the surface of the light bulbs even when they are turned on.
- FIG. 9 compares depth maps with epipolar and regular ToF imaging in the presence of global light transport.
- microcontroller may mean a dedicated hardware device, circuitry, an ASIC, an FPGA, a microprocessor running software, or any other means known in the art. It is further understood that the microcontroller will include connections to both the sensor and the laser light projector for sending control signals, and for receiving data. The invention is not intended to be limited to one method of implementing the functions of the controller.
- h x t represents a pixel 's transient response to the active light source and A x is the light received due to ambient light and the DC component of the active light source.
- a x drops out of the integral, in practice, ⁇ , ⁇ ( ⁇ ) is measured by integrating the incoming light to two different storage sites (called taps) depending on whether ⁇ ⁇ ( ⁇ is positive or negative and then taking the difference between the stored values so the ambient light still adds to the measurement shot noise.
- the pixel depth z(x) can be computed from l(x) using the geometric calibration parameters of the light source and sensor.
- FIG. 2 is a schematic view of a system for performing epipolar ToF imaging.
- a projector that generates a steerable sheet of modulated laser light is combined with a ToF sensor whose rows can be exposed one at a time.
- the projector and sensor are placed in a rectified stereo configuration so that the light sheet always lies in an epipolar plane between the projector and the camera. At any given instant, only the row of camera pixels in the epipolar plane are exposed to light.
- a line laser source with a ID scanning mirror that projects a steerable light sheet out onto the scene is used, as shown in FIG. 4(b).
- No current CW-ToF sensor provides controllable exposure coding across the 2D pixel array. Taking into account available off-the-shelf hardware, there are three ways to restrict exposure to pixels on an epipolar plane:
- DMD digital micro-mirror device
- the third option is chosen because it is more light-efficient than using a DMD mask, and it leads to a simpler design.
- the ROI is set to one row tall to match the requirements of epipolar ToF.
- CW-ToF requires at least two images to recover depth.
- the active epipolar plane must be swept across the field-of- view. This offers flexibility to choose the order by which epipolar planes are sampled.
- FIG. 3 illustrates several such ordering schemes.
- FIG. 3(a) shows conventional prior art ToF, wherein all epipolar planes are illuminated simultaneously and all camera rows are exposed at the same time. This requires long exposures and leads to severe artifacts due to motion, ambient light, global light transport and interference between devices.
- FIG. 3(b) shows that sending a very brief, high- intensity pulse of light for CW-ToF confers resistance to ambient light, but it is still prone to artifacts due to global light transport and motion.
- FIG. 3(c) shows an ordering the epipolar ToF planes which produces an effect similar to a rolling-shutter camera, where one complete image is acquired for each modulation phase. This results in robustness to ambient light, global illumination and motion blur. Sensitivity to motion remains, however, because of the significant delay between the multiple phase measurements acquired for each row. This scheme is undesirable because if the scene or camera move while acquiring these images, the recovered depth map will contain hard to correct errors.
- Another embodiment in FIG. 3(d) shows that interleaving measurements plane by plane minimizes such artifacts.
- the ordering strategy shown in FIG 3(d) loops through the set of modulation phases one epipolar plane at a time. Because the exposure time of each row is very short, all phases required for a single row can be acquired quickly enough to minimize depth and motion blur artifacts due to camera/scene motion.
- each row is captured at a slightly different time. Although this induces a rolling shutter-like effect in the acquired depth map, the individual depth values will be blur- and artifact-free and can be combined into a consistent model by post- processing.
- epipolar planes are ordered in a sawtooth pattern, as shown in FIG. 3(e).
- the entire field of view is scanned twice within the same total exposure time, yielding a higher temporal sampling of the scene and making consistent merging of individual depth map rows easier.
- This essentially provides full field-of-view depth maps at twice the frame rate but half the vertical resolution, making depth correction easier for fast camera shake and/or scene motions.
- FIG. 3(f) shows that, for certain applications, scanning different portions of the field of view with different temporal sampling rates can be beneficial.
- the projector In operation, the projector generates a sheet of modulated laser light and sequentially illuminates epipolar planes defined between the laser projector and the sensor.
- the planes may be illuminated in any order, but, in a preferred embodiment, are illuminated from top-to-bottom and then bottom-to-top. The actual order in which the planes are illuminated may be dependent upon the particular environment in which the platform is being used or the application for which the depth map is being created. Also, any number of planes may be defined within the field-of-view, limited only by the capabilities of the laser and the sensor, and the desired frame rate. In a preferred embodiment, there are 240 planes defined in the field-of-view, with each plane being 320 x 240 pixels.
- the region of interest of the sensor can be set to any portion of the field- of-view and, in operation, a microcontroller synchronizes the laser projector and the sensor such that the ROI of the sensor is set to sense a row of pixels within the currently illuminated epipolar plane.
- Phase is estimated using two images. In general, the sensor uses 4, measurements for correlating the incoming signal with shifted input signals (angles 0, 90, 180, 270). Either 2 or 4 of these images can be used for phase estimation, however, using 4 images gives more accuracy but takes longer to capture and reduces frame rate. If phase unwrapping is necessary, the phase estimation process will need to be performed at different modulation frequencies, and, as such, 4 images instead of 2 images will be required for phase unwrapping.
- an inertial measurement unit IMU may be attached to the sensor and is used to compensate for motion of the platform.
- FIG. 4(a) A prototype device for epipolar ToF imaging, shown in FIG. 4(a) was constructed using a galvomirror-based light sheet projector for illumination and a ToF sensor with adjustable region of interest for imaging.
- FIG. 4 is a depiction of the prototype.
- a DME660 camera with fast ROI control to capture arbitrary rows of pixels, and a custom-built, steerable light sheet projector as the light source were used in the prototype. It should be realized by one of skill in the art that the prototype described herein is only an exemplar of one particular embodiment of the invention, and that other embodiments utilizing different equipment and operational parameters fall within the scope of the invention.
- the ToF sensor used is the EPC660 (from Espros Photonics) which has a resolution of 320x240 and the pixels implement ambient saturation prevention.
- the sensor is fitted with an 8mm F1.6 low distortion lens and an optical bandpass filter (650nm center frequency, 20nm bandwidth).
- the sensor allows the ROI to be changed with every sensor readout and this feature is used to select different rows to image.
- the sensor development kit (DME660) from the manufacturer is utilized. It should be realized that the invention is not limited to the use of the described ToF sensor, but that any ToF sensor might be used.
- the line projector utilized for the prototype uses a 638nm laser diode with a peak power of 700mW as its light source.
- Light from the diode is collimated and passed through a Powell lens that stretches the beam cross-section into a diverging, almost uniformly illuminated straight line with a 45 degree fanout angle.
- the laser light is directed at a ID scanning galvomirror that can be rotated to deflect the sheet.
- the rotational range of the mirror gives the projector a 40 degree vertical field of view.
- the projector's effective center of projection moves as the mirror rotates, but because the distance between the fanout point and the galvomirror is very small compared to depths in the scene, this effect can be ignored.
- a microcontroller is used to synchronize the sensor and light source.
- the microcontroller may communicate with the sensor over an I2C bus to set the exposure time, modulation frequency /phase, region of interest, row and to trigger each capture.
- the microcontroller may also actuate the projector's galvomirror.
- the microcontroller can read the camera's rotational velocity using a MEMs inertial magnetic unit (IMU) that is attached to the sensor.
- IMU MEMs inertial magnetic unit
- a frequency generator circuit allows the selection of a modulation frequency (between 11 MHz and 24 MHz in steps of 1 MHz).
- the projector and camera are aligned side-by-side in a rectified stereo configuration, as required for epipolar imaging.
- the projected light sheet illuminates a single row of pixels in the camera, and this row is independent of depth.
- a mirror calibration is performed to determine the mapping between the galvomirror angle and the illuminated camera row.
- the measurements read out from the sensor, as observed, do not match their expected values. There are a number of reasons for this discrepancy, including fixed pattern noise, unequal sensitivity and crosstalk between taps and variations in the phase of the actual exposure modulation function at each pixel.
- the relation between the expected sensor measurements ⁇ ⁇ (x) and the observed measurements ⁇ ⁇ ( ⁇ ) is modelled using a projective correction ⁇ ⁇ ( ⁇ ) at each pixel.
- the path length can be computed at a pixel l k (x) and from it the expected phase 2 ⁇ 1 ⁇
- the ⁇ ⁇ ( ⁇ ) that best explains the sensor measurements ⁇ ⁇ ⁇ ( ⁇ can De computed by finding the correction ⁇ ⁇ ( ⁇ ) that minimizes the least square error between the corrected measurements and the expected phase.
- the time needed to image a row (and by extension the frame rate) with the prototype is a function of n, the number of readouts per row, exposure time t exp , the readout time for a row t read and t mirror , the time taken by the galvomirror to move to the next row position in the sampling sequence.
- FIG. 5 shows timing diagrams for camera exposure, readout and mirror position for a particular sequence of the rows.
- the scanning mirror is moved to the new active row and takes t mirror time to settle in the position.
- the previous row readout is complete (which takes t read time) and the mirror is in position, the camera is triggered.
- t mirror > t read , so the speed of the mirror is a bottleneck for capture rate.
- Each exposure lasts for time t exp and at the end of each exposure the row is read.
- FIG. 5 shows a timing example.
- t row is 175 ⁇ $ and t exp is set to 100 ⁇ $.
- Embodiments of the present invention need data from only one row of the sensor per readout, but the smallest region of interest the EPC660 sensor supports is 4 rows tall, the reading of 4 rows is forced when in actuality, only one row is used.
- the development kit limits the sensor data bus to 20 MHz, but the sensor itself supports bus rates up to 80 MHz.
- the minimum value of t exp depends on the peak power of the light source and desired range.
- the described prototype of the present invention has a source with a peak power of 700 mW, while most other experimental time-of-flight systems have a peak light source power in the 3 W to 10 W range. With a brighter light source, a shorter exposure time could be used without loss of range.
- the low cost galvomirror could be replaced with a faster ID MEMs mirror. With these improvements, a system based on the described prototype would operate at video frame rates.
- the sensor used in the described prototype supports a maximum modulation frequency of only 24 MHz, whereas most other time-of-flight sensors can run in the 50 MHz to 100 MHz range. This limits the ability of the prototype to accurately scan smaller objects or to be used for transient imaging.
- the EPC660 datasheet specifies that the sensor ADC returns 12 bit values, but the version of the sensor which was used returns only 10 bits, which effects range and makes the output depth maps noisier.
- the entire sensor is exposed at once instead of using a small ROI and the sensor is left exposed until the sheet projector has finished a sweep across the field of view.
- the sheet projector can be replaced with a diffused source.
- FIG. 7(a) shows the results of a simulation of the standard deviation in depth measurements obtained using regular and epipolar ToF imaging (15 MHz modulation frequency) for a target 10m from the camera as a function of ambient light level.
- the peak light source power is 2W and the total exposure time is the same (7.2 ms per image), but epipolar ToF is more robust to ambient light because it concentrates light source power and uses a short exposure for each row (30 ⁇ ).
- FIG. 7(b) shows the working range of the same simulated camera at different levels of acceptable range accuracy. Note that the simulated camera's parameters differ from the prototype.
- FIG. 6 quantitatively compares the described prototype in regular ToF and epipolar ToF modes in cloudy and sunny conditions.
- Regular ToF mode fails in bright sunlight, while epipolar ToF is considerably more robust.
- FIG. 1(a) shows a live 3D CW-ToF imaging in sunlight (70 klx) with 15m range (people walking on stairs, phase wraps around distant building), and
- FIG. 8 shows an example scene with both strong ambient light and global illumination effects. Reflections from the table service cause errors with regular ToF, but these are suppressed with epipolar imaging.
- FIG. 6(d) shows standard deviation in depth measurements versus distance to target (slower rising curves are better).
- the prototype of the present invention has depth error of around 3% at 10m in bright sunlight.
- FIG. 9 demonstrates the ability of epipolar imaging to suppress the effects of global illumination in a few common indoor environments. These results are generated using a single modulation frequency (24 MHz). At the comer of the room, diffuse inter-reflections between the walls and ceiling cause depths to be
- the ghosting on the wall due to reflections from the mirror is suppressed by epipolar imaging.
- the water fountain is particularly challenging because the direct retum from its metallic surface is very weak, but the surface reflects a lot of indirect light back to the sensor.
- epipolar imaging 3 exposures are combined to try to recover a useable direct signal. Longer exposures do not help regular imaging because the inter-reflections cause the sensor to saturate.
- each pair of cameras has a sparse set of points where they interfere with each other.
- a set of epipolar ToF cameras are running at different modulation frequencies, the contribution of each camera to shot noise in the other cameras is greatly reduced.
- FIG. 1(c) shows the result of operating two CW-ToF cameras simultaneously at the same frequency with regular and epipolar imaging.
- Epipolar imaging shoes the lack of interference between ToF devices operating at the same frequency. There are observable errors (i.e., the wall and chair) with regular ToF.
- each captured ToF measurement has motion blur and strong artefacts at depth discontinuities because the measurements are not aligned to each other. In theory, these could be collected using a spatially varying deconvolution but this is computationally expensive and does a poor job of recovering high frequency components.
- epipolar ToF imaging motion blur has basically no effect and a depth map with a rolling shutter like effect is acquired. This can be corrected with a simple image warp computed from the rotation. FIG.
- 1(d) shows an example from a rapidly panning camera, showing that non-distorted depth maps can be obtained even in the presence of severe camera shake during scene exposure (hard-to-remove ghosting errors in regular ToF can be observed).
- the sensor may be equipped with and I am you, which is used to compensate promotion of the platform.
- Epipolar imaging for time-of-flight depth cameras mitigates many of the problems commonly encountered with depth cameras, such as poor performance in brightly lit conditions, systemic errors due to global illumination, inter-device interference and errors due to camera motion. Compared to depth cameras, systems like scanning LIDAR that illuminate and image a single point at a time are very robust to all these effects but have a low measurement rate. Epipolar imaging can be thought of a compromise between these two extremes of full-field capture and point- by-point capture. Because epipolar imaging illuminates and captures a single line at a time, it allows a depth camera to have most of the robustness of point scanning while still having a high measurement rate.
- the scanning mirror follows a sawtooth pattern and captures rows in an orderly sequence.
- pseudo random row sampling strategies could be implemented that might allow epipolar imaging to be used in conjunction with compressed sensing or similar techniques to recover temporally super-resolved depth maps of fast moving scenes.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Computer Networks & Wireless Communication (AREA)
- Electromagnetism (AREA)
- Multimedia (AREA)
- Length Measuring Devices By Optical Means (AREA)
- Measurement Of Optical Distance (AREA)
- Optical Radar Systems And Details Thereof (AREA)
Abstract
Priority Applications (4)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| CA3051102A CA3051102A1 (fr) | 2017-01-20 | 2018-01-19 | Procede d'imagerie de temps de vol epipolaire |
| JP2019539163A JP7244013B2 (ja) | 2017-01-20 | 2018-01-19 | エピポーラ飛行時間撮像のための方法 |
| EP18742111.0A EP3571467A4 (fr) | 2017-01-20 | 2018-01-19 | Procédé d'imagerie de temps de vol épipolaire |
| US16/468,617 US11425357B2 (en) | 2015-02-13 | 2018-01-19 | Method for epipolar time of flight imaging |
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US201762499193P | 2017-01-20 | 2017-01-20 | |
| US62/499,193 | 2017-01-20 |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2018136709A1 true WO2018136709A1 (fr) | 2018-07-26 |
Family
ID=62909043
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/US2018/014369 Ceased WO2018136709A1 (fr) | 2015-02-13 | 2018-01-19 | Procédé d'imagerie de temps de vol épipolaire |
Country Status (4)
| Country | Link |
|---|---|
| EP (1) | EP3571467A4 (fr) |
| JP (1) | JP7244013B2 (fr) |
| CA (1) | CA3051102A1 (fr) |
| WO (1) | WO2018136709A1 (fr) |
Cited By (8)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| EP3620821A1 (fr) * | 2018-09-05 | 2020-03-11 | Infineon Technologies AG | Caméra de mesure du temps de vol et procédé d'étalonnage d'une caméra de mesure du temps de vol |
| CN111077538A (zh) * | 2019-12-29 | 2020-04-28 | 中国科学院西安光学精密机械研究所 | 一种海洋复杂环境动态高精度光学联合成像方法及系统 |
| CN111751807A (zh) * | 2019-03-27 | 2020-10-09 | 先进科技新加坡有限公司 | 用于校准或测试成像装置的设备和方法 |
| WO2021061216A1 (fr) * | 2019-09-23 | 2021-04-01 | Microsoft Technology Licensing, Llc | Partage de fréquence en modes multiples pour caméra à temps de vol |
| WO2022016314A1 (fr) * | 2020-07-20 | 2022-01-27 | 无锡盈达聚力科技有限公司 | Système de balayage et procédé de commande de lumière de visée |
| JP2022530349A (ja) * | 2019-04-17 | 2022-06-29 | カーネギー メロン ユニバーシティ | 三角測量ライトカーテンを使用したアジャイルな深度感知 |
| US11747135B2 (en) | 2015-02-13 | 2023-09-05 | Carnegie Mellon University | Energy optimized imaging system with synchronized dynamic control of directable beam light source and reconfigurably masked photo-sensor |
| US11972586B2 (en) | 2015-02-13 | 2024-04-30 | Carnegie Mellon University | Agile depth sensing using triangulation light curtains |
Families Citing this family (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| WO2021177045A1 (fr) * | 2020-03-04 | 2021-09-10 | ソニーグループ株式会社 | Dispositif de traitement de signal, procédé de traitement de signal et module de télémétrie |
Citations (2)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20150281671A1 (en) * | 2014-03-26 | 2015-10-01 | Alces Technology, Inc. | Compact 3D depth capture systems |
| WO2016131036A1 (fr) * | 2015-02-13 | 2016-08-18 | Carnegie Mellon University | Système d'imagerie avec commande dynamique synchronisée de source lumineuse à faisceau orientable et photo-capteur masqué de façon reconfigurable |
Family Cites Families (7)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2002039716A (ja) * | 2000-07-25 | 2002-02-06 | Olympus Optical Co Ltd | 距離画像入力装置 |
| EP1971820B1 (fr) * | 2006-01-08 | 2012-10-17 | Hermann Tropf | Etablissement d'une image de distance |
| JP2012168049A (ja) * | 2011-02-15 | 2012-09-06 | Stanley Electric Co Ltd | 距離画像生成装置および距離画像生成方法 |
| JP5858688B2 (ja) * | 2011-08-30 | 2016-02-10 | スタンレー電気株式会社 | 距離画像生成装置 |
| JP6309459B2 (ja) * | 2012-02-15 | 2018-04-11 | ヘプタゴン・マイクロ・オプティクス・ピーティーイー・エルティーディーHeptagon Micro Optics Pte.Ltd. | ストライプ照明の飛行時間型カメラ |
| US9476695B2 (en) * | 2013-07-03 | 2016-10-25 | Faro Technologies, Inc. | Laser tracker that cooperates with a remote camera bar and coordinate measurement device |
| JP6693880B2 (ja) * | 2014-01-29 | 2020-05-13 | エルジー イノテック カンパニー リミテッド | 深さ情報抽出装置および方法 |
-
2018
- 2018-01-19 WO PCT/US2018/014369 patent/WO2018136709A1/fr not_active Ceased
- 2018-01-19 CA CA3051102A patent/CA3051102A1/fr active Pending
- 2018-01-19 JP JP2019539163A patent/JP7244013B2/ja active Active
- 2018-01-19 EP EP18742111.0A patent/EP3571467A4/fr active Pending
Patent Citations (2)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20150281671A1 (en) * | 2014-03-26 | 2015-10-01 | Alces Technology, Inc. | Compact 3D depth capture systems |
| WO2016131036A1 (fr) * | 2015-02-13 | 2016-08-18 | Carnegie Mellon University | Système d'imagerie avec commande dynamique synchronisée de source lumineuse à faisceau orientable et photo-capteur masqué de façon reconfigurable |
Non-Patent Citations (1)
| Title |
|---|
| See also references of EP3571467A4 * |
Cited By (12)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US11747135B2 (en) | 2015-02-13 | 2023-09-05 | Carnegie Mellon University | Energy optimized imaging system with synchronized dynamic control of directable beam light source and reconfigurably masked photo-sensor |
| US11972586B2 (en) | 2015-02-13 | 2024-04-30 | Carnegie Mellon University | Agile depth sensing using triangulation light curtains |
| EP3620821A1 (fr) * | 2018-09-05 | 2020-03-11 | Infineon Technologies AG | Caméra de mesure du temps de vol et procédé d'étalonnage d'une caméra de mesure du temps de vol |
| CN111751807A (zh) * | 2019-03-27 | 2020-10-09 | 先进科技新加坡有限公司 | 用于校准或测试成像装置的设备和方法 |
| CN111751807B (zh) * | 2019-03-27 | 2023-05-23 | 先进科技新加坡有限公司 | 用于校准或测试成像装置的设备和方法 |
| JP2022530349A (ja) * | 2019-04-17 | 2022-06-29 | カーネギー メロン ユニバーシティ | 三角測量ライトカーテンを使用したアジャイルな深度感知 |
| JP7403860B2 (ja) | 2019-04-17 | 2023-12-25 | カーネギー メロン ユニバーシティ | 三角測量ライトカーテンを使用したアジャイルな深度感知 |
| WO2021061216A1 (fr) * | 2019-09-23 | 2021-04-01 | Microsoft Technology Licensing, Llc | Partage de fréquence en modes multiples pour caméra à temps de vol |
| US11619723B2 (en) | 2019-09-23 | 2023-04-04 | Microsoft Technology Licensing, Llc | Multiple-mode frequency sharing for time-of-flight camera |
| CN111077538A (zh) * | 2019-12-29 | 2020-04-28 | 中国科学院西安光学精密机械研究所 | 一种海洋复杂环境动态高精度光学联合成像方法及系统 |
| WO2022016314A1 (fr) * | 2020-07-20 | 2022-01-27 | 无锡盈达聚力科技有限公司 | Système de balayage et procédé de commande de lumière de visée |
| US12283078B2 (en) | 2020-07-20 | 2025-04-22 | Wuxi Idata Technoloty Company Ltd. | Scanning system and method for controlling aiming light source |
Also Published As
| Publication number | Publication date |
|---|---|
| EP3571467A4 (fr) | 2020-08-12 |
| CA3051102A1 (fr) | 2018-07-26 |
| JP7244013B2 (ja) | 2023-03-22 |
| JP2020504310A (ja) | 2020-02-06 |
| EP3571467A1 (fr) | 2019-11-27 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US11425357B2 (en) | Method for epipolar time of flight imaging | |
| JP7244013B2 (ja) | エピポーラ飛行時間撮像のための方法 | |
| JP7371443B2 (ja) | 三次元計測装置 | |
| US10764517B2 (en) | Stereo assist with rolling shutters | |
| US11002856B2 (en) | Doppler time-of-flight imaging | |
| US10830579B2 (en) | Three-dimensional triangulational scanner having high dynamic range and fast response | |
| CN106454287B (zh) | 组合摄像系统、移动终端及图像处理方法 | |
| US9578313B2 (en) | Method of enhanced depth image acquisition | |
| US11375165B2 (en) | Image calibration for projected images | |
| JP2023026503A (ja) | 車両の周囲を特徴付けるためのシステム | |
| CN103477644A (zh) | 记录图像和从图像获得3d信息的方法、相机系统 | |
| CN112655022A (zh) | 图像处理装置和图像处理方法 | |
| JP7147729B2 (ja) | 移動量推定装置、移動量推定方法、移動量推定プログラム、及び移動量推定システム | |
| CN101000234A (zh) | 用于三维光学测量的装置和方法 | |
| JP6369897B2 (ja) | 自己位置算出装置及び自己位置算出方法 | |
| Langmann et al. | Real-time image stabilization for ToF cameras on mobile platforms | |
| Maas | Close range photogrammetry sensors | |
| Lickfold et al. | Simultaneous phase and frequency stepping in time-of-flight range imaging | |
| CN115150545B (zh) | 获取三维测量点的测量系统 | |
| WO2024154528A1 (fr) | Dispositif de mesure tridimensionnelle | |
| Langmann | PMD imaging | |
| Chiabrando et al. | New sensors for cultural heritage metric survey: the ToF cameras | |
| JP2010281733A (ja) | 3d光学作像システム |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 18742111 Country of ref document: EP Kind code of ref document: A1 |
|
| ENP | Entry into the national phase |
Ref document number: 3051102 Country of ref document: CA Ref document number: 2019539163 Country of ref document: JP Kind code of ref document: A |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |
|
| ENP | Entry into the national phase |
Ref document number: 2018742111 Country of ref document: EP Effective date: 20190820 |