US9430700B2 - System and method for optimizing tracker system - Google Patents
System and method for optimizing tracker system Download PDFInfo
- Publication number
- US9430700B2 US9430700B2 US14/342,526 US201214342526A US9430700B2 US 9430700 B2 US9430700 B2 US 9430700B2 US 201214342526 A US201214342526 A US 201214342526A US 9430700 B2 US9430700 B2 US 9430700B2
- Authority
- US
- United States
- Prior art keywords
- pose
- optimizing
- camera
- data
- tracker
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Expired - Fee Related, expires
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
- G06T7/73—Determining position or orientation of objects or cameras using feature-based methods
-
- G06K9/00362—
-
- G06T7/0042—
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10016—Video; Image sequence
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30204—Marker
Definitions
- the present invention relates to the field of computation and simulation and covers methods to optimize the fiducial marker positions in optical object tracking systems, by simulating the visibility.
- the pilot wears a helmet with patterns (fiducial markers) and at least one tracker camera determines the helmet's position and orientation using geometric calculations based on these patterns.
- Determining the position and orientation of an entity using fiducials is called the pose estimation problem and it can be stated as follows: given a set of N feature correspondences between three dimensional (3D) points of an object and two dimensional (2D) projection of that object onto the image plane, find the rotation and translation of the object with respect to the reference system of the camera. The objective is to find rotation and translation between camera and 3D object so that the object's 3D location and orientation is known.
- This reference system is generally based on the respective pattern of an object under consideration. Since the position and orientation of the tracker camera with respect to the other coordinate systems is known (or can be calculated or measured) in a tracker system, it is also possible to compute the helmet's spatial relation with the tracker camera's sensor and then with other coordinate systems.
- “tracked object” means an object having a tracking pattern (fiducial marker) and being tracked by a tracker system. It may be either a helmet as in a helmet-mounted tracker system or any other object.
- the patterns used in camera-based tracker systems are either graphical (generally black and white) patterns (passive marker) tracked by visible light cameras or arrays of light sources (e.g. light emitting diodes or LEDs) (active marker). These light sources can be chosen to be in the infrared range of the electromagnetic spectrum with suitable selection of camera sensor and filter set. Other arrangements are also possible but the most convenient among them is the one with the infrared LEDs since these systems can work under inappropriate lighting conditions.
- the positions (locations) of these LEDs on the tracked object should be determined mindfully to make sure that a small pose error is attained and pose coverage is high. There are some currently used methods to determine and optimize the positions of fiducial markers.
- number of visible fiducials and their relative angle with respect to the optical sensor is used as a constraint to determine optimal fiducial positions.
- This method is intended to be used in large areas with fiduciary marks and can not be applied to optimize fiducial locations on a moving tracked object being captured by a stationary camera.
- the motion trend of the pilot should also be considered when calculating the fiducial visibility.
- the pose estimation parameters used by the pose estimation algorithm are not considered in the current methods, which directly affect the output accuracy of the system.
- the current methods are not offering an effective way of simulating a tracker system's camera and fiducial positions to optimize the system's pose estimation accuracy altogether.
- a new methodology should be introduced which uses further steps to determine the fiducial positions on a tracked object, which is moving in front of a stationary camera.
- An objective of the present invention is to simulate a tracker system's camera and fiducial positions and pose estimation algorithm parameters to optimize the system.
- FIG. 1 is the schematic view of the preferred embodiment system.
- FIG. 2 shows graph of a mesh of possible fiducial positions on the object.
- FIG. 3 shows the graph of the result of the optimization routine.
- FIG. 4 is the flowchart of the preferred method of the present invention.
- a method for optimizing tracker system ( 100 ) fundamentally comprises the following steps,
- the possible positions of active fiducials on the tracked object are mathematically modeled to convert the problem into a discrete optimization problem.
- a model mathematically, there are various currently known methods and a very basic example can be representing each position on the object with a three dimensional coordinate. It is important to note that these coordinates on the mesh (model) are determined with respect to a common coordinate system and should be relatable to the possible camera locations.
- pose data representing possible poses of tracked object under working conditions should also be introduced. In a preferred configuration of the present invention, these data are acquired using inertial measurement units (IMU) placed on the real object under real working conditions and movements of the objects are recorded to be used as the pose data.
- IMU inertial measurement units
- Another option is using an optic based pose estimation setup with many high resolution cameras that is capable of generating accurate and complete pose data and active markers mounted on tracked object.
- This data acquisition system is different from the actual hence is expected to more accurately and completely acquire pose of the object under real working conditions.
- These data is again discrete and represents many possible poses of the object under various motion scenarios. As indicated, this data should be considered in calculations to effectively simulate real working conditions. As an example, these represent the head motion of a pilot in an aircraft which uses a head tracking system.
- the cockpit (or room) that the head tracker will operate is generally known and camera(s) can virtually be located inside the cockpit 3D model with respect to the nominal head position of the person of interest (i.e. pilot). This can be used to generate the pose database that the optimization will utilize. Even though there are mechanical and system related limits on the camera locations, there is a margin that the cameras can be placed and the optimum location will be explored using the outputs of the proposed optimization algorithm.
- an occlusion model is used to estimate the visibility of 3D model points (active markers) given the pose (Davis et al., 2004). It is based on ray tracing technique developed in computer graphics.
- the visibility computation is based on the LED's normal with respect to the object coordinate system, LEDs illumination cone angle and the known pose of the head.
- the angle between the LED's normal and camera plane normal in camera coordinate system defines how perpendicular the LED is directed towards the camera (LED direction angle).
- the LED's illumination cone angle defines a minimum LED direction angle threshold for the LED to be visible for the camera.
- LED direction angle can be computed for each LED to determine its visibility.
- the marker's normal with respect to the object coordinate system, marker's illumination cone angle and the known pose of the object can equivalently applied to any active marker tracking system.
- the node with highest visibility count is selected as an active marker placement node in a step ( 103 ).
- the node with the highest visibility count which was not determined as a marker node previously, is selected since it is visible for most of the determined camera position(s) and pose data. This node will represent a marker placement position.
- nodes which are closer to the selected node than a non-maxima suppression threshold are removed in a step ( 104 ) to prevent selection of a very close node in the following iteration.
- the algorithm requires a predetermined percent (say X) of the poses having at least a predetermined number (say K) of placed markers visible.
- say X the total number of markers (selected and placed so far) that are visible for each pose is computed to inspect which poses exceed a K marker limit.
- X percent of the poses will have at least K number of visible markers.
- this pose should be excluded from the pose database. This will ensure that, in the mesh marker visibility analysis, marker location sorting is executed for the poses that still have insufficient number of visible markers. This exclusion prevents unnecessary placement of markers for poses that already have enough number of visible markers. Therefore the pose(s) having predetermined number of selected nodes are removed in step ( 105 ) and marker placement is stopped if sufficient percentage of all poses has enough selected nodes ( 106 ).
- 3D coordinates of the selected nodes are passed to optical head tracking algorithm. Markers are projected onto the image plane for each pose as if they are being viewed from each camera in step ( 107 ). Also these 2D marker images are passed to the pose estimation algorithm that will be used in the simulated system. In the case of LED markers, they are represented as points on the 2D plane and a certain amount noise is added to generate a synthetic LED centroid image in a preferred configuration. Pose estimation algorithm estimates pose for the given marker images (or LED coordinates and centroid image) independently for each pose.
- step ( 108 ) the accuracy and coverage (percent of poses that can be estimated with pose estimation algorithm) of the algorithm for the camera location-marker tree-pose estimation algorithm configuration is tested. Pose error is computed and the poses that the algorithm is unable to compute are accepted as hidden poses. Since the real positions of the markers are known when the input mesh and camera location data is first fed to the simulation they can be used to determine pose error by comparing them with the pose estimation algorithm results (ground truth data).
- a good marker tree configuration is expected to have small pose error and small percent hidden poses (inverse of pose coverage) with minimal number of markers.
- This optimization and simulation environment can be used to find:
- the parameters are systematically varied in a preferred configuration and 3 outputs are recorded: pose error, percent hidden poses and number of placed markers in step ( 108 ).
- the whole method can be implemented with different input data and pose estimation algorithms to compare their results.
- the position of markers for each parameter set is output and recorded with their respective marker counts in a step ( 109 ). Pose accuracy and coverage results are also recorded in this step.
- the optimization routine can be re-executed to fine tune camera location, to find marker tree and adjust pose estimation algorithm. If there is a new data set to be compared, the method starts over with the new data from step ( 101 ), following the check ( 110 ).
- These data includes optimization parameter sets, pose estimation parameter sets, camera location sets or 3D object meshes to be processed.
- a sample output of parameter configurations are plotted.
- X and Y axes are pose error and percent hidden poses, whereas number of markers are encoded in the dot size and also written as text on the plot.
- the output of the simulation provides choices to systems engineers for selecting configurations according to the constraints on acceptable active marker number and pose estimation accuracy. ( FIG. 3 )
- a camera location set, 3D mesh data a marker tree, an optimization parameter set and a pose estimation algorithm parameter set, satisfying at least one constraint is chosen as optimized pose estimation system parameters in step ( 111 ).
- Optimization parameter is basically the non-maxima suppression distance threshold, predetermined number of selected and predetermined percentage of all poses having a predetermined number of selected nodes.
- the mentioned constraint can be one of the following; choosing a parameter set with the minimum pose error, a system with the minimum hidden poses or a system with the minimum marker number.
- the constraint is a combination of the above constraints. For example when there is not any constraint on available marker position; the system with minimum pose error and percent hidden poses can be selected or when there is not any constraint on hidden poses; the system with minimum number of markers and a minimum marker number is selected.
- remove the pose(s) with enough selected nodes ( 105 ) step is not executed before placing a predetermined number of markers to the initial mesh. This ensures elimination of at least one pose in step ( 105 ).
- the process is routed to step ( 110 ) if there is not any empty available node to select as active marker placement node in step ( 103 ). This situation means that current mesh is not appropriate to place enough markers for the given parameters and configuration. Method is continued by receiving a new data set if available.
- a system for optimizing tracker system ( 1 ) fundamentally comprises;
- processing unit ( 2 ) is configured to receive pose, mesh and camera position data from the input/output device ( 3 ).
- Method ( 100 ) is applied on a sequence of pose data (representing possible orientations and locations of the tracked object) and a single mesh data (representing the possible LED positions on the tracked object) for each data set.
- processing unit ( 2 ) is configured to receive pose, mesh and camera position data from the memory unit ( 4 ). This way, it is possible to analyze and simulate previously recorded data without the need for new data set acquisition.
- Input/output device ( 3 ) is preferentially configured to receive at least one pose data of the tracked object under consideration by any known means such as optical/magnetic or laser tracking. This pose data is preferentially relative to a predetermined reference frame which is relative to at least one camera position in the actual tracking system under consideration.
- input/output device ( 3 ) is preferentially configured to receive at least one pose data of the tracked object under consideration using an inertial measurement unit (IMU) mounted to that object or an optic based high resolution multi camera pose estimation data acquisition system that is capable of generating accurate and complete pose data.
- IMU inertial measurement unit
- the camera position data and mesh data are generated by a user through the input/output device by currently known computer aided modeling tools.
- Input/output device ( 3 ) is any interface device known in the art such as monitor, keyboard, mouse, camera or a combination.
- Memory unit ( 4 ) is any volatile or non-volatile memory device known in the art such as a RAM (random access memory), ROM (read-only memory), flash memory or a hard disk. These are used to store input, output or intermediate data related to the said method ( 100 ) temporarily or permanently.
- RAM random access memory
- ROM read-only memory
- flash memory or a hard disk.
- the method ( 100 ) together with the system ( 1 ) can effectively simulate a tracker system's camera and fiducial positions and pose estimation algorithm parameters to optimize the system.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Human Computer Interaction (AREA)
- Multimedia (AREA)
- Length Measuring Devices By Optical Means (AREA)
- Image Analysis (AREA)
- Image Processing (AREA)
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/IB2012/050801 WO2013124706A1 (en) | 2012-02-22 | 2012-02-22 | System and method for optimizing tracker system |
Publications (2)
Publication Number | Publication Date |
---|---|
US20140212000A1 US20140212000A1 (en) | 2014-07-31 |
US9430700B2 true US9430700B2 (en) | 2016-08-30 |
Family
ID=45894602
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/342,526 Expired - Fee Related US9430700B2 (en) | 2012-02-22 | 2012-02-22 | System and method for optimizing tracker system |
Country Status (7)
Country | Link |
---|---|
US (1) | US9430700B2 (pl) |
EP (1) | EP2734977B1 (pl) |
JP (1) | JP5912191B2 (pl) |
KR (1) | KR101850048B1 (pl) |
ES (1) | ES2547321T3 (pl) |
PL (1) | PL2734977T3 (pl) |
WO (1) | WO2013124706A1 (pl) |
Cited By (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10963054B2 (en) | 2016-12-15 | 2021-03-30 | Sony Interactive Entertainment Inc. | Information processing system, vibration control method and program |
US10963055B2 (en) | 2016-12-15 | 2021-03-30 | Sony Interactive Entertainment Inc. | Vibration device and control system for presenting corrected vibration data |
US10969867B2 (en) | 2016-12-15 | 2021-04-06 | Sony Interactive Entertainment Inc. | Information processing system, controller device, controller device control method and program |
US10981053B2 (en) | 2017-04-18 | 2021-04-20 | Sony Interactive Entertainment Inc. | Vibration control apparatus |
US11013990B2 (en) | 2017-04-19 | 2021-05-25 | Sony Interactive Entertainment Inc. | Vibration control apparatus |
US11145172B2 (en) | 2017-04-18 | 2021-10-12 | Sony Interactive Entertainment Inc. | Vibration control apparatus |
US11195293B2 (en) | 2017-07-20 | 2021-12-07 | Sony Interactive Entertainment Inc. | Information processing device and positional information obtaining method |
US11458389B2 (en) | 2017-04-26 | 2022-10-04 | Sony Interactive Entertainment Inc. | Vibration control apparatus |
Families Citing this family (22)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
TR201401751T1 (tr) | 2013-04-12 | 2017-08-21 | Aselsan Elektronik Sanayi Ve Ticaret As | Bir belirteç ve kamera konum yönelimlerini optimize etmeye yönelik sistem ve yöntem. |
US10110881B2 (en) * | 2014-10-30 | 2018-10-23 | Microsoft Technology Licensing, Llc | Model fitting from raw time-of-flight images |
EP3876198A4 (en) * | 2018-10-30 | 2022-06-29 | Alt Limited Liability Company | Method and system for the inside-out optical tracking of a movable object |
US10909715B1 (en) * | 2019-01-31 | 2021-02-02 | Rockwell Collins, Inc. | High-integrity optical pose estimation using coded features |
KR102634916B1 (ko) * | 2019-08-29 | 2024-02-06 | 주식회사 엘지에너지솔루션 | 온도 추정 모델 결정 방법 및 장치, 온도 추정 모델이 적용된 배터리 관리 시스템 |
US11275453B1 (en) | 2019-09-30 | 2022-03-15 | Snap Inc. | Smart ring for manipulating virtual objects displayed by a wearable device |
US11423573B2 (en) * | 2020-01-22 | 2022-08-23 | Uatc, Llc | System and methods for calibrating cameras with a fixed focal point |
US11277597B1 (en) | 2020-03-31 | 2022-03-15 | Snap Inc. | Marker-based guided AR experience |
US11798429B1 (en) | 2020-05-04 | 2023-10-24 | Snap Inc. | Virtual tutorials for musical instruments with finger tracking in augmented reality |
US11520399B2 (en) | 2020-05-26 | 2022-12-06 | Snap Inc. | Interactive augmented reality experiences using positional tracking |
KR102298098B1 (ko) * | 2020-05-29 | 2021-09-03 | 이화여자대학교 산학협력단 | Rgb-d 카메라의 트래킹을 통한 3d 모델의 생성 방법 및 장치 |
US11925863B2 (en) | 2020-09-18 | 2024-03-12 | Snap Inc. | Tracking hand gestures for interactive game control in augmented reality |
US20220160291A1 (en) * | 2020-11-23 | 2022-05-26 | Mocxa Health Private Limited | System for recording of seizures |
CN116724285A (zh) | 2020-12-29 | 2023-09-08 | 美国斯耐普公司 | 用于控制虚拟和图形元素的微手势 |
KR20230124077A (ko) | 2020-12-30 | 2023-08-24 | 스냅 인코포레이티드 | 증강 현실 정밀 추적 및 디스플레이 |
US11740313B2 (en) * | 2020-12-30 | 2023-08-29 | Snap Inc. | Augmented reality precision tracking and display |
US11531402B1 (en) | 2021-02-25 | 2022-12-20 | Snap Inc. | Bimanual gestures for controlling virtual and graphical elements |
CN117178247A (zh) | 2021-04-19 | 2023-12-05 | 斯纳普公司 | 用于动画化及控制虚拟和图形元素的手势 |
US12198380B2 (en) | 2022-01-11 | 2025-01-14 | Rockwell Collins, Inc. | Vision-based navigation system incorporating high-confidence error overbounding of multiple optical poses |
US12136234B2 (en) | 2022-01-11 | 2024-11-05 | Rockwell Collins, Inc. | Vision-based navigation system incorporating model-based correspondence determination with high-confidence ambiguity identification |
US11995228B2 (en) | 2022-01-11 | 2024-05-28 | Rockwell Collins, Inc. | Head tracking system with extended kalman filter combining absolute and relative navigation |
CN114563203B (zh) * | 2022-03-11 | 2023-08-15 | 中国煤炭科工集团太原研究院有限公司 | 一种模拟井下低能见度环境的方法 |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20020024675A1 (en) * | 2000-01-28 | 2002-02-28 | Eric Foxlin | Self-referenced tracking |
US20040239756A1 (en) * | 2003-05-30 | 2004-12-02 | Aliaga Daniel G. | Method and apparatus for computing error-bounded position and orientation of panoramic cameras in real-world environments |
US20140368664A1 (en) * | 2012-01-17 | 2014-12-18 | Aselsan Elektronik Sanayi Ve Ticaret Anonim Sirketi | System and method for measuring tracker system accuracy |
US20160063708A1 (en) * | 2013-04-12 | 2016-03-03 | Aselsan Elektronik Sanayi Ve Ticaret Anonim Sirketi | A system and method for optimizing fiducial marker and camera positions/orientations |
Family Cites Families (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP5594944B2 (ja) * | 2008-08-05 | 2014-09-24 | 株式会社島津製作所 | モーショントラッカ装置 |
-
2012
- 2012-02-22 JP JP2014558220A patent/JP5912191B2/ja not_active Expired - Fee Related
- 2012-02-22 EP EP12711001.3A patent/EP2734977B1/en active Active
- 2012-02-22 PL PL12711001T patent/PL2734977T3/pl unknown
- 2012-02-22 ES ES12711001.3T patent/ES2547321T3/es active Active
- 2012-02-22 US US14/342,526 patent/US9430700B2/en not_active Expired - Fee Related
- 2012-02-22 WO PCT/IB2012/050801 patent/WO2013124706A1/en active Application Filing
- 2012-02-22 KR KR1020147007367A patent/KR101850048B1/ko not_active Expired - Fee Related
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20020024675A1 (en) * | 2000-01-28 | 2002-02-28 | Eric Foxlin | Self-referenced tracking |
US20040239756A1 (en) * | 2003-05-30 | 2004-12-02 | Aliaga Daniel G. | Method and apparatus for computing error-bounded position and orientation of panoramic cameras in real-world environments |
US20140368664A1 (en) * | 2012-01-17 | 2014-12-18 | Aselsan Elektronik Sanayi Ve Ticaret Anonim Sirketi | System and method for measuring tracker system accuracy |
US20160063708A1 (en) * | 2013-04-12 | 2016-03-03 | Aselsan Elektronik Sanayi Ve Ticaret Anonim Sirketi | A system and method for optimizing fiducial marker and camera positions/orientations |
Non-Patent Citations (2)
Title |
---|
Davis L et al: "A Method for Designing Marker-Based Tracking Probes", Mixed and Augmented Reality, 2004. ISMAR 2004. Third IEEE and ACM Inte Rnational Symposium on Arlington, VA,USA, IEEE, Nov. 2, 2004, pp. 120-129. XP10769635, DOI:10.1109/ISMAR.2004.5 ISBN:978-0-7695-2191-6 the whole document. |
Jansen et al.:"Performance Improvement for Optical Tracking by Adapting Marker Arrangements", VR Workshop on Trends and Issues in Tracking for Virtual Environments, Jan. 1, 2007, pp. 28-33,XP002679938,Retrieved from the Internet: URL: http://viscg.uni-muenster.de/publications/2007/JSHVS07/ [retrieved on Jul. 16, 2012] the whole document. |
Cited By (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10963054B2 (en) | 2016-12-15 | 2021-03-30 | Sony Interactive Entertainment Inc. | Information processing system, vibration control method and program |
US10963055B2 (en) | 2016-12-15 | 2021-03-30 | Sony Interactive Entertainment Inc. | Vibration device and control system for presenting corrected vibration data |
US10969867B2 (en) | 2016-12-15 | 2021-04-06 | Sony Interactive Entertainment Inc. | Information processing system, controller device, controller device control method and program |
US10981053B2 (en) | 2017-04-18 | 2021-04-20 | Sony Interactive Entertainment Inc. | Vibration control apparatus |
US11145172B2 (en) | 2017-04-18 | 2021-10-12 | Sony Interactive Entertainment Inc. | Vibration control apparatus |
US11013990B2 (en) | 2017-04-19 | 2021-05-25 | Sony Interactive Entertainment Inc. | Vibration control apparatus |
US11458389B2 (en) | 2017-04-26 | 2022-10-04 | Sony Interactive Entertainment Inc. | Vibration control apparatus |
US11195293B2 (en) | 2017-07-20 | 2021-12-07 | Sony Interactive Entertainment Inc. | Information processing device and positional information obtaining method |
Also Published As
Publication number | Publication date |
---|---|
EP2734977B1 (en) | 2015-06-24 |
PL2734977T3 (pl) | 2016-02-29 |
EP2734977A1 (en) | 2014-05-28 |
WO2013124706A1 (en) | 2013-08-29 |
JP2015513143A (ja) | 2015-04-30 |
ES2547321T3 (es) | 2015-10-05 |
JP5912191B2 (ja) | 2016-04-27 |
KR20140130096A (ko) | 2014-11-07 |
US20140212000A1 (en) | 2014-07-31 |
KR101850048B1 (ko) | 2018-05-30 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US9430700B2 (en) | System and method for optimizing tracker system | |
US9607387B2 (en) | System and method for optimizing fiducial marker and camera positions/orientations | |
KR102296236B1 (ko) | 3d 이미지 데이터에서 3d 포즈와 기생포인트 제거의 스코어링을 개선하는 시스템 및 방법 | |
EP3262439B1 (en) | Using intensity variations in a light pattern for depth mapping of objects in a volume | |
CN106524922A (zh) | 测距校准方法、装置和电子设备 | |
US20200057831A1 (en) | Real-time generation of synthetic data from multi-shot structured light sensors for three-dimensional object pose estimation | |
KR101896301B1 (ko) | 깊이 영상 처리 장치 및 방법 | |
WO2012081687A1 (en) | Information processing apparatus, information processing method, and program | |
JP2016103230A (ja) | 画像処理装置、画像処理方法、及びプログラム | |
EP3622481B1 (en) | Method and system for calibrating a velocimetry system | |
US20210148694A1 (en) | System and method for 3d profile determination using model-based peak selection | |
KR102223484B1 (ko) | 식생을 제거한 절토사면의 3차원 모델 생성 시스템 및 3차원 모델 생성 방법 | |
US10410068B2 (en) | Determining the position of an object in a scene | |
KR20190050819A (ko) | 가정된 캘리브레이션 모델 없이 측정된 데이터를 이용하여 캘리브레이션을 수행하는 방법 및 상기 방법을 수행하는 3차원 스캐너 캘리브레이션 시스템 | |
KR102242327B1 (ko) | 조명 공간 정보 획득 장치 및 그 방법과 대상 공간의 조명 환경 평가 방법 | |
Schlette et al. | A new benchmark for pose estimation with ground truth from virtual reality | |
CN115222799B (zh) | 图像重力方向的获取方法、装置、电子设备及存储介质 | |
US20230206493A1 (en) | Processing images of objects and object portions, including multi-object arrangements and deformed objects | |
WO2023002978A1 (ja) | 画像生成処理装置、3次元形状の復元システム、画像生成処理方法およびプログラム | |
CN207622767U (zh) | 对象定位系统 | |
KR20170053509A (ko) | 가정된 캘리브레이션 모델 없이 측정된 데이터를 이용하여 캘리브레이션을 수행하는 방법 및 상기 방법을 수행하는 3차원 스캐너 캘리브레이션 시스템 | |
Eissa | Image Space Coverage Model for Deployment of Multi-camera Networks | |
CN116385537A (zh) | 一种用于增强现实的定位方法及装置 | |
KR20240055102A (ko) | 이미지 처리의 특징화 및 개선 | |
Pérez et al. | Exploring 3D Reconstruction Techniques within Autonomous Underwater Manipulation Tasks |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: ASELSAN ELEKTRONIK SANAYI VE TICARET ANONIM SIRKET Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:YAGCIOGLU, MUSTAFA;YAVUZ, ERKAN;YILMAZ, OZGUR;AND OTHERS;SIGNING DATES FROM 20140127 TO 20140226;REEL/FRAME:032340/0745 |
|
ZAAA | Notice of allowance and fees due |
Free format text: ORIGINAL CODE: NOA |
|
ZAAB | Notice of allowance mailed |
Free format text: ORIGINAL CODE: MN/=. |
|
STCF | Information on status: patent grant |
Free format text: PATENTED CASE |
|
MAFP | Maintenance fee payment |
Free format text: PAYMENT OF MAINTENANCE FEE, 4TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1551); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY Year of fee payment: 4 |
|
FEPP | Fee payment procedure |
Free format text: MAINTENANCE FEE REMINDER MAILED (ORIGINAL EVENT CODE: REM.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY |
|
LAPS | Lapse for failure to pay maintenance fees |
Free format text: PATENT EXPIRED FOR FAILURE TO PAY MAINTENANCE FEES (ORIGINAL EVENT CODE: EXP.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY |
|
STCH | Information on status: patent discontinuation |
Free format text: PATENT EXPIRED DUE TO NONPAYMENT OF MAINTENANCE FEES UNDER 37 CFR 1.362 |
|
FP | Lapsed due to failure to pay maintenance fee |
Effective date: 20240830 |