+

WO2018134866A1 - Dispositif d'étalonnage de caméra - Google Patents

Dispositif d'étalonnage de caméra Download PDF

Info

Publication number
WO2018134866A1
WO2018134866A1 PCT/JP2017/001337 JP2017001337W WO2018134866A1 WO 2018134866 A1 WO2018134866 A1 WO 2018134866A1 JP 2017001337 W JP2017001337 W JP 2017001337W WO 2018134866 A1 WO2018134866 A1 WO 2018134866A1
Authority
WO
WIPO (PCT)
Prior art keywords
movement
marker
sensor
camera
fixed camera
Prior art date
Application number
PCT/JP2017/001337
Other languages
English (en)
Japanese (ja)
Inventor
秀行 粂
三好 雅則
媛 李
Original Assignee
株式会社日立製作所
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社日立製作所 filed Critical 株式会社日立製作所
Priority to PCT/JP2017/001337 priority Critical patent/WO2018134866A1/fr
Publication of WO2018134866A1 publication Critical patent/WO2018134866A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof

Definitions

  • the present invention relates to a camera calibration device, and more particularly to a camera calibration device that automatically creates a movement plan for a calibration jig during calibration work.
  • this information processing apparatus can obtain the effect of “the mutual orientation and absolute orientation processes in camera calibration can be integrated” described in paragraph 0031 of the same document.
  • a camera calibration apparatus is a movement in which a fixed camera and a movement marker with a sensor are connected, and the position / posture of the movement marker is estimated from the measured value of the sensor of the movement marker with sensor.
  • a marker calibration unit, a fixed camera calibration unit that estimates the position / orientation of the fixed camera from the captured image of the fixed camera and the estimated position / orientation of the moving marker, and an estimated position / position of the fixed camera A movement plan unit that creates a movement plan including a target movement position of the movement marker according to the posture, a movement instruction unit that instructs movement based on the movement plan, and outputs the estimated position and posture of the fixed camera And an output unit.
  • the positions and postures of a plurality of fixed cameras can be calibrated with high accuracy in a short time regardless of the skill level of the calibration operator.
  • the figure which shows the block structure of the camera calibration apparatus 100 The figure which shows an example of the fixed camera 200 and the movement marker 300 with a sensor.
  • the flowchart which shows the process performed in the fixed camera calibration part 102 The figure which shows an example of the feature point 311 detected from the image 210 and the image 210
  • the flowchart which shows the process performed in the movement plan part 103 The figure which shows an example of the three-dimensional histogram 220
  • the figure which shows an example of the two-dimensional histogram 230 The figure which shows an example of the bin 231 of the two-dimensional histogram 230
  • indication part 104 displays The figure which shows an example of the two-dimensional map which the output part 105 outputs The figure which shows the block structure of the camera calibration apparatus 400 The flowchart which shows the process performed in the stop determination part 401
  • FIG. 1 is a diagram illustrating a block configuration of a camera calibration apparatus 100 according to the present embodiment.
  • the camera calibration apparatus 100 is connected to a plurality of fixed cameras 200 and a movement marker 300 with a sensor, and calibrates the position and orientation of the fixed camera 200.
  • the identifier of each of the fixed camera 200 and i the total number of fixed cameras 200 N i. That is, the camera calibration apparatus 100 shall be fixed camera 200 of N i stand is connected.
  • the fixed camera 200 is a camera whose position and posture are fixed, such as a surveillance camera installed on the ceiling.
  • the movement marker 300 with a sensor includes, for example, a sensor such as a camera that can measure its own movement and a marker such as a checkerboard pattern that can be easily detected from a captured image of the fixed camera 200. Accordingly, the camera moves in the environment where the fixed camera 200 is installed.
  • the camera calibration apparatus 100 includes a movement marker calibration unit 101, a fixed camera calibration unit 102, a movement planning unit 103, a movement instruction unit 104, and an output unit 105. Note that these can be realized, for example, by operating an arithmetic device such as a CPU in accordance with a program stored in a storage device such as a semiconductor memory in the camera calibration device 100, and each is necessarily provided as hardware. There is no need.
  • the moving marker calibration unit 101 estimates the position / posture of the sensor-equipped moving marker 300 in the world coordinate system from the measurement values measured by the sensor of the sensor-equipped moving marker 300.
  • the fixed camera calibration unit 102 uses the image including the moving marker 300 with the sensor imaged by the fixed camera 200 and the position / posture in the world coordinate system of the moving marker 300 with the sensor estimated by the moving marker calibration unit 101. Estimate the position and orientation in 200 world coordinate systems.
  • the movement planning unit 103 plans the movement of the sensor-equipped moving marker 300 according to the position / orientation of each fixed camera 200 estimated by the fixed camera calibration unit 102.
  • the movement instruction unit 104 issues an instruction for causing the movement marker with sensor 300 to execute the movement plan planned by the movement planning unit 103.
  • the output unit 105 outputs the position / posture of the fixed camera 200 estimated by the fixed camera calibration unit 102.
  • the fixed camera 200, the sensor-equipped moving marker 300, and the camera calibration apparatus 100 may be connected by a wired connection such as USB or Ethernet (registered trademark) or wirelessly via a wireless network. Connection is also acceptable. Further, data recorded in a recording medium existing in the fixed camera 200 and the sensor-equipped moving marker 300 may be input to the camera calibration apparatus 100 via a storage medium such as an SD card.
  • FIG. 2 is a diagram illustrating an example of the fixed camera 200 and the movement marker 300 with a sensor.
  • the fixed camera 200 fixed to the ceiling or the like captures images at a predetermined cycle and outputs captured images to the camera calibration device 100.
  • the sensor-equipped moving marker 300 can move in the environment where the fixed camera 200 is installed.
  • the marker 310 for facilitating detection from the captured image of the fixed camera 200 and the marker 310 are fixed on the marker 310.
  • the moving marker with sensor 300 composed of the moving camera 320 is mounted on the moving robot 350 that moves in the environment and moved, but the calibration operator 300 moves the moving marker with sensor by hand.
  • the calibration operator may move the cart or tripod on which the sensor-equipped movement marker 300 is placed. It is assumed that the fixed position / orientation of moving camera 320 in the coordinate system (hereinafter referred to as “marker coordinate system”) installed on marker 310 is known to camera calibration apparatus 100.
  • marker coordinate system the fixed position / orientation of moving camera 320 in the coordinate system
  • a checkerboard pattern is used as the marker 310, but another pattern such as a circular pattern that can be easily detected from an image may be used.
  • the marker 310 is not limited to a planar pattern, and may be a three-dimensional object such as a cube or a sphere, or a pattern that exists in advance in the mobile robot 350 may be used as the marker 310.
  • the moving camera 320 is used as a sensor, but other sensors that can measure its own motion, such as an IMU (Inertial Measurement Unit), a wheel encoder, a steering angle meter, GPS, and a laser range finder, are used. Also good.
  • IMU Inertial Measurement Unit
  • the sensor typified by the moving camera 320 of the moving marker 300 with sensor measures at a predetermined cycle and outputs a captured image as a measurement result to the camera calibration device 100.
  • the fixed camera 200 and the moving camera 320 are time-synchronized, and photographing and measurement are performed at the same time.
  • the camera calibration device 100 executes the process of the fixed camera calibration unit 102 every time an image is input from the fixed camera 200 or a certain number of images are input. Further, the process of the movement marker calibration unit 101 is performed every time a measurement result is input from the movement marker 300 with sensor or a certain number of measurement results are input.
  • the moving marker calibration unit 101 estimates the position / orientation in the world coordinate system of the moving marker 300 with sensor at each measurement time of the moving camera 320. Since the position / posture of the mobile camera 320 in the marker coordinate system is fixed and known, the position / posture of the marker 310 in the world coordinate system is estimated by estimating the position / posture of the mobile camera 320 in the world coordinate system. Can do. As shown in FIG. 2, when the moving camera 320 is used as the sensor of the sensor-equipped moving marker 300, Structure from Motion that estimates the position / orientation of the moving camera 320 at the time of capturing each image from a plurality of captured images. Method or Visual Simultaneous Localization and Mapping (vSLAM) method can be used. For example, G. Klein and D. Murray, Parallel Tracking and Mapping for Small AR Workspaces, Proc. IEEE and ACM Int. Symp. On Mixed and Augmented Reality, pp.225-234, 2007. Can do.
  • vSLAM Visual Simultaneous Localization and Mapping
  • the position / orientation can be estimated by integrating the acceleration and angular velocity measured by the IMU.
  • the position / posture can be estimated by dead reckoning.
  • a plurality of sensors such as a camera, an IMU, and a wheel encoder may be used as the sensors, and the position / posture may be estimated by using the measured values of the sensors together.
  • the fixed camera calibration unit 102 uses the image including the moving marker 300 with sensor captured by the fixed camera 200 and the position / posture of the moving marker 300 with sensor estimated by the moving marker calibration unit 101 corresponding to the shooting time. 200 positions and postures are estimated. Details of the processing will be described later. Note that, as described above, since the fixed camera 200 and the moving camera 320 perform time-synchronized shooting, in the estimation process in the moving marker calibration unit 101 and the estimation process in the fixed camera calibration unit 102, captured images at the same time are used. Use.
  • the movement planning unit 103 creates a movement plan for the sensor-equipped movement marker 300 according to the position / orientation of the fixed camera 200 estimated by the fixed camera calibration unit 102. Details of the processing will be described later.
  • the movement instruction unit 104 issues an instruction for causing the movement marker 300 with a sensor to execute the movement plan created by the movement planning unit 103. Details of the processing will be described later.
  • the output unit 105 outputs the position / posture of the fixed camera 200 estimated by the fixed camera calibration unit 102. Details of the processing will be described later. (Operation of Fixed Camera Calibration Unit 102) Next, details of the processing of the fixed camera calibration unit 102 will be described with reference to FIGS. 3 and 4.
  • FIG. 3 is a flowchart showing a process that the fixed camera calibration unit 102 repeats during the operation of the camera calibration apparatus 100.
  • FIG. 4 is an example of an image 210 taken by the fixed camera 200.
  • step S500 the marker 310 and the feature point 311 on the marker 310 are detected from the image 210 taken by the fixed camera 200.
  • the feature point 311 is a point where it is easy to detect the position from the image, such as the corners of each square included in the checkerboard pattern included in the image 210 illustrated in FIG.
  • FIG. 4 only some feature points 311 are illustrated for convenience of illustration. Since detection of the checkerboard pattern and the feature points on the pattern from the image is a known technique, a detailed description thereof is omitted.
  • the identifier when the images 210 are arranged in time series is j
  • the total number of images taken by the i-th fixed camera 200 when the fixed camera calibration unit 102 is processed is N ij
  • the identifier of the feature point 311 is k
  • the total number of feature points 311 detected in the j-th image 210 photographed by the i-th fixed camera 200 is N ijk .
  • the two-dimensional position in the image coordinate system of the image 210 of the k-th feature point 311 detected in the j-th image 210 photographed by the i-th fixed camera 200 is defined as (u ′ ijk , v ′ ijk ) T.
  • step S510 the three-dimensional position in the world coordinate system of the feature point 311 on the marker 310 detected in step S500 is calculated using the position / orientation of the marker 310 estimated by the moving marker calibration unit 101, and the process proceeds to step S520. move on.
  • the three-dimensional position p ijk M of the feature point 311 in the marker coordinate system is known from the specification of the checkerboard pattern. From the position / posture of the marker 310 at the shooting time of the j-th image 210 captured by the i-th fixed camera 200 estimated by the moving marker calibration unit 101, the three-dimensional position p ijk W of the feature point in the world coordinate system is Calculated by (Equation 1).
  • R ij MW and t ij MW are a rotation matrix and a translation vector from the marker coordinate system to the world coordinate system.
  • step S520 from the two-dimensional position in the image coordinate system of the feature point 311 on the marker 310 detected in step S500 and the three-dimensional position in the world coordinate system of the feature point 311 on the marker 310 calculated in step S510,
  • the position / orientation of the fixed camera 200 in the world coordinate system is estimated.
  • the position and orientation of the i-th fixed camera 200 in the world coordinate system is calculated by solving (Equation 2) using a known nonlinear least square method such as the Levenberg-Marquardt method or the Gauss-Newton method. To do.
  • R i WCi and t i WCi are a rotation matrix and a translation vector from the world coordinate system to the fixed camera coordinate system of the i-th fixed camera 200, and R ′ i WCi and t ′ i WCi are estimated.
  • E i is the total reprojection error.
  • the reprojection error refers to a projection position obtained by projecting the three-dimensional position of the feature point 311 onto the image 210 using camera internal parameters such as the position / posture of the fixed camera 200, focal length, and lens distortion, and the feature point in the image 210. This is the distance between 311 detection positions.
  • the total re-projection error E i is calculated by (Equation 3).
  • (x ijk , y ijk ) T is a position in the normalized image coordinate system of the k-th feature point 311 in the image 210 photographed j-th by the i-th fixed camera 200.
  • (x ′ ijk , y ′ ijk ) T projects the position in the world coordinate system of the k-th feature point 311 in the image 210 photographed j-th by the i-th fixed camera 200 onto the i-th fixed camera 200. Coordinates.
  • the position (x ijk , y ijk ) T in the normalized image coordinate system of the k-th feature point 311 in the image 210 photographed j-th by the i-th fixed camera 200 uses, for example, a perspective projection model as a camera model. If there is, it is calculated by (Equation 4).
  • the camera model is not limited to the perspective projection model, and other camera models such as a camera model for an omnidirectional camera may be used.
  • (c ix , c iy ) T is the position of the optical center of the i-th fixed camera 200
  • (f ix , f iy ) T is the focal length of the i-th fixed camera 200.
  • (u ijk , v ijk ) T is the position of the k-th feature point 311 in the image 210 photographed j-th by the i-th fixed camera 200 in the image coordinates from which lens distortion has been removed.
  • distortion in the radial direction of the lens it is calculated according to (Equation 5).
  • the lens distortion model is not limited to distortion in the radial direction of the lens, and other lens models such as distortion in the tangential direction orthogonal to the radial direction of the lens may be used.
  • ⁇ 1 and ⁇ 2 are lens distortion parameters.
  • FIG. 5 is a flowchart showing a process executed by the movement planning unit 103 during the operation of the camera calibration apparatus 100.
  • a fixed camera 200 having a high calibration priority is selected and a target movement position is calculated.
  • step S600 the movable area of the moving marker 300 with sensor is calculated.
  • the reason why the movable area is obtained in advance is that if there is an area where the movement marker 300 with the sensor cannot be arranged due to the influence of an obstacle or the like, it is necessary to create a movement plan excluding the area. is there.
  • the three-dimensional in the world coordinate system of the feature point 311 on the marker 310 at each time is calculated by the same calculation as step S510 in FIG. Calculate the position and fit a plane to the calculated 3D position. Since plane fitting for a three-dimensional position is a known technique, the details are omitted, but a least square method or principal component analysis can be used.
  • the signed distance between each three-dimensional position and the fitted plane is calculated.
  • the distances are rearranged in ascending order, and the q quantiles are calculated as w 1 and w 2 , respectively.
  • q is a preset value of 0 ⁇ q ⁇ 1.
  • an area sandwiched between two planes obtained by moving the fitted plane by w 1 and w 2 in the normal direction of the plane is defined as a movable area of the sensor-equipped movement marker 310.
  • w 1 , W 2 should be set to 0.
  • step S600 the priority of calibration is calculated for each fixed camera 200.
  • step S610 the distance d i between the i-th fixed camera 200 and the sensor-equipped moving marker 300 is calculated.
  • d i is estimated by the moving marker calibration unit 101, and is estimated by the rotation matrix from the latest marker coordinate system to the world coordinate system, translation vectors R ij MW and t ij MW, and the fixed camera calibration unit 102.
  • the rotation matrix from the world coordinate system to the fixed camera coordinate system of the i-th fixed camera 200 and the translation vectors R ′ i WCi and t ′ i WCi are used ( Equation 7.1) and ( Equation 7.2). ).
  • step S620 the three-dimensional distribution a i 3D in the world coordinate system of the feature point 311 on the marker 310 photographed by the i-th fixed camera 200 is calculated.
  • a three-dimensional histogram 220 of the number of feature points 311 is created using the position of the feature point 311 on the marker 310 in the image 210 and the depth from the i-th fixed camera 200.
  • FIG. 6 is a diagram illustrating an example of the three-dimensional histogram 220.
  • the three-dimensional histogram 220 is a histogram of the number of feature points 311 in a space composed of the position of the feature point 311 on the marker 310 in the image 210 and the depth from the i-th fixed camera 200.
  • the bin 221 includes an image x coordinate minimum value x min , a maximum value x max , an image y coordinate minimum value y min , a maximum value y max , a depth minimum value z min , and a region defined by the maximum value z max .
  • This bin constitutes the three-dimensional histogram 220 that holds the number of feature points 311.
  • FIG. 7 is a diagram illustrating an example of the bin 221 of the three-dimensional histogram 220 in the three-dimensional space.
  • an area of 160x120 pixels is generated by dividing the image 210 of 640x480 pixels into 4x4 areas, and each area is further divided in the depth direction.
  • the bin 221 is defined.
  • 10 bins 221 having a depth of 1 m can be defined by dividing the depth from 0 m to 10 m into 10 equal parts.
  • the size of the bin 221 shown here is merely an example, and the bin 221 having an arbitrary size may be set according to the environment. Then, by counting the number of feature points 311 present in each bin, the three-dimensional histogram 220 illustrated in FIG. 6 is created.
  • the number of feature points 311 detected in all the images 210 taken by the i-th fixed camera 200 is counted. That is, N ij images are used. Also, the depth Z ′ ijk Ci of the feature point 311 is calculated by (Equation 8).
  • the bin 221 including the movable region of the moving marker 300 with sensor calculated in step S600 is a bin that the moving marker 300 with sensor can reach, and the bin that does not include the bin that the moving marker 300 with sensor cannot reach.
  • the measured three-dimensional information is used as the moving marker 300 with sensor. This is used to determine whether or not the destination is reachable. Specifically, when there are three-dimensional measurement points in the bin 221 that are equal to or greater than a preset threshold, it is considered that there is an obstacle and the bin is determined as an unreachable bin. In addition, using the measured three-dimensional information, it is determined whether the fixed camera 200 can shoot each bin 221 of the three-dimensional histogram 220.
  • the bin 221 having a depth larger than the bin 221 determined as an unreachable bin by the obstacle is determined as a bin 221 that cannot be captured from the fixed camera 200 due to the obstacle.
  • the moving marker with sensor 300 is processed as a bin that cannot be reached.
  • the number N i3eb of bins 221 that can be reached by the sensor-equipped moving marker 300 and the bin 221 that can be reached by the sensor-equipped moving marker 300 and the number of feature points 311 in the bin 221 is greater than a preset threshold th 3D.
  • step S630 the two-dimensional distribution a i 2D in the image coordinate system of the feature point 311 on the marker 310 photographed by the i-th fixed camera 200 is calculated.
  • FIG. 8 is a diagram illustrating an example of the two-dimensional histogram 230.
  • the two-dimensional histogram 230 is a histogram of the number of feature points 311 in the space formed by the positions of the feature points 311 on the marker 310 in the image 210.
  • the bin 231 holds the number of feature points 311 in the region defined by the minimum value x min , the maximum value x max of the image x coordinate, the minimum value y min of the image y coordinate, and the maximum value y max . This is a bin constituting the histogram 230.
  • FIG. 8 is a diagram illustrating an example of the two-dimensional histogram 230.
  • the two-dimensional histogram 230 is a histogram of the number of feature points 311 in the space formed by the positions of the feature points 311 on the marker 310 in the image 210.
  • the bin 231 holds the number of feature points 311 in the region defined by the minimum value x min , the maximum value x max of
  • the two-dimensional histogram 230 is created by using the three-dimensional histogram 220 created in step S620 and adding the number of feature points 311 included in all depth bins 221 for each region on the image.
  • each bin 231 of the two-dimensional histogram 230 it is determined whether or not the movement marker 300 with sensor can be reached.
  • the three-dimensional histogram 220 created in step S620 for each bin 231 of the two-dimensional histogram 230, if there is at least one bin 221 of the three-dimensional histogram 220 that can be reached with respect to the same image area, it can be reached. Judge that the bin is correct. If no reachable 3D histogram bin 221 exists, it is determined that the bin is not reachable.
  • the number N i2eb of bins 231 that can be reached by the sensor-equipped moving marker 300 and the bin 231 that can be reached by the sensor-equipped moving marker 300 and the number of feature points 311 in the bin 231 is greater than a preset threshold th 2D.
  • step S640 each fixed camera is calibrated based on the distance d i obtained in steps S610 to S630, the three-dimensional distribution a i 3D , the two-dimensional distribution a i 2D , and (Equation 9). Priority a i is calculated.
  • ⁇ d , ⁇ 2D , and ⁇ 3D are preset weights for the distance d i from the sensor-equipped moving marker 300, the three-dimensional distribution a i 3D , and the two-dimensional distribution a i 2D .
  • the fixed camera 200 having the highest priority a i is selected from all the fixed cameras 200 in step S650.
  • step S660 following step S650, the target movement position of the sensor-equipped movement marker 300, which is required to calibrate the fixed camera 200 having the highest priority selected in step S650, is calculated. Details of this calculation will be described with reference to the flowchart of FIG.
  • Steps S661 to S664 are processed for each bin 221 of the three-dimensional histogram 220 created in Step S620 for the fixed camera 200 selected in Step S650.
  • step S661 the distance d b between the center position of the bin 221 of the three-dimensional histogram 220 and the moving marker 300 with sensor is calculated.
  • the distance d b between the center position p b Ci of the bin 221 and the moving marker 300 with sensor is calculated by (Expression 11.1) and (Expression 11.2).
  • step S662 the three-dimensional sufficiency b b 3D of the bin 221 of the three-dimensional histogram 220 is calculated.
  • step S663 the two-dimensional sufficiency b b 2D of the bin 221 of the three-dimensional histogram 220 is calculated.
  • the two-dimensional sufficiency b b 2D of the bin 221 has the same threshold value th 2D as the number of feature points 311 included in the bin 231 of the two-dimensional histogram 230 created in step S630 having the same image area as the bin 221 of the three-dimensional histogram 220. It is set to 1 in the above case, and set to 0 if less than the threshold th 2D .
  • step S664 the priority b b of the bin 221 of the three-dimensional histogram 220 is calculated from the distance d b from the sensor-equipped moving marker 300, the three-dimensional sufficiency b b 3D , and the two-dimensional sufficiency b b 2D .
  • the priority b b of the bin 221 is calculated by (Equation 12).
  • ⁇ ′ d , ⁇ ′ 3D , and ⁇ ′ 2D are preset weights for the distance d b from the sensor-equipped moving marker 300, the three-dimensional sufficiency b b 3D , and the two-dimensional sufficiency b b 2D . is there.
  • step S665 the world coordinate system of the bin 221 having the highest priority b b from among the reachable bins 221 using the reachability of the sensor-attached movement marker 300 with respect to the bin 221 of the three-dimensional histogram 220 calculated in step S620.
  • the center position p b W at is output as the target movement position.
  • the movement marker 300 with a sensor cannot move to the target movement position output by the movement plan part 103 .
  • the movement marker 300 with the sensor cannot move through the interface of the movement instruction unit 104 that instructs the movement marker 300 with the sensor the target movement position output by the movement planning unit 103.
  • the movement planning unit 103 excludes the bin 221 of the three-dimensional histogram 220 having the highest priority from the selection candidates in step S665 and moves the center position of the bin 221 having the next highest priority to the target movement. Output as position.
  • step S650 If a signal indicating that the movement cannot be continuously performed for the same fixed camera 200 more than a preset number of times is received, in step S650, the fixed camera 200 with the highest priority is selected from the selection candidates. The process of step S660 is executed on the fixed camera 200 with the next highest priority. (Operation of the movement instruction unit 104) Next, the operation of the movement instruction unit 104 will be described with reference to FIG.
  • the movement instruction unit 104 issues an instruction for moving the sensor-equipped movement marker 300 to the target movement position output by the movement planning unit 103.
  • the movement instructing unit 104 outputs a control signal for moving the mobile robot 350 to the target movement position. 350 is moved.
  • FIG. 11 is a diagram illustrating an example of a three-dimensional map created by the movement instruction unit 104.
  • the current position / posture of the moving marker with sensor 300 estimated by the moving marker calibration unit 101 and the locus 360 are displayed in computer graphics.
  • the position / orientation of the fixed camera 200 estimated by the fixed camera calibration unit 102 is displayed by computer graphics, and the right fixed camera 200 to be calibrated selected in step S650 is colored and emphasized. Yes.
  • the measured three-dimensional information is three-dimensional.
  • it may be displayed on a two-dimensional map with computer graphics.
  • the movement instruction unit 104 may instruct movement based on the amount of movement from the current position / posture of the movement marker 300 with sensor.
  • the coordinate p b M of the target movement position in the marker coordinate system of the movement marker 300 with sensor is calculated by (Equation 13).
  • p b M corresponds to the amount of movement from the current position / posture of the moving marker 300 with sensor, for example, 10 m forward, 5 m right, and 0.5 m upward.
  • the mobile robot 350 moves the sensor-equipped movement marker 300
  • the mobile robot 350 is moved by outputting a control signal for moving the movement amount.
  • the calibration operator moves the sensor-equipped movement marker 300
  • the movement amount is instructed by, for example, sound output from a speaker or the like, or screen output from a display. (Operation of output unit 105)
  • the operation of the output unit 105 will be described with reference to FIG.
  • the output unit 105 outputs the position / posture of the fixed camera 200 estimated by the fixed camera calibration unit 102 to a RAM in the fixed camera 200, a management server of the fixed camera 200, and the like.
  • the output unit 105 may display the position / orientation of the fixed camera 200 estimated by the fixed camera calibration unit 102 using a three-dimensional or two-dimensional map.
  • FIG. 12 is a diagram illustrating an example of a two-dimensional map output by the output unit 105.
  • FIG. 12 is drawn from the normal direction of the plane representing the movable region of the movement marker 300 with sensor, which is applied in step S600 of the movement planning unit 103.
  • the position / orientation of the fixed camera 200 estimated by the fixed camera calibration unit 102 is displayed together with the imageable range 240 by computer graphics.
  • a trajectory 360 indicates the position of the moving marker 300 with the sensor estimated by the moving marker calibration unit 101.
  • the shootable range 240 is a shootable range of the fixed camera 200, and can be calculated from the focal length and the angle of view of the fixed camera 200.
  • a point 370 is a three-dimensional measurement that seems to be an obstacle measured by the external sensor when an external sensor capable of measuring surrounding three-dimensional information, such as a camera or a laser range finder, is used as the sensor of the moving marker 300 with the sensor. Is a point.
  • the display method of the measurement result by the external sensor is not limited to the point.
  • a plurality of planes may be applied to the three-dimensional measurement points measured by the external sensor, and the applied planes may be displayed. (Effect) According to Example 1 mentioned above, the following effects are obtained.
  • the movement planning unit 103 plans the target movement position of the sensor-equipped movement marker 300 based on the position and orientation of the fixed camera in the world coordinate system estimated by the fixed camera calibration unit 102, and moves
  • the instruction unit 104 instructs the target movement position to the movement marker 300 with a sensor. Therefore, regardless of the skill level of the calibration operator, the positions and postures of the plurality of fixed cameras 200 can be calibrated with high accuracy in a short time.
  • the heel movement planning unit 103 includes the position / posture of the sensor-equipped moving marker 300 estimated by the moving marker calibration unit 101 in the world coordinate system and the position / posture of the fixed camera 200 estimated by the fixed camera calibration unit 102 in the world coordinate system.
  • the distance between the sensor-equipped moving marker 300 and the fixed camera 200 is calculated from the posture, and the fixed camera 200 with a short distance is preferentially selected as a calibration target (steps S610 and S650 in FIG. 5). Further, the distance between the movement marker with sensor 300 and the center position of the bin 221 of the selected three-dimensional histogram 220 of the fixed camera 200 is calculated, and the center position of the bin 221 having a short distance is set as the target movement position (FIG. 10 step S661, step S665). Therefore, by selecting a target movement position with a short distance from the sensor-equipped movement marker 300, the time required to move to the target movement position is shortened, and a plurality of fixed cameras 200 can be calibrated in a short time.
  • the heel movement planning unit 103 determines the tertiary of the feature point 311 on the marker 310 based on the position / posture in the world coordinate system of the moving marker with sensor 300 estimated by the moving marker calibration unit 101 and the result of the fixed camera calibration unit 102.
  • the original distribution degree and the two-dimensional distribution degree are calculated, and the fixed camera 200 having a small distribution degree is preferentially selected as a calibration target (steps S620 to S650 in FIG. 5). Further, for each bin 221 of the selected three-dimensional histogram 220 of the fixed camera 200, the three-dimensional satisfaction degree and the two-dimensional satisfaction degree of the feature point 311 on the marker 310 are calculated, and the three-dimensional histogram 220 having a small satisfaction degree is calculated.
  • the center position of the bin 221 is set as the target movement position (FIG. 10, step S662 to step S665). Therefore, by using information on various positions in the three-dimensional space and the two-dimensional space for calibration of the fixed camera 200, the position and orientation of the fixed camera 200 can be calibrated with high accuracy.
  • the heel movement planning unit 103 uses the distance between the sensor-equipped moving marker 300 and the fixed camera 200 and the three-dimensional distribution degree and the two-dimensional distribution degree of the feature point 311 on the marker 310 at the same time.
  • the fixed camera 200 having a high priority is selected and set as a calibration target (steps S610 to S650 in FIG. 5).
  • the distance between the sensor-equipped moving marker 300 and the center position of the bin 221 and the three-dimensional satisfaction of the feature point 311 on the marker 310 is calculated using the two-dimensional sufficiency at the same time, and the center position of the bin 221 of the three-dimensional histogram 220 having a high priority is set as the target movement position (step S661 to step 665 in FIG. 10). Therefore, by selecting a target movement position having a short distance from the movement marker 300 with the sensor and using information on various positions in the three-dimensional space and the two-dimensional space for calibration of the fixed camera 200, a plurality of pieces can be obtained in a short time. The position / posture of the fixed camera 200 can be calibrated with high accuracy.
  • the heel movement planning unit 103 calculates whether or not the movement marker 300 with sensor reaches the bin 221 of the three-dimensional histogram 220 and the bin 231 of the two-dimensional histogram 230 by calculating the movable region of the movement marker 300 with sensor. Using the reachable bin, the fixed camera 200 is selected and the target movement position is calculated (step S600 in FIG. 5, step S620 to step S650, step S665 in FIG. 10). Therefore, it is prevented that the movement marker 300 with the sensor indicates a target movement position that cannot be reached, and a plurality of fixed cameras 200 can be calibrated in a short time.
  • the kite movement planning unit 103 performs a cubic operation on the bin 221 of the three-dimensional histogram 220 and the bin 231 of the two-dimensional histogram 230. Using the original information, it is determined whether or not the moving marker with sensor 300 can be reached and whether or not the fixed camera 200 can shoot, and using the reachable and shootable bin, the fixed camera 200 is selected and the target moving position is calculated. (FIG. 5 step S600, steps S620 to S650, FIG. 10 step S665). Therefore, it is possible to prevent a target movement position that cannot be reached by the sensor-equipped movement marker 300 and a target movement position that cannot be photographed by the fixed camera 200, and a plurality of fixed cameras 200 can be calibrated in a short time.
  • the heel movement instruction unit 104 moves the mobile robot 350 by outputting a control signal for the mobile robot 350 to move to the target movement position when the mobile robot 350 moves the sensor-equipped movement marker 300. Let For this reason, a plurality of fixed cameras 200 can be automatically calibrated.
  • the heel movement instruction unit 104 fixes the position / posture in the world coordinate system of the sensor-equipped movement marker 300 estimated by the movement marker calibration unit 101.
  • the position / posture of the fixed camera 200 in the world coordinate system estimated by the camera calibration unit 102 and the target movement position output by the movement planning unit 103 on a two-dimensional or three-dimensional map movement can be achieved. Instruct (FIG. 11). Therefore, since the calibration operator who moves the movement marker 300 with the sensor can easily grasp the target movement position, the time required to move to the target movement position is shortened, and a plurality of fixed cameras 200 can be quickly connected. Can be calibrated.
  • the output unit 105 includes the position / posture in the world coordinate system of the movement marker 300 with sensor estimated by the movement marker calibration unit 101, and the fixed camera 200 estimated by the fixed camera calibration unit 102.
  • the position / posture in the world coordinate system and the shootable range of the fixed camera 200 calculated from the focal length and angle of view of the fixed camera 200 are displayed on a two-dimensional or three-dimensional map (FIG. 12). ). Therefore, the calibration operator can easily confirm the calibration result of the fixed camera 200.
  • the movement planning unit 103 calculates the priority of each bin 221 of the three-dimensional histogram 220 and sets the center position of the bin 221 having the highest priority as the target movement position (step S665 in FIG. 10). .
  • the output of the movement planning unit 103 is not limited to this.
  • the movement planning unit 103 sets the minimum value x min of the image x coordinate, the maximum value x max , the minimum value y min of the image y coordinate, the maximum value y max , and the minimum value z min of the bin 221 having the highest priority.
  • the entire area of the predetermined bin 221 having the maximum value z max may be output as the target movement position.
  • the movement instruction unit 104 estimates the current position / posture of the moving marker 300 with sensor estimated by the moving marker calibration unit 101, the past position of the moving marker 300 with sensor, and the fixed camera calibration unit 102.
  • the movement is instructed by displaying the position / posture of the fixed camera 200 and the entire area of the predetermined bin output by the movement planning unit 103 on a two-dimensional or three-dimensional map.
  • the movement planning unit 103 outputs the entire area of the predetermined bin 221 as the target movement position, and the movement instruction unit 104 displays the area output by the movement planning unit 103 as the target movement position. Therefore, since the target movement position can be easily grasped, the time required to move to the target movement position is shortened, and a plurality of fixed cameras 200 can be calibrated in a short time.
  • the movement planning unit 103 calculates the priority of each bin 221 of the three-dimensional histogram 220, and uses the center position or the entire area of the bin 221 having the highest priority as the target movement position. This is output (step S665 in FIG. 10).
  • the output of the movement planning unit 103 is not limited to this.
  • the movement planning unit 103 sets all the bins 221 whose priority is equal to or higher than a preset threshold, such as second priority and third priority.
  • the center position or the entire area may be output as the target movement position.
  • the center position or area of the bin 221 having a preset number of high priorities may be output as the target movement position.
  • the movement instruction unit 104 instructs movement by displaying all target movement positions output by the movement planning unit 103 on a two-dimensional or three-dimensional map.
  • the movement planning unit 103 may output the center position or the entire area of all the bins 221 together with the bin priority.
  • the movement instructing unit 104 instructs the movement by displaying the center positions or areas of all the bins 221 output from the movement planning unit 103 on the two-dimensional or three-dimensional map by color coding according to the bin priority.
  • the movement planning unit 103 outputs a plurality of target movement positions with high priority
  • the movement instruction unit 104 displays a plurality of target movement positions with high priority output by the movement planning unit 103. Therefore, when the calibration operator moves the sensor-equipped movement marker 300, it can move so as to efficiently pass through a plurality of high priority target movement positions, and the position / posture of the plurality of fixed cameras 200 can be achieved in a short time. Can be calibrated with high accuracy.
  • the movement planning unit 103 calculates the priority of each bin 221 of the three-dimensional histogram 220 and outputs the center position or region of the bin 221 having the highest priority as the target movement position. (Step S665 in FIG. 10).
  • the output of the movement planning unit 103 is not limited to this.
  • the movement planning unit 103 may calculate a route on which the bin 221 having a high priority can be efficiently moved based on the priority of each bin 221 of the three-dimensional histogram 220, and output it as the target movement position. . Specifically, assuming that the bin 221 where the sensor-equipped movement marker 300 currently exists is a start position, the movement to the adjacent bin 221 with respect to image coordinates and depth is counted as one time, and the bin is moved a preset number of times. For all the movement paths, the sum of the priorities of the bins 221 on the movement path that can be reached by the sensor-attached movement marker is calculated, and the movement path with the highest priority sum is output as the target movement position.
  • the moving route calculation method is not limited to this, and other known route planning methods can be used.
  • the movement instruction unit 104 When the mobile robot 350 moves the sensor-equipped movement marker 300, the movement instruction unit 104 outputs a control signal so that the mobile robot 350 passes the route output as the target movement position by the movement planning unit 103. The mobile robot 350 is moved. On the other hand, when the calibration operator moves the movement marker with sensor 300, the movement is instructed by displaying the route output as the target movement position by the movement planning unit 103 on a two-dimensional or three-dimensional map.
  • the movement planning unit 103 outputs, as the target movement position, a route for efficiently moving a plurality of movement targets having high priority
  • the movement instruction unit 104 outputs the route output by the movement planning unit 103 as the target movement position. Instruct. Therefore, it can move so that it may pass through a plurality of movement targets efficiently, and the position and posture of a plurality of fixed cameras 200 can be calibrated with high accuracy in a short time.
  • the camera calibration device 400 according to the second embodiment will be described with reference to FIGS. 13 to 14 for the case where the measurement times of the sensors of the fixed camera 200 and the sensor-equipped moving marker 300 are not synchronized.
  • the same components as those in the first embodiment are denoted by the same reference numerals, and redundant description is omitted. Differences will be mainly described.
  • FIG. 13 is a diagram illustrating a block configuration of the camera calibration apparatus 400.
  • the camera calibration device 400 calibrates the position and orientation of the connected fixed camera 200.
  • these can be realized, for example, by operating an arithmetic device such as a CPU in accordance with a program stored in a storage device such as a semiconductor memory in the camera calibration device 400, and each is necessarily provided as hardware. There is no need.
  • the stop determination unit 401 combines the image coordinates of the feature point 311 on the marker 310 and the three-dimensional coordinates in a state where the moving marker 300 with the sensor is stopped, and outputs the combination to the fixed camera calibration unit 102.
  • the stop planning unit 402 plans the stop of the sensor-equipped moving marker 300 according to the results of the stop determination unit 401, the moving marker calibration unit 101, and the movement planning unit 103.
  • the stop instruction unit 403 instructs the stop marker with the sensor to the movement marker 300 with the sensor.
  • the measurement times of the sensors of the fixed camera 200 and the sensor-equipped moving marker 300 are not synchronized.
  • the movement marker 300 with sensor repeats movement and stop according to the instruction of the stop instruction unit 403.
  • the sensor-equipped movement marker 300 is mounted on the mobile robot 350, a carriage, a tripod, or the like. (Operation of stop judgment unit) Details of the processing of the stop determination unit 401 will be described with reference to FIG. FIG. 14 is a flowchart illustrating processing executed by the stop determination unit 401 during the operation of the camera calibration apparatus 400.
  • Steps S800 to S802 are processed for each fixed camera 200.
  • step S800 the feature point 311 on the marker 310 of the sensor-equipped moving marker 300 is detected from the image 210 obtained by capturing with the fixed camera 200, and the process proceeds to step S801.
  • the process of step S800 is the same as step S500 of the fixed camera calibration unit 102.
  • step S801 if the marker 310 is detected in step S800, the process proceeds to step S802. If the marker 310 is not detected, the process proceeds to the next process of the fixed camera 200.
  • step S802 it is determined from the image coordinates of the feature point 311 detected in step S800 whether the sensor-equipped moving marker 300 is stopped. Specifically, for each feature point 311, the distance between the position in the latest image and the position in the previous image is calculated. The average of the distances of all the feature points 311 is calculated, and when the average distance is smaller than a preset threshold, it is determined that the sensor-equipped moving marker 300 is stopped.
  • Step S803 determines whether or not the sensor-equipped moving marker 300 is stopped based on the position / orientation of the sensor-equipped moving marker 300 estimated by the moving marker calibrating unit 101, and the process proceeds to step S804.
  • the distance between the latest position of the moving marker 300 with sensor and the position of the moving marker 300 with sensor calculated from the measured value of the previous sensor is calculated. If the distance is smaller than a preset threshold value, the sensor It is determined that the attached movement marker 300 is stopped.
  • step S804 the image coordinates of the feature point 311 and the three-dimensional coordinates are synchronized. Specifically, after the stop instruction is issued by the stop instructing unit 403, the image coordinates of the feature point 311 that is first determined that the sensor-equipped moving marker 300 is stopped in step S802, and the sensor that is first detected in step S803.
  • the three-dimensional coordinates of the feature point 311 determined that the attached movement marker 300 is stopped are regarded as the coordinates in a state where the movement marker with sensor 300 is stopped at the same position, and the image coordinates of the feature point 311
  • the combination of the three-dimensional coordinates is output to the fixed camera calibration unit 102. (Operation of outage planning department) Next, details of the processing of the stop planning unit 402 will be described.
  • the stop planning unit 402 includes a detection result of the marker 310 from the image 210 captured by the fixed camera 200 by the stop determination unit 401, the position of the moving marker 300 with sensor estimated by the moving marker calibration unit 101, and a movement planning unit. An instruction is issued to stop or move the sensor-equipped movement marker 300 from the target movement position output by 103.
  • the stop planning unit 402 instructs the moving marker with sensor to stop via the stop instructing unit 403 when the moving marker with sensor 300 is first detected in each fixed camera 200.
  • the stop plan unit 402 is configured such that the distance between the position of the sensor-equipped movement marker 300 estimated by the movement marker calibration unit 101 and the target movement position output by the movement planning unit 103 is equal to or less than a preset threshold value.
  • the stop instruction unit 403 is used to instruct the sensor-equipped movement marker 300 to stop.
  • step S804 of the stop determination unit 401 the stop planning unit 402 cancels the stop of the sensor-equipped moving marker 300 via the stop instruction unit 403. And instruct them to move. (Operation of stop instruction section) Next, details of the processing of the stop instruction unit 403 will be described.
  • the stop planning unit 403 stops or moves the mobile robot 350 by outputting a control signal to the mobile robot 350 when the mobile robot 350 moves the sensor-equipped movement marker 300.
  • the stop or movement is instructed by sound output from a speaker or screen output from a display.
  • the same interface as the movement instruction unit 104 may be used as an interface such as a speaker or a display. (Effect) According to the second embodiment, the following operational effects can be obtained. That is, in the camera calibration apparatus 400, the stop planning unit 402 instructs the sensor-equipped movement marker 300 to stop via the stop instruction unit 403, and the stop determination unit 401 causes the sensor-equipped movement marker 300 to have the same position / posture.
  • the image coordinates of the feature point 311 on the marker 310 and the three-dimensional coordinates in a stopped state are combined and output to the fixed camera calibration unit 102. Therefore, the fixed camera 200 can be calibrated even when the measurement times of the sensors of the fixed camera 200 and the sensor-equipped moving marker 300 are not synchronized.
  • this invention is not limited to an above-described Example, Various modifications are included.
  • the above-described embodiments have been described in detail for easy understanding of the present invention, and are not necessarily limited to those having all the configurations described.
  • Other embodiments conceivable within the scope of the technical idea of the present invention are also included in the scope of the present invention.
  • a part of the configuration of one embodiment can be replaced with the configuration of another embodiment, and the configuration of another embodiment can be added to the configuration of one embodiment.
  • Each of the above-described configurations, functions, processing units, processing means, and the like may be realized by hardware by designing a part or all of them with, for example, an integrated circuit.
  • Each of the above-described configurations, functions, and the like may be realized by software by interpreting and executing a program that realizes each function by the processor.
  • Information such as programs, tables, and files for realizing each function can be stored in a recording device such as a memory, a hard disk, an SSD (Solid State Drive), or a recording medium such as an IC card, an SD card, or a DVD.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

Lorsqu'un gabarit d'étalonnage est complètement déplacé afin d'améliorer la précision d'étalonnage, la quantité de temps nécessaire à la mesure devient plus longue, et lorsque le gabarit d'étalonnage est déplacé de manière limitée afin de réduire le temps de la mesure, la précision de l'étalonnage diminue. Pour corriger ce problème, un dispositif d'étalonnage de caméra selon la présente invention comprend : une unité d'étalonnage de marqueur mobile dans laquelle une caméra fixe et un marqueur mobile équipé d'un capteur sont connectés, et la position et l'attitude du marqueur mobile sont estimées sur la base de valeurs de mesure provenant du capteur du marqueur mobile équipé d'un capteur ; une unité d'étalonnage de caméra fixe dans laquelle la position et l'attitude de la caméra fixe sont estimées sur la base d'une image photographiée de la caméra fixe et de la position et de l'attitude estimées du marqueur mobile ; une unité de planification de mouvement qui crée un plan de mouvement comprenant une position de mouvement cible du marqueur mobile en fonction de la position et de l'attitude estimées de la caméra fixe ; une unité d'instruction de mouvement qui donne des instructions pour un mouvement sur la base du plan de mouvement ; et une unité de sortie qui délivre en sortie la position et l'attitude estimées de la caméra fixe.
PCT/JP2017/001337 2017-01-17 2017-01-17 Dispositif d'étalonnage de caméra WO2018134866A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/JP2017/001337 WO2018134866A1 (fr) 2017-01-17 2017-01-17 Dispositif d'étalonnage de caméra

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2017/001337 WO2018134866A1 (fr) 2017-01-17 2017-01-17 Dispositif d'étalonnage de caméra

Publications (1)

Publication Number Publication Date
WO2018134866A1 true WO2018134866A1 (fr) 2018-07-26

Family

ID=62908484

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2017/001337 WO2018134866A1 (fr) 2017-01-17 2017-01-17 Dispositif d'étalonnage de caméra

Country Status (1)

Country Link
WO (1) WO2018134866A1 (fr)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPWO2019193859A1 (ja) * 2018-04-04 2021-05-13 コニカミノルタ株式会社 カメラ較正方法、カメラ較正装置、カメラ較正システムおよびカメラ較正プログラム
CN113066134A (zh) * 2021-04-23 2021-07-02 深圳市商汤科技有限公司 一种视觉传感器的标定方法及装置、电子设备和存储介质
DE102021204363A1 (de) 2021-04-30 2022-11-03 Robert Bosch Gesellschaft mit beschränkter Haftung Verfahren zur Kalibrierung eines Sensors mittels eines Fortbewegungsmittels
WO2024217572A1 (fr) * 2023-04-21 2024-10-24 北京极智嘉科技股份有限公司 Procédé et appareil de réglage de dispositif basés sur des identifiants de reconnaissance, et dispositif informatique

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003050107A (ja) * 2001-08-07 2003-02-21 Matsushita Electric Ind Co Ltd カメラ校正装置
JP2010172986A (ja) * 2009-01-28 2010-08-12 Fuji Electric Holdings Co Ltd ロボットビジョンシステムおよび自動キャリブレーション方法
JP2010276603A (ja) * 2009-05-29 2010-12-09 Mori Seiki Co Ltd キャリブレーション方法及びキャリブレーション装置

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003050107A (ja) * 2001-08-07 2003-02-21 Matsushita Electric Ind Co Ltd カメラ校正装置
JP2010172986A (ja) * 2009-01-28 2010-08-12 Fuji Electric Holdings Co Ltd ロボットビジョンシステムおよび自動キャリブレーション方法
JP2010276603A (ja) * 2009-05-29 2010-12-09 Mori Seiki Co Ltd キャリブレーション方法及びキャリブレーション装置

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPWO2019193859A1 (ja) * 2018-04-04 2021-05-13 コニカミノルタ株式会社 カメラ較正方法、カメラ較正装置、カメラ較正システムおよびカメラ較正プログラム
JP7173133B2 (ja) 2018-04-04 2022-11-16 コニカミノルタ株式会社 カメラ較正方法、カメラ較正装置、カメラ較正システムおよびカメラ較正プログラム
CN113066134A (zh) * 2021-04-23 2021-07-02 深圳市商汤科技有限公司 一种视觉传感器的标定方法及装置、电子设备和存储介质
DE102021204363A1 (de) 2021-04-30 2022-11-03 Robert Bosch Gesellschaft mit beschränkter Haftung Verfahren zur Kalibrierung eines Sensors mittels eines Fortbewegungsmittels
WO2024217572A1 (fr) * 2023-04-21 2024-10-24 北京极智嘉科技股份有限公司 Procédé et appareil de réglage de dispositif basés sur des identifiants de reconnaissance, et dispositif informatique

Similar Documents

Publication Publication Date Title
CN112258567B (zh) 物体抓取点的视觉定位方法、装置、存储介质及电子设备
US10825198B2 (en) 3 dimensional coordinates calculating apparatus, 3 dimensional coordinates calculating method, 3 dimensional distance measuring apparatus and 3 dimensional distance measuring method using images
US9953461B2 (en) Navigation system applying augmented reality
JP6658001B2 (ja) 位置推定装置、プログラム、位置推定方法
JP5746477B2 (ja) モデル生成装置、3次元計測装置、それらの制御方法及びプログラム
JP5624394B2 (ja) 位置姿勢計測装置、その計測処理方法及びプログラム
JP5992184B2 (ja) 画像データ処理装置、画像データ処理方法および画像データ処理用のプログラム
US20170337701A1 (en) Method and system for 3d capture based on structure from motion with simplified pose detection
JP6503906B2 (ja) 画像処理装置、画像処理方法および画像処理プログラム
JP6589636B2 (ja) 3次元形状計測装置、3次元形状計測方法及び3次元形状計測プログラム
JP6321202B2 (ja) モバイルプラットフォームの運動を決定する方法、装置、及びシステム
WO2020195875A1 (fr) Dispositif et procédé de traitement d'informations, et programme
WO2018134866A1 (fr) Dispositif d'étalonnage de caméra
JP4227037B2 (ja) 撮像システム及び校正方法
US11758100B2 (en) Portable projection mapping device and projection mapping system
WO2019186677A1 (fr) Estimation de position/posture de robot et dispositif de mesure 3d
JP2003006618A (ja) 3次元モデルの生成方法および装置並びにコンピュータプログラム
EP3392748B1 (fr) Systeme et procede de suivi de position dans un systeme de realite virtuelle
JPWO2021111613A1 (ja) 3次元地図作成装置、3次元地図作成方法、及び3次元地図作成プログラム
KR102555269B1 (ko) 전방향 영상센서 및 관성측정센서의 자세추정 융합 방법 및 시스템
JP7588977B2 (ja) 現場映像管理システムおよび現場映像管理方法
EP4292777A1 (fr) Système d'assistance, dispositif de traitement d'image, procédé d'assistance et programme
KR20190070235A (ko) 비전 기반 위치 추정 기법을 이용한 6-자유도 상대 변위 추정 방법 및 그 장치
US20250095204A1 (en) Sensor calibration system
KR20230130024A (ko) 카메라의 상태를 결정하기 위한 방법 및 시스템

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17893037

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 17893037

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP

点击 这是indexloc提供的php浏览器服务,不要输入任何密码和下载