+

WO2005025237A1 - Procede d'auto-etalonnage d'un systeme photographique - Google Patents

Procede d'auto-etalonnage d'un systeme photographique Download PDF

Info

Publication number
WO2005025237A1
WO2005025237A1 PCT/DE2004/001814 DE2004001814W WO2005025237A1 WO 2005025237 A1 WO2005025237 A1 WO 2005025237A1 DE 2004001814 W DE2004001814 W DE 2004001814W WO 2005025237 A1 WO2005025237 A1 WO 2005025237A1
Authority
WO
WIPO (PCT)
Prior art keywords
camera
parameters
rotation
linear
determined
Prior art date
Application number
PCT/DE2004/001814
Other languages
German (de)
English (en)
Inventor
Reinhard Koch
Jan-Michael Frahm
Original Assignee
Christian-Albrechts- Universität Zu Kiel
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Christian-Albrechts- Universität Zu Kiel filed Critical Christian-Albrechts- Universität Zu Kiel
Publication of WO2005025237A1 publication Critical patent/WO2005025237A1/fr

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration

Definitions

  • the invention relates to a method for self-calibration of a camera system, in particular one whose image data are stored electronically according to the preamble of the main claim.
  • the determination of the camera parameters is usually based on an image sequence of a calibration pattern recorded by the camera from different perspectives.
  • An extension is the calibration based on any rigid scene to which the camera moves relatively during the recording.
  • Degrees of freedom of rotation and translation of the camera form the so-called external camera parameters, which are often also not all known.
  • the automatic calculation of the internal and external parameters from the image sequence is referred to as self-calibration.
  • constraints are often formulated for the camera movement and / or the scene, which provide additional information and / or define certain parameters from the outset.
  • constraints are often formulated for the camera movement and / or the scene, which provide additional information and / or define certain parameters from the outset.
  • One of the most common constraints for the camera itself is that it can only rotate around its optical center. In this case, no depth information for the scene can be extracted from the image sequence.
  • this camera movement can be found in many applications, for example video conference systems.
  • panoramas are created with this camera movement (de Agapito, L .; Hartley, RI; Hayman, E.: Linear Self-Calibration of a rotating and zooming Camera, IEEE Computer Society Conference on Computer Visison and Pattern recognition, 1999, Volume 1, 23-25 June 1999, pages 15-21).
  • a proposal for camera calibration for basically any camera movements and fixed camera parameters is known from GB 2 261 566 A, the camera also having to be shifted and rotated.
  • This method uses Kruppa's equations for the iterative calculation of the camera parameters and makes no demands on the rigid scene.
  • the algorithm is complex and the parameters obtained are in many cases unsuitable or only partially suitable for scene reconstruction.
  • This data is used in the numerical reconstruction of the imaged scene, but not for camera calibration, because no suitably formulated method is known for this.
  • a first matrix can now be determined from corresponding points in images selected in pairs, a second matrix can then be determined from the relative rotation of the at least one camera between the same selected images as a representation of the rotation, both matrices can then be inserted into an equation that indicates a system of linear equations in the calibration parameters, and further matrices should also be determined and used from further image pairs until the linear system of equations can be clearly solved and all parameters of the camera system can be determined by arithmetically solving the system of equations.
  • the camera system can consist of more than one camera, the fixed positions and orientations to one another being known or consisting of one or more freely movable and freely rotatable cameras.
  • at least one of the cameras is connected to a rotation sensor for detecting the relative rotation of the camera between two time-separated exposures.
  • Orientation measurement can be stabilized for error compensation of the orientation measurement system by setting up a statistical fire function with the measurement data for camera rotation, the recorded images, the calculated camera parameters and the preconditions and optimizing them.
  • 1 shows a sketch for explaining the camera movements
  • 2 shows a diagram of the mathematical punch camera model
  • FIG. 7a shows the mean focal length, which is calculated by the self-calibration of the noisy images, as a function of the degree of noise in pixels and angle of rotation
  • FIG. 7b shows the variance of the focal length associated with FIG. 7 a)
  • FIG. 7d shows the variance of the pixel aspect ratio belonging to FIG. 7c).
  • FIGS. 10a and b the variance of / with a) linear calibration and b) statistical calibration.
  • the camera model used below is explained.
  • the simplest model for this is that of a pinhole camera.
  • the camera position is determined from the position of the camera center C and the rotation R of the optical camera axis against the coordinate axes, as shown in FIG. 1.
  • both can change arbitrarily over time ⁇ .
  • C is a three-dimensional column vector in the world coordinate system
  • R is an orthogonal 3x3 matrix that contains the coordinate axes (optical axis, axes of the image plane) of the rotated camera coordinate system in the world coordinate system, thus a rotation matrix with determinant one.
  • the camera center is identified with the focal point, which is ideal in the model, which can be either in front of or behind the image plane. It is common today to start from an image plane in front of the focal point, as can be seen in FIG. 2.
  • the image plane is now thought to be covered with light-sensitive, rectangular pixels with edge lengths dx and dy.
  • the origin of the pixel coordinate system can lie differently than the point of penetration of the optical axis through the image plane, which should be at c (vectors in pixel coordinates are identified by small letters). This is called the main point of the camera.
  • a three-dimensional scene point M in world coordinates is now mapped to a pixel m in camera-stable pixel coordinates via a linear mapping.
  • M (X, Y, Z, 1)
  • T xmd m (x, y, l) ⁇ .
  • You can then write m PM (1)
  • the real 3x3 matrix K is called the camera calibration matrix. It has the shape
  • the calibration matrix represents the properties of the recording sensor or system, ie mostly a CCD chip. It contains the five internal camera parameters:
  • / is the focal length of the camera in pixels. If dx is the width of a pixel in mm, then dx gives the focal length of the camera in mm.
  • c_ (c x , c y ) describes the main point of the camera with two parameters.
  • the mapping described by the projection matrix P is not clearly reversible. Rather, the central projection of the camera maps all object points of an escape line onto the same image point. Therefore, no distances to the object points can be determined from a single image. For this, at least two images from different perspectives (locations or points in time ⁇ l, ⁇ 2) of the same object are required for triangulation.
  • FIG. 3 schematically shows the same camera at times ⁇ 1 and ⁇ 2 with mutually displaced and rotated image planes. This is equivalent to the scenario of recording with two different cameras at the locations ⁇ l and ⁇ 2, which is why only the recording at different times is discussed below.
  • the position of an object point M can be determined from its pixels ml and m2 in the image levels are calculated. If parameters are unknown, the first step is to search for point correspondence for calibration. For a given pixel ml, one only knows of the object point M that it is on the line L. Any point on the line ⁇ 2 can therefore be used as the corresponding pixel m2.
  • Each object point M lies in a plane of the set of planes shown, which in turn must run through both camera centers.
  • the set of planes is depicted in each image plane in a set of lines, of which a point on ⁇ l and ⁇ 2 can be clearly assigned to each other. If M has an image in ⁇ l, then also in ⁇ 2.
  • the lines of lines intersect in each image at exactly one point (el or e2), namely the image of the other camera center, which is called the epipole.
  • This so-called fundamental matrix F contains information about the rotation of the cameras relative to one another or the rotation of a camera over time, and also about the displacement of the cameras or the translation of a camera in the form of a vector product formation with an epipole.
  • P ⁇ denotes the unknown scaling factor, which cannot be determined when the fundamental matrices are set up.
  • the first matrix on the right-hand side of (5) describes the formation of the vector product with the epipole in image i, i.e. the target image of the image analogous to the definition of R j, ;.
  • pixel aspect ratios and focal lengths can be calculated from a single fundamental matrix and the rotation information.
  • a special case is that of a rotating camera with a fixed center, i.e. without translation between the camera centers. No epipole is defined for images that result from a pure rotation of the camera, so that equation (5) cannot be used.
  • the image points mi and mj of an object point M can be clearly mapped onto one another in the images i and j rotated relative to one another, as shown in FIG. 5.
  • H 00 ⁇ is a 3x3 matrix and the vectors m are, as before, the three-dimensional correspondences to the pixels m in pixel coordinates.
  • H ⁇ j ; 1 can be determined from four independent point correspondences except for a scaling factor. It is general
  • the factor Oj is carried out as an unknown if one uses the homographies determined from point correspondences in (8).
  • (8) is again linear in the camera parameters, and the analogy to (5) is obvious.
  • a statistical calibration is connected to the above calibration procedures. The aim is to estimate the most likely camera parameters based on the measurement that has occurred (images, rotation measurements) and formulated known preconditions. For this purpose, an error is minimized which is only determined by the previous knowledge, the images and the rotation data.
  • the error consists of a term for evaluating the camera parameters with regard to the images (maximum likelihood), a term for evaluating the rotation data and several cords for evaluating compliance with the preconditions.
  • the camera parameters are evaluated using the image data using formulas (6) and (4).
  • the term for evaluating the improved rotation uses the error model of the rotation measuring system to decide how likely the currently estimated rotation is in the measurement that has occurred.
  • the other terms used to formulate prior knowledge can be used to stabilize parameters that are difficult to determine, such as the main point.
  • This FeMer function is then minimized by a non-linear minimization process in order to determine the most likely camera parameters and rotation data.
  • the camera parameters determined with the above linear methods and the measured rotation data are used as starting values.
  • the statistical calibration thus calculates a complete calibration and corrected rotation information.
  • This corrected rotation information is used to compensate for the system and measurement errors of the sensor.
  • a virtual camera with a center in the coordinate origin was rotated around the x and y axes by up to 6 °.
  • the camera observed uniformly distributed scene points arranged in a cube and generated calculated images of 512 x 512 pixels.
  • the positions of the pixels were uniformly noisy by up to pixels.
  • FIG. 7a shows an average focal length, which is calculated by the self-calibration of the noisy images, as a function of the degree of noise in pixels and angle of rotation.
  • FIG. 7 b) shows the variance of the focal length belonging to FIG. 7 a).
  • FIG. 7 c) shows the average pixel aspect ratio and
  • FIG. 7 d) shows the variance of the pixel aspect ratio belonging to FIG. 7 c).
  • FIG. 8 shows the calculated results analogously to FIG. 7.
  • the same descriptions of the individual images apply. It can be seen from FIGS. 7 and 8 that the calculated calibration parameters are very robust with respect to variations in the angle of rotation ⁇ 1 ° and pixel errors n ⁇ 1 pixel. The dependence on the angle error is consistently stronger than that on the pixel error. Comparing FIGS. 7 and 8 directly, for example in particular 7 b) and 8 b), one also sees that the influence of pixel errors is greater in the freely moving camera than in the rotating one.
  • a fire function is set up, which must be minimized.
  • the FeWer function contains a summand that tends towards zero, the closer the measured values for the main point are to the respective image center, as it should be according to the selected specifications. If one connects the numerical minimization of this value function (with standard programs) to the linear parameter determination, one obtains improved estimates for the camera parameters. This is to be understood as statistical calibration. In many practical applications, however, you will also be able to do without this last optimization step, for example if you only need to update the focal length.
  • the statistical calibration determined the main point and the pixel shear very robustly even from noisy images, as can be seen by way of example from FIG. 9 b) compared to FIG. 9 a). At the same time, the already robust results for focal length and pixel aspect ratio were further stabilized. This is e.g. clearly in the variance of / in Fig. 10 a) (result of linear calibration) and Fig. 10 b) (result of statistical calibration) in a direct comparison.

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Studio Devices (AREA)

Abstract

L'invention concerne un procédé d'auto-étalonnage d'un système photographique, composé d'au moins un appareil photo, utilisant une pluralité d'images d'une scène, prises à partir de différentes positions et/ou directions de visée du système photographique, et des données de mesure de la rotation relative d'au moins un appareil photo entre les prises de vue individuelles, les paramètres photographiques étant déterminés par le biais d'un système d'équations linéaires. Selon ledit procédé, une première transformation linéaire, pouvant être décrite par une matrice, est déterminée à partir de points correspondants dans des images sélectionnées par paire ; une seconde transformation linéaire comme représentation de la rotation, pouvant être décrite par une matrice, est déterminée à partir de la rotation relative dudit appareil photo entre les mêmes images sélectionnées ; les deux matrices sont insérées dans une équation qui spécifie un système d'équations linéaires dans les paramètres d'étalonnage ; d'autres transformations linéaires, pouvant être décrites en tant que matrices, sont également déterminées à partir d'autres paires d'images et utilisées, jusqu'à ce que le système d'équations linéaires puisse être résolu de façon univoque, et le système d'équations pour tous les paramètres du système photographique est résolu par calcul, au moins un appareil photo étant relié à un capteur de rotation destiné à détecter la rotation relative de l'appareil photo entre deux prises de vue espacées.
PCT/DE2004/001814 2003-08-28 2004-08-16 Procede d'auto-etalonnage d'un systeme photographique WO2005025237A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
DE10340023A DE10340023B3 (de) 2003-08-28 2003-08-28 Verfahren zur Selbstkalibrierung eines Kamerasystems
DE10340023.0 2003-08-28

Publications (1)

Publication Number Publication Date
WO2005025237A1 true WO2005025237A1 (fr) 2005-03-17

Family

ID=34089270

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/DE2004/001814 WO2005025237A1 (fr) 2003-08-28 2004-08-16 Procede d'auto-etalonnage d'un systeme photographique

Country Status (2)

Country Link
DE (1) DE10340023B3 (fr)
WO (1) WO2005025237A1 (fr)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2007004938A1 (fr) 2005-07-01 2007-01-11 Telefonaktiebolaget Lm Ericsson (Publ) Interception de services multimedia
WO2008005066A1 (fr) * 2006-06-30 2008-01-10 Microsoft Corporation étalonnage paramétrique pour des systèmes de caméras panoramiques
WO2016001908A1 (fr) * 2014-07-03 2016-01-07 Imagine Mobile Augmented Reality Ltd Réalité augmentée ancrée en trois dimensions
US9412164B2 (en) 2010-05-25 2016-08-09 Hewlett-Packard Development Company, L.P. Apparatus and methods for imaging system calibration

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102005061931B4 (de) * 2005-12-23 2011-04-14 Bremer Institut für angewandte Strahltechnik GmbH Verfahren und Vorrichtung zur Kalibrierung einer optischen Einrichtung
DE102011100628B4 (de) 2011-05-05 2013-04-25 Deutsches Zentrum für Luft- und Raumfahrt e.V. Verfahren und Vorrichtung zur Bestimmung mindestens eines Kameraparameters
DE102016222319A1 (de) 2016-11-14 2018-05-17 Fraunhofer-Gesellschaft zur Förderung der angewandten Forschung e.V. 3d-referenzierung

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
FRAHM J-M ET AL: "Camera Calibration with Known Rotation", PROCEEDINGS OF THE EIGHT IEEE INTERNATIONAL CONFERENCE ON COMPUTER VISION. (ICCV). NICE, FRANCE, OCT. 13 - 16, 2003, INTERNATIONAL CONFERENCE ON COMPUTER VISION, LOS ALAMITOS, CA : IEEE COMP. SOC, US, vol. VOL. 2 OF 2. CONF. 9, 13 October 2003 (2003-10-13), pages 1418 - 1425, XP010662558, ISBN: 0-7695-1950-4 *
HORAUD R ET AL: "Euclidean reconstruction and affine camera calibration using controlled robot motions", INTELLIGENT ROBOTS AND SYSTEMS, 1997. IROS '97., PROCEEDINGS OF THE 1997 IEEE/RSJ INTERNATIONAL CONFERENCE ON GRENOBLE, FRANCE 7-11 SEPT. 1997, NEW YORK, NY, USA,IEEE, US, vol. 3, 7 September 1997 (1997-09-07), pages 1575 - 1582, XP010264851, ISBN: 0-7803-4119-8 *
OKATANI T ET AL: "Robust estimation of camera translation between two images using a camera with a 3D orientation sensor", PATTERN RECOGNITION, 2002. PROCEEDINGS. 16TH INTERNATIONAL CONFERENCE ON QUEBEC CITY, QUE., CANADA 11-15 AUG. 2002, LOS ALAMITOS, CA, USA,IEEE COMPUT. SOC, US, vol. 1, 11 August 2002 (2002-08-11), pages 275 - 278, XP010613327, ISBN: 0-7695-1695-X *
PAJDLA T ET AL: "Camera calibration and Euclidean reconstruction from known observer translations", COMPUTER VISION AND PATTERN RECOGNITION, 1998. PROCEEDINGS. 1998 IEEE COMPUTER SOCIETY CONFERENCE ON SANTA BARBARA, CA, USA 23-25 JUNE 1998, LOS ALAMITOS, CA, USA,IEEE COMPUT. SOC, US, 23 June 1998 (1998-06-23), pages 421 - 426, XP010291638, ISBN: 0-8186-8497-6 *

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2007004938A1 (fr) 2005-07-01 2007-01-11 Telefonaktiebolaget Lm Ericsson (Publ) Interception de services multimedia
WO2008005066A1 (fr) * 2006-06-30 2008-01-10 Microsoft Corporation étalonnage paramétrique pour des systèmes de caméras panoramiques
KR101330373B1 (ko) 2006-06-30 2013-11-15 마이크로소프트 코포레이션 파노라마 카메라 시스템을 위한 파라미터 보정
US9412164B2 (en) 2010-05-25 2016-08-09 Hewlett-Packard Development Company, L.P. Apparatus and methods for imaging system calibration
WO2016001908A1 (fr) * 2014-07-03 2016-01-07 Imagine Mobile Augmented Reality Ltd Réalité augmentée ancrée en trois dimensions

Also Published As

Publication number Publication date
DE10340023B3 (de) 2005-02-24

Similar Documents

Publication Publication Date Title
DE102015011914B4 (de) Konturlinienmessvorrichtung und Robotersystem
DE102018200154A1 (de) Kalibrationsvorrichtung, Kalibrationsverfahren und Programm für einen visuellen Sensor
DE69800084T2 (de) Verfahren zur linearen Bestimmung der drei-dimensionalen Lage mit affiner Kamerakorrektur
DE102006055758B4 (de) Verfahren zur Kalibrierung von Kameras und Projektoren
WO2017206999A1 (fr) Procédé d'évaluation de données image d'une caméra de véhicule
DE102015015194A1 (de) Bildverarbeitungsvorrichtung und -verfahren und Programm
EP2880853B1 (fr) Dispositif et procédé destinés à déterminer la situation d'une caméra de prise de vue
DE102014201271A1 (de) Verfahren und Steuergerät zum Erkennen einer Veränderung eines relativen Gierwinkels innerhalb eines Stereo-Video-Systems für ein Fahrzeug
WO2011023657A1 (fr) Procédé et dispositif pour assembler plusieurs images individuelles numériques en une image d'ensemble
EP2901414B1 (fr) Procédé et installation de traitement d'images pour déterminer des paramètres d'une caméra
DE102012009577A1 (de) Verfahren zur Kalibrierung und Verfahren zur Justierung von Einzelbildkameras einer Kameraanordnung
EP3104330A1 (fr) Procede de suivi d'au moins un objet et procede de remplacement d'au moins un objet par un objet virtuel dans un signal d'image animee enregistre par une camera
EP3867796A1 (fr) Procédé et dispositif de détermination d'une carte des alentours
DE102017126495B4 (de) Kalibrierung eines stationären Kamerasystems zur Positionserfassung eines mobilen Roboters
DE112014006493T5 (de) Bestimmen eines Massstabs dreidimensonaler Informationen
DE10340023B3 (de) Verfahren zur Selbstkalibrierung eines Kamerasystems
DE19953063A1 (de) Verfahren zur dreidimensionalen optischen Vermessung von Objektoberflächen
DE102015220031A1 (de) Verfahren zur Konfidenzabschätzung für optisch-visuelle Posenbestimmung
DE102016109153A1 (de) Verfahren zum einstellen einer blickrichtung in einer darstellung einer virtuellen umgebung
DE112014002943T5 (de) Verfahren zur Registrierung von Daten unter Verwendung eines Satzes von Grundelementen
DE102020204677B4 (de) Trackingsystem und Computerprogramm zur Kompensation von Sichtschatten bei der Nachverfolgung von Messobjekten
DE102014219418B4 (de) Verfahren zur Stereorektifizierung von Stereokamerabildern und Fahrerassistenzsystem
DE10115149B4 (de) Reintegration einer digitalen Kamera in ein Bildverarbeitungssystem, welche aus einer kalibrierten Position in eine unkalibrierte Position versetzt worden ist
WO2020229352A1 (fr) Procédé pour fournir une fonction de poursuite d'objet
DE102015209284A1 (de) Verfahren zum Erzeugen einer Ansicht einer Fahrzeugumgebung

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A1

Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BW BY BZ CA CH CN CO CR CU CZ DK DM DZ EC EE EG ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NA NI NO NZ OM PG PH PL PT RO RU SC SD SE SG SK SL SY TJ TM TN TR TT TZ UA UG US UZ VC VN YU ZA ZM ZW

AL Designated countries for regional patents

Kind code of ref document: A1

Designated state(s): BW GH GM KE LS MW MZ NA SD SL SZ TZ UG ZM ZW AM AZ BY KG KZ MD RU TJ TM AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IT LU MC NL PL PT RO SE SI SK TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG

121 Ep: the epo has been informed by wipo that ep was designated in this application
122 Ep: pct application non-entry in european phase
点击 这是indexloc提供的php浏览器服务,不要输入任何密码和下载