+

WO2018173243A1 - Stereo camera device - Google Patents

Stereo camera device Download PDF

Info

Publication number
WO2018173243A1
WO2018173243A1 PCT/JP2017/011909 JP2017011909W WO2018173243A1 WO 2018173243 A1 WO2018173243 A1 WO 2018173243A1 JP 2017011909 W JP2017011909 W JP 2017011909W WO 2018173243 A1 WO2018173243 A1 WO 2018173243A1
Authority
WO
WIPO (PCT)
Prior art keywords
unit
stereo
camera
video input
image
Prior art date
Application number
PCT/JP2017/011909
Other languages
French (fr)
Japanese (ja)
Inventor
媛 李
三好 雅則
秀行 粂
Original Assignee
株式会社日立製作所
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社日立製作所 filed Critical 株式会社日立製作所
Priority to PCT/JP2017/011909 priority Critical patent/WO2018173243A1/en
Publication of WO2018173243A1 publication Critical patent/WO2018173243A1/en

Links

Images

Definitions

  • a stereo measurement apparatus that measures an object or an environment using a plurality of cameras has been put into practical use.
  • control is performed by various measurements using an in-vehicle stereo camera of an automobile.
  • the vehicle is controlled by measuring the front-rear direction from a camera even in automatic driving of a car.
  • stereo cameras in the surveillance field and marketing.
  • stereo three-dimensional measurement two cameras photograph the same object at the same time, perform distortion correction and parallelization on the obtained image, and calculate a difference and parallax on the image. And the shape of a three-dimensional object etc. can be measured using the calculated parallax.
  • the measurement range is fixed depending on specifications such as camera parameters, field angle, and parallax depth.
  • the measurement range is defined from 2 m to 40 m in front of the camera. That is, there is a problem that it becomes impossible to measure 2 m or less and 40 m or more from the front of the camera.
  • Patent Document 1 there is one described in Patent Document 1 as a means having a high ability to detect the surrounding situation.
  • image recognition including an optical flow unit that detects a moving state of an object and a moving object state detecting unit that detects a state of the moving object from a detection result detected from a stereo measurement unit that detects the distance of the object.
  • An apparatus is disclosed.
  • Patent Document 2 discloses that high-accuracy three-dimensional measurement is realized even in a wide range by searching for high-precision parallax using images taken from a plurality of different viewpoints.
  • Patent Document 1 it is possible to propose a method of monitoring with a separate sensor or another additional camera for the blind spot problem that cannot be measured in stereo.
  • integration with another sensor and an increase in additional cost are necessary.
  • the information of another sensor and another additional camera does not include three-dimensional information, it is not easy to maintain monitoring accuracy. Also, changing the camera setting value for the shooting direction and zoom of the camera based on the detection result is not disclosed.
  • Patent Document 2 Although the measurement accuracy is improved with respect to the three-dimensional measurement in a certain range, it is not easy to dynamically adjust the measurement range according to the danger or abnormality that has occurred in the field.
  • an object of the present invention is to provide a highly reliable stereo camera device that takes into account the blind spot area.
  • a stereo camera device in order to solve the above problems, a plurality of actual video input units that capture an image of an object or a region, and a camera control unit that changes a setting value at the time of capturing the actual video input unit, ,
  • An image acquisition unit that converts an electronic signal input from the real video input unit, a monitoring unit that detects abnormality or danger based on the image information acquired by the image acquisition unit, and a detection by the monitoring unit Based on the obtained information, a stereo control unit that calculates a control value when the real video input unit is made stereo, and from the real video input unit to the object or region based on the information acquired by the image acquisition unit
  • a distance calculation unit that calculates a distance of the camera, wherein the camera control unit changes a setting value at the time of shooting of the real image input unit based on a control value calculated by the stereo control unit. Characterized in that it.
  • FIG. 1 It is a block diagram which shows the system configuration
  • the real image input unit (stereo camera) 10 basically includes an imaging unit and a lens.
  • the stereo camera receives external images from the left and right camera lenses.
  • the imaging unit is a mechanism including an imaging element such as a CMOS (Complementary Metal Oxide Semiconductor) or a CCD (Charge Coupled Device) which is an image sensor.
  • the lens is a zoomable lens. By zooming the lens, it is possible to image not only the near area but also the telephoto area.
  • the scene to be photographed is not specified, but if it is a stereo camera for the monitoring function, a moving object, a surveillance area, or the like can be considered as a photographing target.
  • a road or a car ahead can be considered as a shooting scene.
  • the shooting can be performed even in a complicated shooting scene or a simple shooting scene, and the shot video is input.
  • the camera of the first embodiment may be a camera that can change the focal length, the rotation angle, the distance between the cameras, and the like.
  • the image acquisition unit 11 is a unit that converts an electronic signal input from the real video input unit 10 into an image.
  • image formats there are various image formats, and the image format is determined by the system such as BMP, JPEG, or PNG.
  • the real video input unit 10, the image acquisition unit 11, and the camera control unit 12 described above are provided in each of the right and left cameras. That is, as shown in FIG. 1, it is comprised by the right demonstration input part, the left demonstration input part, the right image acquisition part, the left image acquisition part, the right camera control part, and the left camera control part.
  • the monitoring unit 13 uses the image information input from the image acquisition unit 11 to monitor and detect abnormalities and dangers.
  • the abnormalities and dangers here vary depending on the shooting scene.
  • Example 1 is a vehicle-mounted camera, it corresponds to a time when a person or a vehicle suddenly appears from the front of the operation or a suspicious dangerous object enters.
  • Various methods have been studied for intruder detection. For example, in the method “Matsuyama et al., Background difference robust against lighting changes”, a background model can be reproduced and an intruder can be detected by a monitoring scene.
  • a method of detecting a specific person or an abnormal thing from an image by machine learning has been proposed.
  • SIFT Scale-Invant Feature Transform
  • HOG Heistograms of Oriented Gradients
  • discriminator Adaboost SVM (supported vector machine)
  • the stereo control unit 14 receives the alarm output from the monitoring unit 13, and calculates a control value for the stereo conversion of the camera based on the detected abnormality and danger information. Based on the calculated control value, the camera control unit 12 adjusts the parameters of the right camera and the left camera. This makes the camera stereo.
  • FIG. 3 is a diagram illustrating a configuration example of the stereo control unit 14.
  • the alarm observation unit 31 determines whether there is an alarm 30 from the monitoring unit 13.
  • the alarm information is output from the alarm observation unit 31 to the alarm analysis unit 32.
  • distance information detected by a laser sensor used in the monitoring unit 13 can be considered as alarm information.
  • the alarm analysis unit 32 specifies a camera to be controlled based on the alarm information. For example, if detection is received from the right camera, it is determined that it is necessary to further measure both cameras in stereo for the observation area of the right camera, and it is possible to decide to control the left camera. . Then, the current camera setting unit 33 acquires the current setting values of the left and right cameras.
  • the installation value means a parameter value of the zoom value, the rotation value, and the inter-camera distance of the camera.
  • the control calculation unit 34 calculates a control value 20 for stereoization based on the camera installation value acquired by the camera current state unit 33 and the result determined by the alarm analysis unit 32, and based on the calculated control value 20,
  • the left camera control unit 13 controls the camera to make it stereo. This operation makes it possible to intelligently control the stereo camera, so that the angle and direction of the stereo camera can be changed separately, and a stereo camera device that can be stereoified dynamically according to abnormalities and dangers. Can be provided. In addition, when two different cameras are made stereo, it is possible to make maximum use of camera resources. Furthermore, if necessary, it can be expected that the cost of the camera will be reduced by making it stereo.
  • FIG. 4 is a diagram illustrating another configuration example of the stereo control unit 14.
  • the alarm is not based on the information detected by the sensor.
  • This is an example in which an alarm is detected using an image and a control value 20 is estimated.
  • the alarm information for example, the position on the image obtained from the right or left camera can be considered.
  • the alarm observation unit is the same as in FIG.
  • the image analysis unit 40 receives the left and right camera images 41 as input and analyzes the danger information on the images.
  • the image analysis unit 40 can analyze a region range where the danger has occurred, a place where the danger has occurred, and the like. For example, when a danger is detected from the image of the right camera, the left camera is controlled to make the left and right cameras stereo.
  • the control estimation unit 42 changes the parameters of the left camera while comparing the images of the left and right cameras, and estimates the control value until it can be made stereo. Then, the estimated control value is output.
  • the camera control unit 12 quickly controls the camera based on the control value obtained when the stereo control unit 14 acquires the stereo image, and the stereo image is generated. By doing so, it is possible to acquire the abnormal or dangerous state in detail as the left and right camera images and camera information.
  • the distance calculation unit 15 calculates the distance using the acquired left and right camera images and camera information.
  • FIG. 5 is a diagram illustrating a configuration example of the distance calculation unit.
  • the camera calibration unit estimates the current camera parameters from the camera installation value 50. Camera parameters are camera information indicating the focal length, orientation, etc. of the camera, and can be broadly divided into two parameters: internal parameters and external parameters.
  • K is an internal parameter matrix
  • f is the focal length
  • a is the aspect ratio
  • s is the skew
  • (vc, uc) is the center coordinate of the image coordinates.
  • D is an external parameter matrix
  • (r11, r12, r13, r21, r22, r23, r31, r32, r33) indicates the direction of the camera
  • (tx, ty, tz) indicates the world coordinates of the camera installation position.
  • the camera direction (r11, r12,..., R33) indicating the external parameter is represented by three parameters of pan ⁇ , tilt ⁇ , and roll ⁇ , which are camera installation angles, when defined by Euler angles. Therefore, the number of camera parameters necessary for associating the image coordinates with the world coordinates is 11 which is a total of 5 internal parameters and 6 external parameters.
  • the distortion correction unit 54 corrects the distortion of the camera image using the camera parameter.
  • the parallel processing unit processes the parallelization of the left and right cameras. In other words, the 3D measurement value of the measurement object is
  • FIG. 6 is a diagram illustrating another configuration example of the distance calculation unit 15.
  • the camera automatic calibration unit automatically estimates camera parameters using the input image.
  • estimation methods have been developed. One of them is the automatic calibration method described in “Development of self-calibration technology for variable-parameter stereo camera, Li, 22nd Image Sensing Symposium, IS1-04”. After estimating the parameters, the distance can be obtained by the same principle as in FIG.
  • the stereo camera parameters can be intelligently adjusted. In other words, it is possible to detect danger and abnormality by quickly controlling the camera and making it stereo, and to correctly measure the current situation.
  • safer and safer control can be performed. For example, when one camera monitors a nearby area and the other camera monitors a telephoto area, if an abnormality or danger occurs in either area, the camera is quickly controlled to make it stereo. It is possible to measure abnormalities and dangers.
  • the measurement ranges of the left and right cameras are 71 and 72, and 73 and 74 on the image when measuring the vicinity region.
  • the measurement ranges of the left and right cameras are 75 and 76, and 77 and 78 on the image.
  • the left and right cameras are measured in separate areas, the near area and the telephoto area. Specifically, the left camera measures the near area, and the right camera measures the telephoto area. Then, when the danger 712 is detected by the left camera, the stereo control is immediately performed, and the right camera is controlled so that the telephoto area can be acquired. Then, the area where the danger has occurred is measured with the left and right cameras in a stereo state. As a result, the blind spot area can be eliminated in the vicinity area of the in-vehicle stereo camera, and the danger can be predicted and measured.
  • FIG. 8 is used to explain another embodiment of the stereo camera device shown in FIG.
  • the left and right cameras are measured in separate areas, a near area and a telephoto area.
  • the left camera measures the vicinity area 71
  • the right camera measures the telephoto area 76.
  • the image corresponding to the left camera at that time is 73
  • the image corresponding to the right camera is 74.
  • the right camera detects the danger 80
  • the stereo control is immediately performed and the left camera is controlled.
  • the left and right cameras are measured in a stereo state in the area where the danger 80 has occurred.
  • the blind spot area can be eliminated from the telephoto area of the in-vehicle stereo camera, and the danger can be predicted and measured.
  • the image region 73 of the camera 71 has a region 75 that is assumed to be distant. Therefore, virtual stereo processing can be performed using the region 75 and the region 74. As a result, stereo three-dimensional measurement can be easily performed without causing camera control. However, in this case, it is predicted that the measurement resolution will be reduced, so it is desirable to apply a super-resolution technique or the like.
  • the fourth embodiment includes a monitoring unit 90 that monitors danger and abnormality, and a stereo control unit 91 that instructs the camera to perform stereo control in accordance with the danger and abnormality risk information acquired by the monitoring unit 90.
  • the left and right camera control units 12 are connected to the actual video input unit 10, and the camera control unit 12 adjusts the camera according to the control information of the stereo control unit 91.
  • the image acquisition unit 11 converts the electronic signal input from the real video input unit 10 into an image.
  • the monitoring unit 90 and the stereo control unit 91 are connected to the camera control unit 12, the actual video input unit 10, and the image acquisition unit 11 via the Internet. Accordingly, for example, the monitoring unit 90 detects danger or abnormality, and an instruction to stereo-control the camera of the stereo control unit 91 according to the danger information acts on the camera control unit 12 via the Internet line to perform camera control. .
  • the distance calculation unit 15 calculates the distance using the acquired left and right camera images and camera information. Since the distance calculation unit 15 is connected to the real video input unit 10 and the image acquisition unit 11 via the Internet line, the distance calculation unit 15 can acquire camera images and camera information necessary for distance calculation via the Internet.
  • each component is connected to the Internet as described above, but the Internet connection location may be changed as appropriate.
  • Example 5 will be described with reference to FIG.
  • a place where a surveillance camera is installed such as a railway platform is assumed.
  • the stereo camera 100 described in the present embodiment monitors either the near-field or the far-field so that the left and right cameras are different in the vicinity and the distance as proposed in the second and third embodiments. is doing.
  • the stereo camera 100 can be made stereo to the vicinity area and monitored, Risk can be prevented.
  • the stereo camera device can perform shooting and measurement for multiple purposes, and can provide a more secure and safe service.
  • the sixth embodiment is an embodiment in which a stereo camera device using a commercially available PTZ camera is mounted.
  • the PTZ camera is a camera that can control pan, tilt, and zoom of the camera. Each camera's specs and control methods vary from manufacturer to manufacturer.
  • two or more PTZ cameras 110 are connected to the camera control unit 113 and the image acquisition unit 114 via the network 112, a cable, and the like to perform camera control and distance measurement.
  • the camera control unit 113 is a unit that controls the PTZ camera 110 and the baseline stage 111.
  • the image acquisition unit 114 is a unit that acquires an image from the PTZ camera via the network 112.
  • the monitoring unit 116 is a unit that monitors abnormalities and dangers on the image using the camera image acquired by the image acquisition unit 114.
  • the stereo control unit 115 instructs the camera control unit 113 to convert the camera into a stereo based on the danger or abnormality detected by the monitoring unit 116. Then, the camera control unit 113 converts the PTZ camera 110 into a stereo based on an instruction from the stereo control unit 115.
  • the distance calculation unit 15 calculates the distance using the camera image and camera information acquired by the PTZ camera, as in the first embodiment.
  • This embodiment makes it possible to intelligently adjust the parameters of a commercially available PTZ camera, detect dangers and abnormalities, and quickly measure the current state of the stereo image to correctly measure the current situation.
  • safer and safer control can be performed. For example, when one of the PTZ cameras monitors a nearby area and the other PTZ camera monitors a far-field area, if an abnormality or danger occurs in either area, the camera is quickly controlled. By making them stereo, abnormalities and dangers can be measured in detail.
  • a commercially available PTZ camera can be used, the control of the camera can be further simplified, and the development cost can be reduced. Further, when connecting the PTZ camera to each configuration, the measurement range of the camera can be further expanded by using the network 112, and can be adapted to various environments.
  • each component is connected to the network as described above, but the network connection location may be changed as necessary.

Landscapes

  • Studio Devices (AREA)
  • Measurement Of Optical Distance (AREA)

Abstract

The purpose of the present invention is to provide a highly reliable stereo camera device with consideration given to a blind spot region. To solve the above problem, this stereo camera device comprises: a plurality of real video input units 10 for photographing an object or a region; a camera control unit 12 for changing a set value at the time of photographing by the real video input unit 10; an image acquisition unit 11 for converting an electronic signal inputted from the real video input unit 10 into an image; a monitoring unit 13 for detecting abnormality and danger on the basis of image information acquired by the image acquisition unit 11; a stereo control unit 14 for calculating a control value 20 at the time of conversion to stereo of the real video input unit 10 on the basis of information detected by the monitoring unit 13; and a distance calculation unit 15 for calculating the distance from the real video input unit 10 to the object or the region on the basis of the information acquired by the image acquisition unit 11, and is characterized in that the camera control unit 12 changes the set value at the time of photographing by the real video input unit 10 on the basis of the control value 20 calculated by the stereo control unit 14.

Description

ステレオカメラ装置Stereo camera device
 本発明は、単眼カメラや複数のカメラを有するステレオカメラ装置に関するものである。 The present invention relates to a monocular camera or a stereo camera device having a plurality of cameras.
 従来から複数のカメラを用いて対象物や環境を計測するステレオ計測装置が実用されている。例えば、侵入者検知や危険予防などを目的に自動車の車載ステレオカメラを用いて様々な計測によって制御を行っている。また、自動車自動運転でもカメラより前後方位を計測し、車を制御する実例がある。また、近年では監視分野やマーケティングでもステレオカメラの適応事例でもある。ステレオ三次元計測としては、二台のカメラが同時に同じ対象物を撮影し、得られた画像に対して、歪み補正や平行化を行い、画像上の差分や視差を算出する。そして、算出した視差を用いて三次元対象の形状などを計測することができる。ここで、二台のカメラの計測領域はほぼ同じであることが条件としてある。その場合カメラのパラメータや画角、視差深度などのスペックに依存し、計測範囲は固定されている。例えば、車載ステレオカメラの場合はカメラの前方2mから40mまでを計測範囲として定められている。つまり、カメラ前方から2m以下40m以上は計測できなくなってしまう問題がある。 Conventionally, a stereo measurement apparatus that measures an object or an environment using a plurality of cameras has been put into practical use. For example, for the purpose of intruder detection and danger prevention, control is performed by various measurements using an in-vehicle stereo camera of an automobile. In addition, there is an example in which the vehicle is controlled by measuring the front-rear direction from a camera even in automatic driving of a car. In recent years, it is also an example of the application of stereo cameras in the surveillance field and marketing. As stereo three-dimensional measurement, two cameras photograph the same object at the same time, perform distortion correction and parallelization on the obtained image, and calculate a difference and parallax on the image. And the shape of a three-dimensional object etc. can be measured using the calculated parallax. Here, it is a condition that the measurement areas of the two cameras are substantially the same. In that case, the measurement range is fixed depending on specifications such as camera parameters, field angle, and parallax depth. For example, in the case of an in-vehicle stereo camera, the measurement range is defined from 2 m to 40 m in front of the camera. That is, there is a problem that it becomes impossible to measure 2 m or less and 40 m or more from the front of the camera.
 ここで周囲状況の検出能力が高い手段として、特許文献1に記載されたものがある。特許文献1では、物体の移動状態を検出するオプティカルフロー手段と、物体の距離を検出するステレオ測定手段から検出した検出結果から、移動物体の状態を検出する移動物体状態検出手段を備えた画像認識装置が開示されている。また、特許文献2では、複数の異なる視点から撮影した画像を用いて、精度の高い視差を探索し、広い範囲でも高精度な三次元計測を実現することが開示されている。 Here, there is one described in Patent Document 1 as a means having a high ability to detect the surrounding situation. In Japanese Patent Laid-Open No. 2004-260260, image recognition including an optical flow unit that detects a moving state of an object and a moving object state detecting unit that detects a state of the moving object from a detection result detected from a stereo measurement unit that detects the distance of the object. An apparatus is disclosed. Patent Document 2 discloses that high-accuracy three-dimensional measurement is realized even in a wide range by searching for high-precision parallax using images taken from a plurality of different viewpoints.
特開平10-222665号公報Japanese Patent Laid-Open No. 10-222665 特開2016-65744号公報Japanese Unexamined Patent Publication No. 2016-65744
 特許文献1においてはステレオ計測できない死角問題に対して、別センサや別の追加カメラで監視する提案も考えられる。しかし、この場合は別センサとの統合や追加コストの増加が必要である。また、別センサや別の追加カメラの情報は三次元情報が含まれていないため、監視精度の維持は容易ではない。また、検出結果をもとにカメラの撮影方向やズームについてのカメラ設定値を変更することは開示されていない。 In Patent Document 1, it is possible to propose a method of monitoring with a separate sensor or another additional camera for the blind spot problem that cannot be measured in stereo. However, in this case, integration with another sensor and an increase in additional cost are necessary. Moreover, since the information of another sensor and another additional camera does not include three-dimensional information, it is not easy to maintain monitoring accuracy. Also, changing the camera setting value for the shooting direction and zoom of the camera based on the detection result is not disclosed.
 また、特許文献2では、一定範囲での三次元計測に対して計測精度を高めるが、現場で起こった危険や異常に応じて、動的に計測範囲を調整することは容易ではない。 Further, in Patent Document 2, although the measurement accuracy is improved with respect to the three-dimensional measurement in a certain range, it is not easy to dynamically adjust the measurement range according to the danger or abnormality that has occurred in the field.
 そこで本発明では、死角領域を考慮した高信頼のステレオカメラ装置を提供することを目的とする。 Therefore, an object of the present invention is to provide a highly reliable stereo camera device that takes into account the blind spot area.
 以上の課題を解決するために、本発明に係るステレオカメラ装置では、物体又は領域を撮影する複数の実映像入力部と、前記実映像入力部の撮影に際しての設定値を変更するカメラ制御部と、前記実映像入力部から入力された電子信号を画像変換する画像取得部と、前記画像取得部で取得された画像情報をもとに異常や危険を検出する監視部と、前記監視部で検出した情報をもとに、前記実映像入力部のステレオ化に際しての制御値を算出するステレオ制御部と、前記画像取得部で取得した情報をもとに前記実映像入力部から前記物体又は領域までの距離を算出する距離算出部と、を有するステレオカメラ装置であって、前記カメラ制御部は、前記ステレオ制御部が算出した制御値に基づいて前記実映像入力部の撮影に際しての設定値を変更することを特徴とする。 In order to solve the above problems, in the stereo camera device according to the present invention, a plurality of actual video input units that capture an image of an object or a region, and a camera control unit that changes a setting value at the time of capturing the actual video input unit, , An image acquisition unit that converts an electronic signal input from the real video input unit, a monitoring unit that detects abnormality or danger based on the image information acquired by the image acquisition unit, and a detection by the monitoring unit Based on the obtained information, a stereo control unit that calculates a control value when the real video input unit is made stereo, and from the real video input unit to the object or region based on the information acquired by the image acquisition unit A distance calculation unit that calculates a distance of the camera, wherein the camera control unit changes a setting value at the time of shooting of the real image input unit based on a control value calculated by the stereo control unit. Characterized in that it.
 本発明によれば、死角領域を考慮した高信頼のステレオカメラ装置を提供することが可能になる。 According to the present invention, it is possible to provide a highly reliable stereo camera device in consideration of the blind spot area.
本発明による一実施形態のシステム構成を示すブロック図である。It is a block diagram which shows the system configuration | structure of one Embodiment by this invention. カメラ制御部の構成例を示す図である。It is a figure which shows the structural example of a camera control part. ステレオ制御部の構成例を示す図である。It is a figure which shows the structural example of a stereo control part. ステレオ制御部の構成例を示す図である。It is a figure which shows the structural example of a stereo control part. 距離算出部の構成例を示す図である。It is a figure which shows the structural example of a distance calculation part. 距離算出部の構成例を示す図である。It is a figure which shows the structural example of a distance calculation part. 本発明による、第二の実施形態を示す図である。It is a figure which shows 2nd embodiment by this invention. 本発明による、第三の実施形態を示す図である。It is a figure which shows 3rd embodiment by this invention. 本発明による、第四の実施形態を示す図である。It is a figure which shows 4th embodiment by this invention. 本発明による、第五の実施形態を示す図である。It is a figure which shows 5th embodiment by this invention. 本発明による、第六の実施形態を示す図である。It is a figure which shows 6th embodiment by this invention.
 以下、図面を用いて本発明の実施例を説明する。尚、下記はあくまでも実施例であり、本発明の実施態様を下記具体的内容に限定することを意図する趣旨ではない。 Hereinafter, embodiments of the present invention will be described with reference to the drawings. The following are only examples, and are not intended to limit the embodiments of the present invention to the following specific contents.
 ここでは、二つのカメラを有するステレオカメラを例として説明する。図1は、本発明の実施例1に係るステレオカメラ装置の構成を示す図である。 Here, a stereo camera having two cameras will be described as an example. FIG. 1 is a diagram illustrating a configuration of a stereo camera device according to Embodiment 1 of the present invention.
 実映像入力部(ステレオカメラ)10は、基本的には撮像部とレンズで構成されている。そしてステレオカメラは左右カメラのレンズから外部の映像が入力される。撮像部は、画像センサであるCMOS(Complementary Metal Oxide Semiconductor)やCCD(Charge Coupled Device)などの撮像素子を含む機構である。レンズはズーム可能なレンズである。レンズをズーム動作をすることによって、近傍領域だけでなく遠望領域を撮像することも可能とする。実施例1では撮影するシーンは特定しないが、監視機能向けのステレオカメラであれば移動物体や監視エリアなどが撮影対象として考えられる。車載ステレオカメラであれば、道路や前方車などが撮影シーンとして考えられる。撮影は、複雑な撮影シーンや単純な撮影シーンでも撮影可能であり、撮影された映像が入力される。また、実施例1のカメラは焦点距離や回転角度、カメラ間の距離など変更することが可能なカメラでもよい。 The real image input unit (stereo camera) 10 basically includes an imaging unit and a lens. The stereo camera receives external images from the left and right camera lenses. The imaging unit is a mechanism including an imaging element such as a CMOS (Complementary Metal Oxide Semiconductor) or a CCD (Charge Coupled Device) which is an image sensor. The lens is a zoomable lens. By zooming the lens, it is possible to image not only the near area but also the telephoto area. In the first embodiment, the scene to be photographed is not specified, but if it is a stereo camera for the monitoring function, a moving object, a surveillance area, or the like can be considered as a photographing target. In the case of an in-vehicle stereo camera, a road or a car ahead can be considered as a shooting scene. The shooting can be performed even in a complicated shooting scene or a simple shooting scene, and the shot video is input. The camera of the first embodiment may be a camera that can change the focal length, the rotation angle, the distance between the cameras, and the like.
 画像取得部11は、実映像入力部10から入力した電子信号を画像へ変換する部である。ここでは画像フォマットは様々であり、BMP,JPEG,PNGなどシステムによって画像フォマットは決められる。 The image acquisition unit 11 is a unit that converts an electronic signal input from the real video input unit 10 into an image. Here, there are various image formats, and the image format is determined by the system such as BMP, JPEG, or PNG.
 カメラ制御部12は実映像入力部10を制御するものである。カメラの画角や向き、カメラ間の距離などカメラのパラメータを調整することができる。図2はカメラ制御部12の構成例を示す図である。カメラ制御部12はズーム制御部21、チルト制御部22、パン制御部23、位置制御部24で構成されており、チルト制御部22はカメラの画角とカメラ向きの水平方向を制御し、パン制御部23はカメラの画角とカメラの垂直方向を制御する。ズーム制御部21はレンズ機構を制御するもので、レンズの移動によって焦点距離を変更しカメラの撮像範囲である画角を変更する。位置制御部24は、カメラ間の距離を調整し、ステレオカメラの場合はベースラインと呼ばれるものを調整する。チルト制御部22とパン制御部23はカメラを回転させながら姿勢を変えることができる。以上の調整は、独立でも可能であり同時でも可能である。この調整はシステム仕様や撮影するシーンのニーズに合わせて調整することができる。 The camera control unit 12 controls the actual video input unit 10. Camera parameters such as camera angle of view and orientation, and distance between cameras can be adjusted. FIG. 2 is a diagram illustrating a configuration example of the camera control unit 12. The camera control unit 12 includes a zoom control unit 21, a tilt control unit 22, a pan control unit 23, and a position control unit 24. The tilt control unit 22 controls the angle of view of the camera and the horizontal direction of the camera, and pans. The control unit 23 controls the angle of view of the camera and the vertical direction of the camera. The zoom control unit 21 controls the lens mechanism, and changes the focal length by moving the lens to change the angle of view that is the imaging range of the camera. The position control unit 24 adjusts the distance between the cameras, and adjusts what is called a baseline in the case of a stereo camera. The tilt control unit 22 and the pan control unit 23 can change the posture while rotating the camera. The above adjustment can be performed independently or simultaneously. This adjustment can be adjusted according to the system specifications and the needs of the scene to be photographed.
 次に図1に戻り実施例1について説明する。実施例1では以上で述べた実映像入力部10、画像取得部11、カメラ制御部12を右と左カメラのそれぞれで有する。すなわち図1に示すように、右実映入力部、左実映入力部、右画像取得部、左画像取得部、右カメラ制御部、左カメラ制御部によって構成される。 Next, returning to FIG. 1, the first embodiment will be described. In the first embodiment, the real video input unit 10, the image acquisition unit 11, and the camera control unit 12 described above are provided in each of the right and left cameras. That is, as shown in FIG. 1, it is comprised by the right demonstration input part, the left demonstration input part, the right image acquisition part, the left image acquisition part, the right camera control part, and the left camera control part.
 監視部13は、画像取得部11から入力された画像情報を用いて、異常や危険を監視し検出するものである。ここでいう異常や危険は撮影シーンによって種々異なる。実施例1が車載カメラの場合とすると、運行前方より突然人や車両が現れた時や不審危険物が侵入してきた時が該当する。侵入者の検出については様々な方式が研究さている。例えば、方式「松山他、照明変化に頑強な背景差分」では背景モデルを再生し、監視シーンによって侵入するものを検出することができる。近年では、機械学習による画像内から特定の人物や異常なものを検出する方法も提案されている。汎用性の高い方式としてはSIFT(Scale-Invariant Feature Transform),HOG(Histograms of Oriented Gradients)と識別機Adaboost,SVM(supported vector machine)の組み合わせで識別機を構築し、異常なものを検出する方法がある。また、最近ではDeepLearning技術の発展、異常検知への適用も検討が始まっている。以上のような手法を用いて画像内の危険や異常を検出し、警報することができる。また、監視部13は画像で判断する方法以外にレーザセンサより侵入など異常を検出することも可能である。 The monitoring unit 13 uses the image information input from the image acquisition unit 11 to monitor and detect abnormalities and dangers. The abnormalities and dangers here vary depending on the shooting scene. When Example 1 is a vehicle-mounted camera, it corresponds to a time when a person or a vehicle suddenly appears from the front of the operation or a suspicious dangerous object enters. Various methods have been studied for intruder detection. For example, in the method “Matsuyama et al., Background difference robust against lighting changes”, a background model can be reproduced and an intruder can be detected by a monitoring scene. In recent years, a method of detecting a specific person or an abnormal thing from an image by machine learning has been proposed. As a highly versatile method, a combination of SIFT (Scale-Invant Feature Transform), HOG (Histograms of Oriented Gradients) and discriminator Adaboost, SVM (supported vector machine) is used to detect an abnormal combination. There is. Recently, the development of Deep Learning technology and application to abnormality detection have begun. Using the method as described above, it is possible to detect and warn of danger and abnormality in the image. The monitoring unit 13 can also detect an abnormality such as an intrusion from the laser sensor in addition to the method of determining by an image.
 ステレオ制御部14は監視部13から出力された警報を受け取り、検知された異常や危険情報に基づいてカメラのステレオ化に際しての制御値を算出する。そして算出された制御値をもとに、カメラ制御部12で右カメラと左カメラのパラメータを調整する。これよってカメラのステレオ化を行う。 The stereo control unit 14 receives the alarm output from the monitoring unit 13, and calculates a control value for the stereo conversion of the camera based on the detected abnormality and danger information. Based on the calculated control value, the camera control unit 12 adjusts the parameters of the right camera and the left camera. This makes the camera stereo.
 図3はステレオ制御部14の構成例を示す図である。警報観察部31は、監視部13から警報30があるか否かを判別する。警報30が検出された場合は、警報観察部31からその警報の情報を警報分析部32へ出力する。ここでは、警報の情報として例えば監視部13で使われるレーザセンサで検出した距離情報などが考えられる。警報分析部32は警報情報に基づいて制御したいカメラを特定する。例えば、右カメラから検出を受けた場合は右カメラの観測領域に対して、更に両カメラをステレオ化した上で計測する必要があると判断し、左カメラを制御することを決定することができる。そして、カメラ現状取得部33で現在の左右のカメラの設定値を取得する。設置値は、カメラのズーム値、回転値、カメラ間距離のパラメータ値を意味する。制御算出部34は、カメラ現状部33で取得したカメラ設置値と警報分析部32で判断した結果に基づいてステレオ化に際しての制御値20を算出し、それをもとに右カメラ制御部12と左カメラ制御部13によってカメラを制御し、ステレオ化させる。この動作によって知能的にステレオカメラを制御可能とするため、ステレオカメラの画角、向きを別々に変更が可能となり、異常や危険に応じて動的にステレオ化することが可能なステレオカメラ装置を提供することができる。また、異なる二つカメラをステレオ化されることは、カメラのリソースを最大限に利用することが可能となる。更には必要のみに応じて、ステレオ化することでカメラのコストを削減することが期待できる。 FIG. 3 is a diagram illustrating a configuration example of the stereo control unit 14. The alarm observation unit 31 determines whether there is an alarm 30 from the monitoring unit 13. When the alarm 30 is detected, the alarm information is output from the alarm observation unit 31 to the alarm analysis unit 32. Here, for example, distance information detected by a laser sensor used in the monitoring unit 13 can be considered as alarm information. The alarm analysis unit 32 specifies a camera to be controlled based on the alarm information. For example, if detection is received from the right camera, it is determined that it is necessary to further measure both cameras in stereo for the observation area of the right camera, and it is possible to decide to control the left camera. . Then, the current camera setting unit 33 acquires the current setting values of the left and right cameras. The installation value means a parameter value of the zoom value, the rotation value, and the inter-camera distance of the camera. The control calculation unit 34 calculates a control value 20 for stereoization based on the camera installation value acquired by the camera current state unit 33 and the result determined by the alarm analysis unit 32, and based on the calculated control value 20, The left camera control unit 13 controls the camera to make it stereo. This operation makes it possible to intelligently control the stereo camera, so that the angle and direction of the stereo camera can be changed separately, and a stereo camera device that can be stereoified dynamically according to abnormalities and dangers. Can be provided. In addition, when two different cameras are made stereo, it is possible to make maximum use of camera resources. Furthermore, if necessary, it can be expected that the cost of the camera will be reduced by making it stereo.
 図4はステレオ制御部14の別の構成例を示す図である。ここでは、センサにより検出した情報をもとにした警報ではない。画像を用いて警報を検出し、制御値20を推定する実例である。警報情報としては例えば右もしくは左のカメラから得られた画像上での位置などが考えられる。警報観察部は図3と同じである。画像分析部40は、左右のカメラの画像41を入力として、画像上の危険情報を分析する。画像分析部40は、危険が発生した領域範囲や発生した危険の場所などを分析することができる。例えば右カメラの画像から危険検知した場合は、左カメラを制御し左右のカメラをステレオ化する。制御推定部42は、左右のカメラの画像を比較しながら左カメラのパラメータを変更し、ステレオ化できるまで制御値を推定する。そして推定された制御値を出力する。 FIG. 4 is a diagram illustrating another configuration example of the stereo control unit 14. Here, the alarm is not based on the information detected by the sensor. This is an example in which an alarm is detected using an image and a control value 20 is estimated. As the alarm information, for example, the position on the image obtained from the right or left camera can be considered. The alarm observation unit is the same as in FIG. The image analysis unit 40 receives the left and right camera images 41 as input and analyzes the danger information on the images. The image analysis unit 40 can analyze a region range where the danger has occurred, a place where the danger has occurred, and the like. For example, when a danger is detected from the image of the right camera, the left camera is controlled to make the left and right cameras stereo. The control estimation unit 42 changes the parameters of the left camera while comparing the images of the left and right cameras, and estimates the control value until it can be made stereo. Then, the estimated control value is output.
 図1に戻り実施例1について説明する。左右カメラのどちらかの領域で異常や危険が発生した場合には、ステレオ制御部14で取得されたステレオ化に際しての制御値をもとにカメラ制御部12によってカメラを速やかに制御し、ステレオ化させることで詳細に異常や危険な状態を左右のカメラ画像やカメラ情報として取得することができる。そして距離算出部15は取得した左右のカメラ画像やカメラ情報を用いて距離を算出する。図5は、距離算出部の構成例を示す図である。カメラキャリブレーション部はカメラの設置値50から現状のカメラパラメータを推定する。カメラパラメータはカメラの焦点距離や向き等を示すカメラ情報であり、内部パラメータと外部パラメータの2つに大別できる。
Figure JPOXMLDOC01-appb-I000001
Returning to FIG. 1, the first embodiment will be described. When an abnormality or danger occurs in either area of the left and right cameras, the camera control unit 12 quickly controls the camera based on the control value obtained when the stereo control unit 14 acquires the stereo image, and the stereo image is generated. By doing so, it is possible to acquire the abnormal or dangerous state in detail as the left and right camera images and camera information. The distance calculation unit 15 calculates the distance using the acquired left and right camera images and camera information. FIG. 5 is a diagram illustrating a configuration example of the distance calculation unit. The camera calibration unit estimates the current camera parameters from the camera installation value 50. Camera parameters are camera information indicating the focal length, orientation, etc. of the camera, and can be broadly divided into two parameters: internal parameters and external parameters.
Figure JPOXMLDOC01-appb-I000001
のようにKは内部パラメータ行列であり、fは焦点距離、aはアスペクト比、sはスキュー、(vc,uc)は画像座標の中心座標を示す。またDは外部パラメータ行列であり、(r11、r12、r13、r21、r22、r23、r31、r32、r33)はカメラの向きを示し、(tx、ty、tz)はカメラ設置位置の世界座標を示す。これら2つのパラメータ行列K、D及び定数λを用いると、画像座標(u,v)と世界座標(X、Y、Z)は
Figure JPOXMLDOC01-appb-I000002
K is an internal parameter matrix, f is the focal length, a is the aspect ratio, s is the skew, and (vc, uc) is the center coordinate of the image coordinates. D is an external parameter matrix, (r11, r12, r13, r21, r22, r23, r31, r32, r33) indicates the direction of the camera, and (tx, ty, tz) indicates the world coordinates of the camera installation position. Show. Using these two parameter matrices K, D and constant λ, the image coordinates (u, v) and world coordinates (X, Y, Z) are
Figure JPOXMLDOC01-appb-I000002
の関係式により対応付けられる。なお、外部パラメータのカメラの向きを示す(r11、r12、…r33)は、オイラー角により定義すると、カメラの設置角度であるパンθ、チルトφ、ロールψの3個のパラメータによって表される。そのため、画像座標と世界座標の対応付けのために必要なカメラパラメータ数は、5個の内部パラメータと6個の外部パラメータを合計した11個となる。歪み補正部54は、カメラパラメータを用いてカメラ画像の歪みを補正する。そして、平行処理部は左右のカメラの平行化を処理する。つまり計測物の三次元計測値は
Figure JPOXMLDOC01-appb-I000003
These are related by the relational expression. Note that the camera direction (r11, r12,..., R33) indicating the external parameter is represented by three parameters of pan θ, tilt φ, and roll ψ, which are camera installation angles, when defined by Euler angles. Therefore, the number of camera parameters necessary for associating the image coordinates with the world coordinates is 11 which is a total of 5 internal parameters and 6 external parameters. The distortion correction unit 54 corrects the distortion of the camera image using the camera parameter. The parallel processing unit processes the parallelization of the left and right cameras. In other words, the 3D measurement value of the measurement object is
Figure JPOXMLDOC01-appb-I000003
のように算出することが可能である。(xl,yl),(xr,yr)は左右のカメラ画像上の画素値であり、平行化処理後はyl=yr=yとなる。fは焦点距離、Bはベースライン及びカメラ間の距離である。dは同じ三次元上のもので、それぞれ左右カメラに投影した画像間の差分である。その差分は三次元上では視差と言う。世界座標と画像座標の関係は、
Figure JPOXMLDOC01-appb-I000004
It is possible to calculate as follows. (Xl, yl) and (xr, yr) are pixel values on the left and right camera images, and yl = yr = y after the parallelization processing. f is the focal length, and B is the distance between the baseline and the camera. d is the same three-dimensional and is the difference between the images projected on the left and right cameras. The difference is called parallax in three dimensions. The relationship between world coordinates and image coordinates is
Figure JPOXMLDOC01-appb-I000004
のようになる。 become that way.
 図6は、距離算出部15の別の構成例を示す図である。カメラ自動キャリブレーション部は、入力された画像を用いて自動的にカメラパラメータを推定する。推定する方法はいくつか開発されている。その一つとして「パラメータ可変ステレオカメラ向け自己較正技術の開発 李 媛、第22回画像センシングシンポジウム、IS1-04」に記載した自動キャリブレーション手法が挙げられる。パラメータを推定した後に図5と同じ原理で距離を得ることができる。 FIG. 6 is a diagram illustrating another configuration example of the distance calculation unit 15. The camera automatic calibration unit automatically estimates camera parameters using the input image. Several estimation methods have been developed. One of them is the automatic calibration method described in “Development of self-calibration technology for variable-parameter stereo camera, Li, 22nd Image Sensing Symposium, IS1-04”. After estimating the parameters, the distance can be obtained by the same principle as in FIG.
 以上説明した実施例1によれば、ステレオカメラのパラメータを知能的に調整可能となる。つまり、速やかにカメラを制御しステレオ化することで危険や異常を検出でき、正しく現状を計測することが可能となる。そのようなステレオカメラ装置を用いて、更に安心かつ安全な制御ができるようになる。例えば片方のカメラは近傍領域を監視し、もう片方のカメラは遠望領域を監視する時、どちらかの領域で異常や危険が発生した場合にはカメラを速やかに制御しステレオ化させることで、詳細に異常や危険を計測することができる。 According to the first embodiment described above, the stereo camera parameters can be intelligently adjusted. In other words, it is possible to detect danger and abnormality by quickly controlling the camera and making it stereo, and to correctly measure the current situation. By using such a stereo camera device, safer and safer control can be performed. For example, when one camera monitors a nearby area and the other camera monitors a telephoto area, if an abnormality or danger occurs in either area, the camera is quickly controlled to make it stereo. It is possible to measure abnormalities and dangers.
 図7を用いて実施例1で提案したステレオカメラ装置の実施例を説明する。従来の自動車70に搭載したステレオカメラは、近傍領域を計測する際には左右のカメラの計測範囲は71と72であり、画像上は73と74となる。一方、遠望領域を計測する際には左右カメラの計測範囲は75と76であり、画像上は77と78となる。 An embodiment of the stereo camera device proposed in Embodiment 1 will be described with reference to FIG. In the stereo camera mounted on the conventional automobile 70, the measurement ranges of the left and right cameras are 71 and 72, and 73 and 74 on the image when measuring the vicinity region. On the other hand, when measuring the telephoto area, the measurement ranges of the left and right cameras are 75 and 76, and 77 and 78 on the image.
 ここで左右カメラを共に遠望にした場合は近傍領域に死角領域79が発生し、画面上に危険や異常712が発生する。この場合その危険を検出できず、距離も計測もできない問題が発生する。実際に車を運転する際にはこれらの危険や異常は運転に支障をきたし、死角領域についても計測する必要がある。 Here, when both the left and right cameras are distant, a blind spot area 79 is generated in the vicinity area, and a danger or an abnormality 712 is generated on the screen. In this case, there is a problem that the danger cannot be detected and the distance and the measurement cannot be performed. When actually driving a car, these dangers and abnormalities hinder driving, and it is necessary to measure the blind spot area.
 本実施例では、ステレオカメラ装置起動当初は左右のカメラそれぞれを近傍領域と遠望領域の別々の領域で計測する。具体的には左カメラは近傍領域を計測し、右は遠望領域を計測するようにしておく。そして、左カメラで危険712を検知した場合は、すぐにステレオ化制御をし、右カメラを遠望領域が取得できるように制御する。そして危険が発生した領域を左右のカメラをステレオの状態で計測する。それによって車載ステレオカメラの近傍領域に死角領域を無くすことができ、危険の予測や計測ができるようになる。 In this embodiment, when the stereo camera device is initially activated, the left and right cameras are measured in separate areas, the near area and the telephoto area. Specifically, the left camera measures the near area, and the right camera measures the telephoto area. Then, when the danger 712 is detected by the left camera, the stereo control is immediately performed, and the right camera is controlled so that the telephoto area can be acquired. Then, the area where the danger has occurred is measured with the left and right cameras in a stereo state. As a result, the blind spot area can be eliminated in the vicinity area of the in-vehicle stereo camera, and the danger can be predicted and measured.
 図8を用いて図7に示したステレオカメラ装置の別の実施例を説明する。本実施例では、ステレオカメラ装置起動当初は左右のカメラそれぞれを近傍領域と遠望領域の別々の領域で計測する。具体的には左カメラは近傍領域71を計測し、右は遠望領域76を計測するようにしておく。その際の左カメラに対応する画像が73、右カメラに対応する画像が74である。そして右カメラで危険80を検知した場合はすぐにステレオ化制御をし、左カメラを制御する。そして危険80が発生した領域を左右のカメラをステレオの状態で計測する。それによって車載ステレオカメラの遠望領域に死角領域を無くすことができ、危険の予測や計測ができるようになる。 FIG. 8 is used to explain another embodiment of the stereo camera device shown in FIG. In this embodiment, when the stereo camera device is initially activated, the left and right cameras are measured in separate areas, a near area and a telephoto area. Specifically, the left camera measures the vicinity area 71, and the right camera measures the telephoto area 76. The image corresponding to the left camera at that time is 73, and the image corresponding to the right camera is 74. When the right camera detects the danger 80, the stereo control is immediately performed and the left camera is controlled. Then, the left and right cameras are measured in a stereo state in the area where the danger 80 has occurred. As a result, the blind spot area can be eliminated from the telephoto area of the in-vehicle stereo camera, and the danger can be predicted and measured.
 またこの実施例では危険80が発生した場合、危険領域は遠望である。そのためカメラ71の画像領域73の中、遠望と想定している領域75が存在する。故に領域75と領域74を利用して仮想のステレオ処理を行うことも可能である。それによって、カメラの制御を起こさなくても、簡易にステレオ三次元計測をするができる。ただし、この場合は計測分解能を低下することが予測されるので、超解像技術などを適応することが望ましい。 Also, in this embodiment, when the danger 80 occurs, the dangerous area is a long distance. For this reason, the image region 73 of the camera 71 has a region 75 that is assumed to be distant. Therefore, virtual stereo processing can be performed using the region 75 and the region 74. As a result, stereo three-dimensional measurement can be easily performed without causing camera control. However, in this case, it is predicted that the measurement resolution will be reduced, so it is desirable to apply a super-resolution technique or the like.
 次に図9を用いて実施例4について説明する。実施例4では、図1に示す構成10から12、15の機能は実施例1と同じであるため説明を省略する。実施例4では危険や異常を監視する監視部90と、監視部90で取得した危険や異常の危険情報に応じてカメラをステレオ制御指示するステレオ制御部91を有する。左右のカメラ制御部12は実映像入力部10と接続されており、カメラ制御部12はステレオ制御部91の制御情報に応じてカメラを調整する。画像取得部11は実映像入力部10から入力した電子信号を画像へ変換する。 Next, Example 4 will be described with reference to FIG. In the fourth embodiment, the functions of the configurations 10 to 12 and 15 shown in FIG. The fourth embodiment includes a monitoring unit 90 that monitors danger and abnormality, and a stereo control unit 91 that instructs the camera to perform stereo control in accordance with the danger and abnormality risk information acquired by the monitoring unit 90. The left and right camera control units 12 are connected to the actual video input unit 10, and the camera control unit 12 adjusts the camera according to the control information of the stereo control unit 91. The image acquisition unit 11 converts the electronic signal input from the real video input unit 10 into an image.
 本実施例において監視部90及びステレオ制御部91は、カメラ制御部12、実映像入力部10及び画像取得部11とインターネット経由で相互に接続されている。これによって例えば、監視部90で危険や異常を検知し、その危険情報に応じたステレオ制御部91のカメラをステレオ制御する指示がインターネット回線を経由してカメラ制御部12に働きかけ、カメラ制御を行う。また、距離算出部15は取得した左右のカメラ画像やカメラ情報を用いて距離を算出する。距離算出部15が実映像入力部10及び画像取得部11とインターネット回線で接続されているため、距離算出部15は距離算出に必要なカメラ画像やカメラ情報をインターネット経由で取得することができる。 In this embodiment, the monitoring unit 90 and the stereo control unit 91 are connected to the camera control unit 12, the actual video input unit 10, and the image acquisition unit 11 via the Internet. Accordingly, for example, the monitoring unit 90 detects danger or abnormality, and an instruction to stereo-control the camera of the stereo control unit 91 according to the danger information acts on the camera control unit 12 via the Internet line to perform camera control. . The distance calculation unit 15 calculates the distance using the acquired left and right camera images and camera information. Since the distance calculation unit 15 is connected to the real video input unit 10 and the image acquisition unit 11 via the Internet line, the distance calculation unit 15 can acquire camera images and camera information necessary for distance calculation via the Internet.
 このようにステレオカメラ装置の各構成をインターネット回線で接続することによって、それぞれの構成が遠隔に位置していたとしても高速にカメラを制御し、カメラのステレオ化を可能とする。ここで、本実施例では前述のように各構成をインターネット接続しているが、必要に応じてインターネット接続箇所を適宜変更しても良い。 By connecting each configuration of the stereo camera device via the Internet line in this way, the camera can be controlled at high speed even if each configuration is remotely located, and the camera can be made stereo. Here, in the present embodiment, each component is connected to the Internet as described above, but the Internet connection location may be changed as appropriate.
 次に図10を用いて実施例5について説明する。実施例5では、鉄道ホームなど監視カメラが設置された場所を想定する。本実施例に記載されたステレオカメラ100は、実施例2や実施例3で提案されたように左右のカメラのそれぞれが近傍と遠望とで相違するように近傍領域もしくは遠望領域のいずれかを監視している。ここで例えば、近傍領域で子供101が鉄道ホームの黄色線の外側に行くなどの危険が検知された場合は、ステレオカメラ100のカメラを近傍領域に対してステレオ化させて監視することができ、危険を予防することができる。 Next, Example 5 will be described with reference to FIG. In the fifth embodiment, a place where a surveillance camera is installed such as a railway platform is assumed. The stereo camera 100 described in the present embodiment monitors either the near-field or the far-field so that the left and right cameras are different in the vicinity and the distance as proposed in the second and third embodiments. is doing. Here, for example, when a danger is detected such that the child 101 goes outside the yellow line of the railroad platform in the vicinity area, the stereo camera 100 can be made stereo to the vicinity area and monitored, Risk can be prevented.
 また、鉄道ホームにて多数な人が集まっているような混雑時の場合102は、ステレオ化することでステレオカメラ100の撮像領域を制御し、混雑時の場合102の詳細を把握することができる。これによって単なる混雑であるか、事故があるかなどを更に把握することが可能となる。 Further, in the case of crowded 102 where many people are gathering at the railroad platform, it is possible to control the imaging area of the stereo camera 100 by making it stereo, and to grasp the details of the crowded 102. . As a result, it is possible to further grasp whether the traffic is simply crowded or an accident occurs.
 また、鉄道ホームで車椅子に乗っている人がいる場合103には、遠望領域を観測するようにステレオカメラ100を遠望領域に対してステレオ化させることで、車椅子利用者が困っているか否かを判断することができる。それによって、車椅子利用者に対してサービス提供を行うことが可能である。 Also, if there is a person in a wheelchair at the railroad platform 103, whether the wheelchair user is in trouble can be determined by making the stereo camera 100 stereo with respect to the far field so as to observe the far field. Judgment can be made. Thereby, it is possible to provide services to wheelchair users.
 このように一台のステレオカメラを本実施例のように適宜ステレオ化処理を行うことで、必要と思われるところを撮像することができ、結果として広範囲を監視することができる。本発明におけるステレオカメラ装置は多目的に撮影や計測をすることができ、更なる安心で安全なサービスを提供することを可能とする。 As described above, by appropriately performing stereo processing on one stereo camera as in the present embodiment, it is possible to take an image of a place that seems necessary, and as a result, it is possible to monitor a wide range. The stereo camera device according to the present invention can perform shooting and measurement for multiple purposes, and can provide a more secure and safe service.
 次に図11を用いて実施例6について説明する。実施例6は、市販のPTZカメラを用いたステレオカメラ装置を実装する実施例である。PTZカメラは、カメラのパン、チルト、ズームを制御することが可能なカメラである。カメラのそれぞれスペックや制御方法はメーカごとに異なる。本実施例は、ネットワーク112やケーブルなどを経由し二台以上のPTZカメラ110をカメラ制御部113や画像取得部114と接続し、カメラ制御や距離計測を行う実施例である。カメラ制御部113は、PTZカメラ110とベースラインステージ111を制御する部である。画像取得部114は、ネットワーク112経由でPTZカメラから画像を取得する部である。監視部116は画像取得部114で取得したカメラの画像を用いて画像上の異常や危険を監視する部である。ステレオ制御部115は、監視部116で検出した危険や異常をもとにカメラ制御部113がカメラをステレオ化するための指示を行う。そしてカメラ制御部113がステレオ制御部115の指示をもとにPTZカメラ110をステレオ化させる。距離算出部15は実施例1と同様に、PTZカメラによって取得したカメラ画像やカメラ情報を用いて距離を算出する。 Next, Example 6 will be described with reference to FIG. The sixth embodiment is an embodiment in which a stereo camera device using a commercially available PTZ camera is mounted. The PTZ camera is a camera that can control pan, tilt, and zoom of the camera. Each camera's specs and control methods vary from manufacturer to manufacturer. In the present embodiment, two or more PTZ cameras 110 are connected to the camera control unit 113 and the image acquisition unit 114 via the network 112, a cable, and the like to perform camera control and distance measurement. The camera control unit 113 is a unit that controls the PTZ camera 110 and the baseline stage 111. The image acquisition unit 114 is a unit that acquires an image from the PTZ camera via the network 112. The monitoring unit 116 is a unit that monitors abnormalities and dangers on the image using the camera image acquired by the image acquisition unit 114. The stereo control unit 115 instructs the camera control unit 113 to convert the camera into a stereo based on the danger or abnormality detected by the monitoring unit 116. Then, the camera control unit 113 converts the PTZ camera 110 into a stereo based on an instruction from the stereo control unit 115. The distance calculation unit 15 calculates the distance using the camera image and camera information acquired by the PTZ camera, as in the first embodiment.
 本実施例によって市販のPTZカメラのパラメータを知能的に調整可能となり、危険や異常を検出でき、速やかにステレオ化の制御を行うことで正しく現状を計測することが可能となる。このようなステレオカメラ装置を用いて、更に安心かつ安全な制御ができるようになる。例えば、PTZカメラのうち一台は近傍領域を監視し、PTZカメラのもう一台が遠望領域を監視する時、どちらかの領域で異常や危険が発生した場合には、カメラを速やかに制御し、ステレオ化させることで詳細に異常や危険を計測することができる。本実施例では、市販のPTZカメラを活用することができ、カメラの制御を更に簡易化することができ、開発コストを抑えることもできる。また、PTZカメラを各構成につなげるに際して、ネットワーク112を利用することでカメラの計測範囲を更に広げることができ、様々な環境に適応可能となる。 This embodiment makes it possible to intelligently adjust the parameters of a commercially available PTZ camera, detect dangers and abnormalities, and quickly measure the current state of the stereo image to correctly measure the current situation. By using such a stereo camera device, safer and safer control can be performed. For example, when one of the PTZ cameras monitors a nearby area and the other PTZ camera monitors a far-field area, if an abnormality or danger occurs in either area, the camera is quickly controlled. By making them stereo, abnormalities and dangers can be measured in detail. In this embodiment, a commercially available PTZ camera can be used, the control of the camera can be further simplified, and the development cost can be reduced. Further, when connecting the PTZ camera to each configuration, the measurement range of the camera can be further expanded by using the network 112, and can be adapted to various environments.
 ここで、本実施例では前述のように各構成をネットワーク接続しているが、必要に応じてネットワーク接続箇所を適宜変更しても良い。
Here, in the present embodiment, each component is connected to the network as described above, but the network connection location may be changed as necessary.
10:実映像入力部
11:画像取得部
12:カメラ制御部
13:監視部
14:ステレオ制御部
15:距離算出部
20:制御値
21:ズーム制御部
22:チルト制御部
23:制御パン部
24:パン制御部
30:警報
31:警報観察部
32:警報分析部
33:カメラ現状取得部
34:制御算出部
35:右カメラ設置部
36:左カメラ設置部
40:画像分析部
41:左右カメラの画像
42:制御推定部
50:カメラ設定値
51:カメラキャリブレーション部
52:カメラパラメータ
53:画像
54:歪み補正部
55:平行処理部
56:距離算出部
60:画像
61:カメラ自動キャリブレーション部
62:カメラパラメータ
70:自動車
71、72、75、76:計測領域
73,74、77、78:取得した画像
79:死角領域
712:危険や異常
80:危険や異常
90:カメラ位置
91:監視部
92:ステレオ制御部
100:ステレオカメラ
101:子供
102:人群
103:車椅子
110:PTZカメラ
111:ベースラインステージ
112:ネットワック
113:カメラ制御部
114:画像取得部
115:ステレオ制御部
116:監視部
10: Real video input unit 11: Image acquisition unit 12: Camera control unit 13: Monitoring unit 14: Stereo control unit 15: Distance calculation unit 20: Control value 21: Zoom control unit 22: Tilt control unit 23: Control pan unit 24 : Pan control unit 30: Alarm 31: Alarm observation unit 32: Alarm analysis unit 33: Camera current status acquisition unit 34: Control calculation unit 35: Right camera installation unit 36: Left camera installation unit 40: Image analysis unit 41: Left and right camera Image 42: Control estimation unit 50: Camera set value 51: Camera calibration unit 52: Camera parameter 53: Image 54: Distortion correction unit 55: Parallel processing unit 56: Distance calculation unit 60: Image 61: Camera automatic calibration unit 62 : Camera parameter 70: Cars 71, 72, 75, 76: Measurement areas 73, 74, 77, 78: Acquired image 79: Blind spot area 712: Danger and abnormality 8 : Danger and abnormality 90: Camera position 91: Monitoring unit 92: Stereo control unit 100: Stereo camera 101: Children 102: People group 103: Wheelchair 110: PTZ camera 111: Baseline stage 112: Network 113: Camera control unit 114: Image acquisition unit 115: Stereo control unit 116: Monitoring unit

Claims (7)

  1.  物体又は領域を撮影する複数の実映像入力部と、
     前記実映像入力部の撮影に際しての設定値を変更するカメラ制御部と、
     前記実映像入力部から入力された電子信号を画像変換する画像取得部と、
     前記画像取得部で取得された画像情報をもとに異常や危険を検出する監視部と、
     前記監視部で検出した情報をもとに、前記実映像入力部のステレオ化に際しての制御値を算出するステレオ制御部と、
     前記画像取得部で取得した情報をもとに前記実映像入力部から前記物体又は領域までの距離を算出する距離算出部と、を備え、
     前記カメラ制御部は、前記ステレオ制御部が算出した制御値に基づいて前記実映像入力部の撮影に際しての設定値を変更することを特徴とするステレオカメラ装置。
    A plurality of real video input sections for photographing an object or an area;
    A camera control unit for changing a setting value at the time of shooting of the real video input unit;
    An image acquisition unit that converts an electronic signal input from the real video input unit;
    A monitoring unit for detecting abnormalities and dangers based on the image information acquired by the image acquisition unit;
    Based on the information detected by the monitoring unit, a stereo control unit that calculates a control value when the real video input unit is made stereo;
    A distance calculation unit that calculates a distance from the actual video input unit to the object or region based on the information acquired by the image acquisition unit;
    The said camera control part changes the setting value at the time of imaging | photography of the said real image input part based on the control value which the said stereo control part calculated, The stereo camera apparatus characterized by the above-mentioned.
  2.  請求項1に記載のステレオカメラ装置であって、
     前記カメラ制御部は、前記実映像入力部の焦点距離、回転角度、カメラ間距離若しくはベースラインの何れか又は複数を変更することが可能なステレオカメラ装置。
    The stereo camera device according to claim 1,
    The said camera control part is a stereo camera apparatus which can change either the focal distance of the said real image input part, a rotation angle, the distance between cameras, or a baseline, or several.
  3.  請求項1又は2に記載のステレオカメラ装置であって、
     前記複数の実映像入力部のうち少なくとも1つが他の実映像入力部が撮影する近傍な領域よりも遠望な領域を撮影していることを特徴とするステレオカメラ装置。
    The stereo camera device according to claim 1 or 2,
    A stereo camera device characterized in that at least one of the plurality of real video input units is shooting a region farther than a region near another real video input unit.
  4.  請求項3に記載のステレオカメラ装置であって、
     前記遠望な領域を撮影している実映像入力部を前記近傍な領域を撮影している実映像入力部と前記近傍な領域においてステレオ化することを特徴とするステレオカメラ装置。
    The stereo camera device according to claim 3,
    A stereo camera device characterized in that an actual video input unit that captures the distant area is stereo-ized in the vicinity of the actual video input unit that captures the adjacent area.
  5.  請求項3に記載のステレオカメラ装置であって、
     前記近傍な領域を撮影している実映像入力部を前記遠望な領域を撮影している実映像入力部と前記遠望な領域においてステレオ化することを特徴とするステレオカメラ装置。
    The stereo camera device according to claim 3,
    A stereo camera device characterized in that a real image input unit that captures the near area is stereo-ized in the distant area with an actual image input unit that captures the distant area.
  6.  請求項1乃至5のいずれか1項に記載のステレオカメラ装置であって、
     前記監視部と前記ステレオ制御部は、前記実映像入力部及び前記画像取得部とインターネット経由で相互に接続されていることを特徴とするステレオカメラ装置。
    A stereo camera device according to any one of claims 1 to 5,
    The monitoring unit and the stereo control unit are connected to the actual video input unit and the image acquisition unit via the Internet, respectively.
  7.  請求項1乃至6のいずれか1項に記載のステレオカメラ装置であって、
     前記実像入力部がPTZカメラであることを特徴とするステレオカメラ装置。
    The stereo camera device according to any one of claims 1 to 6,
    A stereo camera device, wherein the real image input unit is a PTZ camera.
PCT/JP2017/011909 2017-03-24 2017-03-24 Stereo camera device WO2018173243A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/JP2017/011909 WO2018173243A1 (en) 2017-03-24 2017-03-24 Stereo camera device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2017/011909 WO2018173243A1 (en) 2017-03-24 2017-03-24 Stereo camera device

Publications (1)

Publication Number Publication Date
WO2018173243A1 true WO2018173243A1 (en) 2018-09-27

Family

ID=63584296

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2017/011909 WO2018173243A1 (en) 2017-03-24 2017-03-24 Stereo camera device

Country Status (1)

Country Link
WO (1) WO2018173243A1 (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003284053A (en) * 2002-03-27 2003-10-03 Minolta Co Ltd Monitoring camera system and monitoring camera control device
JP2004364212A (en) * 2003-06-09 2004-12-24 Fujitsu Ltd Object photographing apparatus, object photographing method, and object photographing program
JP2005176143A (en) * 2003-12-12 2005-06-30 Sony Corp Monitoring apparatus
JP2007235655A (en) * 2006-03-02 2007-09-13 Victor Co Of Japan Ltd Video imaging control system

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003284053A (en) * 2002-03-27 2003-10-03 Minolta Co Ltd Monitoring camera system and monitoring camera control device
JP2004364212A (en) * 2003-06-09 2004-12-24 Fujitsu Ltd Object photographing apparatus, object photographing method, and object photographing program
JP2005176143A (en) * 2003-12-12 2005-06-30 Sony Corp Monitoring apparatus
JP2007235655A (en) * 2006-03-02 2007-09-13 Victor Co Of Japan Ltd Video imaging control system

Similar Documents

Publication Publication Date Title
CN109922251B (en) Method, device and system for quick snapshot
KR101343975B1 (en) Sudden detection system
KR101043450B1 (en) Position and distance measuring device and camera using position and distance measuring method
US20160379066A1 (en) Method and Camera System for Distance Determination of Objects from a Vehicle
JP3800217B2 (en) Monitoring system
WO2016194307A1 (en) Shake correction device, image pickup apparatus, and shake correction method
KR101496390B1 (en) System for Vehicle Number Detection
KR101449160B1 (en) Apparatus and method for providing information of blind spot
JP6944328B2 (en) Vehicle peripheral monitoring device and peripheral monitoring method
JP6221464B2 (en) Stereo camera device, moving body control system, moving body, and program
KR102183903B1 (en) Surveillance camera and control method thereof
JP2009077092A (en) Multi camera system
JP2005024463A (en) Stereo wide-field image processor
WO2019146510A1 (en) Image processing device
JP3491029B2 (en) Automatic monitoring device
TW201205506A (en) System and method for managing security of a roof
WO2018173243A1 (en) Stereo camera device
JP2018201146A (en) Image correction apparatus, image correction method, attention point recognition apparatus, attention point recognition method, and abnormality detection system
JP6734994B2 (en) Stereo measuring device and system
EP4235574A1 (en) Measuring device, moving device, measuring method, and storage medium
JP5452540B2 (en) Imaging device
WO2023067867A1 (en) Vehicle-mounted control device, and three-dimensional information acquisition method
CN112470456A (en) Camera system for railway vehicle
US20200111227A1 (en) Orientation detection apparatus for vehicle, image processing system, vehicle, and orientation detection method for vehicle
JP2015143657A (en) Vehicle stereo camera system

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17902516

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 17902516

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP

点击 这是indexloc提供的php浏览器服务,不要输入任何密码和下载