+

US20160307311A1 - Image measurement apparatus, image measurement method, information processing apparatus, information processing method, and program - Google Patents

Image measurement apparatus, image measurement method, information processing apparatus, information processing method, and program Download PDF

Info

Publication number
US20160307311A1
US20160307311A1 US15/096,623 US201615096623A US2016307311A1 US 20160307311 A1 US20160307311 A1 US 20160307311A1 US 201615096623 A US201615096623 A US 201615096623A US 2016307311 A1 US2016307311 A1 US 2016307311A1
Authority
US
United States
Prior art keywords
image
image capturing
captured
measurement
captured image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
US15/096,623
Other versions
US10001368B2 (en
Inventor
Shohei Udo
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Mitutoyo Corp
Original Assignee
Mitutoyo Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Mitutoyo Corp filed Critical Mitutoyo Corp
Assigned to MITUTOYO CORPORATION reassignment MITUTOYO CORPORATION ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: UDO, SHOHEI
Publication of US20160307311A1 publication Critical patent/US20160307311A1/en
Application granted granted Critical
Publication of US10001368B2 publication Critical patent/US10001368B2/en
Active legal-status Critical Current
Adjusted expiration legal-status Critical

Links

Images

Classifications

    • G06T7/004
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/24Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/002Measuring arrangements characterised by the use of optical techniques for measuring two or more coordinates
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/002Measuring arrangements characterised by the use of optical techniques for measuring two or more coordinates
    • G01B11/005Measuring arrangements characterised by the use of optical techniques for measuring two or more coordinates coordinate measuring machines
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B21/00Measuring arrangements or details thereof, where the measuring technique is not covered by the other groups of this subclass, unspecified or not relevant
    • G01B21/02Measuring arrangements or details thereof, where the measuring technique is not covered by the other groups of this subclass, unspecified or not relevant for measuring length, width, or thickness
    • G01B21/04Measuring arrangements or details thereof, where the measuring technique is not covered by the other groups of this subclass, unspecified or not relevant for measuring length, width, or thickness by measuring coordinates of points
    • G01B21/045Correction of measurements

Definitions

  • the present invention relates to an image measurement apparatus, an image measurement method, an information processing apparatus, an information processing method, and a program.
  • Patent Document 1 discloses a shape measurement apparatus including a shape measurement device having an image capturing unit that image-captures a stage and a work on the stage, and a computer that measures a shape of the work from the captured image by the image capturing unit (see paragraph [0009] etc. in specification of Patent Document 1).
  • vibration of the shape measurement device is detected during the shape of the work is measured while the image capturing unit is moved.
  • a moving amount of the image capturing unit is controlled such that the vibration detected is canceled. In this manner, deviation by the vibration of the image capturing unit to the stage is inhibited, and a measurement accuracy is improved (see paragraph [00019] etc. in specification of Patent Document 1).
  • the image measurement apparatus that measures a shape, etc. from the image captured by the image capturing unit, it needs to provide a technology that improve measurement accuracy.
  • the present invention aims at providing an image measurement apparatus, an image measurement method, an information processing apparatus, an information processing method, and a program capable of measuring an object to be measured with high accuracy.
  • an image measurement apparatus includes an image capturing part, a movement mechanism and a calculation part.
  • the image capturing part can image-capture an object.
  • the movement mechanism can change an image capturing position of the image capturing part to the object.
  • the calculation part can calculate a correction value from a first captured image group acquired by placing the image capturing part static at each of a plurality of image capturing positions and a second captured image group acquired by relatively moving the image capturing part so as to pass each of a plurality of the image capturing positions.
  • the first captured image group and the second captured image group are captured image groups of images captured at a plurality of the predetermined image capturing positions by the image capturing part.
  • a correction value is calculated from the first captured image group acquired by image-capturing the object with an image capturing part placed static at each of a plurality of image capturing positions, and second captured image group acquired by image-capturing the object with the image capturing part relatively moving.
  • the object may be an object to be measured.
  • the image measurement apparatus may further comprise a correction part that corrects a measurement result based on the second captured image group from the correction value calculated.
  • the image measurement apparatus may further comprise a memory part that stores the correction value calculated.
  • the correction part may correct the measurement result from the correction value stored.
  • an image measurement method includes acquiring a first captured image group by image-capturing an object with an image capturing part placed static at each of a plurality of image capturing positions.
  • a second captured image group is acquired by image-capturing the object with the image capturing part relatively moving so as to pass each of a plurality of the image capturing positions.
  • Acquiring the first captured image group may be executed to the object to be measured.
  • acquiring the second captured image group may be executed to each of the object to be measured and a plurality of other objects to be measured in the same type.
  • the image measurement method may include correcting a measurement result of each of a plurality of the other objects to be measured based on the second captured image group from the correction value calculated to the object to be measured.
  • an information processing apparatus includes a movement control part and a calculation part.
  • the movement control part can control an image capturing position of an image capturing part that image captures an object
  • an information processing method executed by a computer including acquiring a first captured image group by image-capturing an object with an image capturing part placed static at each of a plurality of image capturing positions.
  • a second captured image group is acquired by image-capturing the object with the image capturing part relatively moving so as to pass each of a plurality of the image capturing positions.
  • FIG. 1 is a schematic diagram of an image measurement apparatus according an embodiment of the present invention.
  • FIG. 2 is a functional block diagram showing a configuration example of a measurement control part shown in FIG. 1 ;
  • FIGS. 3A and 3B each is a diagram showing that an image is captured in a state that an image capturing unit is static at an image capturing position;
  • FIGS. 4A and 4B each is a diagram showing that an image is captured in a state that an image capturing unit is accelerated at an image capturing position
  • FIGS. 5A and 5B each is a diagram for illustrating a timing of an image acquisition during the movement
  • FIG. 6 is a flow chart showing an operation example of a static measurement
  • FIG. 7 is a flow chart showing an operation example of a movement measurement
  • FIG. 8 is a flow chart showing an example of calculating a correction value by an image coordinate difference calculation part.
  • FIG. 9 is a flow chart showing an operation example of a movement measurement using a correction value.
  • FIG. 1 is a schematic diagram of an image measurement apparatus according an embodiment of the present invention.
  • the image measurement apparatus 100 has a non-contact type image measurement device 10 and a PC (Personal Computer) 20 as an information processing apparatus.
  • PC Personal Computer
  • FIG. 1 is a schematic diagram of an image measurement apparatus according an embodiment of the present invention.
  • the image measurement apparatus 100 has a non-contact type image measurement device 10 and a PC (Personal Computer) 20 as an information processing apparatus.
  • PC Personal Computer
  • the image measurement device 10 includes a stage 11 , a movement mechanism 12 , and an image capturing unit (image capturing part) 13 .
  • a work W that is an object to be measured is placed at a predetermined position of the stage 11 .
  • the work W is also applied to an object image-captured by the image capturing unit 13 .
  • the movement mechanism 12 can change an image capturing position of the image capturing unit 13 to the work W in three-dimensional directions, xyz.
  • the image capturing position is a relative position of the image capturing unit 13 to the work W when an image is captured. Accordingly, by relatively moving the image capturing unit 13 and the work W, it is possible to change the image capturing position.
  • the movement mechanism 12 includes an x movement mechanism 14 , a y movement mechanism 15 , and a z movement mechanism 16 .
  • the z movement mechanism 16 moves the image capturing unit 13 along a z direction.
  • the x movement mechanism 14 moves the image capturing unit 13 and the z movement mechanism 16 integrally along an x direction.
  • the y movement mechanism 15 moves the stage 11 along a y direction.
  • a specific configuration of each movement mechanism is not limited, and may be designed freely.
  • an axis displacement sensor 17 for example, a linear scale etc. is placed at each of the xyz movement mechanisms. From detection values of an axis displacement sensor 17 x and a z axis displacement sensor 17 z , x and z coordinates of the image capturing unit 13 are calculated. In addition, from a detection value of a y axis displacement sensor 17 y , a y coordinate of the stage 11 is calculated.
  • a digital camera including a video camera having an objective lens 18 (see FIG. 3A ) and an image-capturing device (not shown) is mounted.
  • Light reflected by the work W is incident on the image-capturing device via an objective lens 18 , thereby generating a digital image of the work W.
  • a CMOS Complementary Metal-Oxide Semiconductor
  • a CCD Charge Coupled Device
  • the PC 20 is connected to the image measurement device 10 by any connection form.
  • the PC 20 has hardware necessary for the configuration of the computer such as a CPU (Central Processing Unit), a ROM (Read Only Memory), a RAM (Random Access Memory), a HDD (Hard Disk Drive) (all are not shown) or the like.
  • a CPU Central Processing Unit
  • ROM Read Only Memory
  • RAM Random Access Memory
  • HDD Hard Disk Drive
  • the ROM and HDD programs executed by the CPU and a variety of data such as a shape data are stored. Also, the RAM is used as a temporary work area by the CPU and an area for temporary saving data.
  • Information processing by the PC 20 is realized by a cooperation of software stored in the ROM or the like and hardware resources in the PC 20 .
  • a measurement control part 21 is configured as a functional block. Note that dedicated hardware may be used to constitute the measurement control part 21 .
  • the program is installed to the PC 20 via a variety of recording media, for example.
  • the program may be installed on the PC 20 via the Internet, or the like.
  • FIG. 2 is a functional block diagram showing a configuration example of the measurement control part 21 .
  • the measurement control part 21 includes an image acquisition coordinate memory part 22 , an image acquisition coordinate collation part 23 , an image acquisition signal output part 24 , a camera image acquisition part 25 , a coordinate detecting part 26 , and an axis movement control part 27 .
  • the measurement control part 21 includes a coordinate memory part 28 , a coordinate and image memory part 29 , an image coordinate detecting part 30 , a static measurement image coordinate memory part 31 , a movement measurement image coordinate memory part 32 , an image coordinate finite difference calculation part 33 , and an image coordinate finite difference memory part 34 .
  • the measurement control part 21 includes an image coordinate correction part 35 , and a detection coordinate output part 36 .
  • the image acquisition coordinate memory part 22 stores a coordinate value at an image capturing position (hereinafter referred to as “image capturing position coordinate”).
  • image capturing position coordinate x and z coordinates of the image capturing unit 13 where an image is captured, and a y coordinate of the stage 11 are stored in advance.
  • the coordinate detecting part 26 detects a coordinate at a current measurement position (hereinafter referred to as “measurement position coordinate”) from the detection value by each of the xyz axis displacement sensors 17 .
  • the measurement position coordinate includes x and z coordinates of the image capturing unit 13 at current and a y coordinate of the stage 11 at current.
  • the image acquisition coordinate collation part 23 is collated with the measurement position coordinate detected by the coordinate detecting part 26 , and an image capturing position coordinate stored on the image acquisition coordinate memory part 22 . When both coordinates are matched, the image acquisition coordinate collation part 23 instructs an output of an image acquisition signal to the image acquisition signal output part 24 .
  • the image acquisition signal output part 24 outputs the image acquisition signal to a digital camera of the image capturing unit 13 . By outputting the signal, an image is captured by the image capturing unit 13 .
  • the camera image acquisition part 25 acquires an image captured by the image capturing unit 13 .
  • the axis movement control part 27 controls the movement mechanism 12 , and moves the image capturing unit 13 and the stage 11 .
  • the coordinate memory part 28 stores the measurement position coordinate, when the measurement position coordinate detected by the coordinate detecting part 26 is matched with the image capturing position coordinate.
  • the coordinate and image memory part 29 stores the image captured by the image capturing unit 13 , and the measurement position coordinate when the image is captured (i.e., the image capturing position coordinate).
  • the image coordinate detecting part 30 detects a coordinate about an external shape and a feature point of the work W in three-dimensional directions xyz (hereinafter referred to as “measurement coordinate”), from the captured image and the measurement position coordinate stored in the coordinate and image memory part 29 .
  • measurement coordinate a coordinate about an external shape and a feature point of the work W in three-dimensional directions xyz
  • the measurement position coordinate in respective points of the work W may be detected.
  • the image coordinate correction part 35 corrects the measurement coordinate detected by the image coordinate detecting part 30 .
  • the image is captured by relatively moving the image capturing unit 13 so as to pass each of a plurality of the capturing positions. From a group of the captured images, the measurement coordinate of the work W is detected. The image coordinate correction part 35 corrects the measurement coordinate detected.
  • the detection coordinate output part 36 outputs the measurement coordinate corrected by the image coordinate correction part 35 .
  • the static measurement image coordinate memory part 31 , the movement measurement image coordinate memory part 32 , the image coordinate finite difference calculation part 33 and the image coordinate finite difference memory part 34 are blocks for calculating the correction value used for the correction by the image coordinate correction part 35 . The details about each block will be described in detail.
  • the measurement is possible by relatively moving the image capturing unit 13 so as to pass a plurality of the capturing positions without stopping the image capturing unit 13 .
  • the image capturing position is automatically changed (the image capturing unit 13 is relatively moved) along a predetermined route based on a part program stored on a ROM.
  • the measurement is referred to as “movement measurement”.
  • FIGS. 3A, 3B, 4A, 4B, 5A and 5B are diagrams for illustrating possible problems occurred at the movement measurement.
  • FIGS. 3A, 4A and 5A each is a front diagram of the image measurement device 10 viewed from a y direction
  • FIGS. 3B, 4B and 5B each is an image captured by the image capturing unit 13 .
  • FIGS. 3A and 3B each is a diagram showing that an image is captured in a state that the image capturing unit 13 is static at an image capturing position P.
  • the image capturing unit 13 and the stage 11 are moved on the capturing position coordinate, the work W directly below is image-captured.
  • the work W a plate-like member having a white area 41 and a black area 42 is used.
  • the image capturing position P is set so as to overlap a boundary 43 between the white area 41 and the black area 42 with a center point 55 of the captured image 50 .
  • FIGS. 4A and 4B each is a diagram showing that an image is captured in a state that the image capturing unit 13 is accelerated at the image capturing position P.
  • tilt or distortion may occur by inertia by these weights, as shown in FIG. 4A . If the stage 11 is accelerated, the work W may be distorted.
  • the boundary 43 of the work W is undesirably deviated from the center point 55 of the captured image 50 .
  • the deviation of the image forms an error just as much as the deviation of the image as compared with the measurement coordinate calculated from the captured image 50 at the static measurement shown in FIG. 3B .
  • the deviation of the captured image 50 may be generated during deceleration or a centrifugal force by a simultaneous arc movement between two or more axes.
  • FIGS. 5A and 5B each is a diagram for illustrating a timing of an image acquisition during the movement.
  • the measurement position coordinate is collated with the capturing position coordinate by the image acquisition coordinate collation part 23 shown in FIG. 2 .
  • a few time is necessary until the image acquisition signal is output from the image acquisition signal output part 24 to the image capturing unit 13 in response to the matching the both coordinates.
  • the image is captured by the image capturing unit 13 delayed from the timing when the image capturing unit 13 is moved to the image capturing position P (a delay image capturing position P 1 in FIG. 5A ).
  • the boundary 43 of the work W is undesirably deviated from the center point 55 of the captured image 50 in the captured image 50 as shown in FIG. 5B . If the measurement coordinate is calculated from the captured image 50 , the deviation of the image forms an error just as much as the deviation of the image.
  • the present inventor focused on the following: If the movement measurement is executed according to a predetermined route by the same part program, for example, the distortion of the movement mechanism 12 and the work W and the deviation of the timing of the image acquisition caused by the acceleration, the deceleration, the centrifugal force or the like at each image capturing position P are generated under the almost same conditions each time. In other words, a deviation amount in the captured image 50 at each image capturing position is almost same each time.
  • the image capturing unit 13 is static at each of a plurality of the image capturing positions P to measure an image (hereinafter referred to as “static measurement”).
  • the measurement result i.e., a measurement coordinate detected by each captured image 50 . It is invented that a finite difference between the measurement result by the static measurement and the measurement result by the movement measurement is calculated as a correction value. Specific operation examples will be described below.
  • FIG. 6 is a flow chart showing an operation example of the static measurement.
  • the image capturing position coordinate stored on the image acquisition coordinate memory part 22 is read out (Step 101 ).
  • the axis movement control part 27 moves the image capturing unit 13 and the stage 11 to the image capturing position coordinate read out (Step 102 ).
  • the image acquisition coordinate collation part 23 determines whether or not the measurement position coordinate is matched with the image capturing position coordinate (Step 103 ). If the coordinates are matched (Yes in Step 103 ), a stop of the movement is designated to the axis movement control part 27 (Step 104 ).
  • the axis movement control part 27 determines whether or not the state is static (Step 105 ). If it is identified that the state is static (Yes), a signal showing the identification to the image acquisition coordinate collation part 23 .
  • the image acquisition coordinate collation part 23 receives the signal, and designates an output of an image acquisition signal to the image acquisition signal output part 24 . In this manner, the work W is image-captured by the image capturing unit 13 to acquire the captured image (Step 106 ).
  • the coordinate and image memory part 29 stores the captured image and the measurement position coordinate (image capturing position coordinate) associated with each other (Step 107 ).
  • the image coordinate detecting part 30 detects a measurement coordinate (X, Y, Z) in each point of the work W from the captured image and the measurement position coordinate stored (Step 108 ).
  • the measurement coordinate (X, Y, Z) detected is stored on the static measurement image coordinate memory part 31 shown in FIG. 2 as the measurement coordinate at the static measurement (Step 109 ).
  • the processing described above is executed to all of a plurality of the predetermined image capturing positions (Step 110 ).
  • the static measurement image coordinate memory part 31 stores the measurement coordinate (X S1 , Y S1 , Z S1 )-(X Sn , Y Sn , Z Sn ) in all image capturing positions.
  • the suffix “n” is a sequence number of the image capturing position coordinates.
  • a group of the images captured at the respective image capturing positions corresponds to a first captured image group according to this embodiment. Accordingly, the measurement coordinate (X S1 , Y S1 , Z S1 )-(X Sn , Y Sn , Z Sn ) in all image capturing positions corresponds to measurement results based on the first captured image group.
  • FIG. 7 is a flow chart showing an operation example of the movement measurement.
  • the movement measurement is executed in a state that the same work W is mounted on the stage 11 without change after the static measurement is executed.
  • Steps 201 to 203 the image capturing position coordinate read out is collated with the measurement position coordinate.
  • the image capturing position coordinate read out is same as the image capturing position coordinate at the static measurement.
  • the image acquisition signal output part 24 outputs the image acquisition signal to the image capturing unit 13 to acquire the captured image (Step 204 ).
  • the captured image acquired and the measurement position coordinate are stored (Step 205 ).
  • the measurement coordinate (X, Y, Z) at each point of the work W is detected (Step 206 ).
  • the measurement coordinate (X, Y, Z) detected is stored on the movement measurement image coordinate memory part 32 shown in FIG. 2 as the measurement coordinate at the movement (Step 207 ).
  • the above-described processing is executed to all of a plurality of the image capturing positions (Step 208 ).
  • the movement measurement image coordinate memory part 32 store the measurement coordinate (X M1 , Y M1 , Z M1 )-(X Mn , Y Mn , Z Mn ) in all image capturing positions.
  • the suffix “n” is a sequence number of the measurement result at the static measurement. In other words, the respective measurement results (X Sn , Y Sn , Z Sn ) and (X Mn , Y Mn , Z Mn ) having the same number are measured at the same image capturing position.
  • the group of the images captured at the respective image capturing positions corresponds to a second captured image group according to this embodiment. Accordingly, the measurement coordinate (X M1 , Y M1 , Z M1 )-(X Mn , Y Mn , Z Mn ) in all image capturing positions corresponds to the measurement result based on the second captured image group.
  • FIG. 8 is a flow chart showing an example of calculating a correction value executed by the image coordinate finite difference calculation part 33 shown in FIG. 2 .
  • a measurement coordinate at the static measurement (X Sn , Y Sn , Z Sn ) is read out (Step 301 ).
  • a measurement coordinate at the movement measurement (X Mn , Y Mn , Z Mn ) is read out (Step 302 ).
  • a finite difference ( ⁇ X n , ⁇ Y n , ⁇ Z n ) is calculated (Step 303 ).
  • the finite difference ( ⁇ X n , ⁇ Y n , ⁇ Z n ) calculated is stored in the image coordinate finite difference memory part 34 shown in FIG. 2 (Step 304 ).
  • the processing is repeated to the final image capturing position coordinate (Step 305 ).
  • the finite difference ( ⁇ X n , ⁇ Y n , ⁇ Z n ) corresponds to the correction value calculated from the first captured image group and the second captured image group in this embodiment.
  • FIG. 9 is a flow chart showing an operation example of the movement measurement using the correction value.
  • a work W in the same type having the same shape as the work W mounted for calculating the correction value is mounted.
  • Step 401 to Step 406 the same processing as the movement measurement shown in FIG. 7 is executed.
  • the measurement coordinate (X n , Y n , Z n ) calculated in Step 406 per image capturing position coordinate is output to the image coordinate correction part 35 show in FIG. 2 .
  • the image coordinate correction part 35 reads out a finite difference ( ⁇ X n , ⁇ Y n , ⁇ Z n ) having the same number n corresponding to the accepted measurement coordinate (X n , Y n , Z n ) from the image coordinate finite difference memory part 34 (Step 407 ). Then, the correction coordinate (X Cn , Y Cn , Z Cn ) is calculated by the following equations (Step 408 ).
  • the suffix “n” is matched with the number of the measurement results at the static measurement and the movement measurement.
  • the calculated correction coordinate (X Cn , Y Cn , Z Cn ) are output from the detection coordinate output part 36 ( 409 ).
  • the above-described processing is repeated to the final image capturing position (Step 410 ).
  • the finite difference ( ⁇ X n , ⁇ Y n , ⁇ Z n ) is calculated from the first captured image group by image-capturing with the image capturing unit 13 placed static at a plurality of the predetermined image capturing positions and the second captured image group by image-capturing by the image capturing unit 13 relatively moved.
  • the finite difference ( ⁇ X n , ⁇ Y n , ⁇ Z n ) the influence of the distortion of the movement mechanism 12 and the work W and the deviation of the timing of the image acquisition caused by the acceleration, the deceleration, the centrifugal force or the like at each image capturing position is eliminated, and the measurement result almost equivalent to that provided at the static can be acquired. That is to say, the work W can be measured with high precision by relatively moving the image capturing unit 13 .
  • Rigidity of a member constituting the movement mechanism 12 etc. may not be increased to the utmost limit in order to prevent the deviation of the image at the movement measurement, thereby reducing the costs.
  • any one of a plurality of the works W to be measured is selected as a representative one, and the correction value is calculated from the static measurement and the movement measurement.
  • the correction value is stored. If other work W in the same type is subjected to the movement measurement, the correction is done from the stored correction value. In this manner, a plurality of the works W in the same type can be effectively measured with high precision.
  • the correction value is easily calculated, thereby executing the movement measurement with high precision without complex processing.
  • the work to be measured is used to calculate the correction value.
  • an object for proof is used to calculate the correction value.
  • the object for proof is not limited, but includes a plate-like member described as the work W in FIG. 3A , for example.
  • the object for proof may be used as the object according to the embodiment.
  • the object for proof is used to calculate and store the correction value.
  • the movement measurement is executed using the correction value. In this manner, the movement measurement with high precision is easily possible. It should be appreciated that it can select to use the correction value calculated using the object for proof or to calculate the correction value using the work on site.
  • the image capturing unit is moved in the x and y directions in order to change the image capturing position of the image capturing unit, and the stage is moved in the y direction.
  • the image capturing unit at an image capturing side may be moved in three, i.e., xyz, directions, or the stage at an object to be measured side may be moved in three, i.e., xyz, directions.
  • those at the image capturing side and the object to be measured side may be moved in the same direction.
  • the movement for changing the image capturing position of the image capturing unit corresponds to the relative movement of the image capturing unit.
  • the image measurement device and the PC are constituted separately.
  • the image measurement device and the PC may be integrally constituted to achieve the image measurement apparatus according to the present invention. That is to say, an information processing unit including a CPU and the like may be provided in the image measurement device, and the information processing unit may constitute measurement control part.
  • a type of the image measurement apparatus to which the image measurement method according to the present invention is applied is not limited.
  • the present invention is applicable to any apparatus that executes measurement and observation using the object image acquired by image-capturing the work. Examples include a CNC image measurement device, a CNC three-dimensional measurement device and a hardness tester. Also, the present invention is applicable to a digital microscope that image-captures an enlarged image provided by an optical microscope with a digital camera.

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

Provided is an object image measurement apparatus including an image capturing part that image-captures an object; a movement mechanism that changes an image capturing position of the image capturing part to the object; and a calculation part that calculates a correction value from a first captured image group acquired by placing the image capturing part static at each of a plurality of image capturing positions and a second captured image group acquired by relatively moving the image capturing part so as to pass each of a plurality of the image capturing positions. The first captured image group and the second captured image group are captured image groups of images captured at a plurality of the predetermined image capturing positions by the image capturing part.

Description

    CROSS REFERENCE TO RELATED APPLICATIONS
  • This application claims the benefit of Japanese Priority Patent Application JP 2015-082239 filed Apr. 14, 2015 and Japanese Priority Patent Application JP filed, the entire contents of which are incorporated herein by reference.
  • BACKGROUND
  • The present invention relates to an image measurement apparatus, an image measurement method, an information processing apparatus, an information processing method, and a program.
  • In the related art, an image measurement apparatus that can measure a shape etc. from an image captured of an object to be measured is known. For example, Japanese Patent Application Laid-open No. 2014-228529 (hereinafter, referred to as “Patent Document 1”) discloses a shape measurement apparatus including a shape measurement device having an image capturing unit that image-captures a stage and a work on the stage, and a computer that measures a shape of the work from the captured image by the image capturing unit (see paragraph [0009] etc. in specification of Patent Document 1).
  • In the shape measurement apparatus, vibration of the shape measurement device is detected during the shape of the work is measured while the image capturing unit is moved. A moving amount of the image capturing unit is controlled such that the vibration detected is canceled. In this manner, deviation by the vibration of the image capturing unit to the stage is inhibited, and a measurement accuracy is improved (see paragraph [00019] etc. in specification of Patent Document 1).
  • SUMMARY
  • As described above, in the image measurement apparatus that measures a shape, etc. from the image captured by the image capturing unit, it needs to provide a technology that improve measurement accuracy.
  • In view of the circumstances as described above, the present invention aims at providing an image measurement apparatus, an image measurement method, an information processing apparatus, an information processing method, and a program capable of measuring an object to be measured with high accuracy.
  • To attain the object described above, according to an embodiment of the present invention, there is provided an image measurement apparatus includes an image capturing part, a movement mechanism and a calculation part.
  • The image capturing part can image-capture an object.
  • The movement mechanism can change an image capturing position of the image capturing part to the object.
  • The calculation part can calculate a correction value from a first captured image group acquired by placing the image capturing part static at each of a plurality of image capturing positions and a second captured image group acquired by relatively moving the image capturing part so as to pass each of a plurality of the image capturing positions. The first captured image group and the second captured image group are captured image groups of images captured at a plurality of the predetermined image capturing positions by the image capturing part.
  • In the image measurement apparatus, a correction value is calculated from the first captured image group acquired by image-capturing the object with an image capturing part placed static at each of a plurality of image capturing positions, and second captured image group acquired by image-capturing the object with the image capturing part relatively moving. By using the correction value, it becomes possible to measure the object to be measured with high accuracy while the image capturing unit is moved.
  • The object may be an object to be measured. In this case, the image measurement apparatus may further comprise a correction part that corrects a measurement result based on the second captured image group from the correction value calculated.
  • In this way, it is possible to measure the object to be measured with high accuracy.
  • The image measurement apparatus may further comprise a memory part that stores the correction value calculated.
  • In this case, the correction part may correct the measurement result from the correction value stored.
  • In this way, it is possible to measure the object to be measured with high accuracy.
  • According to an embodiment of the present invention, there is provided an image measurement method includes acquiring a first captured image group by image-capturing an object with an image capturing part placed static at each of a plurality of image capturing positions.
  • A second captured image group is acquired by image-capturing the object with the image capturing part relatively moving so as to pass each of a plurality of the image capturing positions.
  • From the first and second captured image groups acquired, a correction value is calculated.
  • By using the correction value calculated, it becomes possible to measure the object to be measured with high accuracy while the image capturing unit is moved.
  • Acquiring the first captured image group may be executed to the object to be measured. In this case, acquiring the second captured image group may be executed to each of the object to be measured and a plurality of other objects to be measured in the same type. The image measurement method may include correcting a measurement result of each of a plurality of the other objects to be measured based on the second captured image group from the correction value calculated to the object to be measured.
  • In this manner, it is possible to measure effectively other objects to be measured in the same type.
  • According to an embodiment of the present invention, there is provided an information processing apparatus includes a movement control part and a calculation part.
  • The movement control part can control an image capturing position of an image capturing part that image captures an object
  • According to an embodiment of the present invention, there is provided an information processing method executed by a computer including acquiring a first captured image group by image-capturing an object with an image capturing part placed static at each of a plurality of image capturing positions.
  • A second captured image group is acquired by image-capturing the object with the image capturing part relatively moving so as to pass each of a plurality of the image capturing positions.
  • From the first and second captured image groups acquired, a correction value is calculated.
  • A program according to an embodiment of the present invention causes a computer to execute the information processing method
  • As described above, according to the present invention, it becomes possible to measure the object to be measured with high accuracy. It should be noted that the effects described herein are not necessarily limited, and any of the effects described in the present disclosure may be obtained.
  • These and other objects, features and advantages of the present invention will become more apparent in light of the following detailed description of best mode embodiments thereof, as illustrated in the accompanying drawings.
  • BRIEF DESCRIPTION OF DRAWINGS
  • FIG. 1 is a schematic diagram of an image measurement apparatus according an embodiment of the present invention;
  • FIG. 2 is a functional block diagram showing a configuration example of a measurement control part shown in FIG. 1;
  • FIGS. 3A and 3B each is a diagram showing that an image is captured in a state that an image capturing unit is static at an image capturing position;
  • FIGS. 4A and 4B each is a diagram showing that an image is captured in a state that an image capturing unit is accelerated at an image capturing position;
  • FIGS. 5A and 5B each is a diagram for illustrating a timing of an image acquisition during the movement;
  • FIG. 6 is a flow chart showing an operation example of a static measurement;
  • FIG. 7 is a flow chart showing an operation example of a movement measurement;
  • FIG. 8 is a flow chart showing an example of calculating a correction value by an image coordinate difference calculation part; and
  • FIG. 9. is a flow chart showing an operation example of a movement measurement using a correction value.
  • DETAILED DESCRIPTION OF EMBODIMENTS
  • Hereinafter, an embodiment of the present invention will be described with reference to the drawings.
  • [Configuration of Image Measurement Apparatus]
  • FIG. 1 is a schematic diagram of an image measurement apparatus according an embodiment of the present invention. The image measurement apparatus 100 has a non-contact type image measurement device 10 and a PC (Personal Computer) 20 as an information processing apparatus. In the information processing apparatus according to the present technology, other computers may be used.
  • The image measurement device 10 includes a stage 11, a movement mechanism 12, and an image capturing unit (image capturing part) 13. At a predetermined position of the stage 11, a work W that is an object to be measured is placed. The work W is also applied to an object image-captured by the image capturing unit 13.
  • The movement mechanism 12 can change an image capturing position of the image capturing unit 13 to the work W in three-dimensional directions, xyz. The image capturing position is a relative position of the image capturing unit 13 to the work W when an image is captured. Accordingly, by relatively moving the image capturing unit 13 and the work W, it is possible to change the image capturing position.
  • As shown in FIG. 1, the movement mechanism 12 includes an x movement mechanism 14, a y movement mechanism 15, and a z movement mechanism 16. The z movement mechanism 16 moves the image capturing unit 13 along a z direction. The x movement mechanism 14 moves the image capturing unit 13 and the z movement mechanism 16 integrally along an x direction. The y movement mechanism 15 moves the stage 11 along a y direction. A specific configuration of each movement mechanism is not limited, and may be designed freely.
  • At each of the xyz movement mechanisms, an axis displacement sensor 17, for example, a linear scale etc. is placed. From detection values of an axis displacement sensor 17 x and a z axis displacement sensor 17 z, x and z coordinates of the image capturing unit 13 are calculated. In addition, from a detection value of a y axis displacement sensor 17 y, a y coordinate of the stage 11 is calculated.
  • On the image capturing unit 13, a digital camera (including a video camera) having an objective lens 18 (see FIG. 3A) and an image-capturing device (not shown) is mounted. Light reflected by the work W is incident on the image-capturing device via an objective lens 18, thereby generating a digital image of the work W. As the image-capturing device, a CMOS (Complementary Metal-Oxide Semiconductor) sensor, a CCD (Charge Coupled Device) sensor or the like is used, for example.
  • The PC 20 is connected to the image measurement device 10 by any connection form. The PC 20 has hardware necessary for the configuration of the computer such as a CPU (Central Processing Unit), a ROM (Read Only Memory), a RAM (Random Access Memory), a HDD (Hard Disk Drive) (all are not shown) or the like.
  • In the ROM and HDD, programs executed by the CPU and a variety of data such as a shape data are stored. Also, the RAM is used as a temporary work area by the CPU and an area for temporary saving data.
  • Information processing by the PC 20 is realized by a cooperation of software stored in the ROM or the like and hardware resources in the PC 20. In this embodiment, as shown in FIG. 1, by executing a predetermined program by the CPU, a measurement control part 21 is configured as a functional block. Note that dedicated hardware may be used to constitute the measurement control part 21.
  • The program is installed to the PC 20 via a variety of recording media, for example. Alternatively, the program may be installed on the PC 20 via the Internet, or the like.
  • FIG. 2 is a functional block diagram showing a configuration example of the measurement control part 21. The measurement control part 21 includes an image acquisition coordinate memory part 22, an image acquisition coordinate collation part 23, an image acquisition signal output part 24, a camera image acquisition part 25, a coordinate detecting part 26, and an axis movement control part 27. Also, the measurement control part 21 includes a coordinate memory part 28, a coordinate and image memory part 29, an image coordinate detecting part 30, a static measurement image coordinate memory part 31, a movement measurement image coordinate memory part 32, an image coordinate finite difference calculation part 33, and an image coordinate finite difference memory part 34. Furthermore, the measurement control part 21 includes an image coordinate correction part 35, and a detection coordinate output part 36.
  • The image acquisition coordinate memory part 22 stores a coordinate value at an image capturing position (hereinafter referred to as “image capturing position coordinate”). In this embodiment, as the image capturing position coordinate, x and z coordinates of the image capturing unit 13 where an image is captured, and a y coordinate of the stage 11 are stored in advance.
  • The coordinate detecting part 26 detects a coordinate at a current measurement position (hereinafter referred to as “measurement position coordinate”) from the detection value by each of the xyz axis displacement sensors 17. The measurement position coordinate includes x and z coordinates of the image capturing unit 13 at current and a y coordinate of the stage 11 at current.
  • The image acquisition coordinate collation part 23 is collated with the measurement position coordinate detected by the coordinate detecting part 26, and an image capturing position coordinate stored on the image acquisition coordinate memory part 22. When both coordinates are matched, the image acquisition coordinate collation part 23 instructs an output of an image acquisition signal to the image acquisition signal output part 24.
  • The image acquisition signal output part 24 outputs the image acquisition signal to a digital camera of the image capturing unit 13. By outputting the signal, an image is captured by the image capturing unit 13. The camera image acquisition part 25 acquires an image captured by the image capturing unit 13.
  • The axis movement control part 27 controls the movement mechanism 12, and moves the image capturing unit 13 and the stage 11. The coordinate memory part 28 stores the measurement position coordinate, when the measurement position coordinate detected by the coordinate detecting part 26 is matched with the image capturing position coordinate.
  • The coordinate and image memory part 29 stores the image captured by the image capturing unit 13, and the measurement position coordinate when the image is captured (i.e., the image capturing position coordinate).
  • The image coordinate detecting part 30 detects a coordinate about an external shape and a feature point of the work W in three-dimensional directions xyz (hereinafter referred to as “measurement coordinate”), from the captured image and the measurement position coordinate stored in the coordinate and image memory part 29. For example, using the known image analysis technology such as an edge detection, from the position on the captured image and the measurement position coordinate, the measurement position coordinate in respective points of the work W may be detected.
  • The image coordinate correction part 35 corrects the measurement coordinate detected by the image coordinate detecting part 30. The details will be described later. In this embodiment, the image is captured by relatively moving the image capturing unit 13 so as to pass each of a plurality of the capturing positions. From a group of the captured images, the measurement coordinate of the work W is detected. The image coordinate correction part 35 corrects the measurement coordinate detected.
  • The detection coordinate output part 36 outputs the measurement coordinate corrected by the image coordinate correction part 35.
  • The static measurement image coordinate memory part 31, the movement measurement image coordinate memory part 32, the image coordinate finite difference calculation part 33 and the image coordinate finite difference memory part 34 are blocks for calculating the correction value used for the correction by the image coordinate correction part 35. The details about each block will be described in detail.
  • [Operation of Image Measurement Apparatus]
  • As described above, in the image measurement apparatus 100 according to this embodiment, the measurement is possible by relatively moving the image capturing unit 13 so as to pass a plurality of the capturing positions without stopping the image capturing unit 13. For example, the image capturing position is automatically changed (the image capturing unit 13 is relatively moved) along a predetermined route based on a part program stored on a ROM. Just at that time, if the measurement position coordinate is matched with the image capturing position coordinate, the image is automatically captured. Hereinafter, the measurement is referred to as “movement measurement”.
  • FIGS. 3A, 3B, 4A, 4B, 5A and 5B are diagrams for illustrating possible problems occurred at the movement measurement. FIGS. 3A, 4A and 5A each is a front diagram of the image measurement device 10 viewed from a y direction, and FIGS. 3B, 4B and 5B each is an image captured by the image capturing unit 13.
  • FIGS. 3A and 3B each is a diagram showing that an image is captured in a state that the image capturing unit 13 is static at an image capturing position P. In this case, the image capturing unit 13 and the stage 11 are moved on the capturing position coordinate, the work W directly below is image-captured. Here, as the work W, a plate-like member having a white area 41 and a black area 42 is used. Then, as shown in FIG. 3B, the image capturing position P is set so as to overlap a boundary 43 between the white area 41 and the black area 42 with a center point 55 of the captured image 50.
  • FIGS. 4A and 4B each is a diagram showing that an image is captured in a state that the image capturing unit 13 is accelerated at the image capturing position P. In this case, as acceleration acts on the image capturing unit 13 and the z movement mechanism 16, tilt or distortion may occur by inertia by these weights, as shown in FIG. 4A. If the stage 11 is accelerated, the work W may be distorted.
  • As shown in FIG. 4B, in the captured image 50 captured by the image capturing unit 13, the boundary 43 of the work W is undesirably deviated from the center point 55 of the captured image 50. If the measurement coordinate is calculated from the captured image 50, the deviation of the image forms an error just as much as the deviation of the image as compared with the measurement coordinate calculated from the captured image 50 at the static measurement shown in FIG. 3B. The deviation of the captured image 50 may be generated during deceleration or a centrifugal force by a simultaneous arc movement between two or more axes.
  • FIGS. 5A and 5B each is a diagram for illustrating a timing of an image acquisition during the movement. The measurement position coordinate is collated with the capturing position coordinate by the image acquisition coordinate collation part 23 shown in FIG. 2. A few time is necessary until the image acquisition signal is output from the image acquisition signal output part 24 to the image capturing unit 13 in response to the matching the both coordinates. In fact, the image is captured by the image capturing unit 13 delayed from the timing when the image capturing unit 13 is moved to the image capturing position P (a delay image capturing position P1 in FIG. 5A).
  • If the timing of the image acquisition is delayed, the boundary 43 of the work W is undesirably deviated from the center point 55 of the captured image 50 in the captured image 50 as shown in FIG. 5B. If the measurement coordinate is calculated from the captured image 50, the deviation of the image forms an error just as much as the deviation of the image.
  • Then, the present inventor focused on the following: If the movement measurement is executed according to a predetermined route by the same part program, for example, the distortion of the movement mechanism 12 and the work W and the deviation of the timing of the image acquisition caused by the acceleration, the deceleration, the centrifugal force or the like at each image capturing position P are generated under the almost same conditions each time. In other words, a deviation amount in the captured image 50 at each image capturing position is almost same each time.
  • The image capturing unit 13 is static at each of a plurality of the image capturing positions P to measure an image (hereinafter referred to as “static measurement”). The measurement result (i.e., a measurement coordinate detected by each captured image 50) is acquired. It is invented that a finite difference between the measurement result by the static measurement and the measurement result by the movement measurement is calculated as a correction value. Specific operation examples will be described below.
  • FIG. 6 is a flow chart showing an operation example of the static measurement. The image capturing position coordinate stored on the image acquisition coordinate memory part 22 is read out (Step 101). The axis movement control part 27 moves the image capturing unit 13 and the stage 11 to the image capturing position coordinate read out (Step 102). The image acquisition coordinate collation part 23 determines whether or not the measurement position coordinate is matched with the image capturing position coordinate (Step 103). If the coordinates are matched (Yes in Step 103), a stop of the movement is designated to the axis movement control part 27 (Step 104).
  • The axis movement control part 27 determines whether or not the state is static (Step 105). If it is identified that the state is static (Yes), a signal showing the identification to the image acquisition coordinate collation part 23. The image acquisition coordinate collation part 23 receives the signal, and designates an output of an image acquisition signal to the image acquisition signal output part 24. In this manner, the work W is image-captured by the image capturing unit 13 to acquire the captured image (Step 106).
  • Once the captured image is acquired, the coordinate and image memory part 29 stores the captured image and the measurement position coordinate (image capturing position coordinate) associated with each other (Step 107). The image coordinate detecting part 30 detects a measurement coordinate (X, Y, Z) in each point of the work W from the captured image and the measurement position coordinate stored (Step 108).
  • The measurement coordinate (X, Y, Z) detected is stored on the static measurement image coordinate memory part 31 shown in FIG. 2 as the measurement coordinate at the static measurement (Step 109). The processing described above is executed to all of a plurality of the predetermined image capturing positions (Step 110).
  • The static measurement image coordinate memory part 31 stores the measurement coordinate (XS1, YS1, ZS1)-(XSn, YSn, ZSn) in all image capturing positions. The suffix “n” is a sequence number of the image capturing position coordinates.
  • In the static measurement, a group of the images captured at the respective image capturing positions corresponds to a first captured image group according to this embodiment. Accordingly, the measurement coordinate (XS1, YS1, ZS1)-(XSn, YSn, ZSn) in all image capturing positions corresponds to measurement results based on the first captured image group.
  • FIG. 7 is a flow chart showing an operation example of the movement measurement. The movement measurement is executed in a state that the same work W is mounted on the stage 11 without change after the static measurement is executed.
  • In Steps 201 to 203, the image capturing position coordinate read out is collated with the measurement position coordinate. The image capturing position coordinate read out is same as the image capturing position coordinate at the static measurement.
  • If the image capturing position coordinate is matched with the measurement position coordinate (YES in Step 203), the image acquisition signal output part 24 outputs the image acquisition signal to the image capturing unit 13 to acquire the captured image (Step 204). The captured image acquired and the measurement position coordinate are stored (Step 205). Based thereon, the measurement coordinate (X, Y, Z) at each point of the work W is detected (Step 206).
  • The measurement coordinate (X, Y, Z) detected is stored on the movement measurement image coordinate memory part 32 shown in FIG. 2 as the measurement coordinate at the movement (Step 207). The above-described processing is executed to all of a plurality of the image capturing positions (Step 208).
  • The movement measurement image coordinate memory part 32 store the measurement coordinate (XM1, YM1, ZM1)-(XMn, YMn, ZMn) in all image capturing positions. The suffix “n” is a sequence number of the measurement result at the static measurement. In other words, the respective measurement results (XSn, YSn, ZSn) and (XMn, YMn, ZMn) having the same number are measured at the same image capturing position.
  • In the movement measurement, the group of the images captured at the respective image capturing positions corresponds to a second captured image group according to this embodiment. Accordingly, the measurement coordinate (XM1, YM1, ZM1)-(XMn, YMn, ZMn) in all image capturing positions corresponds to the measurement result based on the second captured image group.
  • FIG. 8 is a flow chart showing an example of calculating a correction value executed by the image coordinate finite difference calculation part 33 shown in FIG. 2. From the static measurement image coordinate memory part 31, a measurement coordinate at the static measurement (XSn, YSn, ZSn) is read out (Step 301). From the movement measurement image coordinate memory part 32, a measurement coordinate at the movement measurement (XMn, YMn, ZMn) is read out (Step 302).
  • By the following equations, a finite difference (ΔXn, ΔYn, ΔZn) is calculated (Step 303).

  • ·ΔX n =X Mn −X Sn

  • ΔY n =Y Mn −Y Sn

  • ΔZ n =Z Mn −Z Sn
  • The finite difference (ΔXn, ΔYn, ΔZn) calculated is stored in the image coordinate finite difference memory part 34 shown in FIG. 2 (Step 304). The processing is repeated to the final image capturing position coordinate (Step 305). The finite difference (ΔXn, ΔYn, ΔZn) corresponds to the correction value calculated from the first captured image group and the second captured image group in this embodiment.
  • FIG. 9 is a flow chart showing an operation example of the movement measurement using the correction value. On the stage 11, a work W in the same type having the same shape as the work W mounted for calculating the correction value is mounted. From Step 401 to Step 406, the same processing as the movement measurement shown in FIG. 7 is executed. The measurement coordinate (Xn, Yn, Zn) calculated in Step 406 per image capturing position coordinate is output to the image coordinate correction part 35 show in FIG. 2.
  • The image coordinate correction part 35 reads out a finite difference (ΔXn, ΔYn, ΔZn) having the same number n corresponding to the accepted measurement coordinate (Xn, Yn, Zn) from the image coordinate finite difference memory part 34 (Step 407). Then, the correction coordinate (XCn, YCn, ZCn) is calculated by the following equations (Step 408).

  • X Cn =X n −ΔX n

  • Y Cn =Y n −ΔY n

  • Z Cn =Z n −ΔZ n
  • The suffix “n” is matched with the number of the measurement results at the static measurement and the movement measurement.
  • The calculated correction coordinate (XCn, YCn, ZCn) are output from the detection coordinate output part 36 (409). The above-described processing is repeated to the final image capturing position (Step 410).
  • According to the image measurement apparatus 100 in this embodiment, the finite difference (ΔXn, ΔYn, ΔZn) is calculated from the first captured image group by image-capturing with the image capturing unit 13 placed static at a plurality of the predetermined image capturing positions and the second captured image group by image-capturing by the image capturing unit 13 relatively moved. By using the finite difference (ΔXn, ΔYn, ΔZn), the influence of the distortion of the movement mechanism 12 and the work W and the deviation of the timing of the image acquisition caused by the acceleration, the deceleration, the centrifugal force or the like at each image capturing position is eliminated, and the measurement result almost equivalent to that provided at the static can be acquired. That is to say, the work W can be measured with high precision by relatively moving the image capturing unit 13.
  • Rigidity of a member constituting the movement mechanism 12 etc. may not be increased to the utmost limit in order to prevent the deviation of the image at the movement measurement, thereby reducing the costs.
  • Any one of a plurality of the works W to be measured is selected as a representative one, and the correction value is calculated from the static measurement and the movement measurement. The correction value is stored. If other work W in the same type is subjected to the movement measurement, the correction is done from the stored correction value. In this manner, a plurality of the works W in the same type can be effectively measured with high precision. By way of example, by using the first one work W, the correction value is easily calculated, thereby executing the movement measurement with high precision without complex processing.
  • Other Embodiments
  • The present invention is not limited to the above-described embodiments, and other various embodiments may be made.
  • In the above description, the work to be measured is used to calculate the correction value. Alternatively, an object for proof is used to calculate the correction value. The object for proof is not limited, but includes a plate-like member described as the work W in FIG. 3A, for example.
  • If the work to be measured is fixed, if a method of moving the image capturing unit or the stage at the movement measurement is fixed, if a movement mode is simple (moves at an equal speed for one axis, etc.), or if too high precision is not required, the object for proof may be used as the object according to the embodiment.
  • For example, at the time of factory shipment, the object for proof is used to calculate and store the correction value. Upon the measurement of an actual work, the movement measurement is executed using the correction value. In this manner, the movement measurement with high precision is easily possible. It should be appreciated that it can select to use the correction value calculated using the object for proof or to calculate the correction value using the work on site.
  • In the above description, the image capturing unit is moved in the x and y directions in order to change the image capturing position of the image capturing unit, and the stage is moved in the y direction. However, it is not limited thereto. The image capturing unit at an image capturing side may be moved in three, i.e., xyz, directions, or the stage at an object to be measured side may be moved in three, i.e., xyz, directions.
  • Alternatively, those at the image capturing side and the object to be measured side may be moved in the same direction. In any case, the movement for changing the image capturing position of the image capturing unit corresponds to the relative movement of the image capturing unit.
  • In the above description, the image measurement device and the PC are constituted separately. However, the image measurement device and the PC may be integrally constituted to achieve the image measurement apparatus according to the present invention. That is to say, an information processing unit including a CPU and the like may be provided in the image measurement device, and the information processing unit may constitute measurement control part.
  • A type of the image measurement apparatus to which the image measurement method according to the present invention is applied is not limited. The present invention is applicable to any apparatus that executes measurement and observation using the object image acquired by image-capturing the work. Examples include a CNC image measurement device, a CNC three-dimensional measurement device and a hardness tester. Also, the present invention is applicable to a digital microscope that image-captures an enlarged image provided by an optical microscope with a digital camera.
  • Among the features of the respective embodiments described above, it is possible to combine at least two of them. In addition, various effects described above are exemplary only and should not be limited, and other effects may be exerted.

Claims (8)

What is claimed is:
1. An image measurement apparatus, comprising:
an image capturing part that image-captures an object;
a movement mechanism that changes an image capturing position of the image capturing part to the object; and
a calculation part that calculates a correction value from a first captured image group acquired by placing the image capturing part static at each of a plurality of image capturing positions and a second captured image group acquired by relatively moving the image capturing part so as to pass each of a plurality of the image capturing positions, the first captured image group and the second captured image group being captured image groups of images captured at a plurality of the predetermined image capturing positions by the image capturing part.
2. The image measurement apparatus according to claim 1, wherein
the object is an object to be measured, and
the image measurement apparatus further comprises a correction part that corrects a measurement result based on the second captured image group from the correction value calculated.
3. The image measurement apparatus according to claim 2, further comprising:
a memory part that stores the correction value calculated, wherein the correction part corrects the measurement result from the correction value stored.
4. An image measurement method, comprising:
acquiring a first captured image group by image-capturing an object with an image capturing part placed static at each of a plurality of image capturing positions;
acquiring a second captured image group by image-capturing the object with the image capturing part relatively moving so as to pass each of a plurality of the image capturing positions; and
calculating a correction value from the first and second captured image groups acquired.
5. The image measurement method according to claim 4, wherein
acquiring the first captured image group is executed to the object to be measured,
acquiring the second captured image group is executed to each of the object to be measured and a plurality of other objects to be measured in the same type; and
the image measurement method comprises correcting a measurement result of each of a plurality of the other objects to be measured based on the second captured image group from the correction value calculated to the object to be measure.
6. An information processing apparatus, comprising:
a movement control part that controls an image capturing position of an image capturing part that image captures an object; and
a calculation part that calculates a correction value from a first captured image group acquired by placing the image capturing part static at each of a plurality of image capturing positions and a second captured image group acquired by relatively moving the image capturing part so as to pass each of a plurality of the image capturing positions, the first captured image group and the second captured image group being captured image groups of images captured at a plurality of the predetermined image capturing positions by the image capturing part.
7. An information processing method executed by a computer, comprising:
acquiring a first captured image group by image-capturing an object with an image capturing part placed static at each of a plurality of image capturing positions;
acquiring a second captured image group by image-capturing the object with the image capturing part relatively moving so as to pass each of a plurality of the image capturing positions; and
calculating a correction value from the first and second captured image groups acquired.
8. A program causing a computer to execute a method, comprising:
acquiring a first captured image group by image-capturing an object with an image capturing part placed static at each of a plurality of image capturing positions;
acquiring a second captured image group by image-capturing the object with the image capturing part relatively moving so as to pass each of a plurality of the image capturing positions; and
calculating a correction value from the first and second captured image groups acquired.
US15/096,623 2015-04-14 2016-04-12 Image measurement apparatus, image measurement method, information processing apparatus, information processing method, and program Active 2037-02-21 US10001368B2 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2015082239A JP6475552B2 (en) 2015-04-14 2015-04-14 Image measuring apparatus, image measuring method, information processing apparatus, information processing method, and program
JP2015-082239 2015-04-14

Publications (2)

Publication Number Publication Date
US20160307311A1 true US20160307311A1 (en) 2016-10-20
US10001368B2 US10001368B2 (en) 2018-06-19

Family

ID=57128487

Family Applications (1)

Application Number Title Priority Date Filing Date
US15/096,623 Active 2037-02-21 US10001368B2 (en) 2015-04-14 2016-04-12 Image measurement apparatus, image measurement method, information processing apparatus, information processing method, and program

Country Status (3)

Country Link
US (1) US10001368B2 (en)
JP (1) JP6475552B2 (en)
DE (1) DE102016206264A1 (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20180340763A1 (en) * 2017-05-29 2018-11-29 Mitutoyo Corporation Operation method of position measuring device
CN113014816A (en) * 2021-03-03 2021-06-22 常州微亿智造科技有限公司 Method and device for determining trigger point of flying shooting
US20220028117A1 (en) * 2020-07-22 2022-01-27 Canon Kabushiki Kaisha System, information processing method, method of manufacturing product, and recording medium
JP7131716B1 (en) * 2021-05-17 2022-09-06 三菱電機株式会社 Substrate measurement device and substrate measurement method

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6956611B2 (en) * 2017-11-30 2021-11-02 株式会社ミツトヨ Measuring method, correction amount setting method of measuring device and measuring device

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7085673B2 (en) * 2004-08-31 2006-08-01 Hewlett-Packard Development Company, L.P. Displacement estimation system and method
US20070028677A1 (en) * 2005-04-13 2007-02-08 Renishaw Plc Method of error correction

Family Cites Families (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040223053A1 (en) * 2003-05-07 2004-11-11 Mitutoyo Corporation Machine vision inspection system and method having improved operations for increased precision inspection throughput
DE102010017604B4 (en) * 2009-09-01 2016-03-10 Werth Messtechnik Gmbh Method for optically measuring structures of an object
DE102009049534A1 (en) * 2009-10-06 2011-04-07 Carl Zeiss Industrielle Messtechnik Gmbh Coordinate measuring machine with position change sensors
JP2012098265A (en) * 2010-11-02 2012-05-24 Beru Techno:Kk Measuring device of weight, shape, and other property
JP2012220339A (en) * 2011-04-08 2012-11-12 Nikon Corp Shape measuring device, shape measuring method, and program therefor
JP5893278B2 (en) * 2011-07-26 2016-03-23 キヤノン株式会社 Image measuring apparatus, image measuring method, program, and recording medium
JP5796507B2 (en) * 2012-02-10 2015-10-21 新日鐵住金株式会社 Die inner diameter measurement method
JP2014228529A (en) 2013-05-27 2014-12-08 株式会社ミツトヨ Shape measurement device
JP6136850B2 (en) 2013-10-23 2017-05-31 富士電機株式会社 vending machine

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7085673B2 (en) * 2004-08-31 2006-08-01 Hewlett-Packard Development Company, L.P. Displacement estimation system and method
US20070028677A1 (en) * 2005-04-13 2007-02-08 Renishaw Plc Method of error correction

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20180340763A1 (en) * 2017-05-29 2018-11-29 Mitutoyo Corporation Operation method of position measuring device
US10690474B2 (en) * 2017-05-29 2020-06-23 Mitutoyo Corporation Operation method of position measuring device
US20220028117A1 (en) * 2020-07-22 2022-01-27 Canon Kabushiki Kaisha System, information processing method, method of manufacturing product, and recording medium
US11741632B2 (en) * 2020-07-22 2023-08-29 Canon Kabushiki Kaisha System, information processing method, method of manufacturing product, and recording medium with images of object that moves relative to cameras being captured at predetermined intervals and having different image capture times
CN113014816A (en) * 2021-03-03 2021-06-22 常州微亿智造科技有限公司 Method and device for determining trigger point of flying shooting
JP7131716B1 (en) * 2021-05-17 2022-09-06 三菱電機株式会社 Substrate measurement device and substrate measurement method

Also Published As

Publication number Publication date
US10001368B2 (en) 2018-06-19
JP6475552B2 (en) 2019-02-27
JP2016200551A (en) 2016-12-01
DE102016206264A1 (en) 2016-11-03

Similar Documents

Publication Publication Date Title
US10001368B2 (en) Image measurement apparatus, image measurement method, information processing apparatus, information processing method, and program
JP4821934B1 (en) Three-dimensional shape measuring apparatus and robot system
US9621793B2 (en) Information processing apparatus, method therefor, and measurement apparatus
EP3460715B1 (en) Template creation apparatus, object recognition processing apparatus, template creation method, and program
JP6536550B2 (en) Bolt axial force measuring device and bolt axial force measuring method
KR20160095911A (en) Optical image stabilizer for camera module and gain calbration method thereof
JP2017134177A5 (en)
US20130027546A1 (en) Image measurement apparatus, image measurement method, program and recording medium
US20180033121A1 (en) Image processing apparatus, image processing method, and storage medium
JP2008102063A (en) Vehicle wheel center position measuring device by image processing
JP6816454B2 (en) Imaging device and mobile system
JP2008298589A (en) Device and method for detecting positions
US20210314473A1 (en) Signal processing device, imaging device, and signal processing method
WO2018123639A1 (en) Imaging device and imaging method
JP6699902B2 (en) Image processing apparatus and image processing method
JP2015220623A (en) Mobile body imaging system
JP6452414B2 (en) Image processing apparatus, imaging apparatus, image processing method, and program
JP7131716B1 (en) Substrate measurement device and substrate measurement method
JP2005136743A (en) Image sensor position adjustment apparatus and position adjustment method
US20060109552A1 (en) Image blur compensation device
WO2021029206A1 (en) Image processing device
JPWO2017042995A1 (en) In-vehicle stereo camera device and correction method thereof
WO2018230560A1 (en) Position detecting device and position detecting method
JP5761061B2 (en) Imaging apparatus, microscope, and program used therefor
JP4654693B2 (en) Inspection image imaging device

Legal Events

Date Code Title Description
AS Assignment

Owner name: MITUTOYO CORPORATION, JAPAN

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:UDO, SHOHEI;REEL/FRAME:038255/0644

Effective date: 20160304

STCF Information on status: patent grant

Free format text: PATENTED CASE

MAFP Maintenance fee payment

Free format text: PAYMENT OF MAINTENANCE FEE, 4TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1551); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

Year of fee payment: 4

点击 这是indexloc提供的php浏览器服务,不要输入任何密码和下载