WO2013035847A1 - Dispositif de mesure de forme, système de fabrication de structure, procédé de mesure de forme, procédé de fabrication de structure, programme de mesure de forme et support d'enregistrement lisible par ordinateur - Google Patents
Dispositif de mesure de forme, système de fabrication de structure, procédé de mesure de forme, procédé de fabrication de structure, programme de mesure de forme et support d'enregistrement lisible par ordinateur Download PDFInfo
- Publication number
- WO2013035847A1 WO2013035847A1 PCT/JP2012/072913 JP2012072913W WO2013035847A1 WO 2013035847 A1 WO2013035847 A1 WO 2013035847A1 JP 2012072913 W JP2012072913 W JP 2012072913W WO 2013035847 A1 WO2013035847 A1 WO 2013035847A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- captured
- feature amount
- captured image
- unit
- Prior art date
Links
- 238000005259 measurement Methods 0.000 title claims abstract description 97
- 238000004519 manufacturing process Methods 0.000 title claims description 27
- 238000000691 measurement method Methods 0.000 title description 2
- 238000009826 distribution Methods 0.000 claims abstract description 49
- 238000005286 illumination Methods 0.000 claims abstract description 22
- 238000003384 imaging method Methods 0.000 claims description 86
- 238000000034 method Methods 0.000 claims description 68
- 238000004364 calculation method Methods 0.000 claims description 59
- 238000013461 design Methods 0.000 claims description 34
- 238000003860 storage Methods 0.000 claims description 26
- 238000007689 inspection Methods 0.000 claims description 20
- 238000000465 moulding Methods 0.000 claims description 14
- 239000002131 composite material Substances 0.000 claims description 5
- 230000001678 irradiating effect Effects 0.000 claims description 4
- 238000006243 chemical reaction Methods 0.000 claims 1
- 230000000694 effects Effects 0.000 claims 1
- 230000003760 hair shine Effects 0.000 abstract 1
- 230000008569 process Effects 0.000 description 41
- 238000012545 processing Methods 0.000 description 15
- 230000008439 repair process Effects 0.000 description 10
- 230000002950 deficient Effects 0.000 description 7
- 238000010586 diagram Methods 0.000 description 6
- 238000007493 shaping process Methods 0.000 description 5
- 230000005540 biological transmission Effects 0.000 description 4
- 238000004891 communication Methods 0.000 description 4
- 230000006870 function Effects 0.000 description 4
- 230000005484 gravity Effects 0.000 description 3
- 230000010363 phase shift Effects 0.000 description 3
- 230000008859 change Effects 0.000 description 2
- 238000005520 cutting process Methods 0.000 description 2
- 238000001514 detection method Methods 0.000 description 2
- 238000005516 engineering process Methods 0.000 description 2
- 238000005242 forging Methods 0.000 description 2
- 230000003287 optical effect Effects 0.000 description 2
- 238000005266 casting Methods 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 230000002093 peripheral effect Effects 0.000 description 1
- 230000004044 response Effects 0.000 description 1
- 230000002194 synthesizing effect Effects 0.000 description 1
- 238000012360 testing method Methods 0.000 description 1
- 230000001131 transforming effect Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01B—MEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
- G01B11/00—Measuring arrangements characterised by the use of optical techniques
- G01B11/24—Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
- G01B11/25—Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object
- G01B11/2513—Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object with several lines being projected in more than one direction, e.g. grids, patterns
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/0002—Inspection of images, e.g. flaw detection
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/50—Depth or shape recovery
- G06T7/521—Depth or shape recovery from laser ranging, e.g. using interferometry; from the projection of structured light
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10016—Video; Image sequence
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10141—Special mode during image acquisition
- G06T2207/10152—Varying illumination
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30168—Image quality inspection
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30244—Camera pose
Definitions
- the present invention relates to a shape measuring device, a structure manufacturing system, a shape measuring method, a structure manufacturing method, a shape measuring program, and a computer-readable recording medium.
- a pattern projection type shape measuring apparatus using a phase shift method is known (for example, see Patent Document 1).
- a grating pattern having a sinusoidal intensity distribution is projected onto a measurement object, and the measurement object is repeatedly imaged while changing the phase of the grating pattern at a constant pitch.
- phase distribution (phase image) of the lattice pattern deformed according to the surface shape of the measurement object is obtained, and the phase image Is unwrapped (phase connection) and then converted into a height distribution (height image) of the measurement object.
- the present invention has been made in view of such a situation, and a shape measuring device, a structure manufacturing system, and a shape measuring device that determine whether or not there is a blur in a captured image for measuring the shape of a measurement target.
- a method, a structure manufacturing method, and a shape measurement program are provided.
- the present invention provides an imaging unit that generates a captured image obtained by imaging a measurement target, and an image captured by the imaging unit is captured as an image in which a lattice pattern is projected on the measurement target.
- the irradiation unit that irradiates the measurement target with illumination light having a predetermined intensity distribution from a direction different from the direction in which the imaging unit is imaging, and the degree of blur that exists in the captured image from the captured image
- a shape measuring apparatus comprising: a feature amount calculating unit that calculates a feature amount; and a determination unit that determines whether or not blur is present in a captured image based on the feature amount.
- the present invention also measures a design apparatus for producing design information related to the shape of a structure, a molding apparatus for producing a structure based on the design information, and a shape of the produced structure based on a captured image. It is a structure manufacturing system including the above-described shape measuring device and an inspection device that compares shape information obtained by measurement and design information.
- the present invention provides an imaging unit that generates a captured image obtained by imaging a measurement target, and the imaging unit so that the image captured by the imaging unit is captured as an image in which a lattice pattern is projected on the measurement target.
- a shape measuring device comprising: an irradiating unit that irradiates illumination light having a predetermined intensity distribution on the measurement target from a direction different from the direction in which the image is captured.
- a shape measuring method comprising: calculating a feature amount; and determining whether or not there is a blur in a captured image based on the feature amount.
- the present invention creates design information related to the shape of a structure, creates a structure based on the design information, and generates the shape of the created structure using the shape measurement method described above. It is a structure manufacturing method including measuring based on the captured image and comparing shape information obtained by measurement with design information.
- the present invention provides an imaging unit that generates a captured image obtained by imaging a measurement target, and the imaging unit so that the image captured by the imaging unit is captured as an image in which a lattice pattern is projected on the measurement target.
- the degree of blur that exists in the captured image from the captured image to the computer of the shape measuring apparatus that includes an irradiation unit that irradiates the measurement target with illumination light having a predetermined intensity distribution from a direction different from the direction in which the image is captured And a step of determining whether or not there is a blur in a captured image based on the feature amount.
- FIG. 1 is a block diagram showing a configuration of a shape measuring apparatus 10 according to the first embodiment of the present invention.
- the shape measuring apparatus 10 includes an operation input unit 11, a display unit 12, an imaging unit 13, an irradiation unit 14, a storage unit 15, a feature amount calculation unit 16, a determination unit 17, and a point group calculation unit 18.
- the computer terminal which measures the three-dimensional shape of the measuring object A by the phase shift method is included.
- the shape measuring apparatus 10 detects whether or not there is a blur in the captured image when capturing a plurality of captured images whose phases are shifted by the phase shift method.
- causes of blurring include, for example, blurring when the measurement object moves during imaging, or camera shake when the user captures the portable shape measuring device 10 without holding it on a tripod or the like. Can be considered.
- blurring that affects the measurement results may include blurring in which the imaging range differs between multiple captured images, and the measurement target may move or camera shake during the exposure time for capturing a single captured image. It is thought that this may be a blur due to Alternatively, these blurs may occur simultaneously.
- an image obtained by projecting a plurality of grating patterns having different initial phases based on the N bucket method onto a measurement object is captured, and the shape of the measurement object is measured based on the luminance value of the same pixel in each image.
- the shape measurement is performed from the fringe images having different initial phases, but in this embodiment, the lattice pattern having the same initial phase as that of at least one of the initial phases is projected again, and the captured image is The images of lattice patterns having the same initial phase are compared. In this way, the similarity of two or more captured images with different imaging timings in which the same initial phase lattice pattern is captured is calculated.
- the similarity is equal to or greater than the threshold, it can be determined that there is no blur between the photographing timings, and if the similarity is not equal to or greater than the threshold, it can be determined that there is a blur between the photographing timings.
- the shape measuring apparatus 10 determines whether or not such a blur has occurred during imaging, and issues a warning if a blur has occurred. Thereby, for example, when blurring occurs, it is possible to perform imaging again.
- the operation input unit 11 receives an operation input from a user.
- the operation input unit 11 includes operation members such as a power button for switching on and off of the main power supply and a release button for receiving an instruction to start an imaging process.
- the operation input part 11 can also receive the input by a touchscreen.
- the display unit 12 is a display that displays various types of information. For example, when the determination unit 17 described later determines that there is a blur in the captured image, the display unit 12 displays a warning that there is a blur. Further, the display unit 12 displays, for example, point cloud data indicating the three-dimensional shape of the measurement target calculated by the point cloud calculation unit 18.
- the imaging unit 13 generates a captured image obtained by imaging the measurement target, and performs an imaging process for storing the generated captured image in the storage unit 15.
- the imaging unit 13 operates in conjunction with the irradiation unit 14 and performs an imaging process in accordance with the timing at which the illumination light is projected onto the measurement target by the irradiation unit 14.
- the imaging unit 13 generates a plurality of captured images obtained by capturing, for each initial phase, an image in which a plurality of lattice patterns having different initial phases based on the N bucket method are projected onto the measurement target by the irradiation unit 14.
- the determination unit 17 determines that there is a possibility of blurring in the captured image, the imaging unit 13 warns the user of the possibility of blurring.
- the user can cause the shape measuring apparatus 10 to acquire an image obtained by projecting a plurality of lattice patterns having different initial phases onto the measurement object, based on the N bucket method again. In this way, if imaging is continued until no warning about the possibility of the presence of blur is generated, shape measurement with reduced blurring can be performed.
- the shape measuring apparatus 10 may continue to automatically capture an image in which a plurality of lattice patterns having different initial phases based on the N bucket method are projected on the measurement target until blur determination is eliminated. In this way, the measurement object irradiated with the illumination light is imaged again to generate a captured image.
- the irradiating unit 14 has a direction different from the direction in which the imaging unit 13 is imaging so that an image in which the lattice pattern is projected onto the measurement target is captured when the imaging unit 13 is imaging the measurement target.
- Irradiation light having a predetermined intensity distribution is irradiated to the measurement object.
- the irradiating unit 14 sequentially captures images obtained by projecting a plurality of lattice patterns having a spatial frequency of a certain period and different initial phases by 90 degrees on the measurement object based on the N bucket method. Irradiate with illumination light so that you can. For example, as shown in FIG.
- the irradiation unit 14 has a light intensity such that the light source 1 and the light from the light source 1 have a linear intensity distribution having a longitudinal direction in a direction orthogonal to the light irradiation direction. It has a collimate lens 2 and a cylindrical lens 3 for converting the distribution. Further, the light beam having a linear light intensity distribution is scanned with a scanning mirror 4 (MEMS (MEMS)) that scans the light of the light source 1 with respect to the measurement object in the direction perpendicular to the longitudinal direction of the light beam. Micro Electro Mechanical Systems) mirror).
- MEMS scanning mirror 4
- the light source 1 is provided with a light source control unit 5 for controlling the light intensity emitted from the light source 1, and the light source control unit 5 sequentially scans the scanning mirror while modulating the intensity of the laser light.
- the intensity distribution of the laser light emitted from the light source 1 is shaped so as to have a linear light intensity distribution in one direction perpendicular to the optical axis direction. A light beam having a linear intensity distribution is scanned in a direction perpendicular to both the longitudinal directions while changing the intensity.
- pattern light of a striped pattern (sine lattice) having a sinusoidal luminance change in one direction is formed.
- the light of the light source is shifted in the direction perpendicular to the optical axis direction while changing the intensity in a sinusoidal shape by a mirror using the MEMS technology.
- the irradiation light of the lattice pattern is projected on the measurement target.
- laser light is projected using the MEMS technology
- illumination light can also be projected by applying a liquid crystal projector or the like.
- FIG. 3 is a diagram illustrating an example of a measurement target in which the irradiation unit 14 projects the irradiation light by shifting the initial phase by 90 degrees.
- A has an initial phase of 0 degrees
- B has an initial phase shifted 90 degrees from A
- C has an initial phase shifted 180 degrees from A
- D has an initial phase shifted 270 degrees from A. Is shown.
- the 5-bucket method five captured images from A to E with the initial phase shifted by a constant angle are generated
- a to G with the initial phase shifted by a constant angle. Up to seven captured images are generated.
- the order of imaging does not necessarily have to be in the order of A, B, C, D, and E.
- images can be captured in the order of A, E, B, C, and D.
- the imaging process is performed while sequentially shifting the initial phase.
- the imaging unit 13 may perform other initial phase grating patterns (for example, a plurality of imaging timings for capturing an image in which a lattice pattern (for example, A, E) having the same initial phase is projected onto the measurement target).
- B, C, D captures an image projected on the measurement object.
- the storage unit 15 stores the captured image generated by the imaging unit 13, the point cloud data calculated by the point cloud calculation unit 18, and the like.
- the feature amount calculation unit 16 calculates a feature amount indicating the degree of blur existing in the captured image from the captured image generated by the imaging unit 13.
- the feature amount calculation unit 16 calculates, as a feature amount, the similarity between a plurality of captured images obtained by capturing an image in which a lattice pattern having the same initial phase is projected onto the measurement target.
- the similarity can be calculated by, for example, the following formula (1).
- the feature amount calculation unit 16 may calculate, as a feature amount, a similarity between a set of captured images obtained by capturing an image in which a lattice pattern having the same initial phase is projected onto a measurement target. Similarity between sets of captured images can also be calculated as a feature amount. For example, when nine captured images obtained by capturing a lattice pattern in which the initial phase is shifted by 90 degrees by the 9-bucket method are generated, the first captured image whose initial phase is 0 degrees and the initial phase are 360 Only the degree of similarity with the fifth captured image that is degree (that is, 0 degree) may be calculated, and as shown in FIG.
- the similarity between a plurality of sets of captured images whose initial phases are the same angle may be calculated. In this way, by calculating the degree of similarity between a plurality of sets of captured images and determining blur, it is possible to detect blur more accurately.
- the determination unit 17 determines whether there is a blur in the captured image based on the feature amount calculated by the feature amount calculation unit 16. For example, the determination unit 17 stores a predetermined similarity threshold in its own storage area, and compares the similarity calculated by the feature amount calculation unit 16 with the similarity threshold stored in its own storage area. To do. The determination unit 17 determines that there is no blur if the similarity calculated by the feature amount calculation unit 16 is greater than or equal to a predetermined similarity threshold, and the similarity calculated by the feature amount calculation unit 16 However, if it is determined that it is not greater than or equal to a predetermined similarity threshold, it is determined that there is a blur. The point group calculation unit 18 performs point group calculation processing such as phase calculation and phase connection based on a plurality of captured images generated by the imaging unit 13, calculates point group data, and stores the data in the storage unit 15.
- FIG. 4 is a flowchart for explaining an operation example in which the shape measuring apparatus 10 performs the shape measuring process.
- the imaging unit 13 starts imaging processing of the measurement target, and the irradiation unit 14 starts projection processing of irradiation light onto the measurement target.
- the imaging unit 13 stores, for example, five captured images captured at the time of fringe projection with an initial phase of 0 degrees, 90 degrees, 180 degrees, 270 degrees, and 360 degrees in the storage unit 15 (step S3).
- the feature amount calculation unit 16 reads out a plurality of captured images having the same initial phase (for example, 0 degrees and 360 degrees) from among the plurality of captured images stored in the storage unit 15, and similarity between the read out captured images The degree is calculated (step S4).
- the determining unit 17 compares the similarity calculated by the feature amount calculating unit 16 with a predetermined threshold (step S5). When the determination unit 17 determines that the similarity calculated by the feature amount calculation unit 16 is not equal to or greater than a predetermined threshold (step S5: NO), the display unit 12 indicates that the captured image is blurred. Is displayed (step S6). Here, although there is a possibility that camera shake has occurred, it is possible to accept a selection input from the user as to whether or not to continue the point cloud data calculation process. If an instruction not to continue the process is input to the operation input unit 11 (step S7: NO), the process returns to step S1. If an instruction to continue the process is input to the operation input unit 11 (step S7: YES), the process proceeds to step S8.
- step S5 when the determination unit 17 determines that the similarity calculated by the feature amount calculation unit 16 is equal to or greater than a predetermined threshold (step S5: YES), the point group calculation unit 18 stores the storage unit 15 Point cloud data is calculated based on the captured image stored in the storage unit 15 and stored in the storage unit 15 (step S8). Then, the display unit 12 displays the calculated point cloud data (Step S9).
- the similarity between the captured images having the same initial phase is calculated in the plurality of captured images having different initial phases captured based on the N bucket method, and based on the similarity. It can be determined whether or not there is a blur.
- the point cloud calculation process performed by the point cloud calculation unit 18 based on the captured image is a relatively heavy process, and it takes about several seconds (for example, 5 seconds) to complete the process. There is. Therefore, in this embodiment, after the imaging process by the imaging unit 13 is performed and before the point cloud calculation process by the point cloud calculation unit 18 is started, the similarity determination and the blur determination process are performed, and blurring exists.
- the user can know that there is a blur in the captured image before the high-load point cloud calculation process is performed, and can input a re-imaging instruction. That is, the point cloud calculation process is more compared to the case where it is known that blurring has occurred for the first time when the point cloud data is displayed on the display unit 12 after the time of the point cloud calculation process has elapsed. It is possible to know the occurrence of blur at an early stage before it is performed. Thereby, it is possible to reduce an unnecessary waiting time for performing re-measurement of the shape of the measurement target, and it is possible to efficiently perform measurement processing without consuming unnecessary power.
- the shape measuring apparatus 10 according to this embodiment has the same configuration as that of the first embodiment shown in FIG.
- a blur is detected in which the imaging range is different between a plurality of captured images.
- the measurement target moves or shakes during the exposure time for capturing one captured image. It detects the blur caused by the occurrence of.
- the feature amount calculation unit 16 in the present embodiment calculates the sharpness (blurring degree) of the captured image generated by the imaging unit 13 as the feature amount.
- the determination unit 17 stores in advance a sharpness threshold for determining whether blurring has occurred, and compares the sharpness calculated by the feature amount calculation unit 16 with the sharpness threshold. If the determination unit 17 determines that the sharpness calculated by the feature amount calculation unit 16 is equal to or greater than a predetermined threshold, the determination unit 17 determines that there is no blur in the captured image, and the sharpness calculated by the feature amount calculation unit 16 is If it is determined that it is not equal to or more than the predetermined threshold, it is determined that there is a blur in the captured image.
- the sharpness of the image can be calculated by a predetermined calculation formula, but can be calculated based on, for example, a spatial frequency distribution.
- a spatial frequency distribution For example, as shown in FIGS. 5 and 6, the frequency distribution curve A1 of the spatial frequency in the image data in which the same measurement object is imaged without blurring and the frequency of the spatial frequency in the image data in which blurring occurs.
- the partial curve B1 When the frequency distribution curve A1 of the spatial frequency in the image data in which no blurring occurs and the frequency curve B1 of the spatial frequency in the image data in which the blurring occurs are compared, the frequency distribution curve A1 is a frequency distribution curve. The center of gravity of the enclosed figure exists on the high spatial frequency side.
- FIGS. 5 and 6 the frequency distribution curve A1 of the spatial frequency in the image data in which the same measurement object is imaged without blurring and the frequency of the spatial frequency in the image data in which blurring occurs.
- the frequency corresponding to the position of the dotted line is the spatial frequency of the stripes projected. Therefore, in order to determine whether or not blur has occurred according to the measurement target, the spatial frequency corresponding to the position of the center of gravity of the region surrounded by the frequency distribution curve of the spatial frequency is detected, and the detected space Based on the frequency, it is possible to determine whether or not a blur has occurred in the captured image by comparing with a preset spatial frequency threshold. It should be noted that the spatial frequency of the fringe pattern copied in the image data may be noticed, and if there is a fine pattern on the surface of the measurement object, the fine pattern can be noted. That is, by detecting whether the spatial frequency of the pattern is low, it is possible to determine whether or not the captured image is blurred.
- the feature amount calculation unit 16 performs a Fourier transform of the captured image generated by the imaging unit 13 and stored in the storage unit 15, and sets the spatial frequency corresponding to the center of gravity obtained as described above as the sharpness (feature) (Quantity).
- the determination unit 17 determines whether or not the feature amount calculated by the feature amount calculation unit 16 is equal to or greater than a predetermined threshold value compared with the reference sharpness, thereby causing blur in the captured image. It is determined whether or not to do.
- both the blur determination for each captured image according to the present embodiment and the blur determination based on the similarity between the plurality of captured images shown in the first embodiment are performed, and more efficiently and accurately.
- Blur detection can be performed. For example, every time the imaging unit 13 generates captured images with different initial phases, blur determination based on sharpness can be performed, and a warning can be given when it is determined that the sharpness is not greater than or equal to a threshold value. As a result, it is possible to prevent unnecessary image pickup processing and to prevent unnecessary point cloud data calculation processing.
- FIG. 7 is a flowchart showing an operation example of such a shape measuring apparatus 10.
- the imaging unit 13 starts imaging processing of the measurement target, and the irradiation unit 14 starts projection processing of irradiation light onto the measurement target.
- the imaging unit 13 causes the storage unit 15 to store a captured image generated each time a lattice pattern having a different initial phase is projected onto the measurement target by the irradiation unit 14 (step S12).
- the feature amount calculation unit 16 reads the captured image stored in the storage unit 15, obtains a spatial frequency distribution by Fourier transforming the read captured image, and acquires sharpness (step S13).
- the determination unit 17 compares the sharpness calculated by the feature amount calculation unit 16 with a predetermined sharpness threshold (step S14). When the determination unit 17 determines that the sharpness calculated by the feature amount calculation unit 16 is not equal to or greater than a predetermined threshold (step S14: NO), the display unit 12 indicates that the captured image is blurred. Is displayed (step S15). If an instruction not to continue the process is input to the operation input unit 11 (step S16: NO), the process returns to step S10. If an instruction to continue the process is input to the operation input unit 11 (step S16: YES), the process proceeds to step S17.
- step S14 when the determination unit 17 determines that the sharpness calculated by the feature amount calculation unit 16 is equal to or greater than a predetermined threshold compared to the reference sharpness (step S14: YES), the determination The unit 17 determines whether or not more than a specified number of captured images have been generated (step S17).
- the prescribed number of images is, for example, 5 when imaging by the 5-bucket method and 7 when imaging by the 7-bucket method. If the determination unit 17 determines that no more than the specified number of captured images has been generated (step S17: NO), the process returns to step S11.
- step S17 determines that a predetermined number or more of captured images have been generated (step S17: YES). If the determination unit 17 determines that a predetermined number or more of captured images have been generated (step S17: YES), the process proceeds to step S18, which is the same as steps S4 to S9 described in the first embodiment. Processing is performed (steps S18 to S23).
- the feature amount calculation unit 16 calculates the sharpness for each captured image, the determination unit 17 performs the blur determination, and the threshold in which all the sharpnesses of the plurality of captured images are determined.
- the feature amount calculation unit 16 calculates the similarity between a plurality of captured images.
- the shape measuring apparatus 10 of the present embodiment is the same as that of the first embodiment, but the irradiation unit 14 of the shape measuring apparatus 10 of the present embodiment performs an irradiation process similar to that of the first embodiment.
- the imaging unit 13 is imaging, the illumination light is irradiated so that an image in which illumination light having a uniform intensity distribution is projected on the measurement target is captured.
- the feature amount calculation unit 16 combines a composite image obtained by synthesizing a plurality of captured images with different initial phases captured in the same manner as in the first embodiment, and a captured image obtained by capturing a measurement target irradiated with uniform illumination light. The similarity is calculated as a feature amount. Thereby, it is determined whether or not there is a blur in the captured image.
- an image X in FIG. 8 is an image obtained by combining four captured images.
- an image Y in FIG. 8 is an image taken when the measurement target is illuminated with uniform illumination.
- the presence / absence of blur may be detected by detecting the similarity between an image obtained by combining four captured images and an image obtained by illuminating the measurement target without performing intensity modulation.
- the feature amount calculation unit 16 calculates a spatial frequency distribution of a composite image of a plurality of captured images with different initial phases and a spatial frequency distribution of an image captured with a uniform intensity distribution.
- the degree of similarity can be calculated by comparison.
- the determination unit 17 determines that there is no blur if the calculated similarity is equal to or greater than a certain level, and determines that there is blur if the calculated similarity is not equal to or greater than a certain level.
- the shape measuring apparatus 10 of the present embodiment is the same as that of the first embodiment, but the storage unit 15 of the shape measuring apparatus 10 of the present embodiment captures an image in which a lattice pattern is projected onto the measurement target.
- a template image that is an image is stored in advance.
- the storage unit 15 stores A ′ (initial phase 0 degree), B ′ (initial phase 90 degrees), and C ′ (initial phase 180 degrees) in which the measurement target is imaged in advance for each initial phase. ), D ′ (initial phase 270 degrees) template image is stored.
- the feature amount calculation unit 16 reads the template image stored in the storage unit 15 and calculates the similarity between the read template image and the captured image generated by the imaging unit 13 for each corresponding initial phase.
- the determination unit 17 determines that there is no blur if the similarity calculated by the feature amount calculation unit 16 is equal to or greater than a predetermined threshold, and determines that there is blur if the similarity is not greater than the threshold.
- the shape measuring apparatus 10 of the present embodiment is the same as that of the first embodiment, but the feature amount calculating unit 16 of the shape measuring apparatus 10 of the present embodiment is configured to each of a plurality of captured images captured by the imaging unit 13. Is subjected to Fourier transform, and the spatial frequency distribution is calculated as a feature amount.
- the determination unit 17 determines that there is blur between the captured images, and when the spatial frequency distribution between the plurality of captured images is substantially the same. Then, it is determined that there is no blur between the captured images.
- the spatial frequency distribution of only the captured image is the spatial frequency distribution of the other captured images. It is thought that the result will be different. Therefore, when such a spatial frequency distribution does not match between a plurality of captured images exceeding a predetermined condition, it is determined that blurring exists.
- the shape measuring apparatus 10 of the present embodiment is the same as that of the first embodiment, but the irradiation unit 14 of the shape measuring apparatus 10 of the present embodiment irradiates a measurement target with a lattice pattern based on the spatial code method, and features
- the amount calculation unit 16 calculates a spatial code for each of the plurality of captured images as a feature amount.
- the determination unit 17 determines that there is blur in the captured image if the coordinate position based on the spatial code calculated for each of the plurality of captured images is different, and if there is no blur in the captured image if the coordinate positions are approximately the same. judge.
- At least two determination processes can be combined to perform a determination process with higher accuracy.
- at least one determination process is selected in advance as to which of the determination processes according to the first to sixth embodiments as described above is performed by the user, and a determination process (or combination of determination processes) based on the selection is selected. ) May be executed.
- the display unit 12 displays a warning when the determination unit 17 determines that the captured image is blurred.
- the shape measurement device 10 may be a speaker that outputs sound. A warning sound may be output.
- a message may be transmitted to a terminal connected to the shape measuring apparatus 10 via a wired line or a wireless line to notify the determination unit 17 that the captured image has been determined to be blurred.
- FIG. 10 is a diagram showing a configuration of the structure manufacturing system 100 of the present embodiment.
- the structure manufacturing system 100 of this embodiment includes the shape measuring device 10, the design device 20, the molding device 30, the control device (inspection device) 40, and the repair device 50 as described in the above embodiment.
- the control device 40 includes a coordinate storage unit 41 and an inspection unit 42.
- the design device 20 creates design information related to the shape of the structure, and transmits the created design information to the molding device 30.
- the design apparatus 20 stores the produced design information in the coordinate storage unit 210 of the control apparatus 40.
- the design information includes information indicating the coordinates of each position of the structure.
- the molding apparatus 30 produces the above-described structure based on the design information input from the design apparatus 20.
- the molding of the molding apparatus 30 includes, for example, casting, forging, cutting, and the like.
- the shape measuring device 10 measures the coordinates of the manufactured structure (measurement object) and transmits information (shape information) indicating the measured coordinates to the control device 40.
- the coordinate storage unit 41 of the control device 40 stores design information.
- the inspection unit 42 of the control device 40 reads design information from the coordinate storage unit 41.
- the inspection unit 42 compares the information (shape information) indicating the coordinates received from the shape measuring apparatus 10 with the design information read from the coordinate storage unit 41.
- the inspection unit 42 determines whether or not the structure has been molded according to the design information based on the comparison result. In other words, the inspection unit 42 determines whether or not the manufactured structure is a non-defective product.
- the inspection unit 42 determines whether the structure can be repaired when the structure is not molded according to the design information.
- the inspection unit 42 calculates a defective portion and a repair amount based on the comparison result, and transmits information indicating the defective portion and information indicating the repair amount to the repair device 50.
- the repair device 50 processes the defective portion of the structure based on the information indicating the defective portion received from the control device 40 and the information indicating the repair amount.
- FIG. 11 is a flowchart showing the structure manufacturing method of the present embodiment.
- each process of the structure manufacturing method illustrated in FIG. 11 is executed by each unit of the structure manufacturing system 100.
- the design apparatus 20 creates design information related to the shape of the structure (step S31).
- molding apparatus 30 produces the said structure based on design information (step S32).
- the shape measuring apparatus 10 measures the shape of the manufactured structure (step S33).
- the inspection unit 42 of the control device 40 compares the shape information obtained by the shape measuring device 10 with the above-described design information to inspect whether or not the structure is manufactured according to the integrity design information ( Step S34).
- the inspection unit 42 of the control device 40 determines whether or not the manufactured structure is a good product (step S35).
- step S35 determines that the manufactured structure is a non-defective product
- the structure manufacturing system 100 ends the process.
- the inspection unit 42 determines that the manufactured structure is not a good product (step S35: NO)
- the inspection unit 42 determines whether the manufactured structure can be repaired (step S36).
- step S36 when the inspection unit 42 determines that the manufactured structure can be repaired (step S36: YES), the repair device 50 performs the reworking of the structure (step S37). Return to processing.
- step S36: NO the structure manufacturing system 100 ends the process.
- the shape measuring apparatus 10 in the first to sixth embodiments can accurately measure the coordinates of the structure, is the manufactured structure good? It can be determined whether or not. Moreover, the structure manufacturing system 100 can perform reworking and repair of the structure when the structure is not a good product.
- the repair process which the repair apparatus 50 in this embodiment performs may be replaced with the process in which the shaping
- molding apparatus 30 re-executes a shaping
- the molding apparatus 30 cuts a portion that is originally to be cut and is not cut in the structure. Thereby, the structure manufacturing system 100 can produce a structure correctly.
- a program for realizing the function of the processing unit in the present invention is recorded on a computer-readable recording medium, and the program recorded on the recording medium is read into a computer system and executed, thereby executing blur determination processing. May be performed.
- the “computer system” includes an OS and hardware such as peripheral devices.
- the “computer system” includes a WWW system having a homepage providing environment (or display environment).
- the “computer-readable recording medium” refers to a storage device such as a flexible medium, a magneto-optical disk, a portable medium such as a ROM or a CD-ROM, and a hard disk incorporated in a computer system.
- the “computer-readable recording medium” refers to a volatile memory (RAM) in a computer system that becomes a server or a client when a program is transmitted via a network such as the Internet or a communication line such as a telephone line. In addition, those holding programs for a certain period of time are also included.
- RAM volatile memory
- the program may be transmitted from a computer system storing the program in a storage device or the like to another computer system via a transmission medium or by a transmission wave in the transmission medium.
- the “transmission medium” for transmitting the program refers to a medium having a function of transmitting information, such as a network (communication network) such as the Internet or a communication line (communication line) such as a telephone line.
- the program may be for realizing a part of the functions described above. Furthermore, what can implement
- the image is necessary for calculating the similarity by comparing the images and the spatial frequency distribution of the image. Any method may be used as long as it detects whether or not the measurement object or the shape measuring device is relatively moving during the acquisition.
- the present invention can be applied to a structure manufacturing system that can determine whether or not a manufactured structure is a non-defective product. Thereby, the inspection accuracy of the manufactured structure can be improved, and the manufacturing efficiency of the structure can be improved.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Computer Vision & Pattern Recognition (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Optics & Photonics (AREA)
- Quality & Reliability (AREA)
- Length Measuring Devices By Optical Means (AREA)
Abstract
Un objectif de la présente invention est d'évaluer la présence ou non d'un flou dans une image capturée dans le but de mesurer une forme d'objet à mesurer. Selon l'invention, un dispositif de mesure de forme comprend : une unité de capture d'image qui génère une image capturée qui capture l'objet à mesurer; une unité d'éclairage qui éclaire l'objet à mesurer avec une lumière d'éclairage qui a une répartition d'intensité prescrite depuis une direction différente de la direction suivant laquelle l'unité de capture d'image capture l'image de sorte que l'image qui est capturée par l'unité de capture d'image est capturée en tant qu'image dans laquelle un motif de grille est projeté sur l'objet à mesurer; une unité de calcul de valeur de caractéristique qui calcule une valeur de caractéristique à partir de l'image capturée qui révèle le degré de flou présent dans l'image capturée; et une unité d'évaluation qui évalue la présence ou non d'un flou dans l'image capturée en se basant sur la valeur de la caractéristique.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2011196865 | 2011-09-09 | ||
JP2011-196865 | 2011-09-09 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2013035847A1 true WO2013035847A1 (fr) | 2013-03-14 |
Family
ID=47832286
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2012/072913 WO2013035847A1 (fr) | 2011-09-09 | 2012-09-07 | Dispositif de mesure de forme, système de fabrication de structure, procédé de mesure de forme, procédé de fabrication de structure, programme de mesure de forme et support d'enregistrement lisible par ordinateur |
Country Status (3)
Country | Link |
---|---|
JP (1) | JPWO2013035847A1 (fr) |
TW (1) | TW201329419A (fr) |
WO (1) | WO2013035847A1 (fr) |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105027159A (zh) * | 2013-03-26 | 2015-11-04 | 凸版印刷株式会社 | 图像处理装置、图像处理系统、图像处理方法及图像处理程序 |
WO2016075978A1 (fr) * | 2014-11-12 | 2016-05-19 | ソニー株式会社 | Dispositif de traitement d'informations, procédé de traitement d'informations, et programme |
US11313676B2 (en) * | 2018-02-07 | 2022-04-26 | Omron Corporation | Three-dimensional measurement apparatus, three-dimensional measurement method, and three-dimensional measurement non-transitory computer readable medium |
CN118096592A (zh) * | 2024-04-23 | 2024-05-28 | 荣耀终端有限公司 | 图像处理方法、电子设备、存储介质及芯片系统 |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2004070421A (ja) * | 2002-08-01 | 2004-03-04 | Fuji Photo Film Co Ltd | 画像処理システム、画像処理方法、撮像装置、及び画像処理装置 |
US20060058972A1 (en) * | 2004-09-15 | 2006-03-16 | Asml Netherlands B.V. | Method and apparatus for vibration detection, method and apparatus for vibration analysis, lithographic apparatus, device manufacturing method, and computer program |
US20060120618A1 (en) * | 2004-12-06 | 2006-06-08 | Canon Kabushiki Kaisha | Image processing apparatus, method of controlling thereof, and program |
JP2007158488A (ja) * | 2005-12-01 | 2007-06-21 | Sharp Corp | 手ぶれ検出装置 |
JP2007278951A (ja) * | 2006-04-10 | 2007-10-25 | Alpine Electronics Inc | 車体挙動測定装置 |
JP2008190962A (ja) * | 2007-02-02 | 2008-08-21 | Aidin System Kk | 3次元計測装置 |
US20100158490A1 (en) * | 2008-12-19 | 2010-06-24 | Sirona Dental Systems Gmbh | Method and device for optical scanning of three-dimensional objects by means of a dental 3d camera using a triangulation method |
-
2012
- 2012-09-07 TW TW101132906A patent/TW201329419A/zh unknown
- 2012-09-07 WO PCT/JP2012/072913 patent/WO2013035847A1/fr active Application Filing
- 2012-09-07 JP JP2013532674A patent/JPWO2013035847A1/ja active Pending
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2004070421A (ja) * | 2002-08-01 | 2004-03-04 | Fuji Photo Film Co Ltd | 画像処理システム、画像処理方法、撮像装置、及び画像処理装置 |
US20060058972A1 (en) * | 2004-09-15 | 2006-03-16 | Asml Netherlands B.V. | Method and apparatus for vibration detection, method and apparatus for vibration analysis, lithographic apparatus, device manufacturing method, and computer program |
US20060120618A1 (en) * | 2004-12-06 | 2006-06-08 | Canon Kabushiki Kaisha | Image processing apparatus, method of controlling thereof, and program |
JP2007158488A (ja) * | 2005-12-01 | 2007-06-21 | Sharp Corp | 手ぶれ検出装置 |
JP2007278951A (ja) * | 2006-04-10 | 2007-10-25 | Alpine Electronics Inc | 車体挙動測定装置 |
JP2008190962A (ja) * | 2007-02-02 | 2008-08-21 | Aidin System Kk | 3次元計測装置 |
US20100158490A1 (en) * | 2008-12-19 | 2010-06-24 | Sirona Dental Systems Gmbh | Method and device for optical scanning of three-dimensional objects by means of a dental 3d camera using a triangulation method |
Cited By (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105027159A (zh) * | 2013-03-26 | 2015-11-04 | 凸版印刷株式会社 | 图像处理装置、图像处理系统、图像处理方法及图像处理程序 |
EP2980752A4 (fr) * | 2013-03-26 | 2016-11-09 | Toppan Printing Co Ltd | Dispositif de traitement d'images, système de traitement d'images, procédé de traitement d'images et programme de traitement d'images |
JPWO2014157348A1 (ja) * | 2013-03-26 | 2017-02-16 | 凸版印刷株式会社 | 画像処理装置、画像処理システム、画像処理方法及び画像処理プログラム |
US10068339B2 (en) | 2013-03-26 | 2018-09-04 | Toppan Printing Co., Ltd. | Image processing device, image processing system, image processing method and image processing program |
WO2016075978A1 (fr) * | 2014-11-12 | 2016-05-19 | ソニー株式会社 | Dispositif de traitement d'informations, procédé de traitement d'informations, et programme |
KR20170085494A (ko) * | 2014-11-12 | 2017-07-24 | 소니 주식회사 | 정보 처리 장치, 정보 처리 방법 및 프로그램 |
US11189024B2 (en) | 2014-11-12 | 2021-11-30 | Sony Corporation | Information processing apparatus and information processing method |
KR102503872B1 (ko) * | 2014-11-12 | 2023-02-27 | 소니그룹주식회사 | 정보 처리 장치, 정보 처리 방법 및 프로그램 |
US11313676B2 (en) * | 2018-02-07 | 2022-04-26 | Omron Corporation | Three-dimensional measurement apparatus, three-dimensional measurement method, and three-dimensional measurement non-transitory computer readable medium |
CN118096592A (zh) * | 2024-04-23 | 2024-05-28 | 荣耀终端有限公司 | 图像处理方法、电子设备、存储介质及芯片系统 |
Also Published As
Publication number | Publication date |
---|---|
JPWO2013035847A1 (ja) | 2015-03-23 |
TW201329419A (zh) | 2013-07-16 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP5395507B2 (ja) | 三次元形状測定装置、三次元形状測定方法及びコンピュータプログラム | |
KR101639227B1 (ko) | 3차원 형상 측정장치 | |
DK2198780T3 (en) | Method and Device for Optical Scanning of Three-Dimensional Objects Using a 3D Dental Camera Using Triangulation | |
JP5994787B2 (ja) | 形状測定装置、構造物製造システム、形状測定方法、構造物製造方法、形状測定プログラム | |
WO2012057284A1 (fr) | Dispositif de mesure de forme en trois dimensions, procédé de mesure de forme en trois dimensions, procédé de fabrication de structure et système de fabrication de structure | |
JP2009036589A (ja) | 校正用ターゲット、校正支援装置、校正支援方法、および校正支援プログラム | |
JP5385703B2 (ja) | 検査装置、検査方法および検査プログラム | |
JP5595211B2 (ja) | 三次元形状測定装置、三次元形状測定方法及びコンピュータプログラム | |
KR100752758B1 (ko) | 영상 측정 장치 및 그 방법 | |
US8970674B2 (en) | Three-dimensional measurement apparatus, three-dimensional measurement method and storage medium | |
WO2013035847A1 (fr) | Dispositif de mesure de forme, système de fabrication de structure, procédé de mesure de forme, procédé de fabrication de structure, programme de mesure de forme et support d'enregistrement lisible par ordinateur | |
JP2010181247A (ja) | 形状測定装置及び形状測定方法 | |
KR101465996B1 (ko) | 선택적 큰 주기를 이용한 고속 3차원 형상 측정 방법 | |
JP2006084286A (ja) | 3次元計測方法とその計測装置 | |
JP2012093234A (ja) | 三次元形状測定装置、三次元形状測定方法、構造物の製造方法および構造物製造システム | |
JP2014035198A (ja) | 形状測定装置、構造物製造システム、形状測定方法、構造物製造方法、及び形状測定プログラム | |
JP2016008837A (ja) | 形状測定方法、形状測定装置、構造物製造システム、構造物製造方法、及び形状測定プログラム | |
JP2017125707A (ja) | 計測方法および計測装置 | |
JP2008170282A (ja) | 形状測定装置 | |
JP2010210410A (ja) | 三次元形状測定装置 | |
KR102025498B1 (ko) | 3차원 형상 측정 현미경에서 모아레 패턴 노이즈 제거 방법 | |
JP4930834B2 (ja) | 形状測定方法 | |
JP2010210507A (ja) | 測定装置及び測定方法 | |
JP6460944B2 (ja) | 被測定物の位置姿勢測定方法 | |
JP2008216076A (ja) | 形状測定方法 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 12829855 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2013532674 Country of ref document: JP Kind code of ref document: A |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 12829855 Country of ref document: EP Kind code of ref document: A1 |