+

WO2018185893A1 - Dispositif d'extraction de zone non mesurée, dispositif de guidage de travail et procédé de guidage de travail - Google Patents

Dispositif d'extraction de zone non mesurée, dispositif de guidage de travail et procédé de guidage de travail Download PDF

Info

Publication number
WO2018185893A1
WO2018185893A1 PCT/JP2017/014280 JP2017014280W WO2018185893A1 WO 2018185893 A1 WO2018185893 A1 WO 2018185893A1 JP 2017014280 W JP2017014280 W JP 2017014280W WO 2018185893 A1 WO2018185893 A1 WO 2018185893A1
Authority
WO
WIPO (PCT)
Prior art keywords
information
measured
image
unit
measurement
Prior art date
Application number
PCT/JP2017/014280
Other languages
English (en)
Japanese (ja)
Inventor
昌也 松平
Original Assignee
株式会社ニコン
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社ニコン filed Critical 株式会社ニコン
Priority to PCT/JP2017/014280 priority Critical patent/WO2018185893A1/fr
Publication of WO2018185893A1 publication Critical patent/WO2018185893A1/fr

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B05SPRAYING OR ATOMISING IN GENERAL; APPLYING FLUENT MATERIALS TO SURFACES, IN GENERAL
    • B05CAPPARATUS FOR APPLYING FLUENT MATERIALS TO SURFACES, IN GENERAL
    • B05C11/00Component parts, details or accessories not specifically provided for in groups B05C1/00 - B05C9/00
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B21/00Measuring arrangements or details thereof, where the measuring technique is not covered by the other groups of this subclass, unspecified or not relevant
    • G01B21/02Measuring arrangements or details thereof, where the measuring technique is not covered by the other groups of this subclass, unspecified or not relevant for measuring length, width, or thickness
    • G01B21/08Measuring arrangements or details thereof, where the measuring technique is not covered by the other groups of this subclass, unspecified or not relevant for measuring length, width, or thickness for measuring thickness
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B21/00Measuring arrangements or details thereof, where the measuring technique is not covered by the other groups of this subclass, unspecified or not relevant
    • G01B21/20Measuring arrangements or details thereof, where the measuring technique is not covered by the other groups of this subclass, unspecified or not relevant for measuring contours or curvatures, e.g. determining profile

Definitions

  • the present invention relates to an unmeasured region extraction device, a work guidance device, and a work guidance method.
  • Patent Document 1 A technique for measuring the film thickness of a coating film by pressing a measuring element against a painted surface is known.
  • the unmeasured region extraction device includes a position measurement unit that measures position information for each part of the measurement object, and a measurement part of the measurement object that is measured by the position measurement unit.
  • An image acquisition unit for acquiring the image information
  • an overall image generation unit for generating the entire image information of the measured object from the image information for each measured part, and the measured object based on the entire image information
  • An unmeasured part extraction unit that extracts an unmeasured part of the position information.
  • FIG. 1 is a diagram for explaining a painting work system according to the first embodiment.
  • FIG. 2 is a figure which shows typically the mode at the time of the painting work using the painting work system by 1st Embodiment. 1 and 2, an aircraft 11 is illustrated as an example of a painting object to which the painting work system according to the first embodiment is applied.
  • FIG. 1 shows a hangar 2 in which the object to be painted is stored.
  • the hangar 2 is a facility where the aircraft 11 is painted and maintained.
  • the hangar 2 is provided with a gondola 3, a measuring device 10, and a marker 20 for an operator 4 to get on and perform a painting operation on an object to be painted.
  • FIG. 2 shows the gondola 3.
  • the gondola 3 is equipped with a measuring device 10, and is provided with a painting work guidance control unit (work guidance device) 1, a display device 100, a display device 111 (not shown), and the like.
  • the measuring apparatus 10 measures the distance to a coating target object (aircraft 11) three-dimensional position, and analyzes the state of the coating film formed in the coating target object.
  • the painting work guidance control unit 1 has a function of instructing the worker 4 of a suitable painting work, and extracts an unmeasured area that has not been measured by the measuring apparatus 10 in order to execute the function. Further, the painting work guidance control unit 1 generates information on the unmeasured area of the painting object, and outputs information on the unmeasured area and its painting state to the display device 111.
  • the display device 100 displays an image generated regarding the film thickness distribution information and the painting work information of the coating film applied to the painting object.
  • the worker 4 can execute re-measurement and re-painting of the coating state of the object to be painted by confirming the contents displayed on the display devices 100 and 111.
  • the painting work guidance control unit 1 and the display devices 100 and 111 are separate, but the display devices 100 and 111 may be integrated. Further, the present invention includes one in which the display devices 100 and 111 are integrated with the painting work guidance control unit 1.
  • the painting work guidance control unit 1, the measurement device 10, the display device 100, and the display device 111 are arranged on the gondola 3, and the marker 20 is arranged on the beam of the hangar 2, The position where these are arranged is not limited to this. Further, when the worker 4 performs the painting work without using the gondola 3, for example, when the painting work is performed while standing on the ground at the work site, the measuring device 10 can be applied.
  • the gondola 3 is configured to move to a desired position by supplying power to a drive unit (not shown) by an operation by the operator 4 or the like.
  • the worker 4 appropriately moves the gondola 3 along the painting target surface of the aircraft 11 that is the painting target, and operates the painting device 5 to perform the painting work.
  • only one gondola 3 is shown to simplify the drawing, but a plurality of gondolas may be provided according to the size of the aircraft 11 or the like.
  • the coating device 5 is, for example, a spray device (painting gun) that ejects paint, and a nozzle is attached to the tip.
  • the painting device 5 is connected to a paint supply device (not shown) (tank, paint supply pump, etc.) via a hose 6, and the trigger disposed in the painting device 5 is operated, so that the paint supply device Discharge (spray) the supplied paint.
  • the nozzle can be exchanged, and the paint application pattern (paint ejection pattern) can be changed by exchanging the nozzle with a different shape of the part from which the paint is ejected.
  • the measuring device 10 measures the distance from the measuring device 10 to the painting object in a non-contact manner with respect to the painting object by irradiating the painting object with light and receiving the light reflected from the painting object.
  • the measuring device 10 is a laser radar device that measures distance by the Time Of Flight method, and irradiates the aircraft 11 that is the object to be coated with laser light that is frequency-modulated so that the frequency changes with time.
  • the measuring apparatus 10 includes an azimuth angle changing unit and an elevation angle changing unit so that the direction in which laser light is emitted can be freely set in a three-dimensional space.
  • the measuring device 10 calculates the distance between the measuring device 10 and the measurement point of the aircraft 11 based on the frequency difference between the laser beam reflected from the object to be coated and the reference laser beam. Moreover, the measuring apparatus 10 can generate three-dimensional position information of the position reflected from the azimuth angle and the elevation angle.
  • the measurement apparatus 10 may calculate the distance between the measurement apparatus 10 and the measurement point based on the phase difference between the amplitudes of the reflected laser beam and the reference laser beam. Further, when there is an obstacle between the measuring device 10 and the measurement target position of the painting target, a mirror may be arranged to irradiate the measurement target position with the measurement light. Thereby, the laser beam from the measuring apparatus 10 can be irradiated to the object to be coated via the mirror, and the reflected light from the object to be coated can also be detected by the measuring apparatus 10 via the mirror.
  • the heel marker 20 is arranged at a known position in the hangar 2.
  • the position of the marker 20 is a reference position for specifying the positions of the measuring device 10 and the aircraft 11 in the hangar 2.
  • each measuring device 10 measures the position of the marker 20, thereby obtaining the relative position and angle of the measuring device 10 with respect to the marker 20.
  • the three-dimensional spatial position in the hangar 2 of the measuring apparatus 10 is obtained.
  • FIG. 1 by installing markers 20 at a plurality of locations in the hangar 2 and using the position of each marker 20 in the hangar 2 as a reference, the measuring device 10 and the measurement points in a wide range in the hangar 2 Can be obtained.
  • the measuring device 10 has a horizontal angle (azimuth angle) and a vertical angle (elevation angle or depression angle) of the laser beam to be irradiated, and By using the three-dimensional spatial position information of the measuring device 10 itself, the three-dimensional spatial position in the hangar 2 can be calculated for every position of the painting target surface.
  • a coordinate system for representing the three-dimensional spatial position an orthogonal coordinate system or a polar coordinate system is used.
  • the measurement apparatus 10 performs measurement along the surface of the aircraft 11 by sequentially changing the irradiation angles of the laser light in the horizontal direction and the vertical direction (azimuth angle and elevation angle or depression angle). That is, the measuring apparatus 10 acquires point cloud data representing the three-dimensional spatial position of each measurement point of the aircraft 11 by scanning (scanning) the laser light to be irradiated while changing the azimuth angle, the elevation angle, or the depression angle.
  • the correct position information may be calculated by correcting the point cloud data based on the information indicating the mirror mounting position and the normal direction of the reflecting surface of the mirror. it can.
  • the measuring apparatus 10 generates shape data representing the shape of the aircraft 11 based on the obtained plurality of point cloud data.
  • one measuring device 10 is shown to simplify the drawing, but actually, in order to measure the entire surface of the aircraft 11, a plurality of measuring devices 10 are arranged around the aircraft 11. Placed in.
  • the measuring device 10 can be disposed, for example, in the vicinity of the gondola 3, a movable carriage, a fixed base, and the like.
  • the measuring device 10 may be disposed above or below the aircraft 11 or on a self-propelled rail.
  • the operator 4 may be attached to the operator 4.
  • the measuring device 10 measures the distance in each state before painting and after painting, and includes a three-dimensional position as spatial position information of the measuring point in the hangar 2 including the distance between the measuring device 10 and the measuring point. Get information.
  • the measuring apparatus 10 transmits the acquired three-dimensional position information to the painting work guidance control unit 1 by wireless communication or the like.
  • the painting work guidance control unit 1 includes the three-dimensional position information of the measurement points acquired by the measurement device 10 before painting and the tertiary of the measurement points acquired by the measurement device 10 after painting. Get original location information.
  • the measuring device 10 is programmed in advance so as to measure a plurality of positions within the surface to be painted. For example, the measuring apparatus 10 performs measurement at a predetermined pitch along the route information set on the painting target surface.
  • the measuring device 10 is rotatably provided with a mirror whose tilt angle can be controlled. By irradiating the object to be coated with the laser light through this mirror, the azimuth angle and elevation angle of the laser light are changed.
  • the irradiation direction of the laser beam can be set so that the laser beam is irradiated to a desired position on the surface to be coated.
  • the measuring apparatus 10 controls the irradiation direction of the laser light and the moving condition of the gondola 3 so that the measurement is performed at a predetermined pitch on the surface to be coated.
  • the difference in the three-dimensional position information before and after painting corresponds to the thickness of the coating film (or the thickness of the paint) formed on the measurement point of the painting object. Therefore, the film thickness of the coating film formed at the measurement point is calculated by calculating the difference between the three-dimensional position information before painting and the three-dimensional position information after painting of the measurement points included in each painting target area. be able to.
  • the measuring device 10 can acquire three-dimensional position information at each timing before painting, during painting, and after painting by measuring the vicinity of the area where the worker 4 is performing the painting work as needed.
  • the coating operation guidance control unit 1 can appropriately acquire the film thickness of the coating film formed by coating based on the three-dimensional position information during the coating operation.
  • the coating operation guidance control unit 1 estimates the shape change of the coating target based on the temperature difference between before and after coating in the coating target region.
  • the acquired three-dimensional position information before painting may be corrected.
  • the control unit 1 for painting work guidance can calculate the film thickness of the coating film based on the corrected three-dimensional position information before painting and three-dimensional position information after painting.
  • the measuring apparatus 10 measures the distance before painting only in the range immediately before painting, and immediately measures the distance when the range is painted, so that the control unit 1 for painting work guidance is based on those measurement results.
  • the film thickness of the coating film may be calculated.
  • the measuring device 10 measures the three-dimensional position of the coating device 5 and the posture of the coating device 5 in addition to measuring the three-dimensional position of the measurement point in the coating target region. Thereby, information related to the position and posture of the coating apparatus 5 is acquired.
  • the information related to the posture of the coating apparatus 5 is, for example, information related to the orientation of the nozzle of the coating apparatus 5, and this information is acquired by measuring the distance from the measuring apparatus 10 to a plurality of locations of the coating apparatus 5. Can do.
  • the measuring apparatus 10 transmits information related to the acquired position and orientation of the painting apparatus 5 to the painting work guidance control unit 1 by wireless communication or the like.
  • a distance sensor and an inclination sensor may be provided in the coating apparatus 5 instead of performing measurement related to the position and orientation of the coating apparatus 5 by the measuring apparatus 10.
  • the distance sensor measures the distance from the coating device 5 to the measurement point, and the inclination sensor measures the posture of the coating device 5.
  • the measuring device 10 includes an imaging device 102 (see FIG. 3).
  • the measuring device 10 acquires a captured image of the painting target surface including the measurement point by the imaging device 102, and generates image information including color information of the painting target surface.
  • the optical axis of the optical system that measures the distance from the measurement apparatus 10 to the measurement point and the optical system that captures the image of the surface to be coated are shared, and the distance measurement and imaging are performed simultaneously.
  • the image data including the captured color information is associated with the measured distance from the measurement apparatus 10 in the vicinity region including the measurement point with each measurement point. That is, the measuring apparatus 10 generates the three-dimensional position information of the painting target surface and the image data near the position corresponding to the three-dimensional position information.
  • These pieces of information are transmitted to the painting work guidance control unit 1 by wireless communication or the like and stored in the storage unit 105.
  • the imaging device 102 may be provided at a position different from the measurement device 10 instead of being provided in the measurement device 10.
  • the painting work guidance control unit 1 has, for example, an arithmetic processing circuit such as a CPU and a memory such as a ROM and a RAM, and executes a predetermined program to execute its function. Further, the painting work guidance control unit 1 acquires painting apparatus information related to the painting apparatus 5 by an input operation or the like by the operator 4.
  • the coating device information is, for example, discharge information that is information on the type of nozzle of the coating device 5, the discharge amount and discharge distribution of the paint discharged (spouted) from the nozzle, and the like.
  • the painting work guidance control unit 1 generates work information related to the painting work to be performed by the worker 4 based on the coating apparatus information and the target film thickness information which is information on the film thickness of the coating film to be formed.
  • control information such as a moving direction and a moving speed at which the robot arm moves the painting device 5 along the surface to be painted, and a discharge amount of paint from the painting device 5 is generated.
  • the painting work guidance control unit 1 can issue a warning when the painting work or the measurement work is not performed even though the painting work is to be performed.
  • the coating operation guidance control unit 1 generates image data for displaying a film thickness distribution image which is an image representing the film thickness distribution information of the coating film and a work instruction image which is an image instructing the operation. Further, the coating auxiliary device control unit 1 generates work information or control information to be supplied to the automatic coating device based on the coating device information and the target film thickness information when the coating is started in the area where the coating film is not formed. May be. Such information is transmitted to the display device 100 or the display device 111 by wireless communication or the like.
  • the display device 111 is, for example, a projector that projects and displays an image, and displays the image based on the image data transmitted from the painting work guidance control unit 1.
  • the display device 111 projects and displays an image on the painting target surface based on the image data output by the painting work guidance control unit 1.
  • the operator 4 can perform a painting operation or a measurement operation according to the operation instruction image displayed by the display device 111.
  • one display device 111 is shown to simplify the drawing, but in order to project an image on the entire surface of the aircraft 11, a plurality of display devices 111 are arranged around the aircraft 11. You may arrange in.
  • the display device 111 is disposed at a position near the gondola 3, a movable carriage, a fixed base, a position near the beam or column of the hangar 2, and the like.
  • the display device 111 may be provided in the measurement device 10.
  • the work instruction image relating to the painting work is projected onto the painting target surface by the display device 111, whereby the worker 4 is instructed to perform the painting work.
  • a CRT, a liquid crystal display device, or the like may be used as the display device 111.
  • the worker 4 may wear a head-mounted display (HMD) as the display device 111 and present a work instruction image related to the painting work to the worker 4. You may make it provide the tablet terminal etc. which have the function as the unit 1 for painting work guidance, and the display apparatus 111.
  • FIG. 3 is a block diagram for explaining an example of the configuration of the measuring apparatus 10, the painting work guidance unit 1, the display apparatus 100, and the display apparatus 111 according to the first embodiment.
  • the measurement device 10 includes a position measurement unit 101 and an imaging device 102.
  • the painting work guidance control unit 1 includes an unmeasured part extraction unit 25, a work information generation unit 60, a painting work instruction video generation unit 70, a coating film information acquisition unit 108, an alignment unit 109, and an image generation unit 110.
  • the unmeasured area extraction unit 25 includes an image acquisition unit 103, a storage unit 105, an entire image generation unit 106, and an unmeasured part extraction unit 107.
  • the unmeasured part extraction unit 25 may be included in the measurement apparatus 10.
  • the position measuring unit 101 acquires relative position information of the measuring device 10 with respect to the marker 20 at the time of measurement. If the position measuring unit 101 can directly output the distance information from the measuring device 10 to the measurement point, the azimuth and elevation angles when measuring the measurement point and the distance information correspond to the captured image information as the three-dimensional position information. You may generate it.
  • the image acquisition unit 101 performs cut-out processing based on the measurement pitch information by the measurement device 10 so that the captured image information is in an image range corresponding to the measurement pitch.
  • the measuring apparatus 10 uses the time measurement information generated by the position measurement unit 101 regarding the 3D position information of the surface to be painted and the time information about the time when the image data near the position corresponding to the 3D position information is acquired for painting work guidance. Output to the control unit 1.
  • the measuring device 10 is fixed to the gondola 3 and arranged.
  • the gondola 3 is stopped during the painting operation and the position information of the painting target area is acquired before and after the painting operation, before and after the painting operation acquired by the coating apparatus 10.
  • the position information at these two points in time is necessarily in the same painting target area.
  • the imaging device 102 captures an image including each measurement point in the painting target range and generates image information when the position measurement unit 101 measures the three-dimensional position of each measurement point of the aircraft 11 (coating object).
  • the imaging unit 102 includes an imaging optical system and an imaging element that outputs a signal according to the light intensity distribution of an image formed by the imaging optical system.
  • the measuring device 10 acquires position information and image information for the same measurement point by the first running saddle bottom 101 and the imaging device 102. That is, the image information generated by the imaging device 102 is image information in a range including measurement points where the three-dimensional position measurement is performed by the position measurement unit 101.
  • Image information captured and generated by the imaging device 102 is hereinafter referred to as captured image information.
  • the imaging device 102 generates angle-of-view information related to the angle of view of the captured image information in addition to the captured image information.
  • the captured image information and the angle-of-view information are transmitted from the measuring device 10 to the painting work guidance control unit 1.
  • the painting work guidance control unit 1 acquires the captured image information and the angle-of-view information generated by the imaging unit 102 from the measurement device 10. Further, the painting work guidance control unit 1 acquires from the measurement device 10 position information generated by the position measurement unit 101 and information indicating the position of the measurement device 10 at the time of position measurement.
  • the information indicating the position of the measuring apparatus 10 is information indicating the three-dimensional position of the measuring apparatus 10 when the measuring apparatus 10 performs position measurement while sequentially moving a plurality of positions intermittently with a painting operation. For example, in the state where the measuring device 10 is fixed to the gondola 3 and the gondola 3 is sequentially moved to perform painting work and position measurement, the information indicating the position of the measuring device 10 is the traveling direction of the gondola 3.
  • the three-dimensional position information of each measurement point and the captured image information and the angle-of-view information respectively corresponding to each measurement point transmitted from the measurement apparatus 10 to the painting work guidance control unit 1 are stored in the storage unit 105.
  • the image acquisition unit 103 performs a cut-out process based on the measurement pitch information by the measurement device 10 so that the captured image information is in an image range corresponding to the measurement pitch. That is, when the image acquisition unit 103 detects that position information relating to three or more measurement points among the measurement points on the surface to be coated is stored in the storage unit 105, the image acquisition unit 103 calculates a cut-out range from the stored captured image information. To do. Depending on the distance from a certain measurement point to the measuring device 10, the shooting range on the surface to be painted changes. Further, as the angle between the measurement direction of the position measuring unit 101 and the normal direction of the surface to be painted at the measurement point changes from a parallel state, the shooting range on the surface to be painted changes.
  • the image generation range is extracted from the captured image information as image information corresponding to the measurement pitch based on the distance from the measurement point to the measurement apparatus 10 or the three-dimensional position information of the measurement points around the measurement point. To decide.
  • the image acquisition unit 103 extracts image information corresponding to the determined image generation range from the captured image information.
  • the image information generated by the image acquisition unit 103 is image information corresponding to a wide range including measurement points by the position measurement unit 101. A specific example will be described with reference to FIG.
  • FIG. 4 is a diagram for explaining an example of processing by the image processing unit 103 of the painting work guidance control unit 1 according to the first embodiment.
  • a range 80a is a range surrounded by the point ABCD, and indicates a range of captured image information acquired by the imaging device 102 when measuring the three-dimensional position of the measurement point 81a.
  • a range 80b is a range surrounded by the point EFGH, and indicates a range of captured image information acquired by the imaging apparatus 102 when measuring the three-dimensional position of the measurement point 81b that is the measurement point next to the measurement point 81a.
  • each measurement point is located at the center.
  • FIG. 4 shows a measurement pitch 83 on the measurement target surface.
  • the image acquisition unit 103 calculates an imaging range corresponding to the set measurement pitch 83 when the measurement target surface is a flat surface and the measurement direction of the measurement apparatus 10 is perpendicular to the measurement target surface. For example, when the focal length of the imaging optical system of the imaging device 102 is f, the distance between adjacent measurement points is P, and the distance from the measurement device 10 to the measurement point acquired by the position measurement unit 101 is d. , D ⁇ P / f, the length of the shooting range in the measurement pitch direction can be calculated. Based on the imaging region and the information indicating the measurement pitch, image generation ranges 82a and 82b corresponding to the measurement pitch are set.
  • the image generation range 82a is a range surrounded by the points abcd
  • the image generation range 82b is a range surrounded by the points efgh.
  • the image acquisition unit 103 extracts image information corresponding to the image generation range 82a from the captured image information 80a, and extracts image information corresponding to the image generation range 82b from the captured image information 80b.
  • FIG. 5 is a diagram for explaining another example of processing performed by the image acquisition unit 103 according to the first embodiment.
  • FIG. 5 shows whether the image acquisition unit 103 is tilted with respect to the measurement direction of the position measurement unit 101 based on the measurement points 81 a and 81 b and the three-dimensional position information in the vicinity of these measurement points. Determine whether.
  • the measurement target surface 84 is tilted, that is, when the angle between the measurement direction of the position measurement unit 101 and the normal direction of the coating target surface that is the measurement target surface is equal to or greater than a predetermined angle, the angle Accordingly, the range for cutting out the captured image information is changed.
  • the range for extracting the captured image information corresponding to the measurement point 81a is set to C1
  • the range for extracting the captured image information corresponding to the next measurement point 81b is also the same.
  • the range of the captured image information corresponding to the measurement point 81a is set to C21
  • the captured image information corresponding to the next measurement point 81b ′ is extracted.
  • the range is set to C22 different from C21.
  • the measuring apparatus 10 performs three-dimensional position measurement at a plurality of measurement points while changing the azimuth angle and the elevation angle as described above. Therefore, by changing the pitch of the measurement points for each position where the elevation angle pitch and the azimuth pitch are measured in accordance with the shape of the aircraft that is the object to be coated, a plurality of pitches are measured at a predetermined measurement pitch on the object to be painted. Measure the measurement point.
  • the aircraft that is the object to be painted has a shape that is different from the assumed shape, or the relative positional relationship between the measuring machine and the measuring device is measured in a different positional relationship. In such a case, the normal direction of the measurement point with respect to the measurement apparatus 10 may be different from the assumed direction, and measurement is not performed at the required measurement pitch, resulting in measurement omission of the coating film thickness.
  • the image acquisition unit 103 outputs a signal for changing the scanning angle amount of the measurement device 10 to the measurement device 10. Then, the position measuring unit 101 may change the scanning range of the measuring apparatus 10 to perform three-dimensional position measurement.
  • the image acquisition unit 103 calculates an angle formed between the normal direction of the measurement surface 84 and the measurement direction of the measurement apparatus 10, and the direction set by the position measurement unit 101 at the time of measurement based on the calculated angle.
  • the setting information of the angle or the elevation angle is reset, and information indicating the movement amount of the measuring device 10 is generated based on the setting information.
  • the information indicating the movement amount of the measurement device 10 generated in this way is output to the measurement device 10, and the movement amount of the measurement device 10 is changed. For example, the distance between the position of the measurement device 10 when measuring the distance of the measurement point 81a and the position of the measurement device 10 when measuring the distance of the measurement point 81b is reduced, and the number of measurement points is increased.
  • the unmeasured area extraction unit 25 adds the three-dimensional position information read from the storage unit 105 to the image information cut out by the image acquisition unit 103.
  • the image information provided with the position information is stored in the storage unit 105.
  • the whole image generation unit 106 synthesizes the image information for each measurement point cut out by the image acquisition unit 103 based on the position information given to the image information, and generates the whole image information of the measured region in the painting target. To do. Specifically, the entire image generation unit 106 generates the entire image information of the measured region of the painting target by connecting the plurality of pieces of image information cut out by the image acquisition unit 103.
  • the unmeasured part extraction unit 107 extracts a three-dimensional region from which position information has not been acquired based on the entire image information. Among the extracted three-dimensional regions, for example, a three-dimensional region corresponding to the conditions shown in the following (1) and (2) is extracted as an unmeasured region.
  • a three-dimensional region corresponding to the conditions shown in the following (1) and (2) is extracted as an unmeasured region.
  • (1) When there is the shape model data of the object to be painted, the shape model data is compared with the entire image information, and the shape model data exists but the entire image information does not exist. In particular, when there is paint target area information, the information is also referred to.
  • the unmeasured part extraction unit 107 stores the extracted unmeasured area in the storage unit 105.
  • the measurement device 10, the unmeasured region extraction unit 25 in the work guidance control unit 1, and the display device 111 are configured to be used independently as an unmeasured region extraction device.
  • the coating film information acquisition unit 108 is based on the three-dimensional position information before the painting operation and the painting based on the three-dimensional position information of the same measurement point before and after the painting operation acquired by the position measurement unit 101 of the measuring apparatus 10.
  • the thickness of the coating film formed at each measurement point is calculated by obtaining the difference in the position information of the three-dimensional position information after the work.
  • the coating film information acquisition part 108 produces
  • the paint film information acquisition unit 108 calculates the thickness of the paint film formed on the painting object from the difference information between the position information of the painting object after the painting work and the shape model data of the painting object.
  • the film thickness distribution information may be acquired.
  • the alignment unit 109 performs alignment processing between the model data of the painting object and the entire image information of the painting object.
  • the shape model data of the painting object is, for example, design data (CAD data) of the aircraft 11 that is the painting object.
  • CAD data design data
  • the alignment unit 109 uses the position information given to the image information cut out by the image acquisition unit 103, which is used when generating the entire image information, so that the shape model data of the painting target and the painting target are used. Alignment processing with the entire image of the object is performed. Also.
  • the alignment unit 109 performs alignment processing between the shape model data of the coating object and the film thickness distribution information.
  • the image generation unit 110 generates image data for displaying an image representing the shape model data of the painting target and the entire image information of the painting target based on the processing result by the positioning unit 109. Further, for example, the image generation unit 110 generates image data for displaying an image representing the shape model data and film thickness distribution information of the painting target based on the processing result by the alignment unit 109. In addition, for example, the image generation unit 110 displays an image representing a portion for which position information has not been acquired based on information regarding a portion for which position information has not been acquired, generated by the unmeasured portion extraction unit 107. Generate image data. The image generation unit 110 outputs the generated image data to the display device 100 and / or 111. As for the film thickness distribution information, the film thickness distribution information may be output by being superimposed on the entire image information, instead of being represented together with the shape model data of the object to be coated.
  • the display devices 100 and 111 are, for example, a CRT or a liquid crystal display device, and display an image based on the image data transmitted from the measurement device 10 to the painting work guidance control unit 1.
  • Each of the display devices 100 and 111 is disposed in the gondola 3 or the like.
  • the worker 4 can check the images displayed by the display devices 100 and 111 and perform the painting work.
  • the display device 100 and / or 11 displays an area in which the shape measurement of the surface to be painted is not performed, so that the operator 4 can perform remeasurement or repainting work. Can be encouraged.
  • the worker 4 may wear a head-mounted display (HMD) as the display devices 100 and / 111 and present the image to the worker 4. You may make it provide the tablet terminal etc. which have a function as the display apparatuses 100 and / 111.
  • HMD head-mounted display
  • the operator 4 measures the film thickness of the coating film and confirms the state of the coating film, and then the next coating position. It is possible to move to. However, the operator 4 may move to the next place without measuring the film thickness due to a mistake of the operator 4 or the like.
  • this Embodiment has described about the case where the gondola 3 is one, when arranging a plurality of gondola around the object to be painted and starting the painting work from different positions all at once There is also. In such a case, it is not included in the movement range of any gondola and may be left without being painted.
  • the film thickness of the coating film is measured by the measuring device 10 when the gondola moves to the painting target position and the painting operation is completed.
  • the coating target position to which the gondola has moved is acquired by associating the three-dimensional position information with the film thickness information of the paint film, so check the unpainted area or the poorly painted area during the painting of the object to be painted. be able to. That is, according to the present embodiment, the painting work guidance control unit 1 extracts an unmeasured area where shape measurement by the measuring apparatus 10 is not performed, and displays an image indicating the position of the unmeasured area on the display device 100. And / or 111. Thereby, it is possible to prompt the worker 4 to perform remeasurement or repainting work. As a result, it is possible to reduce the work time of the painting work by reducing the additional painting work on the unpainted area. Moreover, the burden on the operator 4 can be reduced.
  • the painting information acquisition unit 108 determines whether each of the measurement points in the coating target area is a painted area or an unpainted area based on the film thickness distribution information acquired by the measuring apparatus 10. Further, the painting information acquisition unit 108 determines whether the thickness of the coating film is within a predetermined range for the painted area. Thereby, the coating information acquisition unit 108 is information on the insufficient film thickness distribution information that is information on the region where the coating film thickness is insufficient and the insufficient coating thickness from the coating thickness distribution information on the coating target surface. Is generated.
  • the storage unit 105 stores the film thickness distribution information input to the coating information acquisition unit 108.
  • the storage unit 105 stores coating apparatus information regarding the coating apparatus 5 and information regarding the film thickness of the coating film to be formed (target film thickness information and the like) by an input operation or the like by the operator 4.
  • the storage unit 105 stores discharge information for a plurality of nozzles as the coating apparatus information.
  • the storage unit 105 includes a semiconductor memory such as a RAM and a storage medium such as a hard disk device.
  • the work information generation unit 60 generates work information that is information related to the painting work to be performed on the painting target based on the target film thickness information, the film thickness distribution information, the coating apparatus information, and the like.
  • the work information is, for example, information related to the target position of the paint sprayed by the coating apparatus 5, information related to the position of the coating apparatus 5 and the direction of the nozzle, information related to the speed (speed and direction) of moving the coating apparatus 5.
  • the work information generation unit 60 includes a position calculation unit 61 and a transition calculation unit 62.
  • the position calculation unit 61 calculates a target position for spraying the paint by the coating apparatus 5 on the object to be coated. Further, if the coating apparatus 5 is a brush, the position calculation unit 61 calculates a contact target position of the brush with respect to the coating object. Moreover, the position calculation part 61 will calculate the position of the coating target immersed in the electrodeposition coating liquid, when the coating apparatus 5 is an electrodeposition coating apparatus. In the following description, the case where the coating apparatus 5 is a spray gun will be described.
  • the position calculation unit 61 calculates a target position where the coating apparatus 5 performs coating and a target posture of the coating apparatus 5 at that time based on information on the discharge amount and the discharge distribution included in the coating apparatus information. Specifically, the position calculation unit 61 calculates the spray target position with respect to the coating target and the position and orientation of the coating apparatus 5 using information on insufficient film thickness distribution information, discharge amount and discharge distribution, and the like. The position calculation unit 61 may calculate the spray target position based on the shape data generated by the measurement apparatus 10. Further, the position calculation unit 61 may adjust the spray target position according to the shape and size of the painting target.
  • the transition calculation unit 62 calculates the spray target position for each work time in a series of steps of the paint spraying operation based on, for example, the painting apparatus information. That is, the transition calculation unit 62 calculates the temporal transition of the position of the coating apparatus 5 when performing the painting work.
  • the transition calculation unit 62 includes information on the discharge amount of the paint discharged from the nozzles used in the coating apparatus 5 and information on its discharge distribution, and the film thickness calculated by the analysis unit 15 and the film thickness at the time of design.
  • the speed at which the coating apparatus 5 is moved is calculated based on information relating to the difference from (target film thickness).
  • the speed at which the coating apparatus 5 is moved is, for example, the moving distance per unit time of the coating apparatus 5 with respect to the coating target.
  • the work information generation unit 60 determines the paint spray target position, the position of the coating apparatus 5, the moving speed, and the like in accordance with the state of the film thickness of the coating film formed on the painting target surface by the painting work. Generate work information on the painting work to be performed.
  • the work information generation unit 60 may generate the work information in consideration of the ambient temperature and humidity of the object to be painted, the characteristics of the paint, the overall work time, and the like.
  • the work information generated by the work information generating unit 60 is output to the painting work instruction image generating unit 70.
  • the painting work instruction video generation unit 70 generates image data for displaying a film thickness distribution image and a work instruction image. For example, the painting work instruction image generation unit 70 displays the film thickness distribution image and the work instruction image superimposed on the painting target surface based on the film thickness distribution information, the work information, and the shape data of the painting target. Image data is generated.
  • the image data generated by the painting work instruction video generation unit 70 is generated based on the position and orientation of the display device 111 with respect to the painting target surface. For example, information regarding the position and orientation of the display device 111 is input to the painting work instruction image generation unit 70, and the coating operation instruction image generation unit 70 displays the film thickness distribution based on the information regarding the position and orientation of the display device 111. Image data such as images and work instruction images are generated.
  • the film thickness distribution image and the work instruction image can be appropriately superimposed and displayed on the painting target.
  • the image data generated by the painting work instruction video generation unit 70 is output to the display device 111 by wireless communication or the like. Note that only one of the film thickness distribution image and the work instruction image may be displayed without being superimposed.
  • the display device 111 can display various images based on the image data generated by the painting work instruction image generation unit 70. For example, the display device 111 displays a film thickness distribution image in which the film thickness of the coating film is classified step by step and color-coded. Moreover, you may make it the coating auxiliary device control unit 1 display the film thickness value of the coating film for every place with respect to the display apparatus 111. FIG. Further, the painting auxiliary device control unit 1 determines an optimum nozzle from the replaceable nozzles, generates an image for guiding the replacement of the nozzle of the painting device 5, and causes the display unit 111 to display the image. Good.
  • FIG. 6 is a diagram illustrating an example of a display image displayed on the display device 111 in the first embodiment.
  • the film thickness distribution image and the work instruction image are superimposed and displayed on the painting target surface of the aircraft 11 as the painting target. These images are projected and displayed in alignment with the painting target surface of the aircraft 11.
  • the colors are displayed according to the film thickness of the coating film.
  • Regions 121 and 122 are regions in which the film thickness is within a predetermined range from the target film thickness.
  • the region 122 is a region whose film thickness is thinner than the film thickness range of the region 121.
  • the region 123 is a region whose film thickness is thinner than the film thickness range of the region 122 and is below a predetermined range from the target film thickness.
  • the color difference is expressed using dots and hatching.
  • a pointer 90 shown in FIG. 6 indicates a target position at which painting spraying is started.
  • the pointer 90 moves in the direction indicated by the arrow 91.
  • the movement of the arrow 91 is performed in a direction and speed suitable for forming a desired film thickness.
  • the operator 4 moves the coating device 5 following the movement of the arrow, so that an appropriate painting operation can be performed for an area where the coating is not performed or an area where the film thickness of the coating film is insufficient. It can be performed.
  • the discharge amount of the paint discharged from the coating apparatus 5 is set to be constant, the film thickness of the coating film formed by the painting operation can be adjusted by adjusting the moving speed of the pointer 90. it can.
  • the operator 4 is instructed in advance to spray paint onto the object to be painted from the coating device 5 while the pointer 90 is displayed, and to stop spraying the paint by the coating device 5 when the pointer 90 disappears. You can also keep it. As a result, it is possible to prevent the paint from being sprayed on the object to be used unnecessarily, leading to cost reduction of the painting work.
  • a work instruction image can be projected onto the painting target surface of the aircraft 11 which is the painting target. Further, by displaying such a work instruction image on a see-through type head mounted display, the worker 4 can also perform a painting operation while confirming the work instruction image superimposed on the painting work surface. As a result, the burden on the operator can be greatly reduced.
  • FIG. 7 is a flowchart showing a flow of processing by the painting work system according to the first embodiment.
  • the painting work guidance control unit 1 controls the movement of the gondola so as to move the gondola to the painting position of the painting object, and the process proceeds to step 203.
  • the painting work guidance control unit 1 obtains, from the measuring device 10, the three-dimensional position information of the painting target surface measured by the position measurement unit 101 and the painting target surface image obtained by the imaging device 102, Proceed to step S205.
  • the image acquisition unit 103 cuts out the image of the painting target surface acquired by the imaging device 102 so as to be in an image range corresponding to the measurement pitch, and proceeds to step S207.
  • step S207 it is determined whether or not the painting work has been completed. If the determination is affirmative, that is, if it is determined that the painting operation has been completed, the process proceeds to step S209. On the other hand, when a negative determination is made, that is, when it is determined that the painting work is not completed, the process returns to step S201.
  • step S209 the entire image generation unit 106 connects the images cut out by the image acquisition unit 103 in step S207 to generate an entire image of the painting target, and the process proceeds to step S211.
  • step S211 the unmeasured part extraction unit 107 determines whether measurement position information has not been acquired, that is, there is an unmeasured area.
  • step S211 If it is determined affirmative in step S211, that is, if there is an unmeasured area, the process proceeds to step S213. On the other hand, if a negative determination is made in step S211, that is, it is determined that there is no unmeasured area, it means that there is no area where the painting work has not been completed, and thus the painting work guidance control unit 1 ends this routine.
  • step 213 the control unit for painting work guidance 1 moves the gondola to this area for the area determined as the unmeasured area in step S211. After the painting operation is performed, the film thickness of the coating film is measured by the same procedure as S201, and the process proceeds to Step S215.
  • step S215 the coating film information acquisition unit 108 generates film thickness distribution information, and the coating operation guidance control unit 1 determines whether or not the coating film thickness is a predetermined value. If an affirmative determination is made in step S215, that is, if it is determined that the film thickness of the coating film is a predetermined value, the coating work guidance control unit 1 ends this routine. On the other hand, if a negative determination is made in step S215, that is, if it is determined that the film thickness of the coating film has not reached the predetermined value, the process returns to step S213.
  • the coating work guidance control unit 1 acquires position information for each part of the measurement object and image information of the measurement part of the measurement object from the measurement apparatus 10.
  • the coating work guidance control unit 1 generates the entire image information of the object to be measured from the image information for each measurement part, and the position information on the object to be measured is not measured based on the entire image information.
  • an unmeasured part extracting unit 107 for extracting the part With such a configuration, it is possible to grasp a region where distance measurement is performed on a coating target and a region where distance measurement is not performed. Therefore, it is possible to present information indicating an unmeasured area to the worker and prompt the worker to perform remeasurement or repainting work. Further, by performing the painting operation based on the unmeasured area, it is possible to reliably apply the painting to the area to be painted, and to shorten the time of the painting work.
  • mesh data based on position information measured by the position measurement unit 101 may be used as the image information.
  • the points of the point cloud data acquired by the position measuring unit 101 are connected to generate mesh data composed of a plurality of meshes (for example, triangular meshes).
  • the unmeasured part extraction unit 107 extracts an area where no mesh is generated as an unmeasured area for distance measurement.
  • the image generation unit 110 may generate a two-dimensional image based on the three-dimensional data acquired by the measurement apparatus 10 and use the two-dimensional image data as image information.
  • Modification 2 In the embodiment and the modification described above, an example in which an aircraft is used as a painting target has been described.
  • the painting target may be an automobile or a ship, and is not particularly limited.
  • the present invention can be applied to the analysis of the coating state of various coating objects.

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

L'invention concerne un dispositif d'extraction de zone non mesurée pourvu d'une unité de mesure de position pour mesurer des informations de position pour chaque partie d'un objet à mesurer, d'une unité d'acquisition d'image pour acquérir des informations d'image pour les parties en cours de mesure de l'objet à mesurer qui ont été mesurées par l'unité de mesure de position, d'une unité de génération d'image globale pour générer des informations d'image globale pour l'objet en cours de mesure à partir des informations d'image pour chaque partie en cours de mesure, et d'une unité d'extraction de partie non mesurée pour extraire une partie où des informations de position pour l'objet à mesurer ne sont pas mesurées sur la base des informations d'image globale.
PCT/JP2017/014280 2017-04-05 2017-04-05 Dispositif d'extraction de zone non mesurée, dispositif de guidage de travail et procédé de guidage de travail WO2018185893A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/JP2017/014280 WO2018185893A1 (fr) 2017-04-05 2017-04-05 Dispositif d'extraction de zone non mesurée, dispositif de guidage de travail et procédé de guidage de travail

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2017/014280 WO2018185893A1 (fr) 2017-04-05 2017-04-05 Dispositif d'extraction de zone non mesurée, dispositif de guidage de travail et procédé de guidage de travail

Publications (1)

Publication Number Publication Date
WO2018185893A1 true WO2018185893A1 (fr) 2018-10-11

Family

ID=63713407

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2017/014280 WO2018185893A1 (fr) 2017-04-05 2017-04-05 Dispositif d'extraction de zone non mesurée, dispositif de guidage de travail et procédé de guidage de travail

Country Status (1)

Country Link
WO (1) WO2018185893A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2020105311A1 (fr) * 2018-11-21 2020-05-28 三菱重工業株式会社 Système de mesure de position et procédé de mesure de position

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH1047935A (ja) * 1996-08-06 1998-02-20 Topcon Corp 画像重ね合せ式の測定方法
JP2002074347A (ja) * 2000-08-31 2002-03-15 Minolta Co Ltd 情報取得システム
JP2013092456A (ja) * 2011-10-26 2013-05-16 Topcon Corp 画像測定装置
US20130260016A1 (en) * 2012-04-02 2013-10-03 The Boeing Company Sealant Analysis System
US20150317070A1 (en) * 2014-04-11 2015-11-05 Ikegps Group Limited Mobile handheld instruments and methods
JP2016151423A (ja) * 2015-02-16 2016-08-22 株式会社トプコン 姿勢検出装置及びデータ取得装置

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH1047935A (ja) * 1996-08-06 1998-02-20 Topcon Corp 画像重ね合せ式の測定方法
JP2002074347A (ja) * 2000-08-31 2002-03-15 Minolta Co Ltd 情報取得システム
JP2013092456A (ja) * 2011-10-26 2013-05-16 Topcon Corp 画像測定装置
US20130260016A1 (en) * 2012-04-02 2013-10-03 The Boeing Company Sealant Analysis System
US20150317070A1 (en) * 2014-04-11 2015-11-05 Ikegps Group Limited Mobile handheld instruments and methods
JP2016151423A (ja) * 2015-02-16 2016-08-22 株式会社トプコン 姿勢検出装置及びデータ取得装置

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2020105311A1 (fr) * 2018-11-21 2020-05-28 三菱重工業株式会社 Système de mesure de position et procédé de mesure de position
US12025422B2 (en) 2018-11-21 2024-07-02 Mitsubishi Heavy Industries, Ltd. Position measurement system and position measurement method

Similar Documents

Publication Publication Date Title
KR101572917B1 (ko) 그래픽 적용 시스템
JP5824054B2 (ja) 面スパッタ装置
US9914150B2 (en) Graphical application system
US20120130566A1 (en) Method and system for facilitating autonomous landing of aerial vehicles on a surface
CN111192189A (zh) 一种汽车外形的三维自动检测方法及系统
US11865852B2 (en) Robotic livery printing system
US10532561B2 (en) Metrology-based path planning for inkjet printing along a contoured surface
WO2018185891A1 (fr) Dispositif d'analyse d'état de film de revêtement et dispositif de mesure d'épaisseur de film
US12183034B2 (en) System and method for utilization of displacement sensor during placement of vehicle service fixture
WO2018185893A1 (fr) Dispositif d'extraction de zone non mesurée, dispositif de guidage de travail et procédé de guidage de travail
WO2018185890A1 (fr) Dispositif d'aide au revêtement, dispositif de revêtement, procédé d'aide au travail de revêtement, procédé de production d'article revêtu et programme d'aide au revêtement
WO2018185892A1 (fr) Dispositif de détection de position et système d'aide à la peinture
KR102741610B1 (ko) 광 발사체를 장착한 스프레이 건 및 이를 이용한 도막 두께 예측을 위한 분석 장치와 도막 두께 예측 시스템
KR102807106B1 (ko) 레이저 광을 조사하는 스프레이 건 및 이를 이용한 도막 두께 예측 시스템
KR20250019667A (ko) 대화형 훈련 시나리오를 디스플레이하고 훈련 범위 내 관련 객체의 위치를 결정하기 위한 시스템과 이 시스템의 셋업 및 교정 방법

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17904852

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 17904852

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP

点击 这是indexloc提供的php浏览器服务,不要输入任何密码和下载