+

US20190141236A1 - Inspection workflow using ojbect recognition and other techniques - Google Patents

Inspection workflow using ojbect recognition and other techniques Download PDF

Info

Publication number
US20190141236A1
US20190141236A1 US16/180,873 US201816180873A US2019141236A1 US 20190141236 A1 US20190141236 A1 US 20190141236A1 US 201816180873 A US201816180873 A US 201816180873A US 2019141236 A1 US2019141236 A1 US 2019141236A1
Authority
US
United States
Prior art keywords
equipment
inspection
user
data
piece
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US16/180,873
Inventor
Peter A. Bergstrom
Brian Knight
Jamie Rhead
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Fluke Corp
Original Assignee
Fluke Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Fluke Corp filed Critical Fluke Corp
Priority to US16/180,873 priority Critical patent/US20190141236A1/en
Assigned to FLUKE CORPORATION reassignment FLUKE CORPORATION ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: BERGSTROM, PETER A., KNIGHT, BRIAN, RHEAD, JAMIE
Publication of US20190141236A1 publication Critical patent/US20190141236A1/en
Priority to US17/348,755 priority patent/US12088910B2/en
Assigned to FLUKE CORPORATION reassignment FLUKE CORPORATION ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: BERGSTROM, PETER A, KNIGHT, BRIAN, RHEAD, JAMIE
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/20Administration of product repair or maintenance
    • H04N5/23222
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/64Computer-aided capture of images, e.g. transfer from script file into camera, check of taken image quality, advice or proposal for image composition or decision on when to take image
    • G06K9/00671
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/20Scenes; Scene-specific elements in augmented reality scenes
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • H04N7/183Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a single remote source
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0482Interaction with lists of selectable items, e.g. menus

Definitions

  • aspects of this disclosure are directed toward systems and methods for collecting data during a workflow routine.
  • Various methods can include receiving information regarding an environment and outputting information regarding pieces of equipment in the environment, such as the pieces of equipment within the environment available for inspection, directions to a location of a selected piece of equipment, one or more parameters associated with a selected piece of equipment, a workflow routine for acquiring inspection data, and/or a reference image representing a selected piece of equipment.
  • Methods can include acquiring inspection data representative of at least one parameter associated with the identified piece of equipment.
  • Inspection data can include image data, such as acoustic image data, infrared image data, and/or visible light image data, for example.
  • Exemplary systems can include an inspection tool, a user interface, memory, and a processor.
  • the processor can be configured to provide instructions to a user to perform a workflow routine using the inspection tool to collect inspection data via the inspection tool, acquire the inspection data, and save the acquired inspection data to memory.
  • Providing instructions can include displaying a list of equipment that is part of the workflow routine and available for inspection. Additionally or alternatively, providing instructions can include displaying a list of steps to perform during the workflow routine.
  • FIG. 1 shows an environment including a plurality of pieces of equipment suitable for inspection and/or maintenance.
  • FIG. 2 shows an exemplary workflow routine for performing inspection and/or maintenance within an environment.
  • FIG. 3A shows an illustration of a user traveling along an inspection path within an environment.
  • FIG. 3B shows an exemplary interface illustrating to a user that a piece of equipment is available for inspection.
  • FIG. 4A shows an illustration of the user of FIG. 3A at a different location along the inspection path within the environment.
  • FIG. 4B shows an exemplary presentation of a list of equipment within a predetermined proximity of the user in FIG. 4A .
  • FIG. 5A shows an illustration of the user of FIGS. 3A and 4A at a different location along the inspection path within the environment.
  • FIG. 5B shows an exemplary presentation of a list of equipment within a predetermined proximity of the user in FIG. 5A .
  • FIG. 6A shows an illustration of a user at a location along an inspection path within an environment.
  • FIG. 6B shows an exemplary interface assisting a user in capturing an appropriate image of a piece of equipment.
  • FIG. 6C shows an exemplary interface instructing a user to acquire measurement data representative of parameters associated with a piece of equipment.
  • FIG. 7 is a process flow diagram illustrating a variety of possible processes for collecting data during a workflow routine and saving and/or uploading the results.
  • FIG. 8 is a process flow diagram illustrating a variety of possible processes for guiding a user through a workflow routine, collecting data during the workflow routine, and saving and/or uploading the results.
  • Such measuring devices can include one or more imaging tools capable of generating image data representative of a target scene and/or one or more test and measurement tool capable of generating measurement data representative of one or more parameters of an object under test.
  • Exemplary imaging tools can include electromagnetic imaging tools, and can be configured to generate image data representative of electromagnetic radiation from a target scene, such as infrared image data, visible light image data, ultraviolet image data, millimeter wave image data, and the like. Combinations of one or more electromagnetic spectrums may also be used, for example, as described in U.S. Pat. No. 7,538,326, entitled “VISIBLE LIGHT AND IR COMBINED IMAGE CAMERA WITH A LASER POINTER,” which is assigned to the assignee of the instant application, and is hereby incorporated by reference in its entirety.
  • an imaging tool can include an acoustic imaging tool including one or more acoustic sensor elements used to generate an acoustic image of a target scene.
  • acoustic imaging tool including one or more acoustic sensor elements used to generate an acoustic image of a target scene.
  • Exemplary acoustic imaging tools, and combinations of acoustic imaging tool and electromagnetic imaging tools, are described in U.S. patent application Ser. No. 15/802,153, filed Nov. 2, 2017, and entitled “FOCUS AND/OR PARALLAX ADJUSTMENT IN ACOUSTIC IMAGING USING DISTANCE INFORMATION,” which is assigned to the assignee of the instant application and is hereby incorporated by reference in its entirety.
  • Electromagnetic imaging tools and/or acoustic imaging tools may be combined or otherwise in communication with one another and/or with other test and measurement tools, for example, as described in U.S. patent application Ser. No. 14/855,884, filed Sep. 16, 2015, and entitled “TEST AND MEASUREMENT SYSTEM WITH REMOVABLE IMAGING TOOL,” which is assigned to the assignee of the instant application and is hereby incorporated by reference in its entirety.
  • imaging and/or test and measurement functionally may be incorporated into a user's external device (e.g., smartphone, tablet, etc.), such as described in U.S. patent application Ser. No. 14/855,864, filed Sep. 17, 2015, and entitled “MOBILE DEVICE USED WITH ISOLATED TEST AND MEASUREMENT INPUT BLOCK,” which is assigned to the assignee of the instant application and is hereby incorporated by reference in its entirety.
  • Inspection and/or maintenance data may be analyzed individually or collectively and used for predictive maintenance or fault prediction.
  • maintenance and/or inspection processes can be complex and/or lengthy, making consistent inspections difficult to perform consistently while gathering a complete set of proper data. Additionally or alternatively, such processes may be performed by an inexperienced worker and/or a worker that is unfamiliar with one or more inspection processes and/or a particular environment in which the maintenance and/or inspection process is being performed.
  • Aiding techniques and data processing techniques can be used to guide and assist an individual in performing a maintenance and/or inspection process, for example, by assisting a system user in a data collection workflow process. Such aiding can result in faster, easier, and more reliable/consistent data collection.
  • these aiding techniques can support or provide an inspection and/or maintenance workflow procedure.
  • such techniques provide guidance to the user during the workflow and may involve manual inputs from the user and/or automatic means of acquiring and analyzing measurements.
  • useful information for performing various tasks in a maintenance and/or inspection procedure may be provided to the user on an ongoing basis throughout the process, and can be provided on-demand or automatically by a processing/analysis system.
  • Such information may include the locations of equipment that is to be inspected, how the measurements should be taken, and whether or not measurements that are obtained are taken appropriately.
  • This information may be provided to the user in the form of text messages, as graphical/text indicators superimposed on live imagery, as sound cues, as light indicators, or by other means.
  • determining which indicators should be presented to a user can be performed in a variety of ways, including, for example, location detection, processing live imagery to determine the identify of an object under test, or other live data (e.g., proximity detection relative to an object) collected from other sensor devices.
  • CMMS computerized maintenance management system
  • database entries can include a variety of maintenance and/or inspection information, including past results, instructions for performing such processes, possible errors that can be observed during maintenance/inspection, and the like.
  • the aiding and processing techniques and results described herein therefore provide useful data which improves the effectiveness of such a maintenance management system.
  • Such additional data along with the increased reliability of measurements due to aiding, result in better maintenance of equipment and more reliable fault predictions.
  • Inspection and/or maintenance tools and/or activities may be part of an overall CMMS system.
  • one or more tools e.g., a test and measurement tool, imaging tool, etc.
  • a separate device such as a computer workstation, an external device such as a smartphone or a tablet, or the like.
  • data acquired by a tool carried by a user e.g., an imaging tool, a test and measurement tool, or the like
  • an external device such as described in U.S.
  • such a software platform may involve a licensing and delivery model in which software is licensed on a subscription basis and is centrally hosted and may be referred to as a Software as a Service, or SAAS.
  • SAAS Software as a Service
  • Such a system may be made accessible to users using a client via a web browser or other means.
  • Measurement data including imagery, over time, for example, for a particular piece of equipment, as well as analysis results of such data and/or signals sent directly from the equipment itself, may be provided to and made available from the SAAS.
  • data can include results from a variety of sensor devices including images from an IR, VL, acoustic, or other imaging system.
  • Data can additionally or alternatively include metrics/analysis/trends obtained by analysis from such measurements and imagery.
  • data from the SAAS such as imagery, measurement data, and other data for a piece of multiple pieces of equipment may be automatically associated to that equipment and may be provided to a user of the SAAS, for example, to assist with future maintenance and/or inspection processes.
  • Such imagery, measurement data, and other data may be provided to a user in an on-demand fashion, or automatically via an alarm/notification system.
  • data may be downloaded from the SAAS and stored in memory on board one or more tools carried by a user, and/or on a user's personal device, such as a smartphone or tablet.
  • a user may access such data real time from a remote location, such as a hosted server providing access to a user, e.g., via a tool and/or a personal device.
  • a user may receive data (e.g., using a tool and/or personal device) that can provide information representative of previous and/or expected measurement information, steps for performing one or more maintenance and/or inspection processes, or other equipment information.
  • trend analysis and/or generating a CMMS or SAAS for use with guided inspections can include building a statistical database of typical equipment operation, for example, as described in U.S. patent application Ser. No. 15/190,792, filed Jun. 23, 2016, and entitled “THERMAL ANOMALY DETECTION,” which is assigned to the assignee of the instant application and is hereby incorporated by reference in its entirety.
  • a user may be provided (e.g., via on-board memory, network access, etc.) a workflow routine instructing the user how to perform one or more maintenance and/or inspection processes.
  • a workflow routine may be documented in the form of a procedure, which may be brief or quite detailed.
  • a detailed procedure may include, for example, a list of equipment to be inspected and/or maintained, the measuring devices (sensors) to use for each piece of equipment to be inspected, and/or the methods and/or settings in which the measuring devices are to be used at each inspection step.
  • One or more measuring devices used during workflows may include an interface that allows for access/viewing of a workflow procedure, stored as an electronic document or instruction set, which the user may review at will during the inspection.
  • the electronic document or instruction set may reside on one of, and may be shared between, the multiple sensing devices used during the workflow.
  • the electronic document may reside on a separate device (pc, smartphone, or tablet) that the user carries during the inspection process, or it may reside remotely and be communicated to the measuring equipment or other device that the user carries, (from a data cloud or a central hub that is used for data collection and processing).
  • An exemplary workflow procedure may include the physical route of the inspections/maintenance, the equipment to be inspected/maintained, the measurement devices (e.g., imaging tools, test and measurement tools, etc.) to be used in inspecting each piece of equipment at each step, measurement device settings, connection diagrams for electrical and other contact inspections, required viewing angles and perspectives for image inspections, and/or previously acquired reference images that indicate the appropriate image appearance for imagery at each step of an image-based inspection.
  • Such imagery inspections may involve IR, VL, mm wave, acoustic, or other imaging devices.
  • the measuring devices imagers and other sensors
  • a separate device pc, smartphone, or tablet
  • the user may have access to the electronic workflow procedure. Additionally or alternatively, the user may be able to manually record their progress and/or measurement results during the inspections and measurements during the workflow, for example, saving data to a SAAS and/or CMMS. In some examples, the progress through the workflow and/or the measurements themselves may be recorded automatically.
  • the physical real-time location of a user and/or of the measuring device(s) may be automatically tracked during the workflow or may be manually entered by the user.
  • Automatic methods may include GPS, inertial tracking methods, triangulation by use of external devices, by proximity or RFID sensors placed at various locations, or by other means.
  • the physical location of equipment to inspect may also be known to the processing system.
  • the real-time location data of the operator and sensors may be used to infer which pieces of equipment can be inspected (e.g., are near the user, such as within a predetermined proximity of the user) at a given time. These determinations can be made either inside a measuring device (e.g., imaging tools, test and measurement tools, etc.), in a separate device that the user is carrying, (ex. pc, smartphone, or tablet), or at a separate processing hub which is in communication with one or more such devices.
  • a measuring device e.g., imaging tools, test and measurement tools, etc.
  • guidance can be provided to a user as to the proper actions/measurements to take for a given one or more pieces of equipment that are accessible for inspection a given point. For example, based on known locations of a user and equipment available for inspection, should the user wish to take a measurement or collect an image at a known location, a set of candidate equipment for inspection at that location may be indicated to the user. Thus, the user may select an item from a candidate list of known items. Such a selection may trigger execution of further guidance for performing maintenance and/or inspection of the selected equipment, and/or may pre-load a variety of available data representative of the equipment, eliminating the need for manual entry of some such details, such as the full description of the equipment. As the user verifies the specific identity of the equipment of interest, the instructions for taking required measurements may be indicated, and any subsequent measurements may be automatically associated to the specific equipment for future reference (e.g., in a CMMS and/or SAAS).
  • a specific piece of equipment near the user can be identified automatically and in real time.
  • identification can be achieved as a result of the known physical location of the inspection device(s) and equipment, or by an identification signal (active or passive) transmitted from the equipment to the inspection device, or by an external triangulation system.
  • identification of the equipment might be achieved by object recognition image processing techniques where the equipment is identified within the imagery in real time.
  • object recognition image processing techniques can include, for example, correlation methods and blob analysis.
  • identification of a specific piece of equipment might be achieved by combing live data obtained from a number of different sensors such as acoustic, mm-wave, visual imaging, and IR imaging.
  • a user may or may not be prompted to manually select the equipment, for example, from a candidate list, or otherwise confirm the identity of the automatically identified equipment. For instance, in some examples, identification of the equipment to be inspected may be indicated to the user, and useful reference information regarding the specific equipment can be provided automatically to the user. Association of the specific equipment identity to subsequently obtained imager data or other sensor data measurements may also be automatic.
  • the viewing perspective (orientation, position, and measuring distance) of an imaging device may be automatically determined or manually entered by the user. Automatic determination might be achieved using sensors within the imager (e.g., orientation sensing via accelerometers or the like, position sensing such as GPS or the like, etc.), or by externally placed sensors, or might be determined from the imagery itself using image processing techniques such as object recognition.
  • a procedure step e.g., in a workflow routine
  • one or more procedure steps is associated with and therefore implies a particular viewing location and perspective for a specific piece of equipment.
  • the system may direct a user to a location for capturing image data from such a predetermined location.
  • a location for capturing image data may be associated with an image previously captured and associated with the procedure step.
  • directing a user to the location can include a rephotography process in order to reproduce the capture point of the previously captured image. Exemplary such processes are described in U.S. patent application Ser. No. 13/331,633, filed Dec. 20, 2011, and entitled, “THERMAL IMAGING CAMERA FOR INFRARED REPHOTOGRAPHY,” U.S. patent application Ser. No. 13/331,644, filed Dec.
  • Location/perspective specific reference imagery and other data may be stored prior to performing the workflow, for each piece of equipment that is of interest for imaging or data collection.
  • This data may be stored in the measuring device, a separate device that the user carries, or in a central processing hub. This data may be provided to the user automatically or on demand throughout the workflow for reference.
  • this reference imagery may be used by the user as a guide that indicates how the appropriate view should appear for the imager as measurements are acquired.
  • the reference imagery can be displayed along with live scene imagery, and other data may be processed by the system in order to provide useful guidance and cues to the user.
  • the known real-time imaging perspective and current physical location data may be used to determine what objects or equipment are to be expected in the current imagery at a given time.
  • analysis of current location information may be used (e.g., via a processor) to determine that the tool is near a location from which previous image data (or other data) was captured.
  • Such a location or equipment located at such a location can be presented to the user as a possible inspection candidate.
  • a notification or description of such potential equipment can be presented to the user.
  • These indications might include a display of the previously acquired reference image of the equipment.
  • Object recognition image processing techniques such as correlation or blob analysis methods may be used to search for, indicate, and track objects in the imagery which are candidates for known pieces of equipment needing inspection. Such candidates may be presented to the user with an option to confirm the identity of a piece of equipment.
  • a user may or may not be prompted to select or confirm the identity of the equipment.
  • any related messaging to the user or association of the equipment identity to the image and/or other sensor data may then be performed automatically.
  • object recognition and other image processing techniques can be used to determine automatically and in real time when pieces of equipment that are required for inspection are present in the image scene. These techniques may also be used to determine when the objects are present but are not being viewed appropriately for imaging measurements. Image processing and/or other techniques can determine errors in the imaging process, and can provide guidance to the user (e.g., to refocus, change position, change viewing perspective, etc.). In addition, a system may automatically determine whether the correct settings for obtaining imagery of the equipment are being applied or not to the imager. If not then the system may automatically provide appropriate guidance for changing these settings to the user.
  • the system may automatically apply the appropriate control settings to the imager as a known piece of equipment is inspected or imaged. If the imager settings and imagery itself is determined to be appropriate for a required measurement for a piece of equipment, a message may be provided to indicate this status to the user so that the user knows that it is appropriate to obtain a manual image measurement.
  • image data may be captured automatically for a desired piece of equipment in the event that the system determines that the imager settings and live image content are appropriate for the inspection of that equipment (e.g., if an image is sufficiently reproduced, or if relevant portions of the equipment are recognized to be within the imaged scene).
  • the system may take one or more corrective actions. For example, in some embodiments, a system may signal a user to alert the user of an error. In some examples, if an image is being incorrectly viewed, the system may provide signals a positioning device capable of physically moving an imaging tool to the appropriate viewing position and/or angle for the measurement. The measurements may then be acquired automatically by the system, or prompt the user to acquire one or more desired measurements.
  • FIGS. 1-8 Various examples of system operation are described below with reference to FIGS. 1-8 .
  • FIG. 1 shows an environment 10 including a plurality of pieces of Equipment A, B, C, D, and E suitable for inspection and/or maintenance.
  • a path 20 extends through the environment 10 and moves past Equipment A, B, C, D, and E.
  • a system e.g., a measurement device, a user's mobile device, a workstation, a remote server, etc.
  • can include a map of environment 10 for example, showing path 20 for performing an inspection and/or maintenance routine.
  • Such a map may be viewed by a user for determining an appropriate route for performing a given workflow.
  • a textual or other description may be used to guide a workflow.
  • FIG. 2 shows an exemplary workflow routine for performing inspection and/or maintenance within environment 10 .
  • a workflow can include steps such as analyzing various equipment, such as Equipment A-E.
  • a graphical interface showing a workflow such as that shown in FIG. 2 can be displayed to a user as an overview of a workflow prior to performing the workflow and/or a checklist of steps to be viewed during the workflow.
  • a user may select a step from the graphical workflow representation to view additional information about the step, such as various analysis and/or other steps to perform.
  • the exemplary workflow shown in FIG. 2 further includes sample images associated with each step. Such images may be stored in memory (e.g., as part of a CMMS and/or SAAS), and may be used as a visual aid for identifying equipment for inspection and/or as a template or guide for reproducing like images during inspection.
  • a user may be presented with the workflow for environment 10 shown in FIG. 2 without access to a map such as that shown in FIG. 1 .
  • the provided workflow informs the user which equipment within the space should be analyzed (e.g., inspected).
  • a user may enter the environment 10 to being the inspection process without explicit knowledge of the location of each of the prescribed pieces of equipment to analyze, but may be provided with images (e.g., as shown) indicating which equipment should be analyzed.
  • images e.g., as shown
  • a user may be alerted as to which equipment may be currently available for inspection, for example, due to being within a certain proximity of such equipment.
  • FIG. 3A shows a user 30 along inspection path 20 within environment 10 .
  • a user may be guided explicitly down path 20 for performing a workflow routine, such as via GPS or other real-time location monitoring technology.
  • path 20 may be the only practical path through environment 10 .
  • a user may travel through environment 10 via an arbitrary path (e.g., path 20 ).
  • equipment within a predetermined proximity 40 of the user becomes available and/or recommended for inspection.
  • wireless communication between one or more measurement devices carried by the user e.g., a test and measurement tool, an imaging tool, etc.
  • a tool carried by the user may determine a distance from one or more pieces of equipment, and identify the equipment within a predetermined proximity (e.g., 40 ), such as a programmed proximity within which a user should be able to identify the equipment for analysis.
  • a predetermined proximity may be adjustable, for example, via a user interface or a remote server.
  • FIG. 3B shows an exemplary interface illustrating to a user that Equipment A is available for inspection.
  • a tool e.g., a test and measurement tool, an imaging tool
  • an external device e.g., a smartphone, tablet, etc.
  • Equipment A is listed as the equipment available for inspection, since Equipment A is within the proximity 40 of user 30 .
  • user 30 may select Equipment A on the interface, and receive subsequent instruction for performing maintenance and/or inspection of Equipment A.
  • a user may receive additional data representative of Equipment A, such as a representative image, typical operating parameters, and the like.
  • a tool e.g., an imaging tool, a test and measurement tool, a remote device interfacing with tool, etc.
  • a tool determines that only one piece of equipment (e.g., Equipment A) is within range, that equipment may be automatically selected for inspection.
  • a reference image, an inspection process, typical operating parameters, and/or other information related to the equipment may be automatically presented.
  • FIG. 4A shows user 30 at a different location along inspection path 20 within environment 10 .
  • the user 30 is within a predetermined proximity 40 of Equipment A, B, and C.
  • a user 30 may be presented with a list of equipment within a predetermined proximity 40 of the user 30 .
  • FIG. 4B shows an exemplary presentation of a list of equipment within the predetermined proximity 40 of the user 30 in FIG. 4A .
  • the list of available equipment includes representative images of the available equipment.
  • Equipment A, B, and C are considered available for inspection given the location of the user 30 .
  • the user may select a piece of equipment from the list of available equipment in order to receive additional information regarding the equipment and/or inspection processes related thereto.
  • a system can determine when prescribed maintenance for a given piece of equipment has been performed (e.g., due to automatic data acquisition, receiving manual data entry, receiving an input from the user indicating inspection is complete, etc.), and can update the interface to indicate which equipment has been analyzed according to the workflow routine and which equipment has yet to be analyzed. For instance, if a user 30 inspects Equipment A when available at the location shown in FIG. 3A , when the user arrives at the location in FIG. 4A , Equipment A may be excluded from the list of available equipment for inspection or may otherwise be presented differently from equipment for which inspection data has not yet been acquired.
  • equipment for which inspection data has been captured will be displayed in a different color, or may be grayed out and/or not selectable by a user.
  • a user may select the equipment for which data has already been captured in order to review the captured data, or to capture new or additional data.
  • FIG. 5A shows user 30 at yet another different location along inspection path 20 within environment 10 .
  • the user 30 is within a predetermined proximity 40 of Equipment C, D, and E.
  • a user 30 may be presented with a list of equipment within a predetermined proximity 40 of the user 30 .
  • FIG. 5B shows an exemplary presentation of a list of equipment within the predetermined proximity 40 of the user 30 in FIG. 5A .
  • the list of available equipment includes representative images of the available equipment.
  • Equipment C, D, and E are considered available for inspection given the location of the user 30 .
  • the user may select a piece of equipment from the list of available equipment in order to receive additional information regarding the equipment and/or inspection processes related thereto.
  • equipment for which inspection data has already been acquired may be presented differently from equipment for which data has yet to be acquired. For example, if a user performed an inspection of Equipment C while at the location shown in FIG. 4A , Equipment C may be excluded from or otherwise presented differently than Equipment D and E in the list of available equipment shown in FIG. 5B .
  • FIG. 6A shows the user 30 at a location along path 20 in environment 10 similar to the location shown in FIG. 4A .
  • Equipment C is in a field of view 50 of a tool (e.g., an imaging tool) carried by the user 30 .
  • an inspection system e.g., an imaging tool
  • FIG. 6B shows an exemplary interface assisting a user in capturing an appropriate image of Equipment C.
  • a template image 52 e.g., associated with the prescribed workflow routine
  • a live image 54 can be presented alongside the template image (e.g., with one or both images being partially transparent) to assist a user in positioning an imaging tool to capture an image similar to the one associated with the workflow routine.
  • other instructions/guidance can be provided for recapturing a new image corresponding to the reference image, such as described in U.S. patent application Ser. Nos. 13/331,633, 13/331,644, and 13/336,607, each of which is incorporated by reference.
  • FIG. 6C shows an exemplary interface instructing a user to acquire measurement data representative of parameters associated with Equipment C (Parameter X, Parameter Y, Parameter Z).
  • Equipment C Parameter X, Parameter Y, Parameter Z
  • Such parameters can include a variety of different parameters, for example, that can be analyzed/acquired using a test and measurement tool that may be carried by a user.
  • a user may select from the list of parameters in order to view instructions on how to measure such a parameter.
  • the user may be presented with detailed instructions for acquiring measurement data representative of that parameter and/or an interface on a measurement tool suitable for performing a measurement data acquisition.
  • a user may be presented with an image capture interface (e.g., as shown in FIG. 6B ), a measurement data acquisition interface (e.g., as shown in FIG. 6C ), or both.
  • an “acquire image data” or similar step may be presented in a list of steps to perform during a workflow routine.
  • a system may present workflow routine steps to a user individually and sequentially in order to guide the user through the workflow process.
  • a user may be presented with a list of steps, from which a user may select a step in the process to perform. Upon such a selection, the system may assist the user in performing the selected step.
  • FIG. 7 is a process flow diagram illustrating a variety of possible processes for collecting data during a workflow routine ( 130 ) and saving and/or uploading the results ( 140 ).
  • a user can receive instruction to perform an inspection process of a given piece of equipment ( 100 ), and can receive a representative image or description of the equipment to assist in the inspection ( 102 ). Upon receiving such assistance information, the user may location the equipment ( 104 ) for inspection.
  • the user may receive an indication that equipment is available for inspection ( 110 ), and select equipment for inspection, for example, from a list of available equipment ( 112 ).
  • the user may receive a representative image or description of the selected equipment ( 102 ), or may locate the equipment ( 104 ) based on, for example, information from the provided list of available equipment.
  • the user may select equipment within an environment for analysis ( 120 ) and enter information representative of the equipment into an inspection system (e.g., via an interface in an imaging tool, a test and measurement tool, an external device, etc.).
  • an inspection system e.g., via an interface in an imaging tool, a test and measurement tool, an external device, etc.
  • the user can capture an image of the selected equipment, input a type or location of such equipment, or the like.
  • An inspection system may be programmed with instructions to identify the equipment for inspection based on the information input by the user, for example, via image recognition or the like.
  • the system may present information regarding the equipment the system believes is to be inspected, which can be confirmed by the user ( 124 ).
  • the user may collect data according to a workflow routine or otherwise confirm data captured automatically is satisfactory for performing the routine ( 130 ).
  • the results e.g., inspection results
  • FIG. 8 is a process flow diagram illustrating a variety of possible processes for guiding a user through a workflow routine ( 225 ), collecting data during the workflow routine (e.g., an inspection process) ( 230 ) and saving and/or uploading the results ( 240 ).
  • the processes in FIG. 8 may be performed by a maintenance/inspection system, for example, via one or more tools carried by a user providing guidance/instruction to the user.
  • a system may provide instruction to a user to perform inspection of a piece of equipment ( 200 ), and may provide a representative image or description of the equipment ( 202 ) to assist the user in finding the equipment.
  • the system may provide an indication to a user that equipment is available for inspection ( 210 ), for example, by way of a list of one or more available pieces of equipment.
  • the system may receive a selection of equipment, for example, from such a list ( 212 ).
  • the system may provide a representative image or description of the equipment ( 202 ) to assist the user in finding the equipment.
  • the system may receive information regarding equipment for inspection from the user ( 220 ). Such information may include an acquired image or other identification information that the system may use to lookup the equipment, for example, via a lookup table, image recognition, or the like in order to determine the equipment that is to be inspected based on the received information ( 222 ). In some examples, the system may confirm that the determined equipment (e.g., from step 222 ) is correct ( 224 ), for example, by indicating to the user the equipment the system identified based on the received information.
  • the system may provide guidance for performing a workflow routine with respect to the equipment, such as an inspection process ( 225 ).
  • the system may collect (e.g., automatically) and/or receive (e.g., via a user interface) data from the workflow routine, such as image data, measurement data, or the like ( 230 ) and save the results internally and/or upload the results to a separate location, such as a remote server ( 240 ).
  • processors distributed among one or more system components, such as tools carried by the user (e.g., imaging tools, test and measurement tools, external devices, etc.), remote servers, and the like.
  • processors may be implemented as one or more processors, such as one or more microprocessors, digital signal processors (DSPs), application specific integrated circuits (ASICs), field programmable gate arrays (FPGAs), programmable logic circuitry, or the like, either alone or in any suitable combination.
  • DSPs digital signal processors
  • ASICs application specific integrated circuits
  • FPGAs field programmable gate arrays

Landscapes

  • Engineering & Computer Science (AREA)
  • Business, Economics & Management (AREA)
  • Human Resources & Organizations (AREA)
  • Multimedia (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Signal Processing (AREA)
  • Economics (AREA)
  • Entrepreneurship & Innovation (AREA)
  • Marketing (AREA)
  • Operations Research (AREA)
  • Quality & Reliability (AREA)
  • Strategic Management (AREA)
  • Tourism & Hospitality (AREA)
  • General Business, Economics & Management (AREA)
  • Testing And Monitoring For Control Systems (AREA)
  • Investigating Materials By The Use Of Optical Means Adapted For Particular Applications (AREA)
  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)

Abstract

Systems for guiding a user through a workflow routine can include an inspection tool, a user interface, memory, and a processor. The processor can provide instructions via the user interface to perform a workflow routine using the inspection tool and save acquiring inspection data to memory. Instructions can direct a user to which equipment to inspect and/or how to collect inspection data associated with one or more pieces of equipment. Systems can determine which equipment is available for inspection by the user, such as via image recognition or proximity detection, and instruct a user to acquire inspection data associated with such equipment. Workflow routine instructions can be provided to the user via various devices, such as an inspection tool, smartphone, or a tablet.

Description

    CROSS-REFERENCES
  • This application claims priority to U.S. Provisional Application No. 62/582,137, filed Nov. 6, 2017, the content of which is hereby incorporated by reference in its entirety.
  • BACKGROUND
  • Various inspection procedures can be performed using a variety of inspection tools in order to and monitor equipment for proper operation or to detect abnormal operating conditions. However, certain environments may include several pieces of equipment for inspection, and one or more such pieces of equipment can have several inspection processes associated therewith, which can contribute to complex and/or lengthy inspection processes. This can result in errors in an inspection process, such as missed data and/or undesirably long inspection times, which can result in excessive downtime or otherwise interfere with typical equipment operation. Such difficulties can be exaggerated when an operator is inexperienced and/or unfamiliar with the environment in which the inspection takes place.
  • SUMMARY
  • Aspects of this disclosure are directed toward systems and methods for collecting data during a workflow routine. Various methods can include receiving information regarding an environment and outputting information regarding pieces of equipment in the environment, such as the pieces of equipment within the environment available for inspection, directions to a location of a selected piece of equipment, one or more parameters associated with a selected piece of equipment, a workflow routine for acquiring inspection data, and/or a reference image representing a selected piece of equipment. Methods can include acquiring inspection data representative of at least one parameter associated with the identified piece of equipment. Inspection data can include image data, such as acoustic image data, infrared image data, and/or visible light image data, for example.
  • Exemplary systems can include an inspection tool, a user interface, memory, and a processor. The processor can be configured to provide instructions to a user to perform a workflow routine using the inspection tool to collect inspection data via the inspection tool, acquire the inspection data, and save the acquired inspection data to memory. Providing instructions can include displaying a list of equipment that is part of the workflow routine and available for inspection. Additionally or alternatively, providing instructions can include displaying a list of steps to perform during the workflow routine.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • FIG. 1 shows an environment including a plurality of pieces of equipment suitable for inspection and/or maintenance.
  • FIG. 2 shows an exemplary workflow routine for performing inspection and/or maintenance within an environment.
  • FIG. 3A shows an illustration of a user traveling along an inspection path within an environment.
  • FIG. 3B shows an exemplary interface illustrating to a user that a piece of equipment is available for inspection.
  • FIG. 4A shows an illustration of the user of FIG. 3A at a different location along the inspection path within the environment.
  • FIG. 4B shows an exemplary presentation of a list of equipment within a predetermined proximity of the user in FIG. 4A.
  • FIG. 5A shows an illustration of the user of FIGS. 3A and 4A at a different location along the inspection path within the environment.
  • FIG. 5B shows an exemplary presentation of a list of equipment within a predetermined proximity of the user in FIG. 5A.
  • FIG. 6A shows an illustration of a user at a location along an inspection path within an environment.
  • FIG. 6B shows an exemplary interface assisting a user in capturing an appropriate image of a piece of equipment.
  • FIG. 6C shows an exemplary interface instructing a user to acquire measurement data representative of parameters associated with a piece of equipment.
  • FIG. 7 is a process flow diagram illustrating a variety of possible processes for collecting data during a workflow routine and saving and/or uploading the results.
  • FIG. 8 is a process flow diagram illustrating a variety of possible processes for guiding a user through a workflow routine, collecting data during the workflow routine, and saving and/or uploading the results.
  • DETAILED DESCRIPTION
  • Workers who perform inspections and/or maintenance routines for various types of equipment (e.g., pumps, motors, transformers, electrical panels, etc.) typically carry measuring devices from location to location, take measurements of various pieces of equipment in specified ways, and often do so repeatedly. Such measuring devices can include one or more imaging tools capable of generating image data representative of a target scene and/or one or more test and measurement tool capable of generating measurement data representative of one or more parameters of an object under test.
  • Exemplary imaging tools can include electromagnetic imaging tools, and can be configured to generate image data representative of electromagnetic radiation from a target scene, such as infrared image data, visible light image data, ultraviolet image data, millimeter wave image data, and the like. Combinations of one or more electromagnetic spectrums may also be used, for example, as described in U.S. Pat. No. 7,538,326, entitled “VISIBLE LIGHT AND IR COMBINED IMAGE CAMERA WITH A LASER POINTER,” which is assigned to the assignee of the instant application, and is hereby incorporated by reference in its entirety.
  • Additionally or alternatively, an imaging tool can include an acoustic imaging tool including one or more acoustic sensor elements used to generate an acoustic image of a target scene. Exemplary acoustic imaging tools, and combinations of acoustic imaging tool and electromagnetic imaging tools, are described in U.S. patent application Ser. No. 15/802,153, filed Nov. 2, 2017, and entitled “FOCUS AND/OR PARALLAX ADJUSTMENT IN ACOUSTIC IMAGING USING DISTANCE INFORMATION,” which is assigned to the assignee of the instant application and is hereby incorporated by reference in its entirety.
  • Electromagnetic imaging tools and/or acoustic imaging tools may be combined or otherwise in communication with one another and/or with other test and measurement tools, for example, as described in U.S. patent application Ser. No. 14/855,884, filed Sep. 16, 2015, and entitled “TEST AND MEASUREMENT SYSTEM WITH REMOVABLE IMAGING TOOL,” which is assigned to the assignee of the instant application and is hereby incorporated by reference in its entirety. In some examples, imaging and/or test and measurement functionally may be incorporated into a user's external device (e.g., smartphone, tablet, etc.), such as described in U.S. patent application Ser. No. 14/855,864, filed Sep. 17, 2015, and entitled “MOBILE DEVICE USED WITH ISOLATED TEST AND MEASUREMENT INPUT BLOCK,” which is assigned to the assignee of the instant application and is hereby incorporated by reference in its entirety.
  • Maintenance activities may also be performed and recorded for future reference and analysis. Inspection and/or maintenance data may be analyzed individually or collectively and used for predictive maintenance or fault prediction.
  • In some cases, maintenance and/or inspection processes can be complex and/or lengthy, making consistent inspections difficult to perform consistently while gathering a complete set of proper data. Additionally or alternatively, such processes may be performed by an inexperienced worker and/or a worker that is unfamiliar with one or more inspection processes and/or a particular environment in which the maintenance and/or inspection process is being performed.
  • Aiding techniques and data processing techniques can be used to guide and assist an individual in performing a maintenance and/or inspection process, for example, by assisting a system user in a data collection workflow process. Such aiding can result in faster, easier, and more reliable/consistent data collection. In various examples, these aiding techniques can support or provide an inspection and/or maintenance workflow procedure. For example, in some embodiments, such techniques provide guidance to the user during the workflow and may involve manual inputs from the user and/or automatic means of acquiring and analyzing measurements.
  • In various examples, useful information for performing various tasks in a maintenance and/or inspection procedure may be provided to the user on an ongoing basis throughout the process, and can be provided on-demand or automatically by a processing/analysis system. Such information may include the locations of equipment that is to be inspected, how the measurements should be taken, and whether or not measurements that are obtained are taken appropriately. This information may be provided to the user in the form of text messages, as graphical/text indicators superimposed on live imagery, as sound cues, as light indicators, or by other means. In various embodiments, determining which indicators should be presented to a user can be performed in a variety of ways, including, for example, location detection, processing live imagery to determine the identify of an object under test, or other live data (e.g., proximity detection relative to an object) collected from other sensor devices.
  • As or after maintenance and/or inspection(s) are performed, various data can be recorded, such as, for example, a record of the inspection/maintenance activities performed, entries made by the user, aiding data that is provided to the user during inspections, inspection results (measurements), or combinations thereof. One or more such recorded data elements may be made available to a computerized maintenance management system (CMMS), including a computer database of one or more maintenance operations. Such database entries can include a variety of maintenance and/or inspection information, including past results, instructions for performing such processes, possible errors that can be observed during maintenance/inspection, and the like. The aiding and processing techniques and results described herein therefore provide useful data which improves the effectiveness of such a maintenance management system. Such additional data, along with the increased reliability of measurements due to aiding, result in better maintenance of equipment and more reliable fault predictions.
  • Inspection and/or maintenance tools and/or activities may be part of an overall CMMS system. For example, in some embodiments, one or more tools (e.g., a test and measurement tool, imaging tool, etc.) carried by a user can be configured to provide inputs from the inspection and/or maintenance activities to the CMMS system. Additionally or alternatively, such data may be entered to a CMMS system by a software platform that is accessed by a separate device, such as a computer workstation, an external device such as a smartphone or a tablet, or the like. For instance, in some examples, data acquired by a tool carried by a user (e.g., an imaging tool, a test and measurement tool, or the like) can be communicated to an external device such as described in U.S. patent application Ser. No. 14/855,989, filed Sep. 17, 2015, and entitled “DISPLAY OF IMAGES FROM AN IMAGING TOOL EMBEDDED OR ATTACHED TO A TEST AND MEASUREMENT TOOL,” which is assigned to the assignee of the instant application and is hereby incorporated by reference in its entirety.
  • In some examples, such a software platform may involve a licensing and delivery model in which software is licensed on a subscription basis and is centrally hosted and may be referred to as a Software as a Service, or SAAS. Such a system may be made accessible to users using a client via a web browser or other means.
  • Measurement data, including imagery, over time, for example, for a particular piece of equipment, as well as analysis results of such data and/or signals sent directly from the equipment itself, may be provided to and made available from the SAAS. Such data can include results from a variety of sensor devices including images from an IR, VL, acoustic, or other imaging system. Data can additionally or alternatively include metrics/analysis/trends obtained by analysis from such measurements and imagery. In various examples, data from the SAAS, such as imagery, measurement data, and other data for a piece of multiple pieces of equipment may be automatically associated to that equipment and may be provided to a user of the SAAS, for example, to assist with future maintenance and/or inspection processes.
  • Such imagery, measurement data, and other data, such as analysis results and trend data, may be provided to a user in an on-demand fashion, or automatically via an alarm/notification system. For example, such data may be downloaded from the SAAS and stored in memory on board one or more tools carried by a user, and/or on a user's personal device, such as a smartphone or tablet. Additionally or alternatively, a user may access such data real time from a remote location, such as a hosted server providing access to a user, e.g., via a tool and/or a personal device. Thus, a user may receive data (e.g., using a tool and/or personal device) that can provide information representative of previous and/or expected measurement information, steps for performing one or more maintenance and/or inspection processes, or other equipment information.
  • In some embodiments, trend analysis and/or generating a CMMS or SAAS for use with guided inspections can include building a statistical database of typical equipment operation, for example, as described in U.S. patent application Ser. No. 15/190,792, filed Jun. 23, 2016, and entitled “THERMAL ANOMALY DETECTION,” which is assigned to the assignee of the instant application and is hereby incorporated by reference in its entirety.
  • In addition or alternatively to equipment information, a user may be provided (e.g., via on-board memory, network access, etc.) a workflow routine instructing the user how to perform one or more maintenance and/or inspection processes. In some examples, a workflow routine may be documented in the form of a procedure, which may be brief or quite detailed. A detailed procedure may include, for example, a list of equipment to be inspected and/or maintained, the measuring devices (sensors) to use for each piece of equipment to be inspected, and/or the methods and/or settings in which the measuring devices are to be used at each inspection step.
  • One or more measuring devices (e.g., imaging tools, test and measurement tools, etc.) used during workflows may include an interface that allows for access/viewing of a workflow procedure, stored as an electronic document or instruction set, which the user may review at will during the inspection. The electronic document or instruction set may reside on one of, and may be shared between, the multiple sensing devices used during the workflow. The electronic document may reside on a separate device (pc, smartphone, or tablet) that the user carries during the inspection process, or it may reside remotely and be communicated to the measuring equipment or other device that the user carries, (from a data cloud or a central hub that is used for data collection and processing).
  • An exemplary workflow procedure may include the physical route of the inspections/maintenance, the equipment to be inspected/maintained, the measurement devices (e.g., imaging tools, test and measurement tools, etc.) to be used in inspecting each piece of equipment at each step, measurement device settings, connection diagrams for electrical and other contact inspections, required viewing angles and perspectives for image inspections, and/or previously acquired reference images that indicate the appropriate image appearance for imagery at each step of an image-based inspection. Such imagery inspections may involve IR, VL, mm wave, acoustic, or other imaging devices.
  • The measuring devices (imagers and other sensors) or a separate device (pc, smartphone, or tablet) that the user is carrying may have access to the electronic workflow procedure. Additionally or alternatively, the user may be able to manually record their progress and/or measurement results during the inspections and measurements during the workflow, for example, saving data to a SAAS and/or CMMS. In some examples, the progress through the workflow and/or the measurements themselves may be recorded automatically.
  • In some embodiments, the physical real-time location of a user and/or of the measuring device(s) may be automatically tracked during the workflow or may be manually entered by the user. Automatic methods may include GPS, inertial tracking methods, triangulation by use of external devices, by proximity or RFID sensors placed at various locations, or by other means. The physical location of equipment to inspect may also be known to the processing system. The real-time location data of the operator and sensors may be used to infer which pieces of equipment can be inspected (e.g., are near the user, such as within a predetermined proximity of the user) at a given time. These determinations can be made either inside a measuring device (e.g., imaging tools, test and measurement tools, etc.), in a separate device that the user is carrying, (ex. pc, smartphone, or tablet), or at a separate processing hub which is in communication with one or more such devices.
  • In some examples, guidance can be provided to a user as to the proper actions/measurements to take for a given one or more pieces of equipment that are accessible for inspection a given point. For example, based on known locations of a user and equipment available for inspection, should the user wish to take a measurement or collect an image at a known location, a set of candidate equipment for inspection at that location may be indicated to the user. Thus, the user may select an item from a candidate list of known items. Such a selection may trigger execution of further guidance for performing maintenance and/or inspection of the selected equipment, and/or may pre-load a variety of available data representative of the equipment, eliminating the need for manual entry of some such details, such as the full description of the equipment. As the user verifies the specific identity of the equipment of interest, the instructions for taking required measurements may be indicated, and any subsequent measurements may be automatically associated to the specific equipment for future reference (e.g., in a CMMS and/or SAAS).
  • In some applications, a specific piece of equipment near the user can be identified automatically and in real time. In various examples, such identification can be achieved as a result of the known physical location of the inspection device(s) and equipment, or by an identification signal (active or passive) transmitted from the equipment to the inspection device, or by an external triangulation system.
  • Additionally or alternatively, when a user is carrying an imaging tool (e.g., an imaging system), specific identification of the equipment might be achieved by object recognition image processing techniques where the equipment is identified within the imagery in real time. Such techniques can include, for example, correlation methods and blob analysis. In some embodiments, identification of a specific piece of equipment might be achieved by combing live data obtained from a number of different sensors such as acoustic, mm-wave, visual imaging, and IR imaging.
  • In various embodiments, if equipment of interest is identified automatically (e.g., via location determination and/or image recognition), a user may or may not be prompted to manually select the equipment, for example, from a candidate list, or otherwise confirm the identity of the automatically identified equipment. For instance, in some examples, identification of the equipment to be inspected may be indicated to the user, and useful reference information regarding the specific equipment can be provided automatically to the user. Association of the specific equipment identity to subsequently obtained imager data or other sensor data measurements may also be automatic.
  • In some examples, the viewing perspective (orientation, position, and measuring distance) of an imaging device, (IR, VL, mm wave, acoustic), may be automatically determined or manually entered by the user. Automatic determination might be achieved using sensors within the imager (e.g., orientation sensing via accelerometers or the like, position sensing such as GPS or the like, etc.), or by externally placed sensors, or might be determined from the imagery itself using image processing techniques such as object recognition. In some examples, a procedure step (e.g., in a workflow routine) may be indicated by the user to the system, where one or more procedure steps is associated with and therefore implies a particular viewing location and perspective for a specific piece of equipment.
  • In some such examples, the system (e.g., via an imaging tool) may direct a user to a location for capturing image data from such a predetermined location. Such an image capturing location may be associated with an image previously captured and associated with the procedure step. Thus, directing a user to the location can include a rephotography process in order to reproduce the capture point of the previously captured image. Exemplary such processes are described in U.S. patent application Ser. No. 13/331,633, filed Dec. 20, 2011, and entitled, “THERMAL IMAGING CAMERA FOR INFRARED REPHOTOGRAPHY,” U.S. patent application Ser. No. 13/331,644, filed Dec. 20, 2011, and entitled, “THERMAL IMAGING CAMERA FOR INFRARED REPHOTOGRAPHY,” and U.S. patent application Ser. No. 13/336,607, filed Dec. 23, 2011, and entitled, “THERMAL IMAGING CAMERA FOR INFRARED REPHOTOGRAPHY,” each of which is assigned to the assignee of the instant application and is incorporated by reference in its entirety.
  • Location/perspective specific reference imagery and other data may be stored prior to performing the workflow, for each piece of equipment that is of interest for imaging or data collection. This data may be stored in the measuring device, a separate device that the user carries, or in a central processing hub. This data may be provided to the user automatically or on demand throughout the workflow for reference. In the case of imaging tasks such as infrared, visible light, or acoustic measurements, this reference imagery may be used by the user as a guide that indicates how the appropriate view should appear for the imager as measurements are acquired. The reference imagery can be displayed along with live scene imagery, and other data may be processed by the system in order to provide useful guidance and cues to the user.
  • Additionally or alternatively, the known real-time imaging perspective and current physical location data, along with previously acquired reference imagery/data, and other data, may be used to determine what objects or equipment are to be expected in the current imagery at a given time. For instance, in an exemplary embodiment, rather than use current location information to guide a user to reposition an imaging tool to a previous position, analysis of current location information may be used (e.g., via a processor) to determine that the tool is near a location from which previous image data (or other data) was captured. Such a location or equipment located at such a location can be presented to the user as a possible inspection candidate.
  • In some examples, a notification or description of such potential equipment can be presented to the user. These indications might include a display of the previously acquired reference image of the equipment. Object recognition image processing techniques such as correlation or blob analysis methods may be used to search for, indicate, and track objects in the imagery which are candidates for known pieces of equipment needing inspection. Such candidates may be presented to the user with an option to confirm the identity of a piece of equipment. In the case where a specific piece of equipment is identified automatically by the system, in various embodiments, a user may or may not be prompted to select or confirm the identity of the equipment. In some such examples, any related messaging to the user or association of the equipment identity to the image and/or other sensor data may then be performed automatically.
  • In some embodiments, object recognition and other image processing techniques can be used to determine automatically and in real time when pieces of equipment that are required for inspection are present in the image scene. These techniques may also be used to determine when the objects are present but are not being viewed appropriately for imaging measurements. Image processing and/or other techniques can determine errors in the imaging process, and can provide guidance to the user (e.g., to refocus, change position, change viewing perspective, etc.). In addition, a system may automatically determine whether the correct settings for obtaining imagery of the equipment are being applied or not to the imager. If not then the system may automatically provide appropriate guidance for changing these settings to the user.
  • In some embodiments, where a system has an awareness of the image appearance of one or more pieces of equipment to be imaged, and has an awareness of the correct settings for the imager for each measurement, the system may automatically apply the appropriate control settings to the imager as a known piece of equipment is inspected or imaged. If the imager settings and imagery itself is determined to be appropriate for a required measurement for a piece of equipment, a message may be provided to indicate this status to the user so that the user knows that it is appropriate to obtain a manual image measurement. In some applications, image data may be captured automatically for a desired piece of equipment in the event that the system determines that the imager settings and live image content are appropriate for the inspection of that equipment (e.g., if an image is sufficiently reproduced, or if relevant portions of the equipment are recognized to be within the imaged scene).
  • In some applications, if the combination of results from image processing and other sensor data indicates that a measurement is not being taken correctly, such as a desired object is not being viewed from the correct perspective, angle, or position, in order to acquire a measurement correctly, the system may take one or more corrective actions. For example, in some embodiments, a system may signal a user to alert the user of an error. In some examples, if an image is being incorrectly viewed, the system may provide signals a positioning device capable of physically moving an imaging tool to the appropriate viewing position and/or angle for the measurement. The measurements may then be acquired automatically by the system, or prompt the user to acquire one or more desired measurements.
  • Various examples of system operation are described below with reference to FIGS. 1-8.
  • FIG. 1 shows an environment 10 including a plurality of pieces of Equipment A, B, C, D, and E suitable for inspection and/or maintenance. A path 20 extends through the environment 10 and moves past Equipment A, B, C, D, and E. In some embodiments, a system (e.g., a measurement device, a user's mobile device, a workstation, a remote server, etc.) can include a map of environment 10, for example, showing path 20 for performing an inspection and/or maintenance routine. Such a map may be viewed by a user for determining an appropriate route for performing a given workflow. Additionally or alternatively, a textual or other description may be used to guide a workflow.
  • FIG. 2 shows an exemplary workflow routine for performing inspection and/or maintenance within environment 10. Such a workflow can include steps such as analyzing various equipment, such as Equipment A-E. In some embodiments, a graphical interface showing a workflow such as that shown in FIG. 2 can be displayed to a user as an overview of a workflow prior to performing the workflow and/or a checklist of steps to be viewed during the workflow. In some examples, a user may select a step from the graphical workflow representation to view additional information about the step, such as various analysis and/or other steps to perform. The exemplary workflow shown in FIG. 2 further includes sample images associated with each step. Such images may be stored in memory (e.g., as part of a CMMS and/or SAAS), and may be used as a visual aid for identifying equipment for inspection and/or as a template or guide for reproducing like images during inspection.
  • In an exemplary embodiment, a user may be presented with the workflow for environment 10 shown in FIG. 2 without access to a map such as that shown in FIG. 1. The provided workflow informs the user which equipment within the space should be analyzed (e.g., inspected). A user may enter the environment 10 to being the inspection process without explicit knowledge of the location of each of the prescribed pieces of equipment to analyze, but may be provided with images (e.g., as shown) indicating which equipment should be analyzed. Additionally or alternatively, as mentioned elsewhere herein, a user may be alerted as to which equipment may be currently available for inspection, for example, due to being within a certain proximity of such equipment.
  • FIG. 3A shows a user 30 along inspection path 20 within environment 10. In various embodiments, a user may be guided explicitly down path 20 for performing a workflow routine, such as via GPS or other real-time location monitoring technology. In other examples, path 20 may be the only practical path through environment 10. In still further examples, a user may travel through environment 10 via an arbitrary path (e.g., path 20).
  • In the illustrated example, equipment within a predetermined proximity 40 of the user becomes available and/or recommended for inspection. In some such embodiments, wireless communication between one or more measurement devices carried by the user (e.g., a test and measurement tool, an imaging tool, etc.) functions within proximity 40. Additionally or alternatively, in some embodiments, a tool carried by the user may determine a distance from one or more pieces of equipment, and identify the equipment within a predetermined proximity (e.g., 40), such as a programmed proximity within which a user should be able to identify the equipment for analysis. In some embodiments, such a predetermined proximity may be adjustable, for example, via a user interface or a remote server.
  • In the illustrated example of FIG. 3A, Equipment A is within proximity 40 of user 30. FIG. 3B shows an exemplary interface illustrating to a user that Equipment A is available for inspection. Such an interface may be provided to the user via a tool (e.g., a test and measurement tool, an imaging tool), an external device (e.g., a smartphone, tablet, etc.) or the like. In the example of FIG. 3B, Equipment A is listed as the equipment available for inspection, since Equipment A is within the proximity 40 of user 30. In an exemplary embodiment, user 30 may select Equipment A on the interface, and receive subsequent instruction for performing maintenance and/or inspection of Equipment A. In some examples, a user may receive additional data representative of Equipment A, such as a representative image, typical operating parameters, and the like. In some embodiments, if a tool (e.g., an imaging tool, a test and measurement tool, a remote device interfacing with tool, etc.) determines that only one piece of equipment (e.g., Equipment A) is within range, that equipment may be automatically selected for inspection. For example, a reference image, an inspection process, typical operating parameters, and/or other information related to the equipment may be automatically presented.
  • FIG. 4A shows user 30 at a different location along inspection path 20 within environment 10. In the illustrated example of FIG. 4A, the user 30 is within a predetermined proximity 40 of Equipment A, B, and C. As described, in some embodiments, a user 30 may be presented with a list of equipment within a predetermined proximity 40 of the user 30. FIG. 4B shows an exemplary presentation of a list of equipment within the predetermined proximity 40 of the user 30 in FIG. 4A. In some examples, the list of available equipment includes representative images of the available equipment.
  • As shown, Equipment A, B, and C are considered available for inspection given the location of the user 30. In an exemplary embodiment, the user may select a piece of equipment from the list of available equipment in order to receive additional information regarding the equipment and/or inspection processes related thereto.
  • In some embodiments, a system can determine when prescribed maintenance for a given piece of equipment has been performed (e.g., due to automatic data acquisition, receiving manual data entry, receiving an input from the user indicating inspection is complete, etc.), and can update the interface to indicate which equipment has been analyzed according to the workflow routine and which equipment has yet to be analyzed. For instance, if a user 30 inspects Equipment A when available at the location shown in FIG. 3A, when the user arrives at the location in FIG. 4A, Equipment A may be excluded from the list of available equipment for inspection or may otherwise be presented differently from equipment for which inspection data has not yet been acquired. In some examples, equipment for which inspection data has been captured will be displayed in a different color, or may be grayed out and/or not selectable by a user. In some embodiments, a user may select the equipment for which data has already been captured in order to review the captured data, or to capture new or additional data.
  • FIG. 5A shows user 30 at yet another different location along inspection path 20 within environment 10. In the illustrated example of FIG. 5A, the user 30 is within a predetermined proximity 40 of Equipment C, D, and E. As described, in some embodiments, a user 30 may be presented with a list of equipment within a predetermined proximity 40 of the user 30. FIG. 5B shows an exemplary presentation of a list of equipment within the predetermined proximity 40 of the user 30 in FIG. 5A. In some examples, the list of available equipment includes representative images of the available equipment.
  • As shown, Equipment C, D, and E are considered available for inspection given the location of the user 30. In an exemplary embodiment, the user may select a piece of equipment from the list of available equipment in order to receive additional information regarding the equipment and/or inspection processes related thereto. As described above, in some examples, equipment for which inspection data has already been acquired may be presented differently from equipment for which data has yet to be acquired. For example, if a user performed an inspection of Equipment C while at the location shown in FIG. 4A, Equipment C may be excluded from or otherwise presented differently than Equipment D and E in the list of available equipment shown in FIG. 5B.
  • FIG. 6A shows the user 30 at a location along path 20 in environment 10 similar to the location shown in FIG. 4A. In the illustrated example of FIG. 6A, Equipment C is in a field of view 50 of a tool (e.g., an imaging tool) carried by the user 30. As described elsewhere herein, in some examples, an inspection system (e.g., an imaging tool) can assist a user in capturing an image of equipment. FIG. 6B shows an exemplary interface assisting a user in capturing an appropriate image of Equipment C. In the illustrated example, a template image 52 (e.g., associated with the prescribed workflow routine) is displayed on an interface associated with Equipment C. A live image 54 can be presented alongside the template image (e.g., with one or both images being partially transparent) to assist a user in positioning an imaging tool to capture an image similar to the one associated with the workflow routine. In addition or alternatively to providing a template image, other instructions/guidance can be provided for recapturing a new image corresponding to the reference image, such as described in U.S. patent application Ser. Nos. 13/331,633, 13/331,644, and 13/336,607, each of which is incorporated by reference.
  • In addition or alternatively to capturing image data, various other parameters may be captured during a workflow routine, for example, measurement data that can be captured via a test and measurement tool. FIG. 6C shows an exemplary interface instructing a user to acquire measurement data representative of parameters associated with Equipment C (Parameter X, Parameter Y, Parameter Z). Such parameters can include a variety of different parameters, for example, that can be analyzed/acquired using a test and measurement tool that may be carried by a user. In some embodiments, a user may select from the list of parameters in order to view instructions on how to measure such a parameter. Upon selection of a parameter, the user may be presented with detailed instructions for acquiring measurement data representative of that parameter and/or an interface on a measurement tool suitable for performing a measurement data acquisition.
  • In various embodiments, only image data, only measurement data, or both image data and measurement data can be required during a workflow. Thus, in various embodiments, a user may be presented with an image capture interface (e.g., as shown in FIG. 6B), a measurement data acquisition interface (e.g., as shown in FIG. 6C), or both. In some examples, an “acquire image data” or similar step may be presented in a list of steps to perform during a workflow routine. In various examples, a system may present workflow routine steps to a user individually and sequentially in order to guide the user through the workflow process. Alternatively, a user may be presented with a list of steps, from which a user may select a step in the process to perform. Upon such a selection, the system may assist the user in performing the selected step.
  • FIG. 7 is a process flow diagram illustrating a variety of possible processes for collecting data during a workflow routine (130) and saving and/or uploading the results (140). In an exemplary process, a user can receive instruction to perform an inspection process of a given piece of equipment (100), and can receive a representative image or description of the equipment to assist in the inspection (102). Upon receiving such assistance information, the user may location the equipment (104) for inspection.
  • In another exemplary process, the user may receive an indication that equipment is available for inspection (110), and select equipment for inspection, for example, from a list of available equipment (112). In some examples, upon selection, the user may receive a representative image or description of the selected equipment (102), or may locate the equipment (104) based on, for example, information from the provided list of available equipment.
  • In yet another exemplary process, the user may select equipment within an environment for analysis (120) and enter information representative of the equipment into an inspection system (e.g., via an interface in an imaging tool, a test and measurement tool, an external device, etc.). In various examples, the user can capture an image of the selected equipment, input a type or location of such equipment, or the like. An inspection system may be programmed with instructions to identify the equipment for inspection based on the information input by the user, for example, via image recognition or the like. In some examples, the system may present information regarding the equipment the system believes is to be inspected, which can be confirmed by the user (124).
  • After the equipment is located (104) and/or confirmed (124), the user may collect data according to a workflow routine or otherwise confirm data captured automatically is satisfactory for performing the routine (130). The results (e.g., inspection results) can then be saved locally or uploaded to a server (140).
  • FIG. 8 is a process flow diagram illustrating a variety of possible processes for guiding a user through a workflow routine (225), collecting data during the workflow routine (e.g., an inspection process) (230) and saving and/or uploading the results (240). The processes in FIG. 8 may be performed by a maintenance/inspection system, for example, via one or more tools carried by a user providing guidance/instruction to the user.
  • In one example, a system may provide instruction to a user to perform inspection of a piece of equipment (200), and may provide a representative image or description of the equipment (202) to assist the user in finding the equipment.
  • In another exemplary process, the system may provide an indication to a user that equipment is available for inspection (210), for example, by way of a list of one or more available pieces of equipment. The system may receive a selection of equipment, for example, from such a list (212). In some examples, upon receiving the selection, the system may provide a representative image or description of the equipment (202) to assist the user in finding the equipment.
  • In still another exemplary process, the system may receive information regarding equipment for inspection from the user (220). Such information may include an acquired image or other identification information that the system may use to lookup the equipment, for example, via a lookup table, image recognition, or the like in order to determine the equipment that is to be inspected based on the received information (222). In some examples, the system may confirm that the determined equipment (e.g., from step 222) is correct (224), for example, by indicating to the user the equipment the system identified based on the received information.
  • Once the equipment is selected and/or confirmed, the system may provide guidance for performing a workflow routine with respect to the equipment, such as an inspection process (225). The system may collect (e.g., automatically) and/or receive (e.g., via a user interface) data from the workflow routine, such as image data, measurement data, or the like (230) and save the results internally and/or upload the results to a separate location, such as a remote server (240).
  • In various examples, such system processes can be performed by one or more processors distributed among one or more system components, such as tools carried by the user (e.g., imaging tools, test and measurement tools, external devices, etc.), remote servers, and the like. Components described as processors may be implemented as one or more processors, such as one or more microprocessors, digital signal processors (DSPs), application specific integrated circuits (ASICs), field programmable gate arrays (FPGAs), programmable logic circuitry, or the like, either alone or in any suitable combination.
  • Various embodiments have been described. Such examples are non-limiting, and do not define or limit the scope of the invention in any way. Rather, these and other examples are within the scope of the following claims.

Claims (26)

1. A system comprising:
an inspection tool configured to acquire inspection data;
a user interface;
memory; and
a processor in communication with the inspection tool, the user interface, and the memory, the processor being configured to:
provide instructions via the user interface to perform a workflow routine using the inspection tool, the workflow routine including at least collecting inspection data associated with a piece of equipment;
acquire the inspection data associated with the equipment; and
save the acquired inspection data to the memory.
2. The system of claim 1, wherein the user interface, the memory, and the processor are included in the inspection tool.
3. The system of claim 1, wherein the workflow routine comprises a list including a plurality of pieces of equipment for which to collect inspection data and one or more steps for collecting inspection data from each of the plurality of pieces of equipment.
4. The system of claim 3, wherein the workflow routine includes a list of one or more devices necessary for completing the routine, the one or more devices including the inspection tool and the method and/or settings for operating at least one of the one or more devices.
5. The system of claim 1, wherein the workflow routine is presented as an electronic document, and wherein the user interface comprises a computer, smartphone, or tablet.
6. The system of claim 1, wherein the inspection tool comprises an imaging tool, and wherein the workflow routine comprises instructions for acquiring image data, including at least a viewing perspective or a reference image.
7. The system of claim 6, wherein the processor is configured to:
receive image data from the imaging tool; and
recognize an object within the received image data; and
wherein the workflow routine comprises instructions for collecting inspection data associated with the recognized object.
8. The system of claim 1, further comprising a position sensor in communication with the processor, and wherein the processor is configured to provide instructions via the user interface to perform a workflow routine based on received position information from the position sensor.
9. The system of claim 8, wherein the position information comprises proximity information representative of proximity between the user and a piece of equipment.
10. The system of claim 1, wherein:
providing instructions via the user interface to perform a workflow routine comprises displaying, on the user interface, a list of equipment that is part of the workflow routine and available for inspection; and
the processor is configured to:
receive a selection of a piece of equipment via the user interface; and
provide instructions, via the user interface, for collecting inspection data associated with the selected piece of equipment.
11. The system of claim 1, wherein:
providing instructions via the user interface to perform a workflow routine comprises displaying, on the user interface, a list of steps to perform during the workflow routine, and wherein steps that have been completed are shown in a distinguishing way from steps that have not been completed.
12. The system of claim 11, wherein the processor is configured to determine automatically when one or more steps in the workflow routine are completed.
13. The system of claim 1, wherein the processor is located remotely from the inspection tool and the user interface, and wherein the processor communicates with the inspection tool and the user interface via a network.
14. A method of collecting data during a workflow routine, comprising:
receiving information regarding an environment;
providing information regarding pieces of equipment in the environment;
identifying a piece of equipment for inspection;
acquiring inspection data representative of at least one parameter associated with the identified piece of equipment;
storing the inspection data.
15. The method of claim 14, further comprising determining one or more pieces of equipment available for inspection; and
wherein providing information regarding pieces of equipment in the environment comprises notifying a user of at least one piece of equipment currently available for inspection; and
identifying a piece of equipment for inspection comprises receiving a selection from a user designating the identified piece of equipment.
16. The method of claim 15, further comprising the step of determining a first piece of equipment is within a predetermined proximity, and notifying the user that the first piece of equipment is available for inspection upon determining that the first piece of equipment is within the predetermined proximity.
17. The method of claim 15, wherein determining one or more pieces of equipment available for inspection comprises recognizing one or more pieces of equipment within a field of view of an imaging tool.
18. The method of claim 15, wherein determining one or more pieces available for inspection comprises determining one or more pieces of equipment for which inspection data has not yet been acquired.
19. The method of claim 15, wherein notifying a user of at least one piece of equipment currently available for inspection comprises presenting a user with a list including equipment currently available for inspection.
20. The method of claim 19, wherein the list including equipment currently available for inspection presents equipment for which inspection data has already been received in a visually distinguishing manner than equipment for which inspection data has not yet been received.
21. The method of claim 20, wherein equipment for which inspection data has already been received are presented in the list in a different color than equipment for which inspection data has not yet been received.
22. The method of claim 19, wherein receiving the selection of a piece of equipment comprises receiving a selection from the list including equipment currently available for inspection.
23. The method of claim 15, further comprising outputting information related to the identified piece of equipment, the outputting information comprising:
outputting directions to a location of the selected piece of equipment;
outputting one or more parameters associated with the selected piece of equipment;
outputting a workflow routine for acquiring the inspection data;
and/or presenting a reference image representative of the selected equipment.
24. The method of claim 14, wherein the inspection data comprises acoustic image data, infrared image data, and/or visible light image data.
25. The method of claim 14, wherein receiving information regarding the environment comprises determining one or more pieces of equipment in the environment.
26. The method of claim 25, wherein determining one or more pieces of equipment in the environment comprises determining location information representative of a current location;
receiving an input indicative of the environment;
and/or receiving proximity information representative of a proximity to one or more pieces of equipment.
US16/180,873 2017-11-06 2018-11-05 Inspection workflow using ojbect recognition and other techniques Abandoned US20190141236A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
US16/180,873 US20190141236A1 (en) 2017-11-06 2018-11-05 Inspection workflow using ojbect recognition and other techniques
US17/348,755 US12088910B2 (en) 2017-11-06 2021-06-15 Inspection workflow using object recognition and other techniques

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201762582137P 2017-11-06 2017-11-06
US16/180,873 US20190141236A1 (en) 2017-11-06 2018-11-05 Inspection workflow using ojbect recognition and other techniques

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US17/348,755 Continuation US12088910B2 (en) 2017-11-06 2021-06-15 Inspection workflow using object recognition and other techniques

Publications (1)

Publication Number Publication Date
US20190141236A1 true US20190141236A1 (en) 2019-05-09

Family

ID=64267531

Family Applications (2)

Application Number Title Priority Date Filing Date
US16/180,873 Abandoned US20190141236A1 (en) 2017-11-06 2018-11-05 Inspection workflow using ojbect recognition and other techniques
US17/348,755 Active 2039-05-09 US12088910B2 (en) 2017-11-06 2021-06-15 Inspection workflow using object recognition and other techniques

Family Applications After (1)

Application Number Title Priority Date Filing Date
US17/348,755 Active 2039-05-09 US12088910B2 (en) 2017-11-06 2021-06-15 Inspection workflow using object recognition and other techniques

Country Status (3)

Country Link
US (2) US20190141236A1 (en)
EP (1) EP3480752A1 (en)
CN (1) CN109754148A (en)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20180348143A1 (en) * 2017-05-31 2018-12-06 Flir Systems Ab Inspection routing systems and methods
US20220057269A1 (en) * 2020-08-21 2022-02-24 Analog Devices, Inc. Multi-sensor using a thermal camera
US20220377219A1 (en) * 2021-05-18 2022-11-24 Magna Electronics Inc. Vehicular driver monitoring system with camera view optimization
WO2024073746A1 (en) * 2022-09-30 2024-04-04 Flir Systems Ab Camera alignment using reference image for asset inspection systems and methods
US20240420477A1 (en) * 2009-06-04 2024-12-19 Optics Innovation Llc Method and apparatus for a wearable computer
US12335655B2 (en) 2020-03-31 2025-06-17 Flir Systems Ab Thermal imaging asset inspection systems and methods

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114004819B (en) * 2021-11-04 2025-05-30 中国联合网络通信集团有限公司 Fabric inspection and processing method, device, equipment, system and medium

Family Cites Families (59)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5386117B1 (en) 1993-06-07 1997-06-10 Computational Systems Inc Infrared thermography system including mobile unit
US5637871A (en) 1993-06-07 1997-06-10 Computational Systems, Inc. Portable digital infrared thermography system
US5724242A (en) 1995-03-21 1998-03-03 Caterpillar Inc. Method for producing production control software for a natural gas engine controller
US5860066A (en) 1996-06-27 1999-01-12 Payment Systems For Credit Unions Inc. Imaging and workflow system
US5949418A (en) 1997-05-06 1999-09-07 Microsoft Corporation Operating system for handheld computing device having graphical window minimization/enlargement functionality
US6255650B1 (en) 1998-12-11 2001-07-03 Flir Systems, Inc. Extreme temperature radiometry and imaging apparatus
US6542824B1 (en) 1999-01-29 2003-04-01 International Business Machines Corporation Method and system for determining position information utilizing a portable electronic device lacking global positioning system (GPS) reception capability
US6845913B2 (en) 1999-02-11 2005-01-25 Flir Systems, Inc. Method and apparatus for barcode selection of themographic survey images
US7640007B2 (en) * 1999-02-12 2009-12-29 Fisher-Rosemount Systems, Inc. Wireless handheld communicator in a process control environment
SE0201529D0 (en) 2002-05-21 2002-05-21 Flir Systems Ab Method and apparatus for IR camera inspections
US10546253B2 (en) * 2013-01-22 2020-01-28 General Electric Company Realtime inspection management
US6974373B2 (en) 2002-08-02 2005-12-13 Geissler Technologies, Llc Apparatus and methods for the volumetric and dimensional measurement of livestock
US20040143602A1 (en) 2002-10-18 2004-07-22 Antonio Ruiz Apparatus, system and method for automated and adaptive digital image/video surveillance for events and configurations using a rich multimedia relational database
US6937164B2 (en) 2003-02-17 2005-08-30 The Boeing Company Methods and apparatus for transportation vehicle security monitoring
US20070174152A1 (en) 2003-12-08 2007-07-26 Bjornberg David B Handheld system for information acquisition, verification, recording, processing, display and communication
US7454050B2 (en) 2004-06-18 2008-11-18 Csi Technology, Inc. Method of automating a thermographic inspection process
US7535002B2 (en) 2004-12-03 2009-05-19 Fluke Corporation Camera with visible light and infrared image blending
WO2006060746A2 (en) 2004-12-03 2006-06-08 Infrared Solutions, Inc. Visible light and ir combined image camera with a laser pointer
US20060244837A1 (en) 2005-04-28 2006-11-02 Lucas Ekeroth Thermal imaging device
US7332716B2 (en) 2005-06-06 2008-02-19 Flir Systems Ab IR camera
US7528372B2 (en) 2005-10-19 2009-05-05 Csi Technology, Inc. Apparatus and method for infrared imaging with performance algorithm
US7475275B2 (en) 2005-10-27 2009-01-06 International Business Machines Corporation Method for fault handling in a co-operative workflow environment
US7732768B1 (en) 2006-03-02 2010-06-08 Thermoteknix Systems Ltd. Image alignment and trend analysis features for an infrared imaging system
US20080183049A1 (en) 2007-01-31 2008-07-31 Microsoft Corporation Remote management of captured image sequence
EP2174291B1 (en) 2007-07-09 2017-04-26 Flir Systems AB Method of processing an infrared image, infrared image capturing system and computer readable medium
TW200915853A (en) 2007-09-28 2009-04-01 Altek Corp Method and system for stabilizing vibration of camera
US20090106684A1 (en) * 2007-10-22 2009-04-23 Al Chakra System and Method to Facilitate Progress Forking
US7970911B2 (en) * 2008-01-04 2011-06-28 Mitel Networks Corporation Method, apparatus and system for modulating an application based on proximity
DE102008004785B4 (en) 2008-01-17 2012-06-21 Dräger Safety AG & Co. KGaA System for protecting and / or guiding people in dangerous situations
US8749635B2 (en) 2009-06-03 2014-06-10 Flir Systems, Inc. Infrared camera systems and methods for dual sensor applications
US8645854B2 (en) * 2010-01-19 2014-02-04 Verizon Patent And Licensing Inc. Provisioning workflow management methods and systems
US8976931B2 (en) 2010-04-13 2015-03-10 Carestream Health, Inc. Mobile radiography imaging apparatus using prior related images before current image exposure and methods for same
US9615147B2 (en) 2010-05-17 2017-04-04 Flir Systems, Inc. Multisensory meter system
US20120039537A1 (en) 2010-08-10 2012-02-16 Keys Gregory C Method, apparatus, and system for workflow participation of an imaging device
WO2012101272A1 (en) 2011-01-28 2012-08-02 Flir Systems Ab A method for managing ir image data
WO2012101275A1 (en) 2011-01-28 2012-08-02 Flir Systems Ab Dynamic annotation in user information system of ir camera
EP2678842A1 (en) 2011-02-22 2014-01-01 Flir System, Inc. Infrared sensor systems and methods
CN103493472B (en) 2011-02-25 2017-07-04 菲力尔系统公司 Modular infrared camera system and method
US9176990B2 (en) 2011-03-04 2015-11-03 Fluke Corporation Visual image annotation, tagging of infrared images, and infrared image linking
KR101066068B1 (en) 2011-03-22 2011-09-20 (주)유디피 Video surveillance device and method using dual camera
US9058520B2 (en) * 2011-09-22 2015-06-16 Siemens Corporation Systems and methods for hands free inspection
US20130090946A1 (en) 2011-10-05 2013-04-11 Thomas Kwok-Fah Foo Systems and methods for imaging workflow
US20130155249A1 (en) 2011-12-20 2013-06-20 Fluke Corporation Thermal imaging camera for infrared rephotography
US20130155248A1 (en) 2011-12-20 2013-06-20 Fluke Corporation Thermal imaging camera for infrared rephotography
US20130162835A1 (en) 2011-12-23 2013-06-27 Fluke Corporation Thermal imaging camera for infrared rephotography
US20130176202A1 (en) * 2012-01-11 2013-07-11 Qualcomm Incorporated Menu selection using tangible interaction with mobile devices
US9479388B2 (en) * 2012-08-29 2016-10-25 Maintenance Assistant Inc Computer system and method for maintenance management including collaboration across clients
US10337962B2 (en) 2013-03-15 2019-07-02 Fluke Corporation Visible audiovisual annotation of infrared images using a separate wireless mobile device
US10031490B2 (en) * 2013-03-15 2018-07-24 Fisher-Rosemount Systems, Inc. Mobile analysis of physical phenomena in a process plant
US8887993B2 (en) * 2013-04-23 2014-11-18 The Boeing Company Barcode access to electronic resources for complex system parts
US20160080666A1 (en) 2014-09-17 2016-03-17 Fluke Corporation Test and measurement system with removable imaging tool
US9568368B2 (en) 2014-09-17 2017-02-14 Fluke Corporation Mobile device used with isolated test and measurement input block
US20160076937A1 (en) 2014-09-17 2016-03-17 Fluke Corporation Display of images from an imaging tool embedded or attached to a test and measurement tool
US10116885B2 (en) 2015-04-05 2018-10-30 Hema Imaging Llc Systems and approaches for repeated thermal imaging determinations
US10152836B2 (en) * 2016-04-19 2018-12-11 Mitchell International, Inc. Systems and methods for use of diagnostic scan tool in automotive collision repair
US10789577B2 (en) * 2016-04-21 2020-09-29 Continental Tide Defense Systems Inc. Workflow, assessment, verification, and evaluation (WAVE) system and method
US10375325B2 (en) 2016-06-23 2019-08-06 Fluke Corporation Thermal anomaly detection
WO2019005951A1 (en) * 2017-06-29 2019-01-03 Walmart Apollo, Llc Systems and methods for performing and tracking asset inspections
US11099075B2 (en) 2017-11-02 2021-08-24 Fluke Corporation Focus and/or parallax adjustment in acoustic imaging using distance information

Cited By (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20240420477A1 (en) * 2009-06-04 2024-12-19 Optics Innovation Llc Method and apparatus for a wearable computer
US12340590B2 (en) * 2009-06-04 2025-06-24 Optics Innovation Llc Method and apparatus for a wearable computer
US20180348143A1 (en) * 2017-05-31 2018-12-06 Flir Systems Ab Inspection routing systems and methods
US10732123B2 (en) * 2017-05-31 2020-08-04 Flir Systems Ab Inspection routing systems and methods
US12335655B2 (en) 2020-03-31 2025-06-17 Flir Systems Ab Thermal imaging asset inspection systems and methods
US20220057269A1 (en) * 2020-08-21 2022-02-24 Analog Devices, Inc. Multi-sensor using a thermal camera
US20220377219A1 (en) * 2021-05-18 2022-11-24 Magna Electronics Inc. Vehicular driver monitoring system with camera view optimization
US11930264B2 (en) * 2021-05-18 2024-03-12 Magna Electronics Inc. Vehicular driver monitoring system with camera view optimization
US20240214666A1 (en) * 2021-05-18 2024-06-27 Magna Electronics Inc. Vehicular driver monitoring system with camera view optimization
US12225278B2 (en) * 2021-05-18 2025-02-11 Magna Electronics Inc. Vehicular driver monitoring system with camera view optimization
WO2024073746A1 (en) * 2022-09-30 2024-04-04 Flir Systems Ab Camera alignment using reference image for asset inspection systems and methods

Also Published As

Publication number Publication date
US12088910B2 (en) 2024-09-10
EP3480752A1 (en) 2019-05-08
CN109754148A (en) 2019-05-14
US20210344833A1 (en) 2021-11-04

Similar Documents

Publication Publication Date Title
US12088910B2 (en) Inspection workflow using object recognition and other techniques
CN111604888B (en) Inspection robot control method, inspection system, storage medium and electronic device
US10809148B2 (en) Combined gas leakage detection and quantification
US10878607B2 (en) Collection and validation of data from visual displays
RU2707681C2 (en) Method and device for remote inspection of aircraft engine condition
US10839717B2 (en) Weld training systems to synchronize weld data for presentation
US11127211B2 (en) Plant management system, plant management method, plant management apparatus, and plant management program
US8902302B2 (en) Method and apparatus for surveying with a feature location
CN111417962A (en) Equipment management system
US10732123B2 (en) Inspection routing systems and methods
CN107643125A (en) The determination method and apparatus of equipment fault
CN114508646B (en) Intelligent detection method and system for overhauling pipeline by utilizing pipeline robot
WO2018038149A1 (en) Gas detection information display system and gas detection information display program
US12335655B2 (en) Thermal imaging asset inspection systems and methods
KR102227031B1 (en) Method for monitoring cracks on surface of structure by tracking of markers in image data
JP7335899B2 (en) Measuring system, measuring device, measuring method, and program
JP7193913B2 (en) Inspection work management system and inspection work management method
KR101912611B1 (en) Portable measurement and analysis device using location based information service
JP7432078B2 (en) Portable display device with virtual information overlay
KR101497396B1 (en) A system for measuring target location and method for measuring target location using the same
KR102039902B1 (en) Remote device precision inspection system and method
US20250071251A1 (en) System and method for locating and visualizing camera images in relation to a large-scale manufacturing product
US20240328789A1 (en) Method for recording inspection data
Induti et al. Magnetic wireless crawler for welds Visual Testing, based on 3D profilometry and 2D image processing
CN119359631A (en) Finished product quality inspection method, finished garment quality inspection method, system, equipment, storage medium and program product

Legal Events

Date Code Title Description
AS Assignment

Owner name: FLUKE CORPORATION, WASHINGTON

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:BERGSTROM, PETER A.;KNIGHT, BRIAN;RHEAD, JAMIE;SIGNING DATES FROM 20171117 TO 20180110;REEL/FRAME:047477/0950

STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: FINAL REJECTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER

STPP Information on status: patent application and granting procedure in general

Free format text: FINAL REJECTION MAILED

AS Assignment

Owner name: FLUKE CORPORATION, WASHINGTON

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:BERGSTROM, PETER A;KNIGHT, BRIAN;RHEAD, JAMIE;SIGNING DATES FROM 20171117 TO 20180110;REEL/FRAME:056721/0221

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION

点击 这是indexloc提供的php浏览器服务,不要输入任何密码和下载