US20030164200A1 - Assembly line fluid filling system and method - Google Patents
Assembly line fluid filling system and method Download PDFInfo
- Publication number
- US20030164200A1 US20030164200A1 US09/810,778 US81077801A US2003164200A1 US 20030164200 A1 US20030164200 A1 US 20030164200A1 US 81077801 A US81077801 A US 81077801A US 2003164200 A1 US2003164200 A1 US 2003164200A1
- Authority
- US
- United States
- Prior art keywords
- fuel
- stem
- fuel stem
- end effector
- robotic arm
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
- 238000000034 method Methods 0.000 title claims abstract description 65
- 239000012530 fluid Substances 0.000 title claims description 11
- 239000000446 fuel Substances 0.000 claims abstract description 235
- 239000012636 effector Substances 0.000 claims abstract description 72
- 239000013598 vector Substances 0.000 claims description 38
- 230000007246 mechanism Effects 0.000 claims description 3
- 238000007789 sealing Methods 0.000 claims description 2
- 238000012544 monitoring process Methods 0.000 claims 3
- 238000006073 displacement reaction Methods 0.000 description 13
- 230000033001 locomotion Effects 0.000 description 11
- 230000004044 response Effects 0.000 description 6
- 238000004519 manufacturing process Methods 0.000 description 5
- 101100241173 Caenorhabditis elegans dat-1 gene Proteins 0.000 description 4
- 230000008901 benefit Effects 0.000 description 4
- 238000004891 communication Methods 0.000 description 4
- 239000000945 filler Substances 0.000 description 4
- 238000003032 molecular docking Methods 0.000 description 4
- 230000003287 optical effect Effects 0.000 description 4
- 238000010926 purge Methods 0.000 description 4
- 238000012546 transfer Methods 0.000 description 4
- 238000004590 computer program Methods 0.000 description 3
- 239000000284 extract Substances 0.000 description 3
- 230000000750 progressive effect Effects 0.000 description 3
- 101000650775 Boana raniceps Raniseptin-1 Proteins 0.000 description 2
- 238000001816 cooling Methods 0.000 description 2
- 230000001419 dependent effect Effects 0.000 description 2
- QPADNTZLUBYNEN-UHFFFAOYSA-N etallobarbital Chemical compound C=CCC1(CC)C(=O)NC(=O)NC1=O QPADNTZLUBYNEN-UHFFFAOYSA-N 0.000 description 2
- 239000003502 gasoline Substances 0.000 description 2
- 230000006872 improvement Effects 0.000 description 2
- 238000003780 insertion Methods 0.000 description 2
- 230000037431 insertion Effects 0.000 description 2
- 239000003550 marker Substances 0.000 description 2
- 238000013178 mathematical model Methods 0.000 description 2
- 239000011159 matrix material Substances 0.000 description 2
- 238000011084 recovery Methods 0.000 description 2
- 230000009466 transformation Effects 0.000 description 2
- 230000001960 triggered effect Effects 0.000 description 2
- 238000013459 approach Methods 0.000 description 1
- 239000002826 coolant Substances 0.000 description 1
- 230000007423 decrease Effects 0.000 description 1
- 230000003247 decreasing effect Effects 0.000 description 1
- 230000007812 deficiency Effects 0.000 description 1
- 238000001514 detection method Methods 0.000 description 1
- 238000010586 diagram Methods 0.000 description 1
- 239000002283 diesel fuel Substances 0.000 description 1
- 230000009977 dual effect Effects 0.000 description 1
- 239000000428 dust Substances 0.000 description 1
- 229920002457 flexible plastic Polymers 0.000 description 1
- 239000002828 fuel tank Substances 0.000 description 1
- 230000006870 function Effects 0.000 description 1
- 239000010763 heavy fuel oil Substances 0.000 description 1
- 238000003384 imaging method Methods 0.000 description 1
- 230000004807 localization Effects 0.000 description 1
- 230000013011 mating Effects 0.000 description 1
- 229910001507 metal halide Inorganic materials 0.000 description 1
- 150000005309 metal halides Chemical class 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 229920003023 plastic Polymers 0.000 description 1
- 239000004033 plastic Substances 0.000 description 1
- 230000008569 process Effects 0.000 description 1
- 238000005086 pumping Methods 0.000 description 1
- 239000012858 resilient material Substances 0.000 description 1
- 230000000087 stabilizing effect Effects 0.000 description 1
- 238000012360 testing method Methods 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Images
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B67—OPENING, CLOSING OR CLEANING BOTTLES, JARS OR SIMILAR CONTAINERS; LIQUID HANDLING
- B67D—DISPENSING, DELIVERING OR TRANSFERRING LIQUIDS, NOT OTHERWISE PROVIDED FOR
- B67D7/00—Apparatus or devices for transferring liquids from bulk storage containers or reservoirs into vehicles or into portable containers, e.g. for retail sale purposes
- B67D7/04—Apparatus or devices for transferring liquids from bulk storage containers or reservoirs into vehicles or into portable containers, e.g. for retail sale purposes for transferring fuels, lubricants or mixed fuels and lubricants
- B67D7/0401—Apparatus or devices for transferring liquids from bulk storage containers or reservoirs into vehicles or into portable containers, e.g. for retail sale purposes for transferring fuels, lubricants or mixed fuels and lubricants arrangements for automatically fuelling vehicles, i.e. without human intervention
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B62—LAND VEHICLES FOR TRAVELLING OTHERWISE THAN ON RAILS
- B62D—MOTOR VEHICLES; TRAILERS
- B62D65/00—Designing, manufacturing, e.g. assembling, facilitating disassembly, or structurally modifying motor vehicles or trailers, not otherwise provided for
- B62D65/02—Joining sub-units or components to, or positioning sub-units or components with respect to, body shell or other sub-units or components
- B62D65/18—Transportation, conveyor or haulage systems specially adapted for motor vehicle or trailer assembly lines
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B67—OPENING, CLOSING OR CLEANING BOTTLES, JARS OR SIMILAR CONTAINERS; LIQUID HANDLING
- B67D—DISPENSING, DELIVERING OR TRANSFERRING LIQUIDS, NOT OTHERWISE PROVIDED FOR
- B67D7/00—Apparatus or devices for transferring liquids from bulk storage containers or reservoirs into vehicles or into portable containers, e.g. for retail sale purposes
- B67D7/04—Apparatus or devices for transferring liquids from bulk storage containers or reservoirs into vehicles or into portable containers, e.g. for retail sale purposes for transferring fuels, lubricants or mixed fuels and lubricants
- B67D7/0401—Apparatus or devices for transferring liquids from bulk storage containers or reservoirs into vehicles or into portable containers, e.g. for retail sale purposes for transferring fuels, lubricants or mixed fuels and lubricants arrangements for automatically fuelling vehicles, i.e. without human intervention
- B67D2007/0444—Sensors
- B67D2007/0455—Sensors recognising the position
- B67D2007/0467—Sensors recognising the position of the fuel tank flap and/or fuel tank opening
- B67D2007/0473—Sensors recognising the position of the fuel tank flap and/or fuel tank opening optically
Definitions
- a computer program listings appendix is contained on a compact disc submitted herewith hereby incorporated by reference.
- One compact disc and one duplicate are submitted according to 37 C.F.R. ⁇ 1.52(e) and each contains the following files: FILE NAME SIZE IN BYTES DATE OF CREATION Cal.rsp 1 KB May 21, 1995 Cal_eval.c 12 KB Jul. 15, 1995 Cal_main.c 72 KB Sep. 12, 1995 Cal_main.h 10 KB Jul. 15, 1995 Cal_tran.c 14 KB Oct. 22, 1995 Cal_util.c 8 KB May 14, 1995 Cc_cd.dat 9 KB Oct. 28, 1995 Cc_cpcc.dat 1 KB Oct. 28, 1995 Ccal.c 4 KB Apr.
- This invention relates to a robotic assembly line filling system.
- a known system 10 for fueling an automotive vehicle 12 is shown.
- the vehicle 12 is moved along an assembly line via a conveyer line 14 .
- an operator 16 inserts a fuel nozzle 18 into a fuel stem (not shown) to fuel the vehicle 12 .
- a gantry 20 and a carriage 22 are utilized to move the fuel nozzle 18 along with the vehicle 12 .
- a predetermined amount of fuel is pumped into the fuel stem of the vehicle 12 .
- the operator 16 may remove the nozzle 18 from the fuel stem or the nozzle 18 may be automatically removed from the fuel stem.
- the known fueling system 10 has a drawback in that the operator 16 must manually insert the fuel nozzle 18 into the fuel stem. Labor and associated manufacturing costs are increased.
- U.S. Pat. No. 4,708,175 issued to Janashak et al. discloses a robot that fills a container mounted on a vehicle with a fluid.
- the vehicle is mounted on a conveyer line that moves into a work cell where the robot is located.
- the robot utilizes a vision system to determine the position of the inlet of the container.
- the robot then moves a robotic arm to the position of the inlet to fill the container with fluid, using gauge holes as visual target points.
- Janashak et al. appear to disclose that the system may be used to fill a moving container provided the robot is capable of tracking the moving vehicle, no description of how this can be accomplished is provided. In particular, Janashak et al.
- One advantage of the present invention is that the vehicles on the assembly line need not be stopped to have a fluid container, such as a fuel tank, filled.
- the present invention broadly, provides a system and method for filling a container with a fluid in a vehicle moving along an assembly line.
- the fluid can be, for example only, fuel, coolant, windshield washer fluid, and the like.
- the method includes the steps of determining a position of an inlet of the container (as the vehicle is moving) using a machine vision system. The next step involves moving a fluid fill outlet to the container inlet using a robot. Finally, the container is filled via the outlet while the vehicle continues to move.
- a fueling system in accordance with the present invention includes a gantry having a carriage configured to move generally parallel to a first axis (i.e., the conveyor line on which the vehicle moves).
- the fueling system further includes a robotic arm attached to the carriage that moves with the carriage.
- the robotic arm has an end effector configured to mate with a fuel stem on the vehicle and to supply fuel to the fuel stem.
- the fueling system further includes a vision system including first and second cameras for iteratively determining a three-dimensional position of the fuel stem relative to a predetermined coordinate system. The position is preferably a center point of the fuel stem lying on a plane defined by the sealing surface (typically an outer edge) of the fuel stem.
- the fueling system includes a robot controller configured to command the carriage to move proximate to the position of the fuel stem and to move the end effector to the three-dimensional position to mate with the fuel stem.
- the robot controller is further configured to move the robotic arm relative to the first axis at a speed substantially equal to a speed of the conveyer line.
- a method for fueling an automotive vehicle on a conveyer line utilizes a robotic arm with an end effector for fueling a fuel stem of the vehicle.
- the method includes determining a first position of the fuel stem along a first axis.
- the method further includes moving the robotic arm to position the end effector proximate to the first position.
- the method further includes determining a three-dimensional position of the fuel stem utilizing a vision system.
- the method further includes moving the end effector proximate to the three-dimensional position to enable the end effector to mate with the fuel stem and moving the robotic arm relative to the first axis at a speed substantially equal to a speed of the conveyer line.
- the method includes fueling the fuel stem with the end effector.
- a further method for providing position data to a controller is also provided.
- the method provides position data to the controller to enable a robotic arm controlled by the controller to move to a position of an object.
- the method utilizes first and second cameras disposed at first and second camera coordinate systems, respectively.
- the method in a preferred embodiment includes simultaneously acquiring a first digital image and a second digital image of a workspace including the object utilizing the first camera and the second camera, respectively.
- the method further includes searching the first digital image with a first fuel stem template to determine a location in the first image where a first correlation score between that portion of the image being searched and the template is greater than a predetermined threshold.
- the method further includes searching the second digital image with a second fuel stem template to determine a location in the second image where a second correlation score between that portion of the second image being searched and the template is greater than a predetermined threshold.
- the method further includes calculating a three-dimensional position of the object with respect to a predetermined coordinate system. The position is calculated responsive to the locations in the first and second digital images where the correlation scores were above the threshold.
- the method further includes calculating a triangulation error of the position.
- the method further includes transferring position data indicative of the calculated position to the controller when the fuel stem matches are found in both images and the triangulation error is less than a threshold error value.
- the fueling system for fueling an automotive vehicle on a conveyer line and the method related thereto represent a significant improvement over conventional fueling systems and methods.
- the inventive fueling system allows the automatic fueling of a fuel stem without the need for an operator, resulting in labor savings.
- the inventive fueling system can track the position of the fuel stem on a moving vehicle (on a conveyer line) and move an end effector to mate with the fuel stem.
- the vehicle can be fueled without stopping the conveyer line resulting in decreased manufacturing costs and increased manufacturing efficiency.
- FIG. 1 is a schematic of a known fueling system for an automotive vehicle on a conveyer line.
- FIG. 2 is a schematic of a fueling system in accordance with the present invention.
- FIG. 3 is a schematic of a gantry and a robotic arm shown in FIG. 2.
- FIG. 4 is a side view of the gantry and the robotic arm shown in FIG. 3.
- FIG. 5 is an enlarged perspective view of the robotic arm shown in FIG. 3.
- FIG. 6 is a side view of the robotic arm in its kinematic zero position (all joint variables equal to zero).
- FIG. 7 is a schematic showing the coordinate systems utilized to control the position of the robotic arm.
- FIG. 8 is an exploded schematic of an end effector of the robotic arm shown in FIG. 5.
- FIG. 9 is block diagram of a robotic control system utilized by the inventive fueling system.
- FIGS. 10 A- 10 C are schematics illustrating the tracking variables utilized for positioning the robotic arm.
- FIG. 10D is a tracking equation utilized to position the robotic arm along the J1 axis.
- FIG. 11 is a schematic illustrating a triangulation technique utilized by the vision system to determine the three dimensional position of the fuel stem.
- FIG. 12 is a schematic illustrating a two-dimensional digital image coordinate system.
- FIG. 13 is a schematic illustrating a two-dimensional camera sensor coordinate system.
- FIG. 14 is a schematic illustrating a three-dimensional camera coordinate system.
- FIG. 15 is a schematic illustrating equations that define the intrinsic camera model.
- FIG. 16 is a schematic illustrating a force vector applied to the end effector.
- FIG. 17 is a schematic illustrating a Jacobian Transpose relationship utilized for force feedback control of the robotic arm.
- FIG. 18 is a schematic illustrating an inverse kinematics model utilized by the robot control system to position and orient an end effector.
- FIG. 19 is a schematic of a fuel stem.
- FIG. 20 is a schematic of a docking path of an end effector with a fuel stem.
- FIGS. 21 A- 21 G are flowcharts illustrating the modes of operation for the fueling system in accordance with the present invention.
- FIG. 2 illustrates a fueling system 24 for fueling an automotive vehicle 26 on a conveyer line 28 .
- the vehicle 26 is on a carriage 30 moving on the conveyer line 28 .
- the inventive fueling system 24 may be configured to operate with any known type of conveyer line, including for example, overhead conveyer lines (see FIG. 1) and floor mounted conveyer lines.
- An advantage of the fueling system 24 is that the vehicle 26 may be fueled without stopping the vehicle 26 on the conveyer line 28 .
- the inventive fueling system 24 includes a gantry 32 , a robotic arm 34 , and a robot control system 36 .
- the gantry 32 is provided to move the robotic arm 34 substantially parallel to the conveyer line 28 .
- the gantry 32 includes a frame 38 , a carriage 40 , and a motor 42 .
- the frame 38 may be supported by legs (not shown) or may be mounted to ceiling supports.
- the motor 42 is mounted on the gantry 32 and is operatively connected to a drive belt (not shown) that is connected to the carriage 40 .
- rotational movement of a rotor (not shown) of the motor 42 causes the drive belt to move the carriage 40 along the J1 axis in either a forward direction (to the right in FIG. 2) or a backward direction (to the left in FIG. 2).
- the motor 42 is electrically connected to a motor driver 44 (see FIG. 9) that is controlled by the robot controller 46 .
- the controller 46 can selectively control the position of the carriage 40 and the robotic arm 34 along a J1 axis.
- the motor 42 further includes an internal encoder 48 (not shown in FIG. 1).
- the encoder 48 generates a position value J1_VAL indicative of the position of the carriage 40 that is received by the robot controller 46 .
- the robotic arm 34 is provided to fuel the automotive vehicle 26 .
- the robotic arm 34 is connected to the carriage 40 and moves with the carriage 40 along the J1 axis.
- the robotic arm 34 Prior to fueling the vehicle 26 , the robotic arm 34 is rotated to a desired angle about the J2 axis which matches the orientation of a fuel stem 90 in the plane of J2 (see FIG. 10A) of the vehicle 26 .
- the robotic arm 34 is automatically rotated to the desired J2 angle.
- the J2 angle could be manually fixed.
- the robotic arm 34 includes a manipulator arm 50 , a pitch arm 52 , an end effector 54 , a camera arm 56 , and a light 58 .
- the manipulator arm 50 is provided to support the pitch arm 52 and the end effector 54 .
- the manipulator arm 50 includes a frame 60 , a motor 62 , a linear actuator 64 , and a joint 66 .
- the frame 60 is connected to a first end of the pitch arm 52 via the joint 66 .
- the motor 62 is provided to drive the linear actuator 64 to thereby move the pitch arm 52 about the J3 axis.
- the linear actuator 64 converts rotary movement of a rotor (not shown) of the motor 62 into linear movement of the upper end of a rod 68 .
- the lower end of the rod 68 is connected to the pitch arm 52 via a pin joint 70 .
- linear actuation of the upper end of the rod 68 causes rotational motion of the pitch arm 52 about the J3 axis.
- the motor 62 is electrically connected to a motor driver 72 that is controlled by the robot controller 46 .
- the controller 46 can selectively control the position of the linear actuator 64 to move the pitch arm 52 to a desired rotational angle about the J3 axis.
- an encoder 74 is operatively mounted on the joint 66 .
- the encoder 74 generates a position value ⁇ 3_VAL indicative of a rotational angle of the pitch arm 52 about the J3 axis that is received by the robot controller 46 .
- an encoder 76 is operatively mounted between the carriage 40 and the manipulator arm 50 .
- the encoder 76 generates a position value ⁇ 2_VAL indicative of a rotational angle of the manipulator arm 50 about the J2 axis that is received by the robot controller 46 .
- the pitch arm 52 is provided to support the end effector 54 .
- the pitch arm 52 includes an L-shaped frame 78 , a pneumatic cylinder 80 , and a joint 82 . As illustrated, a second end of the pitch arm 52 is connected to the end effector 54 via the joint 82 .
- the pneumatic cylinder 80 is mounted on the pitch arm 52 and has a cylinder rod 84 connected to the end effector 54 . Further, the cylinder 80 is connected to a pneumatic control valve 86 (see FIG. 9) which is selectively controlled by the robot controller 46 .
- the valve 86 may be a dual solenoid three-position open or closed center valve having an extend solenoid (not shown) and a retract solenoid (not shown).
- the cylinder 80 When the extend solenoid is energized (and the retract solenoid is de-energized), the cylinder 80 extends the rod 84 to rotate the end effector 54 to a desired rotational angle about the J4 axis as defined by an adjustable hard stop.
- the hard stop connected to the cylinder rod is adjusted prior to moving the actuator (which only makes full motion movements). This may be done manually, but is preferably done automatically.
- This method of controlling the joint position takes advantage of the speed, power, and safety (i.e., in proximity to gasoline) of a pneumatic actuator while avoiding the difficulty in mid-positioning pneumatic actuators, and particularly avoiding the lack of stiffness exhibited by a mid-positioned pneumatic actuator.
- the hard stop is preferably moved by a low power electric or pneumatic motor using a self-locking mechanism such as a worm gear drive or a self-locking lead screw so that the power required to move the stop is less than the power required to move the joint.
- a self-locking mechanism such as a worm gear drive or a self-locking lead screw
- the end effector 54 is provided to mate with the fuel stem 90 of the automotive vehicle 26 .
- the end effector 54 includes a rodless cylinder 92 , a mounting bracket 94 , a guide 96 , a manifold 98 , a fuel hose 100 , a boot 102 , a front housing 104 , load cells 106 , 108 , 110 , 112 , pneumatic cylinder 114 , fuel valves 116 , 118 , elbows 120 , 122 , 124 , 126 , 128 , 130 , pipe tees 132 , 134 , 136 , 138 , check valves 140 , 142 , vapor fittings 144 , 146 , 148 , 150 , and a position potentiometer 152 .
- the rodless cylinder 92 is provided to move the mounting bracket 94 and the remaining components of the end effector 54 along the J5 axis.
- the cylinder 92 is provided to move the boot 102 against the fuel stem 90 .
- the cylinder 92 has a slidable plate 154 that may be extended or retracted along the J5 axis. Referring to FIGS. 8 and 9, the cylinder 92 may be operatively connected to a pneumatic control valve 156 that is controlled by the robot controller 46 . As shown, the plate 154 is attached to the mounting bracket 94 .
- the mounting bracket 94 is provided to support the remaining components of the end effector 54 (excluding the rodless cylinder 92 ). As shown, the guide 96 is attached to bracket 94 to allow axial movement of the manifold 98 and the fuel hose 100 relative to the bracket 94 and the boot 102 .
- the manifold 98 is provided to direct fuel from either fuel valve 116 or fuel valve 118 through the fuel hose 100 . Further, the pneumatic cylinder 114 is provided to selectively move the manifold 98 (discussed in greater detail below) along the guide 96 relative to the bracket 94 .
- the fuel valve 116 may receive fuel (e.g., gasoline) from the elbow 122 which is connected to a first fuel line (not shown). When the valve 116 is open, fuel is supplied to the manifold 98 and to the fuel hose 100 . When the fuel valve 116 is closed, the fuel may be recirculated through elbow 120 for cooling by cooling equipment (not shown). The valve 116 is selectively controlled by the robot controller 46 .
- fuel e.g., gasoline
- first fuel line not shown
- the valve 116 is selectively controlled by the robot controller 46 .
- the fuel valve 118 may receive fuel (e.g., diesel fuel) from the elbow 126 which is connected to a second fuel line (not shown). When the valve 118 is open, fuel is supplied to the manifold 98 and to the fuel hose 100 . When the fuel valve 118 is closed, no fuel is supplied to the manifold 98 .
- the valve 118 is also selectively controlled by the robot controller 46 .
- the boot 102 is provided to mate with the fuel stem 90 during fueling of the stem 90 .
- the boot 102 may be constructed of a resilient material such as rubber or plastic and includes a hollow body portion 103 and a boot seal 101 .
- the boot seal 101 is configured to seal against a top edge of the fuel stem 90 to prevent fuel and fuel vapor from escaping from the stem 90 during fueling.
- the boot 102 is attached to the front housing 104 .
- the boot is also electrically connected to the front housing, and provides a conductive path which grounds the robot to the vehicle upon insertion.
- the fuel hose 100 is provided to be inserted into the fuel stem 90 beneath the no-lead insert (not shown) after the boot 102 has mated with the stem 90 .
- the fuel hose 100 is constructed of a flexible plastic and extends from an outlet (not shown) on the manifold 98 and through the housing 104 and the boot 102 .
- the pneumatic cylinder 114 is provided to move the manifold 98 and the fuel hose 100 along the J5 axis to insert the hose 100 into the stem 90 .
- the cylinder 114 is operatively connected to a pneumatic control valve 158 that is selectively controlled by the robot controller 46 .
- the check valves 140 , 142 are provided to purge manifold 98 and fuel hose 100 of fuel. When both fuel valves 116 , 118 are closed, an air supply (not shown) may apply air through both of the check valves 140 , 142 to force any fuel remaining in the manifold 98 and the fuel hose 100 into the fuel stem 90 .
- vapor recovery fittings 144 , 146 , 148 , 150 , pipe tee 138 , and elbow 130 may be connected together for fuel vapor recovery as known by those skilled in the art.
- the load cells 106 , 108 , 110 , 112 are provided to measure a force applied to the end effector 54 by the fuel stem 90 during fueling.
- the load cells 106 , 108 , 110 , 112 are conventional in the art and are mounted between the front housing 104 and the mounting bracket 94 .
- the front housing 104 transmits force exerted on the boot 102 to the load cells 106 , 108 , 110 , 112 .
- the load cells 106 , 108 , 110 , 112 are electrically connected to the robot controller 46 .
- the controller 46 may receive signals generated by the load cells 106 , 108 , 110 , 112 and calculate a force vector for force feedback control of the robotic arm 34 during fueling.
- the end effector 54 is rotated about the joint 82 to a calculated rotation angle ⁇ 4 .
- the rodless cylinder 92 then extends the remaining components of the end effector 54 along the J5 axis to mate the boot 102 with the fuel stem 90 .
- the pneumatic cylinder 114 extends the fuel hose 100 along the J5 axis to enter the fuel stem 90 beneath the no-lead insert (not shown).
- the camera arm 56 is provided to house cameras 182 , 184 . As shown, the camera arm 56 is attached to a side of the manipulator arm 50 .
- the light 58 is provided to illuminate the fuel stem 90 on the vehicle 26 to allow the vision system 164 to recognize the fuel stem 90 regardless of ambient lighting conditions.
- the light 58 may comprise a fluorescent metal halide fixture (e.g., a Class I Div. 2 rated fixture in a constructed embodiment).
- a coordinate system CS 0 represents a home position of the carriage 40 .
- the origin of the coordinate system CS 0 lies on the J1 axis.
- a coordinate system CS 1 is located on the carriage 40 and accordingly moves with the carriage 40 .
- the coordinate system CS 1 is utilized extensively for tracking the fuel stem 90 along the J1 axis, as discussed in greater detail below.
- the coordinate system CS 2 is utilized for positioning the robotic arm 34 about the J2 axis.
- Coordinate system CS 2 is also the reference coordinate system for the vision system, which uses it to locate and orient the camera coordinate systems and to report object position and orientation vectors.
- the coordinate system CS 3 is utilized for positioning the pitch arm 52 about the J3 axis.
- the coordinate system CS 4 is utilized for positioning the end effector 54 about the J4 axis.
- the coordinate system CS 5 is utilized for positioning the boot 102 along the J5 axis.
- the robot control system 36 includes a light sensor 160 , a conveyer line encoder 162 , a robot controller 46 , a vision system 164 , encoders 48 , 74 , 76 , 88 , the position potentiometer 52 , load cells 106 , 108 , 110 , 112 , motor drivers 44 , 72 , control valves 86 , 156 , fuel valves 116 , 118 , and the fuel hose extend valve 158 .
- the light sensor 160 is provided to detect when the vehicle 26 enters a vehicle fueling area on an assembly line.
- the light sensor 160 is conventional in the art and includes a light transmitter and a light receiver in package 166 to be used with a corresponding reflector 168 .
- the package 166 is oriented on a first side of the conveyer line 28 to project a light beam, in one embodiment, across the conveyer line 28 .
- the reflector 168 is oriented and positioned on an opposite side of the conveyer line 28 to receive the light beam.
- the vehicle 26 blocks the light beam from being reflected by the reflector 168 .
- the receiver in package 168 In response, the receiver in package 168 generates a trigger signal V TRIG that is received by the robot controller 46 .
- V TRIG the trigger signal
- the position of the light sensor 160 is designated as the light sensor trigger position hereinafter (see FIG. 10A).
- V TRIG could be generated by any other sufficiently repeatable mechanism (e.g., mechanical switch, ultrasonic sensor, bar code reader, etc.) to reliably indicate the presence of the next vehicle to the robot controller.
- any other sufficiently repeatable mechanism e.g., mechanical switch, ultrasonic sensor, bar code reader, etc.
- the conveyer line encoder 162 is provided to generate an encoder count CL_VAL indicative of a current encoder count with respect to the conveyer line 28 .
- the robot controller 46 utilizes two encoder counts to determine a gross displacement distance along the axis 170 of the vehicle 26 (and the fuel stem 90 ) from the light sensor trigger position.
- the robot controller 46 is provided to control the carriage 40 and the robotic arm 34 for fueling the automotive vehicle 26 .
- the robot controller 46 has a communication bus 172 for communicating with the vision controller 174 .
- robot controller 46 also has a communication bus 176 for communicating with a supervisory PLC 178 which may control the operation of the fuel pumping and metering equipment and various safety devices.
- the PLC 178 is not a component of the robot control system 36 and is shown only for illustrative purposes.
- the robot controller 46 is electrically connected to the conveyer line encoder 162 , the light sensor 160 , the load cells 106 , 108 , 110 , 112 , and the vision controller 174 .
- the robot controller 46 receives encoder counts (from the conveyer line encoder 162 ), and the signals V TRIG , V L1 , V L2 , V L3 , V L4 , and a three-dimensional position P e of the fuel stem 90 from the vision controller 174 —to control the position of the robotic arm 34 .
- the method for determining the position of the fuel stem 90 will be explained in greater detail below.
- the controller 46 is also electrically connected to encoders 48 , 74 , 76 , 88 and the position potentiometer 52 .
- the controller 46 receives the joint position values J1_VAL, ⁇ 2_VAL, ⁇ 3_VAL, ⁇ 4_VAL, J5_VAL and is able to calculate a current position of the end effector 54 .
- the controller 46 is also electrically connected to motor drivers 44 , 72 , control valves 86 , 156 , 158 , fuel valves 116 , 118 , and generates control signals to control the foregoing devices.
- the controller 46 further includes a programmable memory for implementing the various modes of operation of the fueling system 24 which will be described in greater detail below.
- the robot controller 46 utilizes an inverse kinematic model of the robotic arm 34 to position the arm 34 .
- the inverse kinematic model is a set of equations that allow the joint variables (i.e., ⁇ 3 and J5 of the robotic arm 34 to be calculated in order to place the end effector 54 in a desired position and orientation.
- Inverse kinematic models are utilized extensively in robotic applications and can readily be determined by one skilled in the art. Accordingly, the underlying inverse kinematic model equations will not be discussed in any further detail.
- one set of inputs x, y, z represent the desired position P e of the end of boot 102 with respect to a predetermined coordinate system, such as coordinate system CS 1 .
- the inputs x, y, z (i.e., point P e ) are determined by the vision system 164 while tracking the fuel stem 90 and may be stored in the variables CARRIAGE_TO_STEMX, CARRIAGE_TO_STEMY, CARRIAGE_TO_STEMZ in the robot controller 46 .
- the input a x , a y , a z is a unit vector representing the orientation of the fuel stem 90 (see FIG. 20) with respect to the coordinate system CS 1 .
- the orientation of the fuel stem 90 may be a stored value.
- unit vector a x , a y , a z may be stored as VID data in non-volatile memory (e.g., hard disk) of the vision controller and is readily determined by one skilled in the art for a specific fuel stem 90 .
- the DH constants are geometric dimensions relating to the robotic arm 34 and are stored in the non-volatile memory (e.g., hard disk) of the robot controller 46 .
- angles ⁇ 2 and ⁇ 4 represent desired positions of the robotic arm 34 about the J2 axis and J4 axis, respectively, for fueling the fuel stem 90 .
- the angles ⁇ 2 and ⁇ 4 may also be stored in the non-volatile memory (e.g., hard disk) of the robot controller 46 or may be calculated from the unit vector a x , a y , a z .
- the vision system 164 is provided to generate a three-dimensional position value (i.e., point P e ) of a fuel stem 90 with respect to a predetermined coordinate system (e.g., CS 1 ).
- the vision system 164 includes the vision controller 174 , the frame grabber 180 , inner camera 182 , and outer camera 184 .
- the vision controller 174 is provided to calculate a three-dimensional position P e of the fuel stem 90 .
- the robot controller 46 may request a position of the fuel stem 90 from the controller 174 via the bus 172 .
- the controller 174 may calculate the position P e and return the position P e to the controller 46 .
- the vision controller 174 may calculate the position of the fuel stem 90 responsive to two digital images generated by the cameras 182 , 184 —which will be explained in greater detail below.
- the robot controller 46 may also request vehicle identification data (VID) from the vision controller 174 .
- the VID data may be stored in non-volatile memory (e.g., hard disk) of the vision controller 174 .
- a user or an assembly line controller may input a VID number identifying the particular vehicle type, via an input device or a serial bus (not shown), to the robot controller 46 .
- the robot controller 46 may transmit the VID number to the vision controller 174 , which retrieves a VID record from its hard disk.
- the VID record may be transmitted to the robot controller 46 and maintained in a random access memory (RAM) of the vision controller 174 .
- the VID record contains vehicle dependent information used by the robot controller 46 to track the position of the fuel stem 90 .
- the VID record may include the following information:
- a x , a y , a z unit vector defining orientation of the fuel stem 90 with respect to the coordinate system CS 1 ;
- HOME_TO_STEM distance from a home position of the carriage 40 to a position P 0 on the J1 axis projection of fuel stem center onto z0) when the vehicle 26 crosses the light sensor trigger position (see FIG. 10A);
- DESIRED_CARRIAGE_TO_STEMY desired distance measured with respect to y1 from the carriage 40 to a position P 1 on the J1 axis for the end effector 54 to mate with the fuel stem 90 ;
- DIGITAL_IMAGE_TEMPLATE1 digital image template of the fuel stem 90 for the inner camera 182 ;
- DIGITAL_IMAGE_TEMPLATE2 digital image template of the fuel stem 90 for the outer camera 184 ;
- APPROACH_ANGLE the angular offset in the vertical plane between z5 and the stem axis 243 .
- the information contained in the VID record may be determined by one skilled in the art.
- the information is measured and stored by the vision controller 174 and the robot controller 46 .
- the information could be determined using conventional survey equipment.
- the vision controller 174 may utilize commercially available vision software for template matching to determine if cameras 182 and 184 are viewing the fuel stem 90 .
- the commercially available vision software comprised Matrox Imaging Library, Version 6.0 sold by Matrox Electronic Systems Ltd. of Dorval, Quebec, Canada.
- the conventional software searches the first digital image of the workspace, acquired by camera 182 and frame grabber 180 , for a location in which the correlation score with the first digital image template (i.e., DIGITAL_IMAGE_TEMPLATE1) is higher than a pre-determined acceptance level. If such an image location is found then there is a fuel stem match in the first digital image.
- the conventional software searches the second digital image of the workspace, acquired by camera 184 and frame grabber 180 , for a location in which the correlation score with the second digital image template (i.e., DIGITAL_IMAGE_TEMPLATE2) is higher than a pre-determined acceptance level. If such an image location is found then there is a fuel stem match in the second digital image. If there is a fuel stem match in both digital images, i.e., if both cameras 182 , 184 are viewing the fuel stem 90 , then the vision controller 174 further proceeds to calculate a three-dimensional point P e corresponding to the center of the fuel stem.
- the software methodology for calculating the point P e will be explained in greater detail below.
- the frame grabber 180 is provided to simultaneously trigger the cameras 182 , 184 and to simultaneously digitize the first and second analog images acquired by the cameras 182 , 184 , respectively, and to transmit the first and second digital images to the vision controller 174 .
- the frame grabber 180 is conventional in the art and may comprise a multi-channel frame grabber capable of simultaneously triggering and digitizing images coming from at least two monochrome progressive scan analog cameras.
- frame grabber 180 comprises a Matrox Meteor-II/MC Frame Grabber manufactured by Matrox Electronics Ltd. of Dorval, Quebec, Canada.
- the frame grabber 180 may receive a retrieve signal V TR from vision controller 174 .
- the frame grabber 180 may generate retrieve signals V TRI and V TRO to simultaneously trigger the cameras 182 , 184 to acquire the first and second analog images, respectively, of the workspace 186 .
- the cameras 182 , 184 may simultaneously begin to transfer the first and second analog images to the frame grabber 180 , which may simultaneously digitize them and may further transfer the first and the second digital images to the vision controller 174 .
- the first and second cameras 182 , 184 are used to acquire the first and the second analog images, respectively, of the workspace 186 .
- the cameras 182 , 184 are conventional in the art and may comprise monochrome CCD cameras having a progressive scan capability and a trigger shutter mode capability.
- each of cameras 182 , 184 comprised a XC-55 Progressive Scan Camera Module manufactured by Sony Electronics Inc. of Itasca, Ill.
- the cameras 182 , 184 may acquire first and second analog images of the workspace responsive to the trigger signals V TRI and V TRO , respectively.
- the cameras 182 , 184 are mounted within the camera arm 56 .
- the camera arm 56 is provided to serve a number of functions. It acts as an enclosure which protects the cameras 182 , 184 and their lenses (not shown) from dust, drips, and tampering.
- the camera arm is mounted to the manipulator arm 34 using precision shoulder screws (not shown) to guarantee precise and repeatable alignment. This allows camera arms 56 to be factory calibrated but interchangeable in the field.
- the camera arm is provided to position the cameras 182 , 184 such that their fields of view cover the workspace 186 (see FIG. 4).
- the workspace 186 is defined to be the volume of space containing the various possible positions in which the robotic manipulator 34 and the end effector 50 are capable of servicing the fuel stem 90 .
- the vision controller 174 is able to determine a three-dimensional position P e of the fuel stem 90 .
- the camera model comprises two parts: an intrinsic camera model and an extrinsic camera model.
- the intrinsic camera model specifies the relationship between a point (i.e., point (x i ,y i )) with respect to a digital image coordinate system and a direction in space (i.e., unit vector D i ) with respect to a camera coordinate system CS ci .
- the intrinsic camera model for camera 182 depends on the internal geometric, electronic and optical characteristics of camera 182 and on the electronic characteristics of the frame grabber 180 which will be discussed in greater detail below.
- the extrinsic camera model of each the cameras 182 , 184 specifies the position and orientation of the camera coordinate system CS c (i.e., coordinate system CS ci for the camera 182 and coordinate system CS co for the camera 184 ) with respect to a predetermined coordinate system that is fixed with respect to the cameras, such as CS 2 .
- the extrinsic camera model parameters are measured in the factory and are stored in non-volatile memory (e.g., hard disk) of the vision controller with respect to the camera arm 56 mounting frame.
- each extrinsic camera model which is a coordinate system transformation matrix—may be readily determined by one skilled in the art, only the intrinsic camera model will be discussed.
- the intrinsic camera model for camera 182 comprises several equations (discussed hereinafter) that are used to determine a direction vector D i pointing in space to the center point P i of the fuel stem 90 .
- the intrinsic camera model utilizes three coordinate systems: a digital image coordinate system, a camera sensor coordinate system, and a camera coordinate system.
- a digital image coordinate system defined by the axes X i and Y i is illustrated.
- the digital image coordinate system is defined by a plurality of rows and columns of pixels 185 .
- the Principal Point (C x , C y ) is the point projected onto the digital image coordinate system that corresponds to the optical axis of a camera lens (not shown) of the camera 182 . Further, the image point (x i , y i ) corresponds to the center of the fuel stem 90 and is generated by the commercially available vision software previously discussed.
- the CCD camera 182 has a sensor plane 188 comprising a plurality of rows and columns of CCD sensor cells 187 .
- the center of the fuel stem 90 is imaged on the sensor point (x s , y s ) (coordinates with respect to the camera sensor coordinate system) and after digitization by the frame grabber 180 , this point corresponds to the image point (x i , y i ) (coordinates with respect to the digital image coordinate system).
- the three-dimensional camera sensor coordinate system CS ci is partially illustrated in two dimensions.
- the camera coordinate system CS ci is defined by the axes X ci , Y ci , Z ci has an origin O ci that corresponds to the center of a lens (not shown) of the camera 182 .
- the origin O ci is also located on the optical axis of the camera 182 .
- the unit direction vector D i is determined with respect to the camera coordinate system CS ci .
- the parameters d x , d y , N cx , N cy , N ix may be supplied by the manufacturer of the camera 182 , and the parameter N ix is a known parameter of the frame grabber 180 .
- the parameters f, S x , C x , C y , K 1 may be readily determined by one skilled in the art utilizing an algorithm set forth in the following publication: “A Versatile Camera Calibration Technique for High-Accuracy 3D Machine Vision Metrology Using Off-the-Shelf TV Cameras and Lenses”, IEEE Journal of Robotics and Automation Vol RA-3, No 4, August 1987—which is incorporated by reference in its entirety.
- camera model equations 1-8 are used to map the image point (x i , y i ) with respect to a digital image coordinate system to a sensor point (x s , y s ) with respect to a camera sensor coordinate system and to further calculate an undistorted sensor point (x u , y u ) with respect to the camera sensor coordinate system, from which is calculated a direction vector D i with respect to the camera coordinate system CS ci .
- equations (1), (2), (7), and (8) are used to map the image point (x i , y i ) to the sensor point (x s , y s )—where light from a point P i of the fuel stem 90 is projected.
- sensor point (x s , y s ) is shown on the sensor plane 188 .
- the equation (3) is utilized to calculate a radial distortion factor k for the camera 182 to compensate for the radial distortion of camera 182 .
- equations (4) and (5) are utilized to calculate the undistorted sensor point (x u , y u ) (coordinates with respect to the camera sensor coordinate system) utilizing the radial distortion factor k.
- the point (x u , y u ) is shown on the sensor plane 188 .
- the preferred embodiment uses an improved version of Tsai's algorithm that was developed by Reg Willson and that is freely available on Internet at the following URL: http://www.cs.cmu.edu/afs/cs.cmu.edu/user/rgw/www/TsaiCode.html, as also shown in the computer program listing Appendix on a compact disc submitted herewith.
- the equation ( 6 ) is used to calculate the unit vector D i which is centered at the origin O ci and points toward the center point P i of the fuel stem 90 .
- the numerator of the equation (6) represents a vector pointing toward the center point P i .
- the denominator of equation (6) represents the magnitude of the vector D i .
- the vector D i represents a unit vector.
- the foregoing camera model equations are valid if the following criteria regarding the camera 182 , the lens (not shown) and the frame grabber 180 are true.
- the optical axis of the camera 182 is presumed to be perpendicular to the sensor plane 188 .
- the lens is focused so that an object in the camera workspace 186 is in focus.
- the focus of the lens is fixed. In other words, a zoom lens is not used.
- the digitization of a digital image by the frame grabber 180 is accurate.
- the intrinsic camera model for camera 184 may also be defined by the equations (1)-(8) utilizing the ten parameters f, S x , C x , C y , K 1 , d x , d y , N cx , N cy , N ix determined for the camera 184 .
- the camera 184 may have an origin O co with a coordinate system CS co .
- an observer first establishes a stereo match of the object and then uses triangulation to determine a distance to the object with respect to the observer.
- the vision system 164 works in a similar manner.
- the conventional vision software within the vision controller 174 , performs image template matching and generates first and second image match locations.
- the first image match location as the inner image location
- the second image match location as the outer image location.
- the conventional vision software determines an inner image point (i.e., x i , y i ) and an outer image point (not shown) that are representative of the center point of the fuel stem 90 in the first and second digital images, respectively.
- the inner image point i.e., x i , y i
- the outer image point is utilized to calculate the direction vector D o .
- the direction vector Do may be calculated utilizing the equations (1)-(8) discussed above.
- the two direction vectors D i and D o are utilized in conjunction with the extrinsic camera model of the cameras 182 , 184 to triangulate as estimated position P e of the fuel stem 90 with respect to the coordinate system CS 2 .
- This position with respect to CS 2 is transmitted by the vision controller 174 to the robot controller 46 which transforms it with respect to CS 1 .
- the vision controller 174 calculates an inner triangulation point P i along the line ⁇ O ci , D i ⁇ that is the closest point to the line ⁇ O co , D o ⁇ . The controller 174 then calculates an outer triangulation point P o that is the closest point on the line ⁇ O co , D o ⁇ to the inner triangulation point P i . Finally, the controller 174 calculates the midpoint between the inner triangulation point P i and the outer triangulation point P o , which is the triangulated point P e . As previously discussed, the triangulated point P e is the estimated position of a center point of the fuel stem 90 with respect to the coordinate system CS 2 .
- the vision controller 174 calculates a triangulation error which is the distance between points P e and P i . If the triangulation error is less than a threshold error value, the vision controller 174 transmits the position P e to the robot controller 46 . In particular, the position P e may be transmitted to the controller 174 utilizing the following variables: CARRIAGE_TO_STEMX, CARRIAGE_TO_STEMY, CARRIAGE_TO_STEMZ.
- force feedback control may be utilized to correct the position of the robotic arm 34 .
- the end effector 54 may be moved to reduce the force vector to a desired magnitude.
- the desired magnitude may comprise a zero value or a non-zero value depending upon the amount of force needed to seal the end effector 54 against the fuel stem 90 .
- a boot 102 of the end effector 54 is illustrated. Further, the coordinate system CS 5 is shown on the boot 102 which corresponds to a contact point of a fuel stem 90 with the boot 102 .
- a force vector FV is applied to the boot 102 with force components f x5 , f y5 and f z5 along the X5, Y5 and Z5 axes, respectively.
- the Y5 axis extends outwardly from the page.
- the undesirable force f y5 results from an error in position for the robotic arm 34 along the J1 axis.
- the undesirable force f x5 results from an error in position of the robotic arm 34 about the J3 axis.
- the load cells 106 , 108 , 110 , 112 may be utilized to determine a force vector applied to the front housing 104 . Thereafter, the force vector FV and the component forces f x5 and f y5 may be readily determined utilizing simple force vector equations well know to those skilled in the art.
- the forces f x5 , f y5 and f z5 applied to the boot 102 create (i) an undesirable force f 1 on the carriage 40 along the J1 axis and (ii) an undesirable torque ⁇ 3 on the pitch arm 52 about the J3 axis.
- the force f 1 and the torque ⁇ 3 may be calculated utilizing a Jacobian Transpose equation.
- the Jacobian Transpose equation utilizes (i) the geometric DH constants g, h, k of the robotic arm 34 (see FIG.
- the force f 1 and the torque ⁇ 3 are utilized to calculate displacement error values for the carriage 40 along the J1 axis and the pitch arm 52 about the J3 axis. Because the boot 102 is compliant, it can be modeled as a spring utilizing the spring equation:
- an error in force or torque i.e., f 1 and ⁇ 3
- an error in displacement i.e., displacement d
- the force f 1 is input into a first PID controller (implemented in software) that calculates a displacement error value ⁇ J1 for the carriage 40 on the J1 axis.
- the torque ⁇ 3 is input into a second PID controller that calculates an angular displacement error value ⁇ ⁇ 3 for the pitch arm 52 about the J3 axis.
- the values ⁇ J1 , and ⁇ ⁇ 3 may be utilized by the robot controller 46 to correct the position of the carriage 40 and the pitch arm 52 , respectively, to reduce the undesirable forces applied to the end effector 54 during fueling of the fuel stem 90 .
- the robot controller 46 utilizes three types of sensor feedback to position the robotic arm 34 along the J1 axis.
- the types of sensor feedback include a line encoder feedback, a vision system feedback, and a force feedback.
- the line encoder feedback includes encoder counts from the conveyer line encoder 162 that are utilized to determine a gross position of the fuel stem 90 .
- the encoder counts are also indicative of the speed of the conveyer line 28 .
- the vision system feedback includes a three-dimensional position P e of the fuel stem 90 with respect to the coordinate system CS 1 .
- the force feedback includes a measured force exerted on the end effector 54 by the fuel stem 90 during fueling of the fuel stem 90 .
- the robot controller 46 moves the robotic arm 34 along the J1 axis using the tracking equation (9).
- the equation (9) utilizes a STEM_DISPLACEMENT variable that corresponds to the distance that the fuel stem 90 (and the vehicle 26 ) have traveled along conveyer axis 170 since the vehicle 26 passed the light sensor trigger position.
- the STEM_DISPLACEMENT variable is calculated using the following equation:
- STEM_DISPLACEMENT (current encoder count ⁇ first encoder count)* CF , wherein;
- first encoder count encoder count from conveyer line encoder 162 when the V TRIG signal is generated
- CF conversion factor for converting an encoder count to a distance along the conveyer line axis 170 .
- the STEM_DISPLACEMENT variable is updated to a new value whenever the current encoder count is updated by the conveyer line encoder 162 .
- the equation (9) also utilizes the DESIRED_CARRIAGE_TO_STEMY constant obtained from the VID record.
- the DESIRED_CARRIAGE_TO_STEMY constant represents the desired distance along the J1 axis from the origin of coordinate system CS 1 (on the carriage 40 ) to a point P 1 directly across from the fuel stem 90 —to allow the end effector 54 to mate with the fuel stem 90 .
- the equation (9) also utilizes the HOME_TO_STEM constant obtained from the VID record.
- the HOME_TO_STEM constant represents the distance along the J1 axis from the origin of coordinate system CS 0 to a point P 0 directly across from the fuel stem 90 —when the vehicle 26 passes the light sensor trigger position.
- the equation (9) also utilizes a VISION_TRACKING_ERROR variable.
- the vision system 164 calculates a three-dimensional position P e of the fuel stem 90 with respect to a coordinate system CS 1 . Further, the vision system 164 transfers the position P e to the robot controller 46 using the following variables: CARRIAGE_TO_STEMX, CARRIAGE_TO_STEMY, CARRIAGE_TO_STEMZ. Because the CARRIAGE_TO_STEMY variable represents a current distance from the carriage 40 (and coordinate system CS 1 ) to the fuel stem 90 along the J1 axis, the variable can be used to calculate the VISION_TRACKING_ERROR. In particular, the VISION_TRACKING_ERROR variable may be calculated using the following equation:
- VISION_TRACKING_ERROR (CARRIAGE_TO_STEMY_DESIRED_CARRIAGE_TO_ST EMY)
- the carriage 40 (and coordinate system CS 1 ) is perfectly positioned along the J1 axis to allow engagement of the end effector 54 and the fuel stem 90 .
- the CARRIAGE_TO_STEMY distance returned by the vision system 164 is equal to the DESIRED_CARRIAGE_TO_STEMY.
- the VISION_TRACKING_ERROR value equals a zero value.
- the carriage 40 (and coordinate system CS 1 ) are positioned too far in front of the fuel stem 90 along the J1 axis to allow engagement of the end effector 54 and the fuel stem 90 .
- the VISION_TRACKING_ERROR is equal to a negative number (i.e., a negative value along the Y1 axis) which decreases the COMMANDED_J1_POSITION value.
- the carriage 40 and the robotic arm 34 are moved to a position along the J1 axis corresponding to the new COMMANDED_J1_POSITION.
- the carriage 40 (and coordinate system CS 1 ) are positioned too far behind a desired position on the J1 axis to allow engagement of the end effector 54 and the fuel stem 90 .
- the VISION_TRACKING_ERROR is equal to a positive number (i.e., a positive value along the Y1 axis) which increases the COMMANDED_J1_POSITION variable.
- the carriage 40 and the robotic arm 34 are moved to a position along the J1 axis corresponding to the new COMMANDED_J1_POSITION.
- the equation (9) also utilizes a FORCE_FEEDBACK_ERROR variable.
- the robot controller 46 monitors a force exerted on the robotic arm 34 by the fuel stem 90 during the fueling of the stem 90 . Further, the controller 46 calculates a displacement error value ⁇ J1 to correct the position of the carriage 40 along the J1 axis responsive to the force. Note that the force control also compensates for misalignments in the vertical directional using the J3 axis.
- the FORCE FEEDBACK_ERROR variable is set equal to the calculated displacement error value ⁇ J1 .
- the FORCE_FEEDBACK_ERROR is set equal to a zero value.
- the modes of operation include a power up mode 190 , an auto mode 192 , and a manual mode 194 .
- the various modes of operation are implemented in software that is stored in the ROM of the robot controller 46 .
- the robot controller 46 and the vision controller 174 establish communication with each other via the communication bus 172 . Further, power is applied to the motor drivers 44 , 72 .
- the robot controller 46 advances to a step 191 that prompts the user of the fueling system 24 to select between the different modes of system operation.
- the modes of operation include auto mode 192 or manual mode 194 .
- the robot controller 46 also provides for the option to shut down the fueling system 24 .
- the modes of operation may be chosen using a GUI and a pointing device (not shown), a push-button based operator panel (not shown), or similar controls on the supervisory PLC 178 .
- the software modules comprising the auto mode 192 are executed to implement the method for fueling a vehicle 26 in accordance with the present invention.
- the auto mode 192 includes a stow module 196 , an idle module 198 , a gross position module 200 , a track fuel stem module 202 , an insert module 204 , a fuel module 206 , a purge fuel module 208 , and an extract module 210 .
- the stow module 196 performs steps to move the carriage 40 to a home position (i.e., origin of the coordinate system CS 0 ) along the J1 axis. Further, the joints 66 , 82 are moved to predetermined stowed positions.
- the idle module 198 includes a step 212 of determining if a vehicle trigger signal V TRIG was received.
- a vehicle trigger signal V TRIG was received.
- the module 198 advances to a step 214 which stores a first encoder count from the conveyer line encoder 162 . Thereafter, the module 198 is exited and the auto mode 192 advances to the gross position module 200 . Alternately, if the signal V TRIG is not received, the module 198 iteratively performs the step 212 until a vehicle 26 is detected by the light sensor 160 .
- the gross position module 200 includes a step 216 of requesting a VID record from the vision controller 174 .
- the VID record contains vehicle dependent information for tracking the fuel stem 90 .
- unit vector a x , a y , a z is obtained from the VID record, and is then used to calculate the optimum J2 and J4 angles.
- the J2 angle and J4 hard stop are adjusting during the gross position module 200 .
- the J2 and J4 angles are automatically calculated but are manually adjusted off line prior to automatic operation.
- the module 200 further includes a step 218 which iteratively calculates a gross position of the fuel stem 90 with respect to the J1 axis.
- the step 218 utilizes the tracking equation (9) to calculate the COMMANDED_J1_POSITION which also represents the gross position of the fuel stem 90 .
- both the VISION_TRACKING_ERROR and the FORCE_FEEDBACK_ERROR are equal to a zero value during the step 218 .
- the module 200 further includes a step 220 of moving the robotic arm 34 along the J1 axis to position the end effector 54 proximate to the gross position of the fuel stem 90 . Further, the module 200 moves the robotic arm 34 (and the end effector 54 ) at a speed substantially equal to the speed of the conveyer line 28 . During the step 200 , the cameras 182 , 184 on the robotic arm 34 should be positioned along J1 such that the field of view of the cameras 182 , 184 covers the fuel stem 90 .
- the module 200 further includes a step 222 of triggering the frame grabber 180 .
- the robot controller 46 generates a signal V TR that causes the frame grabber 180 to transfer first and second digital images of the fuel stem 90 from the cameras 182 , 184 to the vision controller 174 .
- the module 200 further includes a step 224 of requesting a three-dimensional position P e of the fuel stem 90 from the vision controller 174 .
- the vision controller 174 performs template matching on the first and second digital images and calculates first and second correlation scores, respectively. Further, the controller 174 returns the position P e if the first and second correlation scores are above a threshold correlation score and a triangulation error of the position P e is below a predetermined triangulation error.
- the module 200 further includes a step 226 which determines whether a three-dimensional position P e was received from the vision controller 174 . If the position P e was received, the module 200 is exited and the auto mode 192 advances to the track fuel stem module 202 . Alternately, if the position P e was not received, the module 200 returns to step 222 .
- the auto mode 192 advances to the track fuel stem module 202 after module 200 .
- the module 202 includes a step 228 of triggering the frame grabber 180 to obtain first and second digital images of the fuel stem 90 generated by the cameras 182 , 184 , respectively.
- the module 202 further includes a step 230 that requests a three-dimensional position P e of the fuel stem 90 from the vision controller 174 .
- the module 202 further includes a step 232 which determines whether a three-dimensional position P e of the fuel stem 90 was received by the robot controller 46 . If the position P e was received, the module 202 advances to the step 234 . Otherwise, the module 202 returns to the step 228 .
- the module 202 further includes a step 234 of calculating the VISION_TRACKING_ERROR.
- the VISION_TRACKING_ERROR is utilized in the equation (9) to more accurately position the robotic arm 34 along the J1 axis relative to the fuel stem 90 . It should be understood that the VISION_TRACKING_ERROR is iteratively calculated in the track fuel stem module 202 and the insert module 204 so long as the fuel stem 90 is viewed by both cameras 182 , 184 .
- the module 202 further includes a step 236 which moves the robotic arm 34 along the J1 axis proximate the fuel stem 90 responsive to the COMMANDED_J1_POSITION.
- the auto mode 192 advances to the insert module 204 after the module 202 .
- the module 204 includes the steps 238 , 240 , 242 , 244 , 246 .
- the step 238 calculates joint variables ⁇ 3 and J5 using the inverse kinematic model previously discussed.
- the step 240 moves the pitch arm 52 about the J3 axis to the calculated angle ⁇ 3 .
- the step 242 moves the end effector 54 about the J4 axis to a predetermined angle ⁇ 4 .
- the step 244 moves the boot 102 along the J5 axis a distance J5 to allow the boot 102 to mate with the fuel stem 90 .
- the end effector 54 is positioned to allow the step 244 to move the end effector 54 along a docking line toward the fuel stem 90 .
- the docking line extends between a point P b on the nozzle axis 245 to the fuel stem point P e . Further, the docking line forms an approach angle ⁇ D with respect to the stem axis 243 .
- the step 246 moves the fuel hose 100 into the fuel stem 90 .
- the auto mode 192 advances to the fuel module 206 after the module 204 .
- the fuel module 206 includes the steps 248 , 250 , 252 .
- the step 248 calculates the FORCE_FEEDBACK_ERROR which is utilized by the controller 46 to calculate the COMMANDED_J1_POSITION of the carriage 40 (and the robotic arm 34 ) along the J1 axis.
- the step 250 opens a fuel valve 116 in the end effector 54 to supply fuel to the fuel stem 90 .
- the step 252 closes the fuel valve 116 after a predetermined amount of fuel is pumped into the fuel stem 90 .
- the auto mode 192 advances to the purge fuel module 208 after the module 206 .
- the purge fuel module 208 performs steps to apply air pressure to the check valves 140 , 142 to force any residual fuel in the manifold 98 and the fuel hose 100 into the fuel stem 90 .
- the auto mode 192 advances to the extract module 210 after the module 208 which extracts the fuel hose 100 from the fuel stem 90 and moves the robotic arm 34 to a predetermined retract position.
- the auto mode 192 after exiting module 210 advances to a step 254 which determines if another vehicle 26 has passed the light sensor trigger position. If another vehicle 26 is detected, the auto mode 192 advances to the gross position module 200 . Otherwise, the auto mode 192 advances to the stow module 196 .
- the robot controller 46 places the robotic arm 34 such that the two robot tip markers TM 0 and TM 1 as shown in FIG. 20 are within the field of view of the cameras 182 , 184 .
- the vision controller 174 acquires a first digital image and a second digital image from the cameras 182 , 184 , and the frame grabber 180 , and locates the first and second image locations of the tip marker 0 “TM 0 ” and of the tip marker 1 “TM 1 .”
- Pneumatic cylinder 114 is provided to extend the filler hose 100 through the boot 102 in order to pump fuel into the filler neck from a point below the no-lead insert.
- the cylinder 114 is prevented from reaching its full stroke due to the fact that the tip of the filler hose is too large to fit through the boot 102 . Should the tip of the hose 100 be lost, the cylinder 114 can be fully retracted.
- This configuration allows the robot to sense if the hose 100 has been lost or damaged by simply using two limit switches on the cylinder 114 retract stroke, one of which is triggered when the hose 100 is retracted far enough to reach the boot, and one of which is triggered when the cylinder 114 is fully retracted.
- inventive fueling system 24 and the method related thereto represent a significant improvement over conventional fueling systems and methods.
- inventive fueling system 24 allows the automatic fueling of an automotive vehicle 26 without the need for an operator.
- inventive fueling system 24 and method result in labor savings.
- the fueling system 24 can move an end effector 54 to mate with the fuel stem 90 while the vehicle 26 is moving.
- the vehicle 26 can be fueled without stopping the conveyer line 28 resulting in manufacturing cost savings and increased assembly line efficiency.
Landscapes
- Engineering & Computer Science (AREA)
- Mechanical Engineering (AREA)
- Manufacturing & Machinery (AREA)
- Chemical & Material Sciences (AREA)
- Combustion & Propulsion (AREA)
- Transportation (AREA)
- Manipulator (AREA)
Abstract
A method for fueling an automotive vehicle 26 on a conveyer line 28 is provided. The conveyer line 28 moves generally along a first axis 170 and the method utilizes a robotic arm 34 with an end effector 54 for fueling a fuel stem 90 of the vehicle 26. The method includes determining a first position of the fuel stem 90 along the first axis 170. The method further includes moving the robotic arm 34 to position the end effector 54 proximate to the first position. The method further includes determining a three-dimensional position of the fuel stem 90 utilizing a vision system 164. The method further includes moving the end effector 54 proximate to the three-dimensional position to enable the end effector 54 to mate with the fuel stem 90 and moving said robotic arm 34 relative to said first axis 170 at a speed substantially equal to a speed of said conveyer line 28. Finally, the method includes fueling the fuel stem 90 with the end effector 54.
Description
- A computer program listings appendix is contained on a compact disc submitted herewith hereby incorporated by reference. One compact disc and one duplicate are submitted according to 37 C.F.R. §1.52(e) and each contains the following files:
FILE NAME SIZE IN BYTES DATE OF CREATION Cal. rsp 1 KB May 21, 1995 Cal_eval. c 12 KB Jul. 15, 1995 Cal_main. c 72 KB Sep. 12, 1995 Cal_main. h 10 KB Jul. 15, 1995 Cal_tran. c 14 KB Oct. 22, 1995 Cal_util. c 8 KB May 14, 1995 Cc_cd.dat 9 KB Oct. 28, 1995 Cc_cpcc.dat 1 KB Oct. 28, 1995 Ccal.c 4 KB Apr. 1, 1995 Ccal.log 4 KB Oct. 28, 1995 Ccal.run 1 KB Apr. 2, 1995 Ccal_fo. c 4 KB Apr. 1, 1995 Changes.txt 6 KB Oct. 28, 1995 Csyn. c 7 KB May 21, 1995 Dpmpar. c 7 KB Apr. 1, 1995 Dpmpar. f 6 KB Feb. 15, 1995 Ecal. c 4 KB May 17, 1995 Ecal. log 3 KB Oct. 28, 1995 Ecal.run 1 KB Oct. 28, 1995 Ecalmain.c 23 KB Oct. 15, 1995 Ecccpcc.dat 1 KB Oct. 28, 1995 Encccpcc.dat 1 KB Oct. 28, 1995 Enorm. c 4 KB Apr. 1, 1995 Enorm. f 4 KB Mar. 25, 1994 F2c.h 5 KB Feb. 25, 1992 F2c.ps 138 KB Oct. 20, 1995 Faq.txt 12 KB Oct. 28, 1995 Fdjac2. c 5 KB Apr. 1, 1995 Fdjac2. f 4 KB Mar. 25, 1994 Gasdev. c 2 KB Jul. 17, 1995 Ic2wc.c 3 KB Apr. 1, 1995 Index.txt 3 KB Oct. 28, 1995 Lmdif.c 17 KB Apr. 1, 1995 Lmdif. f 16 KB Mar. 25, 1994 Lmpar. c 10 KB Apr. 1, 1995 Lmpar. f 9 KB Mar. 25, 1994 Makefile.bor 3 KB Oct. 23, 1995 Makefile. unx 3 KB Oct. 20, 1995 Matrix. c 12 KB Jul. 15, 1995 Matrix. h 1 KB Jul. 15, 1995 Minpack. rsp 1 KB May 14, 1995 Ncc_cd.dat 27 KB Oct. 28, 1995 Ncc_cpcc.dat 1 KB Oct. 28, 1995 Nccal. c 4 KB May 20, 1995 Nccal. log 4 KB Oct. 28, 1995 Nccal.run 1 KB Apr. 2, 1995 Nccal_fo. c 4 KB May 20, 1995 Ncsyn. c 7 KB May 21, 1995 Notes.txt 6 KB Oct. 28, 1995 Qrfac. c 7 KB Apr. 1, 1995 Qrfac. f 6 KB Mar. 25, 1994 Qrsolv. c 8 KB Apr. 1, 1995 Qrsolv. f 7 KB Mar. 25, 1994 Wc2ic.c 3 KB Apr. 1, 1995 Xfd2xfu.c 4 KB Jul. 15, 1995 - 1. Technical Field
- This invention relates to a robotic assembly line filling system.
- 2. Description of the Related Art
- Referring to FIG. 1, a known
system 10 for fueling anautomotive vehicle 12 is shown. Thevehicle 12 is moved along an assembly line via aconveyer line 14. When thevehicle 12 progresses into a vehicle fueling area anoperator 16 inserts afuel nozzle 18 into a fuel stem (not shown) to fuel thevehicle 12. Because theconveyer line 14 and thevehicle 12 are moving, agantry 20 and acarriage 22 are utilized to move thefuel nozzle 18 along with thevehicle 12. After insertion of thenozzle 18, a predetermined amount of fuel is pumped into the fuel stem of thevehicle 12. Thereafter, theoperator 16 may remove thenozzle 18 from the fuel stem or thenozzle 18 may be automatically removed from the fuel stem. The knownfueling system 10, however, has a drawback in that theoperator 16 must manually insert thefuel nozzle 18 into the fuel stem. Labor and associated manufacturing costs are increased. - U.S. Pat. No. 4,708,175 issued to Janashak et al. discloses a robot that fills a container mounted on a vehicle with a fluid. The vehicle is mounted on a conveyer line that moves into a work cell where the robot is located. The robot utilizes a vision system to determine the position of the inlet of the container. The robot then moves a robotic arm to the position of the inlet to fill the container with fluid, using gauge holes as visual target points. While Janashak et al. appear to disclose that the system may be used to fill a moving container provided the robot is capable of tracking the moving vehicle, no description of how this can be accomplished is provided. In particular, Janashak et al. do not disclose how to compensate for motions of the vehicle relative to the assembly line conveyor that cannot be detected by an encoder connected to an assembly line conveyor, do not disclose how to compensate for vehicle to vehicle variations in the location of the fuel stem relative to other parts of the vehicle, nor do Janashak et al. disclose how to accomplish such filling of a container while the vehicle moves without first touching the vehicle or attaching anything to the vehicle in order to facilitate detection of the container inlet or, finally, how to coordinate such motion to fill the container without constraining the motion of the vehicle in an uncharacteristic fashion (i.e., without stopping them, stabilizing them, etc. where such motion is only a requirement of the fueling system and not typical of the conveyor itself). Applicants assume that, Janashak et al. teach no more than that the conveyor line therefore must stop in order to fill the container. Stopping a conveyer line results in an increased time to complete the vehicle build, which results in increased manufacturing and labor costs.
- There is thus a need for a fluid filling system and method that reduces and/or minimizes one or more of the above-identified deficiencies.
- One advantage of the present invention is that the vehicles on the assembly line need not be stopped to have a fluid container, such as a fuel tank, filled.
- The present invention, broadly, provides a system and method for filling a container with a fluid in a vehicle moving along an assembly line. The fluid can be, for example only, fuel, coolant, windshield washer fluid, and the like. The method includes the steps of determining a position of an inlet of the container (as the vehicle is moving) using a machine vision system. The next step involves moving a fluid fill outlet to the container inlet using a robot. Finally, the container is filled via the outlet while the vehicle continues to move.
- In a preferred embodiment, a fueling system in accordance with the present invention includes a gantry having a carriage configured to move generally parallel to a first axis (i.e., the conveyor line on which the vehicle moves). The fueling system further includes a robotic arm attached to the carriage that moves with the carriage. The robotic arm has an end effector configured to mate with a fuel stem on the vehicle and to supply fuel to the fuel stem. The fueling system further includes a vision system including first and second cameras for iteratively determining a three-dimensional position of the fuel stem relative to a predetermined coordinate system. The position is preferably a center point of the fuel stem lying on a plane defined by the sealing surface (typically an outer edge) of the fuel stem. Finally, the fueling system includes a robot controller configured to command the carriage to move proximate to the position of the fuel stem and to move the end effector to the three-dimensional position to mate with the fuel stem. The robot controller is further configured to move the robotic arm relative to the first axis at a speed substantially equal to a speed of the conveyer line.
- A method is also provided for fueling an automotive vehicle on a conveyer line. The method utilizes a robotic arm with an end effector for fueling a fuel stem of the vehicle. The method includes determining a first position of the fuel stem along a first axis. The method further includes moving the robotic arm to position the end effector proximate to the first position. The method further includes determining a three-dimensional position of the fuel stem utilizing a vision system. The method further includes moving the end effector proximate to the three-dimensional position to enable the end effector to mate with the fuel stem and moving the robotic arm relative to the first axis at a speed substantially equal to a speed of the conveyer line. Finally, the method includes fueling the fuel stem with the end effector.
- A further method for providing position data to a controller is also provided. The method provides position data to the controller to enable a robotic arm controlled by the controller to move to a position of an object. The method utilizes first and second cameras disposed at first and second camera coordinate systems, respectively. The method in a preferred embodiment includes simultaneously acquiring a first digital image and a second digital image of a workspace including the object utilizing the first camera and the second camera, respectively. The method further includes searching the first digital image with a first fuel stem template to determine a location in the first image where a first correlation score between that portion of the image being searched and the template is greater than a predetermined threshold. The method further includes searching the second digital image with a second fuel stem template to determine a location in the second image where a second correlation score between that portion of the second image being searched and the template is greater than a predetermined threshold. The method further includes calculating a three-dimensional position of the object with respect to a predetermined coordinate system. The position is calculated responsive to the locations in the first and second digital images where the correlation scores were above the threshold. The method further includes calculating a triangulation error of the position. In a more preferred embodiment, the method further includes transferring position data indicative of the calculated position to the controller when the fuel stem matches are found in both images and the triangulation error is less than a threshold error value.
- The fueling system for fueling an automotive vehicle on a conveyer line and the method related thereto represent a significant improvement over conventional fueling systems and methods. In particular, the inventive fueling system allows the automatic fueling of a fuel stem without the need for an operator, resulting in labor savings. Further, the inventive fueling system can track the position of the fuel stem on a moving vehicle (on a conveyer line) and move an end effector to mate with the fuel stem. Thus, the vehicle can be fueled without stopping the conveyer line resulting in decreased manufacturing costs and increased manufacturing efficiency.
- These and other features and advantages of this invention will become apparent to one skilled in the art from the following detailed description and the accompanying drawings illustrating features of this invention by way of example.
- FIG. 1 is a schematic of a known fueling system for an automotive vehicle on a conveyer line.
- FIG. 2 is a schematic of a fueling system in accordance with the present invention.
- FIG. 3 is a schematic of a gantry and a robotic arm shown in FIG. 2.
- FIG. 4 is a side view of the gantry and the robotic arm shown in FIG. 3.
- FIG. 5 is an enlarged perspective view of the robotic arm shown in FIG. 3.
- FIG. 6 is a side view of the robotic arm in its kinematic zero position (all joint variables equal to zero).
- FIG. 7 is a schematic showing the coordinate systems utilized to control the position of the robotic arm.
- FIG. 8 is an exploded schematic of an end effector of the robotic arm shown in FIG. 5.
- FIG. 9 is block diagram of a robotic control system utilized by the inventive fueling system.
- FIGS.10A-10C are schematics illustrating the tracking variables utilized for positioning the robotic arm.
- FIG. 10D is a tracking equation utilized to position the robotic arm along the J1 axis.
- FIG. 11 is a schematic illustrating a triangulation technique utilized by the vision system to determine the three dimensional position of the fuel stem.
- FIG. 12 is a schematic illustrating a two-dimensional digital image coordinate system.
- FIG. 13 is a schematic illustrating a two-dimensional camera sensor coordinate system.
- FIG. 14 is a schematic illustrating a three-dimensional camera coordinate system.
- FIG. 15 is a schematic illustrating equations that define the intrinsic camera model.
- FIG. 16 is a schematic illustrating a force vector applied to the end effector.
- FIG. 17 is a schematic illustrating a Jacobian Transpose relationship utilized for force feedback control of the robotic arm.
- FIG. 18 is a schematic illustrating an inverse kinematics model utilized by the robot control system to position and orient an end effector.
- FIG. 19 is a schematic of a fuel stem.
- FIG. 20 is a schematic of a docking path of an end effector with a fuel stem.
- FIGS.21A-21G are flowcharts illustrating the modes of operation for the fueling system in accordance with the present invention.
- Referring now to the drawings wherein like reference numerals are used to identify identical components in the various views, FIG. 2 illustrates a fueling
system 24 for fueling anautomotive vehicle 26 on aconveyer line 28. As shown, thevehicle 26 is on acarriage 30 moving on theconveyer line 28. It should be understood, however, that theinventive fueling system 24 may be configured to operate with any known type of conveyer line, including for example, overhead conveyer lines (see FIG. 1) and floor mounted conveyer lines. An advantage of the fuelingsystem 24 is that thevehicle 26 may be fueled without stopping thevehicle 26 on theconveyer line 28. Theinventive fueling system 24 includes agantry 32, arobotic arm 34, and arobot control system 36. - Referring to FIG. 3, the
gantry 32 is provided to move therobotic arm 34 substantially parallel to theconveyer line 28. Thegantry 32 includes aframe 38, acarriage 40, and amotor 42. Theframe 38 may be supported by legs (not shown) or may be mounted to ceiling supports. Themotor 42 is mounted on thegantry 32 and is operatively connected to a drive belt (not shown) that is connected to thecarriage 40. Thus, rotational movement of a rotor (not shown) of themotor 42 causes the drive belt to move thecarriage 40 along the J1 axis in either a forward direction (to the right in FIG. 2) or a backward direction (to the left in FIG. 2). Themotor 42 is electrically connected to a motor driver 44 (see FIG. 9) that is controlled by therobot controller 46. Thus, thecontroller 46 can selectively control the position of thecarriage 40 and therobotic arm 34 along a J1 axis. Themotor 42 further includes an internal encoder 48 (not shown in FIG. 1). Theencoder 48 generates a position value J1_VAL indicative of the position of thecarriage 40 that is received by therobot controller 46. - Referring to FIG. 3, the
robotic arm 34 is provided to fuel theautomotive vehicle 26. As shown, therobotic arm 34 is connected to thecarriage 40 and moves with thecarriage 40 along the J1 axis. Prior to fueling thevehicle 26, therobotic arm 34 is rotated to a desired angle about the J2 axis which matches the orientation of afuel stem 90 in the plane of J2 (see FIG. 10A) of thevehicle 26. In a preferred embodiment, therobotic arm 34 is automatically rotated to the desired J2 angle. In an alternate embodiment, the J2 angle could be manually fixed. Referring to FIG. 5, therobotic arm 34 includes amanipulator arm 50, apitch arm 52, anend effector 54, acamera arm 56, and a light 58. - The
manipulator arm 50 is provided to support thepitch arm 52 and theend effector 54. Referring to FIGS. 4 and 5, themanipulator arm 50 includes aframe 60, amotor 62, alinear actuator 64, and a joint 66. As shown, theframe 60 is connected to a first end of thepitch arm 52 via the joint 66. Themotor 62 is provided to drive thelinear actuator 64 to thereby move thepitch arm 52 about the J3 axis. In particular, thelinear actuator 64 converts rotary movement of a rotor (not shown) of themotor 62 into linear movement of the upper end of arod 68. The lower end of therod 68 is connected to thepitch arm 52 via a pin joint 70. Thus, linear actuation of the upper end of therod 68 causes rotational motion of thepitch arm 52 about the J3 axis. - Referring to FIGS. 5 and 9, the
motor 62 is electrically connected to amotor driver 72 that is controlled by therobot controller 46. Thus, thecontroller 46 can selectively control the position of thelinear actuator 64 to move thepitch arm 52 to a desired rotational angle about the J3 axis. Further, anencoder 74 is operatively mounted on the joint 66. Theencoder 74 generates a position value θ3_VAL indicative of a rotational angle of thepitch arm 52 about the J3 axis that is received by therobot controller 46. Still further, anencoder 76 is operatively mounted between thecarriage 40 and themanipulator arm 50. Theencoder 76 generates a position value θ2_VAL indicative of a rotational angle of themanipulator arm 50 about the J2 axis that is received by therobot controller 46. - Referring to FIG. 4, the
pitch arm 52 is provided to support theend effector 54. Thepitch arm 52 includes an L-shapedframe 78, apneumatic cylinder 80, and a joint 82. As illustrated, a second end of thepitch arm 52 is connected to theend effector 54 via the joint 82. Thepneumatic cylinder 80 is mounted on thepitch arm 52 and has acylinder rod 84 connected to theend effector 54. Further, thecylinder 80 is connected to a pneumatic control valve 86 (see FIG. 9) which is selectively controlled by therobot controller 46. Thevalve 86 may be a dual solenoid three-position open or closed center valve having an extend solenoid (not shown) and a retract solenoid (not shown). When the extend solenoid is energized (and the retract solenoid is de-energized), thecylinder 80 extends therod 84 to rotate theend effector 54 to a desired rotational angle about the J4 axis as defined by an adjustable hard stop. The hard stop connected to the cylinder rod is adjusted prior to moving the actuator (which only makes full motion movements). This may be done manually, but is preferably done automatically. This method of controlling the joint position takes advantage of the speed, power, and safety (i.e., in proximity to gasoline) of a pneumatic actuator while avoiding the difficulty in mid-positioning pneumatic actuators, and particularly avoiding the lack of stiffness exhibited by a mid-positioned pneumatic actuator. The hard stop is preferably moved by a low power electric or pneumatic motor using a self-locking mechanism such as a worm gear drive or a self-locking lead screw so that the power required to move the stop is less than the power required to move the joint. When the retract solenoid is energized (and the extend solenoid is de-energized) thecylinder 80 retracts therod 84 to rotate theend effector 54 back to a home position about the J4 axis. Referring to FIGS. 5 and 9, anencoder 88 is operatively mounted on the joint 82. Theencoder 88 generates a position value θ4_VAL indicative of a rotational angle of thepitch arm 52 about the J4 axis that is received by therobot controller 46. - The
end effector 54 is provided to mate with thefuel stem 90 of theautomotive vehicle 26. Referring to FIG. 8, theend effector 54 includes arodless cylinder 92, a mountingbracket 94, aguide 96, a manifold 98, afuel hose 100, aboot 102, afront housing 104,load cells pneumatic cylinder 114,fuel valves elbows pipe tees check valves vapor fittings position potentiometer 152. - The
rodless cylinder 92 is provided to move the mountingbracket 94 and the remaining components of theend effector 54 along the J5 axis. In particular, thecylinder 92 is provided to move theboot 102 against thefuel stem 90. Thecylinder 92 has aslidable plate 154 that may be extended or retracted along the J5 axis. Referring to FIGS. 8 and 9, thecylinder 92 may be operatively connected to apneumatic control valve 156 that is controlled by therobot controller 46. As shown, theplate 154 is attached to the mountingbracket 94. - Referring to FIG. 8, the mounting
bracket 94 is provided to support the remaining components of the end effector 54 (excluding the rodless cylinder 92). As shown, theguide 96 is attached tobracket 94 to allow axial movement of the manifold 98 and thefuel hose 100 relative to thebracket 94 and theboot 102. - The
manifold 98 is provided to direct fuel from eitherfuel valve 116 orfuel valve 118 through thefuel hose 100. Further, thepneumatic cylinder 114 is provided to selectively move the manifold 98 (discussed in greater detail below) along theguide 96 relative to thebracket 94. - The
fuel valve 116 may receive fuel (e.g., gasoline) from theelbow 122 which is connected to a first fuel line (not shown). When thevalve 116 is open, fuel is supplied to the manifold 98 and to thefuel hose 100. When thefuel valve 116 is closed, the fuel may be recirculated throughelbow 120 for cooling by cooling equipment (not shown). Thevalve 116 is selectively controlled by therobot controller 46. - The
fuel valve 118 may receive fuel (e.g., diesel fuel) from theelbow 126 which is connected to a second fuel line (not shown). When thevalve 118 is open, fuel is supplied to the manifold 98 and to thefuel hose 100. When thefuel valve 118 is closed, no fuel is supplied to themanifold 98. Thevalve 118 is also selectively controlled by therobot controller 46. - The
boot 102 is provided to mate with thefuel stem 90 during fueling of thestem 90. Theboot 102 may be constructed of a resilient material such as rubber or plastic and includes ahollow body portion 103 and aboot seal 101. Theboot seal 101 is configured to seal against a top edge of thefuel stem 90 to prevent fuel and fuel vapor from escaping from thestem 90 during fueling. As shown, theboot 102 is attached to thefront housing 104. The boot is also electrically connected to the front housing, and provides a conductive path which grounds the robot to the vehicle upon insertion. - The
fuel hose 100 is provided to be inserted into thefuel stem 90 beneath the no-lead insert (not shown) after theboot 102 has mated with thestem 90. Thefuel hose 100 is constructed of a flexible plastic and extends from an outlet (not shown) on the manifold 98 and through thehousing 104 and theboot 102. As shown, thepneumatic cylinder 114 is provided to move the manifold 98 and thefuel hose 100 along the J5 axis to insert thehose 100 into thestem 90. Referring to FIGS. 8 and 9, thecylinder 114 is operatively connected to apneumatic control valve 158 that is selectively controlled by therobot controller 46. - The
check valves manifold 98 andfuel hose 100 of fuel. When bothfuel valves check valves fuel hose 100 into thefuel stem 90. - The
vapor recovery fittings pipe tee 138, andelbow 130 may be connected together for fuel vapor recovery as known by those skilled in the art. - The
load cells end effector 54 by thefuel stem 90 during fueling. Theload cells front housing 104 and the mountingbracket 94. Thus, thefront housing 104 transmits force exerted on theboot 102 to theload cells load cells robot controller 46. Thecontroller 46 may receive signals generated by theload cells robotic arm 34 during fueling. - Referring to FIGS. 4 and 8, during the process of mating the
boot 102 with thefuel stem 90, theend effector 54 is rotated about the joint 82 to a calculated rotation angle θ4. Therodless cylinder 92 then extends the remaining components of theend effector 54 along the J5 axis to mate theboot 102 with thefuel stem 90. Finally, thepneumatic cylinder 114 extends thefuel hose 100 along the J5 axis to enter thefuel stem 90 beneath the no-lead insert (not shown). - Referring to FIG. 5, the
camera arm 56 is provided to housecameras camera arm 56 is attached to a side of themanipulator arm 50. - The light58 is provided to illuminate the
fuel stem 90 on thevehicle 26 to allow thevision system 164 to recognize thefuel stem 90 regardless of ambient lighting conditions. The light 58 may comprise a fluorescent metal halide fixture (e.g., a Class I Div. 2 rated fixture in a constructed embodiment). Referring to FIG. 7, the coordinate systems utilized by therobot controller 46 to position the robotic arm 34 (and the end effector 54) are shown. In particular, a coordinate system CS0 represents a home position of thecarriage 40. The origin of the coordinate system CS0 lies on the J1 axis. A coordinate system CS1 is located on thecarriage 40 and accordingly moves with thecarriage 40. The coordinate system CS1 is utilized extensively for tracking thefuel stem 90 along the J1 axis, as discussed in greater detail below. The coordinate system CS2 is utilized for positioning therobotic arm 34 about the J2 axis. Coordinate system CS2 is also the reference coordinate system for the vision system, which uses it to locate and orient the camera coordinate systems and to report object position and orientation vectors. The coordinate system CS3 is utilized for positioning thepitch arm 52 about the J3 axis. Further, the coordinate system CS4 is utilized for positioning theend effector 54 about the J4 axis. Finally, the coordinate system CS5 is utilized for positioning theboot 102 along the J5 axis. - Referring to FIG. 9, a
robot control system 36 is provided. Therobot control system 36 includes alight sensor 160, aconveyer line encoder 162, arobot controller 46, avision system 164,encoders position potentiometer 52,load cells motor drivers control valves fuel valves valve 158. - Referring to FIG. 2, the
light sensor 160 is provided to detect when thevehicle 26 enters a vehicle fueling area on an assembly line. Thelight sensor 160 is conventional in the art and includes a light transmitter and a light receiver inpackage 166 to be used with acorresponding reflector 168. Thepackage 166 is oriented on a first side of theconveyer line 28 to project a light beam, in one embodiment, across theconveyer line 28. In the illustrated embodiment, thereflector 168 is oriented and positioned on an opposite side of theconveyer line 28 to receive the light beam. When thevehicle 26 enters the fueling area, thevehicle 26 blocks the light beam from being reflected by thereflector 168. In response, the receiver inpackage 168 generates a trigger signal VTRIG that is received by therobot controller 46. In the case where the beam is reflected by the vehicle, it is the presence (not absence) of a reflected beam that generates the trigger signal. In any event, the position of thelight sensor 160 is designated as the light sensor trigger position hereinafter (see FIG. 10A). - In an alternate embodiment VTRIG could be generated by any other sufficiently repeatable mechanism (e.g., mechanical switch, ultrasonic sensor, bar code reader, etc.) to reliably indicate the presence of the next vehicle to the robot controller.
- Referring to FIGS. 2 and 9, the
conveyer line encoder 162 is provided to generate an encoder count CL_VAL indicative of a current encoder count with respect to theconveyer line 28. Therobot controller 46 utilizes two encoder counts to determine a gross displacement distance along theaxis 170 of the vehicle 26 (and the fuel stem 90) from the light sensor trigger position. - Referring to FIG. 9, the
robot controller 46 is provided to control thecarriage 40 and therobotic arm 34 for fueling theautomotive vehicle 26. Therobot controller 46 has acommunication bus 172 for communicating with thevision controller 174. Further,robot controller 46 also has acommunication bus 176 for communicating with asupervisory PLC 178 which may control the operation of the fuel pumping and metering equipment and various safety devices. ThePLC 178 is not a component of therobot control system 36 and is shown only for illustrative purposes. As shown, therobot controller 46 is electrically connected to theconveyer line encoder 162, thelight sensor 160, theload cells vision controller 174. Therobot controller 46 receives encoder counts (from the conveyer line encoder 162), and the signals VTRIG, VL1, VL2, VL3, VL4, and a three-dimensional position Pe of thefuel stem 90 from thevision controller 174—to control the position of therobotic arm 34. The method for determining the position of thefuel stem 90 will be explained in greater detail below. Thecontroller 46 is also electrically connected to encoders 48, 74, 76, 88 and theposition potentiometer 52. Thus, thecontroller 46 receives the joint position values J1_VAL, θ2_VAL, θ3_VAL, θ4_VAL, J5_VAL and is able to calculate a current position of theend effector 54. Thecontroller 46 is also electrically connected tomotor drivers control valves fuel valves controller 46 further includes a programmable memory for implementing the various modes of operation of the fuelingsystem 24 which will be described in greater detail below. - Referring to FIG. 18, the
robot controller 46 utilizes an inverse kinematic model of therobotic arm 34 to position thearm 34. In particular, the inverse kinematic model is a set of equations that allow the joint variables (i.e., θ3 and J5 of therobotic arm 34 to be calculated in order to place theend effector 54 in a desired position and orientation. Inverse kinematic models are utilized extensively in robotic applications and can readily be determined by one skilled in the art. Accordingly, the underlying inverse kinematic model equations will not be discussed in any further detail. As shown, one set of inputs x, y, z represent the desired position Pe of the end ofboot 102 with respect to a predetermined coordinate system, such as coordinate system CS1. The inputs x, y, z (i.e., point Pe) are determined by thevision system 164 while tracking thefuel stem 90 and may be stored in the variables CARRIAGE_TO_STEMX, CARRIAGE_TO_STEMY, CARRIAGE_TO_STEMZ in therobot controller 46. The input ax, ay, az, is a unit vector representing the orientation of the fuel stem 90 (see FIG. 20) with respect to the coordinatesystem CS 1. Because the orientation of thefuel stem 90 does not vary a large amount when thevehicle 26 moves on theconveyer line 28, the orientation of thefuel stem 90 may be a stored value. In particular, unit vector ax, ay, az, may be stored as VID data in non-volatile memory (e.g., hard disk) of the vision controller and is readily determined by one skilled in the art for aspecific fuel stem 90. The DH constants (see FIGS. 6 and 7) are geometric dimensions relating to therobotic arm 34 and are stored in the non-volatile memory (e.g., hard disk) of therobot controller 46. Finally, the angles θ2 and θ4 represent desired positions of therobotic arm 34 about the J2 axis and J4 axis, respectively, for fueling thefuel stem 90. The angles θ2 and θ4 may also be stored in the non-volatile memory (e.g., hard disk) of therobot controller 46 or may be calculated from the unit vector ax, ay, az. - Referring to FIG. 9, the
vision system 164 is provided to generate a three-dimensional position value (i.e., point Pe) of afuel stem 90 with respect to a predetermined coordinate system (e.g., CS1). Thevision system 164 includes thevision controller 174, theframe grabber 180,inner camera 182, andouter camera 184. - The
vision controller 174 is provided to calculate a three-dimensional position Pe of thefuel stem 90. In particular, therobot controller 46 may request a position of thefuel stem 90 from thecontroller 174 via thebus 172. In response, thecontroller 174 may calculate the position Pe and return the position Pe to thecontroller 46. Thevision controller 174 may calculate the position of thefuel stem 90 responsive to two digital images generated by thecameras - The
robot controller 46 may also request vehicle identification data (VID) from thevision controller 174. The VID data may be stored in non-volatile memory (e.g., hard disk) of thevision controller 174. In particular, a user or an assembly line controller may input a VID number identifying the particular vehicle type, via an input device or a serial bus (not shown), to therobot controller 46. Therobot controller 46 may transmit the VID number to thevision controller 174, which retrieves a VID record from its hard disk. The VID record may be transmitted to therobot controller 46 and maintained in a random access memory (RAM) of thevision controller 174. The VID record contains vehicle dependent information used by therobot controller 46 to track the position of thefuel stem 90. In particular, the VID record may include the following information: - ax, ay, az, unit vector defining orientation of the
fuel stem 90 with respect to the coordinate system CS1; - HOME_TO_STEM=distance from a home position of the
carriage 40 to a position P0 on the J1 axis projection of fuel stem center onto z0) when thevehicle 26 crosses the light sensor trigger position (see FIG. 10A); - DESIRED_CARRIAGE_TO_STEMY=desired distance measured with respect to y1 from the
carriage 40 to a position P1 on the J1 axis for theend effector 54 to mate with thefuel stem 90; - DIGITAL_IMAGE_TEMPLATE1=digital image template of the
fuel stem 90 for theinner camera 182; - DIGITAL_IMAGE_TEMPLATE2=digital image template of the
fuel stem 90 for theouter camera 184; - APPROACH_ANGLE=the angular offset in the vertical plane between z5 and the
stem axis 243. - The information contained in the VID record may be determined by one skilled in the art. In a preferred embodiment, the information is measured and stored by the
vision controller 174 and therobot controller 46. In an alternate embodiment, the information could be determined using conventional survey equipment. - The
vision controller 174 may utilize commercially available vision software for template matching to determine ifcameras fuel stem 90. In a constructed embodiment, the commercially available vision software comprised Matrox Imaging Library, Version 6.0 sold by Matrox Electronic Systems Ltd. of Dorval, Quebec, Canada. In particular, the conventional software searches the first digital image of the workspace, acquired bycamera 182 andframe grabber 180, for a location in which the correlation score with the first digital image template (i.e., DIGITAL_IMAGE_TEMPLATE1) is higher than a pre-determined acceptance level. If such an image location is found then there is a fuel stem match in the first digital image. The conventional software searches the second digital image of the workspace, acquired bycamera 184 andframe grabber 180, for a location in which the correlation score with the second digital image template (i.e., DIGITAL_IMAGE_TEMPLATE2) is higher than a pre-determined acceptance level. If such an image location is found then there is a fuel stem match in the second digital image. If there is a fuel stem match in both digital images, i.e., if bothcameras fuel stem 90, then thevision controller 174 further proceeds to calculate a three-dimensional point Pe corresponding to the center of the fuel stem. The software methodology for calculating the point Pe will be explained in greater detail below. - The
frame grabber 180 is provided to simultaneously trigger thecameras cameras vision controller 174. Theframe grabber 180 is conventional in the art and may comprise a multi-channel frame grabber capable of simultaneously triggering and digitizing images coming from at least two monochrome progressive scan analog cameras. In a constructed embodiment,frame grabber 180 comprises a Matrox Meteor-II/MC Frame Grabber manufactured by Matrox Electronics Ltd. of Dorval, Quebec, Canada. Theframe grabber 180 may receive a retrieve signal VTR fromvision controller 174. In response, theframe grabber 180 may generate retrieve signals VTRI and VTRO to simultaneously trigger thecameras workspace 186. Thecameras frame grabber 180, which may simultaneously digitize them and may further transfer the first and the second digital images to thevision controller 174. - The first and
second cameras workspace 186. Thecameras cameras cameras - Referring to FIG. 5, the
cameras camera arm 56. Thecamera arm 56 is provided to serve a number of functions. It acts as an enclosure which protects thecameras manipulator arm 34 using precision shoulder screws (not shown) to guarantee precise and repeatable alignment. This allowscamera arms 56 to be factory calibrated but interchangeable in the field. Finally, the camera arm is provided to position thecameras workspace 186 is defined to be the volume of space containing the various possible positions in which therobotic manipulator 34 and theend effector 50 are capable of servicing thefuel stem 90. Thus, when thecameras fuel stem 90, such that thefuel stem 90 lies within theworkspace 186, thevision controller 174 is able to determine a three-dimensional position Pe of thefuel stem 90. - Before explaining the various modes of operation of the fueling
system 24, the methodology and mathematics for determining a three-dimensional position Pe of thefuel stem 90 will now be explained. In order to determine the position Pe, a mathematical camera model of each of thecameras cameras camera arm 56 and the aperture and focus of their lenses have been adjusted. For the purposes of simplicity and clarity, only the mathematical model forcamera 182 will be further described hereinbelow. - The camera model comprises two parts: an intrinsic camera model and an extrinsic camera model. Referring to FIGS. 12 and 14, the intrinsic camera model specifies the relationship between a point (i.e., point (xi,yi)) with respect to a digital image coordinate system and a direction in space (i.e., unit vector Di) with respect to a camera coordinate system CSci. The intrinsic camera model for
camera 182 depends on the internal geometric, electronic and optical characteristics ofcamera 182 and on the electronic characteristics of theframe grabber 180 which will be discussed in greater detail below. - During run time, the extrinsic camera model of each the
cameras camera 182 and coordinate system CSco for the camera 184) with respect to a predetermined coordinate system that is fixed with respect to the cameras, such as CS2. In a constructed embodiment, the extrinsic camera model parameters are measured in the factory and are stored in non-volatile memory (e.g., hard disk) of the vision controller with respect to thecamera arm 56 mounting frame. During run time thevision controller 174 always supplies position and orientation vectors with respect to CS2 but therobot controller 46 always converts and uses these vectors with respect to CS1. Because each extrinsic camera model—which is a coordinate system transformation matrix—may be readily determined by one skilled in the art, only the intrinsic camera model will be discussed. - The intrinsic camera model for
camera 182 will now be discussed. The intrinsic camera model comprises several equations (discussed hereinafter) that are used to determine a direction vector Di pointing in space to the center point Pi of thefuel stem 90. Referring to FIGS. 12, 13, and 14, the intrinsic camera model utilizes three coordinate systems: a digital image coordinate system, a camera sensor coordinate system, and a camera coordinate system. Referring to FIG. 12, the two-dimensional digital image coordinate system defined by the axes Xi and Yi is illustrated. The digital image coordinate system is defined by a plurality of rows and columns of pixels 185. Further, the Principal Point (Cx, Cy) is the point projected onto the digital image coordinate system that corresponds to the optical axis of a camera lens (not shown) of thecamera 182. Further, the image point (xi, yi) corresponds to the center of thefuel stem 90 and is generated by the commercially available vision software previously discussed. - Referring to FIG. 13, the two-dimensional camera sensor coordinate system defined by the axes Xs and Ys is illustrated. The
CCD camera 182 has asensor plane 188 comprising a plurality of rows and columns of CCD sensor cells 187. The center of thefuel stem 90 is imaged on the sensor point (xs, ys) (coordinates with respect to the camera sensor coordinate system) and after digitization by theframe grabber 180, this point corresponds to the image point (xi, yi) (coordinates with respect to the digital image coordinate system). - Referring to FIG. 14, the three-dimensional camera sensor coordinate system CSci is partially illustrated in two dimensions. The camera coordinate system CSci is defined by the axes Xci, Yci, Zci has an origin Oci that corresponds to the center of a lens (not shown) of the
camera 182. The origin Oci is also located on the optical axis of thecamera 182. As shown, the unit direction vector Di is determined with respect to the camera coordinate system CSci. - Before discussing the intrinsic camera model equations, the parameters used in the equations will be set forth. The parameters include:
f = effective focal length of the camera lens (meters), Sx = camera aspect ratio, Cx = x image coordinate (pixel) of the Principal Point, Cy = y image coordinate (pixel) of the Principal Point, K1 = second degree radial distortion coefficient of the camera lens, dx = width of a sensor cell (meters), dy = height of a sensor cell (meters), Ncx = number of sensor cells on a row of the camera Sensor, Ncy = number of sensor cells on a column of the camera Sensor, N1x = number of pixels on an image row. - The parameters dx, dy, Ncx, Ncy, Nix may be supplied by the manufacturer of the
camera 182, and the parameter Nix is a known parameter of theframe grabber 180. The parameters f, Sx, Cx, Cy, K1, may be readily determined by one skilled in the art utilizing an algorithm set forth in the following publication: “A Versatile Camera Calibration Technique for High-Accuracy 3D Machine Vision Metrology Using Off-the-Shelf TV Cameras and Lenses”, IEEE Journal of Robotics and Automation Vol RA-3, No 4, August 1987—which is incorporated by reference in its entirety. - Referring to FIG. 15, camera model equations 1-8 are used to map the image point (xi, yi) with respect to a digital image coordinate system to a sensor point (xs, ys) with respect to a camera sensor coordinate system and to further calculate an undistorted sensor point (xu, yu) with respect to the camera sensor coordinate system, from which is calculated a direction vector Di with respect to the camera coordinate system CSci. In particular, equations (1), (2), (7), and (8) are used to map the image point (xi, yi) to the sensor point (xs, ys)—where light from a point Pi of the
fuel stem 90 is projected. Referring to FIG. 14, sensor point (xs, ys) is shown on thesensor plane 188. It should be understood, however, that camera 182 (and all conventional cameras) radially distorts projected light beams. Accordingly, the equation (3) is utilized to calculate a radial distortion factor k for thecamera 182 to compensate for the radial distortion ofcamera 182. Next, equations (4) and (5) are utilized to calculate the undistorted sensor point (xu, yu) (coordinates with respect to the camera sensor coordinate system) utilizing the radial distortion factor k. Referring to FIG. 14, the point (xu, yu) is shown on thesensor plane 188. The preferred embodiment uses an improved version of Tsai's algorithm that was developed by Reg Willson and that is freely available on Internet at the following URL: http://www.cs.cmu.edu/afs/cs.cmu.edu/user/rgw/www/TsaiCode.html, as also shown in the computer program listing Appendix on a compact disc submitted herewith. - Referring to FIG. 15, the equation (6) is used to calculate the unit vector Di which is centered at the origin Oci and points toward the center point Pi of the
fuel stem 90. The numerator of the equation (6) represents a vector pointing toward the center point Pi. The denominator of equation (6) represents the magnitude of the vector Di. Thus, the vector Di represents a unit vector. - The foregoing camera model equations are valid if the following criteria regarding the
camera 182, the lens (not shown) and theframe grabber 180 are true. First, the optical axis of thecamera 182 is presumed to be perpendicular to thesensor plane 188. Second, the lens is focused so that an object in thecamera workspace 186 is in focus. Third, the focus of the lens is fixed. In other words, a zoom lens is not used. Fourth, the digitization of a digital image by theframe grabber 180 is accurate. - It should be understood that the intrinsic camera model for
camera 184 may also be defined by the equations (1)-(8) utilizing the ten parameters f, Sx, Cx, Cy, K1, dx, dy, Ncx, Ncy, Nix determined for thecamera 184. Referring to FIG. 11, thecamera 184 may have an origin Oco with a coordinate system CSco. Generally, to determine a position of an object, an observer first establishes a stereo match of the object and then uses triangulation to determine a distance to the object with respect to the observer. Thevision system 164 works in a similar manner. When thecameras frame grabber 180 generate first and second digital images, respectively, of theworkspace 186 the conventional vision software (discussed above), within thevision controller 174, performs image template matching and generates first and second image match locations. We will hereafter refer to the first image match location as the inner image location and to the second image match location as the outer image location. - The conventional vision software determines an inner image point (i.e., xi, yi) and an outer image point (not shown) that are representative of the center point of the
fuel stem 90 in the first and second digital images, respectively. The inner image point (i.e., xi, yi) is utilized to calculate the direction vector Di as explained above. Similarly, the outer image point is utilized to calculate the direction vector Do. It should be understood that the direction vector Do may be calculated utilizing the equations (1)-(8) discussed above. The two direction vectors Di and Do are utilized in conjunction with the extrinsic camera model of thecameras fuel stem 90 with respect to the coordinate system CS2. This position with respect to CS2 is transmitted by thevision controller 174 to therobot controller 46 which transforms it with respect to CS1. - Referring to FIG. 11, to triangulate the position of the estimated triangulation point Pe with respect to the coordinate system CS2, the position and orientation of the inner and outer camera coordinate systems CSco and CSci with respect to coordinate system CS2, i.e., the extrinsic camera models of
cameras vision controller 174 calculates an inner triangulation point Pi along the line {Oci, Di} that is the closest point to the line {Oco, Do}. Thecontroller 174 then calculates an outer triangulation point Po that is the closest point on the line {Oco, Do} to the inner triangulation point Pi. Finally, thecontroller 174 calculates the midpoint between the inner triangulation point Pi and the outer triangulation point Po, which is the triangulated point Pe. As previously discussed, the triangulated point Pe is the estimated position of a center point of thefuel stem 90 with respect to the coordinate system CS2. Further, thevision controller 174 calculates a triangulation error which is the distance between points Pe and Pi. If the triangulation error is less than a threshold error value, thevision controller 174 transmits the position Pe to therobot controller 46. In particular, the position Pe may be transmitted to thecontroller 174 utilizing the following variables: CARRIAGE_TO_STEMX, CARRIAGE_TO_STEMY, CARRIAGE_TO_STEMZ. - During fueling of the
fuel stem 90, force feedback control may be utilized to correct the position of therobotic arm 34. In particular, if a force vector applied to theend effector 54 exceeds a threshold magnitude, theend effector 54 may be moved to reduce the force vector to a desired magnitude. The desired magnitude may comprise a zero value or a non-zero value depending upon the amount of force needed to seal theend effector 54 against thefuel stem 90. - Referring to FIG. 16, a
boot 102 of theend effector 54 is illustrated. Further, the coordinate system CS5 is shown on theboot 102 which corresponds to a contact point of afuel stem 90 with theboot 102. When theboot 102 mates with thefuel stem 90, a force vector FV is applied to theboot 102 with force components fx5, fy5 and fz5 along the X5, Y5 and Z5 axes, respectively. It should be noted that the Y5 axis extends outwardly from the page. The undesirable force fy5 results from an error in position for therobotic arm 34 along the J1 axis. The undesirable force fx5 results from an error in position of therobotic arm 34 about the J3 axis. As shown, theload cells front housing 104. Thereafter, the force vector FV and the component forces fx5 and fy5 may be readily determined utilizing simple force vector equations well know to those skilled in the art. - The forces fx5, fy5 and fz5 applied to the
boot 102 create (i) an undesirable force f1 on thecarriage 40 along the J1 axis and (ii) an undesirable torque τ3 on thepitch arm 52 about the J3 axis. Referring to FIG. 17, the force f1 and the torque τ3 may be calculated utilizing a Jacobian Transpose equation. In particular, the Jacobian Transpose equation utilizes (i) the geometric DH constants g, h, k of the robotic arm 34 (see FIG. 7), (ii) the current joint values θ2, θ3, θ4, J5, and (iii) the forces fx5, fy5, fz5—to calculate the force f1 and the torque τ3. The Jacobian Transpose equation may be readily determined by one skilled in the art. - The force f1 and the torque τ3 are utilized to calculate displacement error values for the
carriage 40 along the J1 axis and thepitch arm 52 about the J3 axis. Because theboot 102 is compliant, it can be modeled as a spring utilizing the spring equation: - Force=k*d, wherein
- d=displacement of
boot 102, - k=the spring constant
- Thus, an error in force or torque (i.e., f1 and τ3) is proportional to an error in displacement (i.e., displacement d) of the
robotic arm 34 along the J1 and J3 axes. In a constructed embodiment, the force f1 is input into a first PID controller (implemented in software) that calculates a displacement error value ΔJ1 for thecarriage 40 on the J1 axis. Similarly, the torque τ3 is input into a second PID controller that calculates an angular displacement error value Δθ3 for thepitch arm 52 about the J3 axis. As previously discussed, the values ΔJ1, and Δθ3 may be utilized by therobot controller 46 to correct the position of thecarriage 40 and thepitch arm 52, respectively, to reduce the undesirable forces applied to theend effector 54 during fueling of thefuel stem 90. - The
robot controller 46 utilizes three types of sensor feedback to position therobotic arm 34 along the J1 axis. The types of sensor feedback include a line encoder feedback, a vision system feedback, and a force feedback. Referring to FIG. 2, the line encoder feedback includes encoder counts from theconveyer line encoder 162 that are utilized to determine a gross position of thefuel stem 90. The encoder counts are also indicative of the speed of theconveyer line 28. Referring to FIG. 11, the vision system feedback includes a three-dimensional position Pe of thefuel stem 90 with respect to the coordinate system CS1. Finally, the force feedback includes a measured force exerted on theend effector 54 by thefuel stem 90 during fueling of thefuel stem 90. - Referring to FIGS. 10A and 10D, the
robot controller 46 moves therobotic arm 34 along the J1 axis using the tracking equation (9). The equation (9) utilizes a STEM_DISPLACEMENT variable that corresponds to the distance that the fuel stem 90 (and the vehicle 26) have traveled alongconveyer axis 170 since thevehicle 26 passed the light sensor trigger position. The STEM_DISPLACEMENT variable is calculated using the following equation: - STEM_DISPLACEMENT=(current encoder count−first encoder count)*CF, wherein;
- first encoder count=encoder count from
conveyer line encoder 162 when the VTRIG signal is generated; - CF=conversion factor for converting an encoder count to a distance along the
conveyer line axis 170. - Accordingly, the STEM_DISPLACEMENT variable is updated to a new value whenever the current encoder count is updated by the
conveyer line encoder 162. - The equation (9) also utilizes the DESIRED_CARRIAGE_TO_STEMY constant obtained from the VID record. The DESIRED_CARRIAGE_TO_STEMY constant represents the desired distance along the J1 axis from the origin of coordinate system CS1 (on the carriage 40) to a point P1 directly across from the
fuel stem 90—to allow theend effector 54 to mate with thefuel stem 90. - The equation (9) also utilizes the HOME_TO_STEM constant obtained from the VID record. The HOME_TO_STEM constant represents the distance along the J1 axis from the origin of coordinate system CS0 to a point P0 directly across from the
fuel stem 90—when thevehicle 26 passes the light sensor trigger position. - The equation (9) also utilizes a VISION_TRACKING_ERROR variable. As previously discussed, the
vision system 164 calculates a three-dimensional position Pe of thefuel stem 90 with respect to a coordinate system CS1. Further, thevision system 164 transfers the position Pe to therobot controller 46 using the following variables: CARRIAGE_TO_STEMX, CARRIAGE_TO_STEMY, CARRIAGE_TO_STEMZ. Because the CARRIAGE_TO_STEMY variable represents a current distance from the carriage 40 (and coordinate system CS1) to thefuel stem 90 along the J1 axis, the variable can be used to calculate the VISION_TRACKING_ERROR. In particular, the VISION_TRACKING_ERROR variable may be calculated using the following equation: - VISION_TRACKING_ERROR=(CARRIAGE_TO_STEMY_DESIRED_CARRIAGE_TO_ST EMY)
- Referring to FIG. 10A, the carriage40 (and coordinate system CS1) is perfectly positioned along the J1 axis to allow engagement of the
end effector 54 and thefuel stem 90. As shown, the CARRIAGE_TO_STEMY distance returned by thevision system 164 is equal to the DESIRED_CARRIAGE_TO_STEMY. Accordingly, the VISION_TRACKING_ERROR value equals a zero value. - Referring to FIG. 10B, the carriage40 (and coordinate system CS1) are positioned too far in front of the
fuel stem 90 along the J1 axis to allow engagement of theend effector 54 and thefuel stem 90. Accordingly, the VISION_TRACKING_ERROR is equal to a negative number (i.e., a negative value along the Y1 axis) which decreases the COMMANDED_J1_POSITION value. In response, thecarriage 40 and therobotic arm 34 are moved to a position along the J1 axis corresponding to the new COMMANDED_J1_POSITION. - Referring to FIG. 10C, the carriage40 (and coordinate system CS1) are positioned too far behind a desired position on the J1 axis to allow engagement of the
end effector 54 and thefuel stem 90. Accordingly, the VISION_TRACKING_ERROR is equal to a positive number (i.e., a positive value along the Y1 axis) which increases the COMMANDED_J1_POSITION variable. In response, thecarriage 40 and therobotic arm 34 are moved to a position along the J1 axis corresponding to the new COMMANDED_J1_POSITION. - The equation (9) also utilizes a FORCE_FEEDBACK_ERROR variable. As previously discussed, the
robot controller 46 monitors a force exerted on therobotic arm 34 by thefuel stem 90 during the fueling of thestem 90. Further, thecontroller 46 calculates a displacement error value ΔJ1 to correct the position of thecarriage 40 along the J1 axis responsive to the force. Note that the force control also compensates for misalignments in the vertical directional using the J3 axis. When theboot 102 is mated with thefuel stem 90, the FORCE FEEDBACK_ERROR variable is set equal to the calculated displacement error value ΔJ1. When theboot 102 is not mated with thefuel stem 90, the FORCE_FEEDBACK_ERROR is set equal to a zero value. - The modes of operation of the fueling
system 24 in accordance with the present invention will now be discussed. Referring to FIG. 21A, the modes of operation include a power upmode 190, anauto mode 192, and amanual mode 194. The various modes of operation are implemented in software that is stored in the ROM of therobot controller 46. - During the power up
mode 190, therobot controller 46 and thevision controller 174 establish communication with each other via thecommunication bus 172. Further, power is applied to themotor drivers - Next, the
robot controller 46 advances to astep 191 that prompts the user of the fuelingsystem 24 to select between the different modes of system operation. As discussed, the modes of operation includeauto mode 192 ormanual mode 194. Therobot controller 46 also provides for the option to shut down the fuelingsystem 24. The modes of operation may be chosen using a GUI and a pointing device (not shown), a push-button based operator panel (not shown), or similar controls on thesupervisory PLC 178. - When the user selects the
auto mode 192 of operation, the software modules comprising theauto mode 192 are executed to implement the method for fueling avehicle 26 in accordance with the present invention. Referring to FIG. 21B, theauto mode 192 includes astow module 196, anidle module 198, agross position module 200, a trackfuel stem module 202, aninsert module 204, a fuel module 206, apurge fuel module 208, and anextract module 210. - Referring to FIGS. 2 and 4, the
stow module 196 performs steps to move thecarriage 40 to a home position (i.e., origin of the coordinate system CS0) along the J1 axis. Further, thejoints - Referring to FIG. 21C, the
idle module 198 includes astep 212 of determining if a vehicle trigger signal VTRIG was received. Referring to FIG. 10A, when thevehicle 26 passes the light sensor trigger position, a continuously emitted light beam is not detected by thelight sensor 160. In response, thelight sensor 160 generates the signal VTRIG which is received by therobot controller 46. Referring to FIG. 21C, if the signal VTRIG is received, themodule 198 advances to astep 214 which stores a first encoder count from theconveyer line encoder 162. Thereafter, themodule 198 is exited and theauto mode 192 advances to thegross position module 200. Alternately, if the signal VTRIG is not received, themodule 198 iteratively performs thestep 212 until avehicle 26 is detected by thelight sensor 160. - Referring to FIG. 21D, the
gross position module 200 includes astep 216 of requesting a VID record from thevision controller 174. As previously discussed, the VID record contains vehicle dependent information for tracking thefuel stem 90. In a preferred embodiment, unit vector ax, ay, az is obtained from the VID record, and is then used to calculate the optimum J2 and J4 angles. The J2 angle and J4 hard stop are adjusting during thegross position module 200. In an alternate embodiment the J2 and J4 angles are automatically calculated but are manually adjusted off line prior to automatic operation. - The
module 200 further includes astep 218 which iteratively calculates a gross position of thefuel stem 90 with respect to the J1 axis. Referring to FIG. 10, thestep 218 utilizes the tracking equation (9) to calculate the COMMANDED_J1_POSITION which also represents the gross position of thefuel stem 90. Further, both the VISION_TRACKING_ERROR and the FORCE_FEEDBACK_ERROR are equal to a zero value during thestep 218. - Referring to FIG. 21D, the
module 200 further includes astep 220 of moving therobotic arm 34 along the J1 axis to position theend effector 54 proximate to the gross position of thefuel stem 90. Further, themodule 200 moves the robotic arm 34 (and the end effector 54) at a speed substantially equal to the speed of theconveyer line 28. During thestep 200, thecameras robotic arm 34 should be positioned along J1 such that the field of view of thecameras fuel stem 90. - The
module 200 further includes astep 222 of triggering theframe grabber 180. In particular, therobot controller 46 generates a signal VTR that causes theframe grabber 180 to transfer first and second digital images of thefuel stem 90 from thecameras vision controller 174. - The
module 200 further includes astep 224 of requesting a three-dimensional position Pe of thefuel stem 90 from thevision controller 174. As previously discussed, thevision controller 174 performs template matching on the first and second digital images and calculates first and second correlation scores, respectively. Further, thecontroller 174 returns the position Pe if the first and second correlation scores are above a threshold correlation score and a triangulation error of the position Pe is below a predetermined triangulation error. - The
module 200 further includes astep 226 which determines whether a three-dimensional position Pe was received from thevision controller 174. If the position Pe was received, themodule 200 is exited and theauto mode 192 advances to the trackfuel stem module 202. Alternately, if the position Pe was not received, themodule 200 returns to step 222. - Referring to FIG. 21B, the
auto mode 192 advances to the trackfuel stem module 202 aftermodule 200. Referring to FIG. 21E, themodule 202 includes astep 228 of triggering theframe grabber 180 to obtain first and second digital images of thefuel stem 90 generated by thecameras - The
module 202 further includes astep 230 that requests a three-dimensional position Pe of thefuel stem 90 from thevision controller 174. - The
module 202 further includes astep 232 which determines whether a three-dimensional position Pe of thefuel stem 90 was received by therobot controller 46. If the position Pe was received, themodule 202 advances to thestep 234. Otherwise, themodule 202 returns to thestep 228. - The
module 202 further includes astep 234 of calculating the VISION_TRACKING_ERROR. As previously discussed, the VISION_TRACKING_ERROR is utilized in the equation (9) to more accurately position therobotic arm 34 along the J1 axis relative to thefuel stem 90. It should be understood that the VISION_TRACKING_ERROR is iteratively calculated in the trackfuel stem module 202 and theinsert module 204 so long as thefuel stem 90 is viewed by bothcameras - The
module 202 further includes astep 236 which moves therobotic arm 34 along the J1 axis proximate thefuel stem 90 responsive to the COMMANDED_J1_POSITION. - Referring to FIG. 21B, the
auto mode 192 advances to theinsert module 204 after themodule 202. Referring to FIG. 21F, themodule 204 includes thesteps step 238 calculates joint variables θ3 and J5 using the inverse kinematic model previously discussed. Thestep 240 moves thepitch arm 52 about the J3 axis to the calculated angle θ3. The step 242 moves theend effector 54 about the J4 axis to a predetermined angle θ4. - The
step 244 moves theboot 102 along the J5 axis a distance J5 to allow theboot 102 to mate with thefuel stem 90. Referring to FIG. 20, during thesteps end effector 54 is positioned to allow thestep 244 to move theend effector 54 along a docking line toward thefuel stem 90. As shown, the docking line extends between a point Pb on the nozzle axis 245 to the fuel stem point Pe. Further, the docking line forms an approach angle θD with respect to thestem axis 243. Finally, thestep 246 moves thefuel hose 100 into thefuel stem 90. - Referring to FIG. 21B, the
auto mode 192 advances to the fuel module 206 after themodule 204. Referring to FIG. 21G, the fuel module 206 includes thesteps step 248 calculates the FORCE_FEEDBACK_ERROR which is utilized by thecontroller 46 to calculate the COMMANDED_J1_POSITION of the carriage 40 (and the robotic arm 34) along the J1 axis. Thestep 250 opens afuel valve 116 in theend effector 54 to supply fuel to thefuel stem 90. Thestep 252 closes thefuel valve 116 after a predetermined amount of fuel is pumped into thefuel stem 90. - Referring to FIG. 21B, the
auto mode 192 advances to thepurge fuel module 208 after the module 206. Referring to FIG. 8, thepurge fuel module 208 performs steps to apply air pressure to thecheck valves fuel hose 100 into thefuel stem 90. - The
auto mode 192 advances to theextract module 210 after themodule 208 which extracts thefuel hose 100 from thefuel stem 90 and moves therobotic arm 34 to a predetermined retract position. - Referring to FIG. 21B, the
auto mode 192 after exitingmodule 210 advances to astep 254 which determines if anothervehicle 26 has passed the light sensor trigger position. If anothervehicle 26 is detected, theauto mode 192 advances to thegross position module 200. Otherwise, theauto mode 192 advances to thestow module 196. - There is a self-test operating mode during which the
robot controller 46 places therobotic arm 34 such that the two robot tip markers TM0 and TM1 as shown in FIG. 20 are within the field of view of thecameras vision controller 174 then acquires a first digital image and a second digital image from thecameras frame grabber 180, and locates the first and second image locations of the tip marker 0 “TM0” and of thetip marker 1 “TM1.” Using a triangulation method already described for the localization of the fuel stem point, it calculates the 3D positions of both tip markers with respect to CS2. This information may be used by therobot controller 46 to periodically update the kinematic parameters which vary due to dimensional instability of theelastomeric boot hose 103. -
Pneumatic cylinder 114 is provided to extend thefiller hose 100 through theboot 102 in order to pump fuel into the filler neck from a point below the no-lead insert. When thefiller hose 100 is retracted, thecylinder 114 is prevented from reaching its full stroke due to the fact that the tip of the filler hose is too large to fit through theboot 102. Should the tip of thehose 100 be lost, thecylinder 114 can be fully retracted. This configuration allows the robot to sense if thehose 100 has been lost or damaged by simply using two limit switches on thecylinder 114 retract stroke, one of which is triggered when thehose 100 is retracted far enough to reach the boot, and one of which is triggered when thecylinder 114 is fully retracted. - The
inventive fueling system 24 and the method related thereto represent a significant improvement over conventional fueling systems and methods. In particular, theinventive fueling system 24 allows the automatic fueling of anautomotive vehicle 26 without the need for an operator. Thus, theinventive fueling system 24 and method result in labor savings. Further, the fuelingsystem 24 can move anend effector 54 to mate with thefuel stem 90 while thevehicle 26 is moving. Thus, thevehicle 26 can be fueled without stopping theconveyer line 28 resulting in manufacturing cost savings and increased assembly line efficiency. - While the invention has been particularly shown and described with reference to the preferred embodiments thereof, it is well understood by those skilled in the art that various changes and modifications can be made in the invention without departing from the spirit and the scope of the invention.
Claims (27)
1. A method for assembly line fluid fill of a container in a vehicle moving along the assembly line comprising the steps of:
(A) determining a position of an inlet of the container using a machine vision system while the vehicle is moving along the assembly line;
(B) positioning a fluid fill delivery outlet to the inlet using a robot; and
(C) filling the container via the outlet while the vehicle continues to move.
2. The method of claim 1 further comprising the step of moving the delivery outlet in substantial synchronism with said inlet.
3. The method of claim 2 wherein said moving step includes the substep of updating the position of the container inlet.
4. A method for fueling an automotive vehicle on a conveyer line, said conveyer line moving generally along a first axis, said method utilizing a robotic arm with an end effector for fueling a fuel stem of said vehicle, said robotic arm having a workspace within which the fuel stem must lie in order to be fueled, said method comprising:
determining a first position of said fuel stem along said first axis;
moving said robotic arm to position said end effector proximate to said first position such that said fuel stem lies within said workspace;
determining a three-dimensional position of said fuel stem utilizing a vision system;
moving said end effector proximate to said three-dimensional position to enable said end effector to mate with said fuel stem and moving said robotic arm relative to said first axis at a speed substantially equal to a speed of said conveyer line; and,
fueling said fuel stem with said end effector.
5. The method of claim 4 wherein said step of determining a first position of said fuel stem includes:
storing a first encoder position value indicative of a position of said vehicle when said vehicle passes a predetermined position on said conveyer line;
storing a second encoder position value indicative of a position of said vehicle after said vehicle has passed said predetermined position; and,
calculating said first position of said fuel stem responsive to said first and second encoder position values.
6. The method of claim 5 wherein the step of storing the second position includes detecting the presence of the vehicle.
7. The method of claim 6 wherein said step of storing said second position value includes:
monitoring a light beam being projected across said conveyer line; and,
determining when said light beam is interrupted by said vehicle.
8. The method of claim 4 wherein said vision system includes first and second cameras and said step of determining said three-dimensional position of said fuel stem includes:
generating a first digital image of said workspace including said fuel stem utilizing said first camera;
simultaneously generating a second digital image of said workspace including said fuel stem utilizing said second camera; and,
calculating said three-dimensional position of said fuel stem with respect to a predetermined coordinate system responsive to said first and second digital images.
9. The method of claim 4 wherein said three-dimensional position is a center point of said fuel stem with respect to a predetermined coordinate system, said center point lying on a plane defined by the sealing surface (typically an outer edge) of said fuel stem.
10. The method of claim 4 wherein said step of moving said end effector to said second position includes the steps of:
monitoring said speed of said conveyer line while said end effector is mated with said fuel stem; and
matching the speed and position of the end effector with that of the fuel stem.
11. The method of claim 4 wherein said step of fueling further includes:
monitoring a force exerted on said end effector by said fuel stem; and
adjusting said position of said end effector responsive to said force.
12. The method of claim 4 wherein said step of fueling said fuel stem with said end effector includes:
opening a fuel valve in said end effector to supply fuel to said fuel stem; and,
closing said fuel valve after a predetermined amount of fuel is pumped into said fuel stem.
13. The method of claim 4 further including retracting said end effector from said fuel stem after a predetermined amount of fuel is pumped into said fuel stem.
14. The method of claim 4 further including illuminating said fuel stem.
15. A method for providing position data to a controller to enable a robotic arm controlled by said controller to move to a position of an object, said object lying within the workspace of said robotic arm, said method utilizing first and second cameras disposed at first and second coordinate systems, respectively, said method comprising:
generating a first digital image of said workspace including said object utilizing said first camera;
searching said first digital image with a first image template to determine a location in said first image where a first correlation score between that portion of said image being searched and said template is greater than a predetermined threshold;
simultaneously generating a second digital image of said workspace including said object utilizing said second camera;
searching said second digital image with a second image template to determine a location in said second image where a second correlation score between that portion of said image being searched and said template is greater than a predetermined threshold;
calculating a three-dimensional position of said object with respect to a predetermined coordinate system responsive to said first and second digital images when said first and second correlation scores are both greater than a threshold correlation score;
calculating a triangulation error of said position; and,
transferring position data indicative of said three-dimensional position to said controller when said triangulation error is less than a threshold error value.
16. The method of claim 15 wherein said object is a fuel stem and said three-dimensional position is a center point of said fuel stem.
17. The method of claim 15 wherein said step of calculating said three-dimensional position includes:
calculating a first direction vector from an origin of said first coordinate system responsive to said first digital image, said first vector pointing towards an estimated first center point of said object;
calculating a position of said first coordinate system and an orientation of said first direction vector relative to said predetermined coordinate system;
calculating a second direction vector from an origin of said second coordinate system responsive to said second digital image, said second direction vector pointing towards an estimated second center point of said object;
calculating a position of said second coordinate system and an orientation of said second direction vector relative to said predetermined coordinate system;
determining a first point along said first direction vector that is closest to said second direction vector;
determining a second point along said second direction vector that is closest to said first point; and,
calculating a midpoint between said first and second points to obtain said three-dimensional position of said object.
18. The method of claim 15 further including illuminating said object.
19. The method of claim 15 further including moving said robotic arm to said three-dimensional position.
20. A fueling system for fueling an automotive vehicle on a conveyer line, said conveyer line moving generally along a first axis, comprising:
a gantry having a carriage configured to move generally parallel to said first axis;
a robotic arm attached to said carriage that moves with said carriage, said robotic arm having an end effector configured to mate with a fuel stem on said vehicle and to supply fuel to said fuel stem through a fuel hose;
a vision system including first and second cameras for iteratively determining a three-dimensional position of said fuel stem relative to a predetermined coordinate system;
a robot controller configured to command said carriage to move proximate said three-dimensional position and to move said end effector to said position to mate with said fuel stem, said robot controller being further configured to move said robotic arm relative to said first axis at a speed substantially equal to a speed of said conveyer line.
21. The fueling system of claim 20 further including a position encoder operatively connected to said conveyer line.
22. The fueling system of claim 20 further including a light sensor for detecting when said vehicle passes a predetermined location on said conveyer line.
23. The fueling system of claim 20 further including a light for illuminating said fuel stem.
24. The fueling system of claim 20 wherein said vision system further includes a vision controller and a frame grabber, said frame grabber retrieving digital images generated by said first and second cameras, said vision controller configured to calculate said three-dimensional position of said fuel stem responsive to said digital images.
25. The fueling system of claim 20 wherein the robotic controller includes means for positioning a joint of said robotic arm;
means for calculating a desired joint position;
means for automatically positioning a stop corresponding to said desired joint position using a low powered motor and a self-locking mechanism; and
means for driving said joint against said stop using a pneumatic actuator.
26. The fueling system of claim 20 comprising:
further including means for detecting loss of or damage to said fuel hose;
means for providing said fuel hose with a tip having a size configured to impair retraction through a boot of said end effector;
means for actuating said fuel hose using a pneumatic cylinder;
means for arranging a stroke of said pneumatic cylinder such that said stroke remains when said tip has bottomed in said boot of said end effector;
means for providing a first limit switch which energizes when said pneumatic cylinder has retracted to a first position corresponding to said hose tip being bottomed in said boot;
means for providing a second limit switch which energizes when said pneumatic cylinder is fully retracted to a second position beyond said first position; and
means for providing said robot controller with logic to sense that said fuel hose has been lost or damaged when said second limit switch is energized.
27. A method for providing updated kinematic parameters to a controller to enable a robotic arm controlled by said controller to compensate for dimensionally unstable portions of said robotic arm, said method comprising the steps of:
providing one or more markers on an end effector of said robotic arm;
positioning the markers within a workspace of said robotic arm;
using a vision system associated with said robotic arm to determine respective three-dimensional locations of said markers; and
calculating updated kinematic parameters by comparing the determined locations of said markers returned by said vision system to respective expected locations of said markers.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US09/810,778 US20030164200A1 (en) | 2001-03-16 | 2001-03-16 | Assembly line fluid filling system and method |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US09/810,778 US20030164200A1 (en) | 2001-03-16 | 2001-03-16 | Assembly line fluid filling system and method |
Publications (1)
Publication Number | Publication Date |
---|---|
US20030164200A1 true US20030164200A1 (en) | 2003-09-04 |
Family
ID=27805603
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US09/810,778 Abandoned US20030164200A1 (en) | 2001-03-16 | 2001-03-16 | Assembly line fluid filling system and method |
Country Status (1)
Country | Link |
---|---|
US (1) | US20030164200A1 (en) |
Cited By (35)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20040138782A1 (en) * | 2003-01-09 | 2004-07-15 | Passmore Michael L. | Multi-station robotic welding assembly |
WO2007059781A1 (en) * | 2005-11-28 | 2007-05-31 | Niels Jakob Jacques Svendstorp | System for an infrastructure for hydrogen refuelling of moving vehicles |
WO2007098918A2 (en) * | 2006-03-01 | 2007-09-07 | Khs Ag | Device for introducing an inspection liquid and/or control liquid into bottles or similar receptacles |
WO2008058077A3 (en) * | 2006-11-03 | 2008-07-31 | Univ Southern California | Gantry robotics system and related material transport for contour crafting |
US20090223592A1 (en) * | 2008-03-04 | 2009-09-10 | Vanrx Pharmaceuticals, Inc. | Robotic filling systems and methods |
EP2335885A1 (en) * | 2009-12-17 | 2011-06-22 | Miguel Matos Pinto Afonso Duraes | Computer vision device to automatically synchronize the speed and position of manipulator with the moving assembly line in the automotive industry |
US20110150609A1 (en) * | 2007-04-12 | 2011-06-23 | Leco Corporation | Crucible shuttle assembly and method of operation |
US8000837B2 (en) | 2004-10-05 | 2011-08-16 | J&L Group International, Llc | Programmable load forming system, components thereof, and methods of use |
US20130076902A1 (en) * | 2011-09-26 | 2013-03-28 | Universite Laval | Robotically operated vehicle charging station |
WO2013156021A1 (en) * | 2012-04-20 | 2013-10-24 | Dürr Somac GmbH | Device for handling hose assemblies on filling systems of assembly lines |
US9470701B2 (en) | 2007-04-12 | 2016-10-18 | Leco Corporation | Crucible shuttle assembly |
US20170129765A1 (en) * | 2014-03-29 | 2017-05-11 | Dürr Somac GmbH | Filling adapter (aeration line) |
WO2017105239A1 (en) * | 2015-12-18 | 2017-06-22 | Wasmunt B.V. | Petrol station and method for refuelling vehicles |
US20170279486A1 (en) * | 2016-03-24 | 2017-09-28 | Focal Point Positioning Ltd. | Method, apparatus, computer program, chip set, or data structure for correlating a digital signal and a correlation code |
US9789986B2 (en) | 2009-02-26 | 2017-10-17 | Vanrx Pharmasystems Inc. | Robotic filling systems and methods |
US20170355590A1 (en) * | 2016-06-10 | 2017-12-14 | Jin Sung Eng Co., Ltd | Liquid injection system for vehicle and method for controlling the same |
WO2018033167A1 (en) | 2016-08-19 | 2018-02-22 | Dürr Somac GmbH | Handling system for a filling system for filling containers and circuits of vehicles with different operating materials on assembly lines of the automobile industry |
CN108128747A (en) * | 2017-12-22 | 2018-06-08 | 河南理工大学 | Oiling robot and gas station |
DE202017001707U1 (en) | 2017-03-28 | 2018-07-02 | Dürr Somac GmbH | Device for fully automatic filling of vehicle systems on assembly lines of the automotive industry |
US20180361999A1 (en) * | 2015-12-18 | 2018-12-20 | Wasmunt B.V. | Petrol Station and Method for Refuelling Vehicles |
US20190016582A1 (en) * | 2017-07-11 | 2019-01-17 | Hyundai Motor Company | Automatic liquid injection system |
US10546167B2 (en) * | 2014-11-10 | 2020-01-28 | Faro Technologies, Inc. | System and method of operating a manufacturing cell |
DE102019001407A1 (en) * | 2019-02-25 | 2020-08-27 | Dürr Somac GmbH | Screwing tool for handling closure elements on vehicle containers for holding operating materials |
DE102019206491A1 (en) * | 2019-05-06 | 2020-11-12 | Volkswagen Aktiengesellschaft | Manufacturing plant for motor vehicles and manufacturing cell for this manufacturing plant |
DE102020006487A1 (en) | 2020-10-20 | 2022-04-21 | Dürr Somac GmbH | Quick-change device for a filling adapter |
US20220135394A1 (en) * | 2020-11-03 | 2022-05-05 | Oilmen's Truck Tanks, Inc. | Fluid delivery system and method |
WO2022207662A1 (en) * | 2021-03-30 | 2022-10-06 | Autofuel Aps | Fuel dispenser adaptor for automatic refuelling |
US11608261B2 (en) * | 2019-07-29 | 2023-03-21 | RPM Industries, LLC | Robotic servicing system |
DE102021006518A1 (en) | 2021-12-22 | 2023-06-22 | Dürr Somac GmbH | Device and method for filling vehicles with operating materials on assembly lines in the automotive industry |
US11982753B2 (en) | 2017-09-26 | 2024-05-14 | Focal Point Positioning Limited | Method and system for calibrating a system parameter |
DE102022004464A1 (en) | 2022-11-29 | 2024-05-29 | Dürr Somac GmbH | Assembly unit for designing an interface for a filling adapter for filling vehicles |
US20240190595A1 (en) * | 2022-12-08 | 2024-06-13 | as Strömungstechnik GmbH | Dispensing System and Method of Operating a Dispensing System |
EP4442638A1 (en) | 2023-04-04 | 2024-10-09 | Dürr Somac GmbH | Method and device for filling containers and circuits of vehicles with liquid operating substances |
US12135380B2 (en) | 2016-03-24 | 2024-11-05 | Focal Point Positioning Limited | Method and system for calibrating a system parameter |
US12164465B2 (en) | 2008-03-04 | 2024-12-10 | Vanrx Pharmasystems Inc. | Robotic filling systems and methods |
-
2001
- 2001-03-16 US US09/810,778 patent/US20030164200A1/en not_active Abandoned
Cited By (65)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6907318B2 (en) * | 2003-01-09 | 2005-06-14 | Advanced Tubing Technology, Inc. | Multi-station robotic welding assembly |
US20040138782A1 (en) * | 2003-01-09 | 2004-07-15 | Passmore Michael L. | Multi-station robotic welding assembly |
US8000837B2 (en) | 2004-10-05 | 2011-08-16 | J&L Group International, Llc | Programmable load forming system, components thereof, and methods of use |
WO2007059781A1 (en) * | 2005-11-28 | 2007-05-31 | Niels Jakob Jacques Svendstorp | System for an infrastructure for hydrogen refuelling of moving vehicles |
WO2007098918A2 (en) * | 2006-03-01 | 2007-09-07 | Khs Ag | Device for introducing an inspection liquid and/or control liquid into bottles or similar receptacles |
WO2007098918A3 (en) * | 2006-03-01 | 2007-10-25 | Khs Ag | Device for introducing an inspection liquid and/or control liquid into bottles or similar receptacles |
WO2008058077A3 (en) * | 2006-11-03 | 2008-07-31 | Univ Southern California | Gantry robotics system and related material transport for contour crafting |
US20100025349A1 (en) * | 2006-11-03 | 2010-02-04 | University Of Southern California | Gantry Robotics System and Related Material Transport for Contour Crafting |
US8029710B2 (en) | 2006-11-03 | 2011-10-04 | University Of Southern California | Gantry robotics system and related material transport for contour crafting |
US8657550B2 (en) * | 2007-04-12 | 2014-02-25 | Leco Corporation | Crucible shuttle assembly with linearly moving carriage |
US20110150609A1 (en) * | 2007-04-12 | 2011-06-23 | Leco Corporation | Crucible shuttle assembly and method of operation |
US9470701B2 (en) | 2007-04-12 | 2016-10-18 | Leco Corporation | Crucible shuttle assembly |
US10901941B2 (en) | 2008-03-04 | 2021-01-26 | Vanrx Pharmasystems Inc. | Robotic filling systems and methods |
US20090223592A1 (en) * | 2008-03-04 | 2009-09-10 | Vanrx Pharmaceuticals, Inc. | Robotic filling systems and methods |
US12164465B2 (en) | 2008-03-04 | 2024-12-10 | Vanrx Pharmasystems Inc. | Robotic filling systems and methods |
US11630801B2 (en) | 2008-03-04 | 2023-04-18 | V Anrx Pharmasystems Inc. | Robotic filling systems and methods |
US10261940B2 (en) | 2008-03-04 | 2019-04-16 | Vanrx Pharmasystems, Inc. | Robotic filling systems and methods |
US9789986B2 (en) | 2009-02-26 | 2017-10-17 | Vanrx Pharmasystems Inc. | Robotic filling systems and methods |
EP2335885A1 (en) * | 2009-12-17 | 2011-06-22 | Miguel Matos Pinto Afonso Duraes | Computer vision device to automatically synchronize the speed and position of manipulator with the moving assembly line in the automotive industry |
US20130076902A1 (en) * | 2011-09-26 | 2013-03-28 | Universite Laval | Robotically operated vehicle charging station |
US9266440B2 (en) * | 2011-09-26 | 2016-02-23 | GM Global Technology Operations LLC | Robotically operated vehicle charging station |
WO2013156021A1 (en) * | 2012-04-20 | 2013-10-24 | Dürr Somac GmbH | Device for handling hose assemblies on filling systems of assembly lines |
DE102012008378B4 (en) * | 2012-04-20 | 2019-08-29 | Dürr Somac GmbH | Device for handling hose packages at filling plants |
US10343892B2 (en) * | 2014-03-29 | 2019-07-09 | Dürr Somac GmbH | Filling adapter (aeration line) |
US20170129765A1 (en) * | 2014-03-29 | 2017-05-11 | Dürr Somac GmbH | Filling adapter (aeration line) |
US10546167B2 (en) * | 2014-11-10 | 2020-01-28 | Faro Technologies, Inc. | System and method of operating a manufacturing cell |
WO2017105239A1 (en) * | 2015-12-18 | 2017-06-22 | Wasmunt B.V. | Petrol station and method for refuelling vehicles |
US20180361999A1 (en) * | 2015-12-18 | 2018-12-20 | Wasmunt B.V. | Petrol Station and Method for Refuelling Vehicles |
US11137500B2 (en) | 2016-03-24 | 2021-10-05 | Focal Point Positioning Limited | Method, apparatus, computer program, chip set, or data structure for correlating a digital signal and a correlation code |
US20170279486A1 (en) * | 2016-03-24 | 2017-09-28 | Focal Point Positioning Ltd. | Method, apparatus, computer program, chip set, or data structure for correlating a digital signal and a correlation code |
US12135380B2 (en) | 2016-03-24 | 2024-11-05 | Focal Point Positioning Limited | Method and system for calibrating a system parameter |
US12270920B2 (en) | 2016-03-24 | 2025-04-08 | Focal Point Positioning Limited | Method and system for calibrating a system parameter |
US9780829B1 (en) * | 2016-03-24 | 2017-10-03 | Focal Point Positioning Ltd. | Method, apparatus, computer program, chip set, or data structure for correlating a digital signal and a correlation code |
US10604404B2 (en) * | 2016-06-10 | 2020-03-31 | Jin Sung Eng Co., Ltd | Liquid injection system for vehicle and method for controlling the same |
US20170355590A1 (en) * | 2016-06-10 | 2017-12-14 | Jin Sung Eng Co., Ltd | Liquid injection system for vehicle and method for controlling the same |
US10882731B2 (en) * | 2016-08-19 | 2021-01-05 | Dürr Somac GmbH | Handling system for a filling system for filling containers and circuits of vehicles with different operating materials on assembly lines of the automobile industry |
US20190210863A1 (en) * | 2016-08-19 | 2019-07-11 | Dürr Somac GmbH | Handling system for a filling system for filling containers and circuits of vehicles with different operating materials on assembly lines of the automobile industry |
WO2018033167A1 (en) | 2016-08-19 | 2018-02-22 | Dürr Somac GmbH | Handling system for a filling system for filling containers and circuits of vehicles with different operating materials on assembly lines of the automobile industry |
DE102016010179A1 (en) | 2016-08-19 | 2018-02-22 | Dürr Somac GmbH | Handling system for a filling system |
DE102016010179B4 (en) | 2016-08-19 | 2023-07-06 | Dürr Somac GmbH | Handling system for a filling system |
WO2018177450A1 (en) | 2017-03-28 | 2018-10-04 | Dürr Somac GmbH | Apparatus for filling vehicles on assembly lines in the automobile industry |
DE202017001707U1 (en) | 2017-03-28 | 2018-07-02 | Dürr Somac GmbH | Device for fully automatic filling of vehicle systems on assembly lines of the automotive industry |
US10981775B2 (en) | 2017-03-28 | 2021-04-20 | Dürr Somac GmbH | Apparatus for filling vehicles on assembly lines in the automobile industry |
US20190016582A1 (en) * | 2017-07-11 | 2019-01-17 | Hyundai Motor Company | Automatic liquid injection system |
US11982753B2 (en) | 2017-09-26 | 2024-05-14 | Focal Point Positioning Limited | Method and system for calibrating a system parameter |
CN108128747A (en) * | 2017-12-22 | 2018-06-08 | 河南理工大学 | Oiling robot and gas station |
US11865694B2 (en) | 2019-02-25 | 2024-01-09 | Dürr Somac GmbH | Rotary driving tool for handling closure elements |
DE102019001407A1 (en) * | 2019-02-25 | 2020-08-27 | Dürr Somac GmbH | Screwing tool for handling closure elements on vehicle containers for holding operating materials |
WO2020173518A1 (en) | 2019-02-25 | 2020-09-03 | Dürr Somac GmbH | Screwing tool for handling closure elements on vehicle tanks for holding fuels |
DE102019206491A1 (en) * | 2019-05-06 | 2020-11-12 | Volkswagen Aktiengesellschaft | Manufacturing plant for motor vehicles and manufacturing cell for this manufacturing plant |
DE102019206491B4 (en) * | 2019-05-06 | 2024-11-14 | Volkswagen Aktiengesellschaft | Manufacturing plant for motor vehicles and manufacturing cell for this manufacturing plant |
US11608261B2 (en) * | 2019-07-29 | 2023-03-21 | RPM Industries, LLC | Robotic servicing system |
US20230278853A1 (en) * | 2019-07-29 | 2023-09-07 | RPM Industries, LLC | Robotic servicing system |
WO2022083814A1 (en) | 2020-10-20 | 2022-04-28 | Dürr Somach Gmbh | Quick-change device for a filling adapter |
DE102020006487A1 (en) | 2020-10-20 | 2022-04-21 | Dürr Somac GmbH | Quick-change device for a filling adapter |
US11660646B2 (en) * | 2020-11-03 | 2023-05-30 | Oilmen's Truck Tanks, Inc. | Fluid delivery system and method |
US20220135394A1 (en) * | 2020-11-03 | 2022-05-05 | Oilmen's Truck Tanks, Inc. | Fluid delivery system and method |
WO2022207662A1 (en) * | 2021-03-30 | 2022-10-06 | Autofuel Aps | Fuel dispenser adaptor for automatic refuelling |
WO2023117129A1 (en) | 2021-12-22 | 2023-06-29 | Dürr Somac GmbH | Apparatus and method for filling vehicles with operating fluids on assembly lines of the automotive industry |
DE102021006518A1 (en) | 2021-12-22 | 2023-06-22 | Dürr Somac GmbH | Device and method for filling vehicles with operating materials on assembly lines in the automotive industry |
WO2024114937A1 (en) | 2022-11-29 | 2024-06-06 | Dürr Somac GmbH | Structural unit for configuring an interface for a filling adapter for filling vehicles |
DE102022004464A1 (en) | 2022-11-29 | 2024-05-29 | Dürr Somac GmbH | Assembly unit for designing an interface for a filling adapter for filling vehicles |
US20240190595A1 (en) * | 2022-12-08 | 2024-06-13 | as Strömungstechnik GmbH | Dispensing System and Method of Operating a Dispensing System |
EP4442638A1 (en) | 2023-04-04 | 2024-10-09 | Dürr Somac GmbH | Method and device for filling containers and circuits of vehicles with liquid operating substances |
DE102023001333A1 (en) | 2023-04-04 | 2024-10-10 | Dürr Somac GmbH | Method and device for filling containers and circuits of vehicles with liquid operating materials |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20030164200A1 (en) | Assembly line fluid filling system and method | |
US11597091B2 (en) | Robotic target alignment for vehicle sensor calibration | |
EP3549874B1 (en) | Mobile visual-inspection system | |
US10596700B2 (en) | System and calibration, registration, and training methods | |
US10814480B2 (en) | Stabilization of tool-carrying end of extended-reach arm of automated apparatus | |
US11835646B2 (en) | Target alignment for vehicle sensor calibration | |
US10625427B2 (en) | Method for controlling location of end effector of robot using location alignment feedback | |
JP6855492B2 (en) | Robot system, robot system control device, and robot system control method | |
US8274648B2 (en) | Device and a method for checking an attitude of a vehicle | |
KR101023275B1 (en) | Method and apparatus for calibrating a vehicle camera system, a method for determining angular misalignment of a vehicle camera system, and an electronic control unit performing the same | |
CN115702322A (en) | Apparatus and method for calibrating and aligning automotive sensors | |
EP1076221A2 (en) | A robot with gauging system for determining three-dimensional measurement data | |
CN109764805B (en) | Mechanical arm positioning device and method based on laser scanning | |
CN104703762A (en) | System and method for camera-based auto-alignment | |
CN111502364B (en) | Device and method for automatically lifting vehicle | |
CN108767933A (en) | A kind of control method and its device, storage medium and charging equipment for charging | |
WO2020141455A1 (en) | Robotic target alignment for vehicle sensor calibration | |
AU2021328456A1 (en) | Vehicular floor target alignment for sensor calibration | |
US20200096316A1 (en) | Three-dimensional measuring device and method | |
CN115112018A (en) | Three-coordinate machine intelligent composite joint inspection system and method based on stereoscopic vision | |
D'Orazio et al. | Mobile robot position determination using visual landmarks | |
Nygards et al. | Docking to pallets with feedback from a sheet-of-light range camera | |
Bucher et al. | Development and evaluation of an automatic connection device for electric cars with four DOFs and a control scheme based on infrared markers | |
JP2002144266A (en) | Working system | |
WO2022070162A1 (en) | Target alignment for vehicle sensor calibration |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: AMERICAN CONTROLS, INC., MICHIGAN Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:DALE, JAMES;CORKILL, DEAN;BESLER, DAVID;AND OTHERS;REEL/FRAME:011683/0404;SIGNING DATES FROM 20010307 TO 20010313 |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |