WO2025007230A1 - 机器人位置处理的方法及装置、电子设备和存储介质 - Google Patents
机器人位置处理的方法及装置、电子设备和存储介质 Download PDFInfo
- Publication number
- WO2025007230A1 WO2025007230A1 PCT/CN2023/105472 CN2023105472W WO2025007230A1 WO 2025007230 A1 WO2025007230 A1 WO 2025007230A1 CN 2023105472 W CN2023105472 W CN 2023105472W WO 2025007230 A1 WO2025007230 A1 WO 2025007230A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- controlled end
- target
- contact force
- force
- controller
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J13/00—Controls for manipulators
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
Definitions
- the present disclosure relates to the field of data processing technology, and in particular to a method and device for robot position processing, an electronic device, and a storage medium.
- Robot-assisted ultrasound examination refers to remote robot ultrasound examination.
- the robot remote ultrasound diagnosis system includes two subsystems: the doctor side and the patient side.
- Professional ultrasound doctors can remotely control the patient side ultrasound robot by operating the doctor side device to perform ultrasound examinations on patients.
- the robot remote ultrasound diagnosis system has gradually become popular.
- the robot remote ultrasound diagnosis system when using the robot remote ultrasound diagnosis system to examine the examinee, there is a problem of lagging movement when the ultrasound probe of the robot remote ultrasound diagnosis system moves on the human body.
- the present disclosure provides a method and device for robot position processing, an electronic device and a storage medium.
- the main purpose is to make the ultrasonic probe of the robot remote ultrasonic diagnosis system move more smoothly and fit the human body curve when moving on the human body, thereby improving the safety of ultrasonic examination based on the robot remote ultrasonic diagnosis system.
- a method for processing a robot position comprising:
- the moving distance and direction of the controlled end are controlled according to the position change.
- the displacement information and the target contact force are processed based on the first controller to obtain the position change.
- the processing of the displacement information and the target contact force based on the first controller to obtain the position change comprises:
- a first component of the position vector is obtained, where the first component is a vector perpendicular to the direction of the normal vector;
- the position change of the controlled end is calculated based on the first controller, the direction of the position change is the same as the direction of the first component, and the magnitude of the position change is the same as the magnitude of the position vector.
- controlling a sensor based on the first controller to obtain the target contact force between the controlled end and the target object includes:
- the target contact force between the controlled end and the target object is obtained, and the target contact force is a force of a preset amount and direction obtained based on the preset force sensor.
- the obtaining of a normal vector of a contact surface between the controlled end and the target object by calculating based on the first controller using the end posture and the target contact force comprises:
- the target contact force vector is normalized based on the first controller to obtain a normal vector of the contact surface between the controlled end and the target object.
- the calculating based on the first controller using the normal vector and the position vector to obtain the first component of the position vector includes:
- the first controller performs a vector subtraction operation using the position vector and the second component to obtain a first component.
- the target contact force is a force of equal magnitude and opposite direction;
- the method when increasing or decreasing the pressure of the controlled end on the target object, the method further includes:
- the pressure of the controlled end on the target object is increased or decreased according to the change amount.
- the analyzing the first target deviation based on the second controller to obtain the change in the pressure increase or decrease of the controlled end terminal on the target object includes:
- the second controller uses the differential gain and the target deviation change to perform a product calculation to obtain a second calculation result, wherein the target deviation change is the difference between the first target deviation and the target deviation at a previous moment, and the target deviation is the difference between the reference force and the actual contact force;
- the first calculation result and the second calculation result are summed up based on the second controller to obtain the change amount.
- a method for processing a robot position comprising:
- a control instruction is sent to the controlled terminal to control the controlled terminal to start the detection task, and receive the response information and image information of the controlled terminal;
- the displacement information is sent to the controlled end so that the controlled end moves according to the displacement information, and the response information and the image information of the controlled end are received in real time.
- the method further includes:
- the target object is the detection object of the controlled end.
- control instructions also include ultrasound control instructions and camera control instructions.
- a device for processing a robot position comprising:
- a receiving unit used to receive the displacement information sent by the master end, and collect the target contact force between the controlled end and the target object, wherein the target object is the detection object of the controlled end;
- a processing unit used for processing the displacement information and the target contact force to obtain a position change of the controlled end
- the first control unit is used to control the moving distance and direction of the controlled end according to the position change.
- processing unit is further used for:
- the displacement information and the target contact force are processed based on the first controller to obtain the position change.
- the processing unit includes:
- an acquisition module configured to respond to a control instruction of a master end, define the terminal posture of a controlled end according to the control instruction based on the first controller, and acquire the target contact force between the controlled end and a target object based on controlling a sensor based on the first controller;
- a first calculation module configured to calculate based on the first controller using the end posture and the target contact force to obtain a normal vector of a contact surface between the controlled end and the target object;
- a generating module configured to generate a position vector in the contact surface of the target object based on the displacement information based on the first controller, and to calculate based on the first controller using the normal vector and the position vector to obtain a first component of the position vector, wherein the first component is a vector perpendicular to a direction of the normal vector;
- the second calculation module is used to calculate the position change of the controlled end based on the first controller by setting the modulus of the first component to be the same as the modulus of the position vector, the direction of the position change is the same as the direction of the first component, and the magnitude of the position change is the same as the magnitude of the position vector.
- the first calculation module is further used for:
- the target contact force between the controlled end and the target object is obtained, and the target contact force is a force of a preset amount and direction obtained based on the preset force sensor.
- the generating module is further used for:
- the target contact force vector is normalized based on the first controller to obtain a normal vector of the contact surface between the controlled end and the target object.
- the generating module is further used for:
- the first controller performs a vector subtraction operation using the position vector and the second component to obtain a first component.
- the device further comprises:
- a definition unit used to obtain the target contact force of the sensor z-axis, and define the actual contact force as a force with the same magnitude and opposite direction as the target contact force of the sensor z-axis;
- a comparison unit configured to receive the reference force sent by the main control end, and compare the actual contact force with the reference force
- an increasing unit configured to increase the pressure of the controlled end on the target object when it is determined that the actual contact force is less than the reference force
- a reducing unit is used to reduce the pressure of the controlled end on the target object when it is determined that the actual contact force is greater than the reference force.
- the device further comprises:
- a comparison unit configured to receive the reference force sent by the main control end, compare the actual contact force with the reference force, and obtain a first target deviation
- An analyzing unit configured to analyze the first target deviation based on the second controller to obtain a change in the pressure increase or decrease of the controlled end terminal on the target object
- the second control unit is used to control the increase or decrease of the pressure of the controlled end on the target object according to the change amount.
- the analysis unit includes:
- a setting module configured to set a proportional gain and a differential gain, and obtain a first calculation result by performing a product calculation based on the second controller using the proportional gain and the first target deviation;
- a first calculation module for performing a product calculation based on the second controller using the differential gain and the target deviation change to obtain a second calculation result, wherein the target deviation change is the product of the first target deviation and the previous time; a target deviation of the reference force and the actual contact force;
- the second calculation module is used to sum the first calculation result and the second calculation result based on the second controller to obtain the change amount.
- a device for processing a robot position comprising:
- a transceiver unit configured to send a control instruction to the controlled end based on the connection between the master end and the controlled end, control the controlled end to start a detection task, and receive response information and image information from the controlled end;
- the displacement information is sent to the controlled end so that the controlled end moves according to the displacement information, and the response information and the image information of the controlled end are received in real time.
- the transceiver unit is further used for:
- the reference force is sent to the controlled end so that the controlled end controls the actual contact force between its end and the target object according to the reference force, and receives feedback information from the controlled end, wherein the target object is the detection object of the controlled end.
- control instructions also include ultrasound control instructions and camera control instructions.
- an electronic device including:
- the memory stores instructions that can be executed by the at least one processor, and the instructions are executed by the at least one processor to enable the at least one processor to perform the method described in the first aspect or the second aspect.
- a non-transitory computer-readable storage medium storing computer instructions, wherein the computer instructions are used to enable the computer to execute the method described in the first aspect or the second aspect.
- a computer program product comprising a computer program, wherein when the computer program is executed by a processor, the computer program implements the method as described in the first aspect or the second aspect.
- the present disclosure provides a method, device, electronic device and storage medium for robot position processing.
- the main technical solutions include: receiving displacement information sent by the main control end, and collecting the target contact force between the controlled end and the target object, the target object being the detection object of the controlled end; processing the displacement information and the target contact force to obtain the position change of the controlled end; and controlling the distance and direction of movement of the controlled end according to the position change.
- the position of the controlled end is obtained after processing the displacement information and the target contact force.
- the controlled end terminal is controlled to move along the contact surface between the controlled end terminal and the target object based on the position change amount, and the displacement distance of the controlled end terminal is controlled based on the size of the position change amount.
- FIG. 1 is a flow chart of a method for processing a robot position provided by an embodiment of the present disclosure.
- FIG2 is a schematic flow chart of another method for processing robot positions provided by an embodiment of the present disclosure.
- FIG3 is a schematic diagram of a controlled terminal structure provided by an embodiment of the present disclosure.
- FIG4 is a schematic diagram of a hardware connection structure of a controlled terminal provided by an embodiment of the present disclosure.
- FIG. 5 is a schematic diagram of the forces acting on a controlled terminal and a target object provided by an embodiment of the present disclosure.
- FIG6 is a schematic diagram of a robot force control process provided by an embodiment of the present disclosure.
- FIG7 is a schematic diagram of a controlled terminal moving on a target object based on a method for robot position processing provided by an embodiment of the present disclosure
- FIG8 is a schematic diagram of a main control terminal structure provided by an embodiment of the present disclosure.
- FIG9 is a schematic diagram of a hardware connection structure of a master control end provided in an embodiment of the present disclosure.
- FIG10 is a schematic flow chart of another method for robot position processing provided by an embodiment of the present disclosure.
- FIG11 is a schematic diagram of a hardware connection between a master terminal and a controlled terminal provided by an embodiment of the present disclosure
- FIG12 is a schematic diagram of the structure of a device for processing a robot position provided by an embodiment of the present disclosure
- FIG13 is a schematic diagram of the structure of another device for processing the position of a robot provided by an embodiment of the present disclosure
- FIG14 is a schematic diagram of the structure of another device for processing the position of a robot provided in an embodiment of the present disclosure
- FIG. 15 is a schematic block diagram of an example electronic device 500 provided according to an embodiment of the present disclosure.
- FIG1 is a schematic flow chart of a method for robot position processing provided in an embodiment of the present disclosure.
- the method is applied to the controlled end, as shown in FIG1 , and the method comprises the following steps:
- Step 101 receiving displacement information sent by the master end, and collecting the target contact force between the controlled end and the target object, wherein the target object is the detection object of the controlled end.
- the target contact force is the force between the controlled end and the target object when the two are in contact.
- the displacement information is the control information sent by the main control end to control the movement of the controlled end.
- the target object is the detection object of the controlled end. For example, when the controlled end performs an ultrasonic examination on a person to be examined, the person to be examined is the target object.
- Step 102 Process the displacement information and the target contact force to obtain a position change of the controlled end.
- the position change of the controlled end is obtained by processing the displacement information and the target contact force.
- the position change includes size and direction.
- Step 103 Control the moving distance and direction of the controlled terminal according to the position change.
- the distance and direction of movement of the controlled end are controlled according to the position change, that is, the magnitude of the position change is the distance moved by the controlled end, and the direction of the position change is the direction of movement of the controlled end.
- the method for robot position processing mainly includes: receiving displacement information sent by the master control end, and collecting the target contact force between the controlled end and the target object, the target object being the detection object of the controlled end; processing the displacement information and the target contact force to obtain the position change of the controlled end; and controlling the distance and direction of movement of the controlled end according to the position change.
- the position change of the controlled end is obtained, so as to realize the control of the controlled end terminal to move along the contact surface between it and the target object based on the position change, and control the displacement distance of the controlled end terminal based on the size of the position change, and by controlling the controlled end terminal to move along the contact surface between it and the target object, the controlled end terminal can move smoothly on the target object, and the movement of the controlled end terminal can be more in line with the curve of the target object, thereby improving the safety of the controlled end terminal when inspecting the target object.
- the method in performing step 102, the displacement information and the target contact
- the method can be implemented in but not limited to the following manners, for example: based on the first controller, the displacement information and the target contact force are processed to obtain the position change.
- the displacement information and the target contact force are processed by the first controller to obtain the position change to control the movement of the controlled end.
- FIG2 is a flow chart of another method for processing the position of a robot provided by an embodiment of the present disclosure. As shown in FIG2 , the method comprises the following steps:
- Step 201 in response to a control instruction from a master end, the first controller defines the terminal posture of the controlled end according to the control instruction, and the first controller controls a sensor to obtain the target contact force between the controlled end and a target object.
- the terminal posture of the terminal end of the controlled end is defined, and the terminal posture is a posture matrix. It should be understood that the present disclosure does not limit the terminal posture to the form of a posture matrix.
- the target contact force between the controlled end and the target object is obtained based on a force sensor.
- the force sensor can be but is not limited to a six-axis force sensor. In order to more clearly illustrate the above-mentioned target contact force acquisition scenario, an exemplary explanation is provided here.
- the six-axis force sensor above the ultrasonic probe detects the target contact force corresponding to the ultrasonic detection
- the target contact force includes forces of different axes obtained by the six-axis force sensor, for example, x-axis force, y-axis force, and z-axis force.
- Step 202 based on the first controller, using the end posture and the target contact force to perform calculations to obtain a normal vector of a contact surface between the controlled end and the target object;
- the calculation is performed based on the end posture and the target contact force, that is, the calculation is performed based on the end posture and the force of each axis obtained by the force sensor to obtain the normal vector of the contact surface between the controlled end and the target object, and the normal vector of the contact surface of the target object is used to analyze subsequent parameters.
- the force of each axis of the force sensor is mentioned in the exemplary description of the above step 201, that is, the x-axis force, y-axis force, and z-axis force obtained by the six-dimensional force sensor.
- Step 203 generating a position vector in the contact surface of the target object based on the first controller according to the displacement information, and performing calculation based on the first controller using the normal vector and the position vector to obtain a first component of the position vector, where the first component is a vector perpendicular to the direction of the normal vector;
- the normal vector and the position vector are used to perform calculation to obtain the first component.
- the position vector is generated by the displacement information from the master control end, for example: the displacement information at the first moment sent by the master control end, the displacement information at the second moment, in this embodiment, the time interval between the second moment and the first moment is 8ms, that is, the period of displacement information collection is 8ms, in the actual scenario, this embodiment does not limit the time interval between the second moment and the first moment to 8ms.
- Step 204 by setting the modulus of the first component to be the same as the modulus of the position vector, the position change of the controlled end is calculated based on the first controller, the direction of the position change is the same as the direction of the first component, and the magnitude of the position change is the same as the magnitude of the position vector.
- the modulus of the first component is set to be the same as the modulus of the position vector, and the position change of the controlled end is calculated under the above conditions.
- the position change is the size and direction of the displacement of the controlled end, wherein the direction of the position change is the same as the direction of the first component, and its size is the same as the size of the position vector.
- the target contact force when executing step 201 to obtain the target contact force between the controlled end and the target object based on the first controller controlling the sensor, the following implementation method can be adopted but is not limited to, for example: based on the first controller controlling a preset force sensor to obtain the target contact force between the controlled end and the target object, the target contact force is a force of a preset amount and direction obtained based on the preset force sensor.
- step 202 when performing step 202 to calculate based on the first controller using the end posture and the target contact force to obtain the normal vector of the contact surface between the controlled end and the target object, the following implementation methods may be adopted but are not limited to, for example: based on the first controller using the end posture and the target contact force to calculate based on the first controller using the end posture and the target contact force to obtain the normal vector of the contact surface between the controlled end and the target object
- the target contact force is matrix-calculated to obtain a target contact force vector; and the target contact force vector is normalized based on the first controller to obtain a normal vector of the contact surface between the controlled end and the target object.
- FIG5 is a schematic diagram of the forces acting on a controlled end and a target object provided by an embodiment of the present disclosure, as shown in FIG5 , which includes: a position vector deltaP, and a resultant force F obtained by a preset force sensor.
- a position vector deltaP a position vector deltaP
- a resultant force F obtained by a preset force sensor.
- the information of the force sensor is used to estimate the degree of inclination of the surface of the target object.
- the target contact force includes forces in three different directions, namely Fx contact , Fy contact , and Fz contact .
- the target contact force vector is V F , where the target contact force vector is the resultant force vector of the target contact force in the directions of the three unit column vectors Vx, Vy, and Vz, respectively.
- V F Fx contact *Vx+Fy contact *Vy+Fz contact *Vz formula (1)
- the target contact force vector V F is normalized to obtain the normal vector of the contact surface between the controlled end and the target object, wherein the normalization process is shown in formula (2):
- the following implementation methods may be used but are not limited to, for example: based on the first controller, the normal vector and the position vector are used to perform dot product calculations to obtain the second component of the position vector, where the second component is a vector in the direction of the normal vector; based on the first controller, the position vector and the second component are used to perform vector subtraction operations to obtain the first component.
- the position vector is calculated from the displacement information at different times.
- the position vector can be obtained by the following formula, for example:
- deltaP.x, deltaP.y, deltaP.z are the coordinates of the position vector deltaP in the x-axis, y-axis, and z-axis of the coordinate system.
- x1 , y1 , z1 are the coordinates of the displacement information in the x-axis, y-axis, and z-axis of the coordinate system at the first moment
- x2 , y2 , z2 are the coordinates of the displacement information in the x-axis, y-axis, and z-axis of the coordinate system at the second moment.
- deltaP 1 deltaP-deltaP 2 formula (5)
- deltaP(0), deltaP(1), deltaP(2) respectively represent the components of the position change moveP on each coordinate axis
- deltaP 1 (0), deltaP 1 (1), deltaP 1 (2) respectively represent the components of the first component deltaP 1 on each coordinate axis.
- the method can also adopt but is not limited to the following implementation methods, for example: obtaining the target contact force of the sensor z-axis, defining the actual contact force as a force with the same magnitude and opposite direction as the target contact force of the sensor z-axis; receiving the reference force sent by the main control end, and comparing the actual contact force with the reference force; when it is determined that the actual contact force is less than the reference force, increasing the pressure of the controlled end on the target object; when it is determined that the actual contact force is greater than the reference force, reducing the pressure of the controlled end on the target object.
- the actual contact is compared with the reference force sent by the master end to achieve force control of the controlled end.
- the reference force is reference information of the force between the controlled end and the target object.
- the strength of the controlled end on the target object is controlled based on the reference force. That is, when the force between the controlled end and the target object is too large, the force is reduced by controlling the controlled end to move away from the target detection object. When the force between the controlled end and the target object is too small, it means that the controlled end is too far away from the target object, so the controlled end is controlled to be close to the target object to ensure the accuracy of the detection.
- the method when increasing or decreasing the pressure of the controlled end terminal on the target object, can also adopt but is not limited to the following implementation methods, for example: receiving the reference force sent by the main control end, comparing the actual contact force with the reference force, and obtaining a first target deviation; analyzing the first target deviation based on the second controller to obtain the change in the increase or decrease in the pressure of the controlled end terminal on the target object; and controlling the increase or decrease in the pressure of the controlled end terminal on the target object according to the change.
- the second controller analyzes the first target deviation to obtain the change, and then controls the increase or decrease in the pressure of the controlled end on the target object according to the change.
- the method can also adopt but is not limited to the following implementation methods, for example: setting a proportional gain and a differential gain, and obtaining a first calculation result by multiplying the proportional gain and the first target deviation based on the second controller; and obtaining a second calculation result by multiplying the differential gain and the target deviation change based on the second controller, wherein the target deviation change is the difference between the first target deviation and the target deviation at the previous moment, and the target deviation is the difference between the reference force and the actual contact force; and obtaining the change by summing the first calculation result and the second calculation result based on the second controller.
- Ef is the error
- ⁇ Ef is the error differential
- E f F desire -F true
- FIG5 is a schematic diagram of a robot force control process provided in an embodiment of the present disclosure, as shown in FIG6 , which mainly includes a proportional differential controller, a robot system, and an acquisition unit for force control analysis.
- FIG7 is a schematic diagram of a controlled end moving on a target object based on a method for robot position processing provided by an embodiment of the present disclosure. As shown in FIG7 , the controlled end moves closely along the surface curve of the target object.
- FIG8 is a schematic diagram of a main control terminal structure provided by an embodiment of the present disclosure
- FIG8 is a schematic diagram of a main control terminal hardware connection structure provided by an embodiment of the present disclosure
- the main control terminal includes: the main control terminal is composed of a host, a robot console, a dual display, an ultrasound control panel, a camera, a voice pickup, a speaker, a camera control joystick and other modules.
- the doctor can manipulate the robot console to control the remote manipulator.
- the console consists of a contouring probe, a position sensor and a pressure sensor.
- the console has six degrees of freedom.
- the posture sensor has 3 degrees of freedom of rotation
- the position sensor has 2 degrees of freedom of horizontal motion
- the "UP key" and the pressure sensor correspond to 1 degree of freedom for up and down motion.
- FIG10 is a flow chart of another method for processing the position of a robot provided by an embodiment of the present disclosure. As shown in FIG10 , the method is applied to a main control end, and includes:
- Step 301 based on the connection between the master terminal and the controlled terminal, a control instruction is sent to the controlled terminal to control the controlled terminal to start a detection task, and response information and image information of the controlled terminal are received.
- the master control end and the controlled end are connected, and based on the connection relationship, the master control end sends a control instruction to the controlled end to control the controlled end to start a detection task, and the detection task includes but is not limited to ultrasonic inspection.
- the controlled end After receiving the control instruction sent by the master control end, the controlled end responds to the control instruction and replies with a response message to the master control end. After collecting the image information of the ultrasonic detection, the controlled end transmits the image information to the master control end, that is, the master control end receives the image information.
- the above description of data interaction between the master control end and the controlled end is merely exemplary, and this embodiment does not limit the content and method of data interaction.
- Step 302 Send displacement information to the controlled end, so that the controlled end moves according to the displacement information, and receives the response information and the image information of the controlled end in real time.
- displacement information is sent to the controlled end, the controlled end moves according to the displacement information, and receives the response information and the image information of the controlled end in real time, scenario-wise, that is, operating the console of the main control end to control the movement of the robotic arm of the controlled end to perform ultrasonic detection and inspection, and the controlled end transmits the ultrasonic image detected when the robotic arm moves to the main control end in real time.
- the method applied to the master control end also includes but is not limited to the following contents, for example: sending a reference force to the controlled end so that the controlled end controls the actual contact force between its end and the target object according to the reference force, and receives feedback information from the controlled end, wherein the target object is the detection object of the controlled end.
- the main control end in order to adjust the contact force between the controlled end and the target object, sends a reference force to the controlled end so that the controlled end controls the actual contact force between its end and the target object according to the reference force, and the main control end feeds back the pressure information between it and the target object to the main control end in real time, and the pressure information is part of the feedback information.
- This embodiment does not limit the content of the feedback information.
- control instructions also include ultrasound control instructions and camera control instructions.
- the ultrasonic detection of the controlled end is controlled by the ultrasonic control instruction, and the camera of the controlled end is controlled by the camera control instruction.
- FIG11 is a schematic diagram of the hardware connection between the master end and the controlled end provided in this embodiment.
- the master end and the controlled end both include cameras and voice pickups, and the audio and video can be collected by the pickups and the main/secondary cameras.
- the controlled end host can also collect ultrasonic images of the ultrasonic instrument through a video capture card. Through audio and video transmission technology, the collected audio and video and ultrasonic images will be sent to the opposite end through the network, so that remote audio and video communication can be realized.
- the master end can collect robot control instructions through the operator system, collect ultrasonic control instructions through the ultrasonic control panel, and collect camera control instructions through the camera control rocker.
- the controlled end host can receive robot control instructions and camera control instructions from the master end through the network.
- the controlled end processor processes the robot control instructions sent by the master end and sends them to the robotic arm system, which can perform real-time control of position, posture and force.
- Ultrasonic control instructions and main camera control instructions will also be sent to the ultrasonic host and main camera in real time through the controlled end processor to achieve remote control of ultrasound and the camera.
- this embodiment can achieve the following effects:
- the displacement distance of the controlled end terminal is controlled based on the size of the position change, and the controlled end terminal is controlled to move along the contact surface between the controlled end terminal and the target object, so as to achieve smooth movement of the controlled end terminal on the target object, and make the movement of the controlled end terminal more closely fit the curve of the target object, thereby improving the safety of the controlled end terminal when inspecting the target object.
- the position change of the controlled terminal is calculated based on the first component, and the direction of the position change is the same as the direction of the first component, so that when the controlled terminal moves, the controlled terminal is controlled based on the position change.
- the end moves along its contact surface with the target object.
- the present disclosure also proposes a device for processing robot position. Since the device embodiment of the present disclosure corresponds to the above-mentioned method embodiment, details not disclosed in the device embodiment can be referred to the above-mentioned method embodiment, and will not be repeated in the present disclosure.
- FIG. 12 is a schematic diagram of the structure of a device for processing a robot position provided by an embodiment of the present disclosure. As shown in FIG. 12 , the device is applied to a controlled end and includes:
- the receiving unit 41 is used to receive the displacement information sent by the master end, and collect the target contact force between the controlled end and the target object, wherein the target object is the detection object of the controlled end;
- a processing unit 42 configured to process the displacement information and the target contact force to obtain a position change of the controlled end
- the first control unit 43 is used to control the moving distance and direction of the controlled end according to the position change amount.
- the present invention provides a robot position processing device, the main technical scheme includes: receiving displacement information sent by the main control end, and collecting the target contact force between the controlled end and the target object, the target object is the detection object of the controlled end; processing the displacement information and the target contact force to obtain the position change of the controlled end; controlling the distance and direction of movement of the controlled end according to the position change.
- the position change of the controlled end is obtained, so as to realize the control of the controlled end terminal to move along the contact surface between it and the target object based on the position change, and control the displacement distance of the controlled end terminal based on the size of the position change, by controlling the controlled end terminal to move along the contact surface between it and the target object, so as to realize the smooth movement of the controlled end terminal on the target object, and make the movement of the controlled end terminal more close to the curve of the target object, so as to improve the safety of the controlled end terminal when inspecting the target object.
- FIG. 13 is a schematic diagram of the structure of another device for processing the position of a robot provided in an embodiment of the present disclosure. As shown in FIG. 13 , the device is applied to a controlled end, and the processing unit 42 is further used for:
- the displacement information and the target contact force are processed based on the first controller to obtain the position change.
- the processing unit 42 includes:
- the acquisition module 421 is used to respond to the control instruction of the master end, define the terminal posture of the controlled end according to the control instruction based on the first controller, and obtain the terminal posture of the controlled end and the terminal posture based on the first controller controlling the sensor. said target contact force of the target object;
- a first calculation module 422, configured to calculate based on the first controller using the end posture and the target contact force to obtain a normal vector of a contact surface between the controlled end and the target object;
- a generating module 423 configured to generate a position vector in the contact surface of the target object based on the displacement information based on the first controller, and to calculate based on the first controller using the normal vector and the position vector to obtain a first component of the position vector, wherein the first component is a vector perpendicular to the direction of the normal vector;
- the second calculation module 424 is used to calculate the position change of the controlled end based on the first controller by setting the modulus of the first component to be the same as the modulus of the position vector, the direction of the position change is the same as the direction of the first component, and the magnitude of the position change is the same as the magnitude of the position vector.
- the first calculation module 422 is further configured to:
- the target contact force between the controlled end and the target object is obtained, and the target contact force is a force of a preset amount and direction obtained based on the preset force sensor.
- the generating module 423 is further configured to:
- the target contact force vector is normalized based on the first controller to obtain a normal vector of the contact surface between the controlled end and the target object.
- the generating module 423 is further configured to:
- the first controller performs a vector subtraction operation using the position vector and the second component to obtain a first component.
- the device further includes:
- a definition unit 44 is used to obtain a target contact force of the sensor z-axis, and define the actual contact force as a force having the same magnitude and opposite direction as the target contact force of the sensor z-axis;
- the comparison unit 45 is used to receive the reference force sent by the main control end, compare the actual contact force with the reference force, and compare the actual contact force with the reference force. force to compare;
- an increasing unit 46 configured to increase the pressure of the controlled end on the target object when it is determined that the actual contact force is less than the reference force
- the reducing unit 47 is used to reduce the pressure of the controlled end on the target object when it is determined that the actual contact force is greater than the reference force.
- the device further includes:
- a comparison unit 48 configured to receive the reference force sent by the main control end, compare the actual contact force with the reference force, and obtain a first target deviation
- An analysis unit 49 is used to analyze the first target deviation based on the second controller to obtain a change in the pressure increase or decrease of the controlled end terminal on the target object;
- the second control unit 410 is used to control the increase or decrease of the pressure of the controlled terminal on the target object according to the change amount.
- the analysis unit 49 includes:
- a setting module 491 is used to set a proportional gain and a differential gain, and obtain a first calculation result by multiplying the proportional gain and the first target deviation by the second controller;
- a first calculation module 492 is used to perform a product calculation based on the second controller using the differential gain and the target deviation change to obtain a second calculation result, wherein the target deviation change is the difference between the first target deviation and the target deviation at a previous moment, and the target deviation is the difference between the reference force and the actual contact force;
- the second calculation module 493 is used to sum the first calculation result and the second calculation result based on the second controller to obtain the change amount.
- FIG. 14 is a schematic diagram of the structure of another device for processing the position of a robot provided in an embodiment of the present disclosure. As shown in FIG. 14 , the device is applied to a main control end and includes:
- the transceiver unit 51 is used to send a control instruction to the controlled end based on the connection between the master end and the controlled end, control the controlled end to start the detection task, and receive the response information and image information of the controlled end;
- the displacement information is sent to the controlled end so that the controlled end moves according to the displacement information, and the response information and the image information of the controlled end are received in real time.
- the transceiver unit 51 is further configured to:
- the target object is the detection object of the controlled end.
- control instruction also includes an ultrasound control instruction and a camera control instruction.
- the present disclosure also provides an electronic device, a readable storage medium and a computer program product.
- FIG. 15 shows a schematic block diagram of an example electronic device 600 that can be used to implement an embodiment of the present disclosure.
- the electronic device is intended to represent various forms of digital computers, such as laptop computers, desktop computers, workbenches, personal digital assistants, servers, blade servers, mainframe computers, and other suitable computers.
- the electronic device can also represent various forms of mobile devices, such as personal digital processing, cellular phones, smart phones, wearable devices, and other similar computing devices.
- the components shown herein, their connections and relationships, and their functions are merely examples and are not intended to limit the implementation of the present disclosure described and/or required herein.
- the device 600 includes a computing unit 601, which can perform various appropriate actions and processes according to a computer program stored in a ROM (Read-Only Memory) 602 or a computer program loaded from a storage unit 608 to a RAM (Random Access Memory) 603.
- a ROM Read-Only Memory
- RAM Random Access Memory
- various programs and data required for the operation of the device 600 can also be stored.
- the computing unit 601, the ROM 602, and the RAM 603 are connected to each other via a bus 604.
- An I/O (Input/Output) interface 605 is also connected to the bus 604.
- a number of components in the device 600 are connected to the I/O interface 605, including: an input unit 606, such as a keyboard, a mouse, etc.; an output unit 607, such as various types of displays, speakers, etc.; a storage unit 608, such as a disk, an optical disk, etc.; and a communication unit 609, such as a network card, a modem, a wireless communication transceiver, etc.
- the communication unit 609 allows the device 600 to exchange information/data with other devices through a computer network such as the Internet and/or various telecommunication networks.
- the computing unit 601 may be various general and/or special processing components with processing and computing capabilities. Some examples of the computing unit 601 include, but are not limited to, CPU (Central Processing Unit), GPU (Graphic Processing Units), various dedicated AI (Artificial Intelligence) computing chips, various computing units that run machine learning model algorithms, DSP (Digital Signal Processor), and any appropriate processor, controller, microcontroller, etc.
- the computing unit 601 performs the various methods and processes described above, such as the method of robot position processing.
- the method of robot position processing may be implemented as a computer software program, which is tangibly contained in a machine-readable medium, such as a storage unit 608.
- part or all of the computer program may be loaded and/or installed on the device 600 via the ROM 602 and/or the communication unit 609.
- the computer program When the computer program is loaded into the RAM 603 and executed by the computing unit 601, the method described above may be executed.
- the computing unit 601 may be configured to execute the aforementioned robot position processing method in any other appropriate manner (for example, by means of firmware).
- Various embodiments of the systems and techniques described above herein may be implemented in digital electronic circuit systems, integrated circuit systems, FPGAs (Field Programmable Gate Array), ASICs (Application-Specific Integrated Circuit), ASSPs (Application Specific Standard Product), SOCs (System On Chip), CPLDs (Complex Programmable Logic Device), computer hardware, firmware, software, and/or combinations thereof.
- FPGAs Field Programmable Gate Array
- ASICs Application-Specific Integrated Circuit
- ASSPs Application Specific Standard Product
- SOCs System On Chip
- CPLDs Complex Programmable Logic Device
- These various embodiments may include: being implemented in one or more computer programs that are executable and/or interpreted on a programmable system including at least one programmable processor that may be a special purpose or general purpose programmable processor that may receive data and instructions from a storage system, at least one input device, and at least one output device, and transmit data and instructions to the storage system, the at least one input device, and the at least one output device.
- a programmable processor that may be a special purpose or general purpose programmable processor that may receive data and instructions from a storage system, at least one input device, and at least one output device, and transmit data and instructions to the storage system, the at least one input device, and the at least one output device.
- the program code for implementing the method of the present disclosure may be written in any combination of one or more programming languages. These program codes may be provided to a processor or controller of a general-purpose computer, a special-purpose computer, or other programmable data processing device, so that the program code, when executed by the processor or controller, enables the functions/operations specified in the flow chart and/or block diagram to be implemented.
- the program code may be executed entirely on the machine, partially on the machine, partially on the machine and partially on a remote machine as a stand-alone software package, or entirely on a remote machine or server.
- a machine-readable medium may be a tangible medium that may contain or store a program for use by or in conjunction with an instruction execution system, apparatus, or device.
- a machine-readable medium may be a machine-readable signal medium or a machine-readable storage medium.
- a machine-readable medium may include, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing.
- machine-readable storage media would include an electrical connection based on one or more wires, a portable computer disk, a hard disk, RAM, ROM, EPROM (Electrically Programmable Read-Only-Memory) or flash memory, optical fiber, CD-ROM (Compact Disc Read-Only Memory), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
- the systems and techniques described herein can be implemented on a computer having: a display device (e.g., a CRT (Cathode-Ray Tube) or LCD (Liquid Crystal Display) monitor) for displaying information to the user; and a keyboard and pointing device (e.g., a mouse or trackball) through which the user can provide input to the computer.
- a display device e.g., a CRT (Cathode-Ray Tube) or LCD (Liquid Crystal Display) monitor
- a keyboard and pointing device e.g., a mouse or trackball
- Other types of devices can also be used to provide interaction with the user; for example, the feedback provided to the user can be any form of sensory feedback (e.g., visual feedback, auditory feedback, or tactile feedback); and input from the user can be received in any form (including acoustic input, voice input, or tactile input).
- the systems and techniques described herein can be implemented in a computing system that includes a backend component (e.g., as a data server), or a computing system that includes a middleware component (e.g., an application server), or a computing system that includes a frontend component.
- a computing system that includes back-end components, middleware components, or front-end components e.g., a user computer with a graphical user interface or a web browser through which a user can interact with embodiments of the systems and techniques described herein), or any combination of such back-end components, middleware components, or front-end components.
- the components of the system can be interconnected by any form or medium of digital data communication (e.g., a communication network). Examples of communication networks include: LAN (Local Area Network), WAN (Wide Area Network), the Internet, and blockchain networks.
- a computer system may include a client and a server.
- the client and the server are generally remote from each other and usually interact through a communication network.
- the relationship between the client and the server is generated by computer programs running on the corresponding computers and having a client-server relationship with each other.
- the server may be a cloud server, also known as a cloud computing server or a cloud host, which is a host product in the cloud computing service system to solve the defects of difficult management and weak business scalability in traditional physical hosts and VPS services ("Virtual Private Server", or "VPS" for short).
- the server may also be a server for a distributed system, or a server combined with a blockchain.
- artificial intelligence is a discipline that studies how computers can simulate certain human thought processes and intelligent behaviors (such as learning, reasoning, thinking, planning, etc.), and includes both hardware-level and software-level technologies.
- Artificial intelligence hardware technologies generally include technologies such as sensors, dedicated artificial intelligence chips, cloud computing, distributed storage, and big data processing; artificial intelligence software technologies mainly include computer vision technology, speech recognition technology, natural language processing technology, as well as machine learning/deep learning, big data processing technology, knowledge graph technology, and other major directions.
Landscapes
- Engineering & Computer Science (AREA)
- Life Sciences & Earth Sciences (AREA)
- Health & Medical Sciences (AREA)
- Robotics (AREA)
- Mechanical Engineering (AREA)
- Biomedical Technology (AREA)
- Medical Informatics (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Pathology (AREA)
- Radiology & Medical Imaging (AREA)
- Physics & Mathematics (AREA)
- Heart & Thoracic Surgery (AREA)
- Biophysics (AREA)
- Molecular Biology (AREA)
- Surgery (AREA)
- Animal Behavior & Ethology (AREA)
- General Health & Medical Sciences (AREA)
- Public Health (AREA)
- Veterinary Medicine (AREA)
- Manipulator (AREA)
Abstract
机器人位置处理的方法及装置、电子设备和存储介质,涉及数据处理技术领域,主要技术方案包括:接收主控端发送的位移信息,并采集被控端与目标对象之间的目标接触力,目标对象为被控端的检测对象(101);对位移信息和目标接触力进行处理,得到被控端的位置变化量(102);根据位置变化量控制被控端移动的距离和方向(103)。基于位置变化量控制被控端末端沿其与目标对象的接触面移动,并基于位置变化量的大小控制被控端末端的位移距离,从而实现被控端末端在目标对象上的平滑移动,且使被控端末端的移动更加贴合目标对象的曲线。
Description
本公开涉及数据处理技术领域,尤其涉及一种机器人位置处理的方法及装置、电子设备和存储介质。
机器人辅助超声检查是指远程机器人超声检查。机器人远程超声诊断系统包含医生端与病人端两个子系统,专业超声医生可以通过操作医生端设备来远程控制病人端超声机器人来为病人进行超声检查,随着科技的进步,机器人远程超声诊断系统也逐渐普及。但是,在使用所述机器人远程超声诊断系统对被检查者进行检查时,存在所述机器人远程超声诊断系统的超声探头在人体上移动时,动作滞后的问题。
发明内容
本公开提供了一种机器人位置处理的方法及装置、电子设备和存储介质。其主要目的在于使所述机器人远程超声诊断系统的超声探头在人体上移动时,动作更加柔顺,贴合人体曲线,从而提高基于所述机器人远程超声诊断系统进行超声检查的安全性。
根据本公开的第一方面,提供了一种机器人位置处理的方法,所述方法应用于被控端,包括:
接收主控端发送的位移信息,并采集所述被控端与目标对象之间的目标接触力,所述目标对象为所述被控端的检测对象;
对所述位移信息和所述目标接触力进行处理,得到所述被控端的位置变化量;
根据所述位置变化量控制所述被控端移动的距离和方向。
可选的,所述对所述位移信息和所述目标接触力进行处理,得到所述被控端的位置变化量包括:
基于第一控制器对所述位移信息和所述目标接触力进行处理,得到所述位置变化量。
可选的,所述基于第一控制器对所述位移信息和所述目标接触力进行处理,得到所述位置变化量包括:
响应于主控端的控制指令,基于所述第一控制器根据所述控制指令定义被控端的
末端姿态,并基于所述第一控制器控制传感器获取所述被控端与目标对象的所述目标接触力;
基于所述第一控制器使用所述末端姿态和所述目标接触力进行计算,得到所述被控端与所述目标对象接触面的法向量;
基于所述第一控制器根据所述位移信息生成在所述目标对象接触面中的位置向量,并基于所述第一控制器使用所述法向量和所述位置向量进行计算,得到所述位置向量的第一分量,所述第一分量为垂直于所述法向量方向的向量;
通过设置所述第一分量的模与所述位置向量的模相同,基于所述第一控制器计算得到所述被控端的位置变化量,所述位置变化量的方向与所述第一分量的方向相同,所述位置变化量的大小与所述位置向量的大小相同。
可选的,所述基于所述第一控制器控制传感器获取所述被控端与目标对象的所述目标接触力,包括:
基于所述第一控制器控制预设的力传感器获取所述被控端与所述目标对象的所述目标接触力,所述目标接触力为基于所述预设的力传感器获取的预设数量及方向的力。
可选的,所述基于所述第一控制器使用所述末端姿态和所述目标接触力进行计算,得到所述被控端与所述目标对象接触面的法向量包括:
基于所述第一控制器使用所述末端姿态和所述目标接触力进行矩阵计算,得到目标接触力向量;
基于所述第一控制器对所述目标接触力向量进行归一化处理,得到所述被控端与所述目标对象接触面的法向量。
可选的,所述基于所述第一控制器使用所述法向量和所述位置向量进行计算,得到所述位置向量的第一分量包括:
基于所述第一控制器使用所述法向量和所述位置向量进行点积计算,得到所述位置向量的第二分量,所述第二分量为所述法向量方向上的向量;
基于所述第一控制器使用所述位置向量和所述第二分量进行向量的减法运算,得到第一分量。
可选的,当所述被控端与所述目标对象的实际接触力大于或小于参照力时,所述方法包括:
获取所述传感器z轴的目标接触力,定义所述实际接触力为与所述传感器z轴的
目标接触力大小相同方向相反的力;
接收主控端发送的所述参照力,将所述实际接触力与所述参照力进行比较;
在确定所述实际接触力小于所述参照力时,增大所述被控端末端对所述目标对象的压力;
在确定所述实际接触力大于所述参照力时,减小所述被控端末端对所述目标对象的压力。
可选的,在增大或减小所述被控端末端对所述目标对象的压力时,所述方法还包括:
接收所述主控端发送的所述参照力,将所述实际接触力与所述参照力进行比较,得到第一目标偏差;
基于第二控制器对所述第一目标偏差进行分析,得到所述被控端末端对所述目标对象的压力增大或减小的变化量;
根据所述变化量控制增大或减小所述被控端末端对所述目标对象的压力。
可选的,所述基于第二控制器对所述第一目标偏差进行分析,得到所述被控端末端对所述目标对象的压力增大或减小的变化量包括:
设置比例增益和微分增益,基于所述第二控制器使用所述比例增益与所述第一目标偏差进行乘积计算,得到第一计算结果;以及
基于所述第二控制器使用所述微分增益与目标偏差变化量进行乘积计算,得到第二计算结果,所述目标偏差变化量为所述第一目标偏差与前一时刻的目标偏差之差,所述目标偏差为所述参照力与所述实际接触力之差;
基于所述第二控制器对所述第一计算结果和所述第二计算结果进行求和计算,得到所述变化量。
根据本公开的第二方面,提供了一种机器人位置处理的方法,所述方法应用于主控端,包括:
基于主控端与被控端之间的连接,发送控制指令至所述被控端,控制所述被控端启动检测任务,并接收所述被控端的应答信息及图像信息;
发送位移信息至所述被控端,以便所述被控端根据所述位移信息进行移动,并实时接收所述被控端的所述应答信息及所述图像信息。
可选的,所述方法还包括:
发送参照力至所述被控端,以便所述被控端根据所述参照力控制其末端与目标对
象的实际接触力,并接收所述被控端的反馈信息,所述目标对象为所述被控端的检测对象。
可选的,所述控制指令还包括超声控制指令及摄像头控制指令。
根据本公开的第三方面,提供了一种机器人位置处理的装置,所述装置应用于被控端,包括:
接收单元,用于接收主控端发送的位移信息,并采集所述被控端与目标对象之间的目标接触力,所述目标对象为所述被控端的检测对象;
处理单元,用于对所述位移信息和所述目标接触力进行处理,得到所述被控端的位置变化量;
第一控制单元,用于根据所述位置变化量控制所述被控端移动的距离和方向。
可选的,所述处理单元还用于:
基于第一控制器对所述位移信息和所述目标接触力进行处理,得到所述位置变化量。
可选的,所述处理单元包括:
获取模块,用于响应于主控端的控制指令,基于所述第一控制器根据所述控制指令定义被控端的末端姿态,并基于所述第一控制器控制传感器获取所述被控端与目标对象的所述目标接触力;
第一计算模块,用于基于所述第一控制器使用所述末端姿态和所述目标接触力进行计算,得到所述被控端与所述目标对象接触面的法向量;
生成模块,用于基于所述第一控制器根据所述位移信息生成在所述目标对象接触面中的位置向量,并基于所述第一控制器使用所述法向量和所述位置向量进行计算,得到所述位置向量的第一分量,所述第一分量为垂直于所述法向量方向的向量;
第二计算模块,用于通过设置所述第一分量的模与所述位置向量的模相同,基于所述第一控制器计算得到所述被控端的位置变化量,所述位置变化量的方向与所述第一分量的方向相同,所述位置变化量的大小与所述位置向量的大小相同。
可选的,所述第一计算模块还用于:
基于所述第一控制器控制预设的力传感器获取所述被控端与所述目标对象的所述目标接触力,所述目标接触力为基于所述预设的力传感器获取的预设数量及方向的力。
可选的,所述生成模块还用于:
基于所述第一控制器使用所述末端姿态和所述目标接触力进行矩阵计算,得到目标接触力向量;
基于所述第一控制器对所述目标接触力向量进行归一化处理,得到所述被控端与所述目标对象接触面的法向量。
可选的,所述生成模块还用于:
基于所述第一控制器使用所述法向量和所述位置向量进行点积计算,得到所述位置向量的第二分量,所述第二分量为所述法向量方向上的向量;
基于所述第一控制器使用所述位置向量和所述第二分量进行向量的减法运算,得到第一分量。
可选的,所述装置还包括:
定义单元,用于获取所述传感器z轴的目标接触力,定义所述实际接触力为与所述传感器z轴的目标接触力大小相同方向相反的力;
比较单元,用于接收主控端发送的所述参照力,将所述实际接触力与所述参照力进行比较;
增大单元,用于在确定所述实际接触力小于所述参照力时,增大所述被控端末端对所述目标对象的压力;
减小单元,用于在确定所述实际接触力大于所述参照力时,减小所述被控端末端对所述目标对象的压力。
可选的,所述装置还包括:
比对单元,用于接收所述主控端发送的所述参照力,将所述实际接触力与所述参照力进行比较,得到第一目标偏差;
分析单元,用于基于第二控制器对所述第一目标偏差进行分析,得到所述被控端末端对所述目标对象的压力增大或减小的变化量;
第二控制单元,用于根据所述变化量控制增大或减小所述被控端末端对所述目标对象的压力。
可选的,所述分析单元包括:
设置模块,用于设置比例增益和微分增益,基于所述第二控制器使用所述比例增益与所述第一目标偏差进行乘积计算,得到第一计算结果;以及
第一计算模块,用于基于所述第二控制器使用所述微分增益与目标偏差变化量进行乘积计算,得到第二计算结果,所述目标偏差变化量为所述第一目标偏差与前一时
刻的目标偏差之差,所述目标偏差为所述参照力与所述实际接触力之差;
第二计算模块,用于基于所述第二控制器对所述第一计算结果和所述第二计算结果进行求和计算,得到所述变化量。
根据本公开的第四方面,提供了一种机器人位置处理的装置,所述装置应用于主控端,包括:
收发单元,用于基于主控端与被控端之间的连接,发送控制指令至所述被控端,控制所述被控端启动检测任务,并接收所述被控端的应答信息及图像信息;以及
发送位移信息至所述被控端,以便所述被控端根据所述位移信息进行移动,并实时接收所述被控端的所述应答信息及所述图像信息。
可选的,所述收发单元还用于:
发送参照力至所述被控端,以便所述被控端根据所述参照力控制其末端与目标对象的实际接触力,并接收所述被控端的反馈信息,所述目标对象为所述被控端的检测对象。
可选的,所述控制指令还包括超声控制指令及摄像头控制指令。
根据本公开的第五方面,提供了一种电子设备,包括:
至少一个处理器;以及
与所述至少一个处理器通信连接的存储器;其中,
所述存储器存储有可被所述至少一个处理器执行的指令,所述指令被所述至少一个处理器执行,以使所述至少一个处理器能够执行前述第一方面或第二方面所述的方法。
根据本公开的第六方面,提供了一种存储有计算机指令的非瞬时计算机可读存储介质,其中,所述计算机指令用于使所述计算机执行前述第一方面或第二方面所述的方法。
根据本公开的第七方面,提供了一种计算机程序产品,包括计算机程序,所述计算机程序在被处理器执行时实现如前述第一方面或第二方面所述的方法。
本公开提供的机器人位置处理的方法、装置、电子设备和存储介质,主要技术方案包括:接收主控端发送的位移信息,并采集所述被控端与目标对象之间的目标接触力,所述目标对象为所述被控端的检测对象;对所述位移信息和所述目标接触力进行处理,得到所述被控端的位置变化量;根据所述位置变化量控制所述被控端移动的距离和方向。与相关技术相比,通过接收所述主控端发送的所述位移信息,并采集所述目标接触力,对所述位移信息和所述目标接触力进行处理后,得到所述被控端的位置
变化量,从而实现基于所述位置变化量控制所述被控端末端沿其与所述目标对象的接触面移动,并基于所述位置变化量的大小控制所述被控端末端的位移距离,通过控制所述被控端末端沿其与所述目标对象的接触面移动,从而实现所述被控端末端在所述目标对象上的平滑移动,且使所述被控端末端的移动更加贴合所述目标对象的曲线,提高所述被控端末端对所述目标对象进行检查时的安全性。
应当理解,本部分所描述的内容并非旨在标识本申请的实施例的关键或重要特征,也不用于限制本申请的范围。本申请的其它特征将通过以下的说明书而变得容易理解。
附图用于更好地理解本方案,不构成对本公开的限定。其中:
图1为本公开实施例所提供的一种机器人位置处理的方法的流程示意图
图2为本公开实施例所提供的另一种机器人位置处理的方法的流程示意图;
图3为本公开实施例提供的一种被控端结构示意图;
图4为本公开实施例提供的一种被控端硬件连接结构的示意图;
图5为本公开实施例提供的一种被控端与目标对象的受力示意图
图6为本公开实施例提供的一种机器人力控流程示意图;
图7为本公开实施例提供的一种被控端末端基于机器人位置处理的方法在目标对象上运动的示意图;
图8为本公开实施例提供的一种主控端结构示意图;
图9为本公开实施例提供的一种主控端硬件连接结构的示意图;
图10为本公开实施例提供的另一种机器人位置处理的方法的流程示意图;
图11为本公开实施例提供的一种主控端与被控端硬件连接的示意图;
图12为本公开实施例提供的一种机器人位置处理的装置的结构示意图;
图13为本公开实施例提供的另一种机器人位置处理的装置的结构示意图;
图14为本公开实施例提供的另一种机器人位置处理的装置的结构示意图;
图15为本公开实施例提供的示例电子设备500的示意性框图。
以下结合附图对本公开的示范性实施例做出说明,其中包括本公开实施例的各种细节以助于理解,应当将它们认为仅仅是示范性的。因此,本领域普通技术人员应当认识到,可以对这里描述的实施例做出各种改变和修改,而不会背离本公开的范围和精神。同样,为了清楚和简明,以下的描述中省略了对公知功能和结构的描述。
下面参考附图描述本公开实施例的机器人位置处理的方法、装置、电子设备和存储介质。
图1为本公开实施例所提供的一种机器人位置处理的方法的流程示意图。
所述方法应用于被控端,如图1所示,该方法包含以下步骤:
步骤101,接收主控端发送的位移信息,并采集所述被控端与目标对象之间的目标接触力,所述目标对象为所述被控端的检测对象。
作为上述步骤101的细化,为了实现对所述被控端的控制,需接收所述主控端发送的位移信息,以及采集所述被控端与目标对象之间的目标接触力,所述目标接触力即为所述被控端与所述目标对象之间接触时,二者之间的作用力,所述位移信息为主控端发送的控制所述被控端移动的控制信息,所述目标对象即为所述被控端的检测对象,例如:基于所述被控端对被检查人做超声检查时,所述被检查人即为所述目标对象。
步骤102,对所述位移信息和所述目标接触力进行处理,得到所述被控端的位置变化量。
作为上述步骤102的细化,为了使所述被控端在运动过程中更加柔顺且贴合所述目标对象的表面,通过对所述位移信息及所述目标接触力进行处理,得到所述被控端的位置变化量。所述位置变化量包括大小和方向。
步骤103,根据所述位置变化量控制所述被控端移动的距离和方向。
作为上述步骤103的细化,为了对所述被控端移动的距离和方向进行控制,根据所述位置变化量控制所述被控端移动的距离和方向,即所述位置变化量的大小为所述被控端移动的距离,所述位置变化量的方向为所述被控端移动的方向。
本公开提供的机器人位置处理的方法,主要技术方案包括:接收主控端发送的位移信息,并采集所述被控端与目标对象之间的目标接触力,所述目标对象为所述被控端的检测对象;对所述位移信息和所述目标接触力进行处理,得到所述被控端的位置变化量;根据所述位置变化量控制所述被控端移动的距离和方向。与相关技术相比,通过接收所述主控端发送的所述位移信息,并采集所述目标接触力,对所述位移信息和所述目标接触力进行处理后,得到所述被控端的位置变化量,从而实现基于所述位置变化量控制所述被控端末端沿其与所述目标对象的接触面移动,并基于所述位置变化量的大小控制所述被控端末端的位移距离,通过控制所述被控端末端沿其与所述目标对象的接触面移动,从而实现所述被控端末端在所述目标对象上的平滑移动,且使所述被控端末端的移动更加贴合所述目标对象的曲线,提高所述被控端末端对所述目标对象进行检查时的安全性。
作为本公开实施例的细化,在执行步骤102所述对所述位移信息和所述目标接触
力进行处理,得到所述被控端的位置变化量时,所述方法可以采用但不限于以下实现方式,例如:基于第一控制器对所述位移信息和所述目标接触力进行处理,得到所述位置变化量。
通过所述第一控制器对所述位移信息和所述目标接触力进行处理,得到所述位置变化量,以控制所述被控端移动。
图2为本公开实施例所提供的另一种机器人位置处理的方法的流程示意图。如图2所示,该方法包含以下步骤:
步骤201,响应于主控端的控制指令,基于所述第一控制器根据所述控制指令定义被控端的末端姿态,并基于所述第一控制器控制传感器获取所述被控端与目标对象的所述目标接触力。
作为上述步骤201的细化,为了描述所述被控端末端的姿态定义所述被控端末端的末端姿态,所述末端姿态为姿态矩阵,应当明白,本公开并不限定所述末端姿态为姿态矩阵的形式,基于力传感器获取所述被控端与所述目标对象的目标接触力,所述力传感器可以但不限于为六维力传感器,为了更清楚的说明上述目标接触力的获取场景,在此提供示例性说明,例如:被控端的末端为超声探头,所述目标对象为超声检测对象时,即在使用所述超声探头对所述超声检测对象进行检测时,基于所述超声探头上方的六维力传感器检测所述超声探头与所述超声检测对应的目标接触力,所述目标接触力包括六维力传感器获取的不同轴的力,例如,x轴的力,y轴的力,z轴的力。
步骤202,基于所述第一控制器使用所述末端姿态和所述目标接触力进行计算,得到所述被控端与所述目标对象接触面的法向量;
作为上述步骤202的细化,所述基于所述末端姿态和所述目标接触力进行计算即基于所述末端姿态与力传感器获取的每个轴的力进行计算,得到所述被控端与目标对象接触面的法向量,使用所述目标对象接触面的法向量对后续的参量进行解析,所述力传感器每个轴的力在上述步骤201的示例性说明中有提及,即所述六维力传感器获取的x轴的力,y轴的力,z轴的力。
步骤203,基于所述第一控制器根据所述位移信息生成在所述目标对象接触面中的位置向量,并基于所述第一控制器使用所述法向量和所述位置向量进行计算,得到所述位置向量的第一分量,所述第一分量为垂直于所述法向量方向的向量;
作为上述步骤203的细化,出于对所述位置向量进行分解的目的,以得到垂直于所述法向量方向的分量,使用所述法向量和所述位置向量进行计算,得到所述第一分
量,其中,所述位置向量由来自于所述主控端的位移信息生成,例如:主控端发送的第一时刻下的位移信息,第二时刻下的位移信息,在本实施例中所述第二时刻与所述第一时刻的时间间隔为8ms,即位移信息采集的周期为8ms,在实际场景中,本实施例并不限定所述第二时刻与所述第一时刻的时间间隔为8ms。基于所述第一时刻下的位移信息和所述第二时刻下的位移信息计算得到8ms内主控端仿形探头变化的位移信息,基于变化的位移信息得到所述位置向量,所述位移信息为主控端的位置传感器采集的所述仿形探头的移动信息,应当明白上述位移信息采集的方式仅为示例性说明,本实施例并不限定主控端对位移信息采集的方式。
步骤204,通过设置所述第一分量的模与所述位置向量的模相同,基于所述第一控制器计算得到所述被控端的位置变化量,所述位置变化量的方向与所述第一分量的方向相同,所述位置变化量的大小与所述位置向量的大小相同。
作为上述实施例的细化,为了使所述被控端末端沿所述第一分量的方向移动的距离大小与所述位置向量的大小相等,则设置所述第一分量的模与所述位置向量的模相同,在前述条件下计算得到所述被控端的位置变化量,所述位置变化量即为所述被控端末端位移的大小和方向,其中,所述位置变化量的方向与所述第一分量的方向相同,其大小与所述位置向量的大小相同。
为了便于对本公方案的理解,图3为本公开实施例提供的一种被控端结构示意图,图4为本公开实施例提供的一种被控端硬件连接结构的示意图,如图3所示,所述控端包括:机械臂,六维力传感器,主摄像头,副摄像头,显示器,扬声器以及主机。
作为本公开实施例的细化,在执行步骤201所述基于所述第一控制器控制传感器获取所述被控端与目标对象的所述目标接触力时,可以采用但不限于以下实现方式,例如:基于所述第一控制器控制预设的力传感器获取所述被控端与所述目标对象的所述目标接触力,所述目标接触力为基于所述预设的力传感器获取的预设数量及方向的力。
通过预设的力传感器获取的值经过低通滤波和重力补偿处理,得到所述目标接触力,上述预设的力传感器可以为六维力传感器,基于所述六维力传感器获取其x轴的力,y轴的力和z轴的力,上述说明仅仅为示例性的,本实施例对所述预设的力传感器的种类不进行限定,且对采集的力的数量及方向也不进行限定。
作为上述实施例的细化,在执行步骤202所述基于所述第一控制器使用所述末端姿态和所述目标接触力进行计算,得到所述被控端与所述目标对象接触面的法向量时,可以采用但不限于以下实现方式,例如:基于所述第一控制器使用所述末端姿态和所
述目标接触力进行矩阵计算,得到目标接触力向量;基于所述第一控制器对所述目标接触力向量进行归一化处理,得到所述被控端与所述目标对象接触面的法向量。
图5为本公实施例提供的一种被控端与目标对象的受力示意图,如图5所示,其中,包括:位置向量deltaP,预设的力传感器获取的力的合力F。为了实现所述被控端末端在所述目标对象上的柔性运动,需要实时估计所述目标对象表面的倾斜角度,然后将位移的方向改为沿人体表面的方向。本实施例中,使用力传感器的信息来估计所述目标对象表面的倾斜程度。
探头和人体接触的过程中,如果忽略掉摩擦力,可以认为探头与人体的接触力的合力方向时垂直于接触平面的
为了更直观的展示所述末端姿态和所述目标接触力的计算过程,在此依据公式进行说明,例如:
定义姿态向量为R=[Vx,Vy,Vz],其中Vx,Vy,Vz分别表示三个坐标轴方向的单位列向量,所述目标接触力包括三个不同方向的力分别为Fxcontact,Fycontact,Fzcontact,所述目标接触力向量为VF,其中,所述目标接触力向量即为所述目标接触力分别在Vx,Vy,Vz三个单位列向量方向上力的合力向量,构建所述目标接触力向量求解公式,如公式(1)所示:
VF=Fxcontact*Vx+Fycontact*Vy+Fzcontact*Vz公式(1)
VF=Fxcontact*Vx+Fycontact*Vy+Fzcontact*Vz公式(1)
对所述目标接触力向量VF进行归一化处理,得到所述被控端与所述目标对象接触面的法向量,其中,归一化处理的过程由公式(2)所示:
上述公式(2)中所示出的为所述被控端与所述目标对象接触面的法向量,VF(0)、VF(1)、VF(2)表示所述目标接触力向量VF的三个分量。
作为上述实施例的细化,在执行步骤203所述基于所述第一控制器使用所述法向量和所述位置向量进行计算,得到所述位置向量的第一分量时,可以采用但不限于以下实现方式,例如:基于所述第一控制器使用所述法向量和所述位置向量进行点积计算,得到所述位置向量的第二分量,所述第二分量为所述法向量方向上的向量;基于所述第一控制器使用所述位置向量和所述第二分量进行向量的减法运算,得到第一分量。
为了更直观的展示所述第一分量和所述第二分量的计算过程,在此依据公式进行
说明,例如:
在上述细化说明中已知所述法向量所述位置向量由不同时刻的位移信息进行计算得到。所述位置向量可由下述公式求得,例如:
记第一时刻下的位移信息为Position=[x1,y1,z1],第二时刻下的位移信息为Positionpre=[x2,y2,z2],其中,所述第一时刻下的位移信息和第二时刻下的位移信息为主控端发送的向量信息,则位置向量deltaP的求解,如公式(3)所示:
其中,deltaP.x,deltaP.y,deltaP.z为位置向量deltaP在坐标系x轴,y轴,z轴的坐标,同理,x1,y1,z1为第一时刻下位移信息在坐标系x轴,y轴,z轴的坐标;x2,y2,z2为第二时刻下位移信息在坐标系x轴,y轴,z轴的坐标。
已知所述法向量所述位置向量deltaP,构建所述第二分量deltaP2的求解公式,如公式(4)所示:
构建所述第一分量deltaP1的求解公式,如公式(5)所示:
deltaP1=deltaP-deltaP2公式(5)
deltaP1=deltaP-deltaP2公式(5)
作为上述实施例的细化,通过设置所述第一分量deltaP1的模与所述位置向量deltaP的模相同、计算得到所述被控端的位置变化量moveP,则位置变化量moveP的求解公式,如公式(6)所示:
其中,deltaP(0)、deltaP(1)、deltaP(2)分别表示位置变化量moveP的各个坐标轴的分量,deltaP1(0)、deltaP1(1)、deltaP1(2)分别表示第一分量deltaP1在各个坐标轴的分量。
作为上述实施例的细化,当所述被控端与所述目标对象的实际接触力大于或小于参照力时,所述方法还可以采用但不限于以下实现方式,例如:获取所述传感器z轴的目标接触力,定义所述实际接触力为与所述传感器z轴的目标接触力大小相同方向相反的力;接收主控端发送的所述参照力,将所述实际接触力与所述参照力进行比较;在确定所述实际接触力小于所述参照力时,增大所述被控端末端对所述目标对象的压力;在确定所述实际接触力大于所述参照力时,减小所述被控端末端对所述目标对象的压力。
作为上述实施例的细化,为了实现所述被控端的力控作用,通过将所述实际接触
力与所述主控端发送的所述参照力进行比对,以实现对所述被控端的力控,所述参照力为所述被控端与所述目标对象之间作用力的参照信息,基于所述参照力以控制所述被控端对所述目标对象的力度,即当所述被控端与所述目标对象之间的作用力过大时,则通过控制所述被控端远离所述目标检测对象以减小所述作用力,当所述被控端与所述目标对象之间的作用力过小时,则说明所述被控端距离所述目标对象过远,从而控制所述被控端贴近所述目标对象,以保证检测的精确度。
作为上述实施例的细化,在增大或减小所述被控端末端对所述目标对象的压力时,所述方法还可以采用但不限于以下实现方式,例如:接收所述主控端发送的所述参照力,将所述实际接触力与所述参照力进行比较,得到第一目标偏差;基于第二控制器对所述第一目标偏差进行分析,得到所述被控端末端对所述目标对象的压力增大或减小的变化量;根据所述变化量控制增大或减小所述被控端末端对所述目标对象的压力。
作为上述实施例的细化,为了得到所述被控端末端对所述目标对象的压力增大或减小的变化量,通过所述第二控制器对所述第一目标偏差进行分析得到所述变化量,之后根据所述变化量控制增大或减小所述被控端末端对所述目标对象的压力。
作为上述实施例的细化,在基于第二控制器对所述第一目标偏差进行分析,得到所述被控端末端对所述目标对象的压力增大或减小的变化量时,所述方法还可以采用但不限于以下实现方式,例如:设置比例增益和微分增益,基于所述第二控制器使用所述比例增益与所述第一目标偏差进行乘积计算,得到第一计算结果;以及基于所述第二控制器使用所述微分增益与目标偏差变化量进行乘积计算,得到第二计算结果,所述目标偏差变化量为所述第一目标偏差与前一时刻的目标偏差之差,所述目标偏差为所述参照力与所述实际接触力之差;基于所述第二控制器对所述第一计算结果和所述第二计算结果进行求和计算,得到所述变化量。
为了更清楚的说明上述实施例的步骤,下述部分结合公式对上述步骤进行说明,具体说明如下所示:
获取所述传感器z轴的力Fzcontact,定义实际接触力Ftrue=-Fzcontact,接收主控端发送的所述参照力Fdesire,当Fz小于Fdesire时,机械臂沿远离目标对象的方向运动,当Fz大于Fdesire时,机械臂沿靠近目标对象的方向运动。假设每次机械臂的姿态方向位置的控制输出量为Pcontrol,则有:
Pcontrol=Kp*Ef+Kd*ΔEf公式(7)
Pcontrol=Kp*Ef+Kd*ΔEf公式(7)
其中,Kp和Kd分别为比例增益和微分增益,Ef为误差,ΔEf为误差微分,
Ef=Fdesire-Ftrue,ΔEf=Ef-Ef_previous,Ef_previous为前一时刻的误差。
与上述机械臂的姿态方向位置的控制相对应,图5为本公开实施例提供的一种机器人力控流程示意图,如图6所示,其主要包括比例微分控制器,机器人系统,及采集单元进行力控的分析。
图7为本公开实施例提供的一种被控端末端基于机器人位置处理的方法在目标对象上运动的示意图,如图7所示,所述被控端的末端紧贴所述目标对象的表面曲线移动。
为了便于对本公开方案的理解,图8为本公开实施例提供的一种主控端结构示意图,图8为本公开实施例提供的一种主控端硬件连接结构的示意图,如图7所示,所述主控端包括:主控端由主机、机器人控制台、双显示器、超声控制面板、摄像头、语音拾音器、扬声器、摄像机控制摇杆和其他模块。医生可以操纵机器人控制台来控制远程机械臂。控制台由仿形探头、位置传感器和压力传感器组成。控制台有六个自由度。姿态传感器有3个旋转自由度,位置传感器有2个水平面运动自由度,“UP键”和压力传感器对应1个自由度上下运动。
图10为本公开实施例提供的另一种机器人位置处理的方法的流程示意图,如图10所示,所述方法应用于主控端,包括:
步骤301,基于主控端与被控端之间的连接,发送控制指令至所述被控端,控制所述被控端启动检测任务,并接收所述被控端的应答信息及图像信息。
作为上述步骤301的细化,为了实现所述主控端与所述被控端之间的通信,将所述主控端与所述被控端进行连接,基于连接关系所述主控端发送控制指令至所述被控端,控制所述被控端启动检测任务,所述检测任务包括但不限于超声检查,所述被控端在接收到所述主控端发送的控制指令后,对所述控制指令进行应答后,回复应答信息至所述主控端,且所述被控端在采集到超声检测的图像信息后,将图像信息传输至所述主控端,即所述主控端接收所述图像信息,上述主控端与被控端之间数据交互的说明仅仅为示例性的,本实施例并不限定数据交互的内容及方式。
步骤302,发送位移信息至所述被控端,以便所述被控端根据所述位移信息进行移动,并实时接收所述被控端的所述应答信息及所述图像信息。
作为上述步骤302的细化,发送位移信息至所述被控端,被控端根据所述位移信息进行移动,并实时接收所述被控端的所述应答信息及所述图像信息,场景性的,即操作主控端的控制台以控制所述被控端的机械臂移动以进行超声检测检查,且所述被控端将所述机械臂移动时检测到的超声图像实时传输至所述主控端。
作为上述实施例的细化,应用于所述主控端的方法,还包括但不限于以下内容,例如:发送参照力至所述被控端,以便所述被控端根据所述参照力控制其末端与目标对象的实际接触力,并接收所述被控端的反馈信息,所述目标对象为所述被控端的检测对象。
作为上述实施例的细化,为了调节所述被控端与所述目标对象之间的接触力,由主控端发送参照力至所述被控端,以便所述被控端根据所述参照力控制其末端与目标对象的实际接触力,主控端实时反馈其与所述目标对象之间的压力信息至所述主控端,所述压力信息为所述反馈信息的一部分,本实施例对所述反馈信息包含的内容不进行限定。
作为上述实施例的细化,所述控制指令还包括超声控制指令及摄像头控制指令。
通过所述超声控制指令实现所述被控端超声检测的控制,通过所述摄像头控制指令实现所述被控端摄像头的控制。
为了更清楚的展示所述被控端与所述主控端之间的连接关系,图11为本实施例提供了的一种主控端与被控端硬件连接的示意图,如图11所示,所述主控端与被控端都包括摄像头和语音拾音器,可通过拾音器和主/副摄像头采集的音视频。被控端主机还可以通过视频采集卡采集超声仪的超声图像,通过音视频传输技术,采集到的音视频和超声图像会通过网络发送给对端,便可以实现远程音视频通信。同时,主控端可通过操作手系统采集机器人控制指令,通过超声控制面板采集超声控制指令,通过摄像头控制摇杆采集摄像头控制指令,这些指令都可通过网络发送至被控端。被控端主机可通过网络接收主控端传来的机器人控制指令、摄像头控制指令。被控端处理器将主控端发送过来的机器人控制指令处理后发送给机械臂系统,可进行位置、姿态与力的实时控制。超声控制指令和主摄像头控制指令也会通过被控端处理器实时发送给超声主机和主摄像头,实现超声和摄像头的远程控制。
综上所述,本实施例能达到以下效果:
1.基于所述位置变化量的大小控制所述被控端末端的位移距离,通过控制所述被控端末端沿其与所述目标对象的接触面移动,从而实现所述被控端末端在所述目标对象上的平滑移动,且使所述被控端末端的移动更加贴合所述目标对象的曲线,提高所述被控端末端对所述目标对象进行检查时的安全性。
2.通过计算所述被控端与目标对象接触面中位置向量的第一分量,基于第一分量计算得到所述被控端的位置变化量,所述位置变化量的方向与所述第一分量的方向相同,从而实现当所述被控端末端发生移动时,基于所述位置变化量控制所述被控端末
端沿其与所述目标对象的接触面移动。
与上述的机器人位置处理的方法相对应,本公开还提出一种机器人位置处理的装置。由于本公开的装置实施例与上述的方法实施例相对应,对于装置实施例中未披露的细节可参照上述的方法实施例,本公开中不再进行赘述。
图12为本公开实施例提供的一种机器人位置处理的装置的结构示意图,如图12所示,所述装置应用于被控端,包括:
接收单元41,用于接收主控端发送的位移信息,并采集所述被控端与目标对象之间的目标接触力,所述目标对象为所述被控端的检测对象;
处理单元42,用于对所述位移信息和所述目标接触力进行处理,得到所述被控端的位置变化量;
第一控制单元43,用于根据所述位置变化量控制所述被控端移动的距离和方向。
本公开提供的机器人位置处理的装置,主要技术方案包括:接收主控端发送的位移信息,并采集所述被控端与目标对象之间的目标接触力,所述目标对象为所述被控端的检测对象;对所述位移信息和所述目标接触力进行处理,得到所述被控端的位置变化量;根据所述位置变化量控制所述被控端移动的距离和方向。与相关技术相比,通过接收所述主控端发送的所述位移信息,并采集所述目标接触力,对所述位移信息和所述目标接触力进行处理后,得到所述被控端的位置变化量,从而实现基于所述位置变化量控制所述被控端末端沿其与所述目标对象的接触面移动,并基于所述位置变化量的大小控制所述被控端末端的位移距离,通过控制所述被控端末端沿其与所述目标对象的接触面移动,从而实现所述被控端末端在所述目标对象上的平滑移动,且使所述被控端末端的移动更加贴合所述目标对象的曲线,提高所述被控端末端对所述目标对象进行检查时的安全性。
图13为本公开实施例提供的另一种机器人位置处理的装置的结构示意图,如图13所示,所述装置应用于被控端,所述处理单元42还用于:
基于第一控制器对所述位移信息和所述目标接触力进行处理,得到所述位置变化量。
进一步地,在本实施例一种可能的实现方式中,如图13所示,所述处理单元42包括:
获取模块421,用于响应于主控端的控制指令,基于所述第一控制器根据所述控制指令定义被控端的末端姿态,并基于所述第一控制器控制传感器获取所述被控端与
目标对象的所述目标接触力;
第一计算模块422,用于基于所述第一控制器使用所述末端姿态和所述目标接触力进行计算,得到所述被控端与所述目标对象接触面的法向量;
生成模块423,用于基于所述第一控制器根据所述位移信息生成在所述目标对象接触面中的位置向量,并基于所述第一控制器使用所述法向量和所述位置向量进行计算,得到所述位置向量的第一分量,所述第一分量为垂直于所述法向量方向的向量;
第二计算模块424,用于通过设置所述第一分量的模与所述位置向量的模相同,基于所述第一控制器计算得到所述被控端的位置变化量,所述位置变化量的方向与所述第一分量的方向相同,所述位置变化量的大小与所述位置向量的大小相同。
进一步地,在本实施例一种可能的实现方式中,如图13所示,所述第一计算模块422还用于:
基于所述第一控制器控制预设的力传感器获取所述被控端与所述目标对象的所述目标接触力,所述目标接触力为基于所述预设的力传感器获取的预设数量及方向的力。
进一步地,在本实施例一种可能的实现方式中,如图13所示,所述生成模块423还用于:
基于所述第一控制器使用所述末端姿态和所述目标接触力进行矩阵计算,得到目标接触力向量;
基于所述第一控制器对所述目标接触力向量进行归一化处理,得到所述被控端与所述目标对象接触面的法向量。
进一步地,在本实施例一种可能的实现方式中,如图13所示,所述生成模块423还用于:
基于所述第一控制器使用所述法向量和所述位置向量进行点积计算,得到所述位置向量的第二分量,所述第二分量为所述法向量方向上的向量;
基于所述第一控制器使用所述位置向量和所述第二分量进行向量的减法运算,得到第一分量。
进一步地,在本实施例一种可能的实现方式中,如图13所示,所述装置还包括:
定义单元44,用于获取所述传感器z轴的目标接触力,定义所述实际接触力为与所述传感器z轴的目标接触力大小相同方向相反的力;
比较单元45,用于接收主控端发送的所述参照力,将所述实际接触力与所述参照
力进行比较;
增大单元46,用于在确定所述实际接触力小于所述参照力时,增大所述被控端末端对所述目标对象的压力;
减小单元47,用于在确定所述实际接触力大于所述参照力时,减小所述被控端末端对所述目标对象的压力。
进一步地,在本实施例一种可能的实现方式中,如图13所示,所述装置还包括:
比对单元48,用于接收所述主控端发送的所述参照力,将所述实际接触力与所述参照力进行比较,得到第一目标偏差;
分析单元49,用于基于第二控制器对所述第一目标偏差进行分析,得到所述被控端末端对所述目标对象的压力增大或减小的变化量;
第二控制单元410,用于根据所述变化量控制增大或减小所述被控端末端对所述目标对象的压力。
进一步地,在本实施例一种可能的实现方式中,如图13所示,所述分析单元49包括:
设置模块491,用于设置比例增益和微分增益,基于所述第二控制器使用所述比例增益与所述第一目标偏差进行乘积计算,得到第一计算结果;以及
第一计算模块492,用于基于所述第二控制器使用所述微分增益与目标偏差变化量进行乘积计算,得到第二计算结果,所述目标偏差变化量为所述第一目标偏差与前一时刻的目标偏差之差,所述目标偏差为所述参照力与所述实际接触力之差;
第二计算模块493,用于基于所述第二控制器对所述第一计算结果和所述第二计算结果进行求和计算,得到所述变化量。
图14为本公开实施例提供的另一种机器人位置处理的装置的结构示意图,如图14所示,所述装置应用于主控端,包括:
收发单元51,用于基于主控端与被控端之间的连接,发送控制指令至所述被控端,控制所述被控端启动检测任务,并接收所述被控端的应答信息及图像信息;以及
发送位移信息至所述被控端,以便所述被控端根据所述位移信息进行移动,并实时接收所述被控端的所述应答信息及所述图像信息。
进一步地,在本实施例一种可能的实现方式中,如图14所示,所述收发单元51还用于:
发送参照力至所述被控端,以便所述被控端根据所述参照力控制其末端与目标对
象的实际接触力,并接收所述被控端的反馈信息,所述目标对象为所述被控端的检测对象。
进一步地,在本实施例一种可能的实现方式中,所述控制指令还包括超声控制指令及摄像头控制指令。
需要说明的是,前述对方法实施例的解释说明,也适用于本实施例的装置,原理相同,本实施例中不再限定。
根据本公开的实施例,本公开还提供了一种电子设备、一种可读存储介质和一种计算机程序产品。
图15示出了可以用来实施本公开的实施例的示例电子设备600的示意性框图。电子设备旨在表示各种形式的数字计算机,诸如,膝上型计算机、台式计算机、工作台、个人数字助理、服务器、刀片式服务器、大型计算机、和其它适合的计算机。电子设备还可以表示各种形式的移动装置,诸如,个人数字处理、蜂窝电话、智能电话、可穿戴设备和其它类似的计算装置。本文所示的部件、它们的连接和关系、以及它们的功能仅仅作为示例,并且不意在限制本文中描述的和/或者要求的本公开的实现。
如图15所示,设备600包括计算单元601,其可以根据存储在ROM(Read-Only Memory,只读存储器)602中的计算机程序或者从存储单元608加载到RAM(Random Access Memory,随机访问/存取存储器)603中的计算机程序,来执行各种适当的动作和处理。在RAM 603中,还可存储设备600操作所需的各种程序和数据。计算单元601、ROM 602以及RAM 603通过总线604彼此相连。I/O(Input/Output,输入/输出)接口605也连接至总线604。
设备600中的多个部件连接至I/O接口605,包括:输入单元606,例如键盘、鼠标等;输出单元607,例如各种类型的显示器、扬声器等;存储单元608,例如磁盘、光盘等;以及通信单元609,例如网卡、调制解调器、无线通信收发机等。通信单元609允许设备600通过诸如因特网的计算机网络和/或各种电信网络与其他设备交换信息/数据。
计算单元601可以是各种具有处理和计算能力的通用和/或专用处理组件。计算单元601的一些示例包括但不限于CPU(Central Processing Unit,中央处理单元)、GPU(Graphic Processing Units,图形处理单元)、各种专用的AI(Artificial Intelligence,人工智能)计算芯片、各种运行机器学习模型算法的计算单元、DSP(Digital Signal Processor,数字信号处理器)、以及任何适当的处理器、控制器、微控制器等。计算单元601执行上文所描述的各个方法和处理,例如机器人位置处理的方法。例如,在一些实施例中,机器人位置处理的方法可被实现为计算机软件程序,其被有形地包含于机器可读介质,例如存储单元608。在一些实施例中,计算机程序的部分或者全部可以经由ROM 602和/或通信单元609而被载入和/或安装到设备600上。当计算机程序加载到RAM 603并由计算单元601执行时,可以执行上文描述的方法的
一个或多个步骤。备选地,在其他实施例中,计算单元601可以通过其他任何适当的方式(例如,借助于固件)而被配置为执行前述机器人位置处理的方法。
本文中以上描述的系统和技术的各种实施方式可以在数字电子电路系统、集成电路系统、FPGA(Field Programmable Gate Array,现场可编程门阵列)、ASIC(Application-Specific Integrated Circuit,专用集成电路)、ASSP(Application Specific Standard Product,专用标准产品)、SOC(System On Chip,芯片上系统的系统)、CPLD(Complex Programmable Logic Device,复杂可编程逻辑设备)、计算机硬件、固件、软件、和/或它们的组合中实现。这些各种实施方式可以包括:实施在一个或者多个计算机程序中,该一个或者多个计算机程序可在包括至少一个可编程处理器的可编程系统上执行和/或解释,该可编程处理器可以是专用或者通用可编程处理器,可以从存储系统、至少一个输入装置、和至少一个输出装置接收数据和指令,并且将数据和指令传输至该存储系统、该至少一个输入装置、和该至少一个输出装置。
用于实施本公开的方法的程序代码可以采用一个或多个编程语言的任何组合来编写。这些程序代码可以提供给通用计算机、专用计算机或其他可编程数据处理装置的处理器或控制器,使得程序代码当由处理器或控制器执行时使流程图和/或框图中所规定的功能/操作被实施。程序代码可以完全在机器上执行、部分地在机器上执行,作为独立软件包部分地在机器上执行且部分地在远程机器上执行或完全在远程机器或服务器上执行。
在本公开的上下文中,机器可读介质可以是有形的介质,其可以包含或存储以供指令执行系统、装置或设备使用或与指令执行系统、装置或设备结合地使用的程序。机器可读介质可以是机器可读信号介质或机器可读储存介质。机器可读介质可以包括但不限于电子的、磁性的、光学的、电磁的、红外的、或半导体系统、装置或设备,或者上述内容的任何合适组合。机器可读存储介质的更具体示例会包括基于一个或多个线的电气连接、便携式计算机盘、硬盘、RAM、ROM、EPROM(Electrically Programmable Read-Only-Memory,可擦除可编程只读存储器)或快闪存储器、光纤、CD-ROM(Compact Disc Read-Only Memory,便捷式紧凑盘只读存储器)、光学储存设备、磁储存设备、或上述内容的任何合适组合。
为了提供与用户的交互,可以在计算机上实施此处描述的系统和技术,该计算机具有:用于向用户显示信息的显示装置(例如,CRT(Cathode-Ray Tube,阴极射线管)或者LCD(Liquid Crystal Display,液晶显示器)监视器);以及键盘和指向装置(例如,鼠标或者轨迹球),用户可以通过该键盘和该指向装置来将输入提供给计算机。其它种类的装置还可以用于提供与用户的交互;例如,提供给用户的反馈可以是任何形式的传感反馈(例如,视觉反馈、听觉反馈、或者触觉反馈);并且可以用任何形式(包括声输入、语音输入或者、触觉输入)来接收来自用户的输入。
可以将此处描述的系统和技术实施在包括后台部件的计算系统(例如,作为数据服务器)、或者包括中间件部件的计算系统(例如,应用服务器)、或者包括前端部
件的计算系统(例如,具有图形用户界面或者网络浏览器的用户计算机,用户可以通过该图形用户界面或者该网络浏览器来与此处描述的系统和技术的实施方式交互)、或者包括这种后台部件、中间件部件、或者前端部件的任何组合的计算系统中。可以通过任何形式或者介质的数字数据通信(例如,通信网络)来将系统的部件相互连接。通信网络的示例包括:LAN(Local Area Network,局域网)、WAN(Wide Area Network,广域网)、互联网和区块链网络。
计算机系统可以包括客户端和服务器。客户端和服务器一般远离彼此并且通常通过通信网络进行交互。通过在相应的计算机上运行并且彼此具有客户端-服务器关系的计算机程序来产生客户端和服务器的关系。服务器可以是云服务器,又称为云计算服务器或云主机,是云计算服务体系中的一项主机产品,以解决了传统物理主机与VPS服务("Virtual Private Server",或简称"VPS")中,存在的管理难度大,业务扩展性弱的缺陷。服务器也可以为分布式系统的服务器,或者是结合了区块链的服务器。
其中,需要说明的是,人工智能是研究使计算机来模拟人的某些思维过程和智能行为(如学习、推理、思考、规划等)的学科,既有硬件层面的技术也有软件层面的技术。人工智能硬件技术一般包括如传感器、专用人工智能芯片、云计算、分布式存储、大数据处理等技术;人工智能软件技术主要包括计算机视觉技术、语音识别技术、自然语言处理技术以及机器学习/深度学习、大数据处理技术、知识图谱技术等几大方向。
应该理解,可以使用上面所示的各种形式的流程,重新排序、增加或删除步骤。例如,本发公开中记载的各步骤可以并行地执行也可以顺序地执行也可以不同的次序执行,只要能够实现本公开公开的技术方案所期望的结果,本文在此不进行限制。
上述具体实施方式,并不构成对本公开保护范围的限制。本领域技术人员应该明白的是,根据设计要求和其他因素,可以进行各种修改、组合、子组合和替代。任何在本公开的精神和原则之内所作的修改、等同替换和改进等,均应包含在本公开保护范围之内。
Claims (17)
- 一种机器人位置处理的方法,其特征在于,所述方法应用于被控端,包括:接收主控端发送的位移信息,并采集所述被控端与目标对象之间的目标接触力,所述目标对象为所述被控端的检测对象;对所述位移信息和所述目标接触力进行处理,得到所述被控端的位置变化量;根据所述位置变化量控制所述被控端移动的距离和方向。
- 根据权利要求1所述的方法,其特征在于,所述对所述位移信息和所述目标接触力进行处理,得到所述被控端的位置变化量包括:基于第一控制器对所述位移信息和所述目标接触力进行处理,得到所述位置变化量。
- 根据权利要求2所述的方法,其特征在于,所述基于第一控制器对所述位移信息和所述目标接触力进行处理,得到所述位置变化量包括:响应于主控端的控制指令,基于所述第一控制器根据所述控制指令定义被控端的末端姿态,并基于所述第一控制器控制传感器获取所述被控端与目标对象的所述目标接触力;基于所述第一控制器使用所述末端姿态和所述目标接触力进行计算,得到所述被控端与所述目标对象接触面的法向量;基于所述第一控制器根据所述位移信息生成在所述目标对象接触面中的位置向量,并基于所述第一控制器使用所述法向量和所述位置向量进行计算,得到所述位置向量的第一分量,所述第一分量为垂直于所述法向量方向的向量;通过设置所述第一分量的模与所述位置向量的模相同,基于所述第一控制器计算得到所述被控端的位置变化量,所述位置变化量的方向与所述第一分量的方向相同,所述位置变化量的大小与所述位置向量的大小相同。
- 根据权利要求3所述的方法,其特征在于,所述基于所述第一控制器控制传感器获取所述被控端与目标对象的所述目标接触力,包括:基于所述第一控制器控制预设的力传感器获取所述被控端与所述目标对象的所述目标接触力,所述目标接触力为基于所述预设的力传感器获取的预设数量及方向的力。
- 根据权利要求3所述的方法,其特征在于,所述基于所述第一控制器使用所述末端姿态和所述目标接触力进行计算,得到所述被控端与所述目标对象接触面的法向量包括:基于所述第一控制器使用所述末端姿态和所述目标接触力进行矩阵计 算,得到目标接触力向量;基于所述第一控制器对所述目标接触力向量进行归一化处理,得到所述被控端与所述目标对象接触面的法向量。
- 根据权利要求3所述的方法,其特征在于,所述基于所述第一控制器使用所述法向量和所述位置向量进行计算,得到所述位置向量的第一分量包括:基于所述第一控制器使用所述法向量和所述位置向量进行点积计算,得到所述位置向量的第二分量,所述第二分量为所述法向量方向上的向量;基于所述第一控制器使用所述位置向量和所述第二分量进行向量的减法运算,得到第一分量。
- 根据权利要求1所述的方法,其特征在于,当所述被控端与所述目标对象的实际接触力大于或小于参照力时,所述方法包括:获取所述传感器z轴的目标接触力,定义所述实际接触力为与所述传感器z轴的目标接触力大小相同方向相反的力;接收主控端发送的所述参照力,将所述实际接触力与所述参照力进行比较;在确定所述实际接触力小于所述参照力时,增大所述被控端末端对所述目标对象的压力;在确定所述实际接触力大于所述参照力时,减小所述被控端末端对所述目标对象的压力。
- 根据权利要求7所述的方法,其特征在于,在增大或减小所述被控端末端对所述目标对象的压力时,所述方法还包括:接收所述主控端发送的所述参照力,将所述实际接触力与所述参照力进行比较,得到第一目标偏差;基于第二控制器对所述第一目标偏差进行分析,得到所述被控端末端对所述目标对象的压力增大或减小的变化量;根据所述变化量控制增大或减小所述被控端末端对所述目标对象的压力。
- 根据权利要求8所述的方法,其特征在于,所述基于第二控制器对所述第一目标偏差进行分析,得到所述被控端末端对所述目标对象的压力增大或减小的变化量包括:设置比例增益和微分增益,基于所述第二控制器使用所述比例增益与所述第一目标偏差进行乘积计算,得到第一计算结果;以及基于所述第二控制器使用所述微分增益与目标偏差变化量进行乘积计算,得到第二计算结果,所述目标偏差变化量为所述第一目标偏差与前一时刻的目标偏差之差,所述目标偏差为所述参照力与所述实际接触力之差;基于所述第二控制器对所述第一计算结果和所述第二计算结果进行求和计算,得到所述变化量。
- 一种机器人位置处理的方法,其特征在于,包括:基于主控端与被控端之间的连接,发送控制指令至所述被控端,控制所述被控端启动检测任务,并接收所述被控端的应答信息及图像信息;发送位移信息至所述被控端,以便所述被控端根据所述位移信息进行移动,并实时接收所述被控端的所述应答信息及所述图像信息。
- 根据权利要求10所述的方法,其特征在于,所述方法还包括:发送参照力至所述被控端,以便所述被控端根据所述参照力控制其末端与目标对象的实际接触力,并接收所述被控端的反馈信息,所述目标对象为所述被控端的检测对象。
- 根据权利要求10所述的方法,其特征在于,所述控制指令还包括超声控制指令及摄像头控制指令。
- 一种机器人位置处理的装置,其特征在于,所述装置应用于被控端,包括:接收单元,用于接收主控端发送的位移信息,并采集所述被控端与目标对象之间的目标接触力,所述目标对象为所述被控端的检测对象;处理单元,用于对所述位移信息和所述目标接触力进行处理,得到所述被控端的位置变化量;第一控制单元,用于根据所述位置变化量控制所述被控端移动的距离和方向。
- 一种机器人位置处理的装置,其特征在于,所述装置应用于主控端,包括:收发单元,用于基于主控端与被控端之间的连接,发送控制指令至所述被控端,控制所述被控端启动检测任务,并接收所述被控端的应答信息及图像信息;以及发送位移信息至所述被控端,以便所述被控端根据所述位移信息进行移动,并实时接收所述被控端的所述应答信息及所述图像信息。
- 一种电子设备,其特征在于,包括:至少一个处理器;以及与所述至少一个处理器通信连接的存储器;其中,所述存储器存储有可被所述至少一个处理器执行的指令,所述指令被所述至少一个处理器执行,以使所述至少一个处理器能够执行权利要求1-9或10-12中任一项所述的方法。
- 一种存储有计算机指令的非瞬时计算机可读存储介质,其特征在于,所述计算机指令用于使所述计算机执行根据权利要求1-9或10-12中任一项 所述的方法。
- 一种计算机程序产品,其特征在于,包括计算机程序,所述计算机程序在被处理器执行时实现根据权利要求1-9或10-12中任一项所述的方法。
Priority Applications (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| PCT/CN2023/105472 WO2025007230A1 (zh) | 2023-07-03 | 2023-07-03 | 机器人位置处理的方法及装置、电子设备和存储介质 |
| CN202380093912.5A CN120676908A (zh) | 2023-07-03 | 2023-07-03 | 机器人位置处理的方法及装置、电子设备和存储介质 |
Applications Claiming Priority (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| PCT/CN2023/105472 WO2025007230A1 (zh) | 2023-07-03 | 2023-07-03 | 机器人位置处理的方法及装置、电子设备和存储介质 |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2025007230A1 true WO2025007230A1 (zh) | 2025-01-09 |
Family
ID=94171138
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/CN2023/105472 Pending WO2025007230A1 (zh) | 2023-07-03 | 2023-07-03 | 机器人位置处理的方法及装置、电子设备和存储介质 |
Country Status (2)
| Country | Link |
|---|---|
| CN (1) | CN120676908A (zh) |
| WO (1) | WO2025007230A1 (zh) |
Citations (8)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JPH03190687A (ja) * | 1989-12-15 | 1991-08-20 | Fujitsu Ltd | ロボットの倣い速度制御方式 |
| US5646493A (en) * | 1994-05-20 | 1997-07-08 | Fanuc Ltd. | Robot profile control method |
| CN112223283A (zh) * | 2020-09-29 | 2021-01-15 | 腾讯科技(深圳)有限公司 | 机械臂、机器人、机械臂的控制方法、处理设备及介质 |
| CN112936278A (zh) * | 2021-02-07 | 2021-06-11 | 深圳市优必选科技股份有限公司 | 机器人的人机协作控制方法、装置和机器人 |
| US20220079556A1 (en) * | 2019-01-29 | 2022-03-17 | Kunshan Imagene Medical Co., Ltd. | Ultrasound scanning control method, ultrasound scanning device, and storage medium |
| CN114694825A (zh) * | 2020-12-29 | 2022-07-01 | 无锡祥生医疗科技股份有限公司 | 超声探头扫查方法、装置及存储介质 |
| CN115179279A (zh) * | 2022-06-21 | 2022-10-14 | 深圳瀚维智能医疗科技有限公司 | 机械臂的控制方法、装置、机械臂以及可读存储介质 |
| CN115946120A (zh) * | 2023-01-09 | 2023-04-11 | 上海艾利特机器人有限公司 | 机械臂控制方法、装置、设备和介质 |
-
2023
- 2023-07-03 WO PCT/CN2023/105472 patent/WO2025007230A1/zh active Pending
- 2023-07-03 CN CN202380093912.5A patent/CN120676908A/zh active Pending
Patent Citations (8)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JPH03190687A (ja) * | 1989-12-15 | 1991-08-20 | Fujitsu Ltd | ロボットの倣い速度制御方式 |
| US5646493A (en) * | 1994-05-20 | 1997-07-08 | Fanuc Ltd. | Robot profile control method |
| US20220079556A1 (en) * | 2019-01-29 | 2022-03-17 | Kunshan Imagene Medical Co., Ltd. | Ultrasound scanning control method, ultrasound scanning device, and storage medium |
| CN112223283A (zh) * | 2020-09-29 | 2021-01-15 | 腾讯科技(深圳)有限公司 | 机械臂、机器人、机械臂的控制方法、处理设备及介质 |
| CN114694825A (zh) * | 2020-12-29 | 2022-07-01 | 无锡祥生医疗科技股份有限公司 | 超声探头扫查方法、装置及存储介质 |
| CN112936278A (zh) * | 2021-02-07 | 2021-06-11 | 深圳市优必选科技股份有限公司 | 机器人的人机协作控制方法、装置和机器人 |
| CN115179279A (zh) * | 2022-06-21 | 2022-10-14 | 深圳瀚维智能医疗科技有限公司 | 机械臂的控制方法、装置、机械臂以及可读存储介质 |
| CN115946120A (zh) * | 2023-01-09 | 2023-04-11 | 上海艾利特机器人有限公司 | 机械臂控制方法、装置、设备和介质 |
Also Published As
| Publication number | Publication date |
|---|---|
| CN120676908A (zh) | 2025-09-19 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| KR102014351B1 (ko) | 수술정보 구축 방법 및 장치 | |
| Mathiassen et al. | An ultrasound robotic system using the commercial robot UR5 | |
| CN111309144B (zh) | 三维空间内注视行为的识别方法、装置及存储介质 | |
| WO2022062464A1 (zh) | 基于计算机视觉的手眼标定方法及装置、存储介质 | |
| Akbari et al. | Robotic ultrasound scanning with real-time image-based force adjustment: Quick response for enabling physical distancing during the COVID-19 pandemic | |
| US20240173018A1 (en) | System and apparatus for remote interaction with an object | |
| CN114495273B (zh) | 一种机器人手势遥操作方法及相关装置 | |
| CN113505694A (zh) | 一种基于视线追踪的人机交互方法、装置及计算机设备 | |
| CN109543644B (zh) | 一种多模态手势的识别方法 | |
| CN119074239B (zh) | 腔镜手术机器人的主从控制方法、装置、控制台及存储介质 | |
| WO2022217667A1 (zh) | 人体生理样本采集方法、装置、电子设备和存储介质 | |
| CN118466805A (zh) | 机器视觉和手势识别的非接触式三维模型人机交互方法 | |
| CN105005381A (zh) | 一种虚拟机械臂交互的抖动消除方法 | |
| CN119311122A (zh) | 基于自适应多模态融合的云游戏虚拟现实手势交互方法、设备及存储介质 | |
| CN115317136A (zh) | 手术机器人的控制方法、控制装置及机器人 | |
| WO2025007230A1 (zh) | 机器人位置处理的方法及装置、电子设备和存储介质 | |
| CN111290577B (zh) | 一种非接触式输入方法及装置 | |
| CN118809596A (zh) | 一种机器人控制指令确定方法、装置、设备、介质及产品 | |
| CN113561172A (zh) | 一种基于双目视觉采集的灵巧手控制方法及装置 | |
| CN116185205B (zh) | 非接触手势交互方法和装置 | |
| CN118578381A (zh) | 机器人的物体抓取方法、装置、电子设备及存储介质 | |
| CN117100390A (zh) | 一种腔镜手术机器人力感知反馈方法、装置及存储介质 | |
| WO2025007231A1 (zh) | 机器人姿态处理的方法及装置、电子设备和存储介质 | |
| CN118802992A (zh) | 一种机器人远程控制方法及系统 | |
| CN115861920A (zh) | 基于方舱医院的病人异常识别方法、装置、服务器及存储介质 |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 23943967 Country of ref document: EP Kind code of ref document: A1 |