CN105252532A - Method of cooperative flexible attitude control for motion capture robot - Google Patents
Method of cooperative flexible attitude control for motion capture robot Download PDFInfo
- Publication number
- CN105252532A CN105252532A CN201510824988.8A CN201510824988A CN105252532A CN 105252532 A CN105252532 A CN 105252532A CN 201510824988 A CN201510824988 A CN 201510824988A CN 105252532 A CN105252532 A CN 105252532A
- Authority
- CN
- China
- Prior art keywords
- angle
- rightarrow
- joint
- robot
- human body
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 230000033001 locomotion Effects 0.000 title claims abstract description 31
- 238000000034 method Methods 0.000 title claims abstract description 28
- 238000006073 displacement reaction Methods 0.000 claims abstract description 17
- 239000013598 vector Substances 0.000 claims description 53
- 238000004364 calculation method Methods 0.000 claims description 49
- 210000003141 lower extremity Anatomy 0.000 claims description 10
- 210000000988 bone and bone Anatomy 0.000 claims description 8
- 230000008569 process Effects 0.000 claims description 8
- 230000009471 action Effects 0.000 claims description 6
- 210000003423 ankle Anatomy 0.000 claims description 4
- 238000001914 filtration Methods 0.000 claims description 4
- 230000000694 effects Effects 0.000 claims description 3
- 210000001364 upper extremity Anatomy 0.000 claims description 3
- 230000008859 change Effects 0.000 claims description 2
- 238000000205 computational method Methods 0.000 claims 5
- 241001269238 Data Species 0.000 claims 2
- 238000012216 screening Methods 0.000 claims 1
- 230000007613 environmental effect Effects 0.000 abstract description 5
- 230000003238 somatosensory effect Effects 0.000 description 21
- 238000010586 diagram Methods 0.000 description 10
- 230000003993 interaction Effects 0.000 description 9
- 230000006870 function Effects 0.000 description 3
- 210000003414 extremity Anatomy 0.000 description 2
- 230000009916 joint effect Effects 0.000 description 2
- 230000002093 peripheral effect Effects 0.000 description 2
- 230000009466 transformation Effects 0.000 description 2
- 238000013473 artificial intelligence Methods 0.000 description 1
- 210000004556 brain Anatomy 0.000 description 1
- 230000007812 deficiency Effects 0.000 description 1
- 238000011156 evaluation Methods 0.000 description 1
- 230000008921 facial expression Effects 0.000 description 1
- 210000002683 foot Anatomy 0.000 description 1
- 230000002452 interceptive effect Effects 0.000 description 1
- 210000003205 muscle Anatomy 0.000 description 1
- 238000012913 prioritisation Methods 0.000 description 1
- 230000009023 proprioceptive sensation Effects 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1694—Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion
- B25J9/1697—Vision controlled systems
Landscapes
- Engineering & Computer Science (AREA)
- Robotics (AREA)
- Mechanical Engineering (AREA)
- Manipulator (AREA)
Abstract
Description
技术领域technical field
本发明涉及一种动作捕捉机器人协同柔性姿态控制的方法,属于人机交互领域和人工智能领域。The invention relates to a method for motion capture robot cooperative flexible gesture control, belonging to the fields of human-computer interaction and artificial intelligence.
背景技术Background technique
人机交互(HCI,Human-ComputerInteraction)是一门关于设计、评价和实现供人们使用的交互式计算机系统,以及研究由此而发生的相关现象的学科。人机交互技术主要是研究人与计算机之间的信息交换,体现在人到计算机和计算机到人的信息交互两部分,前者是指人们可以借助键盘、鼠标、操作杆、位置跟踪器、数据手套等设备,用手、脚、声音、姿态和身体的动作、视线甚至脑电波等向计算机传递信息;后者是计算机通过打印机、绘图仪、显示器、头盔式显示器、音箱等输出设备或显示设备给人提供信息。Human-Computer Interaction (HCI, Human-Computer Interaction) is a discipline about the design, evaluation and implementation of interactive computer systems for people to use, and the study of related phenomena that occur therefrom. Human-computer interaction technology mainly studies the information exchange between human and computer, which is reflected in two parts: human-to-computer and computer-to-human information interaction. The former means that people can use keyboard, mouse, joystick, position tracker, data gloves and other equipment, using hands, feet, voice, posture and body movements, sight and even brain waves to transmit information to the computer; people provide information.
Kinect是微软在2009年6月2日的E3大展上,正式公布的XBOX360体感周边外设。Kinect彻底颠覆了游戏的单一操作,使人机互动的理念更加彻底的展现出来。它是一种3D体感摄影机,同时它导入了即时动态捕捉、影像辨识、麦克风输入、语音辨识、社群互动等功能。体感,即躯体感觉,是触觉、压觉、温觉、痛觉和本体感觉(关于肌肉和关节位置和运动、躯体姿势和运动以及面部表情的感觉)的总称。采用LightCoding和光源标定技术的Kinect利用红外线发射器发出雷射光,通过红外线CMOS摄像机记录空间中的每个散斑,结合原始散斑图案,通过晶片计算出具有3D深度的图像,进而转换到骨架追踪系统,使得Kinect可以应用到许多领域。Kinect is an XBOX360 somatosensory peripheral peripheral officially announced by Microsoft at the E3 exhibition on June 2, 2009. Kinect has completely subverted the single operation of the game, making the concept of human-computer interaction more thoroughly displayed. It is a 3D somatosensory camera, and it introduces functions such as real-time motion capture, image recognition, microphone input, voice recognition, and community interaction. Somatosensory, or somatosensory, is an umbrella term for touch, pressure, temperature, pain, and proprioception (sense about muscle and joint position and movement, body posture and movement, and facial expression). Kinect using LightCoding and light source calibration technology uses infrared emitters to emit laser light, records each speckle in the space through an infrared CMOS camera, combines the original speckle pattern, calculates an image with 3D depth through the chip, and then converts to skeleton tracking system, so that Kinect can be applied to many fields.
NAO是由AldebaranRobotics公司研制的一个身高58厘米的可编程仿人机器人。其身体具有25个自由度,主要元件为电机和电动致动器。其运动模块基于广义逆运动学,可处理笛卡尔坐标系、关节控制、平衡、冗余性和任务优先性。NAO头部内嵌英特尔ATOM1.6GHz处理器,运行LINUX内核,拥有NAOqi操作系统,是一个开源的软件平台,可用C++或Python语言进行编程。基于以上特点,机器人NAO可以最逼真地实现动作跟踪,是本发明的重要组成部分。NAO is a programmable humanoid robot with a height of 58 cm developed by Aldebaran Robotics. Its body has 25 degrees of freedom, and the main components are motors and electric actuators. Its motion module is based on generalized inverse kinematics and handles Cartesian coordinates, joint control, balance, redundancy and task prioritization. NAO head is embedded with Intel ATOM1.6GHz processor, runs LINUX kernel, has NAOqi operating system, is an open source software platform, and can be programmed in C++ or Python language. Based on the above characteristics, the robot NAO can realize motion tracking most realistically, which is an important part of the present invention.
发明内容Contents of the invention
针对现有技术的不足,本发明涉及一种动作捕捉机器人协同柔性姿态控制的方法。本发明的目的在于为人机交互提出一种新模式。Aiming at the deficiencies of the prior art, the present invention relates to a method for motion capture robot collaborative flexible gesture control. The purpose of the present invention is to propose a new mode for human-computer interaction.
本发明的技术方案如下:Technical scheme of the present invention is as follows:
一种动作捕捉机器人协同柔性姿态控制的方法,包括步骤如下:A method for motion capture robot cooperative flexible gesture control, comprising the following steps:
(1)利用体感设备Kinect实时捕获不同时刻目标人体对应目标关节位移数据;目标人体站在体感设备Kinect视距范围内,为保证最佳识别效果,目标人体应处在体感设备Kinect的镜头前1.2-3.5m,水平视角±57°以内;(1) Use the somatosensory device Kinect to capture real-time displacement data of the corresponding target joints of the target human body at different times; the target human body stands within the line of sight of the somatosensory device Kinect. In order to ensure the best recognition effect, the target human body should be in front of the lens of the somatosensory device Kinect 1.2 -3.5m, horizontal viewing angle within ±57°;
(2)将体感设备Kinect捕获到的目标关节位移数据发送到PC端;(2) Send the target joint displacement data captured by the somatosensory device Kinect to the PC;
(3)PC端接收所有目标关节位移数据,并根据其实时绘制出人体骨骼框架;(3) The PC end receives all the target joint displacement data, and draws the human skeleton frame in real time according to it;
(4)PC端将人体骨骼框架动态显示在屏幕上,并提供报错反馈;(4) The PC end dynamically displays the human skeleton frame on the screen, and provides error feedback;
(5)PC端对接收到的所有目标关节位移数据进行处理:包括相对阈值距离比较滤波计算、空间向量计算,得到人形机器人NAO的姿态控制数据,即关节角度数据;本发明由于环境因素和传感器自身抖动等原因,导致得到的原始数据里存在干扰数据,因此有必要对原始数据进行相对阈值距离比较滤波计算,使得机器人动作追踪更加准确、可靠;(5) The PC end processes all target joint displacement data received: including relative threshold distance comparison filter calculation, space vector calculation, to obtain the attitude control data of the humanoid robot NAO, i.e. joint angle data; the present invention is due to environmental factors and sensors Due to its own jitter and other reasons, there are interference data in the obtained original data, so it is necessary to perform relative threshold distance comparison filter calculation on the original data to make the robot motion tracking more accurate and reliable;
(6)调用人形机器人NAO的NAOqi操作系统中的JointControl函数,根据传送来的关节角度数据,对人形机器人NAO的舵机进行控制,使人形机器人NAO实时跟踪目标人体动作。(6) Call the JointControl function in the NAOqi operating system of the humanoid robot NAO, and control the steering gear of the humanoid robot NAO according to the transmitted joint angle data, so that the humanoid robot NAO can track the target human body in real time.
根据本发明优选的,所述步骤(4)还包括,所述PC端动态显示人体骨骼框架:如果PC端显示的人体骨骼框架与目标人体的动作一致,则将关节角度数据发送至人形机器人NAO;如果PC端显示的人体骨骼框架与目标人体的动作不匹配,则重新初始化程序,以保证人形机器人NAO收到可靠姿态控制数据。Preferably according to the present invention, the step (4) also includes that the PC end dynamically displays the human skeleton frame: if the human skeleton frame displayed on the PC end is consistent with the action of the target human body, then the joint angle data is sent to the humanoid robot NAO ; If the human skeleton frame displayed on the PC does not match the movement of the target human body, re-initialize the program to ensure that the humanoid robot NAO receives reliable attitude control data.
根据本发明优选的,所述步骤(5)中,所述相对阈值距离比较滤波计算包括:Preferably according to the present invention, in said step (5), said relative threshold distance comparison filter calculation includes:
通过观察位移关节点的空间坐标变化,计算固定时间段(例如:0.1s-0.5s)内同一关节在开始时间和结束时间组成的波动向量,观察该波动向量的模以及在空间坐标系各个方向的波动,通过设定波动阈值筛选关节波动值。可以看出关节点识别位置的抖动主要是沿坐标轴方向的快速抖动,且出现抖动时波动向量的模会大幅增加。因此,应对波动较大的关节点做相应处理,对小幅度波动的关节点保持上一状态不变,对于不同的关节处使用不同的阈值比较,保证每次滤波后的结果都是最优解,以保证机器人姿态变换的连续性。By observing the spatial coordinate changes of the displaced joint points, calculate the fluctuation vector composed of the same joint at the start time and end time within a fixed time period (for example: 0.1s-0.5s), observe the modulus of the fluctuation vector and all directions in the space coordinate system The fluctuation of the joint is filtered by setting the fluctuation threshold. It can be seen that the jitter of the joint point recognition position is mainly the rapid jitter along the coordinate axis direction, and the modulus of the fluctuation vector will increase significantly when the jitter occurs. Therefore, the joint points with large fluctuations should be dealt with accordingly, and the previous state should be kept unchanged for the joint points with small fluctuations. For different joints, different threshold comparisons are used to ensure that the result after each filter is the optimal solution. , to ensure the continuity of robot pose transformation.
根据本发明优选的,所述步骤(5)中,所述空间向量计算包括:Preferably according to the present invention, in the step (5), the space vector calculation includes:
Kinect所使用的空间坐标系不同于常见的空间坐标系,其x轴与y轴的零点与传统空间坐标系相同,但其z轴坐标零点为Kinect传感器,正方向为Kinect指向的正前方。Kinect空间坐标系如图6所示:The space coordinate system used by Kinect is different from the common space coordinate system. The zero points of the x-axis and y-axis are the same as the traditional space coordinate system, but the zero point of the z-axis coordinate is the Kinect sensor, and the positive direction is the front where Kinect points. The Kinect space coordinate system is shown in Figure 6:
由Kinect空间坐标系中,得出Kinect坐标系中任意两个不重合的坐标点P1(x1,y1,z1),P2(x2,y2,z2),对其组成的向量P1P2:From the Kinect space coordinate system, any two non-overlapping coordinate points P1 (x1, y1, z1) and P2 (x2, y2, z2) in the Kinect coordinate system are obtained, and the vector P1P2 composed of them:
若存在另一点P3且该点不在P1P2所在的直线上,则存在以下关系式:If there is another point P3 and this point is not on the straight line where P1P2 is located, the following relationship exists:
根据上述性质,将人体关节角度计算简化为对空间向量夹角的计算,下面将分别说明上肢关节角度的计算方法:According to the above properties, the calculation of the joint angle of the human body is simplified to the calculation of the included angle of the space vector, and the calculation method of the joint angle of the upper limbs will be explained separately below:
1)关节角度:LeftElbowRoll1) Joint angle: LeftElbowRoll
如图7所示:计算LeftElbowRoll角度只需构造该空间夹角两边所在的一组向量,As shown in Figure 7: To calculate the LeftElbowRoll angle, you only need to construct a set of vectors on both sides of the space angle,
并根据上文中提到的关节角度计算公式得:And according to the joint angle calculation formula mentioned above:
2)关节角度:LeftElbowYaw2) Joint angle: LeftElbowYaw
如图8所示:As shown in Figure 8:
LeftElbowYaw关节的角度为肘部绕上臂旋转时产生的角度,此角度通常情况下为ABC和BCD两相交平面的夹角,即图中平面S1和S2的夹角;根据空间平面夹角计算公式得出角度LeftElbowRoll的计算方法如下。The angle of the LeftElbowYaw joint is the angle generated when the elbow rotates around the upper arm. This angle is usually the angle between the two intersecting planes ABC and BCD, that is, the angle between planes S1 and S2 in the figure; according to the formula for calculating the angle of the space plane The calculation method of the out angle LeftElbowRoll is as follows.
首先需要明确计算两非共线向量所在平面的法向量的公式:First, it is necessary to clarify the formula for calculating the normal vector of the plane where the two non-collinear vectors are located:
因此S1、S2平面的法向量表示为:Therefore, the normal vectors of the S1 and S2 planes are expressed as:
LeftElbowYaw关节的角度即等于M1、M2两法向量的夹角:The angle of the LeftElbowYaw joint is equal to the angle between the two normal vectors of M1 and M2:
3)关节角度:LeftShoulder.Roll3) Joint angle: LeftShoulder.Roll
如图9所示:As shown in Figure 9:
4)关节角度:LeftShoulderPitch4) Joint angle: LeftShoulderPitch
如图10所示:As shown in Figure 10:
LeftShoulderPitch关节的角度为上臂与双肩轴线组成的平面与双肩轴线与脊柱点组成的平面之间的夹角,此角度通常情况下为ABC和BCD两相交平面的夹角,类比LeftElbowYaw关节角的计算方法,根据空间平面夹角计算公式可得出角度LeftShoulderPitch的计算方法如下:The angle of the LeftShoulderPitch joint is the angle between the plane formed by the upper arm and the shoulder axis and the plane formed by the shoulder axis and the spine point. This angle is usually the angle between the two intersecting planes ABC and BCD. It is similar to the calculation method of the LeftElbowYaw joint angle , according to the calculation formula of the included angle of the space plane, the calculation method of the angle LeftShoulderPitch can be obtained as follows:
平面ABC的法向量:Normal vector of plane ABC:
平面BCD的法向量:The normal vector of the plane BCD:
关节角LeftShoulderPitch计算:Joint angle LeftShoulderPitch calculation:
5)手部张合角度:LeftHand5) Hand opening and closing angle: LeftHand
由于KINECT对于手部信息的读取无法精确到手指的状态,因此对于手部张合角度计算,通过计算Lefthand和LeftWrist之间的距离来估算手部张合的角度,计算方法如下:Since KINECT cannot accurately read the hand information to the state of the fingers, for the calculation of the hand opening and closing angle, the angle of the hand opening and closing is estimated by calculating the distance between Lefthand and LeftWrist. The calculation method is as follows:
如图11所示:As shown in Figure 11:
至此,左臂所有关节的角度全部计算完毕,同理可以计算右臂关节角度;So far, the angles of all joints of the left arm have been calculated, and the joint angles of the right arm can be calculated in the same way;
6)关节角度:HeadPitch6) Joint angle: HeadPitch
头部两个关节中,与低头和抬头有关的角度名称HeadPitchAmong the two joints of the head, the name of the angle related to head down and head up HeadPitch
如图12所示:As shown in Figure 12:
头部关节HeadPitch的角度选取肩中指向头部的向量与两肩与脊柱组成的平面的法向量之间的夹角,具体计算公式如下:The angle of the head joint HeadPitch is selected from the angle between the vector pointing to the head in the shoulder and the normal vector of the plane formed by the two shoulders and the spine. The specific calculation formula is as follows:
7)关节角:HeadYaw7) Joint angle: HeadYaw
如图13所示:As shown in Figure 13:
头部另外一个重要的关节角HeadYaw负责实现头部左右转动,由于KINECT骨骼识别结果中无法给出面部朝向的数据信息,因此HeadYaw角度无法进行直接计算,然而通过观察骨骼绘制动画可以看出,头部的空间位置明显位于肩中的前方(即正面面对KINECT时,Head点和Shouldercenter点在z轴上存在一定距离),因此关于此关节角本文将转化为平面夹角的计算:Another important joint angle of the head, HeadYaw, is responsible for the left and right rotation of the head. Since the face orientation data information cannot be given in the KINECT bone recognition results, the HeadYaw angle cannot be directly calculated. The spatial position of the head is obviously located in front of the shoulder (that is, when the front faces KINECT, there is a certain distance between the Head point and the Shouldercenter point on the z-axis), so this article will convert this joint angle into the calculation of the plane angle:
平面ABC的法向量Normal vector of plane ABC
平面BCD法向量Plane BCD normal vector
关节角度joint angle
在机器人的运动过程中下肢关节活动将直接影响到机器人整体的平稳性,为了简化控制难度,本文对于下肢控制采用相对位置法,通过计算下肢末端在相对坐标系中的位置结合人体质心的高度,实现对机器人动作的控制;During the movement of the robot, the joint activities of the lower limbs will directly affect the overall stability of the robot. In order to simplify the control difficulty, this paper adopts the relative position method for the control of the lower limbs, by calculating the position of the lower limbs in the relative coordinate system combined with the height of the center of mass of the human body , to control the movement of the robot;
如图14所示:As shown in Figure 14:
将骨骼识别结果的Hipcenter点垂直投影到地面,作为新的坐标系原点,取左右脚踝点Rightankle和Leftankle在新坐标系中坐标作为机器人的控制数据。Vertically project the Hipcenter point of the bone recognition result to the ground as the origin of the new coordinate system, and take the coordinates of the left and right ankle points Rightankle and Leftankle in the new coordinate system as the control data of the robot.
B、C两点在O坐标系中坐标如下,The coordinates of points B and C in the O coordinate system are as follows,
LeftFoot=(Az-Bz,Ax-Bx)(3-17)LeftFoot=(A z -B z ,A x -B x )(3-17)
RightFoot=(Az-Cz,Ax-Cx)(3-18)RightFoot=(A z -C z ,A x -C x )(3-18)
为了由于不同的人身高差异造成的绝对距离误差,此处将坐标数值除以人体胯宽,其中胯宽计算公式如下:In order to obtain the absolute distance error caused by the difference in height of different people, here the coordinate value is divided by the width of the human body's crotch, where the formula for calculating the crotch width is as follows:
因此下肢末端在新坐标系中的位置如下所示:So the position of the extremity in the new coordinate system is as follows:
本发明的特点是:The features of the present invention are:
1、本发明采用先进体感设备和前沿机器人技术,设计了一种新的人机交互模式系统。1. The present invention adopts advanced somatosensory equipment and cutting-edge robot technology to design a new human-computer interaction mode system.
2、本发明实现控制人形机器人实时模仿人体动作,具有很高的可靠性,很强的灵活性,在不同环境复杂条件下具有较好的鲁棒性。2. The present invention realizes the real-time imitation of human action by controlling the humanoid robot, which has high reliability, strong flexibility, and good robustness under different complex environmental conditions.
3、本发明采用相对阈值距离滤波方法,降低了环境因素和传感器自身抖动造成的影响,有效地提高了数据的可靠性,系统运行的稳定性。3. The present invention adopts the relative threshold distance filtering method, which reduces the influence caused by environmental factors and the vibration of the sensor itself, and effectively improves the reliability of data and the stability of system operation.
4、本发明将骨骼追踪算法应用于人体姿态识别,针对NAO是拥有25个自由度的可编程仿人机器人的特点,实现控制人形机器人实时模仿人体动作,具有很高的可靠性,在不同环境复杂条件下具有较好的鲁棒性。本发明以Kinect骨骼框架追踪和机器人NAO硬件平台为基础。Kinect端,首选进行程序初始化,该过程包括硬件连接驱动检查、实例化传感器对象、获得深度权限、注册事件监听器。然后开始骨骼框架识别,该过程包括深度图像的获取,通过骨骼识别算法库深度图像识别出人体关节点并提取空间坐标,经过滤波、空间向量计算,为机器人NAO提供姿态控制数据。4. The present invention applies the skeleton tracking algorithm to the recognition of human body gestures. Aiming at the characteristics of NAO being a programmable humanoid robot with 25 degrees of freedom, it realizes the real-time imitation of human body movements by controlling the humanoid robot, which has high reliability and can be used in different environments. It has good robustness under complex conditions. The invention is based on Kinect skeleton frame tracking and robot NAO hardware platform. On the Kinect side, program initialization is preferred. This process includes hardware connection driver checks, instantiating sensor objects, obtaining deep permissions, and registering event listeners. Then start the bone frame recognition, the process includes the acquisition of depth image, through the depth image of the bone recognition algorithm library to identify the joint points of the human body and extract the spatial coordinates, after filtering and space vector calculation, provide attitude control data for the robot NAO.
附图说明Description of drawings
图1是本发明所述方法的流程图;Fig. 1 is a flow chart of the method of the present invention;
图2是如果PC端显示的人体骨骼框架与目标人体的动作不匹配,则重新初始化程序的流程图;Figure 2 is a flow chart of the reinitialization program if the human skeleton frame displayed on the PC end does not match the action of the target human body;
图3是人体骨骼框架识别流程图;Fig. 3 is a flow chart of human skeleton frame recognition;
图4是人体骨骼框架动态显示在屏幕的示意图;Fig. 4 is a schematic diagram of the dynamic display of the human skeleton frame on the screen;
图5是本发明所述相对阈值距离比较滤波算法的流程图;Fig. 5 is a flow chart of the relative threshold distance comparison filter algorithm of the present invention;
图6是Kinect空间坐标系;Fig. 6 is the Kinect space coordinate system;
图7为关节角度:LeftElbowRoll的计算示意图;Figure 7 is a schematic diagram of the calculation of the joint angle: LeftElbowRoll;
图8为关节角度:LeftElbowYaw的计算示意图;Figure 8 is a schematic diagram of the calculation of the joint angle: LeftElbowYaw;
图9为关节角度:LeftShoulder.Roll的计算示意图;Figure 9 is a schematic diagram of the calculation of the joint angle: LeftShoulder.Roll;
图10为关节角度:LeftShoulderPitch的计算示意图;Figure 10 is a schematic diagram of the calculation of the joint angle: LeftShoulderPitch;
图11为手部张合角度:LeftHand的示意图;Figure 11 is a schematic diagram of the opening and closing angle of the hand: LeftHand;
图12为关节角度:HeadPitch的计算示意图;Figure 12 is a joint angle: a schematic diagram of the calculation of HeadPitch;
图13为关节角:HeadYaw的计算示意图;Figure 13 is a schematic diagram of joint angle: HeadYaw calculation;
图14为将骨骼识别结果的Hipcenter点垂直投影到地面,作为新的坐标系原点,取左右脚踝点Rightankle和Leftankle在新坐标系中坐标作为机器人的控制数据的计算示意图。Figure 14 is a schematic diagram of calculating the coordinates of the left and right ankle points Rightankle and Leftankle in the new coordinate system as the control data of the robot by vertically projecting the Hipcenter point of the bone recognition result onto the ground as the origin of the new coordinate system.
具体实施方式detailed description
下面结合实施例和说明书附图对本发明做详细的说明,但不限于此。The present invention will be described in detail below in conjunction with the embodiments and the accompanying drawings, but is not limited thereto.
如图1-5所示。As shown in Figure 1-5.
实施例、Example,
一种动作捕捉机器人协同柔性姿态控制的方法,包括步骤如下:A method for motion capture robot cooperative flexible gesture control, comprising the following steps:
(1)利用体感设备Kinect实时捕获不同时刻目标人体对应目标关节位移数据;目标人体站在体感设备Kinect视距范围内,为保证最佳识别效果,目标人体应处在体感设备Kinect的镜头前1.2-3.5m,水平视角±57°以内;所述体感设备Kinect具备彩色和深度感应镜头,水平视野:57度,垂直视野:43度,深度感应器范围1.2m-3.5m。运行PC端的服务程序,见图2,首先对体感设备Kinect进行初始化,该过程包括硬件连接驱动检查、实例化传感器对象、获得深度权限、注册事件等等;(1) Use the somatosensory device Kinect to capture real-time displacement data of the corresponding target joints of the target human body at different times; the target human body stands within the line of sight of the somatosensory device Kinect. In order to ensure the best recognition effect, the target human body should be in front of the lens of the somatosensory device Kinect 1.2 -3.5m, horizontal viewing angle within ±57°; the somatosensory device Kinect has a color and depth sensing lens, horizontal viewing angle: 57 degrees, vertical viewing angle: 43 degrees, depth sensor range 1.2m-3.5m. Run the service program on the PC side, as shown in Figure 2, first initialize the somatosensory device Kinect, the process includes checking hardware connection drivers, instantiating sensor objects, obtaining depth permissions, registering events, etc.;
(2)将体感设备Kinect捕获到的目标关节位移数据发送到PC端;目标人体站在体感设备Kinect深度感应镜头前,可以改变头部、手臂、手指等姿态,由体感设备Kinect捕获所有目标关节位移数据;(2) Send the target joint displacement data captured by the somatosensory device Kinect to the PC; the target human body stands in front of the Kinect depth sensing lens of the somatosensory device, and can change the posture of the head, arms, fingers, etc., and the somatosensory device Kinect captures all target joints displacement data;
(3)PC端接收所有目标关节位移数据,并根据其实时绘制出人体骨骼框架;绘制人体骨骼框架的流程如图3;(3) The PC end receives all target joint displacement data, and draws the human skeleton frame in real time according to it; the process of drawing the human skeleton frame is shown in Figure 3;
(4)PC端将人体骨骼框架动态显示在屏幕上,并提供报错反馈;如图4,给出了四张具有代表性的服务程序运行状态图:标记为1的部分是目标人体站在视距范围内,程序初始化结束后显示的由体感设备Kinect首次捕获的数据构建的人体骨骼框架,代表程序启动正常,开始实时捕获目标人体的姿态变化;标记为2的部分是程序开始后某一时刻由体感设备Kinect捕获的数据构建的人体骨骼框架,该部分反映了这一时刻目标人体的姿态;标记为3的部分代表了由于程序运行报错、目标人体紧贴体感设备Kinect的镜头的原因,造成人体骨骼框架失真的情况;标记为4的部分是由于目标人体远离视距范围,导致体感设备Kinect无法捕获。标记为3、4的部分是程序给使用者的一个报错反馈,这时候需要重新初始化程序,以保证机器人收到可靠姿态控制数据;(4) The PC side dynamically displays the human skeleton frame on the screen, and provides error feedback; as shown in Figure 4, four representative service program running status diagrams are given: the part marked with 1 is the target human body standing in the visual field. Within the distance range, the human skeleton frame constructed by the first captured data of the somatosensory device Kinect displayed after the program initialization is completed, which means that the program starts normally and starts to capture the posture changes of the target human body in real time; the part marked with 2 is a certain moment after the program starts The human skeleton frame constructed by the data captured by the somatosensory device Kinect, this part reflects the posture of the target human body at this moment; the part marked with 3 represents the cause of the program running error and the target human body being close to the lens of the somatosensory device Kinect. Distortion of the human skeleton frame; the part marked 4 is because the target human body is far away from the line of sight, so the somatosensory device Kinect cannot capture it. The parts marked 3 and 4 are an error feedback from the program to the user. At this time, the program needs to be re-initialized to ensure that the robot receives reliable attitude control data;
(5)PC端对接收到的所有目标关节位移数据进行处理:包括相对阈值距离比较滤波计算、空间向量计算,得到人形机器人NAO的姿态控制数据,即关节角度数据;本发明由于环境因素和传感器自身抖动等原因,导致得到的原始数据里存在干扰数据,因此有必要对原始数据进行相对阈值距离比较滤波计算,使得机器人动作追踪更加准确、可靠;(5) The PC end processes all target joint displacement data received: including relative threshold distance comparison filter calculation, space vector calculation, to obtain the attitude control data of the humanoid robot NAO, i.e. joint angle data; the present invention is due to environmental factors and sensors Due to its own jitter and other reasons, there are interference data in the obtained original data, so it is necessary to perform relative threshold distance comparison filter calculation on the original data to make the robot motion tracking more accurate and reliable;
(6)调用人形机器人NAO的NAOqi操作系统中的JointControl函数,根据传送来的关节角度数据,对人形机器人NAO的舵机进行控制,使人形机器人NAO实时跟踪目标人体动作。(6) Call the JointControl function in the NAOqi operating system of the humanoid robot NAO, and control the steering gear of the humanoid robot NAO according to the transmitted joint angle data, so that the humanoid robot NAO can track the target human body in real time.
根据本发明优选的,所述步骤(4)还包括,所述PC端动态显示人体骨骼框架:如果PC端显示的人体骨骼框架与目标人体的动作一致,则将关节角度数据发送至人形机器人NAO;如果PC端显示的人体骨骼框架与目标人体的动作不匹配,则重新初始化程序,以保证人形机器人NAO收到可靠姿态控制数据。Preferably according to the present invention, the step (4) also includes that the PC end dynamically displays the human skeleton frame: if the human skeleton frame displayed on the PC end is consistent with the action of the target human body, then the joint angle data is sent to the humanoid robot NAO ; If the human skeleton frame displayed on the PC does not match the movement of the target human body, re-initialize the program to ensure that the humanoid robot NAO receives reliable attitude control data.
根据本发明优选的,所述步骤(5)中,所述相对阈值距离比较滤波计算包括:Preferably according to the present invention, in said step (5), said relative threshold distance comparison filter calculation includes:
通过观察位移关节点的空间坐标变化,计算固定时间段(例如:0.1s-0.5s)内同一关节在开始时间和结束时间组成的波动向量,观察该波动向量的模以及在空间坐标系各个方向的波动,通过设定波动阈值筛选关节波动值。可以看出关节点识别位置的抖动主要是沿坐标轴方向的快速抖动,且出现抖动时波动向量的模会大幅增加。因此,应对波动较大的关节点做相应处理,对小幅度波动的关节点保持上一状态不变,对于不同的关节处使用不同的阈值比较,保证每次滤波后的结果都是最优解,以保证机器人姿态变换的连续性。By observing the spatial coordinate changes of the displaced joint points, calculate the fluctuation vector composed of the same joint at the start time and end time within a fixed time period (for example: 0.1s-0.5s), observe the modulus of the fluctuation vector and all directions in the space coordinate system The fluctuation of the joint is filtered by setting the fluctuation threshold. It can be seen that the jitter of the joint point recognition position is mainly the rapid jitter along the coordinate axis direction, and the modulus of the fluctuation vector will increase significantly when the jitter occurs. Therefore, the joint points with large fluctuations should be dealt with accordingly, and the previous state should be kept unchanged for the joint points with small fluctuations. For different joints, different threshold comparisons are used to ensure that the result after each filter is the optimal solution. , to ensure the continuity of robot pose transformation.
根据本发明优选的,所述步骤(5)中,所述空间向量计算包括:Preferably according to the present invention, in the step (5), the space vector calculation includes:
根据本发明优选的,所述步骤(5)中,所述空间向量计算包括:Preferably according to the present invention, in the step (5), the space vector calculation includes:
Kinect所使用的空间坐标系不同于常见的空间坐标系,其x轴与y轴的零点与传统空间坐标系相同,但其z轴坐标零点为Kinect传感器,正方向为Kinect指向的正前方。Kinect空间坐标系如图6所示:The space coordinate system used by Kinect is different from the common space coordinate system. The zero points of the x-axis and y-axis are the same as the traditional space coordinate system, but the zero point of the z-axis coordinate is the Kinect sensor, and the positive direction is the front where Kinect points. The Kinect space coordinate system is shown in Figure 6:
由Kinect空间坐标系中,得出Kinect坐标系中任意两个不重合的坐标点P1(x1,y1,z1),P2(x2,y2,z2),对其组成的向量P1P2:From the Kinect space coordinate system, any two non-overlapping coordinate points P1 (x1, y1, z1) and P2 (x2, y2, z2) in the Kinect coordinate system are obtained, and the vector P1P2 composed of them:
若存在另一点P3且该点不在P1P2所在的直线上,则存在以下关系式:If there is another point P3 and this point is not on the straight line where P1P2 is located, the following relationship exists:
根据上述性质,将人体关节角度计算简化为对空间向量夹角的计算,下面将分别说明上肢关节角度的计算方法:According to the above properties, the calculation of the joint angle of the human body is simplified to the calculation of the included angle of the space vector, and the calculation method of the joint angle of the upper limbs will be explained separately below:
1)关节角度:LeftElbowRoll1) Joint angle: LeftElbowRoll
如图7所示:计算LeftElbowRoll角度只需构造该空间夹角两边所在的一组向量,As shown in Figure 7: To calculate the LeftElbowRoll angle, you only need to construct a set of vectors on both sides of the space angle,
并根据上文中提到的关节角度计算公式得:And according to the joint angle calculation formula mentioned above:
2)关节角度:LeftElbowYaw2) Joint angle: LeftElbowYaw
如图8所示:As shown in Figure 8:
LeftElbowYaw关节的角度为肘部绕上臂旋转时产生的角度,此角度通常情况下为ABC和BCD两相交平面的夹角,即图中平面S1和S2的夹角;根据空间平面夹角计算公式得出角度LeftElbowRoll的计算方法如下。The angle of the LeftElbowYaw joint is the angle generated when the elbow rotates around the upper arm. This angle is usually the angle between the two intersecting planes ABC and BCD, that is, the angle between planes S1 and S2 in the figure; according to the formula for calculating the angle of the space plane The calculation method of the out angle LeftElbowRoll is as follows.
首先需要明确计算两非共线向量所在平面的法向量的公式:First, it is necessary to clarify the formula for calculating the normal vector of the plane where the two non-collinear vectors are located:
因此S1、S2平面的法向量表示为:Therefore, the normal vectors of the S1 and S2 planes are expressed as:
LeftElbowYaw关节的角度即等于M1、M2两法向量的夹角:The angle of the LeftElbowYaw joint is equal to the angle between the two normal vectors of M1 and M2:
3)关节角度:LeftShoulder.Roll3) Joint angle: LeftShoulder.Roll
如图9所示:As shown in Figure 9:
4)关节角度:LeftShoulderPitch4) Joint angle: LeftShoulderPitch
如图10所示:As shown in Figure 10:
LeftShoulderPitch关节的角度为上臂与双肩轴线组成的平面与双肩轴线与脊柱点组成的平面之间的夹角,此角度通常情况下为ABC和BCD两相交平面的夹角,类比LeftElbowYaw关节角的计算方法,根据空间平面夹角计算公式可得出角度LeftShoulderPitch的计算方法如下:The angle of the LeftShoulderPitch joint is the angle between the plane formed by the upper arm and the shoulder axis and the plane formed by the shoulder axis and the spine point. This angle is usually the angle between the two intersecting planes ABC and BCD. It is similar to the calculation method of the LeftElbowYaw joint angle , according to the calculation formula of the included angle of the space plane, the calculation method of the angle LeftShoulderPitch can be obtained as follows:
平面ABC的法向量:Normal vector of plane ABC:
平面BCD的法向量:The normal vector of the plane BCD:
关节角LeftShoulderPitch计算:Joint angle LeftShoulderPitch calculation:
5)手部张合角度:LeftHand5) Hand opening and closing angle: LeftHand
由于KINECT对于手部信息的读取无法精确到手指的状态,因此对于手部张合角度计算,通过计算Lefthand和LeftWrist之间的距离来估算手部张合的角度,计算方法如下:Since KINECT cannot accurately read the hand information to the state of the fingers, for the calculation of the hand opening and closing angle, the angle of the hand opening and closing is estimated by calculating the distance between Lefthand and LeftWrist. The calculation method is as follows:
如图11所示:As shown in Figure 11:
至此,左臂所有关节的角度全部计算完毕,同理可以计算右臂关节角度;So far, the angles of all the joints of the left arm have been calculated, and the angles of the joints of the right arm can be calculated in the same way;
6)关节角度:HeadPitch6) Joint angle: HeadPitch
头部两个关节中,与低头和抬头有关的角度名称HeadPitchAmong the two joints of the head, the name of the angle related to head down and head up HeadPitch
如图12所示:As shown in Figure 12:
头部关节HeadPitch的角度选取肩中指向头部的向量与两肩与脊柱组成的平面的法向量之间的夹角,具体计算公式如下:The angle of the head joint HeadPitch is selected from the angle between the vector pointing to the head in the shoulder and the normal vector of the plane formed by the two shoulders and the spine. The specific calculation formula is as follows:
7)关节角:HeadYaw7) Joint angle: HeadYaw
如图13所示:As shown in Figure 13:
头部另外一个重要的关节角HeadYaw负责实现头部左右转动,由于KINECT骨骼识别结果中无法给出面部朝向的数据信息,因此HeadYaw角度无法进行直接计算,然而通过观察骨骼绘制动画可以看出,头部的空间位置明显位于肩中的前方(即正面面对KINECT时,Head点和Shouldercenter点在z轴上存在一定距离),因此关于此关节角本文将转化为平面夹角的计算:Another important joint angle of the head, HeadYaw, is responsible for the left and right rotation of the head. Since the KINECT bone recognition results cannot provide the data information of the face orientation, the HeadYaw angle cannot be directly calculated. The spatial position of the head is obviously located in front of the shoulder (that is, when the front faces KINECT, there is a certain distance between the Head point and the Shouldercenter point on the z-axis), so this article will convert this joint angle into the calculation of the plane angle:
平面ABC的法向量Normal vector of plane ABC
平面BCD法向量Plane BCD normal vector
关节角度joint angle
在机器人的运动过程中下肢关节活动将直接影响到机器人整体的平稳性,为了简化控制难度,本文对于下肢控制采用相对位置法,通过计算下肢末端在相对坐标系中的位置结合人体质心的高度,实现对机器人动作的控制;During the movement of the robot, the joint activities of the lower limbs will directly affect the overall stability of the robot. In order to simplify the control difficulty, this paper adopts the relative position method for the control of the lower limbs, by calculating the position of the lower limbs in the relative coordinate system combined with the height of the center of mass of the human body , to control the movement of the robot;
如图14所示:As shown in Figure 14:
将骨骼识别结果的Hipcenter点垂直投影到地面,作为新的坐标系原点,取左右脚踝点Rightankle和Leftankle在新坐标系中坐标作为机器人的控制数据。Vertically project the Hipcenter point of the bone recognition result to the ground as the origin of the new coordinate system, and take the coordinates of the left and right ankle points Rightankle and Leftankle in the new coordinate system as the control data of the robot.
B、C两点在O坐标系中坐标如下,The coordinates of points B and C in the O coordinate system are as follows,
LeftFoot=(Az-Bz,Ax-Bx)(3-17)LeftFoot=(A z -B z ,A x -B x )(3-17)
RightFoot=(Az-Cz,Ax-Cx)(3-18)RightFoot=(A z -C z ,A x -C x )(3-18)
为了由于不同的人身高差异造成的绝对距离误差,此处将坐标数值除以人体胯宽,其中胯宽计算公式如下:For the absolute distance error caused by the difference in height of different people, the coordinate value is divided by the width of the human body's crotch here, and the formula for calculating the crotch width is as follows:
因此下肢末端在新坐标系中的位置如下所示:So the position of the extremity in the new coordinate system is as follows:
Claims (4)
Priority Applications (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| CN201510824988.8A CN105252532B (en) | 2015-11-24 | 2015-11-24 | The method of the flexible gesture stability of motion capture robot collaboration |
Applications Claiming Priority (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| CN201510824988.8A CN105252532B (en) | 2015-11-24 | 2015-11-24 | The method of the flexible gesture stability of motion capture robot collaboration |
Publications (2)
| Publication Number | Publication Date |
|---|---|
| CN105252532A true CN105252532A (en) | 2016-01-20 |
| CN105252532B CN105252532B (en) | 2017-07-04 |
Family
ID=55092618
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| CN201510824988.8A Active CN105252532B (en) | 2015-11-24 | 2015-11-24 | The method of the flexible gesture stability of motion capture robot collaboration |
Country Status (1)
| Country | Link |
|---|---|
| CN (1) | CN105252532B (en) |
Cited By (35)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN105945947A (en) * | 2016-05-20 | 2016-09-21 | 西华大学 | Robot writing system based on gesture control and control method of robot writing system |
| CN105999670A (en) * | 2016-05-31 | 2016-10-12 | 山东科技大学 | Shadow-boxing movement judging and guiding system based on kinect and guiding method adopted by same |
| CN106313072A (en) * | 2016-10-12 | 2017-01-11 | 南昌大学 | A Humanoid Robot Based on Kinect Somatosensory Control |
| CN106564055A (en) * | 2016-10-31 | 2017-04-19 | 金阳娃 | Stable motion planning method of simulation humanoid robot and control device thereof |
| CN106607910A (en) * | 2015-10-22 | 2017-05-03 | 中国科学院深圳先进技术研究院 | Robot real-time simulation method |
| CN106648116A (en) * | 2017-01-22 | 2017-05-10 | 隋文涛 | Virtual reality integrated system based on action capture |
| CN106667493A (en) * | 2017-01-22 | 2017-05-17 | 河北大学 | Human body balance assessment system and assessment method |
| CN107225573A (en) * | 2017-07-05 | 2017-10-03 | 上海未来伙伴机器人有限公司 | The method of controlling operation and device of robot |
| CN107272882A (en) * | 2017-05-03 | 2017-10-20 | 江苏大学 | The holographic long-range presentation implementation method of one species |
| CN108518368A (en) * | 2018-05-04 | 2018-09-11 | 贵阳海之力液压有限公司 | A kind of valve control Hydraulic Power Transmission System applied to exoskeleton robot |
| CN108621164A (en) * | 2018-05-10 | 2018-10-09 | 山东大学 | Taiji push hands machine people based on depth camera |
| CN108762495A (en) * | 2018-05-18 | 2018-11-06 | 深圳大学 | The virtual reality driving method and virtual reality system captured based on arm action |
| CN109064487A (en) * | 2018-07-02 | 2018-12-21 | 中北大学 | A kind of human posture's comparative approach based on the tracking of Kinect bone node location |
| CN109079794A (en) * | 2018-09-18 | 2018-12-25 | 广东省智能制造研究所 | It is a kind of followed based on human body attitude robot control and teaching method |
| CN109591013A (en) * | 2018-12-12 | 2019-04-09 | 山东大学 | A kind of flexible assembly analogue system and its implementation |
| CN110135332A (en) * | 2019-05-14 | 2019-08-16 | 吉林大学 | A method for monitoring the efficiency of a bearing assembly production line |
| CN110450145A (en) * | 2019-08-13 | 2019-11-15 | 广东工业大学 | A kind of biomimetic manipulator based on skeleton identification |
| CN110598647A (en) * | 2019-09-17 | 2019-12-20 | 四川爱目视光智能科技有限公司 | Head posture recognition method based on image recognition |
| CN110815215A (en) * | 2019-10-24 | 2020-02-21 | 上海航天控制技术研究所 | Multi-mode fused rotating target approaching and stopping capture ground test system and method |
| CN110853099A (en) * | 2019-11-19 | 2020-02-28 | 福州大学 | A human-computer interaction method and system based on dual Kinect cameras |
| CN110978064A (en) * | 2019-12-11 | 2020-04-10 | 山东大学 | Human safety assessment method and system in human-machine collaboration |
| CN111273783A (en) * | 2020-03-25 | 2020-06-12 | 北京百度网讯科技有限公司 | Digital human control method and device |
| CN111360819A (en) * | 2020-02-13 | 2020-07-03 | 平安科技(深圳)有限公司 | Robot control method and device, computer device and storage medium |
| CN112090076A (en) * | 2020-08-14 | 2020-12-18 | 深圳中清龙图网络技术有限公司 | Game character action control method, device, equipment and medium |
| CN112936342A (en) * | 2021-02-02 | 2021-06-11 | 福建天晴数码有限公司 | System and method for evaluating actions of entity robot based on human body posture recognition algorithm |
| CN112975993A (en) * | 2021-02-22 | 2021-06-18 | 北京国腾联信科技有限公司 | Robot teaching method, device, storage medium and equipment |
| CN113077493A (en) * | 2021-05-11 | 2021-07-06 | 德鲁动力科技(成都)有限公司 | Method and system for following target of mobile robot |
| CN113146634A (en) * | 2021-04-25 | 2021-07-23 | 达闼机器人有限公司 | Robot attitude control method, robot and storage medium |
| CN113318424A (en) * | 2020-12-23 | 2021-08-31 | 广州富港万嘉智能科技有限公司 | Novel game device and control method |
| CN113318426A (en) * | 2020-12-23 | 2021-08-31 | 广州富港万嘉智能科技有限公司 | Novel game system |
| CN113318425A (en) * | 2020-12-23 | 2021-08-31 | 广州富港万嘉智能科技有限公司 | Novel game device and control method |
| CN115256468A (en) * | 2022-08-26 | 2022-11-01 | 北京理工大学 | State detection and standing planning method for humanoid robot after falling |
| CN116197899A (en) * | 2023-01-10 | 2023-06-02 | 北京航空航天大学 | Active robot teleoperation system based on VR |
| CN117340914A (en) * | 2023-10-24 | 2024-01-05 | 哈尔滨工程大学 | Humanoid robot human body feeling control method and control system |
| CN119260709A (en) * | 2024-09-27 | 2025-01-07 | 泰志达智能科技(苏州)有限公司 | A method for controlling a robot |
Citations (6)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN103170973A (en) * | 2013-03-28 | 2013-06-26 | 上海理工大学 | Man-machine cooperation device and method based on Kinect video camera |
| CN203973551U (en) * | 2014-06-13 | 2014-12-03 | 济南翼菲自动化科技有限公司 | A kind of remote control robot of controlling by body gesture |
| CN104440926A (en) * | 2014-12-09 | 2015-03-25 | 重庆邮电大学 | Mechanical arm somatic sense remote controlling method and mechanical arm somatic sense remote controlling system based on Kinect |
| CN104589356A (en) * | 2014-11-27 | 2015-05-06 | 北京工业大学 | Dexterous hand teleoperation control method based on Kinect human hand motion capturing |
| US20150154467A1 (en) * | 2013-12-04 | 2015-06-04 | Mitsubishi Electric Research Laboratories, Inc. | Method for Extracting Planes from 3D Point Cloud Sensor Data |
| CN105058396A (en) * | 2015-07-31 | 2015-11-18 | 深圳先进技术研究院 | Robot teaching system and control method thereof |
-
2015
- 2015-11-24 CN CN201510824988.8A patent/CN105252532B/en active Active
Patent Citations (6)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN103170973A (en) * | 2013-03-28 | 2013-06-26 | 上海理工大学 | Man-machine cooperation device and method based on Kinect video camera |
| US20150154467A1 (en) * | 2013-12-04 | 2015-06-04 | Mitsubishi Electric Research Laboratories, Inc. | Method for Extracting Planes from 3D Point Cloud Sensor Data |
| CN203973551U (en) * | 2014-06-13 | 2014-12-03 | 济南翼菲自动化科技有限公司 | A kind of remote control robot of controlling by body gesture |
| CN104589356A (en) * | 2014-11-27 | 2015-05-06 | 北京工业大学 | Dexterous hand teleoperation control method based on Kinect human hand motion capturing |
| CN104440926A (en) * | 2014-12-09 | 2015-03-25 | 重庆邮电大学 | Mechanical arm somatic sense remote controlling method and mechanical arm somatic sense remote controlling system based on Kinect |
| CN105058396A (en) * | 2015-07-31 | 2015-11-18 | 深圳先进技术研究院 | Robot teaching system and control method thereof |
Cited By (51)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN106607910A (en) * | 2015-10-22 | 2017-05-03 | 中国科学院深圳先进技术研究院 | Robot real-time simulation method |
| CN106607910B (en) * | 2015-10-22 | 2019-03-22 | 中国科学院深圳先进技术研究院 | A kind of robot imitates method in real time |
| CN105945947A (en) * | 2016-05-20 | 2016-09-21 | 西华大学 | Robot writing system based on gesture control and control method of robot writing system |
| CN105999670A (en) * | 2016-05-31 | 2016-10-12 | 山东科技大学 | Shadow-boxing movement judging and guiding system based on kinect and guiding method adopted by same |
| CN106313072A (en) * | 2016-10-12 | 2017-01-11 | 南昌大学 | A Humanoid Robot Based on Kinect Somatosensory Control |
| CN106564055B (en) * | 2016-10-31 | 2019-08-27 | 金阳娃 | Human simulation robot stabilization motion planning method and control device |
| CN106564055A (en) * | 2016-10-31 | 2017-04-19 | 金阳娃 | Stable motion planning method of simulation humanoid robot and control device thereof |
| CN106667493A (en) * | 2017-01-22 | 2017-05-17 | 河北大学 | Human body balance assessment system and assessment method |
| CN106648116B (en) * | 2017-01-22 | 2023-06-20 | 隋文涛 | Virtual reality integrated system based on motion capture |
| CN106648116A (en) * | 2017-01-22 | 2017-05-10 | 隋文涛 | Virtual reality integrated system based on action capture |
| CN107272882A (en) * | 2017-05-03 | 2017-10-20 | 江苏大学 | The holographic long-range presentation implementation method of one species |
| CN107225573A (en) * | 2017-07-05 | 2017-10-03 | 上海未来伙伴机器人有限公司 | The method of controlling operation and device of robot |
| CN108518368A (en) * | 2018-05-04 | 2018-09-11 | 贵阳海之力液压有限公司 | A kind of valve control Hydraulic Power Transmission System applied to exoskeleton robot |
| CN108518368B (en) * | 2018-05-04 | 2023-09-19 | 贵阳海之力液压有限公司 | Valve control hydraulic transmission system applied to exoskeleton robot |
| CN108621164A (en) * | 2018-05-10 | 2018-10-09 | 山东大学 | Taiji push hands machine people based on depth camera |
| CN108762495A (en) * | 2018-05-18 | 2018-11-06 | 深圳大学 | The virtual reality driving method and virtual reality system captured based on arm action |
| WO2019218457A1 (en) * | 2018-05-18 | 2019-11-21 | 深圳大学 | Virtual reality driving method based on arm motion capture, and virtual reality system |
| CN109064487A (en) * | 2018-07-02 | 2018-12-21 | 中北大学 | A kind of human posture's comparative approach based on the tracking of Kinect bone node location |
| CN109064487B (en) * | 2018-07-02 | 2021-08-06 | 中北大学 | A Human Pose Comparison Method Based on Kinect Skeletal Node Position Tracking |
| CN109079794A (en) * | 2018-09-18 | 2018-12-25 | 广东省智能制造研究所 | It is a kind of followed based on human body attitude robot control and teaching method |
| CN109079794B (en) * | 2018-09-18 | 2020-12-22 | 广东省智能制造研究所 | A robot control and teaching method based on human posture following |
| CN109591013A (en) * | 2018-12-12 | 2019-04-09 | 山东大学 | A kind of flexible assembly analogue system and its implementation |
| CN110135332A (en) * | 2019-05-14 | 2019-08-16 | 吉林大学 | A method for monitoring the efficiency of a bearing assembly production line |
| CN110135332B (en) * | 2019-05-14 | 2022-05-31 | 吉林大学 | A method for monitoring the efficiency of a bearing assembly line |
| CN110450145A (en) * | 2019-08-13 | 2019-11-15 | 广东工业大学 | A kind of biomimetic manipulator based on skeleton identification |
| CN110598647B (en) * | 2019-09-17 | 2022-04-22 | 四川爱目视光智能科技有限公司 | Head posture recognition method based on image recognition |
| CN110598647A (en) * | 2019-09-17 | 2019-12-20 | 四川爱目视光智能科技有限公司 | Head posture recognition method based on image recognition |
| CN110815215A (en) * | 2019-10-24 | 2020-02-21 | 上海航天控制技术研究所 | Multi-mode fused rotating target approaching and stopping capture ground test system and method |
| CN110853099B (en) * | 2019-11-19 | 2023-04-14 | 福州大学 | A kind of human-computer interaction method and system based on dual Kinect cameras |
| CN110853099A (en) * | 2019-11-19 | 2020-02-28 | 福州大学 | A human-computer interaction method and system based on dual Kinect cameras |
| CN110978064A (en) * | 2019-12-11 | 2020-04-10 | 山东大学 | Human safety assessment method and system in human-machine collaboration |
| CN110978064B (en) * | 2019-12-11 | 2022-06-24 | 山东大学 | Human safety assessment method and system in human-machine collaboration |
| CN111360819A (en) * | 2020-02-13 | 2020-07-03 | 平安科技(深圳)有限公司 | Robot control method and device, computer device and storage medium |
| CN111273783A (en) * | 2020-03-25 | 2020-06-12 | 北京百度网讯科技有限公司 | Digital human control method and device |
| CN111273783B (en) * | 2020-03-25 | 2023-01-31 | 北京百度网讯科技有限公司 | Digital human control method and device |
| CN112090076A (en) * | 2020-08-14 | 2020-12-18 | 深圳中清龙图网络技术有限公司 | Game character action control method, device, equipment and medium |
| CN113318426A (en) * | 2020-12-23 | 2021-08-31 | 广州富港万嘉智能科技有限公司 | Novel game system |
| CN113318425A (en) * | 2020-12-23 | 2021-08-31 | 广州富港万嘉智能科技有限公司 | Novel game device and control method |
| CN113318424A (en) * | 2020-12-23 | 2021-08-31 | 广州富港万嘉智能科技有限公司 | Novel game device and control method |
| CN113318424B (en) * | 2020-12-23 | 2023-07-21 | 广州富港生活智能科技有限公司 | Novel game device and control method |
| CN113318426B (en) * | 2020-12-23 | 2023-05-26 | 广州富港生活智能科技有限公司 | Novel game system |
| CN112936342B (en) * | 2021-02-02 | 2023-04-28 | 福建天晴数码有限公司 | Physical robot action evaluation system and method based on human body gesture recognition algorithm |
| CN112936342A (en) * | 2021-02-02 | 2021-06-11 | 福建天晴数码有限公司 | System and method for evaluating actions of entity robot based on human body posture recognition algorithm |
| CN112975993A (en) * | 2021-02-22 | 2021-06-18 | 北京国腾联信科技有限公司 | Robot teaching method, device, storage medium and equipment |
| CN113146634A (en) * | 2021-04-25 | 2021-07-23 | 达闼机器人有限公司 | Robot attitude control method, robot and storage medium |
| CN113077493A (en) * | 2021-05-11 | 2021-07-06 | 德鲁动力科技(成都)有限公司 | Method and system for following target of mobile robot |
| CN115256468A (en) * | 2022-08-26 | 2022-11-01 | 北京理工大学 | State detection and standing planning method for humanoid robot after falling |
| CN116197899A (en) * | 2023-01-10 | 2023-06-02 | 北京航空航天大学 | Active robot teleoperation system based on VR |
| CN117340914A (en) * | 2023-10-24 | 2024-01-05 | 哈尔滨工程大学 | Humanoid robot human body feeling control method and control system |
| CN117340914B (en) * | 2023-10-24 | 2024-05-14 | 哈尔滨工程大学 | A humanoid robot somatosensory control method and control system |
| CN119260709A (en) * | 2024-09-27 | 2025-01-07 | 泰志达智能科技(苏州)有限公司 | A method for controlling a robot |
Also Published As
| Publication number | Publication date |
|---|---|
| CN105252532B (en) | 2017-07-04 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| CN105252532A (en) | Method of cooperative flexible attitude control for motion capture robot | |
| Liu et al. | High-fidelity grasping in virtual reality using a glove-based system | |
| CN108762495B (en) | Virtual reality driving method and virtual reality system based on arm motion capture | |
| Riley et al. | Enabling real-time full-body imitation: a natural way of transferring human movement to humanoids | |
| US20130285903A1 (en) | Virtual desktop coordinate transformation | |
| KR101830793B1 (en) | virtual training system using upper body interaction | |
| CN104570731A (en) | Uncalibrated human-computer interaction control system and method based on Kinect | |
| CN109243575B (en) | Virtual acupuncture method and system based on mobile interaction and augmented reality | |
| CN106873787A (en) | A kind of gesture interaction system and method for virtual teach-in teaching | |
| Li et al. | Real-time hand gesture tracking for human–computer interface based on multi-sensor data fusion | |
| Yun et al. | Animation fidelity in self-avatars: Impact on user performance and sense of agency | |
| CN113505694A (en) | Human-computer interaction method and device based on sight tracking and computer equipment | |
| Maycock et al. | Robust tracking of human hand postures for robot teaching | |
| Zhang et al. | A real-time upper-body robot imitation system | |
| Placidi et al. | Data integration by two-sensors in a LEAP-based Virtual Glove for human-system interaction | |
| CN112276914B (en) | Industrial robot based on AR technology and man-machine interaction method thereof | |
| Leoncini et al. | Multiple NUI device approach to full body tracking for collaborative virtual environments | |
| Scherfgen et al. | Estimating the pose of a medical manikin for haptic augmentation of a virtual patient in mixed reality training | |
| CN108734762B (en) | Motion trajectory simulation method and system | |
| Sreejith et al. | Real-time hands-free immersive image navigation system using Microsoft Kinect 2.0 and Leap Motion Controller | |
| CN105225270B (en) | A kind of information processing method and electronic equipment | |
| Wang et al. | Design and implementation of humanoid robot behavior imitation system based on skeleton tracking | |
| CN110815210A (en) | Novel remote control method based on natural human-computer interface and augmented reality | |
| Bai et al. | Kinect-based hand tracking for first-person-perspective robotic arm teleoperation | |
| Cho et al. | Full-Body Pose Estimation of Humanoid Robots Using Head-Worn Cameras for Digital Human-Augmented Robotic Telepresence |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| C06 | Publication | ||
| PB01 | Publication | ||
| C10 | Entry into substantive examination | ||
| SE01 | Entry into force of request for substantive examination | ||
| GR01 | Patent grant | ||
| GR01 | Patent grant | ||
| TR01 | Transfer of patent right |
Effective date of registration: 20190517 Address after: 250101 Shandong Province Jinan High-tech Zone Shunfeng Road 101 Qilu Cultural Creative Base 15 Building 4 Unit 5 Floor Patentee after: Shandong Muke Space Information Technology Co., Ltd. Address before: No. 27, mountain Dana Road, Ji'nan City, Shandong, Shandong Patentee before: Shandong University |
|
| TR01 | Transfer of patent right |