+

CN120335619B - Posture processing method and intelligent wearable system - Google Patents

Posture processing method and intelligent wearable system

Info

Publication number
CN120335619B
CN120335619B CN202510829406.9A CN202510829406A CN120335619B CN 120335619 B CN120335619 B CN 120335619B CN 202510829406 A CN202510829406 A CN 202510829406A CN 120335619 B CN120335619 B CN 120335619B
Authority
CN
China
Prior art keywords
fiber
joint
human body
posture
angle
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202510829406.9A
Other languages
Chinese (zh)
Other versions
CN120335619A (en
Inventor
姜龙
张家铭
于新亮
王军委
赵成
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Goertek Inc
Original Assignee
Goertek Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Goertek Inc filed Critical Goertek Inc
Priority to CN202510829406.9A priority Critical patent/CN120335619B/en
Publication of CN120335619A publication Critical patent/CN120335619A/en
Application granted granted Critical
Publication of CN120335619B publication Critical patent/CN120335619B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • User Interface Of Digital Computer (AREA)
  • Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)

Abstract

本申请公开了一种姿态处理方法及智能穿戴系统,涉及智能穿戴设备技术领域,所述姿态处理方法应用于智能服装,所述智能服装包括多个纤维基拉伸传感器,所述纤维基拉伸传感器织入所述智能服装与各人体关节对应的区域;所述姿态处理方法包括:获取所述纤维基拉伸传感器的拉伸检测值;根据所述拉伸检测值,确定对应的角度值;将所述角度值映射至各所述人体关节的三维坐标系的坐标轴上,得到三维姿态。本申请解决了现有姿态识别方案识别出的用户姿态的准确性较低的技术问题。

This application discloses a posture processing method and smart wearable system, relating to the technical field of smart wearable devices. The posture processing method is applied to smart clothing, wherein the smart clothing includes multiple fiber-based stretch sensors woven into areas of the smart clothing corresponding to human joints. The posture processing method comprises: obtaining stretch detection values from the fiber-based stretch sensors; determining corresponding angle values based on the stretch detection values; and mapping the angle values to the coordinate axes of a three-dimensional coordinate system for each of the human joints to obtain a three-dimensional posture. This application addresses the technical problem of low accuracy in user posture recognition using existing posture recognition schemes.

Description

Gesture processing method and intelligent wearing system
Technical Field
The application relates to the technical field of intelligent wearing equipment, in particular to a gesture processing method and an intelligent wearing system.
Background
Research and application of human body posture estimation technology are increasingly focused in the fields of human-computer interaction, virtual reality, health monitoring, sports science and the like.
The existing gesture recognition mode usually adopts inertial dynamic capturing or optical dynamic capturing. Inertial dynamic capturing is to embed an inertial measurement unit in a key position (such as a joint and a trunk) of the dynamic capturing clothes, and calculate the attitude angles of all parts of a human body by integrating sensor data of the inertial measurement unit, but the attitude angles can drift due to integral calculation errors after long-time use. And the optical dynamic capturing is motion capturing which is realized by relying on computer vision after the image is acquired, and is easy to be blocked or interfered by light rays. Thereby resulting in a lower accuracy of the recognized user gesture.
The foregoing is provided merely for the purpose of facilitating understanding of the technical solutions of the present application and is not intended to represent an admission that the foregoing is prior art.
Disclosure of Invention
The application mainly aims to provide a gesture processing method and an intelligent wearing system, and aims to solve the technical problem that the accuracy of the user gesture recognized by the existing gesture recognition scheme is low.
In order to achieve the above object, the present application provides a posture processing method applied to an intelligent garment, wherein the intelligent garment comprises a plurality of fiber-based stretching sensors woven into the intelligent garment in the areas corresponding to the joints of each human body;
The gesture processing method comprises the following steps:
obtaining a stretching detection value of the fiber-based stretching sensor;
determining a corresponding angle value according to the stretching detection value;
And mapping the angle value to coordinate axes of the three-dimensional coordinate system of each human joint to obtain a three-dimensional gesture.
In one embodiment, the step of determining the corresponding angle value according to the stretch detection value includes:
obtaining an initial stretching value of the fiber-based stretching sensor, wherein the initial stretching value is a stretching value of the fiber-based stretching sensor in an initial standard state;
Calculating a stretch change value between the stretch detection value and the initial stretch value;
And obtaining an angle value corresponding to the stretching detection value according to the stretching variation value and the angle mapping relation corresponding to the fiber-based stretching sensor, wherein the angle mapping relation describes the mapping relation between the stretching variation value and the angle value of the fiber-based stretching sensor.
In an embodiment, the step of mapping the angle value to coordinate axes of a three-dimensional coordinate system of each human joint to obtain a three-dimensional pose includes:
according to the setting position of the fiber-based stretching sensor, determining a coordinate axis of a three-dimensional coordinate system of a human joint corresponding to the fiber-based stretching sensor;
Mapping the angle value of the fiber-based tension sensor to a coordinate axis of a corresponding three-dimensional coordinate system to obtain a local joint angle;
According to the transformation relation between three-dimensional coordinate systems among the human joints, the local joint angles are connected in series, and the global human body posture is obtained and used as the three-dimensional posture.
In an embodiment, the step of concatenating the local joint angles according to the transformation relationship between the three-dimensional coordinate systems of the human joints to obtain the global human body posture includes:
Acquiring a designated human body area, and determining local joint angles of all human body joints in the designated human body area;
and according to the transformation relation between the three-dimensional coordinate systems of the human body joints, the local joint angles of the human body joints in the appointed human body region are connected in series to obtain the global human body posture.
In one embodiment, each human joint in the designated human body region includes an ankle joint, a knee joint, and a hip joint;
after the step of acquiring the designated human body region and determining the local joint angles of each human body joint in the designated human body region, the method comprises the following steps:
Responding to a jump event, acquiring a landing acceleration, and calculating a ground reaction force based on the landing acceleration, wherein the landing acceleration is determined according to the vertical acceleration at the jump event;
according to the local joint angles of the ankle joint, the knee joint and the hip joint, determining stress thresholds of the ankle joint, the knee joint and the hip joint;
According to the ground reaction force, performing reverse recursion calculation by taking an ankle joint as a starting point to obtain ankle joint force, knee joint force and hip joint force;
and respectively comparing the ankle joint force, the knee joint force and the hip joint force with corresponding stress thresholds, and outputting injury early warning information after any one of the ankle joint force, the knee joint force and the hip joint force exceeds the corresponding stress threshold.
In an embodiment, the gesture processing method further includes:
responding to a motion analysis instruction, and acquiring a joint chain and a standard action angle sequence related to motion to be analyzed;
Generating a current action angle sequence by using each local joint angle in the joint chain;
aligning the current action angle sequence with the standard action angle sequence to obtain a motion coordination deviation;
And generating posture correction prompt information according to the movement coordination deviation.
In an embodiment, the step of aligning the current motion angle sequence and the standard motion angle sequence to obtain a motion coordination deviation includes:
mapping the current action angle sequence and the standard action angle sequence on the same time axis, and calculating the angle difference and the time sequence difference between key phase points in the standard action angle sequence and mapping phase points corresponding to the current action angle sequence;
the angle difference and the time sequence difference are taken as motion coordination deviation.
In an embodiment, after the step of mapping the angle value to coordinate axes of the three-dimensional coordinate system of each human joint to obtain a three-dimensional pose, the pose processing method further includes:
importing the three-dimensional gesture into a standard human body model, and determining the real-time gravity center position under the three-dimensional gesture;
And outputting falling warning information after the horizontal moving speed of the real-time gravity center position is greater than a preset speed threshold value or the relative distance between the real-time gravity center position and a preset human body supporting point is greater than a preset distance threshold value.
In addition, in order to achieve the above purpose, the application also provides an intelligent wearing system, which comprises an intelligent garment and a control terminal in communication connection with the intelligent garment;
The intelligent clothing comprises a base fabric layer and a plurality of fiber-based stretching sensors, wherein the fiber-based stretching sensors are woven into the base fabric layer in areas corresponding to all human joints;
the control terminal is configured to implement the steps of the gesture processing method as described above.
In an embodiment, the smart garment further comprises a data acquisition module;
The data acquisition module is electrically connected with each fiber-based stretching sensor through a wire and is used for acquiring sensor signals of each fiber-based stretching sensor and sending the sensor signals to the control terminal.
In addition, in order to achieve the above object, the present application also proposes a storage medium, which is a computer-readable storage medium, on which a computer program is stored, which when being executed by a processor, implements the steps of the gesture processing method as described above.
Furthermore, to achieve the above object, the present application also provides a computer program product comprising a computer program which, when executed by a processor, implements the steps of the gesture processing method as described above.
One or more technical schemes provided by the application have at least the following technical effects:
The intelligent clothing is applied to intelligent clothing, the intelligent clothing comprises a plurality of fiber-based stretching sensors, and the fiber-based stretching sensors are woven into the intelligent clothing in areas corresponding to all human joints. Under the condition that the joints of the human body are at different angles, the fiber stretching degrees of the intelligent clothing and the areas corresponding to the joints of the human body are different, so that the corresponding angle value can be determined according to the stretching detection value by acquiring the stretching detection value of the fiber-based stretching sensor. Therefore, the application can map the angle values to the coordinate axes of the three-dimensional coordinate system of each human joint respectively to obtain the three-dimensional gesture. Therefore, the application accurately identifies the angle value of each human joint on the three-dimensional space from the mechanical stretching layer by means of the fiber-based stretching sensor, maps to the three-dimensional coordinate system of each human joint to form a three-dimensional gesture, and compared with the existing inertial dynamic capturing and optical dynamic capturing, the gesture processing mode of the application has stronger adaptability to different scenes, and has no problem of integral drift due to the angle identification from the mechanical stretching layer, thereby effectively improving the accuracy of gesture identification for users. In addition, the inertia motion capturing and the optical motion capturing need to fit a plurality of degrees of freedom (positions and postures) of the whole body based on inertia information and image information, so that the calculated data size is large and the processing difficulty is high.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the application and together with the description, serve to explain the principles of the application.
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings that are required to be used in the description of the embodiments or the prior art will be briefly described below, and it will be obvious to those skilled in the art that other drawings can be obtained from these drawings without inventive effort.
FIG. 1 is a schematic flow chart of a gesture processing method according to an embodiment of the present application;
FIG. 2 is a schematic structural diagram of a smart garment according to an embodiment of the present application;
FIG. 3 is a scene graph of a three-dimensional coordinate system of each human joint according to an embodiment of the present application;
FIG. 4 is a schematic view of a scenario of coordinate axis limits in a three-dimensional coordinate system according to an embodiment of the present application;
FIG. 5 is a schematic view of an exemplary embodiment of a fiber-based tension sensor;
FIG. 6 is a schematic representation of a standard manikin according to an embodiment of the application;
FIG. 7 is a schematic flow chart of a gesture processing method according to a second embodiment of the present application;
FIG. 8 is a schematic flow chart of a gesture processing method according to a third embodiment of the present application;
FIG. 9 is a system architecture diagram of a smart wearable system in an embodiment of the application;
FIG. 10 is a diagram of an example of a smart garment according to an embodiment of the present application;
Fig. 11 is another exemplary diagram of a smart garment according to an embodiment of the present application.
The achievement of the objects, functional features and advantages of the present application will be further described with reference to the accompanying drawings, in conjunction with the embodiments.
Detailed Description
It should be understood that the specific embodiments described herein are merely illustrative of the technical solution of the present application and are not intended to limit the present application.
For a better understanding of the technical solution of the present application, the following detailed description will be given with reference to the drawings and the specific embodiments.
The intelligent clothing comprises a plurality of fiber-based stretching sensors, wherein the fiber-based stretching sensors are woven into areas of the intelligent clothing corresponding to all human joints, stretching detection values of the fiber-based stretching sensors are obtained, corresponding angle values are determined according to the stretching detection values, and the angle values are mapped to coordinate axes of a three-dimensional coordinate system of all the human joints to obtain three-dimensional gestures.
Since the existing gesture recognition mode usually adopts inertial dynamic capturing or optical dynamic capturing. Inertial dynamic capturing is to embed an inertial measurement unit in a key position (such as a joint and a trunk) of the dynamic capturing clothes, and calculate the attitude angles of all parts of a human body by integrating sensor data of the inertial measurement unit, but the attitude angles can drift due to integral calculation errors after long-time use. And the optical dynamic capturing is motion capturing which is realized by relying on computer vision after the image is acquired, and is easy to be blocked or interfered by light rays. Thereby resulting in a lower accuracy of the recognized user gesture.
The application provides a solution, by means of a fiber-based stretching sensor, the angle value of each human joint in a three-dimensional space is accurately identified from a mechanical stretching layer, and the angle value is mapped to the three-dimensional coordinate system of each human joint to form a three-dimensional gesture. In addition, the inertia motion capturing and the optical motion capturing need to fit a plurality of degrees of freedom (positions and postures) of the whole body based on inertia information and image information, so that the calculated data size is large and the processing difficulty is high.
Based on this, an embodiment of the present application provides a gesture processing method, and referring to fig. 1, fig. 1 is a schematic flow chart of a first embodiment of the gesture processing method of the present application.
In this embodiment, the gesture processing method is applied to an intelligent garment, where the intelligent garment includes a plurality of fiber-based stretching sensors, and the fiber-based stretching sensors are woven into a region of the intelligent garment corresponding to each human joint;
The gesture processing method comprises the following steps S10-S30:
step S10, obtaining a stretching detection value of the fiber-based stretching sensor;
As shown in fig. 2, the smart garment may be a garment in the form of a jacket, trousers, gloves, sleeves, etc., and the base fabric layer (black area in fig. 2) of the smart garment is made of elastic fibers (spandex, polyester fibers, nylon, etc., or blended elastic fibers) to attach to the body surface of the user. In addition, the elastic fiber may additionally be subjected to functional treatment (such as antibacterial treatment, water washing resistant coating, fatigue resistant coating, etc.). The intelligent garment comprises a plurality of fiber-based stretching sensors, wherein the fiber-based stretching sensors are woven into the region (green region in fig. 2) of the base fabric layer of the intelligent garment, corresponding to each human joint, so as to detect the stretching amplitude of the region corresponding to each human joint and characterize the corresponding angle value. It will be appreciated that the fibre-based tension sensor is provided at least in the region where tension is generated when the human joint is in motion. Illustratively, the fiber-based tension sensor may be woven into the base fabric layer by braiding, embroidering, or adhering.
Additionally, the fiber-based tension sensor is a fiber-form sensor that causes a voltage change as the fiber is stretched, so that the present embodiment can determine the fiber tension amplitude of the region where the fiber-based tension sensor is located through the voltage change.
The embodiment can establish communication connection with each fiber-based stretching sensor in a wireless communication or wired communication mode, so that sensor signals of each fiber-based stretching sensor can be received, and a stretching detection value of the fiber-based stretching sensor can be obtained.
Step S20, determining a corresponding angle value according to the stretching detection value;
the stretching detection value characterizes the length of the fiber, and the stretching detection value can be a voltage value directly output by a fiber-based stretching sensor or a length value obtained by converting the voltage value.
Since the degree of stretching of the fibers is different for the human joint at different angles, for example, the elbow joint, when the elbow joint flexes, the fibers in the region corresponding to the dorsal aspect of the elbow joint are stretched, and the deeper the elbow joint flexes (i.e., the smaller the angle between the humerus and the ulna or radius), the greater the degree of stretching of the fibers. The fiber-based stretching sensor is arranged in the area corresponding to the back side of the elbow joint on the intelligent garment, and therefore the fiber stretching amplitude detected by the fiber-based stretching sensor has a mapping relation with the angle value of the elbow joint in the buckling direction. Therefore, according to the stretching detection value, the fiber stretching amplitude of the intelligent garment in the corresponding area of each human joint can be determined, and then according to the mapping relation between the fiber stretching amplitude and the angle value, the angle value of the human joint corresponding to the stretching detection value is determined.
In a possible embodiment, step S20 may include steps S21 to S23:
Step S21, obtaining an initial stretching value of the fiber-based stretching sensor, wherein the initial stretching value is a stretching value of the fiber-based stretching sensor in an initial standard state;
Step S22, calculating a stretch change value between the stretch detection value and the initial stretch value;
and S23, obtaining an angle value corresponding to the stretching detection value according to the stretching variation value and the angle mapping relation corresponding to the fiber-based stretching sensor, wherein the angle mapping relation describes the mapping relation between the stretching variation value and the angle value of the fiber-based stretching sensor.
The initial stretching value is a stretching value of the fiber-based stretching sensor in an initial standard state, and the initial standard state is a state of the fiber-based stretching sensor in a specified standard posture (such as a natural standing posture, a standing posture of a spreading arm, etc.). For example, the smart garment may be worn on a standard mannequin of a specified standard posture in this embodiment, and then the stretch detection value of the fiber-based stretch sensor is taken as an initial stretch value. The embodiment can also guide the user to take the stretching detection value of the fiber-based stretching sensor as an initial stretching value after wearing the intelligent garment and making a specified standard posture.
The embodiment may further obtain an initial stretch value of the fiber-based stretch sensor, where the initial stretch value is a stretch value of the fiber-based stretch sensor in an initial standard state. Whereby a stretch change value between the stretch detection value and the initial stretch value can be calculated, wherein the stretch change value is a difference between the stretch detection value and the initial stretch value. Therefore, according to the embodiment, the angle value corresponding to the stretching detection value can be obtained according to the stretching variation value and the angle mapping relation corresponding to the fiber-based stretching sensor, wherein the angle mapping relation describes the mapping relation between the stretching variation value and the angle value of the fiber-based stretching sensor. The angle mapping relationship can be described in the forms of mapping tables, fitting functions and the like. The partial pressure value is an output voltage value of the fiber-based stretching sensor after being stretched, and the signal directly output by the fiber-based stretching sensor is generally an analog signal, so that the partial pressure value can be subjected to decimal conversion to obtain the current stretching value. And then calculating the difference between the current stretching value and the initial stretching value to obtain a stretching variation value, and then calculating an angle value corresponding to the stretching detection value according to the stretching variation value and the fitting function. Illustratively, the fiber-based stretch sensor on the X-axis of the left shoulder corresponds to a fitting function of f (X) = 11.924X, a current stretch value of 464, and an initial stretch value of 460, the stretch change value is the current stretch value minus the initial stretch value, i.e., 464-460=4, and an angle value of f (4) =11.924×4=47.70 °. The corresponding fitting function of the fiber-based stretch sensor on the Y-axis of the left shoulder is f (x) = -1.8407x+90, the current stretch value is 985, the initial stretch value is 953, and the stretch change value is the current stretch value minus the initial stretch value, i.e. 985-953=32, and the angle value f (32) = -1.8407x 32+90=31.10 °. The corresponding fitting function of the fiber-based tension sensor on the Z-axis of the left shoulder is f (x) = -1.8058 x, the current tension value is 340, the initial tension value is 359, and the tension change value is the current tension value minus the initial tension value, namely 340-359= -19, and the angle value f (-19) = -1.8058 x-19=34.31 °. Further, in order to reduce errors, in this embodiment, a plurality of fiber-based stretching sensors may be disposed in each coordinate axis direction of the three-dimensional coordinate system corresponding to each human joint, so that the stretching detection values of the fiber-based stretching sensors in each coordinate axis direction of the three-dimensional coordinate system corresponding to the human joint are obtained by fusing the detection values of the plurality of fiber-based stretching sensors.
And step S30, mapping the angle value to coordinate axes of a three-dimensional coordinate system of each human joint to obtain a three-dimensional gesture.
It should be noted that, the three-dimensional coordinate system of the human joint is a coordinate system that is preset and is established with the joint of the human joint as the origin, and as shown in fig. 3, the positive X-axis direction of the three-dimensional coordinate system is the bone direction connected to one end of the human joint, the positive Y-axis direction points to a specified direction (e.g., points to the front side of the body) perpendicular to the positive X-axis direction, and the positive Z-axis is the direction perpendicular to the plane formed by the X-axis and the Y-axis. In addition, as shown in fig. 4, because of the limitation of the human body structure, the range of values of the coordinate axes in the three-dimensional coordinate system corresponding to the different human body joints is determined according to the movement range of the human body joints.
It should be noted that the three-dimensional posture may be a whole body posture of the human body or a limb posture of a part of joints of the human body, for example, a limb posture of a hand region, a limb posture of an upper body region, or the like.
Because the angle value corresponding to the fiber-based stretching sensor is described as an angle on a single degree of freedom (i.e., a certain coordinate axis), in this embodiment, the angle value corresponding to each fiber-based stretching sensor needs to be mapped to the coordinate axis of the three-dimensional coordinate system of each human joint, so as to obtain the three-dimensional gesture. Illustratively, in this embodiment, according to the setting position of the fiber-based stretching sensor, the coordinate axis of the three-dimensional coordinate system of the human joint corresponding to the fiber-based stretching sensor is determined. And then mapping the angle value of the fiber-based tension sensor to a coordinate axis of a corresponding three-dimensional coordinate system to obtain a local joint angle, wherein the local joint angle is a three-dimensional vector of a human joint in the corresponding three-dimensional coordinate system. Because each three-dimensional coordinate system is established based on the joint position of each human joint and the bone direction of one end, and the relative position relationship exists between each joint, the embodiment can connect each local joint angle in series according to the transformation relationship between the three-dimensional coordinate systems between each human joint, and obtain the global human body posture as the three-dimensional posture. Therefore, the embodiment realizes real-time mapping and visualization of the user actions after the three-dimensional gestures are imported into the human body model. Furthermore, the obtained three-dimensional gesture can be used for scenes such as motion analysis, rehabilitation, man-machine interaction, virtual reality and the like.
In a possible implementation, step S30 may include steps S31 to S33:
Step S31, determining coordinate axes of a three-dimensional coordinate system of a human joint corresponding to the fiber-based stretching sensor according to the setting position of the fiber-based stretching sensor;
Step S32, mapping the angle value of the fiber-based tension sensor to a coordinate axis of a corresponding three-dimensional coordinate system to obtain a local joint angle;
Step S33, according to the transformation relation between three-dimensional coordinate systems among all the human joints, connecting all the local joint angles in series to obtain a global human body posture as the three-dimensional posture.
The setting position of each fiber-based stretching sensor is in the range of each human joint so as to detect the fiber stretching amplitude of each human joint in the direction of each coordinate axis under the three-dimensional coordinate system.
As shown in fig. 5, the red line segment in the figure is a fiber-based stretch sensor, and the fiber-based stretch sensor 7 and the fiber-based stretch sensor 13 are used for detecting the fiber stretch amplitude in the X-axis direction of the trapezius muscle on the right side of the human body, and the fiber-based stretch sensor 8 and the fiber-based stretch sensor 14 are used for detecting the fiber stretch amplitude in the X-axis direction of the trapezius muscle on the left side of the human body, for the neck. The fiber-based tension sensor 1 is used for detecting the fiber tension amplitude of the right trapezius muscle of the human body in the Y-axis direction, and the fiber-based tension sensor 2 is used for detecting the fiber tension amplitude of the left trapezius muscle of the human body in the Y-axis direction. The fiber-based stretching sensor 5 and the fiber-based stretching sensor 11 are used for detecting the fiber stretching amplitude of the right trapezius muscle of the human body in the Z-axis direction, and the fiber-based stretching sensor 6 and the fiber-based stretching sensor 12 are used for detecting the fiber stretching amplitude of the left trapezius muscle of the human body in the Z-axis direction. For the shoulder, the fiber-based tension sensor 21 is used to detect the fiber tension in the X-axis direction of the shoulder on the right side of the human body, and the fiber-based tension sensor 22 is used to detect the fiber tension in the X-axis direction of the shoulder on the left side of the human body. The fiber-based tension sensor 3 is used for detecting the fiber tension amplitude in the Y-axis direction of the shoulder on the right side of the human body, and the fiber-based tension sensor 4 is used for detecting the fiber tension amplitude in the Y-axis direction of the shoulder on the left side of the human body. The fiber-based tension sensor 9 is used for detecting the fiber tension amplitude in the Z-axis direction of the shoulder on the right side of the human body, and the fiber-based tension sensor 10 is used for detecting the fiber tension amplitude in the Z-axis direction of the shoulder on the left side of the human body. for the chest, the fiber-based tension sensor 15 is used to detect the magnitude of fiber tension in the Z-axis direction of the chest. For the waist, the fiber-based stretching sensor 16 and the fiber-based stretching sensor 17 are used for detecting the fiber stretching amplitude in the X-axis direction of the waist, the fiber-based stretching sensor 18 and the fiber-based stretching sensor 19 are used for detecting the fiber stretching amplitude in the Y-axis direction of the waist, and the fiber-based stretching sensor 20 is used for detecting the fiber stretching amplitude in the Z-axis direction of the waist. For the elbow joint, the fiber-based tension sensor 23 is used for detecting the fiber tension amplitude of the elbow joint on the right side of the human body in the X-axis direction, the fiber-based tension sensor 24 is used for detecting the fiber tension amplitude of the elbow joint on the left side of the human body in the X-axis direction, the fiber-based tension sensor 25 is used for detecting the fiber tension amplitude of the elbow joint on the right side of the human body in the Z-axis direction, and the fiber-based tension sensor 26 is used for detecting the fiber tension amplitude of the elbow joint on the left side of the human body in the Z-axis direction, but the elbow joint has only the activity ability in two degrees of freedom due to the limitation of the human body structure, so that the fiber-based tension sensor for detecting the elbow joint in the Y-axis direction can be unnecessary. For the crotch joint, the fiber-based tension sensor 31 and the fiber-based tension sensor 32 are used for detecting the fiber tension amplitude in the X-axis direction of the crotch joint, the fiber-based tension sensor 29 and the fiber-based tension sensor 30 are used for detecting the fiber tension amplitude in the Y-axis direction of the crotch joint, and the fiber-based tension sensor 27 and the fiber-based tension sensor 28 are used for detecting the fiber tension amplitude in the Z-axis direction of the crotch joint. For the knee joint, the fiber-based tension sensor 33 is used for detecting the fiber tension amplitude in the Z-axis direction of the knee joint on the right side of the human body, and the fiber-based tension sensor 34 is used for detecting the fiber tension amplitude in the Z-axis direction of the knee joint on the left side of the human body, while the knee joint has only the activity capability in one degree of freedom due to the limitation of the human body structure, so that the fiber-based tension sensors for detecting the X-axis and Y-axis directions of the knee joint can be unnecessary to be provided. therefore, according to the setting position of the fiber-based stretching sensor, the coordinate axes of the three-dimensional coordinate system of the human joint corresponding to the fiber-based stretching sensor can be determined. And then mapping the angle value of the fiber-based tension sensor to a coordinate axis of a corresponding three-dimensional coordinate system to obtain a three-dimensional vector (local joint angle) of the human joint under the corresponding three-dimensional coordinate system, namely the angle of the human body in a three-dimensional space. The three-dimensional coordinate systems are established based on the joint positions of the human joints as the origin, so that the positions and the connection relations among the human joints can be used as transformation relations among the three-dimensional coordinate systems among the human joints. Furthermore, according to the embodiment, all local joint angles can be connected in series according to the transformation relation between the three-dimensional coordinate systems among all the human joints, so that the global human body posture corresponding to the whole human body can be obtained as the three-dimensional posture. Regarding the manner of connecting the local joint angles in series, the embodiment can determine the father-son relationship of the joints of the human body, such as the root joint, the waist, the trunk chain, the waist, the chest, the neck, the left lower limb chain, the waist, the left hip, the left knee, the left ankle, and the like. Starting with a root joint in a joint father-son relationship, multiplying each local joint angle step by a transformation matrix describing the transformation relationship to obtain a global joint angle of each human joint, and forming a global human posture. Of course, in this embodiment, the global human body posture corresponding to the partial human body area may be obtained as the three-dimensional posture according to the transformation relationship between the three-dimensional coordinate systems of the human body joints and the local joint angles of the serial connection parts.
In one possible embodiment, step S33 may include steps a10 to a20:
Step A10, acquiring a designated human body area, and determining the local joint angles of all human body joints in the designated human body area;
And step A20, connecting the local joint angles of all the human joints in the appointed human body region in series according to the transformation relation between the three-dimensional coordinate systems of all the human joints, so as to obtain the global human body posture.
The specified human body region is a specified human body region where it is desired to construct a global human body posture, such as a torso region, an upper body region, a lower body region, and the like.
The present embodiment can be implemented by acquiring a specified human body region in which pose construction is desired, and then determining human body joints included in the specified human body region, and local joint angles of the human body joints. And then according to the transformation relation between the three-dimensional coordinate systems of the human body joints, the local joint angles of the human body joints in the appointed human body region are connected in series to obtain the global human body posture.
The embodiment of the application is different from an inertial motion capturing scheme, and the inertial motion capturing scheme is used for identifying the positions of all key points under a global coordinate system (such as a ground coordinate system) through inertial information or optical information so as to realize gesture identification. Even if the global human body posture of the designated human body area is required to be acquired in the mode, the recognition error of the local human body area is overlarge due to interference caused by the overall posture change under the global coordinate system. For example, when the global human body posture is built only for the hand and elbow regions, the points of the hand and elbow regions under the global coordinate system are generally difficult to avoid interference caused by the change of the global posture, for example, when the upper body rotates, the inertial information of the hand and elbow can be changed even if the hand and elbow are not changed, so that the built global human body posture cannot be practically decoupled from the global posture, and the posture description accuracy for the designated human body region is realized.
In a possible implementation, step S30 may be followed by steps S40 to S50:
Step S40, importing the three-dimensional gesture into a standard human body model, and determining the real-time gravity center position under the three-dimensional gesture;
Step S50, after the horizontal moving speed of the real-time gravity center position is greater than a preset speed threshold value or the relative distance between the real-time gravity center position and a preset human body supporting point is greater than a preset distance threshold value, falling warning information is output.
As shown in fig. 6, the standard human body model is a three-dimensional model of a human body under a specified standard body shape.
Since the existing inertial dynamic capture and optical dynamic capture require a large amount of data processing based on inertial information or optical information, the output instantaneity of the three-dimensional gesture is difficult to ensure. According to the embodiment, the three-dimensional gesture is imported into the standard human body model, so that all the global joint angles in the three-dimensional gesture are bound with all model joints in the standard human body model, and the standard human body model presents the three-dimensional gesture. Furthermore, the embodiment can calculate the real-time gravity center position under the three-dimensional posture based on the standard human body model with the three-dimensional posture introduced. By way of example, the present embodiment may obtain the mass ratio of the body parts in the standard phantom, for example, 50% for the torso, 8% for the head and neck, 5% for each arm, and 16% for each leg. Next, the position of the center of gravity of each part needs to be determined. And determining the position of the center of gravity (e.g., near the geometric center of the body part, or corresponding proportional position) of each body part in the standard mannequin into which the three-dimensional pose is introduced. For example, the center of gravity of the thigh is at a mid-hip to knee position, or more proximal to the hip, and the center of gravity of the upper arm is at a proportional position proximal (near the shoulder), such as 43%. Then the mass proportion of each body part is multiplied by the gravity center position to obtain the sum, and the sum is divided by the total mass to obtain the real-time gravity center position under the three-dimensional gesture. The horizontal movement speed of the real-time gravity center position can be obtained by calculating the relative distance between the real-time gravity center position at the last moment and the current real-time gravity center position in the horizontal direction and the time difference between the real-time gravity center position at the last moment and the current real-time gravity center position and according to the relative distance and the time difference in the horizontal direction. The embodiment can judge whether the horizontal moving speed of the real-time gravity center position is greater than a preset speed threshold value or not, and judge whether the relative distance between the real-time gravity center position and a preset human body supporting point is greater than a preset distance threshold value or not, wherein the preset human body supporting point is the gravity center position of the human body in a preset stable posture. Therefore, in this embodiment, after the horizontal movement speed of the real-time gravity center position is greater than the predetermined speed threshold, or the relative distance between the real-time gravity center position and the predetermined human body supporting point is greater than the predetermined distance threshold, it is indicated that the real-time gravity center position is rapidly changed, or is no longer in a stable posture, and at this time, the user has a risk of falling, and then falling warning information can be output. The fall warning information is used for warning the fall risk, and can be output in the forms of characters, images, voice and the like. If the horizontal movement speed of the real-time gravity center position is not greater than a preset speed threshold value, and the relative distance between the real-time gravity center position and a preset human body supporting point is not greater than a preset distance threshold value, the falling risk can be judged to be absent. The embodiment realizes timely early warning of falling risk by means of high-accuracy joint angles and high instantaneity.
The first embodiment of the application provides a gesture processing method which is applied to intelligent clothing, wherein the intelligent clothing comprises a plurality of fiber-based stretching sensors, and the fiber-based stretching sensors are woven into the intelligent clothing in areas corresponding to joints of all human bodies. Because the fiber stretching degree of the intelligent clothing is different from that of the region corresponding to each human joint under the condition that the human joints are at different angles, the embodiment can determine the corresponding angle value according to the stretching detection value by acquiring the stretching detection value of the fiber-based stretching sensor. In this embodiment, each human joint is provided with an independent three-dimensional coordinate system, and because a single fiber-based stretching sensor can identify stretching magnitudes of the human joint in a single direction under different angles, the embodiment can map the angle values onto coordinate axes of the three-dimensional coordinate systems of each human joint respectively to obtain a three-dimensional gesture. Therefore, the embodiment accurately identifies the angle value of each human joint on the three-dimensional space from the mechanical stretching layer by means of the fiber-based stretching sensor, maps to the three-dimensional coordinate system of each human joint to form a three-dimensional gesture, compared with the existing inertial dynamic capturing and optical dynamic capturing, the gesture processing mode of the embodiment has stronger adaptability to different scenes, and the problem of integral drift is avoided because the gesture processing mode is used for carrying out angle recognition from a mechanical stretching layer, so that the accuracy of gesture recognition of a user can be effectively improved. In addition, the inertia motion capturing and the optical motion capturing need to fit a plurality of degrees of freedom (positions and postures) of the whole body based on inertia information and image information, so that the calculated data size is large and the processing difficulty is high.
In the second embodiment of the present application, the same or similar content as in the first embodiment of the present application may be referred to the above description, and will not be repeated. On this basis, please refer to fig. 7, wherein each human joint in the designated human body area includes an ankle joint, a knee joint and a hip joint;
the step A10 includes steps B10 to B40:
Step B10, responding to a jump event, acquiring a landing acceleration, and calculating a ground reaction force based on the landing acceleration, wherein the landing acceleration is determined according to the vertical acceleration during the jump event;
step B20, determining stress thresholds of the ankle joint, the knee joint and the hip joint according to the local joint angles of the ankle joint, the knee joint and the hip joint;
Step B30, performing reverse recursion calculation by taking an ankle joint as a starting point according to the ground reaction force to obtain ankle joint force, knee joint force and hip joint force;
And B40, respectively comparing the ankle joint force, the knee joint force and the hip joint force with corresponding stress thresholds, and outputting injury early warning information after any one of the ankle joint force, the knee joint force and the hip joint force exceeds the corresponding stress threshold.
For jump scenes (such as shooting, long jump, etc.), if the angles of the ankle joint, the knee joint and the hip joint are different when the user lands, the supporting force can be provided differently, for example, the supporting force can be provided as the local joint angle of the ankle joint is 0 °, that is, the supporting force is maximum when the lower leg is perpendicular to the ground, and the supporting force can be provided as the difference between the local joint angle and the angle of 0 ° is larger. Therefore, the embodiment can set a corresponding stress threshold for the local joint angle of each human joint, where the stress threshold is the supporting force of the human joint at the local joint angle. The jump-up event is an event that a user performs a jump-up action.
The present embodiment can determine that a jump-up event exists by acquiring the vertical acceleration of the user in the direction perpendicular to the ground, if the vertical acceleration has an acceleration that falls first (pre-squat phase) and then rises sharply (ground-off moment). The embodiment may then obtain a landing acceleration in response to a jump event, where the landing acceleration is determined according to a vertical acceleration at the jump event, and the landing acceleration may be a sum of an absolute value of a gravitational acceleration and the vertical acceleration. The present embodiment may also consider the air resistance during jump-up and drop-down, and the landing acceleration may also be a correction value of the sum of the gravitational acceleration and the absolute value of the vertical acceleration, that is, a product of the sum of the gravitational acceleration and the absolute value of the vertical acceleration and a predetermined correction coefficient. the ground reaction force can be calculated by the floor acceleration by means of Newton's second law, namely, the ground reaction force is the product of the floor acceleration and the human body mass. According to the local joint angles of the ankle joint, the knee joint and the hip joint, the mapping relation between the joint angles of the human body joint and the stress threshold can be queried, and the stress threshold corresponding to the ankle joint, the knee joint and the hip joint under the respective local joint angles is obtained. Further, according to the present embodiment, the ankle joint force applied to the ankle joint, the knee joint force applied to the knee joint, and the hip joint force applied to the hip joint at the landing time can be obtained by performing a reverse recursion calculation with the ankle joint as a starting point, based on the ground reaction force. Exemplary, ankle force Fankle: fankle=m foot*g−FGRF, where m foot is the mass of the foot, g is gravitational acceleration, and F GRF is the ground reaction force. Knee force after transfer of ankle force F ankle to knee Fknee: fknee =fankle+m shank ×g, where m foot is the mass of the calf, hip force after transfer of knee force F ankle to hip Fhip: fhip = Fknee +m thigh ×g, where m thigh is the mass of the thigh. Therefore, the ankle joint force, the knee joint force and the hip joint force can be respectively compared with the corresponding stress thresholds, and after any one of the ankle joint force, the knee joint force and the hip joint force exceeds the corresponding stress threshold, the injury early warning information is output. The injury warning information is used for warning injury risk, and can be output in the forms of characters, images, voice and the like.
In a second embodiment of the application, a landing acceleration is obtained by responding to a jump event, and a ground reaction force is calculated based on the landing acceleration, wherein the landing acceleration is determined according to the vertical acceleration during the jump event, stress thresholds of an ankle joint, a knee joint and a hip joint are determined according to local joint angles of the ankle joint, the knee joint and the hip joint, ankle joint force, knee joint force and hip joint force are obtained by performing reverse recursion calculation with the ankle joint as a starting point according to the ground reaction force, the ankle joint force, the knee joint force and the hip joint force are respectively compared with the corresponding stress thresholds, and injury early warning information is output after any one of the ankle joint force, the knee joint force and the hip joint force exceeds the corresponding stress threshold. According to the embodiment, the ground reaction force at the moment of jumping and landing is calculated through the vertical acceleration pre-estimation of the corresponding landing acceleration, the joint force is calculated in a reverse recursion mode by taking an ankle joint as a starting point, the transmission path of the impact force in a joint chain (ankle-knee-hip) is accurately decomposed, and the stress threshold value matched with the current posture is determined according to the real-time local joint angle, so that risks of joint sprain, dislocation and the like can exist after any joint force is out of limit, the injury early warning information is immediately output, and the monitoring and early warning of the injury situation possibly occurring after a jump event are realized.
In the third embodiment of the present application, the same or similar content as the first embodiment of the present application can be referred to the above description, and the description thereof will not be repeated. On this basis, referring to fig. 8, the gesture processing method further includes steps C10 to C40:
Step C10, responding to a motion analysis instruction, and acquiring a joint chain and a standard action angle sequence related to motion to be analyzed;
step C20, generating a current action angle sequence by relating each local joint angle in the joint chain;
step C30, aligning the current action angle sequence with the standard action angle sequence to obtain a motion coordination deviation;
and step C40, generating posture correction prompt information according to the movement coordination deviation.
The exercise analysis command is a command for instructing to analyze exercise activities, for example, analysis of exercise activities such as push-ups, yoga actions, and sit-ups. The joint chain related to the motion to be analyzed is a joint chain formed by joints of the human body related to the motion to be analyzed. The standard motion angle sequence is a time sequence consisting of local joint angles related to each human joint in the joint chain under the standard motion to be analyzed.
In this embodiment, the motion analysis instruction may be responded, the sequence of the joint related to the motion to be analyzed and the standard motion angle may be obtained, and then the angles of the local joints in the joint related to the motion may be arranged according to the time sequence, so as to generate the current motion angle sequence. And then aligning the current action angle sequence and the standard action angle sequence on a time axis, and then calculating an angle difference and a time sequence difference between a key phase point in the standard action angle sequence and a mapping phase point corresponding to the current action angle sequence, wherein the key phase point is a phase point of a marked node of motion to be analyzed in the standard action angle sequence, and the marked node is four nodes of lying upward, lying down, sitting up and sitting down, for example. The mapping phase points are phase points corresponding to key phase points in the standard action angle sequence in the current action angle sequence. Then, the embodiment may calculate difference information (such as angle difference and time sequence difference) between the key phase point and the mapping phase point as a motion coordination deviation, so that the embodiment generates posture correction prompt information according to the motion coordination deviation, for example, when the angle difference is large, prompting local joint angles of joints of a human body to be corrected and angle values to be adjusted, when the time sequence difference is large, prompting coordination between joints of the human body to be corrected and time points of the key phase point to be adjusted (such as advancing an execution time point of the marker node in the motion to be analyzed, delaying an execution time point of the marker node in the motion to be analyzed, etc.).
In some embodiments, step C30 may be followed by steps D10-D20:
Step D10, mapping the current action angle sequence and the standard action angle sequence on the same time axis, and calculating the angle difference and the time sequence difference between key phase points in the standard action angle sequence and mapping phase points corresponding to the current action angle sequence;
And D20, taking the angle difference and the time sequence difference as motion coordination deviation.
In this embodiment, the current action angle sequence and the standard action angle sequence may be mapped on the same time axis, and then a key phase point in the standard action angle sequence may be matched with the current action angle sequence, so as to obtain a mapped phase point corresponding to the key phase point in the current action angle sequence. Furthermore, the embodiment can calculate the angle difference and the time sequence difference between the key phase point in the standard action angle sequence and the mapping phase point corresponding to the current action angle sequence, and take the angle difference and the time sequence difference as the motion coordination deviation, so that the action standard degree of the motion to be analyzed is determined by means of the angle difference in the motion coordination deviation, and the action coordination degree is determined by means of the time sequence difference in the motion coordination deviation.
In a third embodiment of the present application, a sequence of current motion angles is generated by acquiring a sequence of joint-related chains and standard motion angles of a motion to be analyzed in response to a motion analysis instruction, and by including each local joint angle in the joint-related chains. And aligning the current action angle sequence with the standard action angle sequence to obtain a motion coordination deviation, and generating posture correction prompt information according to the motion coordination deviation. According to the method, the current action angle sequence and the standard action angle sequence are aligned in a nonlinear mode, multi-joint collaborative analysis on a time axis is achieved, motion coordination deviation is located, deviation on angles and coordination of motion to be analyzed can be achieved, and posture correction prompt is conducted.
The application provides an intelligent wearing system, as shown in fig. 9, which comprises an intelligent garment 201 and a control terminal 202 in communication connection with the intelligent garment 201;
The smart garment 201 comprises a base fabric layer and a plurality of fiber-based stretch sensors woven into the base fabric layer in areas corresponding to respective human joints;
the control terminal 202 is configured to implement the steps of the gesture processing method of the above-described embodiment.
The smart garment 201 may be a garment in the form of a coat, trousers, gloves, sleeves, etc., and the base fabric layer (black area in fig. 9) of the smart garment 201 is made of elastic fibers (spandex, polyester fibers, nylon, etc., or blended elastic fibers) to fit the body surface of the user. In addition, the elastic fiber may additionally be subjected to functional treatment (such as antibacterial treatment, water washing resistant coating, fatigue resistant coating, etc.). The smart garment 201 includes a plurality of fiber-based stretch sensors woven into the region (green region in fig. 9) of the base fabric layer of the smart garment 201 corresponding to each human joint to detect the stretch magnitude of the region corresponding to each human joint, characterizing the corresponding angle value. It will be appreciated that the fibre-based tension sensor is provided at least in the region where tension is generated when the human joint is in motion. Illustratively, the fiber-based tension sensor may be woven into the base fabric layer by braiding, embroidering, or adhering.
The control terminal 202 may be a terminal device independent of the smart garment, or it may be a control unit integrated in the smart garment. The control terminal 202 may include at least one processor and a memory communicatively coupled to the at least one processor, wherein the memory stores instructions executable by the at least one processor to enable the at least one processor to perform the gesture processing method of the first embodiment. The control terminal of the smart wearable system in the embodiment of the present application may include, but is not limited to, terminal devices such as a smart phone, a smart watch, a head-mounted display device, a notebook computer, a PDA (Personal DIGITAL ASSISTANT: personal digital assistant), a PAD (Portable Application Description: tablet computer), a desktop computer, and the like.
In some embodiments, smart garment 201 further includes a data acquisition module;
The data acquisition module is electrically connected with each fiber-based stretching sensor through a wire, and is used for acquiring sensor signals of each fiber-based stretching sensor and sending the sensor signals to the control terminal 202.
As shown in fig. 10, taking smart clothing as an example, the black thick line segment in fig. 10 is a fiber-based tension sensor, and the light red thin line is a wire. As shown in fig. 11, taking an intelligent garment as an example of a glove, a black thick line segment in fig. 11 is a fiber-based tension sensor, and a light red thin line is a wire. Accordingly, the smart garment 201 further includes a data acquisition module electrically connected to each of the fiber-based tension sensors through a wire, for acquiring sensor signals of each of the fiber-based tension sensors and transmitting the sensor signals to the control terminal 202.
The smart wearable system illustrated in fig. 9 is merely an example, and should not be construed as limiting the functionality and scope of use of embodiments of the present application.
The intelligent wearing system provided by the application can solve the technical problem of low accuracy of the user gesture recognized by the existing gesture recognition scheme by adopting the gesture processing method in the embodiment. Compared with the prior art, the beneficial effects of the intelligent wearing system provided by the application are the same as those of the gesture processing method provided by the embodiment, and other technical features in the intelligent wearing system are the same as those disclosed by the method of the previous embodiment, and are not repeated here.
It is to be understood that portions of the present disclosure may be implemented in hardware, software, firmware, or a combination thereof. In the description of the above embodiments, particular features, structures, materials, or characteristics may be combined in any suitable manner in any one or more embodiments or examples.
The foregoing is merely illustrative of the present application, and the present application is not limited thereto, and any person skilled in the art will readily recognize that variations or substitutions are within the scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.
The present application provides a computer-readable storage medium having computer-readable program instructions (i.e., a computer program) stored thereon for performing the gesture processing method in the above-described embodiments.
The computer readable storage medium provided by the present application may be, for example, a USB flash disk, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system or device, or a combination of any of the foregoing. More specific examples of a computer-readable storage medium may include, but are not limited to, an electrical connection having one or more wires, a portable computer diskette, a hard disk, a random access Memory (RAM: random Access Memory), a Read-Only Memory (ROM), an erasable programmable Read-Only Memory (EPROM: erasable Programmable Read Only Memory or flash Memory), an optical fiber, a portable compact disc Read-Only Memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In this embodiment, the computer-readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system or device. Program code embodied on a computer readable storage medium may be transmitted using any appropriate medium, including but not limited to electrical wiring, fiber optic cable, RF (Radio Frequency) and the like, or any suitable combination of the foregoing.
The computer readable storage medium may be included in the control terminal or may exist alone without being incorporated in the control terminal.
The computer readable storage medium carries one or more programs, which when executed by a control terminal, cause the control terminal to acquire a stretch detection value of the fiber-based stretch sensor, determine a corresponding angle value according to the stretch detection value, and map the angle value onto coordinate axes of a three-dimensional coordinate system of each human joint to obtain a three-dimensional gesture.
Computer program code for carrying out operations of the present application may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, smalltalk, C ++ and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of remote computers, the remote computer may be connected to the user's computer through any kind of network, including a local area network (LAN: local Area Network) or a wide area network (WAN: wide Area Network), or may be connected to an external computer (for example, through the Internet using an Internet service provider).
The flowcharts and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present application. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The modules involved in the embodiments of the present application may be implemented in software or in hardware. Wherein the name of the module does not constitute a limitation of the unit itself in some cases.
The readable storage medium provided by the application is a computer readable storage medium, and the computer readable storage medium stores computer readable program instructions (namely computer programs) for executing the gesture processing method, so that the technical problem that the accuracy of the user gesture identified by the existing gesture identification scheme is low can be solved. Compared with the prior art, the beneficial effects of the computer readable storage medium provided by the application are the same as those of the gesture processing method provided by the above embodiment, and are not described herein.
The application also provides a computer program product comprising a computer program which, when executed by a processor, implements the steps of the gesture processing method as described above.
The computer program product provided by the application can solve the technical problem that the accuracy of the user gesture recognized by the existing gesture recognition scheme is low. Compared with the prior art, the beneficial effects of the computer program product provided by the application are the same as those of the gesture processing method provided by the above embodiment, and are not described herein.
The foregoing description is only a partial embodiment of the present application, and is not intended to limit the scope of the present application, and all the equivalent structural changes made by the description and the accompanying drawings under the technical concept of the present application, or the direct/indirect application in other related technical fields are included in the scope of the present application.

Claims (9)

1.一种姿态处理方法,其特征在于,应用于智能服装,所述智能服装包括多个纤维基拉伸传感器,所述纤维基拉伸传感器织入所述智能服装与各人体关节对应的区域;1. A posture processing method, characterized by being applied to smart clothing, wherein the smart clothing includes a plurality of fiber-based stretch sensors woven into areas of the smart clothing corresponding to human joints; 所述姿态处理方法包括:The posture processing method comprises: 获取所述纤维基拉伸传感器的拉伸检测值;obtaining a tensile test value of the fiber-based tensile sensor; 根据所述拉伸检测值,确定对应的角度值;Determining a corresponding angle value according to the stretching detection value; 根据所述纤维基拉伸传感器的设置位置,确定所述纤维基拉伸传感器对应的人体关节的三维坐标系的坐标轴;Determining, according to the arrangement position of the fiber-based stretch sensor, the coordinate axes of the three-dimensional coordinate system of the human joint corresponding to the fiber-based stretch sensor; 将所述纤维基拉伸传感器的角度值映射至对应的三维坐标系的坐标轴上,得到局部关节角度;Mapping the angle value of the fiber-based tensile sensor to the coordinate axes of the corresponding three-dimensional coordinate system to obtain the local joint angle; 按照各所述人体关节之间三维坐标系之间的变换关系,串联各所述局部关节角度,得到全局人体姿态作为三维姿态。According to the transformation relationship between the three-dimensional coordinate systems of the human body joints, the local joint angles are connected in series to obtain the global human body posture as a three-dimensional posture. 2.如权利要求1所述的姿态处理方法,其特征在于,所述根据所述拉伸检测值,确定对应的角度值的步骤,包括:2. The posture processing method according to claim 1, wherein the step of determining the corresponding angle value according to the stretch detection value comprises: 获取所述纤维基拉伸传感器的初始拉伸值,其中所述初始拉伸值为所述纤维基拉伸传感器处于初始标准状态下的拉伸值;Acquiring an initial stretch value of the fiber-based stretch sensor, wherein the initial stretch value is a stretch value of the fiber-based stretch sensor when the fiber-based stretch sensor is in an initial standard state; 计算所述拉伸检测值与所述初始拉伸值之间的拉伸变化值;Calculating a stretch change value between the stretch detection value and the initial stretch value; 根据所述拉伸变化值和所述纤维基拉伸传感器对应的角度映射关系,得到所述拉伸检测值对应的角度值,其中所述角度映射关系描述所述纤维基拉伸传感器的拉伸变化值与角度值之间的映射关系。According to the angle mapping relationship between the stretch change value and the fiber-based stretch sensor, the angle value corresponding to the stretch detection value is obtained, wherein the angle mapping relationship describes the mapping relationship between the stretch change value and the angle value of the fiber-based stretch sensor. 3.如权利要求1所述的姿态处理方法,其特征在于,所述按照各所述人体关节之间三维坐标系之间的变换关系,串联各所述局部关节角度,得到全局人体姿态的步骤,包括:3. The posture processing method according to claim 1, wherein the step of concatenating the local joint angles according to the transformation relationship between the three-dimensional coordinate systems of the human joints to obtain the global human posture comprises: 获取指定人体区域,并确定所述指定人体区域内各人体关节的局部关节角度;Acquire a designated human body region, and determine a local joint angle of each human body joint in the designated human body region; 按照各所述人体关节之间三维坐标系之间的变换关系,串联所述指定人体区域内各人体关节的局部关节角度,得到全局人体姿态。According to the transformation relationship between the three-dimensional coordinate systems of the human joints, the local joint angles of the human joints in the specified human body area are connected in series to obtain the global human body posture. 4.如权利要求3所述的姿态处理方法,其特征在于,所述指定人体区域内各人体关节包括踝关节、膝关节和髋关节;4. The posture processing method according to claim 3, wherein the human body joints in the designated human body region include ankle joints, knee joints, and hip joints; 在所述获取指定人体区域,并确定所述指定人体区域内各人体关节的局部关节角度的步骤之后,包括:After the steps of acquiring the designated human body region and determining the local joint angles of the human body joints in the designated human body region, the method includes: 响应于跃起事件,获取落地加速度,并基于所述落地加速度,计算得到地面反作用力,其中所述落地加速度根据所述跃起事件时的垂直加速度确定;In response to a jumping event, acquiring a landing acceleration, and calculating a ground reaction force based on the landing acceleration, wherein the landing acceleration is determined according to the vertical acceleration during the jumping event; 根据踝关节、膝关节和髋关节的局部关节角度,确定踝关节、膝关节和髋关节的受力阈值;Determine the force thresholds of the ankle, knee, and hip joints based on their local joint angles; 根据所述地面反作用力,以踝关节为起点逆向递归计算,得到踝关节力、膝关节力和髋关节力;According to the ground reaction force, a reverse recursive calculation is performed with the ankle joint as the starting point to obtain the ankle joint force, the knee joint force and the hip joint force; 将踝关节力、膝关节力和髋关节力分别与对应的受力阈值进行对比,并在踝关节力、膝关节力和髋关节力中存在任一项超出对应的受力阈值后,输出受伤预警信息。The ankle joint force, knee joint force and hip joint force are compared with the corresponding force thresholds respectively, and when any one of the ankle joint force, knee joint force and hip joint force exceeds the corresponding force threshold, an injury warning message is output. 5.如权利要求1所述的姿态处理方法,其特征在于,所述姿态处理方法还包括:5. The posture processing method according to claim 1, further comprising: 响应于运动分析指令,获取待分析运动的涉及关节链和标准动作角度序列;In response to a motion analysis instruction, obtaining a joint chain and a standard motion angle sequence involved in the motion to be analyzed; 将所述涉及关节链中各局部关节角度,生成当前动作角度序列;Generate a current action angle sequence by using the angles of each local joint in the joint chain; 将所述当前动作角度序列和所述标准动作角度序列进行对齐,得到运动协调偏差;Aligning the current motion angle sequence with the standard motion angle sequence to obtain a motion coordination deviation; 根据所述运动协调偏差,生成姿态矫正提示信息。Generating posture correction prompt information according to the movement coordination deviation. 6.如权利要求5所述的姿态处理方法,其特征在于,所述将所述当前动作角度序列和所述标准动作角度序列进行对齐,得到运动协调偏差的步骤,包括:6. The posture processing method according to claim 5, wherein the step of aligning the current motion angle sequence with the standard motion angle sequence to obtain the motion coordination deviation comprises: 将所述当前动作角度序列和所述标准动作角度序列映射在同一时间轴上,计算所述标准动作角度序列中关键相位点与所述当前动作角度序列对应的映射相位点之间的角度差和时序差;Mapping the current motion angle sequence and the standard motion angle sequence on the same time axis, and calculating the angle difference and timing difference between the key phase points in the standard motion angle sequence and the mapped phase points corresponding to the current motion angle sequence; 将所述角度差和所述时序差作为运动协调偏差。The angle difference and the timing difference are regarded as motion coordination deviation. 7.如权利要求1至6中任一项所述的姿态处理方法,其特征在于,在所述将所述角度值映射至各所述人体关节的三维坐标系的坐标轴上,得到三维姿态的步骤之后,所述姿态处理方法还包括:7. The posture processing method according to any one of claims 1 to 6, characterized in that after the step of mapping the angle values to the coordinate axes of the three-dimensional coordinate system of each human joint to obtain the three-dimensional posture, the posture processing method further comprises: 将所述三维姿态导入标准人体模型,确定所述三维姿态下的实时重心位置;Importing the three-dimensional posture into a standard human body model to determine the real-time center of gravity position under the three-dimensional posture; 在所述实时重心位置的水平移动速度大于预定速度阈值,或所述实时重心位置与预定人体支撑点之间的相对距离大于预定距离阈值后,输出跌倒警示信息。When the horizontal moving speed of the real-time center of gravity position is greater than a predetermined speed threshold, or the relative distance between the real-time center of gravity position and a predetermined human body support point is greater than a predetermined distance threshold, a fall warning message is output. 8.一种智能穿戴系统,其特征在于,所述智能穿戴系统包括智能服装,以及与所述智能服装通信连接的控制终端;8. A smart wearable system, characterized in that the smart wearable system includes smart clothing and a control terminal communicatively connected to the smart clothing; 所述智能服装包括:基体织物层和多个纤维基拉伸传感器,所述纤维基拉伸传感器织入所述基体织物层中与各人体关节对应的区域;The smart clothing comprises: a base fabric layer and a plurality of fiber-based stretch sensors, wherein the fiber-based stretch sensors are woven into areas of the base fabric layer corresponding to respective human joints; 所述控制终端配置为实现如权利要求1至7中任一项所述的姿态处理方法的步骤。The control terminal is configured to implement the steps of the gesture processing method according to any one of claims 1 to 7. 9.如权利要求8所述的智能穿戴系统,其特征在于,所述智能服装还包括数据采集模块;9. The smart wearable system according to claim 8, wherein the smart clothing further comprises a data acquisition module; 所述数据采集模块通过导线与各所述纤维基拉伸传感器电性连接,用于采集各所述纤维基拉伸传感器的传感器信号并发送至控制终端。The data acquisition module is electrically connected to each of the fiber-based stretching sensors via a wire, and is used to collect sensor signals from each of the fiber-based stretching sensors and send the signals to a control terminal.
CN202510829406.9A 2025-06-20 2025-06-20 Posture processing method and intelligent wearable system Active CN120335619B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202510829406.9A CN120335619B (en) 2025-06-20 2025-06-20 Posture processing method and intelligent wearable system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202510829406.9A CN120335619B (en) 2025-06-20 2025-06-20 Posture processing method and intelligent wearable system

Publications (2)

Publication Number Publication Date
CN120335619A CN120335619A (en) 2025-07-18
CN120335619B true CN120335619B (en) 2025-09-23

Family

ID=96364003

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202510829406.9A Active CN120335619B (en) 2025-06-20 2025-06-20 Posture processing method and intelligent wearable system

Country Status (1)

Country Link
CN (1) CN120335619B (en)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104720821A (en) * 2015-04-01 2015-06-24 深圳柔微传感科技有限公司 Method and smart clothing for achieving real-time posture monitoring
CN114788693A (en) * 2021-01-25 2022-07-26 苏州润裳姿式智能科技有限公司 Joint angle monitoring device and method and readable storage medium

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2015198771A (en) * 2014-04-08 2015-11-12 富士通株式会社 Posture state detection apparatus, posture state detection program, and posture state detection method
CN104720787A (en) * 2015-04-01 2015-06-24 深圳柔微传感科技有限公司 Method and smart clothing for achieving real-time fatigue monitoring
CN113268141B (en) * 2021-05-17 2022-09-13 西南大学 A motion capture method and device based on inertial sensors and fabric electronics
CN113288123A (en) * 2021-05-29 2021-08-24 南京摩盛科技有限公司 Flexible stretchable wearable device integrating joint angle measurement and motion posture measurement
US20250102327A1 (en) * 2022-09-15 2025-03-27 Yale University Stretchable fabric sensor, wearable electronic device including the same, and method of making the same
CN118670246A (en) * 2024-05-17 2024-09-20 武汉纺织大学 High-performance fiber-based stretching sensor and preparation method thereof
CN119645223B (en) * 2024-11-08 2025-08-08 华南理工大学 A whole-body posture estimation method and system based on pressure and inertial sensors

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104720821A (en) * 2015-04-01 2015-06-24 深圳柔微传感科技有限公司 Method and smart clothing for achieving real-time posture monitoring
CN114788693A (en) * 2021-01-25 2022-07-26 苏州润裳姿式智能科技有限公司 Joint angle monitoring device and method and readable storage medium

Also Published As

Publication number Publication date
CN120335619A (en) 2025-07-18

Similar Documents

Publication Publication Date Title
Slade et al. An open-source and wearable system for measuring 3D human motion in real-time
CN107049324B (en) A kind of judgment method and device of limb motion posture
TWI487505B (en) Mechanomyographic signal input device, human-machine operating system and identification method thereof
CN107330967B (en) Rider motion posture capturing and three-dimensional reconstruction system based on inertial sensing technology
WO2018196227A1 (en) Evaluation method, device, and system for human motor capacity
Wei et al. Real-time 3D arm motion tracking using the 6-axis IMU sensor of a smartwatch
JP7107264B2 (en) Human Body Motion Estimation System
Gauthier et al. Human movement quantification using Kinect for in-home physical exercise monitoring
JP2022043264A (en) Exercise evaluation system
Yahya et al. Accurate shoulder joint angle estimation using single RGB camera for rehabilitation
CN114053679A (en) Exercise training method and system
CN110609621A (en) Posture calibration method and human motion capture system based on micro-sensor
CN120335619B (en) Posture processing method and intelligent wearable system
CN106644090B (en) Job hunting instrument state testing method and system based on kinect
WO2010082157A1 (en) Method for determining the rotation axis of a joint and device for monitoring the movements of at least one body part
US11020024B2 (en) System and method for evaluating range of motion of a subject
CN105843388B (en) A kind of data glove system
CN116304544A (en) Motion data calibration method and system
CN210302240U (en) An Augmented Reality AR Wrist Rehabilitation Evaluation and Training System
Lee et al. Motion tracking smart work suit with a modular joint angle sensor using screw routing
Dinh et al. Design and implementation of a wireless wearable band for gait analysis
Park et al. Development of a dance rehabilitation system using kinect and a vibration feedback glove
JP2021099666A (en) Method for generating learning model
US20210255694A1 (en) Methods of and systems for estimating a topography of at least two parts of a body
Redhouse Joint Angle Estimation Method for Wearable Human Motion Capture

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
点击 这是indexloc提供的php浏览器服务,不要输入任何密码和下载