WO2019111977A1 - Dispositif de détermination de posture - Google Patents
Dispositif de détermination de posture Download PDFInfo
- Publication number
- WO2019111977A1 WO2019111977A1 PCT/JP2018/044804 JP2018044804W WO2019111977A1 WO 2019111977 A1 WO2019111977 A1 WO 2019111977A1 JP 2018044804 W JP2018044804 W JP 2018044804W WO 2019111977 A1 WO2019111977 A1 WO 2019111977A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- user
- posture
- unit
- waveform
- determination unit
- Prior art date
Links
Images
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/103—Measuring devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
- A61B5/11—Measuring movement of the entire body or parts thereof, e.g. head or hand tremor or mobility of a limb
- A61B5/1116—Determining posture transitions
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61G—TRANSPORT, PERSONAL CONVEYANCES, OR ACCOMMODATION SPECIALLY ADAPTED FOR PATIENTS OR DISABLED PERSONS; OPERATING TABLES OR CHAIRS; CHAIRS FOR DENTISTRY; FUNERAL DEVICES
- A61G7/00—Beds specially adapted for nursing; Devices for lifting patients or disabled persons
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61G—TRANSPORT, PERSONAL CONVEYANCES, OR ACCOMMODATION SPECIALLY ADAPTED FOR PATIENTS OR DISABLED PERSONS; OPERATING TABLES OR CHAIRS; CHAIRS FOR DENTISTRY; FUNERAL DEVICES
- A61G7/00—Beds specially adapted for nursing; Devices for lifting patients or disabled persons
- A61G7/05—Parts, details or accessories of beds
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/02—Detecting, measuring or recording for evaluating the cardiovascular system, e.g. pulse, heart rate, blood pressure or blood flow
- A61B5/024—Measuring pulse rate or heart rate
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/103—Measuring devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
- A61B5/107—Measuring physical dimensions, e.g. size of the entire body or parts thereof
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/103—Measuring devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
- A61B5/11—Measuring movement of the entire body or parts thereof, e.g. head or hand tremor or mobility of a limb
- A61B5/1113—Local tracking of patients, e.g. in a hospital or private home
- A61B5/1115—Monitoring leaving of a patient support, e.g. a bed or a wheelchair
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/103—Measuring devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
- A61B5/11—Measuring movement of the entire body or parts thereof, e.g. head or hand tremor or mobility of a limb
- A61B5/1118—Determining activity level
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/103—Measuring devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
- A61B5/11—Measuring movement of the entire body or parts thereof, e.g. head or hand tremor or mobility of a limb
- A61B5/113—Measuring movement of the entire body or parts thereof, e.g. head or hand tremor or mobility of a limb occurring during breathing
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/45—For evaluating or diagnosing the musculoskeletal system or teeth
- A61B5/4538—Evaluating a particular part of the muscoloskeletal system or a particular medical condition
- A61B5/4561—Evaluating static posture, e.g. undesirable back curvature
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/68—Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
- A61B5/6887—Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient mounted on external non-worn devices, e.g. non-medical devices
- A61B5/6891—Furniture
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61G—TRANSPORT, PERSONAL CONVEYANCES, OR ACCOMMODATION SPECIALLY ADAPTED FOR PATIENTS OR DISABLED PERSONS; OPERATING TABLES OR CHAIRS; CHAIRS FOR DENTISTRY; FUNERAL DEVICES
- A61G7/00—Beds specially adapted for nursing; Devices for lifting patients or disabled persons
- A61G7/05—Parts, details or accessories of beds
- A61G7/057—Arrangements for preventing bed-sores or for supporting patients with burns, e.g. mattresses specially adapted therefor
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61G—TRANSPORT, PERSONAL CONVEYANCES, OR ACCOMMODATION SPECIALLY ADAPTED FOR PATIENTS OR DISABLED PERSONS; OPERATING TABLES OR CHAIRS; CHAIRS FOR DENTISTRY; FUNERAL DEVICES
- A61G2203/00—General characteristics of devices
- A61G2203/30—General characteristics of devices characterised by sensor means
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61G—TRANSPORT, PERSONAL CONVEYANCES, OR ACCOMMODATION SPECIALLY ADAPTED FOR PATIENTS OR DISABLED PERSONS; OPERATING TABLES OR CHAIRS; CHAIRS FOR DENTISTRY; FUNERAL DEVICES
- A61G2203/00—General characteristics of devices
- A61G2203/30—General characteristics of devices characterised by sensor means
- A61G2203/32—General characteristics of devices characterised by sensor means for force
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61G—TRANSPORT, PERSONAL CONVEYANCES, OR ACCOMMODATION SPECIALLY ADAPTED FOR PATIENTS OR DISABLED PERSONS; OPERATING TABLES OR CHAIRS; CHAIRS FOR DENTISTRY; FUNERAL DEVICES
- A61G2203/00—General characteristics of devices
- A61G2203/30—General characteristics of devices characterised by sensor means
- A61G2203/34—General characteristics of devices characterised by sensor means for pressure
Definitions
- the present invention relates to a posture determination device.
- the detection device disclosed in Patent Document 1 detects a vibration on a bed and extracts a heartbeat vibration signal derived from a heartbeat of a living body. Then, the detection device estimates the posture of the living body on the bed based on the extracted heartbeat vibration signal.
- the notification device disclosed in Patent Document 2 acquires a user state including at least one of the posture of the user and the position of the user on the bed or outside the bed. Then, the notification device notifies in accordance with the acquired user state.
- An object of the present invention is to provide a posture determination device capable of appropriately determining the posture of a user from a waveform based on the vibration of the user.
- the posture determination apparatus calculates at least two vibration sensors capable of detecting vibration of the user during bed rest, and calculates a plurality of waveforms from a plurality of the vibrations, and based on features of the plurality of waveforms And a control unit that determines the posture of the user.
- the posture determination apparatus of the present invention it is possible to appropriately determine the posture of the user at bed rest as one of the states of the user based on the plurality of waveforms calculated from the vibration.
- the preceding apparatus determines whether the user is leaving the bed (whether the user is present in the bed) To detect However, the apparatus did not detect what the user is in when the user is in the bed.
- the user etc. may install many devices (for example, sensors) in a wide area of the bed, and the device may discriminate between lying position and sitting position.
- the device could not determine the user's orientation (for example, supine position, prone position, lateral position, etc.) when the user was sleeping in bed.
- the user In order for the device to determine the posture of the user, the user, for example, separately provides a camera device etc., and the medical staff, staff, family members etc. (hereinafter referred to as staff etc.) monitor the user, or the device It was necessary to analyze an image captured by a camera device or the like. In this case, it is necessary for the staff etc. to constantly monitor the user, and the burden on the staff etc. was large. In addition, in order to monitor the user, it is necessary for the camera device to always shoot the user. Therefore, there are also cases where the user opposes photographing with the camera device from the viewpoint of privacy.
- the posture determination device of the present embodiment it is possible to detect even the posture of the user only by detecting the vibration when the user is sleeping on the bed.
- a user means a person using a bed (or a mattress), and it is not limited to those receiving medical treatment due to a disease, and is used if it is a person who receives care at a facility or a person who goes to bed in a bed Applicable as a person.
- FIG. 1 is a view for explaining the overall outline of a system 1 to which the posture determination device of the present invention is applied.
- the system 1 includes a detection device 3 placed between the floor of the bed 10 and the mattress 20, and a processing device 5 for processing the value output from the detection device 3.
- the system 1 determines the posture of the user by the detection device 3 and the processing device 5.
- the detection device 3 detects body vibration (vibration generated from the human body) as a biological signal of the user P who is the user. Further, the processing device 5 calculates the biological information value of the user P based on the vibration detected by the detection device 3.
- the processing device 5 may output and display the calculated biological information value (for example, the respiration rate, the heart rate, and the activity amount) as the biological information value of the user P.
- the calculated biological information value for example, the respiration rate, the heart rate, and the activity amount
- the detection device 3 may be integrally formed with the processing device 5 by providing the detection device 3 with a storage unit, a display unit, and the like.
- the processing device 5 may be a general-purpose device, and is not limited to an information processing device such as a computer, and may be a device such as a tablet or a smartphone. Further, when the detection device 3 has a communication function, the detection device 3 may connect (communicate) with the server device instead of the processing device 5.
- the user may be a person being treated for illness, or may need care.
- the user may be a healthy person who does not need care, an elderly person, a child, a disabled person, or a person or an animal.
- the detection device 3 is, for example, a thin sheet-like device. Thereby, even if the detection device 3 is placed between the bed 10 and the mattress 20, the user P can use the detection device 3 without feeling discomfort. Therefore, the detection device 3 can detect the state of the user on the bed for a long time.
- the detection apparatus 3 should just be able to detect the vibration of the user P.
- the detection device 3 may use a load cell that measures a load disposed on an actuator with a strain gauge, a leg of the bed 10, or the like.
- the detection device 3 may use a smartphone placed on the bed 10, an acceleration sensor built in a tablet or the like, or the like.
- the direction toward the head side of the bed 10 is H
- the direction toward the foot is F
- the direction toward the left of the user P is L
- the direction toward the right of the user P is R.
- the configuration of the system 1 will be described with reference to FIGS. 2 to 4.
- the system 1 in the present embodiment includes a detection device 3 and a processing device 5. Further, each functional unit other than the detection unit 110 may be included in any of the detection device 3 and the processing device 5. One of the detection device 3 and the processing device 5 implements the function of each functional unit other than the detection unit 110. By combining the detection device 3 and the processing device 5, the detection device 3 and the processing device 5 function as an attitude determination device.
- the system 1 includes a control unit 100, a detection unit 110, a first calculation unit 120, a second calculation unit 130, a third calculation unit 135, a determination unit 140, and a storage unit 150. It includes an input unit 160 and an output unit 170.
- the control unit 100 controls the operation of the system 1.
- the control unit 100 is, for example, a control device such as a CPU (Central Processing Unit).
- the control unit 100 realizes various processes by reading out various programs stored in the storage unit 150 and executing the various programs.
- one control unit 100 is provided as a whole, but may be provided in each of the detection device 3 and the processing device 5 as shown in FIG. 4 described later.
- the detection unit 110 detects vibration on the detection device 3 and acquires vibration data.
- the detection unit 110 of the present embodiment detects a vibration (body vibration) based on a user's movement or the like using a sensor that detects a pressure change.
- the detection unit 110 also acquires vibration data based on the detected vibration.
- the detection unit 110 outputs the vibration data to the first calculation unit 120, the second calculation unit 130, and the third calculation unit 135.
- the vibration data may be analog vibration data or digital vibration data.
- the detection unit 110 may acquire vibration data by detecting the vibration of the user with a pressure sensor, for example.
- the detection part 110 may provide a microphone instead of a pressure sensor.
- the detection unit 110 may acquire a biological signal based on the sound picked up by the microphone, and may acquire vibration data from the biological signal.
- the detection unit 110 may also acquire vibration data from the output values of an acceleration sensor, a capacitance sensor, or a load sensor. As described above, the detection unit 110 only needs to be able to acquire a biological signal (body vibration indicating a user's body movement) using any method.
- the first calculation unit 120 acquires the biological signal of the user from the vibration data, and calculates a biological information value (respiration rate, heart rate, activity amount, etc.).
- the respiration component and the heartbeat component are extracted from the vibration (body vibration) data acquired from the detection unit 110.
- the first calculation unit 120 may obtain the respiration rate and the heart rate from the extracted respiration component and heartbeat component based on the respiration interval and the heartbeat interval. Further, the first calculation unit 120 may analyze the periodicity of the vibration data (Fourier transform or the like), and calculate the respiration rate and the heart rate from the peak frequency.
- JP-A-2015-12948 title of the invention: sleep evaluation device, sleep evaluation method and sleep evaluation program, filing date: July 4, 2013
- This patent application is incorporated in its entirety by reference.
- the second calculation unit 130 converts analog vibration data input from the detection unit 110 into digital voltage signals at predetermined sampling intervals, and calculates waveform data indicating a waveform of the body vibration (vibration waveform).
- the third calculator 135 calculates the frequency distribution from the waveform data. For example, the third calculation unit 135 calculates a frequency distribution from the frequency components obtained by fast Fourier transform (FFT) of the waveform data calculated by the second calculation unit 130. The third calculator 135 may directly calculate the frequency distribution based on the vibration data output from the detector 110.
- FFT fast Fourier transform
- the determination unit 140 determines the state of the user. For example, the determination unit 140 may use the vibration data acquired by the detection unit 110, the biological information value calculated by the first calculation unit 120, the value acquired from a load sensor separately provided in the bed 10, and the second calculation unit 130. The state of the user is determined using at least one of the waveform data for which is calculated, the frequency distribution for which the third calculation unit 135 calculates, and the like. The determination unit 140 may determine the state of the user by combining a plurality of data, values, and the like.
- the determination unit 140 determines the posture of the user on the bed (whether the user is supine position, prone position, or lateral position) as the state of the user. In addition, the determination unit 140 may determine the posture of the user such as the end sitting position or a state other than the posture of the user (for example, leaving the bed, staying in the floor, etc.).
- the storage unit 150 stores various data and programs for the system 1 to operate.
- the control unit 100 implements these various functions by reading these programs and executing the programs.
- the storage unit 150 is configured of a semiconductor memory (for example, a solid state drive (SSD) or an SD card (registered trademark)), a magnetic disk drive (for example, a hard disk drive (HDD)), or the like.
- the storage unit 150 may be a built-in storage device, or may be a removable external storage device.
- the storage unit 150 may be a storage area of an external server such as a cloud.
- the storage unit 150 includes a vibration data storage area 152 and a waveform data storage area 154.
- the vibration data storage area 152 and the waveform data storage area 154 are assigned to the respective areas in the storage unit 150.
- the vibration data storage area 152 stores the vibration data output from the detection unit 110.
- vibration data is stored at predetermined time intervals.
- vibration data is stored at relatively short intervals such as every one second or every five seconds, or at relatively long intervals such as 30 seconds, one minute, or five minutes.
- the waveform data storage area 154 stores the waveform data (waveform) of the vibration calculated by the second calculation unit 130 based on the vibration data output from the detection unit 110 or the vibration data stored in the vibration data storage area 152. .
- the second calculation unit 130 may calculate the waveform data as needed.
- the waveform data may be temporarily stored in the waveform data storage area 154 or may be stored in a cumulative manner.
- the input unit 160 receives an operation from the user. For example, the input unit 160 receives, from the user, an input of an operation indicating starting acquisition of the user's vibration, or receives an input of an operation indicating adjustment of the sensitivity of the detection unit 110; Accept input.
- the input unit 160 is, for example, a keyboard, a mouse, a touch panel, or the like.
- the output unit 170 outputs various information.
- a display device of a liquid crystal display a light emitting member such as an LED, a speaker for outputting sound or sound, an interface for outputting data to another recording medium, or the like.
- the input unit 160 and the output unit 170 may be integrated.
- one touch panel may double as the input unit 160 and the output unit 170.
- the first calculation unit 120, the second calculation unit 130, the third calculation unit 135, and the determination unit 140 are mainly realized by software (program).
- the control unit 100 reads the software stored in the storage unit 150, and the control unit 100 executes the software. As a result, the control unit 100 realizes the function of each configuration.
- control unit 100 reads a program for realizing the first calculation unit 120, the second calculation unit 130, the third calculation unit 135, and the determination unit 140, and executes the program, whereby the control unit 100 performs the function of each configuration. Will have.
- FIG. 2 illustrates the configuration of the posture determination device of the system 1.
- These configurations may be realized, for example, by one device capable of detecting vibration, or, as shown in FIG. 1, the posture determination device may be divided into the detection device 3 and the processing device 5. Further, the attitude determination device may be realized using an external server capable of providing the same service instead of the processing device 5.
- the detection device 3 includes a control unit 300, a detection unit 320 which is a sensor, a storage unit 330, and a communication unit 390.
- control unit 300 functions as the first calculation unit 310 by executing software (program) stored in the storage unit 330.
- the detection unit 320 outputs vibration data based on the detected vibration.
- the first calculator 310 calculates a biological information value based on the vibration data. Then, the detection device 3 stores the biological information value in the biological information value data 340 or transmits the biological information value to the processing device 5 through the communication unit 390. In addition, the detection device 3 also transmits the vibration data detected by the detection unit 320 to the processing device 5 via the communication unit 390.
- the timing of transmitting vibration data from the detection device 3 to the processing device 5 and the timing of storing the biological information value (biological information) in the biological information value data 340 may be real time or every predetermined time. Good.
- the detection unit 320 is the detection unit 110 in FIG. Here, the detection unit 320 will be described with reference to FIG.
- FIG. 4 is a top view of the bed 10 (mattress 20).
- the upward direction is the direction H in FIG. 1
- the downward direction is the direction F in FIG. 1.
- the direction toward the right in FIG. 4 is the direction L in FIG. 1
- the direction toward the left in FIG. 4 is the direction R in FIG.
- the detection device 3 is placed between the bed 10 and the mattress 20 or on the mattress 20.
- the location where the detection device 3 is placed is preferably in the vicinity of the back of the user, so it is in the direction closer to the H side (the head side of the user) than the center of the bed 10 (mattress 20).
- the detection apparatus 3 incorporates a sensor (detection part 110/320).
- the said sensor is a vibration sensor, for example, and can detect a user's vibration (body vibration).
- at least two sensors are provided in the detection device 3.
- two sensors vibration sensor 320 a and vibration sensor 320 b are provided on the left and right of the detection device 3.
- the vibration sensor 320a is provided apart from the vibration sensor 320b.
- the distance between the vibration sensor 320a and the vibration sensor 320b may be, for example, about the distance at which the user is positioned on the sensor, and preferably the distance between the two sensors is about 15 to 60 cm.
- the first calculator 310 in FIG. 3 is the first calculator 120 in FIG.
- the communication unit 390 is, for example, an interface that can be connected (communicated) to a network (for example, a LAN / WAN).
- the processing device 5 includes a control unit 500, a storage unit 530, an input unit 540, an output unit 550, and a communication unit 590.
- the processing device 5 receives vibration data from the detection device 3 via the communication unit 590.
- the processing device 5 stores the received vibration data in the vibration data storage area 532.
- the control unit 500 executes software (program) stored in the storage unit 530, whereby the control unit 500 functions as a second calculation unit 502, a third calculation unit 504, and a determination unit 506.
- the waveform data calculated by the second calculation unit 502 is stored in the waveform data storage area 534.
- the second calculator 502 is the second calculator 130 in FIG.
- the third calculator 504 is the third calculator 135 of FIG.
- the determination unit 506 is the determination unit 140 in FIG.
- the input unit 540 is the input unit 160 of FIG.
- the output unit 550 is the output unit 170 of FIG.
- the storage unit 530 is the storage unit 150 of FIG.
- Posture determination processing is a process performed by the control unit 100 (determination unit 140).
- the control unit 100 acquires vibration data (step S102). Specifically, the determination unit 140 reads out vibration data from the vibration data storage area 152 to acquire vibration data, or receives and acquires vibration data from the detection unit 110.
- the determination unit 140 determines the posture of the user based on the vibration data. At this time, the determination unit 140 determines the posture of the user based on the correlation between sensors or the correlation in the sensors.
- the correlation between sensors means the correlation of a plurality of data output by a plurality of sensors.
- the correlation in a sensor means the correlation of the several data output from one sensor.
- the second calculation unit 130 calculates a waveform from vibration data and outputs it as waveform data (step S104).
- the second calculation unit 130 outputs waveform data for each vibration sensor.
- the second calculation unit 130 also outputs waveform data for each vibration sensor.
- the second calculator 130 outputs two waveform data.
- the second calculation unit 130 may store the waveform data in the waveform data storage area 154 or may output the waveform data to the output unit 170.
- the output unit 170 is a display device, the output unit 170 displays waveform data.
- the determination unit 140 determines whether there is a correlation between sensors from each waveform data (step S106). For example, the determination unit 140 determines whether the waveform data output for each of the two sensors (for example, the vibration sensor 320a and the vibration sensor 320b in FIG. 4) has similarity as the correlation between the sensors.
- the determination unit 140 obtains the correlation between the sensors by using a cross correlation function for the two waveform data.
- the determination unit 140 can output a value normalized to “0” to “1” based on the similarity between two waveform data by using the cross correlation function.
- the value output using this cross correlation function changes in accordance with the degree of similarity between the two waveform data. For example, when the value of the cross correlation function is “1”, the two waveform data completely match and the similarity is maximum. Also, when the value of the cross correlation function is “0”, the two waveforms do not match at all and the similarity is the smallest.
- the determining unit 140 determines whether the output value of the cross correlation function exceeds a threshold. For example, when the threshold value is “0.7”, the determination unit 140 determines that the two waveform data are not correlated if the output value of the cross correlation function is “0.7” or less. If the output value of the cross correlation function exceeds “0.7”, the determination unit 140 determines that the two waveform data have a correlation. That is, when there is a correlation between two waveform data, the determination unit 140 determines that there is a correlation between sensors.
- the determination unit 140 determines that the posture of the user is "supine position or prone position" (steps S116 to S120). In addition, since the time during which the user sleeps usually is extremely short, the determination unit 140 may only determine that the user's posture is "supine position or prone position". However, the prone position is at high risk of asphyxia, and the association with sudden infant death syndrome has also been reported. In addition, in the prone position, the purpose is to prohibit the automatic operation of the electric bed. Therefore, in order to distinguish the supine position and the prone position, the determination unit 140 determines whether the user's posture is the "supine position" and whether the user's posture is the "prone position". You may judge.
- the determination unit 140 determines whether there is a correlation in the sensor (step S106; Yes ⁇ step S116).
- the determination unit 140 determines whether or not there is a correlation in the sensor.
- the determination unit 140 evaluates the strength of the periodicity in the waveform data and determines whether or not there is a correlation in the sensor.
- the determination unit 140 determines whether there is a correlation by using an autocorrelation function with respect to waveform data of one sensor.
- the autocorrelation function outputs a value normalized to “0” to “1” based on the strength of the periodicity of the waveform in the same sensor.
- the determination unit 140 determines that the waveform data is completely periodically output and that there is a complete correlation in the sensor. Further, when the value of the autocorrelation function is “0”, the determination unit 140 determines that there is no correlation in the sensor.
- the determination unit 140 calculates the strength of the periodicity as a value normalized to “0” to “1” using a Fourier transform, a chi-square periodogram, or the like, and based on the value, it is determined in the sensor. The correlation may be determined.
- the determination unit 140 may determine whether the output value of the autocorrelation function exceeds a threshold. For example, when the threshold value is “0.7”, the determination unit 140 determines that the calculated waveform data (one detected vibration data) is the output value of the autocorrelation function is “0.7” or less. Determine that there is no correlation. If the output value of the autocorrelation function exceeds “0.7”, the determination unit 140 determines that the waveform data has a correlation.
- the determination unit 140 determines that the posture of the user is “supine position” (step S116; Yes ⁇ step S118). On the other hand, when the determination unit 140 determines that there is no correlation in the sensor, the determination unit 140 determines that the posture of the user is "in prone position" (step S116; No ⁇ step S120).
- the determination unit 140 determines that there is no correlation between sensors in step S106 (step S106; No).
- the determination unit 140 determines that the posture of the user is "lateral decubitus position" (step S110).
- the determination part 140 may only determine a user's attitude
- the determination unit 140 determines whether the user's posture is "right side decubitus position" or whether the user's posture is "left side decubitus position". Good.
- the determination unit 140 determines that the posture of the user is “left lateral decubitus position”.
- the determination unit 140 can determine the magnitude of the heartbeat signal using the ratio of the frequency component corresponding to the heartbeat signal to the frequency component corresponding to the respiration signal, the signal intensity of the data subjected to the high-pass filter processing, and the like. .
- the posture determination apparatus of the present embodiment can determine the posture (sleep posture) of the user from the vibration data.
- the determination unit 140 of the present embodiment determines the correlation between sensors and the presence or absence of the correlation in the sensors based on the waveform data calculated by the second calculation unit 130. For example, the determination unit 140 may determine the correlation between the sensors or the correlation in the sensors based on the vibration data simply. In this case, the determination unit 140 may not perform the process of step S104.
- the determination unit 140 calculates waveform data from the vibration data of the user to determine the posture of the user, without being limited thereto, for example, the determination unit 140 evaluates the shape of the frequency distribution. Posture may be determined. In this case, although the determination unit 140 may improve the accuracy by comprehensively evaluating the shape of the frequency distribution of the vibration acquired by two or more sensors, the number of sensors for acquiring the vibration may be one.
- the determination unit 140 determines the posture of the user by using the frequency analysis by the determination unit 140
- the determination unit 140 acquires vibration data from the detection unit 110 (step S152), and the second calculation unit 130 calculates (outputs) waveform data (step S154).
- the third calculator 135 calculates the frequency distribution of the waveform data output by the second calculator 130 (step S156). For example, when the high frequency component is not calculated among the frequency distribution, the determination unit 140 specifies that the posture of the user is “lateral decubitus position”.
- the tissues of the body (muscle, fat, bone, viscera, etc.) between them differ. Therefore, the manner of transmission of the vibration from the user to the sensor also changes, and a difference appears in the frequency component measured by the system 1.
- the movement of the heart and the movement of the chest and abdomen due to breathing are fixed.
- the movement in the vertical direction with respect to the sensor (detection unit 110) is increased with respect to the movement of the chest and abdomen by breathing.
- the movement in the parallel direction to the sensor (detection unit 110) is large.
- the frequency distribution differs depending on the posture of the user. Therefore, the frequency distribution is stored for each posture, and the determination unit 140 can determine the posture by comparing the frequency distribution actually extracted with the frequency distribution stored for each posture.
- the determination unit 140 can determine that the user's posture is “side-up position”.
- the third calculating unit 135 calculates the frequency distribution from the waveform data, but may directly calculate the frequency distribution from the vibration data. In this case, step S154 is not performed.
- the determination unit 140 may execute the above-described posture determination process when the user is present. Further, the posture determination process may be performed when the determination unit 140 determines that the user or the state of the bed has a predetermined condition. Hereinafter, the posture determination process uses the motion of the user as a condition for the determination unit 140 to execute the posture determination process.
- the user's body movement means that the user's posture has movement, such as the user turning over.
- the determination unit 140 determines the attitude of the user for each section The time when the determination section 140 determines the attitude of the user is divided into a plurality of sections. For example, in a section that does not include body movement, the determination unit 140 determines that the user's posture continues as the same posture. Then, the determination unit 140 outputs the user as the same posture.
- the determination unit 140 may collectively analyze (do not apply the result of the posture determination processing) the sections not including the body movement of the user. In addition, the determination unit 140 may totalize the result of the posture determination process in a certain fixed section (for example, every three minutes or the like). When there is a section that does not include body movement in the certain section, the determination unit 140 outputs the determination result of the posture that is the largest among the sections that does not include body movement as the result of the posture determination process of the section.
- the determination unit 140 determines the posture of the user only in the section in which no body movement occurs.
- the determination unit 140 determines the posture of the user only in the section in which no body movement occurs. For example, when the detection unit 110 detects a body movement of the user, the determination unit 140 cancels the execution of the posture determination process. In addition, after the detection unit 110 detects the user's body movement and the body movement is not detected, after a predetermined time elapses (for example, 10 seconds after the body movement is not detected), the determination unit 140 is again performed. Execute attitude determination processing.
- each of the waveforms shown in FIGS. 7 to 10 is a waveform based on the vibration detected based on the two vibration sensors.
- the horizontal axis represents time
- the vertical axis represents a voltage value
- each represents a waveform based on the vibration detected by the detection unit 110.
- position is demonstrated based on a waveform on account of description.
- a tendency may be detected from time-series vibration data to determine the posture of the user.
- the two waveforms shown in FIG. 7 have no correlation with the waveforms in the sensor, and the same shape is not represented repeatedly. That is, this waveform is a waveform indicating that the user is in the prone position (step S120 in FIG. 5).
- the two waveforms shown in FIG. 8 are correlated with each other between the sensors, and are also correlated with each other in the sensor. That is, this waveform is a waveform that indicates the posture of the supine position by the user (step S118 in FIG. 5).
- the two waveforms shown in FIGS. 9 and 10 are waveforms that have no correlation between the sensors. That is, this waveform is a waveform that indicates the posture of the user in the lateral decubitus position (step S110 in FIG. 5).
- FIG. 9 and FIG. 10 are compared, in the waveform shown in FIG. 10 which is the waveform of left lateral recumbency, the high frequency signal by a heart beat is recognized notably.
- FIG. 11 is a graph in which the frequency distribution is determined from the waveform.
- FIG. 11 is a graph showing that the frequency distribution was determined in step S156 of FIG. 6,
- FIG. 11 (a) is a graph showing supine position, and
- FIG. 11 (b) is a graph showing lateral position Of the As described above, the determination unit 140 can determine the posture of the user by further obtaining the frequency distribution from the waveform (vibration data).
- the second embodiment will be described.
- the second embodiment is an embodiment in which the posture of the user is determined based on a plurality of posture determination conditions.
- FIG. 5 of the first embodiment is replaced with the operation flow of FIG. 12.
- the description of the same parts as those of the first embodiment will be omitted.
- the determination unit 140 acquires vibration data (step S202). Then, the second calculator 130 calculates a waveform (step S204). Subsequently, the determination unit 140 calculates an index (value) for each condition (steps S206 to S214). The determination unit 140 calculates a total value from the calculated index value (step S216).
- the determination unit 140 calculates, as the total value, a total value corresponding to "supine position", "place prone position", and "(left and right) side lying position”. Then, the determination unit 140 determines the posture that has become the largest total value as the posture of the user.
- the determination unit 140 calculates an index value for each condition in steps S206 to S214.
- the method of calculating the index value will be described with reference to FIG.
- step S206 The determination unit 140 calculates the value of the first index calculated by the correlation based on the waveform in the sensor. First, the determination unit 140 calculates the value of the autocorrelation function based on the waveform in the sensor by the method described in the first embodiment.
- the determination unit 140 weights the output value of the autocorrelation function to calculate the value of the first index.
- the method of weighting performed by the determination unit 140 will be described.
- the table of FIG. 13 is a table showing the characteristics of the sensor waveform for each posture of the user. For example, the correlation based on the waveform in the sensor is "Yes” when the user's posture is “supine”, “Yes” when the user's posture is “side down”, and the user's posture is " When in the prone position, it is "none".
- the determination unit 140 executes the autocorrelation function and outputs a value between “0 and 1”.
- the correlation based on the waveform in the sensor in FIG. 13 is “presence” (“supine position” and “prone position”)
- the determination unit 140 directly uses the output value as the value of the first index.
- the correlation based on the waveform in the sensor in FIG. 13 is “no”
- the determination unit 140 sets a value obtained by subtracting the output value of the autocorrelation function from the maximum value as the value of the first index.
- the determination unit 140 determines that the output value of the autocorrelation function is “0.8”, the value of the first index is “0.8 in the case of supine position” and “the prone position”. When it is 0.2, it becomes 0.8 when it is a "lateral position”.
- step S208 The determination unit 140 calculates the value of the second index calculated by the correlation based on the waveform between the sensors. First, the determination unit 140 calculates an output value using the cross-correlation function of two waveform data by the method described above.
- the determination unit 140 weights the output value to calculate the value of the second index.
- the method of weighting performed by the determination unit 140 will be described.
- the correlation based on the waveforms between the sensors is, for example, when the user's posture is “supine position", “present” and the user's posture is “prone position”, When the user's posture is “side down” when “Yes” or “None”, “None” is given.
- the determination unit 140 executes the cross correlation function and outputs a value between “0 and 1”.
- the determination unit 140 directly uses the output value as the value of the second index.
- the determination unit 140 sets the value obtained by subtracting the output value from the maximum value as the value of the second index.
- the determination unit 140 sets a value obtained by halving the output value as the value of the second index. .
- the determination unit 140 determines that the output value of the cross correlation function is “0.9”, the value of the second index is “0.9 for supine position”, and “the prone position”. When it is 0.45, it becomes 0.1 when it is a "lateral position”.
- the determination unit 140 calculates the value of the third index based on the waveform output from the second calculation unit 130. For example, the determination unit 140 determines, by the third calculation unit 135, whether the ratio of the frequency of the heartbeat component in the high frequency component higher than that of the respiration component and the power spectrum density of the frequency that is an integral multiple thereof is a predetermined value or more. Do. When the above-described ratio is equal to or more than a predetermined value, the determination unit 140 determines that the waveform of the heartbeat strongly appears on the waveform output from the second calculation unit 130 (including many waveforms of heartbeat components).
- the determination unit 140 determines that the waveform of the heart beat is not on the waveform output from the second calculation unit 130. (Does not include many waveforms of heart rate components), "0" is output. Furthermore, the determination unit 140 outputs the value obtained by weighting the output value as the value of the third index. The method of weighting performed by the determination unit 140 will be described.
- the waveform of the heart beat is smaller in the waveform outputted by the second calculator 130 (the waveform outputted by the second calculator 130 contains a heart beat component relatively less) .
- the manner in which the waveform of the heart beat is applied to the waveform output by the second calculation unit 130 becomes smaller.
- the waveform of the heartbeat becomes larger in the waveform output by the second calculator 130 (the waveform output by the second calculator 130 has a relatively large number of heartbeat components). Including).
- the waveform of the heartbeat is larger in the waveform output by the second calculator 130.
- the second calculation unit 130 outputs a waveform of a heartbeat waveform. How to put on becomes large. However, when the posture of the user is “right lateral decubitus position”, the placement of the heartbeat waveform becomes smaller in the waveform output by the second calculation unit 130 as compared with the posture of the user “left lateral decubitus position”.
- the determination unit 140 when the posture of the user is “side down”, the determination unit 140 outputs the output value as it is as the value of the third index. In addition, when the user's posture is "supine position" or “abdominal position", the determination unit 140 reduces the value of the third index (for example, "0.1" or "0"). Etc) output.
- the determination unit 140 calculates a fourth index related to the shapes of peaks and valleys in the waveform of the heart beat.
- the portion from the valley to the mountain in the waveform is, for example, a portion from time t1 to t2 in FIG.
- a portion which becomes a mountain and a valley is, for example, a portion of time t2 ⁇ t3 in FIG.
- the portion where these waveforms change indicates a vibration transition (pressure transition), which usually corresponds to the exhalation / inspiration of the user.
- the determination unit 140 is configured such that, for each cycle of respiration (valley to valley or mountain to mountain), the first time to go from valley to mountain and the second time to go from mountain to valley are between two sensors (vibration The degree of coincidence is evaluated by the sensor 320a and the vibration sensor 320b). For example, if the ratio of the first time in one cycle of the vibration sensor 320a and the ratio of the first time in one cycle of the vibration sensor 320b are within a desired range, the determination unit 140 matches the vibration sensors 320a and 320b (same It is evaluated as “paired”.
- the determination unit 140 calculates the value of the fourth index based on the ratio of the same pair in two sensors (the vibration sensor 320a and the vibration sensor 320b) in each cycle included in the posture determination section.
- the determination unit 140 agrees between the two sensors (the vibration sensor 320a and the vibration sensor 320b) when the period in the same pair out of the periods included in the posture determination section exceeds the desired threshold. Evaluate.
- the determination unit 140 outputs, as it is, the value of the fourth index as it is, the ratio in which the correspondence relationship between the time of two sensors is the same in the posture determination section.
- the determination unit 140 outputs a value obtained by multiplying the ratio of the same pair included in the posture determination section by “0.5” as the value of the fourth index.
- step S214 The determination unit 140 calculates a comparison result of the first time from valley to mountain and the second time from mountain to valley in the waveform as a fifth index.
- the determination unit 140 determines, from the waveform, the first time from valley to mountain, the first time from mountain to valley, for each cycle of respiration (valley to valley or mountain to mountain) included in the posture determination section. The lengths of two hours are compared, and a portion where the length of the first time is “short” and the length of the second time is “long” (“short ⁇ long”) is determined. Then, the determination unit 140 sets the fifth on the basis of the ratio of pairs in which the length of time has a relationship of “short ⁇ long” in both of the two sensors (the vibration sensor 320 a and the vibration sensor 320 b) in the posture determination section. Calculate the value of the indicator.
- the determination unit 140 counts “breathing in which the first time is shorter than the second time” included in the posture determination section, and outputs, as a value of the fifth index, a ratio of coincidence.
- the determination unit 140 outputs, as the value of the fifth index, a value obtained by subtracting “0.5” from the ratio of two sensors forming a pair in the posture determination section.
- the determination unit 140 is less likely to exhibit the characteristic in the waveform. Therefore, the determination unit 140 does not output the value of the fifth index. That is, the determination unit 140 outputs “0” as the value of the fifth index.
- the determination unit 140 can determine the posture of the user more appropriate.
- the determination unit 140 determines the posture using the waveform calculated by the second calculation unit 130 based on the vibration data.
- the determination unit 140 may determine the posture of the user simply based on the vibration data. That is, in the process of FIG. 12, the determination unit 140 may shift the process from step S202 to step S206 without executing step S204.
- the determination unit 140 can output the third index by performing frequency analysis on vibration data.
- the determination unit 140 can determine the peaks and valleys of the waveform in the same manner as the waveform as long as the maximum value (nearby) and the minimum value (nearby) of the vibration data can be extracted.
- the present embodiment describes a case where the determination unit 140 determines the posture of the user using artificial intelligence (machine learning).
- the posture of the user which is one of the states of the user, is estimated based on the estimation unit 700 of FIG.
- the estimation unit 700 estimates the posture of the user by using vibration data and the state of the user as input values (input data) and using artificial intelligence and various statistical indexes.
- the estimation unit 700 includes an extraction unit 710, an identification unit 720, an identification dictionary 730 (learning model), and an output unit 740.
- vibration data and waveform data calculated from the vibration data are input to the estimation unit 700.
- the extraction unit 710 extracts each feature point of the input data and outputs it as a feature vector.
- the contents to be extracted as feature points by the extraction unit 710 may be, for example, the following.
- the extraction unit 710 outputs a feature vector by combining one or more feature points of these contents.
- a feature point is an example, and you may combine the content extracted as another feature point.
- the value which the extraction part 710 outputs as a feature point is a convenient value for convenience of description. Then, the extraction unit 710 outputs the value of the corresponding feature point as “1” and the value of the non-corresponding feature point as “0”.
- the extraction unit 710 may output a random variable as the value of the feature point.
- the extraction unit 710 outputs the 7-dimensional feature vector to the identification unit 720.
- the identification unit 720 identifies a class corresponding to the state of the user from the input feature vector. At this time, the identification unit 720 identifies a class by collating with a plurality of prototypes prepared in advance as an identification dictionary 730.
- the prototype may be stored as a feature vector corresponding to each class, or may store a feature vector representing the class.
- the identification unit 720 determines the class to which the closest prototype belongs. At this time, the identification unit 720 may determine the class according to the nearest neighbor determination rule or may determine the class according to the k-nearest neighbor method.
- the identification dictionary 730 used by the identification unit 720 may store a prototype in advance, may be newly stored using machine learning, or may be updated as needed.
- the output unit 740 outputs a (sleeping) posture as one of the user's states.
- the state of the user output by the output unit 740 may identify and output “supine position”, “prone position”, “lateral position” or the like, or may output a random variable as it is.
- the determination unit 140 can acquire vibration data output from the sensor, and can estimate the posture of the user from the information using machine learning.
- the fourth embodiment is an embodiment in which the estimation unit 700 of the third embodiment estimates a user's posture using deep learning using a neural network.
- waveform data of the user is input to the estimation unit 700.
- the estimation unit 700 estimates the user state (posture) from the input waveform data.
- the inference unit 700 uses deep learning as the process of inference. Note that processing using deep learning (deep neural network) can be estimated with high accuracy, particularly in image recognition.
- FIG. 16 is a diagram for explaining a process in which the inference unit 700 estimates using deep learning.
- the estimation unit 700 inputs a signal of waveform data (image data) output by the second calculation unit 130 to a neural network configured by a plurality of layers and neurons included in each layer. Each neuron receives a signal from another plurality of neurons, and outputs the calculated signal to another plurality of neurons.
- the neural network has a multilayer structure, it is referred to as an input layer, an intermediate layer (hidden layer), and an output layer in the order in which the signals flow.
- the middle layer of the neural network consists of multiple layers, it is called a deep neural network (for example, Convolutional Neural Network with convolution operation), and the machine learning method using this is called deep learning. Call.
- the waveform data is subjected to various operations (convolution operation, pooling operation, normalization operation, matrix operation, etc.) on neurons in each layer of the neural network, flows while changing its form, and a plurality of signals are output from the output layer.
- various operations convolution operation, pooling operation, normalization operation, matrix operation, etc.
- the estimation unit 700 estimates the posture of the user as the posture of the user linked to the output value having the largest value.
- the estimation unit 700 may estimate the user's posture from the output of the classifier by passing one or more output values to the classifier without directly outputting the posture that is the user's state.
- a number of waveform data and the corresponding user's posture in the waveform data are input to the neural network in advance as parameters which are coefficients used for various operations of the neural network. Further, the error between the output value of the neural network and the correct value is propagated in the backward direction of the neural network by the error back propagation method. In this way, the parameters of the neurons in each layer are updated and determined many times. The process of updating and determining parameters in this manner is called learning.
- the structure of the neural network and individual operations are known techniques described in a book or a paper, and any one of them may be used.
- the posture of the user is output by referring to the vibration wave data (waveform data) calculated from the vibration data output from the sensor.
- the estimation unit 700 estimates using a neural network as waveform image data.
- the estimation unit 700 may estimate vibration of the user by simply inputting vibration data (time-series voltage output values) and learning.
- the estimation unit 700 may estimate the posture of the user by inputting and learning data converted into a frequency domain signal by Fourier transform or discrete cosine transform.
- FIG. 10 The configuration of the bed is shown in FIG.
- the bed 10 has a back bottom 12, a hip bottom 14, a knee bottom 16 and a foot bottom 18.
- the upper body of the user P is supported by the back bottom 12 and the lower back is supported by the waist bottom 14.
- the drive control unit 1000 controls the drive of the bed.
- the drive control unit 1000 includes the function of the bottom control unit 1100 for controlling the back raising and knee raising (foot lowering) functions and the like by operating the bottom.
- the bottom control unit 1100 is connected to the back bottom driving unit 1110 and the knee bottom driving unit 1120 in order to realize the back lifting function.
- the back bottom drive unit 1110 is, for example, an actuator, and is connected to a back lift link via a link mechanism. And the back bottom drive part 1110 controls the operation of the back bottom 12 placed by the link.
- the back bottom drive unit 1110 performs back up and down control.
- the knee bottom drive unit 1120 is, for example, an actuator.
- the knee bottom drive unit 1120 is connected to a knee lifting link via a link mechanism. Then, the knee bottom drive unit 1120 controls the operation of the knee bottom 16 placed on the link and the foot bottom 18 further connected.
- the knee bottom drive unit 1120 performs knee raising and knee lowering (foot lowering and foot raising) control.
- the bottom control unit 1100 does not perform a back-raising operation when the determination unit 140 (or the estimation unit 700) determines that the user's posture is “prone position”. That is, even if the user selects the back raising operation, the bottom control unit 1100 does not drive the back bottom driving unit 1110 and does not perform the back raising operation. Further, even when the bed is in an automatic operation, when the determination unit 140 (or the estimation unit 700) determines that the posture of the user is "in prone position", the bottom control unit 1100 does not perform the back-raising operation.
- control unit 100 determines whether an operation has been selected by the user (step S302). For example, the back up button is selected by the user by the input unit 160 (operation remote control). Thus, the control unit 100 determines that the back-raising operation is selected.
- control unit 100 determines posture determination processing (step S304).
- the determination unit 140 executes any of the above-described attitude determination processes to determine the attitude of the user on the bed.
- the estimation unit 700 may determine the attitude of the user.
- the control unit 100 determines whether the posture of the user is in a specific posture (step S306). In the present application example, the control unit 100 does not execute the back-raising operation when the user's posture is “prone position” (step S306; Yes). If the posture is other than that, the control unit 100 instructs the bottom control unit 1100 (back bottom drive unit 1110) to execute the back raising operation (Step S306; No ⁇ Step S308).
- step S310 when the user releases the back-raising operation (for example, the user performs a cancel operation, the back-raising button is released, etc.), the control unit 100 stops the back-raising operation (step S310). Yes ⁇ step S312).
- control unit 100 stops the back-raising operation even when the user's posture becomes the specific posture during the back-raising operation (for example, when the user is in the prone position during the back-raising operation). (Step S306; Yes ⁇ Step S312).
- a posture determination device is applied to a posture conversion device that converts a user's posture.
- the control unit 100 automatically stores the frequency of body posture conversion (posture conversion) and the ratio of each posture for grasping the chewing risk. That is, the control unit 100 can be used for care and treatment by automatically storing the posture of the user determined by the determination unit 140.
- the posture conversion device notifies or automatically performs posture conversion when a predetermined time has elapsed while the posture of the user determined by the determination unit 140 is the same.
- the control unit 100 and the drive control unit 1000 control the body posture conversion drive unit 1200.
- the body posture conversion drive unit 1200 converts the body position of the user by, for example, expanding and contracting the air cells provided on the left and right of the user or making the left and right bottoms uneven.
- the drive control unit 1000 controls the body posture conversion drive unit 1200 according to the posture determined by the determination unit 140, and performs control to change the body position of the user P.
- the process of FIG. 18 will be described as an example.
- the determination unit 140 determines the posture of the user by any of the methods described above (step S304).
- the body posture conversion drive unit 1200 executes processing in accordance with the posture determined by the determination unit 140. For example, if the user is in the posture of right side recumbency, the body posture conversion drive unit 1200 controls the air cell provided on the right side. The user performs posture conversion. In addition, if the user is in the posture of left lateral decubitus position, the body posture conversion drive unit 1200 controls the air cell provided on the left side. The user performs posture conversion. In addition, the body posture conversion drive unit 1200 does not control either the left or right air cell if the user is in the prone position. Users do not perform posture conversion.
- the posture conversion drive unit 1200 may perform posture conversion. For example, when the posture of the user determined by the determination unit 140 continues in the same posture for 10 minutes or more, the posture conversion driving unit 1200 performs posture conversion.
- the determination unit 140 may determine that there is a change in vibration wave data (waveform data), and the user may perform posture conversion by the body posture conversion drive unit 1200. That is, when there is a change in the vibration wave data (waveform data), the determination unit 140 determines that the posture of the user has changed. Posture conversion is performed when there is a change in the user's posture. Thereby, the said position change can be effectively used in the pressure sore prevention of the location without the paralyzed side of a user.
- the notification device performs notification according to the posture of the user determined by the determination unit 140.
- notification by voice by the voice output device or notification by display (light) by the display device may be used.
- the notification device may notify to another terminal device (for example, a portable terminal device possessed by a medical worker).
- the following timing can be considered as a timing which alerting
- the posture of the user is determined in the processing device 5 based on the result output from the detection device 3, but all may be determined in one device.
- processing may be performed on the server side and the processing result may be returned to the terminal device.
- the processing described above may be realized on the server side by uploading vibration data from the detection device 3 to the server.
- the detection device 3 may be realized by, for example, a device such as a smartphone incorporating an acceleration sensor and a vibration sensor.
- the vibration sensor was demonstrated as having two, you may be provided more. Moreover, in the method of calculating the frequency distribution of the first embodiment to determine the attitude, even one sensor can be realized.
- a program that operates in each device is a program (a program that causes a computer to function) that controls a CPU or the like so as to realize the functions of the above-described embodiment. Then, the information handled by these devices is temporarily stored in a temporary storage device (for example, RAM) at the time of processing, and then stored in storage devices of various ROMs, HDDs, and SSDs, and read by the CPU as needed. , Correction and writing are performed.
- a temporary storage device for example, RAM
- the program can be stored and distributed in a portable recording medium, or can be transferred to a server computer connected via a network such as the Internet.
- a server computer connected via a network such as the Internet.
- the storage device of the server computer is also included in the present invention.
Landscapes
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Veterinary Medicine (AREA)
- Public Health (AREA)
- General Health & Medical Sciences (AREA)
- Animal Behavior & Ethology (AREA)
- Surgery (AREA)
- Biomedical Technology (AREA)
- Heart & Thoracic Surgery (AREA)
- Medical Informatics (AREA)
- Molecular Biology (AREA)
- Pathology (AREA)
- Biophysics (AREA)
- Physics & Mathematics (AREA)
- Engineering & Computer Science (AREA)
- Dentistry (AREA)
- Oral & Maxillofacial Surgery (AREA)
- Physiology (AREA)
- Physical Education & Sports Medicine (AREA)
- Nursing (AREA)
- Cardiology (AREA)
- Orthopedic Medicine & Surgery (AREA)
- Rheumatology (AREA)
- Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)
Abstract
L'invention concerne un dispositif de détermination de posture comprenant au moins deux capteurs de vibration et une unité de commande, caractérisé en ce que l'unité de commande utilise les capteurs de vibration pour détecter des vibrations lorsque l'utilisateur est couché, calcule une forme d'onde à partir des vibrations détectées, et reconnaît les caractéristiques de la forme d'onde, déterminant ainsi la posture de l'utilisateur à partir de ces caractéristiques. En conséquence de cette configuration, il est possible de fournir un dispositif de détermination de posture et similaire avec lequel il est possible de déterminer de manière appropriée la posture de l'utilisateur en tant qu'état de l'utilisateur.
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202310291475.XA CN116059051A (zh) | 2017-12-07 | 2018-12-05 | 床系统 |
CN201880020409.6A CN111417343B (zh) | 2017-12-07 | 2018-12-05 | 姿势判定装置 |
US16/496,550 US11510594B2 (en) | 2017-12-07 | 2018-12-05 | Posture determination apparatus |
US17/972,012 US11813055B2 (en) | 2017-12-07 | 2022-10-24 | Posture determination apparatus |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2017-235410 | 2017-12-07 | ||
JP2017235410A JP7066389B2 (ja) | 2017-12-07 | 2017-12-07 | 姿勢判定装置 |
Related Child Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US16/496,550 A-371-Of-International US11510594B2 (en) | 2017-12-07 | 2018-12-05 | Posture determination apparatus |
US17/972,012 Continuation US11813055B2 (en) | 2017-12-07 | 2022-10-24 | Posture determination apparatus |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2019111977A1 true WO2019111977A1 (fr) | 2019-06-13 |
Family
ID=66751562
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2018/044804 WO2019111977A1 (fr) | 2017-12-07 | 2018-12-05 | Dispositif de détermination de posture |
Country Status (4)
Country | Link |
---|---|
US (2) | US11510594B2 (fr) |
JP (1) | JP7066389B2 (fr) |
CN (2) | CN111417343B (fr) |
WO (1) | WO2019111977A1 (fr) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112650157A (zh) * | 2019-10-11 | 2021-04-13 | 八乐梦床业株式会社 | 控制装置 |
Families Citing this family (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11071668B1 (en) * | 2018-06-04 | 2021-07-27 | Encompass Group, Llc. | Hospital bed with inflatable bladders with random inflation and related methods |
JP7598700B2 (ja) * | 2019-03-18 | 2024-12-12 | パナソニックホールディングス株式会社 | 生体情報特定システム及び方法 |
JP6893528B2 (ja) * | 2019-04-15 | 2021-06-23 | ミネベアミツミ株式会社 | 生体情報モニタリングシステム、生体情報モニタリング方法、及びベッドシステム |
US20230056977A1 (en) * | 2020-01-31 | 2023-02-23 | Karlos Ishac | Posture detection system and posture detection method |
EP3882114B1 (fr) * | 2020-03-19 | 2023-09-20 | Honda Research Institute Europe GmbH | Système d'estimation de position de conduite |
JP7127905B1 (ja) | 2021-07-14 | 2022-08-30 | 株式会社フューチャーインク | 機械学習を用いて対象者の体位を判定する装置、方法及びプログラム |
JP7659513B2 (ja) | 2022-01-31 | 2025-04-09 | パラマウントベッド株式会社 | 判定装置 |
CN114557570A (zh) * | 2022-03-09 | 2022-05-31 | 湖南晚安床垫有限公司 | 智能床垫的自适应调整方法、智能床垫及存储介质 |
WO2024128038A1 (fr) * | 2022-12-12 | 2024-06-20 | 積水化学工業株式会社 | Système de traitement d'informations, dispositif de traitement d'informations, procédé de commande, et programme |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2001095858A (ja) * | 1999-03-25 | 2001-04-10 | Matsushita Seiko Co Ltd | 身体を動作させる装置 |
JP2005144042A (ja) * | 2003-11-19 | 2005-06-09 | Denso Corp | 生体情報表示装置、寝姿及び体位検出装置 |
JP2008110031A (ja) * | 2006-10-30 | 2008-05-15 | Aisin Seiki Co Ltd | 寝姿勢判定装置 |
JP2012152283A (ja) * | 2011-01-24 | 2012-08-16 | Jepico Corp | 生体信号検出装置及び離床予兆検出システム |
Family Cites Families (34)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP3543392B2 (ja) * | 1994-11-11 | 2004-07-14 | 松下電器産業株式会社 | 睡眠時呼吸情報測定装置 |
US6874181B1 (en) * | 1995-12-18 | 2005-04-05 | Kci Licensing, Inc. | Therapeutic bed |
US7306564B2 (en) * | 2003-11-26 | 2007-12-11 | Denso Corporation | Breath monitor |
JP2005237479A (ja) * | 2004-02-24 | 2005-09-08 | Paramount Bed Co Ltd | 姿勢判断装置、姿勢判断による床ずれ予防装置、姿勢判断方法及び姿勢判断による床ずれ予防方法 |
JP2005270627A (ja) * | 2004-02-25 | 2005-10-06 | Paramount Bed Co Ltd | 気道確保ベッド及び気道確保方法 |
JP2005253608A (ja) * | 2004-03-10 | 2005-09-22 | Sumitomo Osaka Cement Co Ltd | 状態解析装置 |
US7792583B2 (en) * | 2004-03-16 | 2010-09-07 | Medtronic, Inc. | Collecting posture information to evaluate therapy |
KR100712198B1 (ko) * | 2005-02-17 | 2007-04-30 | 재단법인서울대학교산학협력재단 | 무구속 무게감지 기반 수면구조 분석 장치 |
JP4964477B2 (ja) * | 2005-02-23 | 2012-06-27 | パナソニック株式会社 | 生体情報検出装置及び該方法 |
JP4566090B2 (ja) | 2005-08-09 | 2010-10-20 | パラマウントベッド株式会社 | 動作支援ベッド |
JP2007061439A (ja) * | 2005-08-31 | 2007-03-15 | Toshiba Corp | 生体情報測定装置および生体情報計測方法 |
WO2009138976A2 (fr) * | 2008-05-12 | 2009-11-19 | Earlysense Ltd | Surveillance, évaluation et traitement d’épisodes cliniques |
JP5224462B2 (ja) * | 2009-02-06 | 2013-07-03 | 公立大学法人首都大学東京 | 身体情報測定装置および身体情報測定システム |
JP2011120667A (ja) | 2009-12-09 | 2011-06-23 | Aisin Seiki Co Ltd | 在床状態検出装置 |
JP5318810B2 (ja) * | 2010-03-30 | 2013-10-16 | シャープ株式会社 | 脈波伝播速度測定装置および脈波伝播速度の測定方法および脈波伝播速度の測定プログラム |
SG10201510693UA (en) * | 2010-12-28 | 2016-01-28 | Sotera Wireless Inc | Body-worn system for continous, noninvasive measurement of cardiac output, stroke volume, cardiac power, and blood pressure |
US10159429B2 (en) | 2011-05-30 | 2018-12-25 | Koninklijke Philips N.V. | Apparatus and method for the detection of the body position while sleeping |
JP5788293B2 (ja) * | 2011-10-31 | 2015-09-30 | オムロンヘルスケア株式会社 | 睡眠評価装置および睡眠評価用プログラム |
EP2783669B2 (fr) * | 2011-11-22 | 2022-08-31 | Paramount Bed Co., Ltd. | Dispositif de lit |
JP5831826B2 (ja) * | 2012-08-18 | 2015-12-09 | 有限会社チザイ経営社 | 寝姿勢制御ベッドシステム |
JP6123450B2 (ja) * | 2013-04-16 | 2017-05-10 | 富士通株式会社 | 生体情報取得装置、方法及びプログラム |
CN106073737B (zh) * | 2013-06-03 | 2018-11-23 | 飞比特公司 | 可佩戴心率监视器 |
JP2015006210A (ja) * | 2013-06-24 | 2015-01-15 | 住友理工株式会社 | 呼吸状態検出装置 |
DE112014006133B4 (de) * | 2014-06-02 | 2018-05-24 | Sumitomo Riko Company Limited | Lagebestimmungsvorrichtung und Lagebestimmungsverfahren |
WO2016033121A1 (fr) * | 2014-08-25 | 2016-03-03 | Georgia Tech Research Corporation | Systèmes et procédés non-invasifs pour surveiller des caractéristiques de santé |
CN106793878B (zh) * | 2014-09-30 | 2018-07-06 | 深圳市大耳马科技有限公司 | 姿态和生命体征监测系统及方法 |
EP3226753A4 (fr) * | 2015-01-27 | 2017-12-20 | Apple Inc. | Système de détermination de la qualité du sommeil |
JP2016192998A (ja) | 2015-03-31 | 2016-11-17 | 株式会社デンソー | 報知装置 |
WO2017018506A1 (fr) * | 2015-07-30 | 2017-02-02 | ミネベア株式会社 | Dispositif de détection de condition physique, procédé de détection de condition physique et système de lit |
JP6763749B2 (ja) * | 2015-11-05 | 2020-09-30 | セイコーインスツル株式会社 | 移動体制御システム及び移動体 |
KR102570068B1 (ko) * | 2015-11-20 | 2023-08-23 | 삼성전자주식회사 | 동작 인식 방법, 동작 인식 장치 및 웨어러블 장치 |
JP6240237B2 (ja) * | 2016-02-26 | 2017-11-29 | パラマウントベッド株式会社 | エアセル制御装置、プログラム及びエアセル制御システム |
CN106361330B (zh) * | 2016-09-21 | 2019-02-26 | 广州视源电子科技股份有限公司 | 智能辅助睡眠中的催眠状态识别方法和系统 |
CN106730238B (zh) * | 2016-12-30 | 2020-03-10 | 清华大学 | 一种环境自适应的智能辅助睡眠装置及方法 |
-
2017
- 2017-12-07 JP JP2017235410A patent/JP7066389B2/ja active Active
-
2018
- 2018-12-05 US US16/496,550 patent/US11510594B2/en active Active
- 2018-12-05 CN CN201880020409.6A patent/CN111417343B/zh active Active
- 2018-12-05 WO PCT/JP2018/044804 patent/WO2019111977A1/fr active Application Filing
- 2018-12-05 CN CN202310291475.XA patent/CN116059051A/zh active Pending
-
2022
- 2022-10-24 US US17/972,012 patent/US11813055B2/en active Active
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2001095858A (ja) * | 1999-03-25 | 2001-04-10 | Matsushita Seiko Co Ltd | 身体を動作させる装置 |
JP2005144042A (ja) * | 2003-11-19 | 2005-06-09 | Denso Corp | 生体情報表示装置、寝姿及び体位検出装置 |
JP2008110031A (ja) * | 2006-10-30 | 2008-05-15 | Aisin Seiki Co Ltd | 寝姿勢判定装置 |
JP2012152283A (ja) * | 2011-01-24 | 2012-08-16 | Jepico Corp | 生体信号検出装置及び離床予兆検出システム |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112650157A (zh) * | 2019-10-11 | 2021-04-13 | 八乐梦床业株式会社 | 控制装置 |
Also Published As
Publication number | Publication date |
---|---|
CN111417343A (zh) | 2020-07-14 |
CN116059051A (zh) | 2023-05-05 |
US20230037703A1 (en) | 2023-02-09 |
US20210106256A1 (en) | 2021-04-15 |
CN111417343B (zh) | 2023-04-14 |
US11813055B2 (en) | 2023-11-14 |
JP7066389B2 (ja) | 2022-05-13 |
US11510594B2 (en) | 2022-11-29 |
JP2019098069A (ja) | 2019-06-24 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2019111977A1 (fr) | Dispositif de détermination de posture | |
JP6594399B2 (ja) | 生体情報モニタリングシステム | |
KR102687254B1 (ko) | 수면 상태 측정장치 및 방법, 위상 코히어런스 산출 장치, 생체 진동신호 측정 장치, 스트레스 상태 측정 장치 및 수면 상태 측정 장치 및 심박 파형 추출 방법 | |
US11510613B2 (en) | Biological condition determining apparatus and biological condition determining method | |
Enayati et al. | Sleep posture classification using bed sensor data and neural networks | |
JP6122188B1 (ja) | 身体状況検知装置、身体状況検知方法及びベッドシステム | |
CN111916211A (zh) | 一种适用于失能老人的安全监护系统和方法 | |
Adami et al. | A method for classification of movements in bed | |
CN110740684A (zh) | 床监测系统 | |
JP2019097829A (ja) | 異常報知装置及びプログラム | |
Dutta et al. | Using Fuzzy Logic for Decision Support in Vital Signs Monitoring. | |
Adami et al. | A system for assessment of limb movements in sleep | |
JP7497497B2 (ja) | システム | |
JP7320241B2 (ja) | 情報処理装置、褥瘡リスク評価方法及び褥瘡リスク評価プログラム | |
JP2025065486A (ja) | 判定装置 | |
JP2023111448A (ja) | 判定装置及びプログラム |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 18884989 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 18884989 Country of ref document: EP Kind code of ref document: A1 |