WO2019086997A2 - Système de biofeedback vestimentaire - Google Patents
Système de biofeedback vestimentaire Download PDFInfo
- Publication number
- WO2019086997A2 WO2019086997A2 PCT/IB2018/058164 IB2018058164W WO2019086997A2 WO 2019086997 A2 WO2019086997 A2 WO 2019086997A2 IB 2018058164 W IB2018058164 W IB 2018058164W WO 2019086997 A2 WO2019086997 A2 WO 2019086997A2
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- feedback
- insole
- biofeedback system
- person
- unit
- Prior art date
Links
- 238000004891 communication Methods 0.000 claims abstract description 145
- 230000033001 locomotion Effects 0.000 claims abstract description 113
- 230000001953 sensory effect Effects 0.000 claims abstract description 85
- 238000012545 processing Methods 0.000 claims abstract description 48
- 230000005021 gait Effects 0.000 claims abstract description 45
- 238000004458 analytical method Methods 0.000 claims abstract description 36
- 238000012549 training Methods 0.000 claims description 26
- 238000000034 method Methods 0.000 claims description 16
- 230000000007 visual effect Effects 0.000 claims description 13
- 238000012544 monitoring process Methods 0.000 claims description 10
- 238000005259 measurement Methods 0.000 claims description 9
- 230000001755 vocal effect Effects 0.000 claims description 6
- 230000010267 cellular communication Effects 0.000 claims description 2
- 238000003860 storage Methods 0.000 description 18
- 210000002683 foot Anatomy 0.000 description 14
- 239000000463 material Substances 0.000 description 10
- 230000009184 walking Effects 0.000 description 10
- 230000008859 change Effects 0.000 description 8
- 238000010801 machine learning Methods 0.000 description 8
- 230000003750 conditioning effect Effects 0.000 description 7
- 230000006378 damage Effects 0.000 description 7
- 238000000554 physical therapy Methods 0.000 description 7
- 230000009471 action Effects 0.000 description 6
- 230000005540 biological transmission Effects 0.000 description 6
- 239000004020 conductor Substances 0.000 description 6
- 230000000694 effects Effects 0.000 description 6
- 230000001413 cellular effect Effects 0.000 description 5
- 208000037265 diseases, disorders, signs and symptoms Diseases 0.000 description 5
- 238000009826 distribution Methods 0.000 description 5
- 239000000758 substrate Substances 0.000 description 5
- 238000012546 transfer Methods 0.000 description 5
- 230000001960 triggered effect Effects 0.000 description 5
- 208000027418 Wounds and injury Diseases 0.000 description 4
- 230000003190 augmentative effect Effects 0.000 description 4
- 230000009286 beneficial effect Effects 0.000 description 4
- 208000014674 injury Diseases 0.000 description 4
- 238000012986 modification Methods 0.000 description 4
- 230000004048 modification Effects 0.000 description 4
- 238000013473 artificial intelligence Methods 0.000 description 3
- 230000036772 blood pressure Effects 0.000 description 3
- 230000001419 dependent effect Effects 0.000 description 3
- 238000010586 diagram Methods 0.000 description 3
- 208000035475 disorder Diseases 0.000 description 3
- 239000011888 foil Substances 0.000 description 3
- 230000006870 function Effects 0.000 description 3
- 230000006872 improvement Effects 0.000 description 3
- 230000001939 inductive effect Effects 0.000 description 3
- 238000000051 music therapy Methods 0.000 description 3
- 230000000399 orthopedic effect Effects 0.000 description 3
- 230000008569 process Effects 0.000 description 3
- 230000002685 pulmonary effect Effects 0.000 description 3
- 238000002106 pulse oximetry Methods 0.000 description 3
- 208000011644 Neurologic Gait disease Diseases 0.000 description 2
- 210000001015 abdomen Anatomy 0.000 description 2
- 230000000386 athletic effect Effects 0.000 description 2
- 230000008901 benefit Effects 0.000 description 2
- 210000001217 buttock Anatomy 0.000 description 2
- 239000003990 capacitor Substances 0.000 description 2
- 230000008878 coupling Effects 0.000 description 2
- 238000010168 coupling process Methods 0.000 description 2
- 238000005859 coupling reaction Methods 0.000 description 2
- 230000007423 decrease Effects 0.000 description 2
- 238000013461 design Methods 0.000 description 2
- 201000010099 disease Diseases 0.000 description 2
- 210000003414 extremity Anatomy 0.000 description 2
- 230000008014 freezing Effects 0.000 description 2
- 238000007710 freezing Methods 0.000 description 2
- 210000001255 hallux Anatomy 0.000 description 2
- 230000001965 increasing effect Effects 0.000 description 2
- 230000000977 initiatory effect Effects 0.000 description 2
- 239000012212 insulator Substances 0.000 description 2
- 238000010030 laminating Methods 0.000 description 2
- 239000010410 layer Substances 0.000 description 2
- 210000002414 leg Anatomy 0.000 description 2
- 229940052961 longrange Drugs 0.000 description 2
- 239000012528 membrane Substances 0.000 description 2
- 229910052751 metal Inorganic materials 0.000 description 2
- 239000002184 metal Substances 0.000 description 2
- 210000003205 muscle Anatomy 0.000 description 2
- 230000001537 neural effect Effects 0.000 description 2
- 230000003287 optical effect Effects 0.000 description 2
- 239000003973 paint Substances 0.000 description 2
- 230000009183 running Effects 0.000 description 2
- 230000035945 sensitivity Effects 0.000 description 2
- 230000000638 stimulation Effects 0.000 description 2
- 238000005728 strengthening Methods 0.000 description 2
- 230000002123 temporal effect Effects 0.000 description 2
- OKTJSMMVPCPJKN-UHFFFAOYSA-N Carbon Chemical compound [C] OKTJSMMVPCPJKN-UHFFFAOYSA-N 0.000 description 1
- RYGMFSIKBFXOCR-UHFFFAOYSA-N Copper Chemical compound [Cu] RYGMFSIKBFXOCR-UHFFFAOYSA-N 0.000 description 1
- 206010017577 Gait disturbance Diseases 0.000 description 1
- 208000018737 Parkinson disease Diseases 0.000 description 1
- 239000004642 Polyimide Substances 0.000 description 1
- BQCADISMDOOEFD-UHFFFAOYSA-N Silver Chemical compound [Ag] BQCADISMDOOEFD-UHFFFAOYSA-N 0.000 description 1
- 206010044565 Tremor Diseases 0.000 description 1
- 230000001133 acceleration Effects 0.000 description 1
- 230000004075 alteration Effects 0.000 description 1
- 229910052782 aluminium Inorganic materials 0.000 description 1
- XAGFODPZIPBFFR-UHFFFAOYSA-N aluminium Chemical compound [Al] XAGFODPZIPBFFR-UHFFFAOYSA-N 0.000 description 1
- 230000002457 bidirectional effect Effects 0.000 description 1
- 230000004397 blinking Effects 0.000 description 1
- 210000000988 bone and bone Anatomy 0.000 description 1
- 229910052799 carbon Inorganic materials 0.000 description 1
- 238000004590 computer program Methods 0.000 description 1
- 229910052802 copper Inorganic materials 0.000 description 1
- 239000010949 copper Substances 0.000 description 1
- 238000012937 correction Methods 0.000 description 1
- 238000005520 cutting process Methods 0.000 description 1
- 125000004122 cyclic group Chemical group 0.000 description 1
- 230000001351 cycling effect Effects 0.000 description 1
- 238000007405 data analysis Methods 0.000 description 1
- 238000013144 data compression Methods 0.000 description 1
- 238000000354 decomposition reaction Methods 0.000 description 1
- 238000006073 displacement reaction Methods 0.000 description 1
- 238000005530 etching Methods 0.000 description 1
- 239000012530 fluid Substances 0.000 description 1
- 230000005484 gravity Effects 0.000 description 1
- 230000010354 integration Effects 0.000 description 1
- 238000003475 lamination Methods 0.000 description 1
- 230000000670 limiting effect Effects 0.000 description 1
- 230000007774 longterm Effects 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 230000000414 obstructive effect Effects 0.000 description 1
- 230000003071 parasitic effect Effects 0.000 description 1
- 230000036961 partial effect Effects 0.000 description 1
- 230000035479 physiological effects, processes and functions Effects 0.000 description 1
- 229920001721 polyimide Polymers 0.000 description 1
- 229920000307 polymer substrate Polymers 0.000 description 1
- 230000036544 posture Effects 0.000 description 1
- 238000012913 prioritisation Methods 0.000 description 1
- 239000011241 protective layer Substances 0.000 description 1
- 230000003252 repetitive effect Effects 0.000 description 1
- 230000002441 reversible effect Effects 0.000 description 1
- 230000001020 rhythmical effect Effects 0.000 description 1
- 238000005070 sampling Methods 0.000 description 1
- 229910052709 silver Inorganic materials 0.000 description 1
- 239000004332 silver Substances 0.000 description 1
- 230000003860 sleep quality Effects 0.000 description 1
- 238000009987 spinning Methods 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
- 230000009182 swimming Effects 0.000 description 1
- 210000003371 toe Anatomy 0.000 description 1
Classifications
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H20/00—ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance
- G16H20/30—ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance relating to physical therapies or activities, e.g. physiotherapy, acupressure or exercising
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/0002—Remote monitoring of patients using telemetry, e.g. transmission of vital signals via a communication network
- A61B5/0015—Remote monitoring of patients using telemetry, e.g. transmission of vital signals via a communication network characterised by features of the telemetry system
- A61B5/0024—Remote monitoring of patients using telemetry, e.g. transmission of vital signals via a communication network characterised by features of the telemetry system for multiple sensor units attached to the patient, e.g. using a body or personal area network
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/103—Measuring devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
- A61B5/1036—Measuring load distribution, e.g. podologic studies
- A61B5/1038—Measuring plantar pressure during gait
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/103—Measuring devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
- A61B5/11—Measuring movement of the entire body or parts thereof, e.g. head or hand tremor or mobility of a limb
- A61B5/112—Gait analysis
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/48—Other medical applications
- A61B5/486—Biofeedback
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/68—Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
- A61B5/6801—Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient specially adapted to be attached to or worn on the body surface
- A61B5/6802—Sensor mounted on worn items
- A61B5/6804—Garments; Clothes
- A61B5/6807—Footwear
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B2505/00—Evaluating, monitoring or diagnosing in the context of a particular type of medical care
- A61B2505/09—Rehabilitation or training
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/02—Detecting, measuring or recording for evaluating the cardiovascular system, e.g. pulse, heart rate, blood pressure or blood flow
- A61B5/0205—Simultaneously evaluating both cardiovascular conditions and different types of body conditions, e.g. heart and respiratory condition
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/68—Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
- A61B5/6801—Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient specially adapted to be attached to or worn on the body surface
- A61B5/6802—Sensor mounted on worn items
- A61B5/6803—Head-worn items, e.g. helmets, masks, headphones or goggles
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/68—Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
- A61B5/6801—Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient specially adapted to be attached to or worn on the body surface
- A61B5/6802—Sensor mounted on worn items
- A61B5/681—Wristwatch-type devices
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H40/00—ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices
- G16H40/60—ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices
- G16H40/63—ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices for local operation
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H40/00—ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices
- G16H40/60—ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices
- G16H40/67—ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices for remote operation
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H50/00—ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
- G16H50/70—ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for mining of medical data, e.g. analysing previous cases of other patients
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H80/00—ICT specially adapted for facilitating communication between medical practitioners or patients, e.g. for collaborative diagnosis, therapy or health monitoring
Definitions
- a wearable system for automatically providing a human subject guidance in performing guided-gait operations such as walking, music therapy, exercise, physiotherapy, athletic, sports and/or dancing training and may be worn for a training session and/or continually throughout the day.
- the system may require a minimum of only two parts - one for sensor interface and a second for providing feedback to the user - with other embodiments requiring additional parts for additional features.
- algorithmic motion analysis occurs in the sensor- interfacing unit thereby reducing the activity duration of wireless communication to dramatically extend battery life.
- Said sensor-interfacing unit may be an intermediate insole layer between the shoe sole and the insole.
- a wearable system for biofeedback which can be used for physical therapy, physical rehabilitation, sports training, dance training etc. the system comprising of two or more wirelessly communicating units.
- the described system is novel in that operation requires a minimum of only two components to the system and it is independent of any user equipment or separate computing unit such as a mobile phone and does not depend upon a remote data link for proper operation. This is advantageous both for extending the operating time by reducing power consumption such that a full day of use without charging is possible, and this also allows the system to be non-obstructive and ergonomic and to be extremely simple to use by physically and mentally challenged subjects.
- An additional benefit over prior art may be that the subject's mobile phone is available for other uses while the system is being used by the subject.
- a first embodiment comprises sensors embedded within two "smart" insoles each performing gait analysis while the subject is walking, standing, training and/or performing other activities.
- the insoles determine whether the subject's motion, such as but not limited to leg movement and/or gait and/or stance and/or balance and/or weight distribution are acceptable or inacceptable based on comparison to a reference motion for this subject, and a headset provides the subject with audible music and/or voice directions in order to improve the subject's motion if deemed necessary.
- Such device could be used for subjects with neural disorders such as people with Parkinson's disease and/or people that have suffered stroke damage and/or other neural disorder, by cueing them audibly and/or visually whenever they are walking, specifically if their gait requires improvement for example "shuffling gait”. Some subjects may also require assistance to initiate walking and/or to prevent or overcome a freezing-of-gait (FoG) episode. External cueing has been shown in the art to improve gait quality in these subjects.
- neural disorders such as people with Parkinson's disease and/or people that have suffered stroke damage and/or other neural disorder
- Suggested suitable audible cues for such disorders may be, but are not limited to, music for example rhythmical music such as used in music therapy, vocal cues, spoken feedback for example as would be administered by a live physiotherapist, a vocal marching cue for example "left, right. left", a gait initiation cue such as "one...two...three...go! , a metronome tone, other tones, an alarm, a radio broadcast, a digital broadcast, a buzzer and/or any other audible means.
- the cues can be predefined and/or configured and/or recorded by a user of the system, for example the subject or a therapist. Cues may be natural and/or synthesized.
- said cueing is not initiated whenever the subject's gait quality is within acceptable limits and/or the subject is not currently walking, for example is sitting or standing.
- a similar device configuration may assist subjects in physical rehabilitation from an injury by real time voice corrections whenever required, for example if the subject is limping or distributing weight incorrectly.
- a similar device configuration could also assist in training an athlete in order to improve performance and/or reduce the risk for injury.
- a similar device configuration could also assist in training a dancer in order to improve accuracy of a stage performance.
- a biofeedback system that consists essentially of at least one insole and a sensor ⁇ ' output device; wherein the at least one insole comprises a sensing unit, and a processing unit: wherein once the at least one insole is inserted into at least one shoe of a person: the sensing unit is configured to monitor a movement of the person; the processing unit is configured to perform a gait analysis of a movement of the person and determine when to provide feedback to the person, wherein the feedback comprises stimulus; a communication unit of the insole is configured to directly send, using short range wireless communication, to the sensory output device, at least an indication about the feedback, when it is determined to send the feedback; and wherein the sensory output device is dedicated mainly to output the feedback, wherein the feedback is a human perceivable feedback.
- Mainly dedicated may mean that most of the resources of the sensory output device are used to output the feedback. This may mean that the processing of the feedback received by the sensory output device and generating the human perceivable feedback are the sole or major functionality of the sensory output device.
- the sensory output device may be dedicated solely (or almost solely) to the processing of the feedback received by the sensory output device and generating the human perceivable feedback.
- the sensory output device may differ from a general purpose smartphone.
- the sensory output device is also referred to as OUTPUT UNIT.
- the insole when adapted to be positioned between a sole of a shoe and another insole is also referred to as INTERMEDIATE INSOLE.
- the sensing unit may also be referred to as a SENSOR UNIT.
- One or more sensors of the insole may be included in a SENSOR UNIT (or sensing unit) that may have processing capabilities.
- One or more sensors may be included in a sensing unit without processing capabilities that is also referred to as SENSOR INTERFACE UNIT.
- the at least one insole may be a single insole.
- the at least one insole may be multiple insoles.
- the communication capabilities of the sensory output device may be limited to the short range wireless communication.
- the sensory output device may be incapable of cellular communication.
- the at least one insole may be configured to short range communicate with a communication device capable of long range communication.
- the sensory output device may be configured to short range communicate with a communication device capable of long range communication.
- the biofeedback device wherein the sensory output device consists essentially of a short range communication unit and a sensory output signal generator.
- the sensing unit may include at least one capacitive proximity sensor. [0026] The sensing unit may include at least one pressure sensor.
- the at least one insole may include a bottom interface for interfacing with a sole of a shoe and an upper interface for interfacing with another insole.
- the at least one insole may include a wireless charging unit for wirelessly charging the at least one insole.
- the feedback may be audible.
- the feedback may be visual.
- the feedback may be tactile
- the feedback may include vocal directions.
- the feedback may include music.
- the feedback may include
- the sensing unit may include at least one out of an accelerometer, a gyroscope and an inertial measurement unit(IMU).
- IMU inertial measurement unit
- the one or more of the sensory output device and the at least one insole may include a man machine interface.
- the man machine interface may allow a person to interface with the system by input and/or output.
- the man machine interface may include one or more buttons.
- the man machine interface may include a gesture identifier.
- the man machine interface may include a voice command identifier.
- the sensory output device may be configured to short range communicate with a communication device capable of long range communication and may be configured to output audio under a control of the communication device.
- the sensory output device may be a Bluetooth headset.
- the biofeedback system may be configured independently and regardless of any communication from a third party, to (a) monitor the movement of the person, (b) perform the gait analysis of the movement of the person, (c) directly send, using short range wireless communication the at least indication about the feedback, and (d) output the feedback.
- the biofeedback system further configured to reconfigure the at least one insole.
- a biofeedback system may include a first insole, a second insole and a sensory output device; wherein each insole of the first and second insoles may include a sensing unit, and a processing unit; wherein once the first and second insoles may be inserted into first and second shoes of a person: the sensing unit of each insole may be configured to monitor a movement of the person;
- the processing unit of the first insole may be configured to perform a first gait analysis of a movement of the person; the processing unit of the second insole may be configured to perform a second gait analysis of a movement of the person; the biofeedback system may be configured to determine, based on the first gait analysis and on the second gait analysis, when to provide feedback to the person, wherein the feedback may include stimulus; wherein a communication unit of at least one of the insoles may be configured to directly send, using short range wireless communication, to the sensory output device, at least an indication about the feedback, when it may be determined to send the feedback; and wherein the sensory output device may be dedicated mainly to output the feedback, wherein the feedback may be a human perceivable feedback.
- a biofeedback system may include a first insole, a second insole and a sensory output device; wherein at least one insole of the first and second insoles may include a sensing unit, and a processing unit; wherein once the first and second insoles may be inserted into first and second shoes of a person: the sensing unit of the first insole may be configured to monitor a movement of the person to provide a first monitoring result; the sensing unit of the second insole may be configured to monitor a movement of the person to provide a second monitoring result; the processing unit of the first insole may be configured to perform a gait analysis of a movement of the person based on a first monitoring result and based on the second monitoring result; the biofeedback system may be configured to determine, based on the gait analysis, when to provide feedback to the person, wherein the feedback may include stimulus; wherein a communication unit of at least one of the insoles may be configured to directly send, using short range wireless communication, to the sensory output device
- a biofeedback system may include a first insole, a second insole and a sensory output device; wherein at least one insole of the first and second insoles comprises a sensing unit, and a processing unit; wherein once the first and second insoles are inserted into first and second shoes of a person: the sensing unit of each insole is configured to monitor a movement of the person; the processing unit of at least one of the first insole and the second insole is configured to perform a gait analysis of a movement of the person; wherein the biofeedback system is configured to determine, based on the gait analysis, when to provide feedback to the person, wherein the feedback comprises stimulus; wherein a communication unit of at least one of the insoles is configured to directiy send, using short range wireless communication, to the sensory output device, at least an indication about the feedback, when it is determined to send the feedback; and wherein the sensory output device is a headset and is configured to output the feedback, wherein the feedback is a hum
- the SENSOR UNIT - includes and/or is electrically attached to one or more sensors that electrically quantify one or more physiological parameters of a subject's body.
- the sensors are coupled to at least one subject's body part, the coupling may be mechanical, electrical, optical and/or thermal.
- the SENSOR UNIT additionally includes a digital processor and may additionally include an accompanying digital storage.
- the SENSOR UNIT may execute a computer algorithm to determine whether a physiological measurement of the subject, for example a certain movement, tremor or gait, is acceptable within some criteria or is not.
- Said SENSOR UNIT additionally includes one or more short-range wireless communication means - for example a Bluetooth, ANT+, Zigbee, NFC etc. transceiver .and one or more power sources such as a rechargeable battery.
- the sensors may be, but are not limited to, one or more of the following sensor types: a pressure sensor, a distance sensor, a capacitive sensor, a proximity sensor, a pressure sensitive switch, a strain gauge, a flexion sensor, a force sensor, an accelerometer, a gyroscope, a magnetometer, a MEMS inertial measurement unit (IMU), a force sensor, a location sensor (for example a GPS receiver), an angle encoder, a pulse oximetry sensor, a pulse rate sensor, a blood pressure sensor, a fluid chemistry sensor, a pulmonary rate sensor, an EEG sensor, an EKG sensor, a skin conductivity sensor, and/or a temperature sensor.
- a pressure sensor a distance sensor, a capacitive sensor, a proximity sensor, a pressure sensitive switch, a strain gauge, a flexion sensor, a force sensor, an accelerometer, a gyroscope, a magnetometer, a MEMS inertial measurement unit (IMU), a
- a pressure and/or distance sensor may be implemented by way of capacitive proximity sensing. This type of sensing is mechanically robust, very thin and of relatively low cost of implementation.
- there are a plurality of said sensors of a given type which may be located at different locations within a SENSOR UNIT for example a plurality of pressure sensors or for example two accelerometers: one in the posterior section and another in the anterior sections of an insole.
- data from two or more said sensors are mathematically and/or electrically combined into a single data stream prior to processing by said computer algorithm.
- data from one or more said sensor is filtered, converted or otherwise modified by a sensor conditioning operation prior to processing by said computer algorithm, said sensor conditioning operation may be performed by said digital processor and/or by a separate component.
- a second of these units - hereby the OUTPUT UNIT - includes one or more sensory output means which may be (1 jauditory such as an earpiece, speaker, headphones, headset, a Bluetooth headset, a hearing aid and/or bone conduction transducer (2)visual such as a video display, video projection, augmented reality display, virtual reality display, Light Emitting Diode (LED), a shutter, a moving object, a smartwatch, a Heads-Up-Display(HUD),a projected laser and/or other visual means and/or (3)tactile such as a vibrator motor, a haptic transducer, an induced pain, a local pressure, a temperature change, a functional muscle stimulation (FMS) and/or other electromechanical transducer.
- sensory output means which may be (1 jauditory such as an earpiece, speaker, headphones, headset, a Bluetooth headset, a hearing aid and/or bone conduction transducer
- (2)visual such as a video display, video projection, augmented reality display, virtual reality display,
- the OUTPUT UNIT may additionally include one or more subject-operable input means such as a button, touch pad, hand gesture, head gesture and/or voice command input.
- the OUTPUT UNIT may additionally include a digital storage.
- the OUTPUT UNIT additionally includes one or more short-range wireless communication means for communicating with said SENSOR UNIT and one or more power sources - such as a rechargeable battery.
- the SENSOR UNIT determines that the status of the physiological measurement of the subject is (or is not) acceptable.
- the OUTPUT UNIT communicates said status via the short-range communication means to the OUTPUT UNIT which then provides predefined sensory stimulus to the subject such as a voice directions, music, video, blinking, haptic feedback etc.
- the predefined sensory stimulus information is stored within the OUTPUT UNIT. This embodiment is beneficial to reduce the power required for communication of the stimulus information from the SENSOR UNIT to the OUTPUT UNIT whenever an output stimulus is determined necessary by the algorithm.
- the predefined sensory stimulus information is stored within the SENSOR UNIT and communicated to the OUTPUT UNIT via said short- range communication means whenever determined necessary by the SENSOR UNIT.
- This embodiment reduces the requirements for storage of the stimulus information within the OUTPUT UNIT and simplifies its design, for example, the OUTPUT UNIT may be an off-the-shelf Bluetooth headset.
- the subject-operable input means in the OUTPUT UNIT may trigger an operation of the sensory output means of the OUTPUT UNIT.
- Said triggered sensory output may be identical or different from the sensory output triggered by the algorithm in the SENSOR UNIT.
- the subject-operable input means in the OUTPUT UNIT may trigger a recording of the data collected by the sensors or data processed by the algorithm, wherein the data may be stored at the SENSOR UNIT or at the OUTPUT UNIT or at a separate device that is in communication with the SENSOR UNIT and/or the OUTPUT UNIT.
- the recorded data may include data that was generated prior to and/or following the triggering of the subject-operable input means.
- the subject-operable input means in the OUTPUT UNIT may add a "tag" to mark a recording of the data collected by the sensors or data processed by the algorithm, wherein the data may be stored at the SENSOR UNIT or at the OUTPUT UNIT or at a separate device that is in communication with the SENSOR UNIT and/or the OUTPUT UNIT.
- the subject-operable input means in the OUTPUT UNIT may record a time stamp, wherein the time stamp may be stored at the
- the subject-operable input means in the OUTPUT UNIT may select an operating mode of the system, for example turn the system on or off.
- the SENSOR UNIT is configured to identify a predefined movement gesture input from said subject which may be used to control the system, for example lifting the big toe twice may trigger a gait initiation cue by an OUTPUT UNIT, based on pressure sensors of an insole SENSOR UNIT.
- said SENSOR UNITs are identical.
- said SENSOR UNITs are not identical.
- said SENSOR UNIT may further include a subsystem for wireless charging of the power source included there within, for example a resonant Qi charging receiver.
- said OUTPUT UNIT and/or said SENSOR UNIT additionally includes one or more microphones.
- the system further includes one or more additional voice pickup device, said device including one or more microphones.
- Said device may be for example, but not limited to, a smart watch and/or a smartphone and/or a wired microphone and/or a wireless microphone.
- said one or more microphones is further processed by a voice recognition algorithm, for example within the OUTPUT UNIT or for example within another unit of the system or for example cloud-based, which controls the system functionality using the subject's voice.
- a voice recognition algorithm for example within the OUTPUT UNIT or for example within another unit of the system or for example cloud-based, which controls the system functionality using the subject's voice.
- said OUTPUT UNIT is additionally capable of providing a 2-way voice interface for performing a voice conversation between a subject and a REMOTE ENTITY, for example a phone call.
- said system including only one or more SENSOR UNITs and an OUTPUT UNIT is functional for its intended use in providing a sensory stimulus however is NOT capable of providing a phone call capability to the subject, for example it is NOT a mobile phone.
- Such unit may still be able to provide a phone call capability to the subject by communicating with a separate device through which a phone call may be performed, for example a mobile phone and/or a CONFIGURATION UNIT as per described hereinbeiow.
- said SENSOR UNITs each communicate separately with the OUTPUT UNIT.
- the OUTPUT UNIT receives communication separately from each of the one or more SENSOR UNITs, the OUTPUT UNIT further includes a digital processor and further applies an algorithm to said received communication to determine whether a sensory stimulus is required or is not required.
- said OUTPUT UNIT algorithm may request additional data from one or more SENSOR UNITs, for example by polling, for example if said algorithm requires said additional data to reach a conclusion.
- two or more said SENSOR UNITs communicate with one another, wherein at least one of said SENSOR UNITs is additionally in communication with said OUTPUT UNIT.
- said communication means between the SENSOR UNITs is different than the short-range communication means between the SENSOR UNIT and the OUTPUT UNIT.
- one of the SENSOR UNITs' algorithms is further tasked with determining whether a sensory stimulus is required or is not required based upon received communication from one or more other SENSOR UNITs and/or upon received communication from one or more other SENSOR INTERFACE UNITs as per described hereinbelow and/or upon data received from its own attached one or more sensor/s.
- the OUTPUT UNIT additionally includes and/or is electrically attached to one or more sensors, for example an accelerometer, that electrically quantify one or more physiological parameters of a subject's body.
- the sensors are coupled to at least one subject's body part, the coupling may be mechanical, electrical, optical and/or thermal. Information collected by said additional sensors may be processed by and/or stored at the OUTPUT UNIT and/or
- a predefined sensory stimulus information may be further altered and/or edited by an algorithm prior to being output to the subject by the OUTPUT UNIT, for example said alteration may include variation in amplitude, tempo, pitch and/or timbre in the case of an auditory stimulus.
- an algorithm on the SENSOR UNIT and/or the OUTPUT UNIT can identify one or more of the following conditions: freezing of gait, incorrect body posture, arm swinging, stride length, stride height, stride rate, turning rate, unstable balance, weight distribution, gait variability, gait parameters, shuffling gait, a predetermined exercise, a predetermined dance routine, a subject's vital signs, a subject's posture, sleep quality, activity rate, daily walking distance, flexibility of movement, quality of a movement, quantity of movement.
- said algorithm employs machine learning techniques in whole or in part.
- a subject and/or a caregiver and/or trainer and/or other entity for example a physiotherapist or doctor, may guide the subject in performing one or more physical motions correctly while the system records the sensor inputs from the correctly performed one or more motions and stores said recording in the system's storage as a reference motion.
- said subject and/or a caregiver and/or trainer and/or other entity tags a motion performed by said subject as correctly or incorrectly performed, for example by means of a CONFIGURATION UNIT as per described hereinbelow.
- said subject tags a motion performed by said subject as correctly or incorrectly performed, for example by means of said OUTPUT UNIT additional input means.
- a caregiver and/or trainer and/or other entity for example a physiotherapist or doctor, may select one or more recordings of a correctly performed one or more motions out of a database of pre-recorded motions and upload said recordings into the system's storage as a reference motion.
- an artificial intelligence algorithm may select a recording of a correctly performed one or more motions out of a database of pre- recorded motions and upload said recording into the system's storage as a reference motion.
- an artificial intelligence algorithm may generate a correctly performed motion and upload said motion into the system's storage as a reference motion.
- a subject may use the system while performing one or more motions, for example as part of a training or rehabilitation program, whereas the system algorithm compares the one or more motions performed by the subject to said reference motion stored in the system's storage. The system may then alert the subject and/or indicate to the subject whenever said performed motions significantly differ from said reference motion. The system may also advise the subject via the OUTPUT UNIT on corrective action required to correct said one or more motions in order to better compare to said reference motion.
- said comparison between said motions performed by the subject to said reference motion may be performed by an algorithm employing machine learning techniques in whole or in part. Said algorithm may automatically deduce the tolerances of whether the comparison is considered similar or whether it is considered dissimilar. Such tolerances may be based on a plurality of reference motions and/or a plurality of motions performed by the subject and/or past
- a subject and/or a caregiver and/or trainer and/or other entity may select a plurality of reference motions and/or an accompanying sequence to define a sequential training program.
- the subject may subsequently use the system to perform said training program.
- the system may guide the subject through the steps of said training program and/or advise the subject via the OUTPUT UNIT on corrective action required to correct the currently performed one or more motions in order to better compare to the reference motion in the current and/or a previous training program step.
- the system produces a report of the quality of performance of the subject while performing said training program.
- This report may be of interest to the subject and/or to another entity for example a physiotherapist or a trainer and may be communicated therewith through the OUTPUT UNIT and/or through another means such as a long-range or short-range wireless communication means.
- said training program trains the user in one or more of the following activities: injury rehabilitation, strengthening, posture improvement, movement training, balance training, relaxation, stretching, disease rehabilitation, overcoming disability, music therapy, dance training, athletic training, sports training, walking, running, swimming, cycling, spinning, golf, tennis, football, basketball, baseball, fencing, boxing, gymnastics, weightlifting, gym training, TRX etc.
- machine learning techniques may be used to determine which of a plurality of said reference motions is currently being performed.
- machine learning techniques may be used to determine the criteria of requirement for said corrective action guidance.
- machine learning techniques may be used to determine and/or report a grading of the quality of performance of said training program by a subject.
- said plurality of OUTPUT UNITs are identical to one another.
- said plurality of OUTPUT UNITs are not identical to one another, for example one OUTPUT UNIT may be a headset providing audible feedback and a second OUTPUT UNIT may be a smartwatch providing a visual status display.
- a second OUTPUT UNIT functionality is emulated by a smartphone application.
- one or more third SENSOR INTERACE UNIT includes and/or is electrically attached to one or more sensors that electrically quantify one or more physiological parameters of a subject's body.
- Said SENSOR INTRFACE UNIT additionally includes a digital processor and one or more short- range wireless communication means - for example a Bluetooth, ANT+, zigbee, NFC etc. transceiver and one or more power sources such as a rechargeable battery.
- the SENSOR INTERFACE UNIT communicates data collected from the sensors connected thereto to said SENSOR UNIT via the short-range wireless communication means.
- data from two or more said sensors within said SENSOR INTERFACE UNIT are mathematically and/or electrically combined into a single data stream and communicated as such to said SENSOR UNIT via said short- range wireless communication means.
- data from said SENSOR INTERFACE UNIT as received by said SENSOR UNIT is mathematically and/or electrically combined by said SENSOR UNIT with data from said SENSOR UNIT'S attached sensors, prior to processing by said SENSOR UNIT'S algorithm.
- data from said SENSOR INTERFACE UNIT sensors is filtered, converted or otherwise modified by a sensor conditioning operation within said SENSOR INTERFACE UNIT prior to being communicated to said SENSOR UNIT.
- additional sensors may be incorporated into one or more of the SENSOR UNIT and/or the OUTPUT UNIT and/or the SENSOR INTERFACE UNIT.
- Said sensors may or may not assist the primary function of the system, these may be but are not limited to: a location sensor (for example a GPS receiver), an angle encoder, a pulse oximetry sensor, a pulse rate sensor, a blood pressure sensor, a pulmonary rate sensor, a gas concentration sensor, a gas flow rate sensor, an EEG sensor, an EKG sensor, a skin conductivity sensor, and/or a temperature sensor
- one or more sensors and/or the SENSOR UNIT and/or the OUTPUT UNIT and/or the SENSOR INTERFACE UNIT may be embedded into and/or attached to- an article of clothing such as but not limited to: a cap, eyeglasses frame, shirt, gloves, trousers, a belt, underwear, socks, insoles, shoes, a headband, a neckband, an armband, a leg band, a neck pendant and/or another article of clothing
- a sensor and/or the SENSOR UNIT and/or the OUTPUT UNIT and/or the SENSOR INTERFACE UNIT may be mechanically, electrically , optically and/or themially coupled to one or more bodily parts of a human subject such as, but not limited to: a limb, a digit, a joint, the torso, the back, the chest, the belly, the waist, the neck, the buttocks, the shoulders, the head [00112]
- the short-range communication means between the SENSOR INTERFACE UNIT and the SENSOR UNIT is different than the short- range communication means between the SENSOR UNIT and the OUTPUT UNIT.
- one or more sensors and/or said SENSOR UNIT and/or said SENSOR INTERFACE UNIT as described hereinabove are further designed to function as an INTERMEDIATE INSOLE to fit in between a shoe sole and a shoe insole, wherein for example a shoe insole is removed from the shoe, said INTERMEDIATE INSOLE is inserted into the shoe and a shoe insole is inserted on top of said INTERMEDIATE INSOLE.
- said INTERMEDIATE INSOLE includes at least a flexible portion and a rigid portion. Sensors may or may not be embedded within said flexible portion and/or within said rigid portion.
- said flexible portion is located at the anterior side of said INTERMEDIATE INSOLE whereas said rigid portion is located at the posterior side of said INTERMEDIATE INSOLE.
- said flexible portion is located on the top side of said INTERMEDIATE INSOLE and the rigid part is located on the bottom side of said INTERMEDIATE INSOLE.
- the entire INTERMEDIATE INSOLE is flexible, [001 19] In one embodiment, said INTERMEDIATE INSOLE is fully or partially of flat shape wherein at least a portion of area of said INTERMEDIATE INSOLE is of considerably constant thickness, for example less than 5mm of thickness difference.
- said INTERMEDIATE INSOLE is fully or partially of a wedge shape wherein the bottom surface of the INTERMEDIATE INSOLE is considerably flat and at least a portion of the upper surface is considerably flat and slanted in respect to said bottom surface.
- a posterior portion may be of a wedge shape and an anterior portion may be of considerably constant thickness, or, for example the whole upper surface ma ⁇ ' be flat and slanted such that the anterior side is thinner than the posterior side.
- said INTERMEDIATE INSOLE is orthopedically and/or economically shaped, for example an orthosis and/or for comfort, and the insole applied over the INTERMEDIATE INSOLE is considerably of a constant thickness.
- said INTERMEDIATE INSOLE's mechanical structure is 3D-printed, for example customized for the subject.
- said INTERMEDIATE INSOLE additionally includes one or more interface layer applied to one or more of its surfaces to increase friction and/or adhesiveness between said INTERMEDIATE INSOLE and a part of the shoe and/or between said INTERMEDIATE INSOLE and said insole covering it.
- said INTERMEDIATE INSOLE may further include a subsystem for wireless charging of the power source embedded therewithin, for example a resonant Qi charging receiver.
- An antenna coil required for said wireless charging may be embedded within any part of said INTERMEDIATE INSOLE for example in a flexible portion and/or in a rigid portion.
- charging of said INTERMEDIATE INSOLE's power source for example its batter
- a wireless power transmitter for example a Qi charging pad or for example a Pi charging antenna.
- said proximity to a power transmitter does NOT require removal of said INTERMEDIATE INSOLE from the shoe but does require proximity of the shoe to a power transmitter with said INTERMEDIATE INSOLE therewithin.
- At least a portion of said INTERMEDIATE INSOLE may be trimmed, for example to the shape and size of the subject's shoe.
- said INTERMEDIATE INSOLE includes one or more pressure sensors and/or distance sensors.
- said pressure sensors and/or distance sensors and/or other sensors for example accelerometers are distributed at two or more areas of said INTERMEDIATE INSOLE, for example the heel, the big toe, the lateral side and/or the anterior side of the foot.
- there is a single said pressure sensor and/or distance sensor which may or may not cover most of the area of said
- said distance sensor is a capacitive sensor which may convert the proximity between a subject's foot and said capacitive sensor into electrical capacitance which may be measured by said algorithm, for example utilizing a capacitance-to-digital converter.
- said pressure sensor is a capacitive sensor which may convert local mechanical pressure to local proximity by means of a mechanically compliant material, the thickness of said material being dependent upon local mechanical pressure, and successively convert said proximity into electrical capacitance which may be measured by said algorithm, for example utilizing a capacitance-to-digital converter.
- said mechanically compliant material may be a portion of said INTERMEDIATE INSOLE, for example said flexible portion, and/or may be an insole placed thereupon and/or may be an article of clothing of the subject, for example a sock.
- said capacitive sensor include one or more conductive plates applied to an insulating substrate wherein the conductive material is one or more of: a metal, for example copper or aluminum, and/or is an area of a printed circuit board, for example a flexible polyimide PCB and/or a metal foil, for example applied using hot foil stamping and/or a conductive ink, for example carbon or silver based ink.
- a metal for example copper or aluminum
- a metal foil for example applied using hot foil stamping
- a conductive ink for example carbon or silver based ink.
- said pressure sensor is a conductive-particle- infused polymer substrate, the electrical resistance of which is dependent of mechanical pressure applied thereupon, for example Velostat material and/or a strain gauge.
- said pressure sensor is a pressure-sensitive conductive paint.
- said pressure sensor is a piezoelectric material and/or a piezoresistive material, for example as produced by TE connectivity.
- said SENSOR UNIT may additionally include one or more pressure-sensitive electrical switches which are operated by a mechanical pressure that a subject's foot applies thereupon.
- said one or more switches are used as sensors to provide input to said algorithm.
- a plurality of switches is used wherein the location of switches is such that weight distribution across the area of the foot can be estimated by an algorithm.
- two or more switches of different sensitivities may be located in proximity such that the magnitude of pressure applied to said two or more switches is considerably identical and wherein the magnitude of said pressure may be estimated by an algorithm based on which of said switches are closed and/or which of said switches are open.
- said one or more switches provide an indication of the presence of a subject's foot upon said SENSOR UNIT.
- said one or more switches may be used to power said SENSOR UNIT on and/or off.
- said one or more switches is a membrane switch consisting of two opposing conductors separated by an air gap, whereas application of pressure to said membrane switch may cause the conductors to contact and close an electrical circuit.
- one or more parts of the INTERMEDIATE INSOLE are protected from mechanical wear and/or damage by a lamination of one or more protective layer between said part and the exterior environment, for example a part of the shoe of an insole or the foot of a subject.
- one or more parts of the INTERMEDIATE INSOLE are protected from mechanical wear and/or damage by being embedded within the body of the INTERMEDIATE INSOLE, for example within in a compartment and/or by overmolding the body material during manufacturing.
- one or more SENSOR UNIT and/or the OUTPUT UNIT may additionally include a long-range communication means such as but not limited to, a mobile network connection, a WiFi network connection, a wired network connection, a modem and/or an internet connection.
- Said long-range communication means may be utilized to transfer data from one or more of the sensors and/or data generated by the algorithm to a REMOTE ENTITY.
- the system may additionally include a mobile phone and/or other communication device including a short-range communication means and a long-range communication means configured as to communicate data with said SENSOR UNIT and/or said OUTPUT UNIT using said short-range communication means and communicate data with said REMOTE ENTITY using said long-range communication means, for example operating as a communication relay.
- a mobile phone and/or other communication device including a short-range communication means and a long-range communication means configured as to communicate data with said SENSOR UNIT and/or said OUTPUT UNIT using said short-range communication means and communicate data with said REMOTE ENTITY using said long-range communication means, for example operating as a communication relay.
- said mobile phone and/or other communication device additionally modifies and/or otherwise processes the data passing between said short-range communication means and said long-range communication means.
- the REMOTE ENTITY is a computer server wherein said server may store and/or analyze the data transferred thereto via said long-range communication means.
- said REMOTE ENTITY is a mobile device, for example a mobile phone, wherein said mobile device may store and/or analyze the data transferred thereto via said long-range communication means or via said short- range communication means.
- said REMOTE ENTITY is a human operator, for example a physiotherapist or a trainer, wherein said human may receive data from the system corresponding to a motion currently being performed and/or a motion previously performed by the subject.
- the data transferred from the SENSOR UNIT and/or the OUTPUT UNIT to the REMOTE ENTITY constitutes data collected from the majority of the time the system has been in use by the subject.
- the data transferred from the SENSOR UNIT and/or the OUTPUT UNIT to the REMOTE ENTITY constitutes data collected at certain specific intervals that are determined to be of importance by the system, for example whenever the algorithm of the SENSOR UNIT determines or, for example whenever the subject triggers said input means of the OUTPUT UNIT.
- the data transferred from the SENSOR UNIT and/or the OUTPUT UNIT to the REMOTE ENTITY is collected over a relatively long period of time, stored locally in the SENSOR UNIT and transferred to the REMOTE ENTITY over a relatively short period of time from the storage means, for example collected over days or hours and transferred during minutes or seconds.
- the data transferred from the SENSOR UNIT and/or the OUTPUT UNIT to the REMOTE ENTITY is continually transferred whenever the system is collecting data from the subject and the said long-range communication means is in operation.
- the REMOTE ENTITY transmits back to the SENSOR UNIT and/or to the OUTPUT UNIT data resulting from analysi performed by algorithms performed on said REMOTE ENTITY.
- the REMOTE ENTITY transmits back to the SENSOR UNIT and/or to the OUTPUT UNIT data for configuration of the algorithms used by the SENSOR UNIT and/or to the OUTPUT UNIT including for example firmware update.
- the REMOTE ENTITY transmits back to the SENSOR UNIT or directly to the OUTPUT UNIT data for the sensory stimulus.
- the REMOTE ENTITY transmits back to the SENSOR UNIT or directly to the OUTPUT UNIT information for guidance of the subject, for example instructions how to improve the subject's motion.
- the REMOTE ENTITY communicates to and/or with the subject through an audiovisual means which is not the OUTPUT UNIT such as, but not limited to a computer monitor, a tablet computer, a smartphone, a television set. a video display, a video projection, an augmented reality display, a virtual reality display, an audio system and/or a virtual assistant.
- an audiovisual means which is not the OUTPUT UNIT such as, but not limited to a computer monitor, a tablet computer, a smartphone, a television set.
- the REMOTE ENTITY generates a report based on received data which may be accessed by the OUTPUT UNIT and/or the CONFIGURATION UNIT described hereinbelow and/or via a separate means such as, but not limited to, a mobile phone or a computer terminal.
- the REMOTE ENTITY collected, and generated data may be accessed by, or provided to, a separate entity which is not the subject using the system, for example a caregiver.
- the REMOTE ENTITY collected, and generated data may be used to develop the algorithms used by the REMOTE ENTITY and/or the algorithms used by the SENSOR UNIT.
- the REMOTE ENTITY is a human and said long-range communication network provides a two-way voice link between the subject and the REMOTE ENTITY, for example a voice call.
- the REMOTE ENTITY is an artificial intelligence virtual assistant and said long-range communication network provides a two-way voice link between the subject and the REMOTE ENTITY, for example a voice call.
- the REMOTE ENTITY is an assistance provider, for example a medical professional at an emergency helpline, and said long- range communication network provides a two-way voice link between the subject and the REMOTE ENTITY, for example a voice call.
- a fourth CONFIGURATION UNIT may utilize either said short-range communication means and/or said long-range communication means and/or a different means to configure the algorithm in the SENSOR UNIT and/or the sensory stimuli provided by the OUTPUT UNIT.
- the CONFIGURATION UNIT is a mobile device running an application "app”.
- the CONFIGURATION UNIT is a computer terminal.
- the CONFIGURATION UNIT is an internet web page.
- the CONFIGURATION UNIT receives said configuration of the algorithm in the SENSOR UNIT and/or said sensory stimuli provided by the OUTPUT UNIT from the REMOTE ENTITY.
- the CONFIGURATION UNIT is also the REMOTE ENTITY.
- the CONFIGURATION UNIT is also the OUTPUT UNIT.
- INTERMEDIATE INSOLE are not mutually exclusive and in certain embodiments, a single system unit may have functionality of two or more said attributes
- short-range communication refers to communication between units of the system that are carried with the subject or within the subject's vicinity and are a part of the wearable and/or mobile system
- long-range communication refers to communication between the system and a REMOTE ENTITY irrespective of actual distance to said REMOTE ENTITY
- type of communication used by "short-range communication means” and by “long- range communication means” may or may not be the same type, for example WiFi, and may or may not both utilize a single transceiver and/or a single antenna.
- insole that may be less flexible than insole 242
- output unit (such as output sensory unit)
- Figure 2 depicts a minimal embodiment.
- a subject 10 is wearing a single "SENSOR UNIT” (insole 24) in one of his shoes and a short-range wireless audio device, for example a Bluetooth earpiece 26, is an "OUTPUT UNIT".
- Short range transmission denoted 14.
- Sensors in the SENSOR UNIT for example pressure sensors and/or MEMS inertial sensors and/or additional sensors, convert body motion to an electrical signal and a processor embedded therein performs analysis on said motion, for example gait analysis, and further communicates with the OUTPUT UNIT over a short-range wireless link such as Bluetooth.
- Figure 1 depicts an embodiment.
- a subject 10 is wearing smart insoles 24 "SENSOR UNITs" in his shoes and a headset consisting of audible earphones 26 and visual indicators mounted on an eyeglasses frame 27 is an "OUTPUT UNIT".
- Sensors 25 in the SENSOR UNITs convert body motion to an electrical signal and a processor embedded therein performs gait analysis and further communicates with the OUTPUT UNIT over a short-range wireless link such as Bluetooth.
- this embodiment includes a smartwatch 22 OUTPUT UNIT which may also serve as a "CONFIGURATION UNIT” or utilize its internal sensors to perform additional SENSOR UNIT and/or SENSOR INTERFACE UNIT tasks.
- this embodiment includes a smartwatch OUTPUT UNIT which may also serve as a "CONFIGURATION UNIT" or utilize its internal sensors to perform additional SENSOR UNIT and/or SENSOR INTERFACE UNIT tasks.
- the mobile phone 21 is also shown to provide a long-range communication means between the system and a REMOTE ENTITY for example over a cellular network.
- the long-range communication link may provide bidirectional voice and data transfer. Long range communication denoted 15. Short range communication denoted 14.
- a first REMOTE ENTITY shown is a computing "cloud" 13 which may collect data from the system, for example to perform long-term data analysis and/or for example to produce a report, and may transfer data to the system, for example change a configuration setting in the SENSOR UNIT' algorithm.
- a second REMOTE ENTITY shown is a caregiver 12, for example a physiotherapist, which may receive data from the system and/or from said first REMOTE ENTITY and may also communicate with the subject, for example through a voice call.
- Figure 3 depicts a block diagram of the internal components constituting a minimal embodiment system as described in figure 2 hereinabove.
- a SENSOR UNIT 30 includes one or more sensors 35 connected to a digital processor 34, for example a microcontroller unit (MCU). Said digital processor is connected to a digital storage means 33, for example RAM, ROM, EEPROM or Flash memory or other digital storage means, the storage means storing an algorithm to be performed by the processor and data for a sensory stimulus to be output to the subject.
- the SENSOR UNIT additionally includes a short-range wireless
- a Bluetooth transceiver for example a Bluetooth transceiver.
- a power source such as a battery 31.
- An OUTPUT UNIT 40 depicted as a Bluetooth headset includes a short-range wireless communication means 42 and accompanying antenna 45, for example a Bluetooth transceiver, a microphone 43(which may be optional in this embodiment), a speaker 44, for example an earpiece and a power source, for example a battery 41.
- Figure 4 depicts the same SENSOR UNIT and OUTPUT UNIT as per fig. 3 but additionally includes a mobile phone 21 which may act as a
- the mobile phone may communicate with the SENSOR UNIT using a short-range wireless link, for example to receive data from the SENSOR UNIT or for example to change a configuration of the SENSOR UNIT, and/or it may communicate with the OUTPUT UNIT, for example to provide a voice call with a REMOTE ENTITY.
- the mobile phone may additionally communicate using a long-range wireless link, for example a cellular network and/or a WiFi network, to one or more remote entities for example a "cloud" server and/or a caregiver.
- the mobile phone may relay data between a REMOTE ENTITY and a local system unit by transferring said data between the long-range communication link and the short-range communication link.
- the mobile phone may or may not modify said data, for example by further algorithmic processing and/or data compression etc.
- Figure 5 depicts additional features in the SENSOR UNIT and in the OUTPUT UNIT in addition to those described hereinabove.
- the SENSOR UNIT depicted has more than one sensor (sensors 351, 352, 353) and the sensors are pre-processed by a sensor conditioning block 39, for example to filter the data and/or convert from an analog signal to a sampled digital signal that is compatible with a digital processor 34.
- the storage 33 now additionally provides a means to record data while the system is in use for later analysis, for example sensor raw data and/or algorithmic output and/or tagging of events.
- the storage also additionally stores one or more reference motions to which an algorithm may compare a subject motion, in order to provide an indication of the quality of said subject motion in comparison to said reference motion.
- the SENSOR UNIT additionally includes a wireless power transfer means such as an inductive charging coil 38 and accompanying batteiy charging circuit 37 for charging battery 31.
- the OUTPUT UNIT further depicts more than one stimulus means which may be audible for example a speaker, visual for example an LED indicator and/or tactile for example a vibrator motor.
- the OUTPUT UNIT may additionally include one or more sensors, for example an inertial measurement unit (IMU), which may facilitate the system algorithms and/or provide a user input means, such as to detect a gesture.
- IMU inertial measurement unit
- the OUTPUT UNIT 40 may additionally include a digital processor 44 and accompanying storage means 48 whereas said storage means may store a sensory stimulus to be output through one or more of the stimulus means whenever the digital processor deems necessary, for example when an indication is received from a SENSOR UNIT and/or when an input means of the OUPUT UNIT is triggered.
- the OUTPUT UNIT may include additional input means to operate the system such as a button 494 and/or a voice recognition input 47.
- the OUTPUT UNIT additionally includes a long-range communication means 42 ⁇ for example such that the mobile phone as depicted in figure 4 is not necessarily required for communication with said REMOTE ENTITY.
- Tactile unit 491 , LED 492 and speaker 493 are also shown.
- Figure 6 depicts a block diagram of a simplified SENSOR UNIT and a simplified OUTPUT UNIT 40' (including communication means, antenna 45 and speaker 44) wherein the SENSOR UNIT 30' further communicates with a SENSOR INTERFACE UNIT 30" in particular to receive data therefrom.
- the SENSOR INTERFACE UNIT 30 includes a plurality of sensors 351, 352 and 353, sensor conditioning 39, a digital processor 34 , a short-range communication means 32 (antenna 36) and a power source as described above in figures 3/4/5, however by definition, a SENSOR INTERFACE UNIT does not use an algorithm to process the sensor data and/or reach a conclusion based on said data but rather only transfers said data to a SENSOR UNIT 30' (illustrated as including sensors 35, processor 34 antenna 36 and communication means (not shown) for further processing, by way of a short-range communication means.
- FIG. 7 depicts a simplified block diagram with two SENSOR UNITs 30' and an OUTPUT UNIT 40'.
- each sensor unit performs partial processing on its connected sensors, for example quantifies the quality of measured motion by its connected sensors.
- Both SENSOR UNITs communicate with an OUTPUT UNIT which further includes an algorithm to process the data received from said two SENSOR UNITs and provide a sensory stimulus to a subject whenever determined necessary by said algorithm.
- the OUTPUT UNIT may have additional capability to poll and/or request data from a SENSOR UNIT, for example in one scenario a first SENSOR UNIT determines that a sensory stimulus might be required, the OUTPUT UNIT receives a status from said first SENSOR UNIT and proceeds to request and receive additional data from a second SENSOR UNIT, the OUTPUT UNIT then uses an algorithm to decide whether or not a sensory stimulus is required based upon data from both SENSOR UNITs.
- Figure 8 depicts a system with multiple components. Two output units
- one unit may provide an audible stimulus (using speaker 44) and the second may provide visual stimulus (using smart watch 22). Both units communicate with a central SENSOR UNIT 30' using a first short-range communication link.
- the central SENSOR UNIT additionally communicates with a second SENSOR UNIT using a second short range
- the three said short-range communication links may be of the same or of different communication types and may share communication means or have dedicated communication means for each link (shown).
- the central SENSOR UNIT also includes a long-range communication means (not shown) which may communicate with one or more REMOTE ENTITIES such as a caregiver 12 and/or a cloud server 13.
- a cloud server may update firmware and/or parameters and/or stimuli for the system.
- a human REMOTE ENTITY may receive sensor data and/or voice communication through said long-range
- communication means and/or may receive additional data from said cloud server, for example in the form of a report.
- Figure 9 depicts a method of training a subject using the system.
- the method consists of two sessions - a system programming session
- a subject may perform one or more training sessions without re-programming the system.
- the first step in system programming is a subject performing a motion 61, either once or repeatedly, while correct motions are tagged 62 as such either by the subject and/or by a trainer. Said tagged motions are recorded and/or analyzed by the system while untagged motions may be either discarded or recorded and/or analyzed as bad examples.
- the second step in system programming is generation 63 of a reference motion from said tagged recordings, for example by a machine learning algorithm.
- Said reference motion may include spatial and/or temporal information including an acceptable tolerance limit. Steps 1 and 2 may be repeated multiple times and/or generate multiple reference motions.
- a trainer and/or the subject define 64 a training program comprising of one or more reference motion 65. This training program is stored in the system for later use.
- the subject may perform multiple motions 72 that are intended to match the multiple recorded reference motions 71.
- an algorithm for example a machine learning algorithm, compares 73 said subject performed motions to said recorded reference motions of the training program. It may be possible for the algorithm to automatically identify which reference motion the subject motion is attempting to emulate and/or the beginning and end of said motion.
- a third step of a subject training session the system provides feedback 75 to a subject and/or to a different entity, for example a trainer, based on the similarity between the subject performed motions and the reference motions as determined in the second step.
- Feedback may be of any kind, for example sensory, vocal, written 74 etc., and may be given during the performance of the motions and/or at the end of a subject training session.
- Figures 10, 11 , 12 depict embodiments for INTERMEDIATE INSOLE
- the INTERMEDIATE INSOLE 24 mechanical design as installed within a shoe 23 and covered by a shoe insole 1 such that the subject does not come in direct contact with the INTERMEDIATE INSOLE.
- the INTERMEDIATE INSOLE 24 is supported by the sole 92 of the shoe.
- the INTERMEDIATE INSOLE is considerably flat.
- the INTERMEDIATE INSOLE is of a wedge shape and is generally as long as the inner length of the shoe
- the INTERMEDIATE INSOLE is of a wedge shape and is considerably shorter than the inner length of the shoe.
- the INTERMEDIATE INSOLE is partially of a wedge shape and partially generally flat.
- the flat portion of said INTERMEDIATE INSOLE may be exceptionally thin, for example less than 3mm thick, while bulky and/or sensitive components such as integrated circuits and/or a batter ⁇ ' may be embedded within the wedge portion.
- an insole SENSOR UNIT is fabricated in an orthopedic shape, for example as a custom orthosis or for enhanced comfort, for example may be 3D printed, CNCed, molded or otherwise.
- said orthopedic SENSOR UNIT is covered by a considerably flat insole that adapts to its curves to improve a subject's comfort such that the SENSOR UNIT described in the fifth embodiment above is an
- Figure 13 depicts the general position of components in a SENSOR UNIT insole for example in an INTERMEDIATE INSOLE.
- pressure sensors 351 are applied to an upper surface of the SENSOR UNIT whereas most other components (antenna 36, communication module 32, processor 34, batter ⁇ ' 31, configuration unit 39, other sensor 352 such as an inertial measurement unit 353) are concentrated in the thicker and more protected portion of the insole.
- sensors 352 such as an inertial measurement unit 353
- two inertial measurement units - one proximal and a second anterior are shown embedded within the insole.
- pressure sensors are applied to the bottom surface of the SENSOR UNIT whereas a flexion sensor 355 is embedded within the flexible anterior section.
- pressure sensors may be embedded within a SENSOR UNIT as long as the SENSOR UNIT is not rigid in these areas and pressure applied by the subject can be measured by said sensors.
- FIG 14 depicts embodiments wherein pressure sensors are replaced by capacitive proximity sensors 352. Measurement of both pressure and proximity by capacitive proximity sensors is explained in following figures hereinbelow.
- the capacitive sensors are applied to an upper surface of an insole, for example by applying said sensor's conductive capacitor plates to a thin flexible substrate and laminating said substrate on the upper surface of a SENSOR UNIT.
- the capacitive sensors are applied to a bottom surface of an insole, for example by applying said sensor's conductive capacitor plates to a thin flexible substrate and laminating said substrate on the bottom surface of a SENSOR UNIT.
- Capacitive sensors are robust and easy to produce economically, for example by hot foil stamping, conductive paint, etching, cutting etc.
- Figure 15 depicts the electrical working of capacitive sensors 352.
- Capacitance is an electrical property between adjacent conductors that are separated by an insulator.
- the value of capacitance is determined by the area, orientation and distance between the conductors and on the material of the insulator between them.
- a capacitive proximity/ distance/ displacement sensor converts a physical change in one of said parameters, for example the distance, to a change in capacitance which may be converted to a digital reading using a capacitance-to-digital converter (C2D or CDC) 354 wired to said two conductor "plates".
- C2D or CDC capacitance-to-digital converter
- Multiple plates may be used in an application and capacitance measured between pairs of plates, for example sequentially.
- a serially connected circuit comprising a first capacitance 357 between a first plate and the body part, through the resistance 358 of the generally conductive body part, and back through a second capacitance 359 to a second plate.
- the C2D is insensitive to the resistance and the two, serially-connected capacitances form an equivalent capacitance that is dependent upon the proximity between said body part and said first and second sensor plates.
- an attached processor may deduce the physical distance between said body part and said first and second sensor plates.
- Figure 16 depicts the effect of movement on sensors within an insole.
- an INTERMEDIATE INSOLE 24 includes three capacitive pressure sensors 352 and an IMU sensor 353. The
- INTERMEDIATE INSOLE is separated from the subject's foot by an insole made of mechanically compliant material.
- the distance between said foot and a capacitive sensor may vary depending on the local pressure applied to the compliant material of the insole. As local pressure is increased, the insole is locally compressed and the distance between the subject's foot and the capacitive sensor plate decreases thus increasing the capacitance measured by said sensor.
- a SENSOR UNIT analyzing the capacitance values during standing may calculate the weight distribution and/or quality of balance of a subject based on the relative proximity between the foot and each of the sensors.
- Capacitive sensors are advantageous in that they measure proximity and not only pressure, for example, in part of a gait cycle or stance, the heel may lift from the insole and no longer contact it. Whereas a true pressure sensor can no longer identify the position of the heel under these conditions, a capacitive sensor may still measure the distance between the heel and the sensor. In the example shown, a subject standing on his toes increases the capacitance at the anterior sensor since the local pressure is concentrated in the anterior of the foot and decreases the capacitance at the posterior heel sensor which is not receiving any pressure from the foot.
- the angle of the gravity acceleration vector also changes in respect to the EMU which may provide a reading to the algorithm representing said angle.
- the rate of angle change may additionally be measured by gyroscopes within the IMU.
- Figure 17 depicts wireless charging of SENSOR UNIT within a shoe.
- an INTERMEDIATE INSOLE 24 is shown to additionally include an inductive wireless charging coil 38 (first coil), for example resonant-mode Qi charging.
- first coil for example resonant-mode Qi charging.
- the shoe is placed on top of a compatible charger pad 103 and power is transferred wirelessly from a power transmission coil 104 (second coil) within said charger pad, through the shoe sole and the body of said INTERMEDIATE INSOLE and received in said wireless charging coil.
- Additional power conditioning circuits are required preceding said transmission coil in said charging pad and to receive power from said charging coil and to adapt said received power to charging a battery in the INTERMEDIATE INSOLE, said circuits are not illustrated.
- Figure 18 depicts conceptual sensor data 112 received over time and representing a repetitive motion performed by a subject, for example walking, training dancing etc. as detected by one or more sensors and converted to data for processing by an algorithm which may be processed by a SENSOR UNIT.
- Said algorithm may continually or periodically analyze said sensor data for quality based on predetermined criteria, for example amplitude, frequency, variability and/or similarity to a reference motion or any other criteria and calculate a rating for said quality which may change over time,
- the algorithm may have a threshold of decision (1 14) above which the motion is considered correct and below which corrective action should be taken, for example operating a sensory stimulus such as cueing, vocal instructions, haptic feedback etc.
- An algorithm and/or manual operation may also "tag" 118 the data, for example to indicate that an interesting event has occurred. Tagging may be beneficial for later analysis of recorded data wherein only data slightly preceding and slightly after the tag may be of interest for analysis. In the example illustrated, only the data collected around the tag event (pre-tag and post-tag) may be transferred 119 to a remote entity and otherwise discarded.
- a wearable, self-use physiotherapy guidance system may include
- a pair of shoe insole devices each including:
- a wireless communication channel eg. BT, ANT+, zigbee etc.
- a power source eg. a battery
- At least one sensory output device including least one of ⁇ a speaker, headphones, an earpiece, a headset, bone-conduction acoustic transducer, light emitting diode indicators, a visual display, a visual projection, a tactile transducer, a vibrator, a haptic transducer, an electromechanical transducer ⁇
- a long-range communication device including
- a long-distance communication means e.g. cellular, telephone, modem, internet connection etc.
- One or more of the parts additionally may include one or more sensor from the group of: ⁇ a location sensor, an angle encoder, a pulse oximetry sensor, a pulse rate sensor, a blood pressure sensor, a pulmonary rate sensor, a gas
- concentration sensor a gas flow rate sensor, an EEG sensor, an EKG sensor, a skin conductivity sensor, and/or a temperature sensor
- One or more of the sensor data may be mathematically combined to provide a single, fused sensor input that may be processed consecutively as a single sensor data stream
- the sensor may be embedded within an article of clothing such as, but not limited to: a cap, eyeglasses, shirt, gloves, trousers, a belt, underwear, socks, insoles, shoes.
- the sensor may be mechanically coupled to one or more bodily parts of a human subject such as, but not limited to: a limb, a digit, a joint, the torso, the back, the chest, the belly, the waist, the neck, the buttocks, the shoulders, the head [00254]
- the physiotherapy guidance may be used to guide the user in performing controlled gait walking.
- the system may detect one or more of the following conditions: freezing of gait, incorrect body posture, arm swinging, strand length, stand height, step rate, turning rate, unstable balance, weight distribution.
- the physiotherapy guidance may be used to guide the user in one or more of the following activities: injury rehabilitation, strengthening, posture improvement, movement training, balance training, relaxation, stretching, disease rehabilitation, overcoming disability
- the processing unit may be located with- and wired to- one or more of the sensors, the output of which may be wirelessly communicating with the output device
- the processing unit may be located with- and wired to- an output device and the sensors communicate wirelessly with said processing unit
- a processing unit communicates wirelessly with both sensors and output devices.
- One or more processing units may be each located with- and wired to- a sensor and an output device respectively, and the processing units wirelessly communicate with one another.
- the output sensory feedback may be audible and may consist of one or more of: recorded human speech, synthesized speech, tones, an alarm, a prerecorded sound, a radio broadcast, a digital broadcast, music, a buzzer or any other audible means
- the output sensory feedback may be visual and may consist of one or more of: a blinldng light, a moving object, a video, a projection, a shutter or any other visual means
- the output sensory feedback may be tactile and may consist of one or more of: a vibration, a haptic feedback, an induced pain, a local pressure, a temperature change or any other tactile means
- the output device consists of one or more audio speakers
- the speakers may be bone-conduction speakers
- the speakers may be mounted on one of: an eyeglasses frame, a headband, a neckband, a neck pendant, an article of clothing
- the speakers may be headphones or a headset
- the speaker may be part of a hearing aid
- the speakers may also provide reception of a telephone voice call
- the output device may consist of one or more modulated LED indicators
- the LED indicators may be mounted on an eyeglasses frame
- the LED indicators may be mounted on an article of clothing
- the system additionally may include a status display.
- the status display may be one of: a smartwatch, a mobile device, an augmented reality display, a HUD, a mobile video screen.
- the status display unit also may include the processing unit as per claim .
- the output device as per claim may be the also the status display
- the long-range communication channel may communicate information from the user-side wearable system to a remote entity.
- the communication may be over a cellular network and the remote entity may be a cloud server.
- the remote entity may provide remote physiotherapy guidance to the user over the long range communication channel and the remote entity receives information from the user-side sensors over the long range communication channel in order to track the user's physiology
- the remote entity may be a human physiotherapy guide
- the remote entity may be a computer program
- the long-distance communication link may be one of: a cellular data network, a WiFi connection, a wired computer network, an internet connection
- the user-side system may collect information over a relatively long period of time (for example hours or days) and that information may be
- the remote entity guides the user by providing one or more of the following outputs: audio, voice, video, picture, text, mechanical movement, tactile [00285]
- the provision of the output requires additional system elements including, but not limited to: a video screen, a smartphone, a video projector, speakers, virtual reality goggles, augmented reality display, haptic feedback, vibrator, functional muscle stimulation, an electromechanical device
- the system may include a microphone and a voice recognition algorithm (local or remote) which controls the system functionality using the user's voice.
- a voice recognition algorithm local or remote
- the microphone may be embedded within the output device of the system
- the microphone may be in a wrist-worn device such as a smart watch
- the microphone may be in a communication device such a smartphone
- the microphone can provide voice communication transmission over the long-range communication channel
- the microphone may provide voice input to a telephone voice call
- the microphone may be not a separate device but rather a reversed functionality of the speaker in the output device
- a wearable, self-use walking gait assistance system comprising one or more body-worn sensors as inputs, one or more local wireless communication channels, one or more long-range communication channels with a remote entity, one or more processing units and one or more bone-conduction head-mounted speakers as outputs to provide an audible speech guidance to the user based on the sensor inputs.
- any arrangement of components to achieve the same functionality is effectively “associated” such that the desired functionality is achieved.
- any two components herein combined to achieve a particular functionality can be seen as “associated with” each other such that the desired functionality is achieved, ircespective of architectures or intermedial components.
- any two components so associated can also be viewed as being “operably connected” or “operably coupled” to each other to achieve the desired functionality.
- the illustrated examples may be implemented as circuitry located on a single integrated circuit or within a same device.
- the examples may be implemented as any number of separate integrated circuits or separate devices interconnected with each other in a suitable manner.
- the examples, or portions thereof may implemented as soft or code representations of physical circuitry or of logical representations convertible into physical circuitry, such as in a hardware description language of any appropriate type.
- the invention is not limited to physical devices or units implemented in non-programmable hardware but can also be applied in programmable devices or units able to perform the desired device functions by operating in accordance with suitable program code, such as mainframes, minicomputers, servers, workstations, personal computers, notepads, personal digital assistants, electronic games, automotive and other embedded systems, cell phones and various other wireless devices, commonly denoted in this application as 'computer systems'.
- suitable program code such as mainframes, minicomputers, servers, workstations, personal computers, notepads, personal digital assistants, electronic games, automotive and other embedded systems, cell phones and various other wireless devices, commonly denoted in this application as 'computer systems'.
- any reference signs placed between parentheses shall not be construed as limiting the claim.
- the word 'comprising' does not exclude the presence of other elements or steps then those listed in a claim.
- the terms "a " or “an”, as used herein, are defined as one or more than one.
- the use of introductory phrases such as “at least one " and “one or more " in the claims should not be construed to imply that the introduction of another claim element by the indefinite articles "a " or “an " limits any particular claim containing such introduced claim element to inventions containing only one such element, even when the same claim includes the introductory phrases “one or more " or “at least one " and indefinite articles such as "a " or “an.
- Any system, apparatus or device referred to this patent application includes at least one hardware component.
Landscapes
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Engineering & Computer Science (AREA)
- Medical Informatics (AREA)
- Public Health (AREA)
- General Health & Medical Sciences (AREA)
- Biophysics (AREA)
- Animal Behavior & Ethology (AREA)
- Biomedical Technology (AREA)
- Molecular Biology (AREA)
- Surgery (AREA)
- Pathology (AREA)
- Physics & Mathematics (AREA)
- Veterinary Medicine (AREA)
- Heart & Thoracic Surgery (AREA)
- Dentistry (AREA)
- Oral & Maxillofacial Surgery (AREA)
- Physiology (AREA)
- Biodiversity & Conservation Biology (AREA)
- Physical Education & Sports Medicine (AREA)
- Epidemiology (AREA)
- Primary Health Care (AREA)
- Computer Networks & Wireless Communication (AREA)
- Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)
- Footwear And Its Accessory, Manufacturing Method And Apparatuses (AREA)
- Rehabilitation Tools (AREA)
Abstract
L'invention concerne un système de biofeedback qui est essentiellement composé d'au moins une semelle intérieure et d'un dispositif de sortie sensoriel, ladite au moins une semelle intérieure comprenant une unité de détection et une unité de traitement. Une fois ladite au moins une semelle intérieure insérée dans au moins une chaussure d'une personne, l'unité de détection est conçue pour surveiller un mouvement de la personne ; l'unité de traitement est conçue pour effectuer une analyse d'un mouvement de la personne en marche et pour déterminer le moment où il faut fournir un retour d'information à cette personne, le retour d'information comprenant un stimulus ; une unité de communication de la semelle intérieure est conçue pour envoyer directement au dispositif de sortie sensoriel, par communication sans fil à courte portée, au moins une indication relative au retour d'information lorsqu'il est déterminé qu'il faut envoyer le retour d'information ; et le dispositif de sortie sensoriel est principalement dédié à la sortie du retour d'information, le retour d'information étant un retour d'information perceptible par l'homme.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US201762579838P | 2017-10-31 | 2017-10-31 | |
US62/579,838 | 2017-10-31 |
Publications (2)
Publication Number | Publication Date |
---|---|
WO2019086997A2 true WO2019086997A2 (fr) | 2019-05-09 |
WO2019086997A3 WO2019086997A3 (fr) | 2019-08-22 |
Family
ID=66331397
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/IB2018/058164 WO2019086997A2 (fr) | 2017-10-31 | 2018-10-19 | Système de biofeedback vestimentaire |
Country Status (1)
Country | Link |
---|---|
WO (1) | WO2019086997A2 (fr) |
Cited By (16)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10624559B2 (en) | 2017-02-13 | 2020-04-21 | Starkey Laboratories, Inc. | Fall prediction system and method of using the same |
CN111568434A (zh) * | 2020-05-21 | 2020-08-25 | 上海体育学院 | 一种人体平衡测试关节检测装置系统 |
US11145181B1 (en) | 2020-05-26 | 2021-10-12 | National Tsing Hua University | Method and wearable electronic device for imbalance warning |
US11191454B2 (en) | 2018-08-31 | 2021-12-07 | Feel The World, Inc. | Biofeedback for altering gait |
IT202000019825A1 (it) | 2020-08-07 | 2022-02-07 | Schema31 S P A | Sistema per la correzione motoria, particolarmente per il superamento di blocchi motori |
US11277697B2 (en) | 2018-12-15 | 2022-03-15 | Starkey Laboratories, Inc. | Hearing assistance system with enhanced fall detection features |
US11559252B2 (en) | 2017-05-08 | 2023-01-24 | Starkey Laboratories, Inc. | Hearing assistance device incorporating virtual audio interface for therapy guidance |
EP4132355A1 (fr) * | 2020-04-05 | 2023-02-15 | Ceriter | Procédé et système pour détecter et éliminer une démarche déviante |
US11638563B2 (en) | 2018-12-27 | 2023-05-02 | Starkey Laboratories, Inc. | Predictive fall event management system and method of using same |
NL2031060B1 (en) * | 2022-02-24 | 2023-09-06 | Cue2Walk Int B V | Cueing device with self-activation. |
US20230338225A1 (en) * | 2018-09-11 | 2023-10-26 | Encora, Inc. | Apparatus and method for reduction of neurological movement disorder symptoms using wearable device |
GB2619069A (en) * | 2022-05-26 | 2023-11-29 | Magnes Ag | Intervention based on detected gait kinematics |
US11974856B2 (en) | 2019-11-25 | 2024-05-07 | Analog Devices International Unlimited Company | Wearable sensor and method of forming thereof |
WO2024186292A1 (fr) * | 2023-03-08 | 2024-09-12 | Yilmaz Seyma | Lunettes de support pour les symptômes de "gel de la marche" chez les patients atteints de la maladie de parkinson |
US12095940B2 (en) | 2019-07-19 | 2024-09-17 | Starkey Laboratories, Inc. | Hearing devices using proxy devices for emergency communication |
US12254755B2 (en) | 2017-02-13 | 2025-03-18 | Starkey Laboratories, Inc. | Fall prediction system including a beacon and method of using same |
Family Cites Families (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7526954B2 (en) * | 2004-07-24 | 2009-05-05 | Instep Usa, Llc | Gait assistive system and methods for using same |
US8627716B2 (en) * | 2009-02-06 | 2014-01-14 | Pressure Profile Systems Inc. | Capacitive proximity tactile sensor |
KR101263216B1 (ko) * | 2009-12-21 | 2013-05-10 | 한국전자통신연구원 | 스마트 신발 및 그 동작 방법 |
US9453772B2 (en) * | 2011-03-24 | 2016-09-27 | MedHab, LLC | Method of manufacturing a sensor insole |
JP2016528959A (ja) * | 2013-07-12 | 2016-09-23 | ロイヤル・メルボルン・インスティテュート・オブ・テクノロジーRoyal Melbourne Institute Of Technology | センサーアレイシステム |
KR20160145981A (ko) * | 2015-06-11 | 2016-12-21 | 엘지전자 주식회사 | 인솔, 이동 단말기 및 그 제어 방법 |
-
2018
- 2018-10-19 WO PCT/IB2018/058164 patent/WO2019086997A2/fr active Application Filing
Cited By (20)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US12254755B2 (en) | 2017-02-13 | 2025-03-18 | Starkey Laboratories, Inc. | Fall prediction system including a beacon and method of using same |
US10624559B2 (en) | 2017-02-13 | 2020-04-21 | Starkey Laboratories, Inc. | Fall prediction system and method of using the same |
US11559252B2 (en) | 2017-05-08 | 2023-01-24 | Starkey Laboratories, Inc. | Hearing assistance device incorporating virtual audio interface for therapy guidance |
US12064261B2 (en) | 2017-05-08 | 2024-08-20 | Starkey Laboratories, Inc. | Hearing assistance device incorporating virtual audio interface for therapy guidance |
US11191454B2 (en) | 2018-08-31 | 2021-12-07 | Feel The World, Inc. | Biofeedback for altering gait |
US11839466B2 (en) | 2018-08-31 | 2023-12-12 | Feel The World, Inc. | Biofeedback for altering gait |
US20230338225A1 (en) * | 2018-09-11 | 2023-10-26 | Encora, Inc. | Apparatus and method for reduction of neurological movement disorder symptoms using wearable device |
US11277697B2 (en) | 2018-12-15 | 2022-03-15 | Starkey Laboratories, Inc. | Hearing assistance system with enhanced fall detection features |
US12149893B2 (en) | 2018-12-15 | 2024-11-19 | Starkey Laboratories, Inc. | Hearing assistance system with enhanced fall detection features |
US11638563B2 (en) | 2018-12-27 | 2023-05-02 | Starkey Laboratories, Inc. | Predictive fall event management system and method of using same |
US12095940B2 (en) | 2019-07-19 | 2024-09-17 | Starkey Laboratories, Inc. | Hearing devices using proxy devices for emergency communication |
US11974856B2 (en) | 2019-11-25 | 2024-05-07 | Analog Devices International Unlimited Company | Wearable sensor and method of forming thereof |
EP4132355A1 (fr) * | 2020-04-05 | 2023-02-15 | Ceriter | Procédé et système pour détecter et éliminer une démarche déviante |
CN111568434A (zh) * | 2020-05-21 | 2020-08-25 | 上海体育学院 | 一种人体平衡测试关节检测装置系统 |
US11145181B1 (en) | 2020-05-26 | 2021-10-12 | National Tsing Hua University | Method and wearable electronic device for imbalance warning |
IT202000019825A1 (it) | 2020-08-07 | 2022-02-07 | Schema31 S P A | Sistema per la correzione motoria, particolarmente per il superamento di blocchi motori |
NL2031060B1 (en) * | 2022-02-24 | 2023-09-06 | Cue2Walk Int B V | Cueing device with self-activation. |
WO2023227659A1 (fr) * | 2022-05-26 | 2023-11-30 | Magnes Ag | Intervention basée sur une cinématique de démarche détectée |
GB2619069A (en) * | 2022-05-26 | 2023-11-29 | Magnes Ag | Intervention based on detected gait kinematics |
WO2024186292A1 (fr) * | 2023-03-08 | 2024-09-12 | Yilmaz Seyma | Lunettes de support pour les symptômes de "gel de la marche" chez les patients atteints de la maladie de parkinson |
Also Published As
Publication number | Publication date |
---|---|
WO2019086997A3 (fr) | 2019-08-22 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2019086997A2 (fr) | Système de biofeedback vestimentaire | |
US11925471B2 (en) | Modular physiologic monitoring systems, kits, and methods | |
US10973439B2 (en) | Systems and methods for real-time data quantification, acquisition, analysis, and feedback | |
AU2017386412B2 (en) | Systems and methods for real-time data quantification, acquisition, analysis, and feedback | |
US11679300B2 (en) | Systems and methods for real-time data quantification, acquisition, analysis, and feedback | |
US20240298976A1 (en) | Method and system for adjusting audio signals based on motion deviation | |
US10089763B2 (en) | Systems and methods for real-time data quantification, acquisition, analysis and feedback | |
CN108135537A (zh) | 用于治疗骨关节炎的系统、装置和方法 | |
CN204426918U (zh) | 一种用于缓解帕金森病患者冻结步态的智能手环 | |
CN116018080A (zh) | 步态调节辅助系统 | |
CN107998643A (zh) | 一种用于帕金森病人步态运动改善和训练监测的智能脚环 | |
US20230390608A1 (en) | Systems and methods including ear-worn devices for vestibular rehabilitation exercises | |
KR20220132812A (ko) | 의료 데이터를 이용하여 운동 프로그램을 제공하는 전자 장치 및 방법 | |
CN111603674B (zh) | 颈部按摩仪 | |
TWM657485U (zh) | 步態智能系統 | |
Woodward | Pervasive motion tracking and physiological monitoring | |
CN119792898A (zh) | 一种智能舞蹈训练装置 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 18873384 Country of ref document: EP Kind code of ref document: A2 |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 18873384 Country of ref document: EP Kind code of ref document: A2 |