+

WO1991001699A1 - Systeme de commande - Google Patents

Systeme de commande Download PDF

Info

Publication number
WO1991001699A1
WO1991001699A1 PCT/GB1990/001226 GB9001226W WO9101699A1 WO 1991001699 A1 WO1991001699 A1 WO 1991001699A1 GB 9001226 W GB9001226 W GB 9001226W WO 9101699 A1 WO9101699 A1 WO 9101699A1
Authority
WO
WIPO (PCT)
Prior art keywords
menu
control
items
control system
pressure sensor
Prior art date
Application number
PCT/GB1990/001226
Other languages
English (en)
Inventor
Roger Edwin Wilson
Original Assignee
Gec-Marconi Limited
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Gec-Marconi Limited filed Critical Gec-Marconi Limited
Publication of WO1991001699A1 publication Critical patent/WO1991001699A1/fr

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/013Eye tracking input arrangements
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61FFILTERS IMPLANTABLE INTO BLOOD VESSELS; PROSTHESES; DEVICES PROVIDING PATENCY TO, OR PREVENTING COLLAPSING OF, TUBULAR STRUCTURES OF THE BODY, e.g. STENTS; ORTHOPAEDIC, NURSING OR CONTRACEPTIVE DEVICES; FOMENTATION; TREATMENT OR PROTECTION OF EYES OR EARS; BANDAGES, DRESSINGS OR ABSORBENT PADS; FIRST-AID KITS
    • A61F4/00Methods or devices enabling patients or disabled persons to operate an apparatus or a device not forming part of the body

Definitions

  • This invention relates to a control system providing an interface between a human being and a range of machine and electronic devices, that is, a form of cybernetic system, and particularly to such a system for use by a human operator whose activities are restricted either by personal incapacity or by the demands of the environment in which he is operating.
  • Typical examples are paraplegic or quadriplegic people and persons having restricted mobility due to limited confinement who require the use of all or most of their limbs (including head-movement) for other purposes, eg operators of bathyspheres, mini-submarines, space capsules, aircraft cockpits and simulators related to these systems.
  • An object of the present invention is therefore to provide a control system of the above type which significantly improves the operator's control capability while still permitting operation according to selected indiv dual modes.
  • a control system for use by a human operator comprises computer means adapted to control a range of utility items, display means controlled by the computer means and adapted to display at least one menu of the utility items, pressure sensor means adapted to be operated by the operator, eye-direction sensor means adapted to be fitted to an operator's head the pressure sensor means and the eye-d rection sensor means being coupled to the computer means by an electro-magnetic link to control selection and/or activation of the utility items.
  • the electromagnetic link is preferably an infra-red link.
  • the display means may be adapted to display two or more stages of menu, a menu of ⁇ the second or a subsequent stage constituting a the utility item of a menu of the preceding stage.
  • the pressure sensor may be an air pressure sensor responsive to pressures above and below atmospheric produced by blowing and sucking.
  • a cursor may be movable over displayed menu items by the sucking or the brewing process and activation of a selected menu item effected by the complementary process.
  • the eye-direction sensor means and the pressure sensor means may control, one, the selection of menu items, and the other, the activation of selected menu items.
  • the computer means may be adapted to scan menu items automatically until an item is selected by operation of the pressure sensor means.
  • the control system may be adapted to operate in an eye control mode, the eye-di rection sensor means control ling the selection of menu items, and the system including means responsi ve to the dwell time of the eye direction to acti vate any menu item subjected to the same eye direction for a predetermined period.
  • the control system may further include means responsive to the closure of one but not two eyes to switch between operational modes , and parti cularly from a standby to an acti ve mode.
  • Figure 1 is a bl ock diagram of a preferred arrangement provi ding a choice between, or combination of eye di rection sensor and pressure sensor;
  • Figure 2 is a diagram of an abbreviated version of the arrangement of Figure 1 showing a pressure sensor and pressure transducer in some detail ;
  • Figure 3 is a control mode menu to be displayed to the operator
  • Figure 4 is a main menu fol lowing automatically from Figure 3;
  • Figure 5 (a) is a particular sub-menu of a typewriter mode
  • Figure 5 (b) is a sub-menu expansion of part of the menu of Fi gure 5(a);
  • Figure 6 is a sub-menu resulting from selection of radio equipment. and Figures 7 and 8 show, respecti vely, transmitting and recei ving units of an I .R. link.
  • the system comprises an ai r pressure sensor unit 1 , shown in detail in Figure 2 , coupled to a computer 3 by way of an infra-red l ink 5.
  • the computer controls a display 7 and also a number of uti l ity devices 9 whi ch may include radi o, TV , recorder, fan, l ights, telephone, motor/servo units, robots, wheel chai rs and many more.
  • the computer is programmed to associate the uti lity devices 9 with items on a menu displayed by the display system 7.
  • a cursor on the display is moved to select a particular item which is associated either with further displays in sub-menus or with a particular uti lity device 9 di rectly.
  • An acti vation si gnal may be provided by the pressure sensor unit 1 to the computer 3 to acti vate the selected item, ie the selected uti lity device.
  • Anr eye-di rection sensor system 11 is incorporated to gi ve additional control _and flexibility to the overall system.
  • This eye-sensor system comprises a pai r of electrodes 13 whi ch are fitted to the operator's head in close contact with his temples.
  • the charge distribution around a person's eyes is affected by the 'di rect ion- of -look ' .
  • Suitably placed electrodes (13) can therefore be employed to sense this charge di sturbance and hence the di rect ion- of- look.
  • the potential difference arising can be used to control the cursor of the di splay system 7 by way of the computer 3 to perform the selection and, as will be seen, the actiyation, of menu items.
  • the acti vation of a selected menu item can be achieved by responding to the dwel l time of the eyes on the selected item.
  • a sufficiently long dwell time, in excess of a predetermined threshol d, produces an acti vation signal.
  • the eye sensor would be used for only one of the two functions, selection and acti vati on and most conveniently that one would be the selection function.
  • An advantage of the system is however the flexibility prov ided by the ability to choose either eye sensor or pressure sensor or a combination, for selection and acti vati on.
  • the pressure sensor unit 1 of Figure 1 comprises essentially a suck/bl ow mouthpiece 15 and a pressure transducer 17 which converts the pressure signals to electrical signals.
  • a manually operable valve 19 enables the mouthpiece to be isolated from the transducer 17.
  • the suck/blow mouthpiece and the tactile sensor 21 are shown associated with parallel branch tubes. They could, of course, be fitted to the same tube.
  • the pressure transducer 17 responds to both positive and negative pressure, that is, pressure above and below atmospheric. Positive pressure is sensed by a normally open contact 25 coupled to a diaphragm, not shown, and negative pressure is sensed by a normally open contact 27 coupled to a diaphragm also not shown.
  • the two contacts provide three output lines: a common line 29 connected to both 'wiper' contacts; and two lines 31 and 33 connected to the respective other contacts. These three lines are connected to control an infra red transmitter 36 linked tyo a remote infra red receiver 37 as in Figure 1, to control the selection and activation of the displayed menu items by the computer. As shown, selection is achieved by the positive pressure switch 25, and activation by the negative pressure switch 27, ie 'blow' for selection and 'suck' for activation. At the wish of the operator, this arrangement can be reversed by a reversing switch 35 which interchanges the signals on the lines 31 and 33.
  • I.R. link has the advantage of making the sensor systems self-contained and local to the operator. There is no wire connection to any equipment which is fed from 'the mains'.
  • the I.R. receiver 37, the computer 3, display 7 and interface with the utility devices 9 are all relatively remote from the operator and he can feel secure from any high voltages that they might exhibit.
  • Figure 3 shows a suitable control mode menu for setting up the mode of operation according to the operator's capabilities or preference.
  • the menu comprises the following control modes:
  • eye-direction sensor which is calibrated to the individual operator, and a mechanism such as the pressure sensor described above for activation of the menu item selected by the eye-sensor.
  • Other mechanisms such as a sound activation unit or a joystick, may be incorporated.
  • the essential operation is however, selection by eye direction and activation by other mechanism.
  • each 'blow' causes the cursor to move along the one-dimensional array of menu items in predetermined steps. (4) 'Eye pause' .
  • This mode invol ves only the eye-di recti on sensor. Selection is performed as in (1) above but activation is also achieved by means of the eye-di rection sensor. Thus , on selecting a parti cular menu item by looking directly at the item, the operator then holds the position for a predetermined period, eg three seconds. A timer is then triggered and the selected item activated.
  • the eye-di rection sensor means is sensiti ve both to the 'direction-of-look' and to the open/closed state of the eye, the detected voltage level s corresponding di stincti vely to these conditions.
  • a control si gnal can thus be provided by bl inking the eyes.
  • an exclusi ve-OR function is included to detect only single-eye bl inking, ie a 'wink ' .
  • a minimum wink durati on is also imposed to ensure that the movement is deliberate.
  • the control mode may be arranged to default to a particular one of the modes whon in Figure 3 either to suit the particular operator or, in general , the least demanding mode, ie the Eye-pause mode.
  • the operator initially presented with the Figure 3 menu, may then keep the eye-pause mode or, using that mode select one of the others.
  • Each of the control modes when selected and acti vated is fol l owed by a menu such as that of Fi gure 4.
  • the three menu items shown are TYP for 'typewriter mode' ; BYW for 'standby mode' ; and DEV for 'device mode ' .
  • Acti vation of the typewriter mode causes a menu such as shown in Figurte 5 (a) to be displayed, this being a text-writing menu.
  • the items consist of the alphabet, various editing items, and an outlet to the next menu, 'page 2' .
  • the 30 items are arranged in 6 columns and the cursor 39 can be control led to move horizontally by bl owing on the pressure sensor mouthpiece.
  • a particular column is selected by positioning the cursor and, assuming operation in the eye-scan mode, this selection acti vated by sucking on the mouthpiece.
  • the selected column is then di splayed in a horizontal row in the menu.
  • a further cursor selection and activation is performed so selecting a parti cular item from the row.
  • a sequence of letters can thus be called up to form a piece of text which occupies the "text workspace" area. This may then be printed out in a hard copy by selection and activation of the "print" item in column 6 of the original menu.
  • Figure 5(b) shows the sub menu for a selection of the sixth column of Figure 5(a) and shows the cursor positioned under "page 2". This selection is activated as before by sucking on the mouthpiece to give the menu denoted PA2 (page 2) which may be related to the text writing function or may be an exit to a previous or other menu.
  • the operator may require a rest in the middle of some (particularly typewriting) operation.
  • the BYE (standby) mode of Figure 4 provides this function by effectively storing the current state of an operation until such time as the operator wishes to come back to it.
  • exit from the standby mode is effected by a wink, which returns the operation to the state immediatgely prior to the rest period.
  • the third operational mode available from the menu of Figure 4 is the device control mode (DEV). Selection of this mode produces a menu comprising a list of devices that can be remotely controlled by the computer, eg radio, telephone, door locks etc. These items are selected and activated as required.
  • DEV device control mode
  • Each menu available has an exit, which is either automatic, on a time-out, or is selectable as one menu item to enable the operator to revert to the main menu, to proceed to an associated menu or perhaps to close down.
  • Figures 7 and 8 show respectively the transmitting and receiving units of the infra red link.
  • the transmitter local to the operator, is coupled in this instance to a joystick device 41 which produces left/right and up/down output signals according to the direction of hand pressure applied.
  • the resulting signals are encoded by an integrated circuit unit 43 and arranged to pulse a pair of I.R. light-emitting-diodes 45 accordingly.
  • Figure 8 shows the corresponding circuitry of the receiver unit local to the computer and fed by a photo-sensitive diode 47.
  • ajoystick in this embodiment does simplify the selection of (particularly textual) items from a rectangular array, removing one stage of selection. It may well, however, be beyond the capabilities of certain handicapped operators.
  • the system incorporates the 'visual-only' mode as a back-up providing communication and control for the rare cases where the eyes are the only available movement that the patient can control.
  • a minimum movement, minimum pressure sensor as described above is used to control an auto-scan of the display and select the menu item. In this way, the rare extreme cases are catered for, even if the best in communication and control that can be obtained for a patient in that condition is limited.
  • T is mode uses both the eye sensor and the pressure sensor in such a way that the scan of the display is carried out by the eye di rection-of-look measurement and the acti vation of selected items from the scan is carried out by the minimum-movement-minimum-pressure sensor.
  • the eyes are gi ven continual rests from concentration when not scanning the display prior to selection or after selection. It is found a patient' s concentration fades if eye control is attempted continuously after too short a period in most cases.
  • rests and alternati ve control procedures are provided both to eye control and to control by the force/pressure switch, and by uti lising both at once and al so providi ng autoscan.
  • the main mode is based on this mix of techniques.

Landscapes

  • Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Health & Medical Sciences (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • General Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Animal Behavior & Ethology (AREA)
  • Vascular Medicine (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Biomedical Technology (AREA)
  • User Interface Of Digital Computer (AREA)
  • Position Input By Displaying (AREA)

Abstract

Le système de commande décrit, qui est principalement conçu pour des personnes dont la mobilité et le mouvement des membres sont limités, comprend un affichage commandé par ordinateur de séquences de menus, dont chacune fait apparaître une séquence d'objets utilitaires et dont chacune peut être sélectionnée par un curseur. La commande du curseur et l'activation de l'objet utilitaire sélectionné s'effectuent par le patient au moyen d'un capteur de pression (1), actionné buccalement et/ou digitalement, conjointement avec un capteur de direction des yeux (11), les deux capteurs partageant les processus de sélection et d'activation selon les choix du patient. Les capteurs (1, 11) installés sur le patient sont reliés à l'ordinateur par une liaison I. R., de façon à éliminer tout risque d'électrocution et de façon à éviter également les inconvéniente et les dangers représentés par des cables traînants.
PCT/GB1990/001226 1989-08-04 1990-08-06 Systeme de commande WO1991001699A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
GB8917929.5 1989-08-04
GB898917929A GB8917929D0 (en) 1989-08-04 1989-08-04 Control system

Publications (1)

Publication Number Publication Date
WO1991001699A1 true WO1991001699A1 (fr) 1991-02-21

Family

ID=10661217

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/GB1990/001226 WO1991001699A1 (fr) 1989-08-04 1990-08-06 Systeme de commande

Country Status (2)

Country Link
GB (2) GB8917929D0 (fr)
WO (1) WO1991001699A1 (fr)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0468340A3 (en) * 1990-07-24 1992-12-16 Biocontrol Systems, Inc. Eye directed controller
WO1993002622A1 (fr) 1991-08-07 1993-02-18 Software Solutions Limited Utilisation de systemes informatiques
ES2116910A1 (es) * 1996-05-03 1998-07-16 Amengual Colom Antonio Procedimiento y dispositivo para la activacion de un interruptor mediante el guiño de un ojo.
CN100374986C (zh) * 2003-06-12 2008-03-12 控制仿生学公司 用于交互式通信和分析的方法、系统与软件

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
FR2727821A1 (fr) * 1994-12-13 1996-06-14 Lucas Sa G Systeme de commande d'une machine munie de plusieurs organes moteurs hydrauliques, attelee a un tracteur agricole
AT412745B (de) * 2001-10-01 2005-06-27 Arc Seibersdorf Res Gmbh Steuereinheit
WO2004034937A1 (fr) * 2002-10-17 2004-04-29 Arthur Prochazka Procede et appareil de commande de dispositif ou de procede a l'aide de vibrations produites par des clics de dents
ES2349540B1 (es) * 2009-03-11 2011-10-27 Carlos Sanz Juez Sistema domotico y procedimiento de control.

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US3986030A (en) * 1975-11-03 1976-10-12 Teltscher Erwin S Eye-motion operable keyboard-accessory
CH660956A5 (en) * 1986-03-06 1987-06-30 Marcel Roux Control device for physically handicapped persons
US4688037A (en) * 1980-08-18 1987-08-18 Mcdonnell Douglas Corporation Electromagnetic communications and switching system
EP0288169A2 (fr) * 1987-04-07 1988-10-26 Possum Controls Limited Système de commande
EP0294812A2 (fr) * 1987-06-10 1988-12-14 Research Development Foundation Contrôleur de calibration pour contrôler des machines actionnées électriquement

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4109145A (en) * 1974-05-20 1978-08-22 Honeywell Inc. Apparatus being controlled by movement of the eye
EP0055338A1 (fr) * 1980-12-31 1982-07-07 International Business Machines Corporation Communication au moyen d'un appareil commandé par l'oeil humain
US4746913A (en) * 1984-04-23 1988-05-24 Volta Arthur C Data entry method and apparatus for the disabled
GB2179147A (en) * 1984-12-24 1987-02-25 Univ Adelaide Improvements relating to eye-gaze-direction controlled apparatus

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US3986030A (en) * 1975-11-03 1976-10-12 Teltscher Erwin S Eye-motion operable keyboard-accessory
US4688037A (en) * 1980-08-18 1987-08-18 Mcdonnell Douglas Corporation Electromagnetic communications and switching system
CH660956A5 (en) * 1986-03-06 1987-06-30 Marcel Roux Control device for physically handicapped persons
EP0288169A2 (fr) * 1987-04-07 1988-10-26 Possum Controls Limited Système de commande
EP0294812A2 (fr) * 1987-06-10 1988-12-14 Research Development Foundation Contrôleur de calibration pour contrôler des machines actionnées électriquement

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0468340A3 (en) * 1990-07-24 1992-12-16 Biocontrol Systems, Inc. Eye directed controller
WO1993002622A1 (fr) 1991-08-07 1993-02-18 Software Solutions Limited Utilisation de systemes informatiques
ES2116910A1 (es) * 1996-05-03 1998-07-16 Amengual Colom Antonio Procedimiento y dispositivo para la activacion de un interruptor mediante el guiño de un ojo.
CN100374986C (zh) * 2003-06-12 2008-03-12 控制仿生学公司 用于交互式通信和分析的方法、系统与软件

Also Published As

Publication number Publication date
GB9017213D0 (en) 1990-09-19
GB8917929D0 (en) 1989-09-20
GB2236874A (en) 1991-04-17

Similar Documents

Publication Publication Date Title
US4746913A (en) Data entry method and apparatus for the disabled
CA2319525C (fr) Appareil lingual pour emissions tactiles
EP0734704B1 (fr) Système de communication interoral
JP2865875B2 (ja) 盲人および高度の視覚障害者のための像認識装置
EP1652504A3 (fr) Dispositif de communication et de contrôle des fonctions d'un lit
US6712613B2 (en) Display device suited for a blind person
US20050134559A1 (en) Touch pad sensor for motor vehicle
CA2159854A1 (fr) Circuit d'interface auto-alimente pour capteur a transducteur
WO1993014386A1 (fr) Capteur a effleurement du type a repartition
WO1991001699A1 (fr) Systeme de commande
WO2001096969A8 (fr) Procede permettant de simplifier le fonctionnement d'une machine
CA2126142A1 (fr) Appareil de communication visuel
WO1998040698A3 (fr) Instrument de mesure portatif a ecran tactile
US4390861A (en) Cockpit display system to reduce vertigo
US4758829A (en) Apparatus for stimulating a keyboard
JPH06197932A (ja) 人間または動物の医療処理装置用制御装置
SE9603315D0 (sv) Medicinsk utrustning
WO1998014860A1 (fr) Systeme pour communiquer des sensations
Bliss Kinesthetic-tactile communications
Grattan et al. Communication by eye closure-a microcomputer-based system for the disabled
EP0371284A3 (fr) Méthode et appareil pour réaliser une fonction d'aide à l'utilisateur
FR2452135A1 (fr) Systeme d'identification d'attitude
JPH06149473A (ja) 表示画像操作用ユーザインターフェース
Perron Typewriter control for an aphasic quadriplegic patient
JPH09198222A (ja) ワープロ装置

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A1

Designated state(s): US

AL Designated countries for regional patents

Kind code of ref document: A1

Designated state(s): AT BE CH DE DK ES FR GB IT LU NL SE

点击 这是indexloc提供的php浏览器服务,不要输入任何密码和下载