+

WO1991001699A1 - Control system - Google Patents

Control system Download PDF

Info

Publication number
WO1991001699A1
WO1991001699A1 PCT/GB1990/001226 GB9001226W WO9101699A1 WO 1991001699 A1 WO1991001699 A1 WO 1991001699A1 GB 9001226 W GB9001226 W GB 9001226W WO 9101699 A1 WO9101699 A1 WO 9101699A1
Authority
WO
WIPO (PCT)
Prior art keywords
menu
control
items
control system
pressure sensor
Prior art date
Application number
PCT/GB1990/001226
Other languages
French (fr)
Inventor
Roger Edwin Wilson
Original Assignee
Gec-Marconi Limited
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Gec-Marconi Limited filed Critical Gec-Marconi Limited
Publication of WO1991001699A1 publication Critical patent/WO1991001699A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/013Eye tracking input arrangements
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61FFILTERS IMPLANTABLE INTO BLOOD VESSELS; PROSTHESES; DEVICES PROVIDING PATENCY TO, OR PREVENTING COLLAPSING OF, TUBULAR STRUCTURES OF THE BODY, e.g. STENTS; ORTHOPAEDIC, NURSING OR CONTRACEPTIVE DEVICES; FOMENTATION; TREATMENT OR PROTECTION OF EYES OR EARS; BANDAGES, DRESSINGS OR ABSORBENT PADS; FIRST-AID KITS
    • A61F4/00Methods or devices enabling patients or disabled persons to operate an apparatus or a device not forming part of the body

Definitions

  • This invention relates to a control system providing an interface between a human being and a range of machine and electronic devices, that is, a form of cybernetic system, and particularly to such a system for use by a human operator whose activities are restricted either by personal incapacity or by the demands of the environment in which he is operating.
  • Typical examples are paraplegic or quadriplegic people and persons having restricted mobility due to limited confinement who require the use of all or most of their limbs (including head-movement) for other purposes, eg operators of bathyspheres, mini-submarines, space capsules, aircraft cockpits and simulators related to these systems.
  • An object of the present invention is therefore to provide a control system of the above type which significantly improves the operator's control capability while still permitting operation according to selected indiv dual modes.
  • a control system for use by a human operator comprises computer means adapted to control a range of utility items, display means controlled by the computer means and adapted to display at least one menu of the utility items, pressure sensor means adapted to be operated by the operator, eye-direction sensor means adapted to be fitted to an operator's head the pressure sensor means and the eye-d rection sensor means being coupled to the computer means by an electro-magnetic link to control selection and/or activation of the utility items.
  • the electromagnetic link is preferably an infra-red link.
  • the display means may be adapted to display two or more stages of menu, a menu of ⁇ the second or a subsequent stage constituting a the utility item of a menu of the preceding stage.
  • the pressure sensor may be an air pressure sensor responsive to pressures above and below atmospheric produced by blowing and sucking.
  • a cursor may be movable over displayed menu items by the sucking or the brewing process and activation of a selected menu item effected by the complementary process.
  • the eye-direction sensor means and the pressure sensor means may control, one, the selection of menu items, and the other, the activation of selected menu items.
  • the computer means may be adapted to scan menu items automatically until an item is selected by operation of the pressure sensor means.
  • the control system may be adapted to operate in an eye control mode, the eye-di rection sensor means control ling the selection of menu items, and the system including means responsi ve to the dwell time of the eye direction to acti vate any menu item subjected to the same eye direction for a predetermined period.
  • the control system may further include means responsive to the closure of one but not two eyes to switch between operational modes , and parti cularly from a standby to an acti ve mode.
  • Figure 1 is a bl ock diagram of a preferred arrangement provi ding a choice between, or combination of eye di rection sensor and pressure sensor;
  • Figure 2 is a diagram of an abbreviated version of the arrangement of Figure 1 showing a pressure sensor and pressure transducer in some detail ;
  • Figure 3 is a control mode menu to be displayed to the operator
  • Figure 4 is a main menu fol lowing automatically from Figure 3;
  • Figure 5 (a) is a particular sub-menu of a typewriter mode
  • Figure 5 (b) is a sub-menu expansion of part of the menu of Fi gure 5(a);
  • Figure 6 is a sub-menu resulting from selection of radio equipment. and Figures 7 and 8 show, respecti vely, transmitting and recei ving units of an I .R. link.
  • the system comprises an ai r pressure sensor unit 1 , shown in detail in Figure 2 , coupled to a computer 3 by way of an infra-red l ink 5.
  • the computer controls a display 7 and also a number of uti l ity devices 9 whi ch may include radi o, TV , recorder, fan, l ights, telephone, motor/servo units, robots, wheel chai rs and many more.
  • the computer is programmed to associate the uti lity devices 9 with items on a menu displayed by the display system 7.
  • a cursor on the display is moved to select a particular item which is associated either with further displays in sub-menus or with a particular uti lity device 9 di rectly.
  • An acti vation si gnal may be provided by the pressure sensor unit 1 to the computer 3 to acti vate the selected item, ie the selected uti lity device.
  • Anr eye-di rection sensor system 11 is incorporated to gi ve additional control _and flexibility to the overall system.
  • This eye-sensor system comprises a pai r of electrodes 13 whi ch are fitted to the operator's head in close contact with his temples.
  • the charge distribution around a person's eyes is affected by the 'di rect ion- of -look ' .
  • Suitably placed electrodes (13) can therefore be employed to sense this charge di sturbance and hence the di rect ion- of- look.
  • the potential difference arising can be used to control the cursor of the di splay system 7 by way of the computer 3 to perform the selection and, as will be seen, the actiyation, of menu items.
  • the acti vation of a selected menu item can be achieved by responding to the dwel l time of the eyes on the selected item.
  • a sufficiently long dwell time, in excess of a predetermined threshol d, produces an acti vation signal.
  • the eye sensor would be used for only one of the two functions, selection and acti vati on and most conveniently that one would be the selection function.
  • An advantage of the system is however the flexibility prov ided by the ability to choose either eye sensor or pressure sensor or a combination, for selection and acti vati on.
  • the pressure sensor unit 1 of Figure 1 comprises essentially a suck/bl ow mouthpiece 15 and a pressure transducer 17 which converts the pressure signals to electrical signals.
  • a manually operable valve 19 enables the mouthpiece to be isolated from the transducer 17.
  • the suck/blow mouthpiece and the tactile sensor 21 are shown associated with parallel branch tubes. They could, of course, be fitted to the same tube.
  • the pressure transducer 17 responds to both positive and negative pressure, that is, pressure above and below atmospheric. Positive pressure is sensed by a normally open contact 25 coupled to a diaphragm, not shown, and negative pressure is sensed by a normally open contact 27 coupled to a diaphragm also not shown.
  • the two contacts provide three output lines: a common line 29 connected to both 'wiper' contacts; and two lines 31 and 33 connected to the respective other contacts. These three lines are connected to control an infra red transmitter 36 linked tyo a remote infra red receiver 37 as in Figure 1, to control the selection and activation of the displayed menu items by the computer. As shown, selection is achieved by the positive pressure switch 25, and activation by the negative pressure switch 27, ie 'blow' for selection and 'suck' for activation. At the wish of the operator, this arrangement can be reversed by a reversing switch 35 which interchanges the signals on the lines 31 and 33.
  • I.R. link has the advantage of making the sensor systems self-contained and local to the operator. There is no wire connection to any equipment which is fed from 'the mains'.
  • the I.R. receiver 37, the computer 3, display 7 and interface with the utility devices 9 are all relatively remote from the operator and he can feel secure from any high voltages that they might exhibit.
  • Figure 3 shows a suitable control mode menu for setting up the mode of operation according to the operator's capabilities or preference.
  • the menu comprises the following control modes:
  • eye-direction sensor which is calibrated to the individual operator, and a mechanism such as the pressure sensor described above for activation of the menu item selected by the eye-sensor.
  • Other mechanisms such as a sound activation unit or a joystick, may be incorporated.
  • the essential operation is however, selection by eye direction and activation by other mechanism.
  • each 'blow' causes the cursor to move along the one-dimensional array of menu items in predetermined steps. (4) 'Eye pause' .
  • This mode invol ves only the eye-di recti on sensor. Selection is performed as in (1) above but activation is also achieved by means of the eye-di rection sensor. Thus , on selecting a parti cular menu item by looking directly at the item, the operator then holds the position for a predetermined period, eg three seconds. A timer is then triggered and the selected item activated.
  • the eye-di rection sensor means is sensiti ve both to the 'direction-of-look' and to the open/closed state of the eye, the detected voltage level s corresponding di stincti vely to these conditions.
  • a control si gnal can thus be provided by bl inking the eyes.
  • an exclusi ve-OR function is included to detect only single-eye bl inking, ie a 'wink ' .
  • a minimum wink durati on is also imposed to ensure that the movement is deliberate.
  • the control mode may be arranged to default to a particular one of the modes whon in Figure 3 either to suit the particular operator or, in general , the least demanding mode, ie the Eye-pause mode.
  • the operator initially presented with the Figure 3 menu, may then keep the eye-pause mode or, using that mode select one of the others.
  • Each of the control modes when selected and acti vated is fol l owed by a menu such as that of Fi gure 4.
  • the three menu items shown are TYP for 'typewriter mode' ; BYW for 'standby mode' ; and DEV for 'device mode ' .
  • Acti vation of the typewriter mode causes a menu such as shown in Figurte 5 (a) to be displayed, this being a text-writing menu.
  • the items consist of the alphabet, various editing items, and an outlet to the next menu, 'page 2' .
  • the 30 items are arranged in 6 columns and the cursor 39 can be control led to move horizontally by bl owing on the pressure sensor mouthpiece.
  • a particular column is selected by positioning the cursor and, assuming operation in the eye-scan mode, this selection acti vated by sucking on the mouthpiece.
  • the selected column is then di splayed in a horizontal row in the menu.
  • a further cursor selection and activation is performed so selecting a parti cular item from the row.
  • a sequence of letters can thus be called up to form a piece of text which occupies the "text workspace" area. This may then be printed out in a hard copy by selection and activation of the "print" item in column 6 of the original menu.
  • Figure 5(b) shows the sub menu for a selection of the sixth column of Figure 5(a) and shows the cursor positioned under "page 2". This selection is activated as before by sucking on the mouthpiece to give the menu denoted PA2 (page 2) which may be related to the text writing function or may be an exit to a previous or other menu.
  • the operator may require a rest in the middle of some (particularly typewriting) operation.
  • the BYE (standby) mode of Figure 4 provides this function by effectively storing the current state of an operation until such time as the operator wishes to come back to it.
  • exit from the standby mode is effected by a wink, which returns the operation to the state immediatgely prior to the rest period.
  • the third operational mode available from the menu of Figure 4 is the device control mode (DEV). Selection of this mode produces a menu comprising a list of devices that can be remotely controlled by the computer, eg radio, telephone, door locks etc. These items are selected and activated as required.
  • DEV device control mode
  • Each menu available has an exit, which is either automatic, on a time-out, or is selectable as one menu item to enable the operator to revert to the main menu, to proceed to an associated menu or perhaps to close down.
  • Figures 7 and 8 show respectively the transmitting and receiving units of the infra red link.
  • the transmitter local to the operator, is coupled in this instance to a joystick device 41 which produces left/right and up/down output signals according to the direction of hand pressure applied.
  • the resulting signals are encoded by an integrated circuit unit 43 and arranged to pulse a pair of I.R. light-emitting-diodes 45 accordingly.
  • Figure 8 shows the corresponding circuitry of the receiver unit local to the computer and fed by a photo-sensitive diode 47.
  • ajoystick in this embodiment does simplify the selection of (particularly textual) items from a rectangular array, removing one stage of selection. It may well, however, be beyond the capabilities of certain handicapped operators.
  • the system incorporates the 'visual-only' mode as a back-up providing communication and control for the rare cases where the eyes are the only available movement that the patient can control.
  • a minimum movement, minimum pressure sensor as described above is used to control an auto-scan of the display and select the menu item. In this way, the rare extreme cases are catered for, even if the best in communication and control that can be obtained for a patient in that condition is limited.
  • T is mode uses both the eye sensor and the pressure sensor in such a way that the scan of the display is carried out by the eye di rection-of-look measurement and the acti vation of selected items from the scan is carried out by the minimum-movement-minimum-pressure sensor.
  • the eyes are gi ven continual rests from concentration when not scanning the display prior to selection or after selection. It is found a patient' s concentration fades if eye control is attempted continuously after too short a period in most cases.
  • rests and alternati ve control procedures are provided both to eye control and to control by the force/pressure switch, and by uti lising both at once and al so providi ng autoscan.
  • the main mode is based on this mix of techniques.

Landscapes

  • Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Health & Medical Sciences (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • General Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Animal Behavior & Ethology (AREA)
  • Vascular Medicine (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Biomedical Technology (AREA)
  • User Interface Of Digital Computer (AREA)
  • Position Input By Displaying (AREA)

Abstract

A control system primarily for patients having limited mobility and limb movement, and comprising a computer controlled display of menu sequences each showing a sequence of utility items and selectable by a cursor. Control of the cursor and activation of the utility item so selected is effected by the patient by means of a pressure sensor (1), either mouth or finger operated or both, in conjunction with an eye direction sensor (11), the two sharing the selection and activation processes at the will of the patient. The patient's sensors (1, 11) are linked to the computer by an I. R. link so that there is no fear of electrocution and also to avoid the inconvenience and danger of trailing cables.

Description

Control System
This invention relates to a control system providing an interface between a human being and a range of machine and electronic devices, that is, a form of cybernetic system, and particularly to such a system for use by a human operator whose activities are restricted either by personal incapacity or by the demands of the environment in which he is operating. Typical examples are paraplegic or quadriplegic people and persons having restricted mobility due to limited confinement who require the use of all or most of their limbs (including head-movement) for other purposes, eg operators of bathyspheres, mini-submarines, space capsules, aircraft cockpits and simulators related to these systems.
Previous arrangements for use in circumstances of the above kind have included electrode devices for attachment to the operator's head to sense charge variations on the eyeball. The resulting electrical signals may then be coded, by their duration, rate of change etc, to indicate requirements. While such arrangements may be satisfactory in specific circumstances they are of limited application. Dependence on the eyes alone for this control function can become irksome, especially when the operator is an invalid. A further disadvantage of known systems of this kind is the need for trailing connections between the operator's 'head set' and the computer. Such connections can constitute an actual nuisance and, perhaps more important, a psychological stress, in that there is a direct connection between the operator's head and a computer system which is, or appears to be, fed from 'the mains'.
An object of the present invention is therefore to provide a control system of the above type which significantly improves the operator's control capability while still permitting operation according to selected indiv dual modes.
According to the present invention, a control system for use by a human operator comprises computer means adapted to control a range of utility items, display means controlled by the computer means and adapted to display at least one menu of the utility items, pressure sensor means adapted to be operated by the operator, eye-direction sensor means adapted to be fitted to an operator's head the pressure sensor means and the eye-d rection sensor means being coupled to the computer means by an electro-magnetic link to control selection and/or activation of the utility items.
The electromagnetic link is preferably an infra-red link.
~~ f
The display means may be adapted to display two or more stages of menu, a menu of^the second or a subsequent stage constituting a the utility item of a menu of the preceding stage.
The pressure sensor may be an air pressure sensor responsive to pressures above and below atmospheric produced by blowing and sucking. A cursor may be movable over displayed menu items by the sucking or the brewing process and activation of a selected menu item effected by the complementary process.
The eye-direction sensor means and the pressure sensor means may control, one, the selection of menu items, and the other, the activation of selected menu items.
The computer means may be adapted to scan menu items automatically until an item is selected by operation of the pressure sensor means. The control system may be adapted to operate in an eye control mode, the eye-di rection sensor means control ling the selection of menu items, and the system including means responsi ve to the dwell time of the eye direction to acti vate any menu item subjected to the same eye direction for a predetermined period.
The control system may further include means responsive to the closure of one but not two eyes to switch between operational modes , and parti cularly from a standby to an acti ve mode.
A control system for use by a human operator whose mobi l ity is restricted by environment or personal incapacity will now be described, by way of exampl e, with reference to the accompanying drawings, of which:
Figure 1 is a bl ock diagram of a preferred arrangement provi ding a choice between, or combination of eye di rection sensor and pressure sensor;
Figure 2 is a diagram of an abbreviated version of the arrangement of Figure 1 showing a pressure sensor and pressure transducer in some detail ;
Figure 3 is a control mode menu to be displayed to the operator;
Figure 4 is a main menu fol lowing automatically from Figure 3;
Figure 5 (a) is a particular sub-menu of a typewriter mode;
Figure 5 (b) is a sub-menu expansion of part of the menu of Fi gure 5(a);
Figure 6 is a sub-menu resulting from selection of radio equipment. and Figures 7 and 8 show, respecti vely, transmitting and recei ving units of an I .R. link.
Referring to Figure 1 the system comprises an ai r pressure sensor unit 1 , shown in detail in Figure 2 , coupled to a computer 3 by way of an infra-red l ink 5. The computer controls a display 7 and also a number of uti l ity devices 9 whi ch may include radi o, TV , recorder, fan, l ights, telephone, motor/servo units, robots, wheel chai rs and many more. *
The computer is programmed to associate the uti lity devices 9 with items on a menu displayed by the display system 7. A cursor on the display is moved to select a particular item which is associated either with further displays in sub-menus or with a particular uti lity device 9 di rectly. An acti vation si gnal may be provided by the pressure sensor unit 1 to the computer 3 to acti vate the selected item, ie the selected uti lity device.
Anr eye-di rection sensor system 11 is incorporated to gi ve additional control _and flexibility to the overall system. This eye-sensor system comprises a pai r of electrodes 13 whi ch are fitted to the operator's head in close contact with his temples. The charge distribution around a person's eyes is affected by the 'di rect ion- of -look ' . Suitably placed electrodes (13) can therefore be employed to sense this charge di sturbance and hence the di rect ion- of- look. The potential difference arising can be used to control the cursor of the di splay system 7 by way of the computer 3 to perform the selection and, as will be seen, the actiyation, of menu items.
In an extreme case where the pressure sensor cannot be used, the acti vation of a selected menu item can be achieved by responding to the dwel l time of the eyes on the selected item. A sufficiently long dwell time, in excess of a predetermined threshol d, produces an acti vation signal.
More often however, the eye sensor would be used for only one of the two functions, selection and acti vati on and most conveniently that one would be the selection function. An advantage of the system is however the flexibility prov ided by the ability to choose either eye sensor or pressure sensor or a combination, for selection and acti vati on.
Referring now to Fi gure 2, the pressure sensor unit 1 of Figure 1 comprises essentially a suck/bl ow mouthpiece 15 and a pressure transducer 17 which converts the pressure signals to electrical signals. A manually operable valve 19 enables the mouthpiece to be isolated from the transducer 17.
A further pressure sensor 21, which may be one of a series connected to a common closed tube 23, permits a pressure signal to be sent to the pressure transducer by means of, say, finger contact on an elastic surface. Operation of such sensors does of course require the suck/blow mouthpiece 15 to be isolated by the valve 19. The suck/blow mouthpiece and the tactile sensor 21 are shown associated with parallel branch tubes. They could, of course, be fitted to the same tube.
The pressure transducer 17 responds to both positive and negative pressure, that is, pressure above and below atmospheric. Positive pressure is sensed by a normally open contact 25 coupled to a diaphragm, not shown, and negative pressure is sensed by a normally open contact 27 coupled to a diaphragm also not shown.
The two contacts provide three output lines: a common line 29 connected to both 'wiper' contacts; and two lines 31 and 33 connected to the respective other contacts. These three lines are connected to control an infra red transmitter 36 linked tyo a remote infra red receiver 37 as in Figure 1, to control the selection and activation of the displayed menu items by the computer. As shown, selection is achieved by the positive pressure switch 25, and activation by the negative pressure switch 27, ie 'blow' for selection and 'suck' for activation. At the wish of the operator, this arrangement can be reversed by a reversing switch 35 which interchanges the signals on the lines 31 and 33.
The use of an I.R. link has the advantage of making the sensor systems self-contained and local to the operator. There is no wire connection to any equipment which is fed from 'the mains'. The I.R. receiver 37, the computer 3, display 7 and interface with the utility devices 9 are all relatively remote from the operator and he can feel secure from any high voltages that they might exhibit.
There is also the advantage that no trailing wires, cables etc, extend from the operator to the control equipment to constitute an accident hazard. As described above, the selection of a menu item was achieved by steering the cursor horizontally either by picking off a voltage from the eye direction sensor or by control via the positive pressure (blow) signal. An alternative method that is available in accordance with the invention is for the computer to control an automatic scan. Since no vertical control is required (manually) the scan can be two dimensional and cover a matrix or array of menu items. The operator then merely has to activate the system when the required item is 'lit upon'. The scan may be of the same form as the conventional TV scan but with only 6 to 10 (say) lines per frame. The scan is then cyclic and may be operable at two speeds.
Figure 3 shows a suitable control mode menu for setting up the mode of operation according to the operator's capabilities or preference. The menu comprises the following control modes:
(1) Eyescan.
This requires the eye-direction sensor, which is calibrated to the individual operator, and a mechanism such as the pressure sensor described above for activation of the menu item selected by the eye-sensor., Other mechanisms, such as a sound activation unit or a joystick, may be incorporated. The essential operation is however, selection by eye direction and activation by other mechanism.
(2) Autoscan fast and slow.
This has been described above, the menu items being scanned automati cally and selection and acti vati on effected by operati on of the pressure mechanism at the appropriate time. There are two scan rates avai lable.
(3) 'Vacuscan ' .
Here the selection and acti vati on is performed manually, both by the same mechanism, eg the bl ow/suck pressure devi ce may be used to select by blowing and to acti vate by sucking. In the selecti on process each 'blow' causes the cursor to move along the one-dimensional array of menu items in predetermined steps. (4) 'Eye pause' .
This mode invol ves only the eye-di recti on sensor. Selection is performed as in (1) above but activation is also achieved by means of the eye-di rection sensor. Thus , on selecting a parti cular menu item by looking directly at the item, the operator then holds the position for a predetermined period, eg three seconds. A timer is then triggered and the selected item activated.
The eye-di rection sensor means is sensiti ve both to the 'direction-of-look' and to the open/closed state of the eye, the detected voltage level s corresponding di stincti vely to these conditions. A control si gnal can thus be provided by bl inking the eyes. However, to avoid ambi guity arising from involuntary bl inking, an exclusi ve-OR function is included to detect only single-eye bl inking, ie a 'wink ' . A minimum wink durati on is also imposed to ensure that the movement is deliberate.
The control mode may be arranged to default to a particular one of the modes whon in Figure 3 either to suit the particular operator or, in general , the least demanding mode, ie the Eye-pause mode. The operator, initially presented with the Figure 3 menu, may then keep the eye-pause mode or, using that mode select one of the others.
Each of the control modes when selected and acti vated is fol l owed by a menu such as that of Fi gure 4. The three menu items shown are TYP for 'typewriter mode' ; BYW for 'standby mode' ; and DEV for 'device mode ' .
Acti vation of the typewriter mode causes a menu such as shown in Figurte 5 (a) to be displayed, this being a text-writing menu.
The items consist of the alphabet, various editing items, and an outlet to the next menu, 'page 2' . The 30 items are arranged in 6 columns and the cursor 39 can be control led to move horizontally by bl owing on the pressure sensor mouthpiece. A particular column is selected by positioning the cursor and, assuming operation in the eye-scan mode, this selection acti vated by sucking on the mouthpiece. The selected column is then di splayed in a horizontal row in the menu. A further cursor selection and activation is performed so selecting a parti cular item from the row. A sequence of letters can thus be called up to form a piece of text which occupies the "text workspace" area. This may then be printed out in a hard copy by selection and activation of the "print" item in column 6 of the original menu.
Figure 5(b) shows the sub menu for a selection of the sixth column of Figure 5(a) and shows the cursor positioned under "page 2". This selection is activated as before by sucking on the mouthpiece to give the menu denoted PA2 (page 2) which may be related to the text writing function or may be an exit to a previous or other menu.
The operator may require a rest in the middle of some (particularly typewriting) operation. The BYE (standby) mode of Figure 4 provides this function by effectively storing the current state of an operation until such time as the operator wishes to come back to it. When in the standby mode and also in the eye-pause control mode, exit from the standby mode is effected by a wink, which returns the operation to the state immediatgely prior to the rest period.
The third operational mode available from the menu of Figure 4 is the device control mode (DEV). Selection of this mode produces a menu comprising a list of devices that can be remotely controlled by the computer, eg radio, telephone, door locks etc. These items are selected and activated as required. One further menu available as an adjunct to, say, the radio facility, is shown in Figure 6. This comprises a linear scale of frequency which can be selected for tuning a radio in either an analogue or digital manner. The selected frequency is displayed.
Each menu available has an exit, which is either automatic, on a time-out, or is selectable as one menu item to enable the operator to revert to the main menu, to proceed to an associated menu or perhaps to close down.
Figures 7 and 8 show respectively the transmitting and receiving units of the infra red link. The transmitter, local to the operator, is coupled in this instance to a joystick device 41 which produces left/right and up/down output signals according to the direction of hand pressure applied. The resulting signals are encoded by an integrated circuit unit 43 and arranged to pulse a pair of I.R. light-emitting-diodes 45 accordingly.
Figure 8 shows the corresponding circuitry of the receiver unit local to the computer and fed by a photo-sensitive diode 47.
The use of ajoystick in this embodiment does simplify the selection of (particularly textual) items from a rectangular array, removing one stage of selection. It may well, however, be beyond the capabilities of certain handicapped operators.
With regard to the application of the described system to hospital patients, whatever the injuries, disease, disabilities, confinement conditions or requirements for the use of limbs, in by far the largest number of cases, movement of the direction of look of the eyes is the last ability to be unavailable. Hence the system utilises the measurement of direction of look of the eyes in its main mode. However, it has been found that the effort of concentration needed if this is the only property that is used is too great for patients to use for usefully long and frequent periods and it has been found that this is due to the relatively short period of steady visual concentration that will be tolerated by a patient in the condition postulated.
Hence the system incorporates the 'visual-only' mode as a back-up providing communication and control for the rare cases where the eyes are the only available movement that the patient can control. There are also rare cases, eg eye damaged patients, in which eye movement is not adequately available but some other movement is. A minimum movement, minimum pressure sensor as described above is used to control an auto-scan of the display and select the menu item. In this way, the rare extreme cases are catered for, even if the best in communication and control that can be obtained for a patient in that condition is limited. The great majority of cases where eye control is in use wil l uti lise the main mode above which enables, without signi fi cant patient strain, almost conti nuous and ful ly effecti ve operation in both communication and control as often as the patient or medical authority require.
T is mode uses both the eye sensor and the pressure sensor in such a way that the scan of the display is carried out by the eye di rection-of-look measurement and the acti vation of selected items from the scan is carried out by the minimum-movement-minimum-pressure sensor. In this way, the eyes are gi ven continual rests from concentration when not scanning the display prior to selection or after selection. It is found a patient' s concentration fades if eye control is attempted continuously after too short a period in most cases. Several orders better usage can be obtained if rests and alternati ve control procedures are provided both to eye control and to control by the force/pressure switch, and by uti lising both at once and al so providi ng autoscan. The main mode is based on this mix of techniques.
In all modes of the system, progressi ve menu driving, invol ving main menu, sub-menus and sub-sub-menus etc, each with clear ' repeat to previous level ' indication and simple clarity throughout is an inherent requi rement and provision in the preferred system. If not provided, the complications to both patient and medi cal authority are found to mil itate against actual use of the equipment. Wi de spacing is used between di splay items. These combinations all ow elimination of the need to monitor head movement in this proposed system and al l ows the use of smaller, standard sensor desi gns , eg EC6 sensors, whi ch the patient wi l l tolerate much more readily. In large-sensor designs, it has been found the patient tolerates them not at all .

Claims

1. A control system for use by a human operator and comprising computer means adapted to control a range of uti l ity items, display means control led by said computer means and adapted to display at least one menu of said uti lity items, pressure sensor means adapted to be operated by the operator, eye-di rection sensor means adapted to be fitted to an operator's head said pressure sensor means and said eye-di rection sensor means being coupled to said computer means by an electro-magneti c l ink to control selection and/or acti vation of the uti lity items.
2. A control system according to Claim 1 , wherein said electromagnetic link is an infra-red link.
3. A control system according to Claim 1 or Claim 2, wherein said display means is adapted to di splay two or more stages of menu, a menu of the second or a subsequent stage constituting a said uti lity item of a menu of the preceding stage.
4. A control system according to any preceding claim, wherein said pressure sensor is an ai r pressure sensor responsi ve to pressures above and below atmospheri c produced by blowing and sucking.
5. A control system according to Claim 4, wherein a cursor is movable over displayed menu items by the sucking or the bl owing process and acti vation of a selected menu item is effected by the complementary process.
6. A control system according to any of Claims 1 to 4, wherein said eye-direction sensor means and said pressure sensor means control , one, the selection of menu items, and the other, the acti vation of selected menu items. - 12 -
7. A control system according to aηy of Claims 1 to 4, wherein said computer means is adapted to scan menu items automatically until an item is selected by operation of said pressure sensor means.
PCT/GB1990/001226 1989-08-04 1990-08-06 Control system WO1991001699A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
GB8917929.5 1989-08-04
GB898917929A GB8917929D0 (en) 1989-08-04 1989-08-04 Control system

Publications (1)

Publication Number Publication Date
WO1991001699A1 true WO1991001699A1 (en) 1991-02-21

Family

ID=10661217

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/GB1990/001226 WO1991001699A1 (en) 1989-08-04 1990-08-06 Control system

Country Status (2)

Country Link
GB (2) GB8917929D0 (en)
WO (1) WO1991001699A1 (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0468340A3 (en) * 1990-07-24 1992-12-16 Biocontrol Systems, Inc. Eye directed controller
WO1993002622A1 (en) 1991-08-07 1993-02-18 Software Solutions Limited Operation of computer systems
ES2116910A1 (en) * 1996-05-03 1998-07-16 Amengual Colom Antonio Procedure and device for activating a switch by blinking an eye.
CN100374986C (en) * 2003-06-12 2008-03-12 控制仿生学公司 Method, system, and software for interactive communication and analysis

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
FR2727821A1 (en) * 1994-12-13 1996-06-14 Lucas Sa G CONTROL SYSTEM OF A MACHINE EQUIPPED WITH SEVERAL HYDRAULIC MOTORS, ATTACHED TO AN AGRICULTURAL TRACTOR
AT412745B (en) * 2001-10-01 2005-06-27 Arc Seibersdorf Res Gmbh CONTROL UNIT
WO2004034937A1 (en) * 2002-10-17 2004-04-29 Arthur Prochazka Method and apparatus for controlling a device or process with vibrations generated by tooth clicks
ES2349540B1 (en) * 2009-03-11 2011-10-27 Carlos Sanz Juez DOMOTIC SYSTEM AND CONTROL PROCEDURE.

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US3986030A (en) * 1975-11-03 1976-10-12 Teltscher Erwin S Eye-motion operable keyboard-accessory
CH660956A5 (en) * 1986-03-06 1987-06-30 Marcel Roux Control device for physically handicapped persons
US4688037A (en) * 1980-08-18 1987-08-18 Mcdonnell Douglas Corporation Electromagnetic communications and switching system
EP0288169A2 (en) * 1987-04-07 1988-10-26 Possum Controls Limited Control system
EP0294812A2 (en) * 1987-06-10 1988-12-14 Research Development Foundation Calibration controller for controlling electrically operated machines

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4109145A (en) * 1974-05-20 1978-08-22 Honeywell Inc. Apparatus being controlled by movement of the eye
EP0055338A1 (en) * 1980-12-31 1982-07-07 International Business Machines Corporation Eye controlled user-machine communication
US4746913A (en) * 1984-04-23 1988-05-24 Volta Arthur C Data entry method and apparatus for the disabled
GB2179147A (en) * 1984-12-24 1987-02-25 Univ Adelaide Improvements relating to eye-gaze-direction controlled apparatus

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US3986030A (en) * 1975-11-03 1976-10-12 Teltscher Erwin S Eye-motion operable keyboard-accessory
US4688037A (en) * 1980-08-18 1987-08-18 Mcdonnell Douglas Corporation Electromagnetic communications and switching system
CH660956A5 (en) * 1986-03-06 1987-06-30 Marcel Roux Control device for physically handicapped persons
EP0288169A2 (en) * 1987-04-07 1988-10-26 Possum Controls Limited Control system
EP0294812A2 (en) * 1987-06-10 1988-12-14 Research Development Foundation Calibration controller for controlling electrically operated machines

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0468340A3 (en) * 1990-07-24 1992-12-16 Biocontrol Systems, Inc. Eye directed controller
WO1993002622A1 (en) 1991-08-07 1993-02-18 Software Solutions Limited Operation of computer systems
ES2116910A1 (en) * 1996-05-03 1998-07-16 Amengual Colom Antonio Procedure and device for activating a switch by blinking an eye.
CN100374986C (en) * 2003-06-12 2008-03-12 控制仿生学公司 Method, system, and software for interactive communication and analysis

Also Published As

Publication number Publication date
GB9017213D0 (en) 1990-09-19
GB8917929D0 (en) 1989-09-20
GB2236874A (en) 1991-04-17

Similar Documents

Publication Publication Date Title
US4746913A (en) Data entry method and apparatus for the disabled
CA2319525C (en) Tongue placed tactile output device
EP0734704B1 (en) Intraoral communication system
JP2865875B2 (en) Image recognition device for the blind and highly visually impaired
EP1652504A3 (en) Communication and bed function control apparatus
US6712613B2 (en) Display device suited for a blind person
US20050134559A1 (en) Touch pad sensor for motor vehicle
CA2159854A1 (en) Self-Powered Interface Circuit for Use with a Transducer Sensor
WO1993014386A1 (en) Distribution-type touch sensor
WO1991001699A1 (en) Control system
WO2001096969A8 (en) Interface for machine operation
CA2126142A1 (en) Visual communications apparatus
WO1998040698A3 (en) Hand held measurement instrument with touch screen display
US4390861A (en) Cockpit display system to reduce vertigo
US4758829A (en) Apparatus for stimulating a keyboard
JPH06197932A (en) Controlling apparatus for medical treatment device for human or animal
SE9603315D0 (en) Medical equipment
WO1998014860A1 (en) System for communication of feelings
Bliss Kinesthetic-tactile communications
Grattan et al. Communication by eye closure-a microcomputer-based system for the disabled
EP0371284A3 (en) Method and apparatus for providing a help function
FR2452135A1 (en) Direction sensor for use by handicapped person - consists of head attachment moved by body to detect patient to generate panel location signal without using electrical switch
JPH06149473A (en) User interface for operating display picture
Perron Typewriter control for an aphasic quadriplegic patient
JPH09198222A (en) Word processor device

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A1

Designated state(s): US

AL Designated countries for regional patents

Kind code of ref document: A1

Designated state(s): AT BE CH DE DK ES FR GB IT LU NL SE

点击 这是indexloc提供的php浏览器服务,不要输入任何密码和下载