US20110010112A1 - Method and System for Controlling a User Interface of a Device Using Human Breath - Google Patents
Method and System for Controlling a User Interface of a Device Using Human Breath Download PDFInfo
- Publication number
- US20110010112A1 US20110010112A1 US12/813,292 US81329210A US2011010112A1 US 20110010112 A1 US20110010112 A1 US 20110010112A1 US 81329210 A US81329210 A US 81329210A US 2011010112 A1 US2011010112 A1 US 2011010112A1
- Authority
- US
- United States
- Prior art keywords
- user interface
- control signals
- human breath
- expulsion
- enabled
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
- 238000000034 method Methods 0.000 title claims abstract description 30
- 238000001514 detection method Methods 0.000 claims abstract description 35
- 230000003993 interaction Effects 0.000 claims 2
- 230000004044 response Effects 0.000 abstract description 17
- 238000012545 processing Methods 0.000 description 98
- 238000010586 diagram Methods 0.000 description 26
- 238000004891 communication Methods 0.000 description 18
- 230000006870 function Effects 0.000 description 17
- 230000003287 optical effect Effects 0.000 description 8
- 239000003570 air Substances 0.000 description 5
- 238000004590 computer program Methods 0.000 description 5
- 238000013507 mapping Methods 0.000 description 5
- 239000012530 fluid Substances 0.000 description 4
- 230000008569 process Effects 0.000 description 4
- 241001422033 Thestylus Species 0.000 description 3
- 230000001413 cellular effect Effects 0.000 description 3
- 239000000463 material Substances 0.000 description 3
- 230000004048 modification Effects 0.000 description 3
- 238000012986 modification Methods 0.000 description 3
- 238000013473 artificial intelligence Methods 0.000 description 2
- 230000008901 benefit Effects 0.000 description 2
- 239000003990 capacitor Substances 0.000 description 2
- 230000003203 everyday effect Effects 0.000 description 2
- 230000008676 import Effects 0.000 description 2
- 239000000203 mixture Substances 0.000 description 2
- 238000010295 mobile communication Methods 0.000 description 2
- 230000008520 organization Effects 0.000 description 2
- 230000029058 respiratory gaseous exchange Effects 0.000 description 2
- 230000005355 Hall effect Effects 0.000 description 1
- 241001465754 Metazoa Species 0.000 description 1
- 230000009471 action Effects 0.000 description 1
- 239000012080 ambient air Substances 0.000 description 1
- 238000013459 approach Methods 0.000 description 1
- 230000008859 change Effects 0.000 description 1
- 238000006243 chemical reaction Methods 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 230000004438 eyesight Effects 0.000 description 1
- 239000004744 fabric Substances 0.000 description 1
- 238000010348 incorporation Methods 0.000 description 1
- 230000010365 information processing Effects 0.000 description 1
- 230000010354 integration Effects 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 230000007246 mechanism Effects 0.000 description 1
- 230000004297 night vision Effects 0.000 description 1
- 230000002093 peripheral effect Effects 0.000 description 1
- 238000011160 research Methods 0.000 description 1
- 230000035945 sensitivity Effects 0.000 description 1
- 239000000758 substrate Substances 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G10—MUSICAL INSTRUMENTS; ACOUSTICS
- G10H—ELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
- G10H3/00—Instruments in which the tones are generated by electromechanical means
- G10H3/12—Instruments in which the tones are generated by electromechanical means using mechanical resonant generators, e.g. strings or percussive instruments, the tones of which are picked up by electromechanical transducers, the electrical signals being further manipulated or amplified and subsequently converted to sound by a loudspeaker or equivalent instrument
- G10H3/14—Instruments in which the tones are generated by electromechanical means using mechanical resonant generators, e.g. strings or percussive instruments, the tones of which are picked up by electromechanical transducers, the electrical signals being further manipulated or amplified and subsequently converted to sound by a loudspeaker or equivalent instrument using mechanically actuated vibrators with pick-up means
- G10H3/16—Instruments in which the tones are generated by electromechanical means using mechanical resonant generators, e.g. strings or percussive instruments, the tones of which are picked up by electromechanical transducers, the electrical signals being further manipulated or amplified and subsequently converted to sound by a loudspeaker or equivalent instrument using mechanically actuated vibrators with pick-up means using a reed
Definitions
- Certain embodiments of the invention relate to controlling a computer or electronic system. More specifically, certain embodiments of the invention relate to a method and system for controlling a user interface of a device using human breath.
- Mobile communications have changed the way people communicate and mobile phones have been transformed from a luxury item to an essential part of every day life.
- the use of mobile phones is today dictated by social situations, rather than hampered by location or technology.
- most mobile devices are equipped with a user interface that allows users to access the services provided via the Internet.
- some mobile devices may have browsers, and software and/or hardware buttons may be provided to enable navigation and/or control of the user interface.
- Some mobile devices such as Smartphones are equipped with touch screen capability that allows users to navigate or control the user interface via touching with one hand while the device is held in another hand.
- a system and/or method for controlling a user interface of a device using human breath substantially as shown in and/or described in connection with at least one of the figures, as set forth more completely in the claims.
- FIG. 1A is a block diagram of an exemplary system for controlling a user interface of a plurality of devices using human breath, in accordance with an embodiment of the invention.
- FIG. 1B is a block diagram of an exemplary sensing module to detect human breath, in accordance with an embodiment of the invention.
- FIG. 1C is a block diagram of another embodiment of an exemplary system for controlling a user interface of a device using human breath, in accordance with an embodiment of the invention.
- FIG. 1D is a block diagram of an exemplary processor interacting with a device being controlled, in accordance with an embodiment of the invention.
- FIG. 1E is a block diagram of an exemplary system for side loading of information between two or more devices, in accordance with an embodiment of the invention.
- FIG. 2A is a diagram illustrating an exemplary MEMS sensing and processing module located on a stylus, in accordance with an embodiment of the invention.
- FIG. 2B is a diagram illustrating an exemplary MEMS sensing and processing module located on a headset for military personnel, in accordance with an embodiment of the invention.
- FIG. 2C is a diagram illustrating an exemplary MEMS sensing and processing module located on a headrest of a seating apparatus, in accordance with an embodiment of the invention.
- FIG. 2D is a diagram illustrating an exemplary MEMS sensing and processing module located inside an automobile, in accordance with an embodiment of the invention.
- FIG. 2E is a diagram illustrating an exemplary MEMS sensing and processing module located on detachable eyewear, in accordance with an embodiment of the invention.
- FIG. 2F is a diagram illustrating an exemplary MEMS sensing and processing module located on a neckset, in accordance with an embodiment of the invention.
- FIG. 2G is a diagram illustrating an exemplary MEMS sensing and processing module located on a stand alone device, in accordance with an embodiment of the invention.
- FIG. 2H is a diagram illustrating an exemplary MEMS sensing and processing module located on a clip, in accordance with an embodiment of the invention.
- FIG. 3A is a flow chart illustrating exemplary steps for controlling a user interface of a device using human breath, in accordance with an embodiment of the invention.
- FIG. 3B is a flow chart illustrating exemplary steps for side loading of information, in accordance with an embodiment of the invention.
- Certain aspects of the invention may be found in a method and system for controlling a user interface of a device using human breath.
- Exemplary aspects of the invention may comprise detecting movement caused by expulsion of human breath by a user.
- one or more control signals may be generated.
- the generated control signals may be utilized to control the user interface of a device and may enable navigation and/or selection of components in the user interface.
- the generated one or more control signals may be communicated to the device being controlled via a wired and/or a wireless signal.
- the expulsion of the human breath may occur in open space and the detection of the movement caused by the expulsion may occur without the use of a channel.
- the detection of the movement and/or the generation of the control signals may be performed by a MEMS.
- One exemplary embodiment of a user interface is a graphical user interface (GUI).
- FIG. 1A is a block diagram of an exemplary system for controlling a user interface of a plurality of devices using human breath, in accordance with an embodiment of the invention.
- a user 102 a micro-electro-mechanical system (MEMS) sensing and processing module 104 , and a plurality of devices to be controlled, such as a multimedia device 106 a , a cellphone/smartphone/dataphone 106 b , a personal computer (PC), laptop or a notebook computer 106 c , a display device 106 d and/or a television (TV)/game console/other platform 106 e .
- MEMS micro-electro-mechanical system
- the multimedia device 106 a may comprise a user interface 107 a
- the cellphone/smartphone/dataphone 106 b may comprise a user interface 107 b
- the personal computer (PC), laptop or a notebook computer 106 c may comprise a user interface 107 c
- the display device 106 d may comprise a user interface 107 d
- the television (TV)/game console/other platform 106 e may comprise a user interface 107 e .
- Each of the plurality of devices to be controlled may be wired or wirelessly connected to a plurality of other devices 108 for loading of information via, for example, side loading, or loading via a peer-to-peer connection, and/or a network connection, and by wired and/or wireless communication.
- Exemplary other devices 108 may comprise game consoles, immersive or 3D reality devices, and/or telematic devices.
- Telematic devices refers to devices comprising integrated computing, wireless communication and/or global navigation satellite system devices, which enables sending, receiving and/or storing of information over networks.
- the user interface may enable interacting with the device being controlled by one or more inputs, for example, expulsion of a fluid such as air, tactual inputs such as button presses, audio actions such as voice commands, and/or movements of the electronic device 202 such as those detected by an accelerometer and/or gyroscope.
- the MEMS sensing and processing module 104 may comprise suitable logic, circuitry and/or code that may be enabled to detect movement caused by expulsion of human breath by the user 102 . In response to the detection of movement caused by expulsion of human breath, the MEMS sensing and processing module 104 may be enabled to generate one or more controls signals.
- the MEMS sensing and processing module 104 may comprise one or more segments or members that may be operable to sense the kinetic energy generated by the expulsion of the human breath and accordingly generate the one or more control signals.
- the generated one or more control signals may be enabled to control a user interface of one or more of a plurality of devices, such as the user interface 107 a of the multimedia device 106 a , the user interface 107 b of the cellphone/smartphone/dataphone 106 b , the user interface 107 c of the PC, laptop or a notebook computer 106 c , the user interface 107 d of the display device 106 d , the user interface 107 e of the TV/game console/other platform 106 e , and the user interfaces of the mobile multimedia player and/or a remote controller.
- a user interface is a graphical user interface (GUI).
- the detection of the movement caused by expulsion of human breath may occur without use of a channel.
- the detection of the movement caused by expulsion of human breath may be responsive to the expulsion of human breath into open space, which is then sensed.
- the MEMS sensing and processing module 104 may be enabled to navigate within the user interface of one of more of the plurality of devices, such as a handheld device, for example, a multimedia device 106 a , a cellphone/smartphone/dataphone 106 b , a PC, laptop or a notebook computer 106 c , a display device 106 d , and/or a TV/game console/other platform 106 e via the generated one or more control signals.
- the MEMS sensing and processing module 104 may be enabled to select one or more components within the user interface of the plurality of devices via the generated one or more control signals.
- the generated one or more control signals may comprise one or more of a wired and/or a wireless signal.
- one or more of the plurality of devices such as a handheld device, for example, a multimedia device 106 a and/or a cellphone/smartphone/dataphone 106 b and/or a PC, laptop or a notebook computer 106 c may be enabled to receive one or more inputs defining the user interface from another device 108 .
- the other device 108 may be one or more of a PC, laptop or a notebook computer 106 c and/or a handheld device, for example, a multimedia device 106 a and/or a cell phone/smartphone/dataphone 106 b .
- data may be transferred from the other device 108 to the cellphone/smartphone/dataphone 106 b and this data may be associated or mapped to media content that may be remotely accessed by the cellphone/smartphone/dataphone 106 b via a service provider such as a cellular or PCS service provider.
- the transferred data that is associated or mapped to media content may be utilized to customize the user interface 107 b of the cellphone/smartphone/dataphone 106 b .
- media content associated with one or more received inputs may become an integral part of the user interface of the device being controlled.
- the associating and/or mapping may be performed on either the other device 108 and/or one the cellphone/smartphone/dataphone 106 b . In instances where the associating and/or mapping is performed on the other device 108 , the associated and/or mapped data may be transferred from the other device 108 to the cellphone/smartphone/dataphone 106 b.
- an icon transferred from the other device 108 to the cellphone/smartphone/dataphone 106 b may be associated or mapped to media content such as an RSS feed, a markup language such as HTML, and XML, that may be remotely accessed by the cellphone/smartphone/dataphone 106 b via the service provider of the cellphone/smartphone 106 b . Accordingly, when the user 102 blows on the MEMS sensing and processing module 104 , control signals generated by the MEMS sensing and processing module 104 may navigate to the icon and select the icon.
- the RSS feed or markup language may be accessed via the service provider of the cellphone/smartphone/dataphone 106 b and corresponding RSS feed or markup language content may be displayed on the user interface 107 b .
- U.S. application Ser. No. 12/056,187 discloses an exemplary method and system for customizing a user interface of a device and is hereby incorporated herein by reference in its entirety.
- a user 102 may exhale into open space and the exhaled breath or air may be sensed by one or more detection devices or detectors, such as one or more sensors, sensing members and/or sensing segments in the MEMS sensing and processing module 104 .
- the MEMS sensing and processing module 104 may be enabled to detect movement caused by expulsion of human breath by the user 102 .
- One or more electrical, optical and/or magnetic signals may be generated by one or more detection devices or detectors within the MEMS sensing and processing module 104 in response to the detection of movement caused by expulsion of human breath.
- the processor firmware within the MEMS sensing and processing module 104 may be enabled to process the received electrical, optical and/or magnetic signals from the one or more detection device(s) or detector(s) utilizing various algorithms and generate one or more control signals to the device being controlled, for example, the multimedia device 106 a .
- the generated one or more control signals may be communicated to the device being controlled, for example, the multimedia device 106 a via a wired and/or a wireless signal.
- the processor in the device being controlled may utilize the communicated control signals to control the user interface of the device being controlled, such as a user interface 107 a of the multimedia device 106 a , a user interface 107 b of the cellphone/smartphone/dataphone 106 b , a user interface 107 c of the personal computer (PC), laptop or a notebook computer 106 c , a user interface 107 d of the display device 106 d , a user interface 107 e of the TV/game console/other platform 106 e , and a user interface of a mobile multimedia player and/or a remote controller.
- a user interface 107 a of the multimedia device 106 a such as a user interface 107 a of the multimedia device 106 a , a user interface 107 b of the cellphone/smartphone/dataphone 106 b , a user interface 107 c of the personal computer (PC), laptop or a notebook computer 106 c , a user interface 107
- FIG. 1B is a block diagram of an exemplary detection device or detector to detect human breath, in accordance with an embodiment of the invention.
- the sensing module 110 may comprise a sensor control chip 109 and a plurality of sensors, for example, 111 a , 111 b , 111 c , and 111 d .
- the invention may not be so limited and the sensing module 110 may comprise more or less than the number of sensors or sensing members or segments shown in FIG. 1B without limiting the scope of the invention. Accordingly, any number of detectors and sources may be utilized according to the desired size, sensitivity, and resolution desired.
- the type of sources and detectors may comprise other sensing mechanisms, other than visible light.
- piezoelectric, ultrasonic, Hall effect, electrostatic, and/or permanent or electro-magnet sensors may be activated by deflected MEMS members to generate a signal to be communicated to the sensor control chip 109 .
- the sensing module 110 may be an electrochemical sensor or any other type of breath analyzing sensor, for example.
- the plurality of sensors or sensing members or segments 111 a - d may be an integral part of one or more MEMS devices that may enable the detection of various velocities of air flow from the user's 102 breath.
- the plurality of sensors or sensing members or segments 111 a - d may be enabled to detect kinetic energy and/or movement caused by the expulsion of human breath by the user 102 .
- the sensor control chip 109 may be enabled to generate an electrical, optical and/or magnetic signal that may be communicated to the processor in response to the detection of kinetic energy and/or movement caused by expulsion of human breath.
- FIG. 1C is a block diagram of another embodiment of an exemplary system for controlling a user interface of a device using human breath, in accordance with an embodiment of the invention.
- a user 102 a MEMS sensing and processing module 104 , and a device being controlled 106 , such as a multimedia device 106 a , a cellphone/smartphone/dataphone 106 b , a PC, laptop or a notebook computer 106 c , a display device 106 d and/or a TV/game console/other platform 106 e .
- the device being controlled 106 may be wired and/or wirelessly connected to a plurality of other devices 108 for side loading of information.
- the MEMS sensing and processing module 104 may comprise a sensing module 110 , a processing module 112 and passive devices 113 .
- the passive devices 113 which may comprise resistors, capacitors and/or inductors, may be embedded within a substrate material of the MEMS processing sensing and processing module 104 .
- the processing module 112 may comprise, for example, an ASIC.
- the sensing module 110 may generally be referred to as a detection device or detector, and may comprise one or more sensors, sensing members and/or sensing segments that may be enabled to detect kinetic energy and/or movement caused by the expulsion of human breath by the user 102 .
- the sensing module 110 may be enabled to generate an electrical, optical and/or magnetic signal that may be communicated to the processing module 112 in response to the detection of kinetic energy and/or movement caused by expulsion of human breath.
- the processing module 112 may comprise suitable logic, circuitry and/or code that may be enabled to receive the generated electric signal from the sensing module 110 and generate one or more control signals to the device being controlled 106 .
- the processing module 112 may comprise one or more analog to digital converters that may be enabled to translate the sensed signal to one or more digital signals, which may be utilized to generate the one or more control signals.
- the generated one or more control signals may be enabled to control a user interface of the device being controlled 106 .
- the device being controlled 106 may comprise a user interface 107 . Accordingly, the generated one or more signals from the MEMS sensing and processing module 104 may be communicated to the device being controlled 106 and utilized to control the user interface 107 . In an exemplary embodiment of the invention, the one or more signals generated by the MEMS sensing and processing module 104 may be operable to control a pointer on the device being controlled 106 such that items in the user interface 107 may be selected and/or manipulated. In an exemplary embodiment of the invention, the device being controlled may be enabled to receive one or more inputs from the other devices 108 , which may be utilized to customize or define the user interface 107 .
- the other device 108 may be one or more of a PC, laptop or a notebook computer 106 c and/or a handheld device, for example, a multimedia device 106 a and/or a cellphone/smartphone/dataphone 106 b .
- the other device 108 may be similar to or different from the type of device that is being controlled 106 .
- a processor in the other device 108 may be operable to associate or map the data to media content that is remotely accessible by the device being controlled 106 .
- a processor in the device being controlled 106 may be operable to associate or map the data to media content that is remotely accessible by the device being controlled 106 .
- U.S. application Ser. No. 12/056,187 discloses an exemplary method and system for customizing a user interface of a device and is hereby incorporated herein by reference in its entirety.
- FIG. 1D is a block diagram of an exemplary processor interacting with a device being controlled, in accordance with an embodiment of the invention.
- a processing module 112 and a device being controlled 106 such as a multimedia device 106 a , a cellphone/smartphone/dataphone 106 b , a PC, laptop or a notebook computer 106 c , a display device 106 d and/or a TV/game console/other platform 106 e .
- the processing module 112 may be an ASIC and may comprise one or more analog to digital converters (ADCs) 114 , processor firmware 116 , and a communication module 118 .
- ADCs analog to digital converters
- the device being controlled 106 may comprise a communication module 120 , a processor 122 , memory 123 , firmware 124 , a display 126 , and a user interface 128 .
- the device being controlled 106 may be wired and/or wirelessly connected to a plurality of other devices 108 for loading of information via, for example, side loading, or loading via a peer-to-peer connenction, and/or a network connection, and by wired and/or wireless communication.
- the processing module 112 may comprise suitable logic, circuitry and/or code that may be enabled to receive a digital sensing signal and/or an analog sensing signal from the sensing module 110 .
- the ADC 114 may comprise suitable logic, circuitry and/or code that may be enabled to receive the generated analog sensing signal from the sensing module 110 and convert the received signal into a digital signal.
- the processor firmware 116 may comprise suitable logic, and/or code that may be enabled to receive and process the digital signal from the ADC 114 and/or the digital sensing signal from the sensing module 110 utilizing a plurality of algorithms to generate one or more control signals.
- the processor firmware 116 may be enabled to read, store, calibrate, filter, modelize, calculate and/or compare the outputs of the sensing module 110 .
- the processor firmware 116 may also be enabled to incorporate artificial intelligence (Al) algorithms to adapt to a particular user's 102 breathing pattern.
- the processor firmware 116 may be enabled to generate one or more control signals to the device being controlled 106 based on processing the received digital signals.
- the generated one or more control signals may be enabled to control a user interface of the device being controlled 106 , for example, scrolling, zooming, and/or 3-D navigation within the device being controlled 106 .
- the communication module 118 may comprise suitable logic, circuitry and/or code that may be enabled to receive and communicate the generated one or more control signals to the device being controlled 106 via a wired and/or a wireless signal.
- the communication modules 118 and 120 may support a plurality of interfaces.
- the communication modules 118 and 120 may support an external memory interface, a universal asynchronous receiver transmitter (UART) interface, an enhanced serial peripheral interface (eSPI), a general purpose input/output (GPIO) interface, a pulse-code modulation (PCM) and/or an inter-IC sound (I 2 S) interface, an inter-integrated circuit (I 2 C) bus interface, a universal serial bus (USB) interface, a Bluetooth interface, a ZigBee interface, an IrDA interface, and/or a wireless USB (W-USB) interface.
- UART universal asynchronous receiver transmitter
- eSPI enhanced serial peripheral interface
- GPIO general purpose input/output
- PCM pulse-code modulation
- I 2 S inter-IC sound
- I 2 C inter-integrated circuit
- USB universal serial bus
- Bluetooth a Bluetooth interface
- ZigBee interface ZigBee interface
- IrDA interface IrDA interface
- W-USB wireless USB
- the communication module 120 may be enabled to receive the communicated control signals via a wired and/or a wireless signal.
- the processor 122 may comprise suitable logic, circuitry and/or code that may be enabled to utilize the received one or more control signals to control the user interface 128 and/or the display 126 .
- the memory may comprise suitable logic, circuitry and/or code that may be enabled to store data on the device being controlled 106 .
- the firmware 124 may comprise a plurality of drivers and operating system (OS) libraries to convert the received control signals into functional commands.
- the firmware 124 may be enabled to map local functions, and convert received control signals into compatible data, such as user customization features, applets, and/or plugins to control the user interface 128 .
- OS operating system
- the device being controlled 106 may be enabled to receive one or more inputs defining the user interface 128 from another device 108 .
- the other device 108 may comprise a user interface 129 and a processor 125 .
- the other device 108 may be one or more of a PC, laptop or a notebook computer 106 c and/or a handheld device, for example, a multimedia device 106 a and/or a cellphone/smartphone/dataphone 106 b .
- data may be transferred from the other device 108 to the device being controlled, such as the cellphone/smartphone/dataphone 106 b and this data may be associated or mapped to media content that may be remotely accessed by the cellphone/smartphone/dataphone 106 b via a service provider such as a cellular or PCS service provider.
- the transferred data that is associated or mapped to media content may be utilized to customize the user interface 128 of the device being controlled, such as the cellphone/smartphone/dataphone 106 b .
- media content associated with one or more received inputs may become an integral part of the user interface 128 of the device being controlled 106 .
- the processor 125 in the other device 108 may be operable to associate or map the data to media content that is remotely accessible by the device being controlled 106 .
- the processor 122 in the device being controlled 106 may be operable to associate or map the data to media content that is remotely accessible by the device being controlled 106 .
- FIG. 1E is a block diagram of an exemplary system for side loading of information between two or more devices, in accordance with an embodiment of the invention.
- a carrier network 124 a plurality of devices being controlled 106 , such as, a plurality of mobile phones 130 a , 130 b , 130 c and 130 d , a PC, laptop or a notebook computer 132 connected to a network 134 , such as the Internet.
- the network 134 may be coupled to a web server 136 , a wireless carrier portal 138 , a web portal 140 and/or a database 142 .
- Each of the plurality of devices being controlled 106 may have a user interface.
- the mobile phone 130 a may have a user interface 131 a
- the mobile phone 130 b may have a user interface 131 b
- the mobile phone 130 c may have a user interface 131 c
- the mobile phone 130 d may have a user interface 131 d
- the PC, laptop or a notebook computer 132 may have a user interface 133 .
- the carrier network 124 may be a wireless access carrier network.
- Exemplary carrier networks may comprise 2G, 2.5G, 3G, 4G, IEEE802.11, IEEE802.16 and/or suitable network capable of handling voice, video and/or data communication.
- the plurality of devices being controlled 106 may be wirelessly connected to the carrier network 124 .
- One of the devices being controlled, such as mobile phone 130 a may be connected to a plurality of mobile phones 130 b , 130 c and 130 d via a peer-to-peer (P2P) network, for example.
- P2P peer-to-peer
- the device being controlled, such as mobile phone 130 a may be communicatively coupled to a PC, laptop, or a notebook computer 132 via a wired or a wireless network.
- the mobile phone 130 a may be communicatively coupled to the PC, laptop, or a notebook computer 132 via an infrared (IR) link, an optical link, an USB link, a wireless USB, a Bluetooth link and/or a ZigBee link.
- IR infrared
- the PC, laptop, or a notebook computer 132 may be communicatively coupled to the network 134 , for example, the Internet network 134 via a wired or a wireless network.
- the plurality of devices being controlled, such as the plurality of mobile phones 130 a , 130 b , 130 c and 130 d may be wirelessly connected to the Internet network 134 .
- the web server 136 may comprise suitable logic, circuitry, and/or code that may be enabled to receive, for example, HTTP and/or FTP requests from clients or web browsers installed on the PC, laptop, or a notebook computer 132 via the Internet network 134 , and generate HTTP responses along with optional data contents, such as HTML documents and linked objects, for example.
- the wireless carrier portal 138 may comprise suitable logic and/or code that may be enabled to function as a point of access to information on the Internet network 134 via a mobile device, such a mobile phone 130 a , for example.
- the wireless carrier portal 138 may be, for example, a website that may be enabled to provide a single function via a mobile web page, for example.
- the web portal 140 may comprise suitable logic and/or code that may be enabled to function as a point of access to information on the Internet 134 .
- the web portal 140 may be, for example, a site that may be enabled to provide a single function via a web page or site.
- the web portal 140 may present information from diverse sources in a unified way such as e-mail, news, stock prices, infotainment and various other features.
- the database 142 may comprise suitable logic, circuitry, and/or code that may be enabled to store a structured collection of records or data, for example.
- the database 142 may be enabled to utilize software to organize the storage of data.
- the device being controlled such as the mobile phone 130 a may be enabled to receive one or more inputs defining a user interface 128 from another device, such as the PC, laptop, or a notebook computer 132 .
- One or more processors 122 within the device being controlled 106 may be enabled to customize the user interface 128 of the device being controlled, such as the mobile phone 130 a so that content associated with one or more received inputs may become an integral part of the user interface 128 of the device being controlled, such as the mobile phone 130 a .
- the mobile phone 130 a may be enabled to access content directly from the PC, laptop, or a notebook computer 132 rather than from the carrier network 124 . This method of uploading and/or downloading customized information directly from the PC, laptop, or a notebook computer 132 rather than from the carrier network 124 may be referred to as side loading.
- the user interface 128 may be created, modified and/or organized by the user 102 .
- the user 102 may choose, select, create, arrange, manipulate and/or organize content to be utilized for the user interface 128 and/or one or more content components.
- the user 102 may organize the content components on a screen and may choose content such as personal photographs for background and/or icon images.
- the user 102 may create and/or modify the way content components are activated or presented to the user 102 .
- the user 102 may make, import and/or edit icons and/or backgrounds for the user interface 128 .
- the user 102 may associate and/or map the icon to a function so that the user 102 may enable or activate a function via the icon.
- Exemplary icons may enable functions such as hyper-links, book marks, programs/applications, shortcuts, widgets, RSS or markup language feeds or information, and/or favorite buddies.
- the user 102 may organize and/or arrange content components within the user interface 128 .
- the icons may be organized by category into groups. Groups of icons such as content components may be referred to as affinity banks, for example.
- the processor 125 in the other device 108 may be operable to associate or map the data to media content that is remotely accessible by the device being controlled 106 .
- the processor 122 in the device being controlled 106 may be operable to associate or map the data to media content that is remotely accessible by the device being controlled 106 .
- the processor 122 may be enabled to associate and/or map an icon to a function so that the user 102 may enable or activate a function via the icon and may organize and/or arrange content components within the user interface 128 .
- Creation, modification and/or organization of the user interface 128 and/or content components may be performed on the device being controlled, such as mobile phone 130 a and/or may be performed on another device such as the PC, laptop, or a notebook computer 132 .
- a user screen and/or audio that may be created, modified and/or organized on another device, such as the PC, laptop, or a notebook computer 132 may be side loaded to the device being controlled, such as mobile phone 130 a .
- the side loaded user interface 128 may be modified and/or organized on the device being controlled, such as mobile phone 130 a .
- a user interface 128 may be side loaded from the PC, laptop, or a notebook computer 132 to the mobile phone 130 a and may be customized on the mobile phone 130 a .
- One or more tools may enable creation, modification and/or organization of the user interface 128 and/or audio or visual content components.
- FIG. 2A is a diagram illustrating an exemplary MEMS sensing and processing module located on a stylus, in accordance with an embodiment of the invention.
- a user 102 and a device being controlled, such as a cellphone/smartphone/dataphone 106 b .
- the cellphone/smartphone/dataphone 106 b may comprise a user interface 107 b , and a stylus 202 .
- the stylus 202 may be retractable, collapsible, pivotable about an axis or axes and/or flexible and may be enclosed within the body of the cellphone/smartphone/dataphone 106 b .
- the stylus 202 may comprise the MEMS sensing and processing module 104 located on one end, for example.
- the user 102 may be enabled to retract the stylus 202 and exhale into open space and onto the MEMS sensing and processing module 104 .
- the MEMS sensing and processing module 104 may be enabled to detect movement caused by expulsion of human breath by the user 102 . In response to the detection of movement caused by expulsion of human breath, the MEMS sensing and processing module 104 may be enabled to generate one or more controls signals.
- the MEMS sensing and processing module 104 may comprise one or more segments or members that may be operable to sense the kinetic energy generated by the expulsion of the human breath and accordingly generate the one or more control signals.
- the generated one or more control signals may be enabled to control a user interface 107 b of the cellphone/smartphone/dataphone 106 b.
- FIG. 2B is a diagram illustrating an exemplary MEMS sensing and processing module located on a headset for military personnel, in accordance with an embodiment of the invention.
- a user 102 may wear a detachable helmet 208 .
- the detachable helmet 208 may comprise detachable eyewear 204 , a detachable microphone 206 , and a detachable headset 210 .
- the detachable headset 210 may comprise the MEMS sensing and processing module 104 located on one end, for example.
- the detachable eyewear 204 may comprise night vision and/or infrared vision capabilities, for example.
- the detachable microphone 206 may be utilized to communicate with other users, for example.
- the user 102 may be enabled to exhale into open space and the MEMS sensing and processing module 104 may be operable to sense or detect the exhalation. The exhalation may occur from the nostrils and/or the mouth of the user 102 .
- the MEMS sensing and processing module 104 may be enabled to detect movement caused by expulsion of human breath by the user 102 . In response to the detection of movement caused by expulsion of human breath, the MEMS sensing and processing module 104 may be enabled to generate one or more controls signals. The generated one or more control signals may be enabled to control a user interface of the device being controlled such as a user interface 107 a of the multimedia device 106 a , a user interface 107 b of the cellphone/smartphone/dataphone 106 b , a user interface 107 c of the personal computer (PC), laptop or a notebook computer 106 c and/or a user interface 107 d of the display device 106 d.
- PC personal computer
- FIG. 2C is a diagram illustrating an exemplary MEMS sensing and processing module located on a headrest of a seating apparatus, in accordance with an embodiment of the invention.
- a seating apparatus 220 may comprise a headrest 222 , a backrest 226 .
- the headrest 222 may comprise a detachable headset 224 .
- the user 102 may be enabled to sit in the seating apparatus 220 .
- the detachable headset 224 may comprise the MEMS sensing and processing module 104 located on one end, for example.
- the user 102 may be enabled to exhale into open space and onto the MEMS sensing and processing module 104 .
- the seating apparatus 220 may be located inside a car or any other automobile or vehicle, for example. Notwithstanding, the invention may not be so limited and the MEMS sensing and processing module 104 may be located at other locations without limiting the scope of the invention.
- the MEMS sensing and processing module 104 may be enabled to detect movement caused by expulsion of human breath by the user 102 seated in the seating apparatus 102 . In response to the detection of movement caused by expulsion of human breath, the MEMS sensing and processing module 104 may be enabled to generate one or more controls signals.
- the generated one or more control signals may be enabled to control a user interface of the device being controlled such as a user interface 107 a of the multimedia device 106 a , a user interface 107 b of the cellphone/smartphone/dataphone 106 b , a user interface 107 c of the personal computer (PC), laptop or a notebook computer 106 c , a user interface 107 d of the display device 106 d , and/or the user interface of a multimedia player, such as a audio and/or video player.
- a user interface of the device being controlled such as a user interface 107 a of the multimedia device 106 a , a user interface 107 b of the cellphone/smartphone/dataphone 106 b , a user interface 107 c of the personal computer (PC), laptop or a notebook computer 106 c , a user interface 107 d of the display device 106 d , and/or the user interface of a multimedia player, such as a audio and/
- FIG. 2D is a diagram illustrating an exemplary MEMS sensing and processing module located inside an automobile, in accordance with an embodiment of the invention.
- an automobile 230 may comprise a visor 232 and a steering wheel 234 .
- the visor 232 may comprise a flexible support structure 233 .
- the support structure 233 may comprise the MEMS sensing and processing module 104 located on one end, for example.
- the steering wheel 234 may comprise a flexible support structure 235 .
- the support structure 235 may comprise the MEMS sensing and processing module 104 located on one end, for example. Notwithstanding, the invention may not be so limited and the MEMS sensing and processing module 104 may be located at other locations within the automobile 230 without limiting the scope of the invention.
- the user 102 may be seated in the seat behind the steering wheel 234 , with the processing module 104 mounted on the steering wheel 234 .
- the user 102 may be seated in the seat behind the steering wheel 234 .
- the user 102 may be enabled to exhale into open space and onto the MEMS sensing and processing module 104 .
- the MEMS sensing and processing module 104 may be enabled to detect movement caused by expulsion of human breath by the user 102 .
- the MEMS sensing and processing module 104 may be enabled to generate one or more controls signals to control a user interface of the device being controlled such as a user interface 107 a of the multimedia device 106 a , a user interface 107 b of the cellphone/smartphone/dataphone 106 b , a user interface 107 c of the personal computer (PC), laptop or a notebook computer 106 c , a user interface 107 d of the display device 106 d , and/or the user interface of a multimedia or other device, such as a audio and/or video player or a navigation (e.g., GPS) device.
- a user interface of the device being controlled such as a user interface 107 a of the multimedia device 106 a , a user interface 107 b of the cellphone/smartphone/dataphone 106 b , a user interface 107 c of the personal computer (PC), laptop or a notebook computer 106 c , a user interface 107 d of the display device
- FIG. 2E is a diagram illustrating an exemplary MEMS sensing and processing module located on detachable eyewear, in accordance with an embodiment of the invention.
- a user 102 may wear detachable goggles or any other type of eyewear 240 , for example.
- the detachable eyewear 240 may comprise a detachable headset 242 .
- the detachable headset 242 may be flexible and/or deflectable.
- the detachable headset 242 may comprise the MEMS sensing and processing module 104 located on one end, for example.
- the user 102 may be enabled to exhale into open space and onto the MEMS sensing and processing module 104 .
- the MEMS sensing and processing module 104 may be enabled to detect movement caused by expulsion of human breath by the user 102 seated in the seating apparatus 102 . In response to the detection of movement caused by expulsion of human breath, the MEMS sensing and processing module 104 may be enabled to generate one or more controls signals to control a user interface of the device being controlled such as a user interface 107 a of the multimedia device 106 a , a user interface 107 b of the cellphone/smartphone/dataphone 106 b , a user interface 107 c of the personal computer (PC), laptop or a notebook computer 106 c , a user interface 107 d of the display device 106 d , and/or the user interface of a multimedia player, such as a audio and/or video player.
- a user interface of the device being controlled such as a user interface 107 a of the multimedia device 106 a , a user interface 107 b of the cellphone/smartphone/dataphone 106 b ,
- FIG. 2F is a diagram illustrating an exemplary MEMS sensing and processing module located on a neckset, in accordance with an embodiment of the invention.
- a detachable neckset 250 may comprise a flexible printed circuit board (PCB) 254 and processing and/or communication circuitry 252 .
- the flexible PCB 254 may comprise the MEMS sensing and processing module 104 located on one end, for example.
- the processing and/or communication circuitry 252 may comprise a battery, a voltage regulator, one or more switches, one or more light emitting diodes (LEDs), a liquid crystal display (LCD), other passive devices such as resistors, capacitors, inductors, a communications chip capable of handling one or more wireless communication protocols such as Bluetooth and/or one or more wired interfaces.
- the processing and/or communication circuitry 252 may be packaged within a PCB. Notwithstanding, the invention may not be so limited and the processing and/or communication circuitry 252 may comprise other components and circuits without limiting the scope of the invention.
- the user 102 may be enabled to wear the neckset 250 around his/her neck and exhale into open space and the MEMS sensing and processing module 104 may be operable to sense or detect the exhalation.
- the exhalation may occur from the nostrils and/or the mouth of the user 102 .
- the MEMS sensing and processing module 104 may be enabled to detect movement caused by expulsion of human breath by the user 102 . In response to the detection of movement caused by expulsion of human breath, the MEMS sensing and processing module 104 may be enabled to generate one or more controls signals via the flexible PCB 254 to the processing and/or communication circuitry 252 .
- the processing and/or communication circuitry 252 may be enabled to process and communicate the generated one or more control signals to a device being controlled, such as a multimedia device 106 a , a cellphone/smartphone/dataphone 106 b , a personal computer (PC), laptop or a notebook computer 106 c and/or a display device 106 d .
- On or more processors within the device being controlled may be enabled to utilize the communicate control signals to control a user interface of the device being controlled such as a user interface 107 a of the multimedia device 106 a , a user interface 107 b of the cellphone/smartphone/dataphone 106 b , a user interface 107 c of the personal computer (PC), laptop or a notebook computer 106 c and/or a user interface 107 d of the display device 106 d.
- a user interface of the device being controlled such as a user interface 107 a of the multimedia device 106 a , a user interface 107 b of the cellphone/smartphone/dataphone 106 b , a user interface 107 c of the personal computer (PC), laptop or a notebook computer 106 c and/or a user interface 107 d of the display device 106 d.
- PC personal computer
- FIG. 2G is a diagram illustrating an exemplary MEMS sensing and processing module located on a stand alone device, in accordance with an embodiment of the invention.
- a stand alone device 262 may be placed on any suitable surface, for example, on a table or desk top 263 .
- the stand alone device 262 may comprise a flexible support structure 264 .
- the support structure 264 may comprise the MEMS sensing and processing module 104 located on one end, for example.
- the invention may not be so limited and the MEMS sensing and processing module 104 may be located at other locations on the stand alone device 262 , for example in a base of the stand alone device 262 .
- the invention may not be limited in this regard, and the location of the MEMS sensing and processing module 104 within or on the stand alone device 262 may vary accordingly.
- the MEMS sensing and processing module 104 may be enabled to detect movement caused by the expulsion of human breath by the user 102 . In response to the detection of movement caused by expulsion of a fluid such as air from human breath, the MEMS sensing and processing module 104 may be enabled to generate one or more controls signals.
- the MEMS sensing and processing module 104 may comprise one or more segments or members that may be operable to sense the kinetic energy generated by the expulsion of the human breath and accordingly generate the one or more control signals.
- the generated one or more control signals may be enabled to control a user interface 107 b of the cellphone/smartphone/dataphone 106 b.
- FIG. 2H is a diagram illustrating an exemplary MEMS sensing and processing module located on a clip, in accordance with an embodiment of the invention.
- a user 102 and a clip 272 .
- the clip 272 may be placed on any suitable piece of clothing, for example, on a collar of a shirt, a lapel of a coat or a pocket.
- the clip 272 may comprise a flexible support structure 274 , for example. Although a clip 272 is illustrated, other suitable attachment structure may be utilized to affix the support structure 274 .
- the support structure 274 may comprise the MEMS sensing and processing module 104 , the latter of which may be located on one end of or anywhere on the support structure 274 , for example.
- the invention may not be so limited and the MEMS sensing and processing module 104 may be placed at other locations on the outerwear or innerwear of the user 102 without limiting the scope of the invention.
- the support structure 274 may not be utilized and the MEMS sensing and processing module 104 may be attached to the clip 272 or other suitable attachment structure.
- the MEMS sensing and processing module 104 may be enabled to detect movement caused by the expulsion of human breath by the user 102 . In response to the detection of movement caused by expulsion of human breath, the MEMS sensing and processing module 104 may be enabled to generate one or more controls signals.
- the MEMS sensing and processing module 104 may comprise one or more segments or members that may be operable to sense the kinetic energy generated by the expulsion of the human breath and accordingly generate the one or more control signals.
- the generated one or more control signals may be enabled to control a user interface 107 b of the cellphone/smartphone/dataphone 106 b.
- FIG. 3A is a flow chart illustrating exemplary steps for controlling a user interface of a device using human breath, in accordance with an embodiment of the invention.
- exemplary steps may begin at step 302 .
- the sensing module 110 in the MEMS sensing and processing module 104 may be enabled to detect movement or change in composition such as ambient air composition, for example caused by the expulsion of human breath by the user 102 .
- the sensing module 110 may be enabled to generate one or more electrical, optical and/or magnetic signals in response to the detection of movement caused by the expulsion of human breath.
- the processor firmware 116 may be enabled to process the received electrical, magnetic and/or optical signals from the sensing module 110 utilizing various algorithms.
- the processor firmware 116 may also be enabled to incorporate artificial intelligence (Al) algorithms to adapt to a particular user's 102 breathing pattern.
- the processor firmware 116 may be enabled to generate one or more control signals to the device being controlled 106 based on processing the received electrical, optical and/or magnetic signals from the sensing module 110 .
- the generated one or more control signals may be communicated to the device being controlled 106 via a wired and/or a wireless signal.
- one or more processors within the device being controlled 106 may be enabled utilize the communicated control signals to control a user interface 128 of the device being controlled 106 , such as a user interface 107 a of the multimedia device 106 a , a user interface 107 b of the cellphone/smartphone/dataphone 106 b , a user interface 107 c of the personal computer (PC), laptop or a notebook computer 106 c , a user interface 107 d of the display device 106 d , a user interface 107 e of the TV/game console/other platform 106 e , and a user interface of a mobile multimedia player and/or a remote controller. Control then passes to end step 316 .
- a user interface 128 of the device being controlled 106 such as a user interface 107 a of the multimedia device 106 a , a user interface 107 b of the cellphone/smartphone/dataphone 106 b , a user interface 107 c of the personal computer (PC
- FIG. 3B is a flow chart illustrating exemplary steps for side loading of information, in accordance with an embodiment of the invention.
- exemplary steps may begin at step 352 .
- the device being controlled 106 such as the mobile phone 130 a may be enabled to receive data and/or media content from another device 108 , such as the PC, laptop, or a notebook computer 132 .
- the device being controlled 106 such as the mobile phone 130 a may be enabled to retrieve data and/or media content from a network, such as the Internet 134 .
- the retrieved data and/or media content may comprise an RSS feed, a URL and/or multimedia content.
- step 358 it may be determined whether the laptop, PC and/or notebook 132 may perform association and/or mapping of the received data and/or media content and the retrieved data and/or media content. If the association or mapping is performed on the laptop, PC and/or notebook 132 , control passes to step 360 .
- one or more processors within the laptop, PC and/or notebook 132 may be enabled to associate and/or map the received and retrieved data and/or media content into icons or groups. For example, the laptop, PC and/or notebook 132 may be enabled to associate and/or map an icon to a function so that the user 102 may enable or activate a function via the icon.
- Exemplary icons may enable functions such as hyper-links, book marks, shortcuts, widgets, RSS feeds and/or favorite buddies.
- the laptop, PC and/or notebook 132 may be enabled to communicate the associated icons or groups to the device being controlled 106 , such as the mobile phone 130 a . Control then passes to step 366 .
- step 364 one or more processors within the device being controlled 106 , such as the mobile phone 130 a may be enabled to associate and/or map the received and retrieved data and/or media content into icons or groups.
- the mobile phone 130 a may be enabled to associate and/or map an icon to a function so that the user 102 may enable or activate a function via the icon.
- the device being controlled 106 such as the mobile phone 130 a may be enabled to customize the associated icons or groups so that content associated with the received data and/or media content may become an integral part of the user interface 131 a of the device being controlled, such as the mobile phone 130 a .
- the user interface 131 a may be modified and/or organized by the user 102 .
- the user 102 may choose, create, arrange and/or organize content to be utilized for the user interface 131 a and/or one or more content components.
- the user 102 may organize the content components on a screen and may choose content such as personal photographs for background and/or icon images.
- the user 102 may create and/or modify the way content components are activated or presented to the user 102 .
- the user 102 may make, import and/or edit icons and/or backgrounds for the user interface 128 .
- Control then passes to end step 368 .
- a method and system for controlling a user interface of a device using human breath may comprise a MEMS sensing and processing module 104 that may be enabled to detect movement caused by the expulsion of human breath by the user 102 .
- the MEMS sensing and processing module 104 may be enabled to generate one or more controls signals.
- the generated one or more control signals may be utilized to control a user interface 128 of a plurality of devices, such as a multimedia device 106 a , a cellphone/smartphone/dataphone 106 b , a PC, laptop or a notebook computer 106 c , a display device 106 d , a TV/game console/other platform 106 e , a mobile multimedia player and/or a remote controller.
- a multimedia device 106 a such as a multimedia device 106 a , a cellphone/smartphone/dataphone 106 b , a PC, laptop or a notebook computer 106 c , a display device 106 d , a TV/game console/other platform 106 e , a mobile multimedia player and/or a remote controller.
- the detection of the movement caused by the expulsion of human breath may occur without use of a channel.
- the detection of the movement caused by expulsion of human breath may be responsive to the human breath being exhaled into open space and onto a detection device or a sensing module 110 that enables the detection.
- the detecting of the movement and the generation of the one or more control signals may be performed utilizing a MEMS, such a MEMS sensing and processing module 104 .
- the MEMS sensing and processing module 104 may be enabled to navigate within the user interface of one of more of the devices being controlled 106 via the generated one or more control signals.
- the MEMS sensing and processing module 104 may be enabled to select one or more components within the user interface 128 of the devices being controlled 106 via the generated one or more control signals.
- the generated one or more control signals may comprise one or more of a wired and/or a wireless signal.
- one or more of the plurality of devices such as a handheld device, for example, a multimedia device 106 a and/or a cellphone/smartphone/dataphone 106 b and/or a PC, laptop or a notebook computer 106 c may be enabled to receive one or more inputs defining the user interface 128 from another device 108 .
- the other device 108 may be one or more of a PC, laptop or a notebook computer 106 c and/or a handheld device, for example, a multimedia device 106 a and/or a cellphone/smartphone/dataphone 106 b .
- data may be transferred from the other device 108 to the cellphone/smartphone/dataphone 106 b and this data may be associated or mapped to media content that may be remotely accessed by the cellphone/smartphone/dataphone 106 b via a service provider such as a cellular or PCS service provider.
- the transferred data that is associated or mapped to media content may be utilized to customize the user interface of the cellphone/smartphone/dataphone 106 b .
- media content associated with one or more received inputs may become an integral part of the user interface 128 of the device being controlled 106 .
- the MEMS may be enabled to detect the expulsion of any type of fluid such as air, and the source of the fluid may be an animal, a machine and/or a device.
- Certain embodiments of the invention may comprise a machine-readable storage having stored thereon, a computer program having at least one code section for controlling a user interface of a device using human breath, the at least one code section being executable by a machine for causing the machine to perform one or more of the steps described herein.
- aspects of the invention may be realized in hardware, software, firmware or a combination thereof.
- the invention may be realized in a centralized fashion in at least one computer system or in a distributed fashion where different elements are spread across several interconnected computer systems. Any kind of computer system or other apparatus adapted for carrying out the methods described herein is suited.
- a typical combination of hardware, software and firmware may be a general-purpose computer system with a computer program that, when being loaded and executed, controls the computer system such that it carries out the methods described herein.
- One embodiment of the invention may be implemented as a board level product, as a single chip, application specific integrated circuit (ASIC), or with varying levels integrated on a single chip with other portions of the system as separate components.
- the degree of integration of the system will primarily be determined by speed and cost considerations. Because of the sophisticated nature of modern processors, it is possible to utilize a commercially available processor, which may be implemented external to an ASIC implementation of the present system. Alternatively, if the processor is available as an ASIC core or logic block, then the commercially available processor may be implemented as part of an ASIC device with various functions implemented as firmware.
- the present invention may also be embedded in a computer program product, which comprises all the features enabling the implementation of the methods described herein, and which when loaded in a computer system is able to carry out these methods.
- Computer program in the present context may mean, for example, any expression, in any language, code or notation, of a set of instructions intended to cause a system having an information processing capability to perform a particular function either directly or after either or both of the following: a) conversion to another language, code or notation; b) reproduction in a different material form.
- other meanings of computer program within the understanding of those skilled in the art are also contemplated by the present invention.
Landscapes
- Physics & Mathematics (AREA)
- Engineering & Computer Science (AREA)
- Acoustics & Sound (AREA)
- Multimedia (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
Certain aspects of a method and system for controlling a user interface of a device using human breath may include detecting movement caused by expulsion of human breath by a user. In response to the detection of movement caused by expulsion of human breath, one or more control signals may be generated. The generated control signals may control the user interface of a device and may enable navigation and/or selection of components in the user interface. The generated one or more control signals may be communicated to the device being controlled via a wired and/or a wireless signal. The expulsion of the human breath may occur in open space and the detection of the movement caused by the expulsion may occur without the use of a channel. The detection of the movement and/or the generation of the control signals may be performed by a MEMS detector or sensor.
Description
- This application is a continuation of U.S. patent application Ser. No. 12/056,164, filed Mar. 26, 2008, which is a continuation-in-part of U.S. patent application Ser. No. 10/453,192, filed Jun. 2, 2003, now U.S. Pat. No. 7,584,064, which is a continuation of U.S. patent application Ser. No. 09/913,398, filed Aug. 10, 2001, now U.S. Pat. No. 6,574,571, which is a U.S. national application filed under 35 U.S.C. 371 of International Application No. PCT/FR00/00362, filed Feb. 14, 2000, which makes reference to, claims priority to, and claims the benefit of French Patent Application Serial No. 99 01958, filed Feb. 12, 1999.
- This application also makes reference to:
- U.S. application Ser. No. 12/055,999, filed Mar. 26, 2008;
- U.S. application Ser. No. 12/056,203, filed Mar. 26, 2008
- U.S. application Ser. No. 12/056,171, filed Mar. 26, 2008;
- U.S. application Ser. No. 12/056,061, filed Mar. 26, 2008; and
- U.S. application Ser. No. 12/056,187, filed Mar. 26, 2008.
- Each of the above referenced applications is hereby incorporated herein by reference in its entirety.
- Not Applicable
- Not Applicable
- Certain embodiments of the invention relate to controlling a computer or electronic system. More specifically, certain embodiments of the invention relate to a method and system for controlling a user interface of a device using human breath.
- Mobile communications have changed the way people communicate and mobile phones have been transformed from a luxury item to an essential part of every day life. The use of mobile phones is today dictated by social situations, rather than hampered by location or technology.
- While voice connections fulfill the basic need to communicate, and mobile voice connections continue to filter even further into the fabric of every day life, the mobile access to services via the Internet has become the next step in the mobile communication revolution. Currently, most mobile devices are equipped with a user interface that allows users to access the services provided via the Internet. For example, some mobile devices may have browsers, and software and/or hardware buttons may be provided to enable navigation and/or control of the user interface. Some mobile devices such as Smartphones are equipped with touch screen capability that allows users to navigate or control the user interface via touching with one hand while the device is held in another hand.
- Further limitations and disadvantages of conventional and traditional approaches will become apparent to one of skill in the art, through comparison of such systems with the present invention as set forth in the remainder of the present application with reference to the drawings.
- A system and/or method for controlling a user interface of a device using human breath, substantially as shown in and/or described in connection with at least one of the figures, as set forth more completely in the claims.
- Various advantages, aspects and novel features of the present invention, as well as details of an illustrated embodiment thereof, will be more fully understood from the following description and drawings.
-
FIG. 1A is a block diagram of an exemplary system for controlling a user interface of a plurality of devices using human breath, in accordance with an embodiment of the invention. -
FIG. 1B is a block diagram of an exemplary sensing module to detect human breath, in accordance with an embodiment of the invention. -
FIG. 1C is a block diagram of another embodiment of an exemplary system for controlling a user interface of a device using human breath, in accordance with an embodiment of the invention. -
FIG. 1D is a block diagram of an exemplary processor interacting with a device being controlled, in accordance with an embodiment of the invention. -
FIG. 1E is a block diagram of an exemplary system for side loading of information between two or more devices, in accordance with an embodiment of the invention. -
FIG. 2A is a diagram illustrating an exemplary MEMS sensing and processing module located on a stylus, in accordance with an embodiment of the invention. -
FIG. 2B is a diagram illustrating an exemplary MEMS sensing and processing module located on a headset for military personnel, in accordance with an embodiment of the invention. -
FIG. 2C is a diagram illustrating an exemplary MEMS sensing and processing module located on a headrest of a seating apparatus, in accordance with an embodiment of the invention. -
FIG. 2D is a diagram illustrating an exemplary MEMS sensing and processing module located inside an automobile, in accordance with an embodiment of the invention. -
FIG. 2E is a diagram illustrating an exemplary MEMS sensing and processing module located on detachable eyewear, in accordance with an embodiment of the invention. -
FIG. 2F is a diagram illustrating an exemplary MEMS sensing and processing module located on a neckset, in accordance with an embodiment of the invention. -
FIG. 2G is a diagram illustrating an exemplary MEMS sensing and processing module located on a stand alone device, in accordance with an embodiment of the invention. -
FIG. 2H is a diagram illustrating an exemplary MEMS sensing and processing module located on a clip, in accordance with an embodiment of the invention. -
FIG. 3A is a flow chart illustrating exemplary steps for controlling a user interface of a device using human breath, in accordance with an embodiment of the invention. -
FIG. 3B is a flow chart illustrating exemplary steps for side loading of information, in accordance with an embodiment of the invention. - Certain aspects of the invention may be found in a method and system for controlling a user interface of a device using human breath. Exemplary aspects of the invention may comprise detecting movement caused by expulsion of human breath by a user. In response to the detection of movement caused by expulsion of human breath, one or more control signals may be generated. The generated control signals may be utilized to control the user interface of a device and may enable navigation and/or selection of components in the user interface. The generated one or more control signals may be communicated to the device being controlled via a wired and/or a wireless signal. The expulsion of the human breath may occur in open space and the detection of the movement caused by the expulsion may occur without the use of a channel. The detection of the movement and/or the generation of the control signals may be performed by a MEMS. One exemplary embodiment of a user interface is a graphical user interface (GUI).
-
FIG. 1A is a block diagram of an exemplary system for controlling a user interface of a plurality of devices using human breath, in accordance with an embodiment of the invention. Referring toFIG. 1A , there is shown auser 102, a micro-electro-mechanical system (MEMS) sensing andprocessing module 104, and a plurality of devices to be controlled, such as amultimedia device 106 a, a cellphone/smartphone/dataphone 106 b, a personal computer (PC), laptop or anotebook computer 106 c, adisplay device 106 d and/or a television (TV)/game console/other platform 106 e. Themultimedia device 106 a may comprise auser interface 107 a, the cellphone/smartphone/dataphone 106 b may comprise auser interface 107 b, and the personal computer (PC), laptop or anotebook computer 106 c may comprise auser interface 107 c. Additionally, thedisplay device 106 d may comprise auser interface 107 d and the television (TV)/game console/other platform 106 e may comprise auser interface 107 e. Each of the plurality of devices to be controlled may be wired or wirelessly connected to a plurality ofother devices 108 for loading of information via, for example, side loading, or loading via a peer-to-peer connection, and/or a network connection, and by wired and/or wireless communication. Exemplaryother devices 108 may comprise game consoles, immersive or 3D reality devices, and/or telematic devices. Telematic devices refers to devices comprising integrated computing, wireless communication and/or global navigation satellite system devices, which enables sending, receiving and/or storing of information over networks. The user interface may enable interacting with the device being controlled by one or more inputs, for example, expulsion of a fluid such as air, tactual inputs such as button presses, audio actions such as voice commands, and/or movements of theelectronic device 202 such as those detected by an accelerometer and/or gyroscope. - The MEMS sensing and
processing module 104 may comprise suitable logic, circuitry and/or code that may be enabled to detect movement caused by expulsion of human breath by theuser 102. In response to the detection of movement caused by expulsion of human breath, the MEMS sensing andprocessing module 104 may be enabled to generate one or more controls signals. The MEMS sensing andprocessing module 104 may comprise one or more segments or members that may be operable to sense the kinetic energy generated by the expulsion of the human breath and accordingly generate the one or more control signals. The generated one or more control signals may be enabled to control a user interface of one or more of a plurality of devices, such as theuser interface 107 a of themultimedia device 106 a, theuser interface 107 b of the cellphone/smartphone/dataphone 106 b, theuser interface 107 c of the PC, laptop or anotebook computer 106 c, theuser interface 107 d of thedisplay device 106 d, theuser interface 107 e of the TV/game console/other platform 106 e, and the user interfaces of the mobile multimedia player and/or a remote controller. One exemplary embodiment of a user interface is a graphical user interface (GUI). Any information and/or data presented on a display including programs and/or applications may be part of the user interface. U.S. application Ser. No. 12/055,999 discloses an exemplary MEMS sensing and processing module and is hereby incorporated herein by reference in its entirety. - In accordance with an embodiment of the invention, the detection of the movement caused by expulsion of human breath may occur without use of a channel. The detection of the movement caused by expulsion of human breath may be responsive to the expulsion of human breath into open space, which is then sensed.
- In accordance with another embodiment of the invention, the MEMS sensing and
processing module 104 may be enabled to navigate within the user interface of one of more of the plurality of devices, such as a handheld device, for example, amultimedia device 106 a, a cellphone/smartphone/dataphone 106 b, a PC, laptop or anotebook computer 106 c, adisplay device 106 d, and/or a TV/game console/other platform 106 e via the generated one or more control signals. The MEMS sensing andprocessing module 104 may be enabled to select one or more components within the user interface of the plurality of devices via the generated one or more control signals. The generated one or more control signals may comprise one or more of a wired and/or a wireless signal. - In accordance with another embodiment of the invention, one or more of the plurality of devices, such as a handheld device, for example, a
multimedia device 106 a and/or a cellphone/smartphone/dataphone 106 b and/or a PC, laptop or anotebook computer 106 c may be enabled to receive one or more inputs defining the user interface from anotherdevice 108. Theother device 108 may be one or more of a PC, laptop or anotebook computer 106 c and/or a handheld device, for example, amultimedia device 106 a and/or a cell phone/smartphone/dataphone 106 b. In this regard, data may be transferred from theother device 108 to the cellphone/smartphone/dataphone 106 b and this data may be associated or mapped to media content that may be remotely accessed by the cellphone/smartphone/dataphone 106 b via a service provider such as a cellular or PCS service provider. The transferred data that is associated or mapped to media content may be utilized to customize theuser interface 107 b of the cellphone/smartphone/dataphone 106 b. In this regard, media content associated with one or more received inputs may become an integral part of the user interface of the device being controlled. The associating and/or mapping may be performed on either theother device 108 and/or one the cellphone/smartphone/dataphone 106 b. In instances where the associating and/or mapping is performed on theother device 108, the associated and/or mapped data may be transferred from theother device 108 to the cellphone/smartphone/dataphone 106 b. - In an exemplary embodiment of the invention, an icon transferred from the
other device 108 to the cellphone/smartphone/dataphone 106 b may be associated or mapped to media content such as an RSS feed, a markup language such as HTML, and XML, that may be remotely accessed by the cellphone/smartphone/dataphone 106 b via the service provider of the cellphone/smartphone 106 b. Accordingly, when theuser 102 blows on the MEMS sensing andprocessing module 104, control signals generated by the MEMS sensing andprocessing module 104 may navigate to the icon and select the icon. Once the icon is selected, the RSS feed or markup language may be accessed via the service provider of the cellphone/smartphone/dataphone 106 b and corresponding RSS feed or markup language content may be displayed on theuser interface 107 b. U.S. application Ser. No. 12/056,187 discloses an exemplary method and system for customizing a user interface of a device and is hereby incorporated herein by reference in its entirety. - In operation, a
user 102 may exhale into open space and the exhaled breath or air may be sensed by one or more detection devices or detectors, such as one or more sensors, sensing members and/or sensing segments in the MEMS sensing andprocessing module 104. The MEMS sensing andprocessing module 104 may be enabled to detect movement caused by expulsion of human breath by theuser 102. One or more electrical, optical and/or magnetic signals may be generated by one or more detection devices or detectors within the MEMS sensing andprocessing module 104 in response to the detection of movement caused by expulsion of human breath. The processor firmware within the MEMS sensing andprocessing module 104 may be enabled to process the received electrical, optical and/or magnetic signals from the one or more detection device(s) or detector(s) utilizing various algorithms and generate one or more control signals to the device being controlled, for example, themultimedia device 106 a. The generated one or more control signals may be communicated to the device being controlled, for example, themultimedia device 106 a via a wired and/or a wireless signal. The processor in the device being controlled may utilize the communicated control signals to control the user interface of the device being controlled, such as auser interface 107 a of themultimedia device 106 a, auser interface 107 b of the cellphone/smartphone/dataphone 106 b, auser interface 107 c of the personal computer (PC), laptop or anotebook computer 106 c, auser interface 107 d of thedisplay device 106 d, auser interface 107 e of the TV/game console/other platform 106 e, and a user interface of a mobile multimedia player and/or a remote controller. -
FIG. 1B is a block diagram of an exemplary detection device or detector to detect human breath, in accordance with an embodiment of the invention. Referring toFIG. 1B , there is shown auser 102 and asensing module 110. Thesensing module 110 may comprise asensor control chip 109 and a plurality of sensors, for example, 111 a, 111 b, 111 c, and 111 d. Notwithstanding, the invention may not be so limited and thesensing module 110 may comprise more or less than the number of sensors or sensing members or segments shown inFIG. 1B without limiting the scope of the invention. Accordingly, any number of detectors and sources may be utilized according to the desired size, sensitivity, and resolution desired. Similarly, the type of sources and detectors may comprise other sensing mechanisms, other than visible light. For example, piezoelectric, ultrasonic, Hall effect, electrostatic, and/or permanent or electro-magnet sensors may be activated by deflected MEMS members to generate a signal to be communicated to thesensor control chip 109. - The
sensing module 110 may be an electrochemical sensor or any other type of breath analyzing sensor, for example. The plurality of sensors or sensing members or segments 111 a-d may be an integral part of one or more MEMS devices that may enable the detection of various velocities of air flow from the user's 102 breath. The plurality of sensors or sensing members or segments 111 a-d may be enabled to detect kinetic energy and/or movement caused by the expulsion of human breath by theuser 102. Thesensor control chip 109 may be enabled to generate an electrical, optical and/or magnetic signal that may be communicated to the processor in response to the detection of kinetic energy and/or movement caused by expulsion of human breath. -
FIG. 1C is a block diagram of another embodiment of an exemplary system for controlling a user interface of a device using human breath, in accordance with an embodiment of the invention. Referring toFIG. 1C , there is shown auser 102, a MEMS sensing andprocessing module 104, and a device being controlled 106, such as amultimedia device 106 a, a cellphone/smartphone/dataphone 106 b, a PC, laptop or anotebook computer 106 c, adisplay device 106 d and/or a TV/game console/other platform 106 e. The device being controlled 106 may be wired and/or wirelessly connected to a plurality ofother devices 108 for side loading of information. - The MEMS sensing and
processing module 104 may comprise asensing module 110, aprocessing module 112 andpassive devices 113. Thepassive devices 113, which may comprise resistors, capacitors and/or inductors, may be embedded within a substrate material of the MEMS processing sensing andprocessing module 104. Theprocessing module 112 may comprise, for example, an ASIC. Thesensing module 110 may generally be referred to as a detection device or detector, and may comprise one or more sensors, sensing members and/or sensing segments that may be enabled to detect kinetic energy and/or movement caused by the expulsion of human breath by theuser 102. Thesensing module 110 may be enabled to generate an electrical, optical and/or magnetic signal that may be communicated to theprocessing module 112 in response to the detection of kinetic energy and/or movement caused by expulsion of human breath. - The
processing module 112 may comprise suitable logic, circuitry and/or code that may be enabled to receive the generated electric signal from thesensing module 110 and generate one or more control signals to the device being controlled 106. In this regard, theprocessing module 112 may comprise one or more analog to digital converters that may be enabled to translate the sensed signal to one or more digital signals, which may be utilized to generate the one or more control signals. The generated one or more control signals may be enabled to control a user interface of the device being controlled 106. - The device being controlled 106 may comprise a
user interface 107. Accordingly, the generated one or more signals from the MEMS sensing andprocessing module 104 may be communicated to the device being controlled 106 and utilized to control theuser interface 107. In an exemplary embodiment of the invention, the one or more signals generated by the MEMS sensing andprocessing module 104 may be operable to control a pointer on the device being controlled 106 such that items in theuser interface 107 may be selected and/or manipulated. In an exemplary embodiment of the invention, the device being controlled may be enabled to receive one or more inputs from theother devices 108, which may be utilized to customize or define theuser interface 107. Theother device 108 may be one or more of a PC, laptop or anotebook computer 106 c and/or a handheld device, for example, amultimedia device 106 a and/or a cellphone/smartphone/dataphone 106 b. In this regard, theother device 108 may be similar to or different from the type of device that is being controlled 106. In some embodiments of the invention, a processor in theother device 108 may be operable to associate or map the data to media content that is remotely accessible by the device being controlled 106. In other embodiments of the invention, a processor in the device being controlled 106 may be operable to associate or map the data to media content that is remotely accessible by the device being controlled 106. U.S. application Ser. No. 12/056,187 discloses an exemplary method and system for customizing a user interface of a device and is hereby incorporated herein by reference in its entirety. -
FIG. 1D is a block diagram of an exemplary processor interacting with a device being controlled, in accordance with an embodiment of the invention. Referring toFIG. 1D , there is shown aprocessing module 112, and a device being controlled 106 such as amultimedia device 106 a, a cellphone/smartphone/dataphone 106 b, a PC, laptop or anotebook computer 106 c, adisplay device 106 d and/or a TV/game console/other platform 106 e. Theprocessing module 112 may be an ASIC and may comprise one or more analog to digital converters (ADCs) 114,processor firmware 116, and acommunication module 118. The device being controlled 106 may comprise acommunication module 120, aprocessor 122,memory 123,firmware 124, adisplay 126, and auser interface 128. The device being controlled 106 may be wired and/or wirelessly connected to a plurality ofother devices 108 for loading of information via, for example, side loading, or loading via a peer-to-peer connenction, and/or a network connection, and by wired and/or wireless communication. - The
processing module 112 may comprise suitable logic, circuitry and/or code that may be enabled to receive a digital sensing signal and/or an analog sensing signal from thesensing module 110. TheADC 114 may comprise suitable logic, circuitry and/or code that may be enabled to receive the generated analog sensing signal from thesensing module 110 and convert the received signal into a digital signal. - The
processor firmware 116 may comprise suitable logic, and/or code that may be enabled to receive and process the digital signal from theADC 114 and/or the digital sensing signal from thesensing module 110 utilizing a plurality of algorithms to generate one or more control signals. For example, theprocessor firmware 116 may be enabled to read, store, calibrate, filter, modelize, calculate and/or compare the outputs of thesensing module 110. Theprocessor firmware 116 may also be enabled to incorporate artificial intelligence (Al) algorithms to adapt to a particular user's 102 breathing pattern. Theprocessor firmware 116 may be enabled to generate one or more control signals to the device being controlled 106 based on processing the received digital signals. The generated one or more control signals may be enabled to control a user interface of the device being controlled 106, for example, scrolling, zooming, and/or 3-D navigation within the device being controlled 106. - The
communication module 118 may comprise suitable logic, circuitry and/or code that may be enabled to receive and communicate the generated one or more control signals to the device being controlled 106 via a wired and/or a wireless signal. Thecommunication modules communication modules - The
communication module 120 may be enabled to receive the communicated control signals via a wired and/or a wireless signal. Theprocessor 122 may comprise suitable logic, circuitry and/or code that may be enabled to utilize the received one or more control signals to control theuser interface 128 and/or thedisplay 126. The memory may comprise suitable logic, circuitry and/or code that may be enabled to store data on the device being controlled 106. Thefirmware 124 may comprise a plurality of drivers and operating system (OS) libraries to convert the received control signals into functional commands. Thefirmware 124 may be enabled to map local functions, and convert received control signals into compatible data, such as user customization features, applets, and/or plugins to control theuser interface 128. - The device being controlled 106 may be enabled to receive one or more inputs defining the
user interface 128 from anotherdevice 108. Theother device 108 may comprise auser interface 129 and aprocessor 125. Theother device 108 may be one or more of a PC, laptop or anotebook computer 106 c and/or a handheld device, for example, amultimedia device 106 a and/or a cellphone/smartphone/dataphone 106 b. In this regard, data may be transferred from theother device 108 to the device being controlled, such as the cellphone/smartphone/dataphone 106 b and this data may be associated or mapped to media content that may be remotely accessed by the cellphone/smartphone/dataphone 106 b via a service provider such as a cellular or PCS service provider. The transferred data that is associated or mapped to media content may be utilized to customize theuser interface 128 of the device being controlled, such as the cellphone/smartphone/dataphone 106 b. In this regard, media content associated with one or more received inputs may become an integral part of theuser interface 128 of the device being controlled 106. - In some embodiments of the invention, the
processor 125 in theother device 108 may be operable to associate or map the data to media content that is remotely accessible by the device being controlled 106. In other embodiments of the invention, theprocessor 122 in the device being controlled 106 may be operable to associate or map the data to media content that is remotely accessible by the device being controlled 106. -
FIG. 1E is a block diagram of an exemplary system for side loading of information between two or more devices, in accordance with an embodiment of the invention. Referring toFIG. 1E , there is shown acarrier network 124, a plurality of devices being controlled 106, such as, a plurality ofmobile phones network 134, such as the Internet. Thenetwork 134 may be coupled to aweb server 136, awireless carrier portal 138, aweb portal 140 and/or adatabase 142. Each of the plurality of devices being controlled 106 may have a user interface. For example, themobile phone 130 a may have auser interface 131 a, themobile phone 130 b may have auser interface 131 b, themobile phone 130 c may have auser interface 131 c and themobile phone 130 d may have auser interface 131 d. The PC, laptop or a notebook computer 132 may have auser interface 133. - The
carrier network 124 may be a wireless access carrier network. Exemplary carrier networks may comprise 2G, 2.5G, 3G, 4G, IEEE802.11, IEEE802.16 and/or suitable network capable of handling voice, video and/or data communication. The plurality of devices being controlled 106 may be wirelessly connected to thecarrier network 124. One of the devices being controlled, such asmobile phone 130 a may be connected to a plurality ofmobile phones mobile phone 130 a may be communicatively coupled to a PC, laptop, or a notebook computer 132 via a wired or a wireless network. For example, themobile phone 130 a may be communicatively coupled to the PC, laptop, or a notebook computer 132 via an infrared (IR) link, an optical link, an USB link, a wireless USB, a Bluetooth link and/or a ZigBee link. Notwithstanding, the invention may not be so limited and other wired and/or wireless links may be utilized without limiting the scope of the invention. The PC, laptop, or a notebook computer 132 may be communicatively coupled to thenetwork 134, for example, theInternet network 134 via a wired or a wireless network. The plurality of devices being controlled, such as the plurality ofmobile phones Internet network 134. - The
web server 136 may comprise suitable logic, circuitry, and/or code that may be enabled to receive, for example, HTTP and/or FTP requests from clients or web browsers installed on the PC, laptop, or a notebook computer 132 via theInternet network 134, and generate HTTP responses along with optional data contents, such as HTML documents and linked objects, for example. - The
wireless carrier portal 138 may comprise suitable logic and/or code that may be enabled to function as a point of access to information on theInternet network 134 via a mobile device, such amobile phone 130 a, for example. Thewireless carrier portal 138 may be, for example, a website that may be enabled to provide a single function via a mobile web page, for example. - The
web portal 140 may comprise suitable logic and/or code that may be enabled to function as a point of access to information on theInternet 134. Theweb portal 140 may be, for example, a site that may be enabled to provide a single function via a web page or site. Theweb portal 140 may present information from diverse sources in a unified way such as e-mail, news, stock prices, infotainment and various other features. Thedatabase 142 may comprise suitable logic, circuitry, and/or code that may be enabled to store a structured collection of records or data, for example. Thedatabase 142 may be enabled to utilize software to organize the storage of data. - In accordance with an embodiment of the invention, the device being controlled, such as the
mobile phone 130 a may be enabled to receive one or more inputs defining auser interface 128 from another device, such as the PC, laptop, or a notebook computer 132. One ormore processors 122 within the device being controlled 106 may be enabled to customize theuser interface 128 of the device being controlled, such as themobile phone 130 a so that content associated with one or more received inputs may become an integral part of theuser interface 128 of the device being controlled, such as themobile phone 130 a. Themobile phone 130 a may be enabled to access content directly from the PC, laptop, or a notebook computer 132 rather than from thecarrier network 124. This method of uploading and/or downloading customized information directly from the PC, laptop, or a notebook computer 132 rather than from thecarrier network 124 may be referred to as side loading. - In accordance with one embodiment of the invention, the
user interface 128 may be created, modified and/or organized by theuser 102. In this regard, theuser 102 may choose, select, create, arrange, manipulate and/or organize content to be utilized for theuser interface 128 and/or one or more content components. For example, theuser 102 may organize the content components on a screen and may choose content such as personal photographs for background and/or icon images. In addition, theuser 102 may create and/or modify the way content components are activated or presented to theuser 102. For example, theuser 102 may make, import and/or edit icons and/or backgrounds for theuser interface 128. Accordingly, theuser 102 may associate and/or map the icon to a function so that theuser 102 may enable or activate a function via the icon. Exemplary icons may enable functions such as hyper-links, book marks, programs/applications, shortcuts, widgets, RSS or markup language feeds or information, and/or favorite buddies. - In addition, the
user 102 may organize and/or arrange content components within theuser interface 128. For example, the icons may be organized by category into groups. Groups of icons such as content components may be referred to as affinity banks, for example. In some embodiments of the invention, theprocessor 125 in theother device 108 may be operable to associate or map the data to media content that is remotely accessible by the device being controlled 106. In other embodiments of the invention, theprocessor 122 in the device being controlled 106 may be operable to associate or map the data to media content that is remotely accessible by the device being controlled 106. For example, theprocessor 122 may be enabled to associate and/or map an icon to a function so that theuser 102 may enable or activate a function via the icon and may organize and/or arrange content components within theuser interface 128. - Creation, modification and/or organization of the
user interface 128 and/or content components may be performed on the device being controlled, such asmobile phone 130 a and/or may be performed on another device such as the PC, laptop, or a notebook computer 132. In this regard, a user screen and/or audio that may be created, modified and/or organized on another device, such as the PC, laptop, or a notebook computer 132 may be side loaded to the device being controlled, such asmobile phone 130 a. In addition, the side loadeduser interface 128 may be modified and/or organized on the device being controlled, such asmobile phone 130 a. For example, auser interface 128 may be side loaded from the PC, laptop, or a notebook computer 132 to themobile phone 130 a and may be customized on themobile phone 130 a. One or more tools may enable creation, modification and/or organization of theuser interface 128 and/or audio or visual content components. -
FIG. 2A is a diagram illustrating an exemplary MEMS sensing and processing module located on a stylus, in accordance with an embodiment of the invention. Referring toFIG. 2A , there is shown auser 102 and a device being controlled, such as a cellphone/smartphone/dataphone 106 b. The cellphone/smartphone/dataphone 106 b may comprise auser interface 107 b, and astylus 202. Thestylus 202 may be retractable, collapsible, pivotable about an axis or axes and/or flexible and may be enclosed within the body of the cellphone/smartphone/dataphone 106 b. Thestylus 202 may comprise the MEMS sensing andprocessing module 104 located on one end, for example. In one embodiment of the invention, theuser 102 may be enabled to retract thestylus 202 and exhale into open space and onto the MEMS sensing andprocessing module 104. - The MEMS sensing and
processing module 104 may be enabled to detect movement caused by expulsion of human breath by theuser 102. In response to the detection of movement caused by expulsion of human breath, the MEMS sensing andprocessing module 104 may be enabled to generate one or more controls signals. The MEMS sensing andprocessing module 104 may comprise one or more segments or members that may be operable to sense the kinetic energy generated by the expulsion of the human breath and accordingly generate the one or more control signals. The generated one or more control signals may be enabled to control auser interface 107 b of the cellphone/smartphone/dataphone 106 b. -
FIG. 2B is a diagram illustrating an exemplary MEMS sensing and processing module located on a headset for military personnel, in accordance with an embodiment of the invention. Referring toFIG. 2B , there is shown auser 102. Theuser 102 may wear adetachable helmet 208. Thedetachable helmet 208 may comprisedetachable eyewear 204, adetachable microphone 206, and adetachable headset 210. Thedetachable headset 210 may comprise the MEMS sensing andprocessing module 104 located on one end, for example. - The
detachable eyewear 204 may comprise night vision and/or infrared vision capabilities, for example. Thedetachable microphone 206 may be utilized to communicate with other users, for example. In one embodiment of the invention, theuser 102 may be enabled to exhale into open space and the MEMS sensing andprocessing module 104 may be operable to sense or detect the exhalation. The exhalation may occur from the nostrils and/or the mouth of theuser 102. - The MEMS sensing and
processing module 104 may be enabled to detect movement caused by expulsion of human breath by theuser 102. In response to the detection of movement caused by expulsion of human breath, the MEMS sensing andprocessing module 104 may be enabled to generate one or more controls signals. The generated one or more control signals may be enabled to control a user interface of the device being controlled such as auser interface 107 a of themultimedia device 106 a, auser interface 107 b of the cellphone/smartphone/dataphone 106 b, auser interface 107 c of the personal computer (PC), laptop or anotebook computer 106 c and/or auser interface 107 d of thedisplay device 106 d. -
FIG. 2C is a diagram illustrating an exemplary MEMS sensing and processing module located on a headrest of a seating apparatus, in accordance with an embodiment of the invention. Referring toFIG. 2C , there is shown aseating apparatus 220. Theseating apparatus 220 may comprise aheadrest 222, abackrest 226. Theheadrest 222 may comprise adetachable headset 224. Theuser 102 may be enabled to sit in theseating apparatus 220. - The
detachable headset 224 may comprise the MEMS sensing andprocessing module 104 located on one end, for example. In one embodiment of the invention, theuser 102 may be enabled to exhale into open space and onto the MEMS sensing andprocessing module 104. In one embodiment, theseating apparatus 220 may be located inside a car or any other automobile or vehicle, for example. Notwithstanding, the invention may not be so limited and the MEMS sensing andprocessing module 104 may be located at other locations without limiting the scope of the invention. - The MEMS sensing and
processing module 104 may be enabled to detect movement caused by expulsion of human breath by theuser 102 seated in theseating apparatus 102. In response to the detection of movement caused by expulsion of human breath, the MEMS sensing andprocessing module 104 may be enabled to generate one or more controls signals. The generated one or more control signals may be enabled to control a user interface of the device being controlled such as auser interface 107 a of themultimedia device 106 a, auser interface 107 b of the cellphone/smartphone/dataphone 106 b, auser interface 107 c of the personal computer (PC), laptop or anotebook computer 106 c, auser interface 107 d of thedisplay device 106 d, and/or the user interface of a multimedia player, such as a audio and/or video player. -
FIG. 2D is a diagram illustrating an exemplary MEMS sensing and processing module located inside an automobile, in accordance with an embodiment of the invention. Referring toFIG. 2D , there is shown anautomobile 230. Theautomobile 230 may comprise avisor 232 and asteering wheel 234. - In one embodiment of the invention, the
visor 232 may comprise aflexible support structure 233. Thesupport structure 233 may comprise the MEMS sensing andprocessing module 104 located on one end, for example. In another embodiment of the invention, thesteering wheel 234 may comprise aflexible support structure 235. Thesupport structure 235 may comprise the MEMS sensing andprocessing module 104 located on one end, for example. Notwithstanding, the invention may not be so limited and the MEMS sensing andprocessing module 104 may be located at other locations within theautomobile 230 without limiting the scope of the invention. - For example and without limitation, the
user 102 may be seated in the seat behind thesteering wheel 234, with theprocessing module 104 mounted on thesteering wheel 234. Theuser 102 may be seated in the seat behind thesteering wheel 234. Theuser 102 may be enabled to exhale into open space and onto the MEMS sensing andprocessing module 104. The MEMS sensing andprocessing module 104 may be enabled to detect movement caused by expulsion of human breath by theuser 102. In response to the detection of movement caused by expulsion of human breath, the MEMS sensing andprocessing module 104 may be enabled to generate one or more controls signals to control a user interface of the device being controlled such as auser interface 107 a of themultimedia device 106 a, auser interface 107 b of the cellphone/smartphone/dataphone 106 b, auser interface 107 c of the personal computer (PC), laptop or anotebook computer 106 c, auser interface 107 d of thedisplay device 106 d, and/or the user interface of a multimedia or other device, such as a audio and/or video player or a navigation (e.g., GPS) device. -
FIG. 2E is a diagram illustrating an exemplary MEMS sensing and processing module located on detachable eyewear, in accordance with an embodiment of the invention. Referring toFIG. 2E , there is shown auser 102. Theuser 102 may wear detachable goggles or any other type ofeyewear 240, for example. Thedetachable eyewear 240 may comprise adetachable headset 242. Thedetachable headset 242 may be flexible and/or deflectable. Thedetachable headset 242 may comprise the MEMS sensing andprocessing module 104 located on one end, for example. In one embodiment of the invention, theuser 102 may be enabled to exhale into open space and onto the MEMS sensing andprocessing module 104. - The MEMS sensing and
processing module 104 may be enabled to detect movement caused by expulsion of human breath by theuser 102 seated in theseating apparatus 102. In response to the detection of movement caused by expulsion of human breath, the MEMS sensing andprocessing module 104 may be enabled to generate one or more controls signals to control a user interface of the device being controlled such as auser interface 107 a of themultimedia device 106 a, auser interface 107 b of the cellphone/smartphone/dataphone 106 b, auser interface 107 c of the personal computer (PC), laptop or anotebook computer 106 c, auser interface 107 d of thedisplay device 106 d, and/or the user interface of a multimedia player, such as a audio and/or video player. -
FIG. 2F is a diagram illustrating an exemplary MEMS sensing and processing module located on a neckset, in accordance with an embodiment of the invention. Referring toFIG. 2F , there is shown adetachable neckset 250. Thedetachable neckset 250 may comprise a flexible printed circuit board (PCB) 254 and processing and/orcommunication circuitry 252. Theflexible PCB 254 may comprise the MEMS sensing andprocessing module 104 located on one end, for example. - The processing and/or
communication circuitry 252 may comprise a battery, a voltage regulator, one or more switches, one or more light emitting diodes (LEDs), a liquid crystal display (LCD), other passive devices such as resistors, capacitors, inductors, a communications chip capable of handling one or more wireless communication protocols such as Bluetooth and/or one or more wired interfaces. In an exemplary embodiment of the invention, the processing and/orcommunication circuitry 252 may be packaged within a PCB. Notwithstanding, the invention may not be so limited and the processing and/orcommunication circuitry 252 may comprise other components and circuits without limiting the scope of the invention. - In one embodiment of the invention, the
user 102 may be enabled to wear theneckset 250 around his/her neck and exhale into open space and the MEMS sensing andprocessing module 104 may be operable to sense or detect the exhalation. The exhalation may occur from the nostrils and/or the mouth of theuser 102. - The MEMS sensing and
processing module 104 may be enabled to detect movement caused by expulsion of human breath by theuser 102. In response to the detection of movement caused by expulsion of human breath, the MEMS sensing andprocessing module 104 may be enabled to generate one or more controls signals via theflexible PCB 254 to the processing and/orcommunication circuitry 252. The processing and/orcommunication circuitry 252 may be enabled to process and communicate the generated one or more control signals to a device being controlled, such as amultimedia device 106 a, a cellphone/smartphone/dataphone 106 b, a personal computer (PC), laptop or anotebook computer 106 c and/or adisplay device 106 d. On or more processors within the device being controlled may be enabled to utilize the communicate control signals to control a user interface of the device being controlled such as auser interface 107 a of themultimedia device 106 a, auser interface 107 b of the cellphone/smartphone/dataphone 106 b, auser interface 107 c of the personal computer (PC), laptop or anotebook computer 106 c and/or auser interface 107 d of thedisplay device 106 d. -
FIG. 2G is a diagram illustrating an exemplary MEMS sensing and processing module located on a stand alone device, in accordance with an embodiment of the invention. Referring toFIG. 2G , there is shown a standalone device 262. The standalone device 262 may be placed on any suitable surface, for example, on a table ordesk top 263. The standalone device 262 may comprise aflexible support structure 264. Thesupport structure 264 may comprise the MEMS sensing andprocessing module 104 located on one end, for example. Notwithstanding, the invention may not be so limited and the MEMS sensing andprocessing module 104 may be located at other locations on the standalone device 262, for example in a base of the standalone device 262. Notwithstanding, the invention may not be limited in this regard, and the location of the MEMS sensing andprocessing module 104 within or on the standalone device 262 may vary accordingly. - The MEMS sensing and
processing module 104 may be enabled to detect movement caused by the expulsion of human breath by theuser 102. In response to the detection of movement caused by expulsion of a fluid such as air from human breath, the MEMS sensing andprocessing module 104 may be enabled to generate one or more controls signals. The MEMS sensing andprocessing module 104 may comprise one or more segments or members that may be operable to sense the kinetic energy generated by the expulsion of the human breath and accordingly generate the one or more control signals. The generated one or more control signals may be enabled to control auser interface 107 b of the cellphone/smartphone/dataphone 106 b. -
FIG. 2H is a diagram illustrating an exemplary MEMS sensing and processing module located on a clip, in accordance with an embodiment of the invention. Referring toFIG. 2H , there is shown auser 102 and a clip 272. The clip 272 may be placed on any suitable piece of clothing, for example, on a collar of a shirt, a lapel of a coat or a pocket. The clip 272 may comprise a flexible support structure 274, for example. Although a clip 272 is illustrated, other suitable attachment structure may be utilized to affix the support structure 274. The support structure 274 may comprise the MEMS sensing andprocessing module 104, the latter of which may be located on one end of or anywhere on the support structure 274, for example. Notwithstanding, the invention may not be so limited and the MEMS sensing andprocessing module 104 may be placed at other locations on the outerwear or innerwear of theuser 102 without limiting the scope of the invention. In other exemplary embodiments of the invention, the support structure 274 may not be utilized and the MEMS sensing andprocessing module 104 may be attached to the clip 272 or other suitable attachment structure. - The MEMS sensing and
processing module 104 may be enabled to detect movement caused by the expulsion of human breath by theuser 102. In response to the detection of movement caused by expulsion of human breath, the MEMS sensing andprocessing module 104 may be enabled to generate one or more controls signals. The MEMS sensing andprocessing module 104 may comprise one or more segments or members that may be operable to sense the kinetic energy generated by the expulsion of the human breath and accordingly generate the one or more control signals. The generated one or more control signals may be enabled to control auser interface 107 b of the cellphone/smartphone/dataphone 106 b. -
FIG. 3A is a flow chart illustrating exemplary steps for controlling a user interface of a device using human breath, in accordance with an embodiment of the invention. Referring toFIG. 3A , exemplary steps may begin atstep 302. Instep 304, thesensing module 110 in the MEMS sensing andprocessing module 104 may be enabled to detect movement or change in composition such as ambient air composition, for example caused by the expulsion of human breath by theuser 102. Instep 306, thesensing module 110 may be enabled to generate one or more electrical, optical and/or magnetic signals in response to the detection of movement caused by the expulsion of human breath. Instep 308, theprocessor firmware 116 may be enabled to process the received electrical, magnetic and/or optical signals from thesensing module 110 utilizing various algorithms. Theprocessor firmware 116 may also be enabled to incorporate artificial intelligence (Al) algorithms to adapt to a particular user's 102 breathing pattern. - In
step 310, theprocessor firmware 116 may be enabled to generate one or more control signals to the device being controlled 106 based on processing the received electrical, optical and/or magnetic signals from thesensing module 110. In step 312, the generated one or more control signals may be communicated to the device being controlled 106 via a wired and/or a wireless signal. Instep 314, one or more processors within the device being controlled 106 may be enabled utilize the communicated control signals to control auser interface 128 of the device being controlled 106, such as auser interface 107 a of themultimedia device 106 a, auser interface 107 b of the cellphone/smartphone/dataphone 106 b, auser interface 107 c of the personal computer (PC), laptop or anotebook computer 106 c, auser interface 107 d of thedisplay device 106 d, auser interface 107 e of the TV/game console/other platform 106 e, and a user interface of a mobile multimedia player and/or a remote controller. Control then passes to endstep 316. -
FIG. 3B is a flow chart illustrating exemplary steps for side loading of information, in accordance with an embodiment of the invention. Referring toFIG. 3B , exemplary steps may begin atstep 352. Instep 354, the device being controlled 106, such as themobile phone 130 a may be enabled to receive data and/or media content from anotherdevice 108, such as the PC, laptop, or a notebook computer 132. Instep 356, the device being controlled 106, such as themobile phone 130 a may be enabled to retrieve data and/or media content from a network, such as theInternet 134. For example, the retrieved data and/or media content may comprise an RSS feed, a URL and/or multimedia content. - In
step 358, it may be determined whether the laptop, PC and/or notebook 132 may perform association and/or mapping of the received data and/or media content and the retrieved data and/or media content. If the association or mapping is performed on the laptop, PC and/or notebook 132, control passes to step 360. Instep 360, one or more processors within the laptop, PC and/or notebook 132 may be enabled to associate and/or map the received and retrieved data and/or media content into icons or groups. For example, the laptop, PC and/or notebook 132 may be enabled to associate and/or map an icon to a function so that theuser 102 may enable or activate a function via the icon. Exemplary icons may enable functions such as hyper-links, book marks, shortcuts, widgets, RSS feeds and/or favorite buddies. Instep 362, the laptop, PC and/or notebook 132 may be enabled to communicate the associated icons or groups to the device being controlled 106, such as themobile phone 130 a. Control then passes to step 366. - If the association or mapping is not performed on the laptop, PC and/or notebook 132, control passes to step 364. In
step 364, one or more processors within the device being controlled 106, such as themobile phone 130 a may be enabled to associate and/or map the received and retrieved data and/or media content into icons or groups. For example, themobile phone 130 a may be enabled to associate and/or map an icon to a function so that theuser 102 may enable or activate a function via the icon. - In
step 366, the device being controlled 106, such as themobile phone 130 a may be enabled to customize the associated icons or groups so that content associated with the received data and/or media content may become an integral part of theuser interface 131 a of the device being controlled, such as themobile phone 130 a. Theuser interface 131 a may be modified and/or organized by theuser 102. In this regard, theuser 102 may choose, create, arrange and/or organize content to be utilized for theuser interface 131 a and/or one or more content components. For example, theuser 102 may organize the content components on a screen and may choose content such as personal photographs for background and/or icon images. In addition, theuser 102 may create and/or modify the way content components are activated or presented to theuser 102. For example, theuser 102 may make, import and/or edit icons and/or backgrounds for theuser interface 128. Control then passes to endstep 368. - In accordance with an embodiment of the invention, a method and system for controlling a user interface of a device using human breath may comprise a MEMS sensing and
processing module 104 that may be enabled to detect movement caused by the expulsion of human breath by theuser 102. In response to the detection of movement caused by the expulsion of human breath, the MEMS sensing andprocessing module 104 may be enabled to generate one or more controls signals. The generated one or more control signals may be utilized to control auser interface 128 of a plurality of devices, such as amultimedia device 106 a, a cellphone/smartphone/dataphone 106 b, a PC, laptop or anotebook computer 106 c, adisplay device 106 d, a TV/game console/other platform 106 e, a mobile multimedia player and/or a remote controller. - In an exemplary embodiment of the invention, the detection of the movement caused by the expulsion of human breath may occur without use of a channel. The detection of the movement caused by expulsion of human breath may be responsive to the human breath being exhaled into open space and onto a detection device or a
sensing module 110 that enables the detection. The detecting of the movement and the generation of the one or more control signals may be performed utilizing a MEMS, such a MEMS sensing andprocessing module 104. - In accordance with another embodiment of the invention, the MEMS sensing and
processing module 104 may be enabled to navigate within the user interface of one of more of the devices being controlled 106 via the generated one or more control signals. The MEMS sensing andprocessing module 104 may be enabled to select one or more components within theuser interface 128 of the devices being controlled 106 via the generated one or more control signals. The generated one or more control signals may comprise one or more of a wired and/or a wireless signal. - In accordance with another embodiment of the invention, one or more of the plurality of devices, such as a handheld device, for example, a
multimedia device 106 a and/or a cellphone/smartphone/dataphone 106 b and/or a PC, laptop or anotebook computer 106 c may be enabled to receive one or more inputs defining theuser interface 128 from anotherdevice 108. Theother device 108 may be one or more of a PC, laptop or anotebook computer 106 c and/or a handheld device, for example, amultimedia device 106 a and/or a cellphone/smartphone/dataphone 106 b. In this regard, data may be transferred from theother device 108 to the cellphone/smartphone/dataphone 106 b and this data may be associated or mapped to media content that may be remotely accessed by the cellphone/smartphone/dataphone 106 b via a service provider such as a cellular or PCS service provider. The transferred data that is associated or mapped to media content may be utilized to customize the user interface of the cellphone/smartphone/dataphone 106 b. In this regard, media content associated with one or more received inputs may become an integral part of theuser interface 128 of the device being controlled 106. - The invention is not limited to the expulsion of breath. Accordingly, in various exemplary embodiments of the invention, the MEMS may be enabled to detect the expulsion of any type of fluid such as air, and the source of the fluid may be an animal, a machine and/or a device.
- Certain embodiments of the invention may comprise a machine-readable storage having stored thereon, a computer program having at least one code section for controlling a user interface of a device using human breath, the at least one code section being executable by a machine for causing the machine to perform one or more of the steps described herein.
- Accordingly, aspects of the invention may be realized in hardware, software, firmware or a combination thereof. The invention may be realized in a centralized fashion in at least one computer system or in a distributed fashion where different elements are spread across several interconnected computer systems. Any kind of computer system or other apparatus adapted for carrying out the methods described herein is suited. A typical combination of hardware, software and firmware may be a general-purpose computer system with a computer program that, when being loaded and executed, controls the computer system such that it carries out the methods described herein.
- One embodiment of the invention may be implemented as a board level product, as a single chip, application specific integrated circuit (ASIC), or with varying levels integrated on a single chip with other portions of the system as separate components. The degree of integration of the system will primarily be determined by speed and cost considerations. Because of the sophisticated nature of modern processors, it is possible to utilize a commercially available processor, which may be implemented external to an ASIC implementation of the present system. Alternatively, if the processor is available as an ASIC core or logic block, then the commercially available processor may be implemented as part of an ASIC device with various functions implemented as firmware.
- The present invention may also be embedded in a computer program product, which comprises all the features enabling the implementation of the methods described herein, and which when loaded in a computer system is able to carry out these methods. Computer program in the present context may mean, for example, any expression, in any language, code or notation, of a set of instructions intended to cause a system having an information processing capability to perform a particular function either directly or after either or both of the following: a) conversion to another language, code or notation; b) reproduction in a different material form. However, other meanings of computer program within the understanding of those skilled in the art are also contemplated by the present invention.
- While the invention has been described with reference to certain embodiments, it will be understood by those skilled in the art that various changes may be made and equivalents may be substituted without departing from the scope of the present invention. In addition, many modifications may be made to adapt a particular situation or material to the teachings of the present invention without departing from its scope. Therefore, it is intended that the present invention not be limited to the particular embodiments disclosed, but that the present invention will include all embodiments falling within the scope of the appended claims.
Claims (24)
1. A method for interaction, the method comprising:
detecting movement caused by expulsion of human breath; and
responsive to said detection, generating one or more controls signals, wherein said generated one or more control signals are utilized to control a user interface of a device.
2. The method according to claim 1 , wherein said device comprises one or more of a personal computer (PC), a laptop, a notebook computer, a television (TV), a game console, a display device, and/or a handheld device.
3. The method according to claim 2 , wherein said handheld device comprises one or more of a mobile telephone, a mobile multimedia player, navigation device and/or a remote controller.
4. The method according to claim 1 , wherein said detecting of said movement caused by said expulsion of said human breath occurs without use of a channel.
5. The method according to claim 1 , wherein said detecting of said movement caused by said expulsion of said human breath is responsive to said human breath being exhaled into open space and onto one or more detectors that enables said detection.
6. The method according to claim 1 , wherein said detecting of said movement and said generating of said one or more control signals are performed utilizing a micro-electro-mechanical system (MEMS).
7. The method according to claim 1 , comprising navigating within said user interface via said generated one or more control signals.
8. The method according to claim 1 , comprising selecting one or more components within said user interface via said generated one or more control signals.
9. The method according to claim 1 , wherein said generated one or more control signals comprises one or both of a wired and/or a wireless signal.
10. The method according to claim 1 , comprising receiving one or more inputs defining said user interface from another device.
11. The method according to claim 10 , wherein said another device comprises one or more of a personal computer (PC), a laptop, a notebook computer and/or a handheld device.
12. The method according to claim 1 , comprising customizing said user interface so that content associated with one or more received inputs becomes an integral part of said user interface.
13. A system for interaction, the system comprising:
one or more detectors operable to detect movement caused by expulsion of human breath; and
responsive to said detection, one or more circuits operable to generate one or more controls signals, wherein said generated one or more control signals are utilized to control a user interface of a device.
14. The system according to claim 13 , wherein said device comprises one or more of a personal computer (PC), a laptop, a notebook computer, a television (TV), a game console, a display device, and/or a handheld device.
15. The system according to claim 14 , wherein said handheld device comprises one or more of a mobile telephone, a mobile multimedia player, a navigation device, and/or a remote controller.
16. The system according to claim 13 , wherein said detecting of said movement caused by said expulsion of said human breath occurs without use of a channel.
17. The system according to claim 13 , wherein said detecting of said movement caused by said expulsion of said human breath is responsive to said human breath being exhaled into open space and onto said one or more detectors.
18. The system according to claim 13 , comprising a micro-electro-mechanical system (MEMS), and wherein said MEMS comprises said one or more detectors and said one or more circuits.
19. The system according to claim 13 , wherein said one or more circuits enables navigation within said user interface via said generated one or more control signals.
20. The system according to claim 13 , wherein said one or more circuits enables selection of one or more components within said user interface via said generated one or more control signals.
21. The system according to claim 13 , wherein said generated one or more control signals comprises one or both of a wired and/or a wireless signal.
22. The system according to claim 13 , wherein said one or more circuits enables receiving of one or more inputs defining said user interface from another device.
23. The system according to claim 22 , wherein said another device comprises one or more of a personal computer (PC), a laptop, a notebook computer, and/or a handheld device.
24. The system according to claim 13 , wherein said one or more circuits enables customization of said user interface so that content associated with one or more received inputs becomes an integral part of said user interface.
Priority Applications (7)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US12/813,292 US20110010112A1 (en) | 1999-02-12 | 2010-06-10 | Method and System for Controlling a User Interface of a Device Using Human Breath |
CN2010800511238A CN102782459A (en) | 2009-09-11 | 2010-09-13 | Method and system for controlling a user interface of a device using human breath |
JP2012528957A JP2013542470A (en) | 2009-09-11 | 2010-09-13 | Method and system for controlling the user interface of a device using human exhalation |
PCT/US2010/048646 WO2011032096A2 (en) | 2009-09-11 | 2010-09-13 | Method and system for controlling a user interface of a device using human breath |
US12/880,892 US9753533B2 (en) | 2008-03-26 | 2010-09-13 | Method and system for controlling a user interface of a device using human breath |
EP10816236.3A EP2475969A4 (en) | 2009-09-11 | 2010-09-13 | Method and system for controlling a user interface of a device using human breath |
KR1020127009299A KR20130022401A (en) | 2009-09-11 | 2010-09-13 | Method and system for controlling a user interface of a device using human breath |
Applications Claiming Priority (7)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
FR9901958 | 1999-02-12 | ||
FR9901958 | 1999-02-12 | ||
PCT/FR2000/000362 WO2000048066A1 (en) | 1999-02-12 | 2000-02-14 | Method and device for monitoring an electronic or computer system by means of a fluid flow |
US09/913,398 US6574571B1 (en) | 1999-02-12 | 2000-02-14 | Method and device for monitoring an electronic or computer system by means of a fluid flow |
US10/453,192 US7584064B2 (en) | 1999-02-12 | 2003-06-02 | Method and device to control a computer system utilizing a fluid flow |
US12/056,164 US7739061B2 (en) | 1999-02-12 | 2008-03-26 | Method and system for controlling a user interface of a device using human breath |
US12/813,292 US20110010112A1 (en) | 1999-02-12 | 2010-06-10 | Method and System for Controlling a User Interface of a Device Using Human Breath |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US12/056,164 Continuation US7739061B2 (en) | 1999-02-12 | 2008-03-26 | Method and system for controlling a user interface of a device using human breath |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US12/880,892 Continuation-In-Part US9753533B2 (en) | 2008-03-26 | 2010-09-13 | Method and system for controlling a user interface of a device using human breath |
Publications (1)
Publication Number | Publication Date |
---|---|
US20110010112A1 true US20110010112A1 (en) | 2011-01-13 |
Family
ID=39642069
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US12/056,164 Expired - Fee Related US7739061B2 (en) | 1999-02-12 | 2008-03-26 | Method and system for controlling a user interface of a device using human breath |
US12/813,292 Abandoned US20110010112A1 (en) | 1999-02-12 | 2010-06-10 | Method and System for Controlling a User Interface of a Device Using Human Breath |
Family Applications Before (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US12/056,164 Expired - Fee Related US7739061B2 (en) | 1999-02-12 | 2008-03-26 | Method and system for controlling a user interface of a device using human breath |
Country Status (1)
Country | Link |
---|---|
US (2) | US7739061B2 (en) |
Cited By (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20090322675A1 (en) * | 1999-02-12 | 2009-12-31 | Pierre Bonnat | Method and device to control a computer system utilizing a fluid flow |
US20130335315A1 (en) * | 2008-03-26 | 2013-12-19 | Pierre Bonnat | Mobile handset accessory supporting touchless and occlusion-free user interaction |
CN106667631A (en) * | 2016-12-13 | 2017-05-17 | 天津大学 | Breathing airflow control switch |
CN109498295A (en) * | 2018-12-28 | 2019-03-22 | 电子科技大学中山学院 | Paralytic's auxiliary blows control wheelchair and blowing device |
CN109498296A (en) * | 2018-12-28 | 2019-03-22 | 电子科技大学中山学院 | The control method of control wheelchair is blown based on paralytic's auxiliary |
US10587209B2 (en) | 2017-03-08 | 2020-03-10 | Natural Gas Solutions North America, Llc | Generating power for electronics on a gas meter |
Families Citing this family (18)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8339287B2 (en) | 2002-03-29 | 2012-12-25 | Inputive Corporation | Device to control an electronic or computer system utilizing a fluid flow and a method of manufacturing the same |
US8976046B2 (en) * | 2008-03-26 | 2015-03-10 | Pierre Bonnat | Method and system for a MEMS detector that enables control of a device using human breath |
US7739061B2 (en) * | 1999-02-12 | 2010-06-15 | Pierre Bonnat | Method and system for controlling a user interface of a device using human breath |
US10216259B2 (en) * | 2000-02-14 | 2019-02-26 | Pierre Bonnat | Method and system for processing signals that control a device using human breath |
US9753533B2 (en) * | 2008-03-26 | 2017-09-05 | Pierre Bonnat | Method and system for controlling a user interface of a device using human breath |
EP2106818B1 (en) * | 2008-03-31 | 2013-12-25 | Nellcor Puritan Bennett Llc | System for compensating for pressure drop in a breathing assistance system |
US8181648B2 (en) * | 2008-09-26 | 2012-05-22 | Nellcor Puritan Bennett Llc | Systems and methods for managing pressure in a breathing assistance system |
US8302602B2 (en) | 2008-09-30 | 2012-11-06 | Nellcor Puritan Bennett Llc | Breathing assistance system with multiple pressure sensors |
US8776790B2 (en) | 2009-07-16 | 2014-07-15 | Covidien Lp | Wireless, gas flow-powered sensor system for a breathing assistance system |
EP2475969A4 (en) * | 2009-09-11 | 2016-11-02 | Novodigit Sarl | Method and system for controlling a user interface of a device using human breath |
US9174123B2 (en) * | 2009-11-09 | 2015-11-03 | Invensense, Inc. | Handheld computer systems and techniques for character and command recognition related to human movements |
EP2479892B1 (en) * | 2011-01-19 | 2013-08-28 | Sensirion AG | Input device |
EP2498481A1 (en) | 2011-03-09 | 2012-09-12 | Sensirion AG | Mobile phone with humidity sensor |
KR101219523B1 (en) * | 2011-03-23 | 2013-01-11 | 이승렬 | Method For Check A Message Using Air Sensing And Computer Readable Medium Recording The Program |
KR101410579B1 (en) * | 2013-10-14 | 2014-06-20 | 박재숙 | Wind synthesizer controller |
CN105278381A (en) * | 2015-11-03 | 2016-01-27 | 北京京东世纪贸易有限公司 | Method implemented by electronic equipment, electronic equipment control device and electronic equipment |
CN107145218B (en) * | 2016-03-01 | 2020-11-03 | 北京京东尚科信息技术有限公司 | Input device, mobile terminal, input method, and computer-readable storage medium |
RU192632U1 (en) * | 2019-06-18 | 2019-09-24 | Федеральное государственное автономное образовательное учреждение высшего образования "Санкт-Петербургский государственный электротехнический университет "ЛЭТИ" им. В.И. Ульянова (Ленина)" | Computer manipulator for people with disabilities |
Citations (47)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4085646A (en) * | 1975-05-28 | 1978-04-25 | Klaus Naumann | Electronic musical instrument |
US4207959A (en) * | 1978-06-02 | 1980-06-17 | New York University | Wheelchair mounted control apparatus |
US4383254A (en) * | 1979-09-14 | 1983-05-10 | David Gemmell | Control apparatus for a display matrix |
US4433685A (en) * | 1980-09-10 | 1984-02-28 | Figgie International Inc. | Pressure demand regulator with automatic shut-off |
US4521772A (en) * | 1981-08-28 | 1985-06-04 | Xerox Corporation | Cursor control device |
US4561309A (en) * | 1984-07-09 | 1985-12-31 | Rosner Stanley S | Method and apparatus for determining pressure differentials |
US4713540A (en) * | 1985-07-16 | 1987-12-15 | The Foxboro Company | Method and apparatus for sensing a measurand |
US4746913A (en) * | 1984-04-23 | 1988-05-24 | Volta Arthur C | Data entry method and apparatus for the disabled |
US4929826A (en) * | 1988-09-26 | 1990-05-29 | Joseph Truchsess | Mouth-operated control device |
US5226416A (en) * | 1988-06-16 | 1993-07-13 | Pneu Pac Limited | Monitoring and alarm apparatus |
US5341133A (en) * | 1991-05-09 | 1994-08-23 | The Rowland Institute For Science, Inc. | Keyboard having touch sensor keys for conveying information electronically |
US5378850A (en) * | 1992-01-14 | 1995-01-03 | Fernandes Co., Ltd. | Electric stringed instrument having an arrangement for adjusting the generation of magnetic feedback |
US5422640A (en) * | 1992-03-02 | 1995-06-06 | North Carolina State University | Breath actuated pointer to enable disabled persons to operate computers |
US5582182A (en) * | 1994-10-03 | 1996-12-10 | Sierra Biotechnology Company, Lc | Abnormal dyspnea perception detection system and method |
US5603065A (en) * | 1994-02-28 | 1997-02-11 | Baneth; Robin C. | Hands-free input device for operating a computer having mouthpiece with plurality of cells and a transducer for converting sound into electrical control signals |
US5740801A (en) * | 1993-03-31 | 1998-04-21 | Branson; Philip J. | Managing information in an endoscopy system |
US5763792A (en) * | 1996-05-03 | 1998-06-09 | Dragerwerk Ag | Respiratory flow sensor |
US5765135A (en) * | 1994-03-09 | 1998-06-09 | Speech Therapy Systems Ltd. | Speech therapy system |
US5835077A (en) * | 1995-01-13 | 1998-11-10 | Remec, Inc., | Computer control device |
JPH10320108A (en) * | 1997-05-15 | 1998-12-04 | Yuji Tsujimura | Cursor moving device |
US5870705A (en) * | 1994-10-21 | 1999-02-09 | Microsoft Corporation | Method of setting input levels in a voice recognition system |
US5889511A (en) * | 1997-01-17 | 1999-03-30 | Tritech Microelectronics International, Ltd. | Method and system for noise reduction for digitizing devices |
US5907318A (en) * | 1997-01-17 | 1999-05-25 | Medina; Carlos A. | Foot-controlled computer mouse |
US6040821A (en) * | 1989-09-26 | 2000-03-21 | Incontrol Solutions, Inc. | Cursor tracking |
US6064964A (en) * | 1997-11-04 | 2000-05-16 | Fujitsu Limited | Data processing apparatus having breath detecting function and image display control method using breath detection |
US6213955B1 (en) * | 1998-10-08 | 2001-04-10 | Sleep Solutions, Inc. | Apparatus and method for breath monitoring |
US6261238B1 (en) * | 1996-10-04 | 2001-07-17 | Karmel Medical Acoustic Technologies, Ltd. | Phonopneumograph system |
US6282183B1 (en) * | 1997-06-02 | 2001-08-28 | Motorola, Inc. | Method for authorizing couplings between devices in a capability addressable network |
US6323846B1 (en) * | 1998-01-26 | 2001-11-27 | University Of Delaware | Method and apparatus for integrating manual input |
US6396416B1 (en) * | 1996-06-17 | 2002-05-28 | Nokia Mobile Phones Ltd. | Add-on unit for connecting to a mobile station and a mobile station |
US6396402B1 (en) * | 2001-03-12 | 2002-05-28 | Myrica Systems Inc. | Method for detecting, recording and deterring the tapping and excavating activities of woodpeckers |
US6421617B2 (en) * | 1998-07-18 | 2002-07-16 | Interval Research Corporation | Interface including fluid flow measurement for use in determining an intention of, or an effect produced by, an animate object |
US6516671B2 (en) * | 2000-01-06 | 2003-02-11 | Rosemount Inc. | Grain growth of electrical interconnection for microelectromechanical systems (MEMS) |
US6574571B1 (en) * | 1999-02-12 | 2003-06-03 | Financial Holding Corporation, Inc. | Method and device for monitoring an electronic or computer system by means of a fluid flow |
US6611778B2 (en) * | 1996-01-18 | 2003-08-26 | Yeda Research And Development Co., Ltd. | Apparatus for monitoring a system in which a fluid flows |
US6664786B2 (en) * | 2001-07-30 | 2003-12-16 | Rockwell Automation Technologies, Inc. | Magnetic field sensor using microelectromechanical system |
US20040017351A1 (en) * | 2002-03-29 | 2004-01-29 | Pierre Bonnat | Device to control an electronic or computer system utilizing a fluid flow and a method of manufacturing the same |
US20050127154A1 (en) * | 2003-11-03 | 2005-06-16 | Pierre Bonnat | Device for receiving fluid current, which fluid current is used to control an electronic or computer system |
US20050268247A1 (en) * | 2004-05-27 | 2005-12-01 | Baneth Robin C | System and method for controlling a user interface |
US7053456B2 (en) * | 2004-03-31 | 2006-05-30 | Kabushiki Kaisha Toshiba | Electronic component having micro-electrical mechanical system |
US20060118115A1 (en) * | 2004-12-08 | 2006-06-08 | James Cannon | Oxygen conservation system for commercial aircraft |
US20060142957A1 (en) * | 2002-10-09 | 2006-06-29 | Pierre Bonnat | Method of controlling an electronic or computer system |
US20070048181A1 (en) * | 2002-09-05 | 2007-03-01 | Chang Daniel M | Carbon dioxide nanosensor, and respiratory CO2 monitors |
US20080011298A1 (en) * | 2006-06-30 | 2008-01-17 | Transoma Medical, Inc. | Monitoring physiologic conditions via transtracheal measurement of respiratory parameters |
US20080092898A1 (en) * | 2004-08-27 | 2008-04-24 | John Hopkins University | Disposable Sleep And Breathing Monitor |
DE102007063008A1 (en) * | 2007-12-21 | 2009-06-25 | Kouemou, Guy Leonard, Dr. Ing. | Method and device for cardiovascular and respiratory monitoring using hidden Markov models and neural networks |
US7739061B2 (en) * | 1999-02-12 | 2010-06-15 | Pierre Bonnat | Method and system for controlling a user interface of a device using human breath |
Family Cites Families (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8564544B2 (en) | 2006-09-06 | 2013-10-22 | Apple Inc. | Touch screen device, method, and graphical user interface for customizing display of content category icons |
-
2008
- 2008-03-26 US US12/056,164 patent/US7739061B2/en not_active Expired - Fee Related
-
2010
- 2010-06-10 US US12/813,292 patent/US20110010112A1/en not_active Abandoned
Patent Citations (48)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4085646A (en) * | 1975-05-28 | 1978-04-25 | Klaus Naumann | Electronic musical instrument |
US4207959A (en) * | 1978-06-02 | 1980-06-17 | New York University | Wheelchair mounted control apparatus |
US4383254A (en) * | 1979-09-14 | 1983-05-10 | David Gemmell | Control apparatus for a display matrix |
US4433685A (en) * | 1980-09-10 | 1984-02-28 | Figgie International Inc. | Pressure demand regulator with automatic shut-off |
US4521772A (en) * | 1981-08-28 | 1985-06-04 | Xerox Corporation | Cursor control device |
US4746913A (en) * | 1984-04-23 | 1988-05-24 | Volta Arthur C | Data entry method and apparatus for the disabled |
US4561309A (en) * | 1984-07-09 | 1985-12-31 | Rosner Stanley S | Method and apparatus for determining pressure differentials |
US4713540A (en) * | 1985-07-16 | 1987-12-15 | The Foxboro Company | Method and apparatus for sensing a measurand |
US5226416A (en) * | 1988-06-16 | 1993-07-13 | Pneu Pac Limited | Monitoring and alarm apparatus |
US4929826A (en) * | 1988-09-26 | 1990-05-29 | Joseph Truchsess | Mouth-operated control device |
US6040821A (en) * | 1989-09-26 | 2000-03-21 | Incontrol Solutions, Inc. | Cursor tracking |
US5341133A (en) * | 1991-05-09 | 1994-08-23 | The Rowland Institute For Science, Inc. | Keyboard having touch sensor keys for conveying information electronically |
US5378850A (en) * | 1992-01-14 | 1995-01-03 | Fernandes Co., Ltd. | Electric stringed instrument having an arrangement for adjusting the generation of magnetic feedback |
US5422640A (en) * | 1992-03-02 | 1995-06-06 | North Carolina State University | Breath actuated pointer to enable disabled persons to operate computers |
US5740801A (en) * | 1993-03-31 | 1998-04-21 | Branson; Philip J. | Managing information in an endoscopy system |
US5603065A (en) * | 1994-02-28 | 1997-02-11 | Baneth; Robin C. | Hands-free input device for operating a computer having mouthpiece with plurality of cells and a transducer for converting sound into electrical control signals |
US5765135A (en) * | 1994-03-09 | 1998-06-09 | Speech Therapy Systems Ltd. | Speech therapy system |
US5582182A (en) * | 1994-10-03 | 1996-12-10 | Sierra Biotechnology Company, Lc | Abnormal dyspnea perception detection system and method |
US5870705A (en) * | 1994-10-21 | 1999-02-09 | Microsoft Corporation | Method of setting input levels in a voice recognition system |
US5835077A (en) * | 1995-01-13 | 1998-11-10 | Remec, Inc., | Computer control device |
US6611778B2 (en) * | 1996-01-18 | 2003-08-26 | Yeda Research And Development Co., Ltd. | Apparatus for monitoring a system in which a fluid flows |
US5763792A (en) * | 1996-05-03 | 1998-06-09 | Dragerwerk Ag | Respiratory flow sensor |
US6396416B1 (en) * | 1996-06-17 | 2002-05-28 | Nokia Mobile Phones Ltd. | Add-on unit for connecting to a mobile station and a mobile station |
US6261238B1 (en) * | 1996-10-04 | 2001-07-17 | Karmel Medical Acoustic Technologies, Ltd. | Phonopneumograph system |
US5889511A (en) * | 1997-01-17 | 1999-03-30 | Tritech Microelectronics International, Ltd. | Method and system for noise reduction for digitizing devices |
US5907318A (en) * | 1997-01-17 | 1999-05-25 | Medina; Carlos A. | Foot-controlled computer mouse |
JPH10320108A (en) * | 1997-05-15 | 1998-12-04 | Yuji Tsujimura | Cursor moving device |
US6282183B1 (en) * | 1997-06-02 | 2001-08-28 | Motorola, Inc. | Method for authorizing couplings between devices in a capability addressable network |
US6064964A (en) * | 1997-11-04 | 2000-05-16 | Fujitsu Limited | Data processing apparatus having breath detecting function and image display control method using breath detection |
US6323846B1 (en) * | 1998-01-26 | 2001-11-27 | University Of Delaware | Method and apparatus for integrating manual input |
US6421617B2 (en) * | 1998-07-18 | 2002-07-16 | Interval Research Corporation | Interface including fluid flow measurement for use in determining an intention of, or an effect produced by, an animate object |
US6213955B1 (en) * | 1998-10-08 | 2001-04-10 | Sleep Solutions, Inc. | Apparatus and method for breath monitoring |
US7739061B2 (en) * | 1999-02-12 | 2010-06-15 | Pierre Bonnat | Method and system for controlling a user interface of a device using human breath |
US6574571B1 (en) * | 1999-02-12 | 2003-06-03 | Financial Holding Corporation, Inc. | Method and device for monitoring an electronic or computer system by means of a fluid flow |
US20030208334A1 (en) * | 1999-02-12 | 2003-11-06 | Pierre Bonnat | Method and device to control a computer system utilizing a fluid flow |
US6516671B2 (en) * | 2000-01-06 | 2003-02-11 | Rosemount Inc. | Grain growth of electrical interconnection for microelectromechanical systems (MEMS) |
US6396402B1 (en) * | 2001-03-12 | 2002-05-28 | Myrica Systems Inc. | Method for detecting, recording and deterring the tapping and excavating activities of woodpeckers |
US6664786B2 (en) * | 2001-07-30 | 2003-12-16 | Rockwell Automation Technologies, Inc. | Magnetic field sensor using microelectromechanical system |
US20040017351A1 (en) * | 2002-03-29 | 2004-01-29 | Pierre Bonnat | Device to control an electronic or computer system utilizing a fluid flow and a method of manufacturing the same |
US20070048181A1 (en) * | 2002-09-05 | 2007-03-01 | Chang Daniel M | Carbon dioxide nanosensor, and respiratory CO2 monitors |
US20060142957A1 (en) * | 2002-10-09 | 2006-06-29 | Pierre Bonnat | Method of controlling an electronic or computer system |
US20050127154A1 (en) * | 2003-11-03 | 2005-06-16 | Pierre Bonnat | Device for receiving fluid current, which fluid current is used to control an electronic or computer system |
US7053456B2 (en) * | 2004-03-31 | 2006-05-30 | Kabushiki Kaisha Toshiba | Electronic component having micro-electrical mechanical system |
US20050268247A1 (en) * | 2004-05-27 | 2005-12-01 | Baneth Robin C | System and method for controlling a user interface |
US20080092898A1 (en) * | 2004-08-27 | 2008-04-24 | John Hopkins University | Disposable Sleep And Breathing Monitor |
US20060118115A1 (en) * | 2004-12-08 | 2006-06-08 | James Cannon | Oxygen conservation system for commercial aircraft |
US20080011298A1 (en) * | 2006-06-30 | 2008-01-17 | Transoma Medical, Inc. | Monitoring physiologic conditions via transtracheal measurement of respiratory parameters |
DE102007063008A1 (en) * | 2007-12-21 | 2009-06-25 | Kouemou, Guy Leonard, Dr. Ing. | Method and device for cardiovascular and respiratory monitoring using hidden Markov models and neural networks |
Cited By (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20090322675A1 (en) * | 1999-02-12 | 2009-12-31 | Pierre Bonnat | Method and device to control a computer system utilizing a fluid flow |
US9111515B2 (en) * | 1999-02-12 | 2015-08-18 | Pierre Bonnat | Method and device to control a computer system utilizing a fluid flow |
US20130335315A1 (en) * | 2008-03-26 | 2013-12-19 | Pierre Bonnat | Mobile handset accessory supporting touchless and occlusion-free user interaction |
US9904353B2 (en) * | 2008-03-26 | 2018-02-27 | Pierre Bonnat | Mobile handset accessory supporting touchless and occlusion-free user interaction |
CN106667631A (en) * | 2016-12-13 | 2017-05-17 | 天津大学 | Breathing airflow control switch |
US10587209B2 (en) | 2017-03-08 | 2020-03-10 | Natural Gas Solutions North America, Llc | Generating power for electronics on a gas meter |
CN109498295A (en) * | 2018-12-28 | 2019-03-22 | 电子科技大学中山学院 | Paralytic's auxiliary blows control wheelchair and blowing device |
CN109498296A (en) * | 2018-12-28 | 2019-03-22 | 电子科技大学中山学院 | The control method of control wheelchair is blown based on paralytic's auxiliary |
Also Published As
Publication number | Publication date |
---|---|
US20080177404A1 (en) | 2008-07-24 |
US7739061B2 (en) | 2010-06-15 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US7739061B2 (en) | Method and system for controlling a user interface of a device using human breath | |
US9753533B2 (en) | Method and system for controlling a user interface of a device using human breath | |
WO2011032096A2 (en) | Method and system for controlling a user interface of a device using human breath | |
CN101794207B (en) | Pose to device mapping | |
US9122307B2 (en) | Advanced remote control of host application using motion and voice commands | |
CN108700982B (en) | Information processing apparatus, information processing method, and program | |
US8706170B2 (en) | Miniature communications gateway for head mounted display | |
CN102265242B (en) | Motion process is used to control and access content on the mobile apparatus | |
US20120075177A1 (en) | Lapel microphone micro-display system incorporating mobile information access | |
US9262867B2 (en) | Mobile terminal and method of operation | |
EP3929705A1 (en) | Wearable device and method of controlling the same | |
EP2439615A2 (en) | Magnetic sensor for use with hand-held devices | |
US20190294236A1 (en) | Method and System for Processing Signals that Control a Device Using Human Breath | |
CN110489573A (en) | Interface display method and electronic equipment | |
CN108346469A (en) | Method for determining human health status and mobile terminal | |
US10170099B2 (en) | Electronic device and method for representing web content for the electronic device | |
WO2021147767A1 (en) | Icon display method and electronic device | |
EP2538308A2 (en) | Motion-based control of a controllled device | |
WO2022227589A1 (en) | Audio processing method and apparatus | |
CN110472148A (en) | Using recommended method and terminal device | |
EP3451149A1 (en) | Information processing device, information processing method, and program | |
CN108833679A (en) | Object display method and terminal device | |
KR101687552B1 (en) | Mobile terminal and operation method thereof | |
KR20120057256A (en) | Mobile terminal and operation method thereof | |
KR20110133295A (en) | Mobile terminal and its operation method |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
STPP | Information on status: patent application and granting procedure in general |
Free format text: FINAL REJECTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: FINAL REJECTION MAILED |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |