US20180267614A1 - Control system for a terminal device with two sensors and power regulation - Google Patents
Control system for a terminal device with two sensors and power regulation Download PDFInfo
- Publication number
- US20180267614A1 US20180267614A1 US15/461,010 US201715461010A US2018267614A1 US 20180267614 A1 US20180267614 A1 US 20180267614A1 US 201715461010 A US201715461010 A US 201715461010A US 2018267614 A1 US2018267614 A1 US 2018267614A1
- Authority
- US
- United States
- Prior art keywords
- accelerometer
- subsequent
- acoustic
- data signals
- sensor
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
- 230000033228 biological regulation Effects 0.000 title claims description 11
- 230000003993 interaction Effects 0.000 claims abstract description 80
- 238000004891 communication Methods 0.000 claims abstract description 20
- 230000002452 interceptive effect Effects 0.000 claims description 32
- 238000000034 method Methods 0.000 claims description 24
- 238000012545 processing Methods 0.000 claims description 17
- 230000001276 controlling effect Effects 0.000 claims description 12
- 230000000694 effects Effects 0.000 claims description 12
- 230000000977 initiatory effect Effects 0.000 claims description 2
- 230000001105 regulatory effect Effects 0.000 claims 3
- 230000008859 change Effects 0.000 description 7
- 238000001514 detection method Methods 0.000 description 7
- 230000033001 locomotion Effects 0.000 description 7
- 238000005516 engineering process Methods 0.000 description 5
- 238000010586 diagram Methods 0.000 description 3
- 239000011521 glass Substances 0.000 description 3
- 238000010079 rubber tapping Methods 0.000 description 3
- 238000010276 construction Methods 0.000 description 2
- 238000013461 design Methods 0.000 description 2
- 238000011161 development Methods 0.000 description 2
- 238000011160 research Methods 0.000 description 2
- 238000010408 sweeping Methods 0.000 description 2
- 230000003213 activating effect Effects 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 238000006243 chemical reaction Methods 0.000 description 1
- 238000013016 damping Methods 0.000 description 1
- 238000001914 filtration Methods 0.000 description 1
- 230000001771 impaired effect Effects 0.000 description 1
- 230000010354 integration Effects 0.000 description 1
- 239000000463 material Substances 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000008520 organization Effects 0.000 description 1
- 238000009877 rendering Methods 0.000 description 1
- 230000004044 response Effects 0.000 description 1
- 230000000284 resting effect Effects 0.000 description 1
- 230000002459 sustained effect Effects 0.000 description 1
- 238000012549 training Methods 0.000 description 1
- 230000001755 vocal effect Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/017—Gesture based interaction, e.g. based on a set of recognized hand gestures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F1/00—Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
- G06F1/26—Power supply means, e.g. regulation thereof
- G06F1/32—Means for saving power
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
- G06F3/0346—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
- G06F3/038—Control and interface arrangements therefor, e.g. drivers or device-embedded control circuitry
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2200/00—Indexing scheme relating to G06F1/04 - G06F1/32
- G06F2200/16—Indexing scheme relating to G06F1/16 - G06F1/18
- G06F2200/163—Indexing scheme relating to constructional details of the computer
- G06F2200/1636—Sensing arrangement for detection of a tap gesture on the housing
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L12/00—Data switching networks
- H04L12/28—Data switching networks characterised by path configuration, e.g. LAN [Local Area Networks] or WAN [Wide Area Networks]
- H04L12/2803—Home automation networks
- H04L12/2816—Controlling appliance services of a home automation network by calling their functionalities
- H04L12/282—Controlling appliance services of a home automation network by calling their functionalities based on user interaction within the home
Definitions
- the present invention relates to a control system for a terminal device, such as a television, lighting fixture, thermostat or laptop. More particularly, the present invention relates to controlling the terminal device with gestures. Additionally, the present invention relates to more accurately distinguishing gestures from background environment and managing power consumption.
- control system or controller such as a remote control device, to adjust lights, curtains, a thermostat etc.
- Existing control systems include distinct remote control devices dedicated to and associated with the particular output or terminal device to be controlled.
- Remote control devices can also be associated with more than one terminal device, such as a master controller for electronics and a touchscreen computer tablet made integral with furniture or walls to control lighting and room temperature.
- Any computer with an interface can be a remote control device for multiple terminal devices with smart technology.
- Mobile phones are also known to be enabled for controlling terminal devices, such as home security cameras and door locks.
- Another existing control system involves voice recognition technology.
- Each output or terminal device typically is associated with a respective remote control device, such as a controller for the cable box, a controller for the DVD player, and a controller for the sound mixer.
- a controller for the cable box typically is associated with a respective remote control device, such as a controller for the cable box, a controller for the DVD player, and a controller for the sound mixer.
- An excessive number of controllers is needed in order to remotely control multiple devices.
- an individual controller is often misplaced or left in locations that are not readily accessible to the user. The user must search for a controller or change locations to access the controller.
- voice recognition technology often requires cumbersome training sessions to calibrate for pronunciations and accents of each particular user.
- voice recognition technology is often impaired by background noise resulting in difficulties for that control system to recognize verbal commands.
- the sound produced by voice commands may be obtrusive in many environments such as in a room where others are sleeping, or in a room while watching a movie.
- remote control devices can be built into or integrated into furniture.
- Smart tables have been built with touchscreens that are able to receive touch-based gestures.
- the cost of the structure is significantly increased due to design modifications required to accommodate the remote control device, and the cost of the components and hardware.
- aesthetics are often affected. Appearances are altered when furniture, walls and surroundings are filled with touchscreens, touchpads, and other conspicuous devices. Integration of such hardware into furniture also requires the manufacturer to modify existing designs such that the hardware can be accommodated into the structure.
- Prior art manual control systems range from buttons on a television remote controller to a touchscreen of a mobile phone. Simple gestures of pressing dedicated buttons and complex gestures of finger motions on a touchscreen are both used to control terminal devices.
- Various patents and publications are available in the field of these manual control systems.
- U.S. Pat. No. 8,788,978 issued to Stedman et al on Jul. 22, 2014, teaches a gesture sensitive interface for a computer.
- the “pinch zoom” functionality is the subject matter, so that the detection of first and second interaction points, and the relative motion between the points are detected by sensors.
- sensors are disclosed to define the field, including a touch screen, camera, motion sensor, and proximity sensors.
- World Intellectual Property Organization Publication No. WO20131653408 published for Bess on Nov. 7, 2013, describes a system with at least three accelerometers disposed in different locations of an area with a surface to capture respective vibration data corresponding to a command tapped onto the surface by a user.
- a processing system receives the vibration data from each accelerometer, identifying the command and a location of the user from the vibration data.
- a control signal based on the command and the location is generated.
- a control apparatus includes a projector for directing first light toward a scene that includes a hand of a user in proximity to a wall of a room and to receive the first light that is reflected from the scene, and to direct second light toward the wall so as to project an image of a control device onto the wall.
- a processor detects hand motions within the projected field.
- the projector is a unit worn on the body of the user to project onto surfaces, such as walls and tables.
- Spatial data is detected by a sensor array.
- Additional rendering operations may include tracking movements of the recognized body parts, applying a detection algorithm to the tracked movements to detect a predetermined gesture, applying a command corresponding to the detected predetermined gesture, and updating the projected images in response to the applied command.
- a control system can convert any independent mounting surface into a controller for a terminal device.
- a physically separate mounting surface such as a wall or table surface, can be used to activate and deactivate a television or light fixtures, without the user touching either appliance.
- the control system includes a housing engaged to a mounting surface, a sensor and microcontroller unit within the housing, a server in communication with the sensor, and a terminal device in communication with the server.
- the terminal device is to be controlled by gestures associated with the mounting surface.
- the control system further includes a server in communication with the sensor, including but not limited to wifi, Bluetooth, local area network, wired or other wireless connection.
- the terminal device can be an appliance, lighting fixture or climate regulator.
- the sensor is an acoustic sensor
- background noise can affect the ability of the control system to identify the gesture from ambient sounds.
- the sensor is an accelerometer, accidentally colliding with the mounting surface or setting a coffee cup on the mounting surface can affect the ability of the control system to identify the gesture from inadvertent hits on the mounting surface.
- the sensors require power in order to remain active for detecting gestures. In order to regulate power consumption, switching between an energy-saving mode and an active mode can save energy. There are needs to improve the control systems for accurately detecting gestures and saving energy.
- Embodiments of the present invention include a control system comprising a housing, an accelerometer sensor, an acoustic sensor, and a microcontroller.
- the housing has an engagement means for a mounting surface, and both the accelerometer sensor and acoustic sensor are contained within the housing.
- Each sensor forms a respective interactive zone defined by a range of the sensor, and each interactive zone is aligned with the mounting surface.
- the acoustic sensor has a first power consumption level so as to be in a slack status and a second power consumption level so as to be in an active status.
- the slack status is a relatively lower power mode than the active status.
- the acoustic sensor is not devoid of activity; the acoustic sensor is resting, but still operating.
- the acoustic sensor generally stays in the slack status at the lower power consumption level, while the accelerometer sensor remains in a respective active status.
- the system saves energy with the acoustic sensor in the slack status and with other components in respective slack statuses.
- a contact interaction associated with the mounting surface within the accelerometer interactive zone is detected by the accelerometer sensor as accelerometer data signals.
- the contact interaction is also within the acoustic interactive zone, but the acoustic sensor is in slack status so the contact interaction is not detected by the acoustic sensor.
- the microcontroller unit is contained within the housing and connected to the accelerometer sensor.
- the microcontroller unit receives the accelerometer data signals from the accelerometer sensor and determines a status data pattern corresponding to the accelerometer data signals of the contact interaction.
- the status data pattern can match a status gesture profile associated with a command to switch the acoustic sensor from the slack status to the active status.
- the microcontroller toggles the acoustic sensor to the active status, and the control system is ready to detect a subsequent contact interaction with both the accelerometer sensor and the acoustic sensor.
- the control system includes a server and a terminal device.
- the subsequent contact interactions control a terminal device, when the acoustic sensor is in the active status.
- the server in communication with the accelerometer sensor and the acoustic sensor can include a routing module, a processing module being connected to the routing module, and an output module connected to the processing module.
- the terminal device includes a receiving module in communication with the output module of the server and means for initiating activity of the terminal device.
- the subsequent accelerometer data signals and acoustic data signals from the subsequent contact interaction determine a subsequent data pattern, which is transmitted to the server.
- the subsequent data pattern matches with a gesture profile. This gesture profile is associated with a command for the terminal device.
- the control system of the present invention has an accelerometer sensor that remains in an active status at a low power consumption level of the control system.
- the user can awaken the control system with a gesture detected by only the accelerometer sensor to switch the acoustic sensor into an active status.
- the control system is now at a full power consumption level, instead of a low power consumption level, so as to detect subsequent gestures for terminal devices with both the accelerometer sensor and the acoustic sensor.
- Other components of the system can be awakened to corresponding active statuses.
- the interaction of the acoustic sensor with the accelerometer sensor can filter background noise and inadvertent hits on the mounting surface.
- a sound detected by the acoustic sensor without a vibration detected by the accelerometer is now filtered from subsequent contact interactions intended to be gestures.
- accidental bumps on the mounting surface are vibrations without a sound corresponding to a subsequent contact interaction intended to be a gesture.
- the present invention improves accuracy of detecting gestures and saves energy by limiting the powering of both sensors, until activated for listening for gestures.
- Embodiments of the present invention include the method of power regulation of a system for controlling a terminal device.
- the method includes installing a housing of the system on a mounting surface by an engagement device, the housing being comprised of an accelerometer sensor, an acoustic sensor, and a microcontroller unit.
- the acoustic sensor has a first power consumption level so as to be in a slack status and a second power consumption level so as to be in an active status.
- the system consumes less power when the acoustic sensor is in the slack status and when the microcontroller is in a corresponding slack status.
- the system When initially activated, the system has the acoustic sensor and other components, such as the microcontroller, in respective slack statuses, and only the accelerometer sensor is in an active status.
- the method further includes making a physical impact on the mounting surface so as to generate a contact interaction and detecting the contact interaction as accelerometer data signals with the accelerometer sensor.
- the microcontroller unit receives the accelerometer data signals to determine a status data pattern and commands the acoustic sensor to switch from the slack status to the active status, when the status data pattern matches a status gesture profile. With the acoustic sensor in the active status, the system is now fully activated and powered for a subsequent contact interaction within a set time duration.
- the method also includes switching the active status back to the slack status when the subsequent contact interaction occurs after the set time duration passes.
- Embodiments of the method include connecting a server in communication with the accelerometer sensor and the acoustic sensor and connecting the terminal device in communication with the server.
- the system with the acoustic sensor in active status can detect the subsequent contact interactions. Making a subsequent physical impact on the mounting surface generates a subsequent contact interaction, when the acoustic sensor is in the active status and before the set time duration passes.
- Subsequent accelerometer data signals and acoustic data signals determine a subsequent data pattern.
- the server matches the subsequent data pattern to a gesture profile associated with a command for the terminal device.
- the command is sent to the terminal device for performing the activity according to the command.
- Each of the subsequent accelerometer data signals and the acoustic data signals confirm each other to more accurately determine the subsequent data pattern.
- the background noise and extraneous vibrations to the mounting surface are filtered for a more accurate subsequent data pattern.
- FIG. 1 is a schematic view of an embodiment of the control system of the present invention with the accelerometer sensor and the acoustic sensor.
- FIG. 2 is a top plan view of another embodiment of the housing with the accelerometer sensor and the acoustic sensor on the mounting surface of the present invention.
- FIG. 3 is flow diagram of the embodiment of the method for power regulation of the control system of the present invention showing the slack status and the active status of the acoustic sensor.
- FIG. 4 is a schematic view of another embodiment of the control system of the present invention with the server and terminal device.
- FIG. 5 is flow diagram of the embodiment of the method for controlling a terminal device in the ready mode, according to the embodiment of the present invention of FIG. 4 .
- the control system of the present invention regulates power and improves accuracy of gesture detection.
- the control system of the present invention includes two sensors, in particular an accelerometer sensor and an acoustic sensor.
- the vibrations detected by the accelerometer are compared with the sounds detected by the acoustic sensor in order to more accurately identify an intentional gesture from extraneous stimuli.
- the accelerometer detects a vibration on the mounting surface, and the acoustic sensor confirms a corresponding sound to determine the data pattern.
- a vibration on the mounting surface, caused by an accidental bump can no longer be confused as a data pattern for a gesture, such as an intentional knock.
- the power requirements for two sensors and processing data signals from two sensors can be high.
- the power requirements for connecting to a server can also be high.
- the present invention further accounts for power regulation with a control system with toggling between an activated and fully powered system and an activated and power saving system.
- FIGS. 1-3 show the control system 10 with the housing 20 comprised of an engagement means 24 for a mounting surface 22 .
- Planar surfaces such as tables and walls, as well as non-planar surfaces, such as beds, can be mounting surfaces 22 .
- the engagement means 24 attaches the accelerometer sensor unit 35 and the acoustic sensor unit 35 ′ and reduces damping so that the accelerometer sensor unit 35 and the acoustic sensor unit 35 ′ more accurately detect contact interactions 60 on the mounting surface 22 .
- the control system 10 of the present invention includes an accelerometer sensor 35 and an acoustic sensor 35 ′ as shown in FIG. 1 .
- the housing 20 contains the printed circuit board 30 comprised of a board 34 with a flash memory 31 , microcontroller unit (MCU) 33 , the accelerometer sensor unit 35 , the acoustic sensor unit 35 ′, antenna 37 , and light emitting diode 39 .
- the microcontroller unit 33 and antenna 37 can have wifi capability for communication with a server 40 (See FIG. 4 ).
- the microcontroller unit 33 is connected to the accelerometer sensor unit 35 , the acoustic sensor unit 35 ′, and the flash memory 31 .
- the rigid position of the printed circuit board 30 establishes the transmission of the contact interaction to the accelerometer sensor unit 35 and the acoustic sensor unit 35 ′.
- the engagement means 24 is in a fixed position relative to the accelerometer sensor unit 35 and the acoustic sensor unit 35 ′.
- Other parts in the housing 20 include batteries 36 as a known power supply for the control system 10 .
- the batteries 36 power both the accelerometer sensor unit 35 and the acoustic sensor unit 35 ′.
- the stable construction of the housing 20 and the accelerometer sensor unit 35 and the acoustic sensor unit 35 ′ enable the accurate and efficient conversion of the contact interactions 60 as gestures into commands for a terminal device 50 (See FIG. 4 ).
- FIG. 2 shows the accelerometer sensor unit 35 and the acoustic sensor unit 35 ′ forming respective zones 32 , 32 ′.
- the accelerometer sensor unit 35 forms an accelerometer interactive zone 32 defined by an accelerometer range 34 of the accelerometer sensor 35 .
- a contact interaction 60 with the mounting surface 22 within the accelerometer interactive zone 32 is detected by the accelerometer sensor unit 35 as accelerometer data signals 70 .
- the acoustic sensor unit 35 ′ forms an acoustic interactive zone 32 ′ defined by an acoustic range 34 ′ of the acoustic sensor unit 35 ′.
- a contact interaction with the mounting surface 22 within the acoustic interactive zone 32 ′ is detected by the acoustic sensor unit 35 ′ as acoustic data signals.
- the accelerometer interactive zone 32 of the accelerometer sensor unit 35 overlaps with the acoustic interactive zone 32 ′ of the acoustic sensor unit 35 ′.
- FIG. 2 shows the interactive zones 32 , 32 ′ aligned with the mounting surface 22 , in particular, the interactive zones 32 , 32 ′ are coplanar with the mounting surface 22 .
- the contact interaction 60 on the mounting surface 22 can be detected by the accelerometer sensor unit 35 and the acoustic sensor unit 35 ′ on the mounting surface 22 .
- the acoustic sensor unit 35 ′ has a first power consumption level so as to be in a slack status and a second power consumption level so as to be in an active status.
- the acoustic sensor unit 35 ′ In the activated and fully powered control system 10 , the acoustic sensor unit 35 ′ is in the active status, and both sensor units 35 , 35 ′ detect the respective data signals.
- the microcontroller 33 permits communication to a server in the activated and fully powered control system 10 .
- the acoustic sensor 35 ′ is in the slack status, and only the accelerometer sensor unit 35 detects respective data signals.
- control system 10 can also be in respective slack status for lower power consumption.
- the microcontroller 33 is not transmitting to the server 40 in the corresponding slack status of the microcontroller for one type of lower power consumption.
- both sensor units 35 , 35 ′ are off.
- FIG. 3 is a flow diagram of an embodiment of the present invention, showing the accelerometer data signals 70 of the accelerometer sensor unit 35 in relation to the microcontroller unit 33 .
- the contact interaction 60 generates the data signals 70 of the accelerometer sensor unit 35 through the housing 20 .
- the contact interaction 60 is comprised of an impact or plurality of impacts associated with the mounting surface 22 .
- the impact or plurality of impacts on the associated surface is the contact interaction 60 , not an impact on the mounting surface 22 .
- the impacts are coordinated or correspond or translate to the mounting surface 22 for detection by the accelerometer sensor unit 35 through the mounting surface 22 as accelerometer data signals 70 .
- the microcontroller unit 33 receives the accelerometer data signals 70 from the accelerometer sensor unit 35 . These accelerometer data signals 70 correspond to the contact interaction 60 associated with the mounting surface 22 .
- the microcontroller unit 33 determines the accelerometer data pattern 80 corresponding to the accelerometer data signals 70 of the contact interaction 60 .
- the microcontroller unit 33 also matches the status data pattern 80 with a status gesture profile 90 .
- the status gesture profile 90 is associated with a switch command to change the status of the acoustic sensor unit 35 ′ and other components of the control system 10 , such as enabling communication with a server by the microcontroller unit 33 .
- the control system 10 as the activated power saving system has lower power consumption as an energy saving or sleep or slack mode.
- control system 10 remains able to detect the contact interaction 60 corresponding to the status gesture profile 90 .
- the control system 10 remains ready to change into the higher power consumption as an activated and fully powered system.
- the control system 10 can power the microcontroller unit 33 to connect to the server 40 as the activated and fully powered system (See FIG. 4 ).
- the status gesture profile 90 can be comprised of a threshold level for the status data pattern 80 . Any data pattern above the threshold level matches the status gesture profile 90 .
- the control system 10 remains able to detect the contact interaction 60 corresponding to the status gesture profile 90 , such that the control system 10 can toggle between the slack status and active status of the acoustic sensor unit 35 ′ by gestures.
- An elderly person in a wheelchair is able to regulate turning on or turning off the control system 10 by knocking twice on a tabletop instead of locating a dedicated button on the housing 20 .
- the control system 10 is not required to maintain high power consumption. Both sensor unit 35 , 35 ′ are not drawing power at the same time.
- the accelerometer data signals 70 have a respective defined peak corresponding to each impact, a measured time period between each defined peak, and a defined time period after a last defined peak. Each peak is a distinct spike in the data being detected with a quick increase from a baseline or background activity.
- An accelerometer data pattern 80 for each contact interaction 60 is determined by each defined peak and the defined time period after the last defined peak, and each measured time period between each defined peak, if there is a plurality of impacts.
- FIG. 3 shows an embodiment for the contact interaction 60 comprised of one impact or a plurality of impacts. A single knock or a sequence of knocks can be a contact interaction 60 .
- the control system 10 determines the accelerometer data pattern 80 for contact interactions 60 comprised of a single tap, three quick knocks, two taps, and other sequences.
- Contact interactions 60 such as tapping, knocking, sweeping, and dragging, can be detected by the accelerometer sensor unit 35 as accelerometer data signals 70 .
- the relationship between the microcontroller 33 and the acoustic sensor unit 35 ′ is timed.
- the toggle to active status of the acoustic sensor unit 35 ′ is limited by time. Only subsequent contact interactions within a set time duration maintain the active status of the acoustic sensor 35 ′.
- the control system 10 distinguishes between accidentally switching to active status and purposely switching to active status and the higher power consumption level. Once switched, the user must make a subsequent contact interaction within a predetermined amount of time, so that the subsequent contact interaction is detected by both sensor units 35 , 35 ′.
- the control system 10 prevents accidental powering of the acoustic sensor unit 35 ′ and avoids unnecessary power consumption.
- control system 10 can be set as an activated and fully powered system, the control system 10 is ready to detect subsequent contact interactions for controlling the terminal device.
- the subsequent contact interactions will be detected as subsequent accelerometer data signals and acoustic data signals.
- the interaction allows for more accurate detection of gestures with inadvertent hits and background noise being more easily filtered from intentional gestures for the control system 10 .
- FIG. 4-5 show an alternative embodiment of the invention, with the control system 10 including a housing 20 , an accelerometer sensor unit 35 and an acoustic sensor unit 35 ′ within the housing 20 , a server 40 in communication with the sensor units 35 , 35 ′, and a terminal device 50 in communication with the server 40 .
- Interfaces 99 are connected to the server 40 in order to interact with the control system 10 .
- the interfaces 99 can include computers, laptops, tablets and smartphones.
- FIG. 4 shows a variety of different interfaces 99 .
- the interfaces 99 allow the user to adjust the settings of the control system 10 . Gestures by a user associated with the mounting surface 22 regulate the control system 10 and control the terminal devices 50 .
- the devices that are interfaces 99 could also be terminal devices 50 .
- the server 40 is in communication with the sensor units 35 , 35 ′, when the system is an activated and fully powered system.
- the communication can be wireless or wired.
- the connection between the server 40 and the sensor units 35 , 35 ′ can include a router 42 , as shown in FIG. 4 , and may also include wifi, Bluetooth, local area network, or other connections.
- the server 40 can be comprised of a routing module 44 , a processing module 46 being connected to the routing module 44 , and an output module 48 connected to the processing module 46 .
- the flow chart of FIG. 5 shows the control system 10 controlling activity of a terminal device 50 by a subsequent contact interaction 160 .
- the routing module 44 receives the subsequent accelerometer data signals 170 from the accelerometer sensor unit 35 and the acoustic data signals 70 ′ from the acoustic sensor unit 35 ′. These subsequent accelerometer data signals 170 and acoustic data signals 70 ′ correspond to other subsequent contact interactions 160 associated with the mounting surface 22 , when the acoustic sensor unit 35 ′ is in active status.
- the processing module 46 determines the subsequent data pattern 180 corresponding to the subsequent accelerometer data signals 170 and acoustic data signals 70 ′ of the subsequent contact interaction 160 .
- the processing module 46 also matches the subsequent data pattern 180 with a gesture profile 190 .
- the gesture profile 190 is associated with a command for the terminal device 50 , such as power off or change channels or dim intensity. Then, the output module 48 transmits the command to the terminal device 50 .
- the terminal device 50 is a television
- another contact interaction 160 of three fast knocks can be detected as subsequent accelerometer data signals 170 and acoustic data signals 70 ′ to generate a subsequent data pattern 180 .
- the subsequent data pattern 180 can be matched to a gesture profile 190 associated with changing channels up one channel.
- the output module 48 communicates the command to change channels up one channel through the server 40 to the television as the terminal device 50 .
- that same elderly person in a wheelchair is able to activate the control system 10 by knocking so that the person can change channels by knocking twice on a tabletop instead of locating a dedicated button on the television or fiddling with a touchscreen on a smartphone.
- the terminal device 50 can be an appliance, such as a television, stereo or coffee machine.
- the terminal device 50 may be a device running software, a light or climate regulator, such as a thermostat, fan or lighting fixture.
- the activity of the terminal device 50 depends upon the terminal device 50 .
- the activity is dedicated to the particular terminal device 50 .
- the command associated with the gesture profile 190 relates to the particular terminal device 50 .
- Knocking twice on a tabletop can be converted by the control system 10 into a command to change channels on a television or to lower the temperature of a thermostat or to create an entry in an online calendar software program on a computer.
- the control system 10 can also be used with multiple terminal devices 50 .
- a gesture profile 190 for a command is specific for an activity for a particular terminal device 50 . More than one terminal device 50 can be connected to the server 40 to receive the commands from gestures by the user against the mounting surface 22 .
- each of the subsequent accelerometer data signals 170 and the acoustic data signals have a respective defined peak corresponding to each impact, a measured time period between each defined peak, and a defined time period after a last defined peak. These peaks correspond to vibration data for the accelerometer sensor unit 35 and sound data for the acoustic sensor unit 35 ′. Each peak is a distinct spike in the data being detected with a quick increase from a baseline or background activity.
- the subsequent data pattern 180 for each subsequent contact interaction 160 is determined by each defined peak and the defined time period after the last defined peak, and each measured time period between each defined peak, if there is a plurality of impacts.
- FIG. 5 shows an embodiment for the subsequent contact interaction 160 comprised of one impact or a plurality of impacts.
- a single knock or a sequence of knocks can be a subsequent contact interaction 160 .
- the control system 10 determines the subsequent data pattern 180 for subsequent contact interactions 160 comprised of a single tap, three quick knocks, two taps, and other sequences.
- Subsequent contact interactions 160 such as tapping, knocking, sweeping, and dragging, can be detected by the accelerometer sensor unit 35 and acoustic sensor unit 35 ′.
- each defined peak and the defined time period after the last defined peak, and each measured time period between each defined peak of the acoustic data signals 70 ′ confirm each defined peak and the defined time period after the last defined peak, and each measured time period between each defined peak of the subsequent accelerometer data signals 170 . If a user knocks twice and then sets a glass down, the accelerometer detects three identical vibrations and the acoustic sensor, such as a microphone, detects the first two vibrations as from a first object and the surface (the users hand knocking twice) and the third vibration as from a second object and the surface (setting glass down) because the third sound was different from the first two sounds. The unwanted signals from the glass being set down are filtered with a degree of accuracy beyond the prior art.
- Setting a bag on a tabletop may cause a vibration to be detected by the accelerometer sensor unit 35 and a sound detected by the acoustic sensor unit 35 ′.
- Knocking on a tabletop by the user as an intentional gesture may cause the same vibration to be detected by the accelerometer sensor unit 35 and a respective sound detected by the acoustic sensor unit 35 ′.
- the control system 10 can now distinguish setting the bag on the tabletop from knocking as an intentional gesture to control the terminal device 50 .
- the data pattern of setting the bag on the tabletop can generate a vibration analogous to the vibration of knocking as the intentional gesture, but the respective sound of the knocking as the intentional gesture is different. Thus, the data pattern of setting the bag no longer matches the subsequent data pattern of the knocking as an intentional gesture.
- the present invention prioritizes the accelerometer data signals to confirm the contact interaction in the accelerometer interactive zone, not the acoustic interactive zone.
- the acoustic interactive zone overlaps the mounting surface, but the acoustic interactive zone may be too large and detects too many sounds not associated with a subsequent contact interaction. A sound can be heard, but the origin of the location of the sound can be difficult to screen. Filtering the acoustic data signals with the accelerometer sensor unit requires too much power and processing time.
- the present invention selects a particular hierarchy of the sensors, server and microcontroller unit.
- the accelerometer sensor unit 35 relates to the acoustic sensor unit 35 ′, microcontroller unit 33 and the server 40 to regulate power and more accurately determine subsequent data patterns for commands to the terminal devices.
- the present invention provides an improved system and method for controlling a terminal device.
- a user can make gestures to control activity of a terminal device, such as knocking against a wall to illuminate an overhead light.
- Reliably detecting gestures with a sensor is more complicated than simply activating an accelerometer or microphone to capture vibration and sound data.
- Extraneous stimuli like background noise and inadvertent vibrations, interfere with identifying intentional vibrations and sounds intended to be gestures for controlling the terminal device.
- the control system of the present invention sets an accelerometer sensor unit, an acoustic sensor unit, a microcontroller, and a server in particular relationship to more accurate detect the intentional gestures.
- the acoustic sensor confirms the accelerometer sensor unit to insure the location of the gesture in the accelerometer interactive zone.
- the control system further regulates power consumption with the interaction of the two sensors and the microcontroller. Although needed for more accurate detection of gestures, the power demands on the system with two sensors cannot be so easily sustained.
- the present invention regulates power consumption by an activated and powered and an activated and power saving system coordinated with the two sensors of the control system.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Human Computer Interaction (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
Description
- See Application Data Sheet.
- Not applicable.
- Not applicable.
- Not applicable.
- Not applicable.
- The present invention relates to a control system for a terminal device, such as a television, lighting fixture, thermostat or laptop. More particularly, the present invention relates to controlling the terminal device with gestures. Additionally, the present invention relates to more accurately distinguishing gestures from background environment and managing power consumption.
- With the development of electronic technology, output devices or terminal devices are used daily and are increasingly integrated with interactive features in order to enhance convenience and functionality. Users now can use a control system or controller, such as a remote control device, to adjust lights, curtains, a thermostat etc. Existing control systems include distinct remote control devices dedicated to and associated with the particular output or terminal device to be controlled. Remote control devices can also be associated with more than one terminal device, such as a master controller for electronics and a touchscreen computer tablet made integral with furniture or walls to control lighting and room temperature. Any computer with an interface (keyboard, mouse, touch pad or touchscreen) can be a remote control device for multiple terminal devices with smart technology. Mobile phones are also known to be enabled for controlling terminal devices, such as home security cameras and door locks. Another existing control system involves voice recognition technology.
- Existing control systems have limitations. Each output or terminal device typically is associated with a respective remote control device, such as a controller for the cable box, a controller for the DVD player, and a controller for the sound mixer. An excessive number of controllers is needed in order to remotely control multiple devices. Furthermore, an individual controller is often misplaced or left in locations that are not readily accessible to the user. The user must search for a controller or change locations to access the controller. Additionally, voice recognition technology often requires cumbersome training sessions to calibrate for pronunciations and accents of each particular user. Furthermore, voice recognition technology is often impaired by background noise resulting in difficulties for that control system to recognize verbal commands. Additionally, the sound produced by voice commands may be obtrusive in many environments such as in a room where others are sleeping, or in a room while watching a movie.
- For remote control devices associated with multiple terminal devices, for example, computer tablets with a touchscreen and computers with touchpads, remote control devices can be built into or integrated into furniture. Smart tables have been built with touchscreens that are able to receive touch-based gestures. In the case of integrating these touchscreen or touch pads into surfaces of structures such as furniture, the cost of the structure is significantly increased due to design modifications required to accommodate the remote control device, and the cost of the components and hardware. Furthermore, aesthetics are often affected. Appearances are altered when furniture, walls and surroundings are filled with touchscreens, touchpads, and other conspicuous devices. Integration of such hardware into furniture also requires the manufacturer to modify existing designs such that the hardware can be accommodated into the structure.
- Prior art manual control systems range from buttons on a television remote controller to a touchscreen of a mobile phone. Simple gestures of pressing dedicated buttons and complex gestures of finger motions on a touchscreen are both used to control terminal devices. Various patents and publications are available in the field of these manual control systems.
- U.S. Pat. No. 8,788,978, issued to Stedman et al on Jul. 22, 2014, teaches a gesture sensitive interface for a computer. The “pinch zoom” functionality is the subject matter, so that the detection of first and second interaction points, and the relative motion between the points are detected by sensors. A variety of sensors are disclosed to define the field, including a touch screen, camera, motion sensor, and proximity sensors.
- World Intellectual Property Organization Publication No. WO2013165348, published for Bess on Nov. 7, 2013, describes a system with at least three accelerometers disposed in different locations of an area with a surface to capture respective vibration data corresponding to a command tapped onto the surface by a user. A processing system receives the vibration data from each accelerometer, identifying the command and a location of the user from the vibration data. A control signal based on the command and the location is generated.
- U.S. Patent Publication No. 20140225824, published for Shpunt et al on Aug. 14, 2014, discloses flexible room controls. A control apparatus includes a projector for directing first light toward a scene that includes a hand of a user in proximity to a wall of a room and to receive the first light that is reflected from the scene, and to direct second light toward the wall so as to project an image of a control device onto the wall. A processor detects hand motions within the projected field.
- U.S. Patent Publication No. 20120249416, published for Maciocci et al on Oct. 4, 2012, describes another projection system with gesture identification. The projector is a unit worn on the body of the user to project onto surfaces, such as walls and tables. Spatial data is detected by a sensor array. Additional rendering operations may include tracking movements of the recognized body parts, applying a detection algorithm to the tracked movements to detect a predetermined gesture, applying a command corresponding to the detected predetermined gesture, and updating the projected images in response to the applied command.
- U.S. Patent Publication No. 20100019922, published for Van Loenen on Jan. 28, 2010, is the known prior art for an interactive surface by tapping. Sound detection is filtered and interpreted either in the system to be controlled or else in the sensors themselves. The direction of movement of a hand stroking the surface can be interpreted as a command to increase or decrease a parameter, such as the sound volume level of a television, for example. Determination of the position of the user's hand is unnecessary.
- In other innovative systems, a control system can convert any independent mounting surface into a controller for a terminal device. A physically separate mounting surface, such as a wall or table surface, can be used to activate and deactivate a television or light fixtures, without the user touching either appliance. The control system includes a housing engaged to a mounting surface, a sensor and microcontroller unit within the housing, a server in communication with the sensor, and a terminal device in communication with the server. The terminal device is to be controlled by gestures associated with the mounting surface. The control system further includes a server in communication with the sensor, including but not limited to wifi, Bluetooth, local area network, wired or other wireless connection. The terminal device can be an appliance, lighting fixture or climate regulator.
- For gestures associated with the mounting surface, there is a need to distinguish the gestures from background noise. When the sensor is an acoustic sensor, background noise can affect the ability of the control system to identify the gesture from ambient sounds. When the sensor is an accelerometer, accidentally colliding with the mounting surface or setting a coffee cup on the mounting surface can affect the ability of the control system to identify the gesture from inadvertent hits on the mounting surface. Additionally, the sensors require power in order to remain active for detecting gestures. In order to regulate power consumption, switching between an energy-saving mode and an active mode can save energy. There are needs to improve the control systems for accurately detecting gestures and saving energy.
- It is an object of the present invention to provide a system and method for controlling a terminal device.
- It is an object of the present invention to provide a system and method to control a terminal device with gestures, including but not limited to knocks.
- It is another object of the present invention to provide a system and method to more accurately detect gestures.
- It is still another object of the present invention to provide a system and method to distinguish gestures from background stimuli.
- It is another object of the present invention to provide a system and method with two different sensors to identify gestures, including but not limited to knocks.
- It is still another object of the present invention to provide a system and method to confirm a sensor with another sensor with an improved level of confidence.
- It is an object of the present invention to provide a system and method to regulate power consumption by a slack mode and an active mode, said active mode requiring more power than said slack mode.
- These and other objectives and advantages of the present invention will become apparent from a reading of the attached specification.
- Embodiments of the present invention include a control system comprising a housing, an accelerometer sensor, an acoustic sensor, and a microcontroller. The housing has an engagement means for a mounting surface, and both the accelerometer sensor and acoustic sensor are contained within the housing. Each sensor forms a respective interactive zone defined by a range of the sensor, and each interactive zone is aligned with the mounting surface. Additionally, the acoustic sensor has a first power consumption level so as to be in a slack status and a second power consumption level so as to be in an active status. The slack status is a relatively lower power mode than the active status. The acoustic sensor is not devoid of activity; the acoustic sensor is resting, but still operating. The acoustic sensor generally stays in the slack status at the lower power consumption level, while the accelerometer sensor remains in a respective active status. The system saves energy with the acoustic sensor in the slack status and with other components in respective slack statuses.
- A contact interaction associated with the mounting surface within the accelerometer interactive zone is detected by the accelerometer sensor as accelerometer data signals. The contact interaction is also within the acoustic interactive zone, but the acoustic sensor is in slack status so the contact interaction is not detected by the acoustic sensor. However, the microcontroller unit is contained within the housing and connected to the accelerometer sensor. The microcontroller unit receives the accelerometer data signals from the accelerometer sensor and determines a status data pattern corresponding to the accelerometer data signals of the contact interaction. The status data pattern can match a status gesture profile associated with a command to switch the acoustic sensor from the slack status to the active status. The microcontroller toggles the acoustic sensor to the active status, and the control system is ready to detect a subsequent contact interaction with both the accelerometer sensor and the acoustic sensor.
- Another embodiment of the control system includes a server and a terminal device. The subsequent contact interactions control a terminal device, when the acoustic sensor is in the active status. The server in communication with the accelerometer sensor and the acoustic sensor can include a routing module, a processing module being connected to the routing module, and an output module connected to the processing module. The terminal device includes a receiving module in communication with the output module of the server and means for initiating activity of the terminal device. The subsequent accelerometer data signals and acoustic data signals from the subsequent contact interaction determine a subsequent data pattern, which is transmitted to the server. The subsequent data pattern matches with a gesture profile. This gesture profile is associated with a command for the terminal device.
- The control system of the present invention has an accelerometer sensor that remains in an active status at a low power consumption level of the control system. The user can awaken the control system with a gesture detected by only the accelerometer sensor to switch the acoustic sensor into an active status. Thus, the control system is now at a full power consumption level, instead of a low power consumption level, so as to detect subsequent gestures for terminal devices with both the accelerometer sensor and the acoustic sensor. Other components of the system can be awakened to corresponding active statuses. Also, the interaction of the acoustic sensor with the accelerometer sensor can filter background noise and inadvertent hits on the mounting surface. A sound detected by the acoustic sensor without a vibration detected by the accelerometer is now filtered from subsequent contact interactions intended to be gestures. Similarly, accidental bumps on the mounting surface are vibrations without a sound corresponding to a subsequent contact interaction intended to be a gesture. The present invention improves accuracy of detecting gestures and saves energy by limiting the powering of both sensors, until activated for listening for gestures.
- Embodiments of the present invention include the method of power regulation of a system for controlling a terminal device. The method includes installing a housing of the system on a mounting surface by an engagement device, the housing being comprised of an accelerometer sensor, an acoustic sensor, and a microcontroller unit. The acoustic sensor has a first power consumption level so as to be in a slack status and a second power consumption level so as to be in an active status. The system consumes less power when the acoustic sensor is in the slack status and when the microcontroller is in a corresponding slack status. When initially activated, the system has the acoustic sensor and other components, such as the microcontroller, in respective slack statuses, and only the accelerometer sensor is in an active status.
- The method further includes making a physical impact on the mounting surface so as to generate a contact interaction and detecting the contact interaction as accelerometer data signals with the accelerometer sensor. The microcontroller unit receives the accelerometer data signals to determine a status data pattern and commands the acoustic sensor to switch from the slack status to the active status, when the status data pattern matches a status gesture profile. With the acoustic sensor in the active status, the system is now fully activated and powered for a subsequent contact interaction within a set time duration. The method also includes switching the active status back to the slack status when the subsequent contact interaction occurs after the set time duration passes.
- Embodiments of the method include connecting a server in communication with the accelerometer sensor and the acoustic sensor and connecting the terminal device in communication with the server. The system with the acoustic sensor in active status can detect the subsequent contact interactions. Making a subsequent physical impact on the mounting surface generates a subsequent contact interaction, when the acoustic sensor is in the active status and before the set time duration passes. Subsequent accelerometer data signals and acoustic data signals determine a subsequent data pattern. The server matches the subsequent data pattern to a gesture profile associated with a command for the terminal device. The command is sent to the terminal device for performing the activity according to the command. Each of the subsequent accelerometer data signals and the acoustic data signals confirm each other to more accurately determine the subsequent data pattern. The background noise and extraneous vibrations to the mounting surface are filtered for a more accurate subsequent data pattern.
-
FIG. 1 is a schematic view of an embodiment of the control system of the present invention with the accelerometer sensor and the acoustic sensor. -
FIG. 2 is a top plan view of another embodiment of the housing with the accelerometer sensor and the acoustic sensor on the mounting surface of the present invention. -
FIG. 3 is flow diagram of the embodiment of the method for power regulation of the control system of the present invention showing the slack status and the active status of the acoustic sensor. -
FIG. 4 is a schematic view of another embodiment of the control system of the present invention with the server and terminal device. -
FIG. 5 is flow diagram of the embodiment of the method for controlling a terminal device in the ready mode, according to the embodiment of the present invention ofFIG. 4 . - The control system of the present invention regulates power and improves accuracy of gesture detection. To better distinguish gestures from background noise and accidental vibrations, the control system of the present invention includes two sensors, in particular an accelerometer sensor and an acoustic sensor. The vibrations detected by the accelerometer are compared with the sounds detected by the acoustic sensor in order to more accurately identify an intentional gesture from extraneous stimuli. The accelerometer detects a vibration on the mounting surface, and the acoustic sensor confirms a corresponding sound to determine the data pattern. A vibration on the mounting surface, caused by an accidental bump, can no longer be confused as a data pattern for a gesture, such as an intentional knock. Furthermore, a sound without a vibration on the mounting surface can no longer be confused as a data pattern for a gesture. The power requirements for two sensors and processing data signals from two sensors can be high. The power requirements for connecting to a server can also be high. The present invention further accounts for power regulation with a control system with toggling between an activated and fully powered system and an activated and power saving system.
-
FIGS. 1-3 show thecontrol system 10 with thehousing 20 comprised of an engagement means 24 for a mountingsurface 22. Planar surfaces, such as tables and walls, as well as non-planar surfaces, such as beds, can be mounting surfaces 22. There is a rigid positioning of theaccelerometer sensor unit 35 and theacoustic sensor unit 35′ relative to the mountingsurface 22 through thehousing 20. Any sound or vibration or both of the mountingsurface 22 is transmitted to theaccelerometer sensor unit 35 and theacoustic sensor unit 35′. The engagement means 24 attaches theaccelerometer sensor unit 35 and theacoustic sensor unit 35′ and reduces damping so that theaccelerometer sensor unit 35 and theacoustic sensor unit 35′ more accurately detectcontact interactions 60 on the mountingsurface 22. - The
control system 10 of the present invention includes anaccelerometer sensor 35 and anacoustic sensor 35′ as shown inFIG. 1 . Thehousing 20 contains the printedcircuit board 30 comprised of aboard 34 with aflash memory 31, microcontroller unit (MCU) 33, theaccelerometer sensor unit 35, theacoustic sensor unit 35′,antenna 37, andlight emitting diode 39. Themicrocontroller unit 33 andantenna 37 can have wifi capability for communication with a server 40 (SeeFIG. 4 ). Themicrocontroller unit 33 is connected to theaccelerometer sensor unit 35, theacoustic sensor unit 35′, and theflash memory 31. The rigid position of the printedcircuit board 30 establishes the transmission of the contact interaction to theaccelerometer sensor unit 35 and theacoustic sensor unit 35′. The engagement means 24 is in a fixed position relative to theaccelerometer sensor unit 35 and theacoustic sensor unit 35′. Other parts in thehousing 20 includebatteries 36 as a known power supply for thecontrol system 10. Thebatteries 36 power both theaccelerometer sensor unit 35 and theacoustic sensor unit 35′. The stable construction of thehousing 20 and theaccelerometer sensor unit 35 and theacoustic sensor unit 35′ enable the accurate and efficient conversion of thecontact interactions 60 as gestures into commands for a terminal device 50 (SeeFIG. 4 ). - In this embodiment of the
control system 10,FIG. 2 shows theaccelerometer sensor unit 35 and theacoustic sensor unit 35′ formingrespective zones accelerometer sensor unit 35 forms an accelerometerinteractive zone 32 defined by anaccelerometer range 34 of theaccelerometer sensor 35. Acontact interaction 60 with the mountingsurface 22 within the accelerometerinteractive zone 32 is detected by theaccelerometer sensor unit 35 as accelerometer data signals 70. Theacoustic sensor unit 35′ forms an acousticinteractive zone 32′ defined by anacoustic range 34′ of theacoustic sensor unit 35′. A contact interaction with the mountingsurface 22 within the acousticinteractive zone 32′ is detected by theacoustic sensor unit 35′ as acoustic data signals. The accelerometerinteractive zone 32 of theaccelerometer sensor unit 35 overlaps with the acousticinteractive zone 32′ of theacoustic sensor unit 35′.FIG. 2 shows theinteractive zones surface 22, in particular, theinteractive zones surface 22. Thecontact interaction 60 on the mountingsurface 22 can be detected by theaccelerometer sensor unit 35 and theacoustic sensor unit 35′ on the mountingsurface 22. - In the present invention, the
acoustic sensor unit 35′ has a first power consumption level so as to be in a slack status and a second power consumption level so as to be in an active status. In the activated and fullypowered control system 10, theacoustic sensor unit 35′ is in the active status, and bothsensor units microcontroller 33 permits communication to a server in the activated and fullypowered control system 10. In the activated and power savingcontrol system 10, theacoustic sensor 35′ is in the slack status, and only theaccelerometer sensor unit 35 detects respective data signals. Other components of thecontrol system 10, such as themicrocontroller 33, can also be in respective slack status for lower power consumption. For example, themicrocontroller 33 is not transmitting to theserver 40 in the corresponding slack status of the microcontroller for one type of lower power consumption. In the deactivatedcontrol system 10, bothsensor units - The
control system 10 regulates power with theacoustic sensor unit 35′ and themicrocontroller unit 33 in relation to theaccelerometer sensor unit 35.FIG. 3 is a flow diagram of an embodiment of the present invention, showing the accelerometer data signals 70 of theaccelerometer sensor unit 35 in relation to themicrocontroller unit 33. Thecontact interaction 60 generates the data signals 70 of theaccelerometer sensor unit 35 through thehousing 20. In the present invention, thecontact interaction 60 is comprised of an impact or plurality of impacts associated with the mountingsurface 22. In some embodiments, the impact or plurality of impacts on the associated surface is thecontact interaction 60, not an impact on the mountingsurface 22. The impacts are coordinated or correspond or translate to the mountingsurface 22 for detection by theaccelerometer sensor unit 35 through the mountingsurface 22 as accelerometer data signals 70. - According to
FIG. 3 , themicrocontroller unit 33 receives the accelerometer data signals 70 from theaccelerometer sensor unit 35. These accelerometer data signals 70 correspond to thecontact interaction 60 associated with the mountingsurface 22. Themicrocontroller unit 33 determines theaccelerometer data pattern 80 corresponding to the accelerometer data signals 70 of thecontact interaction 60. Themicrocontroller unit 33 also matches thestatus data pattern 80 with astatus gesture profile 90. Thestatus gesture profile 90 is associated with a switch command to change the status of theacoustic sensor unit 35′ and other components of thecontrol system 10, such as enabling communication with a server by themicrocontroller unit 33. Thecontrol system 10 as the activated power saving system has lower power consumption as an energy saving or sleep or slack mode. However,control system 10 remains able to detect thecontact interaction 60 corresponding to thestatus gesture profile 90. Thecontrol system 10 remains ready to change into the higher power consumption as an activated and fully powered system. Thecontrol system 10 can power themicrocontroller unit 33 to connect to theserver 40 as the activated and fully powered system (SeeFIG. 4 ). Thestatus gesture profile 90 can be comprised of a threshold level for thestatus data pattern 80. Any data pattern above the threshold level matches thestatus gesture profile 90. - The
control system 10 remains able to detect thecontact interaction 60 corresponding to thestatus gesture profile 90, such that thecontrol system 10 can toggle between the slack status and active status of theacoustic sensor unit 35′ by gestures. An elderly person in a wheelchair is able to regulate turning on or turning off thecontrol system 10 by knocking twice on a tabletop instead of locating a dedicated button on thehousing 20. Thecontrol system 10 is not required to maintain high power consumption. Bothsensor unit - In the embodiments of the
control system 10, the accelerometer data signals 70 have a respective defined peak corresponding to each impact, a measured time period between each defined peak, and a defined time period after a last defined peak. Each peak is a distinct spike in the data being detected with a quick increase from a baseline or background activity. Anaccelerometer data pattern 80 for eachcontact interaction 60 is determined by each defined peak and the defined time period after the last defined peak, and each measured time period between each defined peak, if there is a plurality of impacts.FIG. 3 shows an embodiment for thecontact interaction 60 comprised of one impact or a plurality of impacts. A single knock or a sequence of knocks can be acontact interaction 60. Thecontrol system 10 determines theaccelerometer data pattern 80 forcontact interactions 60 comprised of a single tap, three quick knocks, two taps, and other sequences. Contactinteractions 60, such as tapping, knocking, sweeping, and dragging, can be detected by theaccelerometer sensor unit 35 as accelerometer data signals 70. - The relationship between the
microcontroller 33 and theacoustic sensor unit 35′ is timed. The toggle to active status of theacoustic sensor unit 35′ is limited by time. Only subsequent contact interactions within a set time duration maintain the active status of theacoustic sensor 35′. Thecontrol system 10 distinguishes between accidentally switching to active status and purposely switching to active status and the higher power consumption level. Once switched, the user must make a subsequent contact interaction within a predetermined amount of time, so that the subsequent contact interaction is detected by bothsensor units control system 10 prevents accidental powering of theacoustic sensor unit 35′ and avoids unnecessary power consumption. - Now that the
control system 10 can be set as an activated and fully powered system, thecontrol system 10 is ready to detect subsequent contact interactions for controlling the terminal device. The subsequent contact interactions will be detected as subsequent accelerometer data signals and acoustic data signals. There will be two sets of data signals to determine a subsequent data pattern, and the server can determine a command for the terminal device with a particular processing of the subsequent data pattern from the two sets of data signals. The interaction allows for more accurate detection of gestures with inadvertent hits and background noise being more easily filtered from intentional gestures for thecontrol system 10. -
FIG. 4-5 show an alternative embodiment of the invention, with thecontrol system 10 including ahousing 20, anaccelerometer sensor unit 35 and anacoustic sensor unit 35′ within thehousing 20, aserver 40 in communication with thesensor units terminal device 50 in communication with theserver 40.Interfaces 99 are connected to theserver 40 in order to interact with thecontrol system 10. Theinterfaces 99 can include computers, laptops, tablets and smartphones.FIG. 4 shows a variety ofdifferent interfaces 99. Theinterfaces 99 allow the user to adjust the settings of thecontrol system 10. Gestures by a user associated with the mountingsurface 22 regulate thecontrol system 10 and control theterminal devices 50. In some embodiments, the devices that areinterfaces 99 could also beterminal devices 50. Theserver 40 is in communication with thesensor units server 40 and thesensor units router 42, as shown inFIG. 4 , and may also include wifi, Bluetooth, local area network, or other connections. InFIG. 4 , theserver 40 can be comprised of arouting module 44, aprocessing module 46 being connected to therouting module 44, and anoutput module 48 connected to theprocessing module 46. - The flow chart of
FIG. 5 shows thecontrol system 10 controlling activity of aterminal device 50 by asubsequent contact interaction 160. Therouting module 44 receives the subsequent accelerometer data signals 170 from theaccelerometer sensor unit 35 and the acoustic data signals 70′ from theacoustic sensor unit 35′. These subsequent accelerometer data signals 170 and acoustic data signals 70′ correspond to othersubsequent contact interactions 160 associated with the mountingsurface 22, when theacoustic sensor unit 35′ is in active status. Theprocessing module 46 determines thesubsequent data pattern 180 corresponding to the subsequent accelerometer data signals 170 and acoustic data signals 70′ of thesubsequent contact interaction 160. Theprocessing module 46 also matches thesubsequent data pattern 180 with agesture profile 190. Thegesture profile 190 is associated with a command for theterminal device 50, such as power off or change channels or dim intensity. Then, theoutput module 48 transmits the command to theterminal device 50. For example, when theterminal device 50 is a television, anothercontact interaction 160 of three fast knocks can be detected as subsequent accelerometer data signals 170 and acoustic data signals 70′ to generate asubsequent data pattern 180. Thesubsequent data pattern 180 can be matched to agesture profile 190 associated with changing channels up one channel. Theoutput module 48 communicates the command to change channels up one channel through theserver 40 to the television as theterminal device 50. Thus, that same elderly person in a wheelchair is able to activate thecontrol system 10 by knocking so that the person can change channels by knocking twice on a tabletop instead of locating a dedicated button on the television or fiddling with a touchscreen on a smartphone. - In the
control system 10, theterminal device 50 can be an appliance, such as a television, stereo or coffee machine. Alternatively, theterminal device 50 may be a device running software, a light or climate regulator, such as a thermostat, fan or lighting fixture. The activity of theterminal device 50 depends upon theterminal device 50. The activity is dedicated to the particularterminal device 50. The command associated with thegesture profile 190 relates to the particularterminal device 50. Knocking twice on a tabletop can be converted by thecontrol system 10 into a command to change channels on a television or to lower the temperature of a thermostat or to create an entry in an online calendar software program on a computer. Thecontrol system 10 can also be used with multipleterminal devices 50. Agesture profile 190 for a command is specific for an activity for a particularterminal device 50. More than oneterminal device 50 can be connected to theserver 40 to receive the commands from gestures by the user against the mountingsurface 22. - In the embodiments of the
control system 10, each of the subsequent accelerometer data signals 170 and the acoustic data signals have a respective defined peak corresponding to each impact, a measured time period between each defined peak, and a defined time period after a last defined peak. These peaks correspond to vibration data for theaccelerometer sensor unit 35 and sound data for theacoustic sensor unit 35′. Each peak is a distinct spike in the data being detected with a quick increase from a baseline or background activity. Thesubsequent data pattern 180 for eachsubsequent contact interaction 160 is determined by each defined peak and the defined time period after the last defined peak, and each measured time period between each defined peak, if there is a plurality of impacts. -
FIG. 5 shows an embodiment for thesubsequent contact interaction 160 comprised of one impact or a plurality of impacts. A single knock or a sequence of knocks can be asubsequent contact interaction 160. Thecontrol system 10 determines thesubsequent data pattern 180 forsubsequent contact interactions 160 comprised of a single tap, three quick knocks, two taps, and other sequences.Subsequent contact interactions 160, such as tapping, knocking, sweeping, and dragging, can be detected by theaccelerometer sensor unit 35 andacoustic sensor unit 35′. - In the present invention, each defined peak and the defined time period after the last defined peak, and each measured time period between each defined peak of the acoustic data signals 70′ confirm each defined peak and the defined time period after the last defined peak, and each measured time period between each defined peak of the subsequent accelerometer data signals 170. If a user knocks twice and then sets a glass down, the accelerometer detects three identical vibrations and the acoustic sensor, such as a microphone, detects the first two vibrations as from a first object and the surface (the users hand knocking twice) and the third vibration as from a second object and the surface (setting glass down) because the third sound was different from the first two sounds. The unwanted signals from the glass being set down are filtered with a degree of accuracy beyond the prior art. Setting a bag on a tabletop may cause a vibration to be detected by the
accelerometer sensor unit 35 and a sound detected by theacoustic sensor unit 35′. Knocking on a tabletop by the user as an intentional gesture may cause the same vibration to be detected by theaccelerometer sensor unit 35 and a respective sound detected by theacoustic sensor unit 35′. Thecontrol system 10 can now distinguish setting the bag on the tabletop from knocking as an intentional gesture to control theterminal device 50. The data pattern of setting the bag on the tabletop can generate a vibration analogous to the vibration of knocking as the intentional gesture, but the respective sound of the knocking as the intentional gesture is different. Thus, the data pattern of setting the bag no longer matches the subsequent data pattern of the knocking as an intentional gesture. - The present invention prioritizes the accelerometer data signals to confirm the contact interaction in the accelerometer interactive zone, not the acoustic interactive zone. With an acoustic sensor, the acoustic interactive zone overlaps the mounting surface, but the acoustic interactive zone may be too large and detects too many sounds not associated with a subsequent contact interaction. A sound can be heard, but the origin of the location of the sound can be difficult to screen. Filtering the acoustic data signals with the accelerometer sensor unit requires too much power and processing time. The present invention selects a particular hierarchy of the sensors, server and microcontroller unit. The
accelerometer sensor unit 35 relates to theacoustic sensor unit 35′,microcontroller unit 33 and theserver 40 to regulate power and more accurately determine subsequent data patterns for commands to the terminal devices. - The present invention provides an improved system and method for controlling a terminal device. A user can make gestures to control activity of a terminal device, such as knocking against a wall to illuminate an overhead light. Reliably detecting gestures with a sensor is more complicated than simply activating an accelerometer or microphone to capture vibration and sound data. Extraneous stimuli, like background noise and inadvertent vibrations, interfere with identifying intentional vibrations and sounds intended to be gestures for controlling the terminal device. To filter knocking in the interactive zone from extraneous stimuli, the control system of the present invention sets an accelerometer sensor unit, an acoustic sensor unit, a microcontroller, and a server in particular relationship to more accurate detect the intentional gestures. The acoustic sensor confirms the accelerometer sensor unit to insure the location of the gesture in the accelerometer interactive zone. The control system further regulates power consumption with the interaction of the two sensors and the microcontroller. Although needed for more accurate detection of gestures, the power demands on the system with two sensors cannot be so easily sustained. The present invention regulates power consumption by an activated and powered and an activated and power saving system coordinated with the two sensors of the control system.
- As described herein, the invention provides a number of advantages and uses, however such advantages and uses are not limited by such description. Embodiments of the present invention are better illustrated with reference to the Figure(s), however, such reference is not meant to limit the present invention in any fashion. The embodiments and variations described in detail herein are to be interpreted by the appended claims and equivalents thereof.
- The foregoing disclosure and description of the invention is illustrative and explanatory thereof. Various changes in the details of the illustrated structures, construction and method can be made without departing from the true spirit of the invention.
Claims (17)
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US15/461,010 US20180267614A1 (en) | 2017-03-16 | 2017-03-16 | Control system for a terminal device with two sensors and power regulation |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US15/461,010 US20180267614A1 (en) | 2017-03-16 | 2017-03-16 | Control system for a terminal device with two sensors and power regulation |
Publications (1)
Publication Number | Publication Date |
---|---|
US20180267614A1 true US20180267614A1 (en) | 2018-09-20 |
Family
ID=63519287
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/461,010 Abandoned US20180267614A1 (en) | 2017-03-16 | 2017-03-16 | Control system for a terminal device with two sensors and power regulation |
Country Status (1)
Country | Link |
---|---|
US (1) | US20180267614A1 (en) |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20090093300A1 (en) * | 2007-10-05 | 2009-04-09 | Lutnick Howard W | Game of chance processing apparatus |
US20150173617A1 (en) * | 2007-03-23 | 2015-06-25 | Qualcomm Incorporated | Multi-Sensor Data Collection and/or Processing |
US20150301615A1 (en) * | 2014-04-21 | 2015-10-22 | Apple Inc. | Impact and contactless gesture inputs for docking stations |
US20170330429A1 (en) * | 2016-05-10 | 2017-11-16 | Google Inc. | LED Design Language for Visual Affordance of Voice User Interfaces |
-
2017
- 2017-03-16 US US15/461,010 patent/US20180267614A1/en not_active Abandoned
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20150173617A1 (en) * | 2007-03-23 | 2015-06-25 | Qualcomm Incorporated | Multi-Sensor Data Collection and/or Processing |
US20090093300A1 (en) * | 2007-10-05 | 2009-04-09 | Lutnick Howard W | Game of chance processing apparatus |
US20150301615A1 (en) * | 2014-04-21 | 2015-10-22 | Apple Inc. | Impact and contactless gesture inputs for docking stations |
US20170330429A1 (en) * | 2016-05-10 | 2017-11-16 | Google Inc. | LED Design Language for Visual Affordance of Voice User Interfaces |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US9812004B1 (en) | Control system for a terminal device and a switch | |
US9996738B2 (en) | System and method for controlling a terminal device | |
KR102269035B1 (en) | Server and method for controlling a group action | |
US10108247B2 (en) | Method for regulating a system for controlling a terminal device | |
US10082878B2 (en) | Method for controlling and calibrating a device with a gesture | |
US9563349B2 (en) | Portable device and method for providing voice recognition service | |
JP2017524216A (en) | Interactive mirror | |
WO2015011703A1 (en) | Method and system for touchless activation of a device | |
US20170263113A1 (en) | Interaction detection wearable control device | |
KR20140140891A (en) | Apparatus and Method for operating a proximity sensor function in an electronic device having touch screen | |
US20160070410A1 (en) | Display apparatus, electronic apparatus, hand-wearing apparatus and control system | |
TW201636887A (en) | Smart control apparatus and system | |
CN108966198A (en) | Network connection method and device, intelligent glasses and storage medium | |
US10936184B2 (en) | Display apparatus and controlling method thereof | |
CN109246525A (en) | Headset control method, device and headphone based on gesture | |
US9946326B2 (en) | User interface device and electronic device including the same | |
TW201510772A (en) | Gesture determination method and electronic device | |
CN108377212A (en) | Control method of household appliance and electronic system thereof | |
US20180275800A1 (en) | Touch sensing device and smart home hub device | |
US11144153B2 (en) | User interface with acoustic proximity and position sensing arrangements | |
KR20210000974A (en) | Method and apparatus for providing a service for electronic device by forming a zone | |
US20180267614A1 (en) | Control system for a terminal device with two sensors and power regulation | |
CN110888681A (en) | Lighting lamp control method, device, electronic device and medium | |
RU2673464C1 (en) | Method for recognition and control of household appliances via mobile phone and mobile phone for its implementation | |
US20180224944A1 (en) | Universal contactless gesture control system |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: SWAN SOLUTIONS, INC., TEXAS Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:BOSHERNITZAN, YANIV;NEZER, OHAD;REEL/FRAME:041601/0426 Effective date: 20170314 |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |
|
AS | Assignment |
Owner name: HAPTIC, INC., TEXAS Free format text: CHANGE OF NAME;ASSIGNOR:SWAN SOLUTIONS, INC.;REEL/FRAME:053938/0842 Effective date: 20190610 |