+

US20110115741A1 - Touch sensitive panel supporting stylus input - Google Patents

Touch sensitive panel supporting stylus input Download PDF

Info

Publication number
US20110115741A1
US20110115741A1 US12/912,472 US91247210A US2011115741A1 US 20110115741 A1 US20110115741 A1 US 20110115741A1 US 91247210 A US91247210 A US 91247210A US 2011115741 A1 US2011115741 A1 US 2011115741A1
Authority
US
United States
Prior art keywords
touch pad
touch
input
touch sensitive
sensitive element
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US12/912,472
Inventor
Bob Lukas
David A. Sobel
Monika Gupta
Sumant Ranganathan
Pieter Vorenkamp
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Avago Technologies International Sales Pte Ltd
Original Assignee
Broadcom Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Broadcom Corp filed Critical Broadcom Corp
Priority to US12/912,472 priority Critical patent/US20110115741A1/en
Assigned to BROADCOM CORPORATION reassignment BROADCOM CORPORATION ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: GUPTA, MONIKA, RANGANATHAN, SUMANT, SOBEL, DAVID A., VORENKAMP, PIETER, LUKAS, BOB
Publication of US20110115741A1 publication Critical patent/US20110115741A1/en
Assigned to BANK OF AMERICA, N.A., AS COLLATERAL AGENT reassignment BANK OF AMERICA, N.A., AS COLLATERAL AGENT PATENT SECURITY AGREEMENT Assignors: BROADCOM CORPORATION
Assigned to AVAGO TECHNOLOGIES GENERAL IP (SINGAPORE) PTE. LTD. reassignment AVAGO TECHNOLOGIES GENERAL IP (SINGAPORE) PTE. LTD. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: BROADCOM CORPORATION
Assigned to BROADCOM CORPORATION reassignment BROADCOM CORPORATION TERMINATION AND RELEASE OF SECURITY INTEREST IN PATENTS Assignors: BANK OF AMERICA, N.A., AS COLLATERAL AGENT
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/30Authentication, i.e. establishing the identity or authorisation of security principals
    • G06F21/31User authentication
    • G06F21/32User authentication using biometric data, e.g. fingerprints, iris scans or voiceprints

Definitions

  • FIG. 8 is a block diagram illustrating a touch pad and touch pad circuitry constructed according to one or more embodiments of the present invention.
  • FIG. 11 is a flowchart illustrating operations of a user input device (video game controller, video game console, remote control, automobile data input device, keypad replacement device, etc.) according to one or more embodiments of the present invention
  • the IR interface 306 couples to the IR transmit/receive element 307 and supports IR communications with game controllers 108 A, 108 B, and 108 C as shown in FIG. 1 .
  • the IR communications between the game console 302 and the game controllers 108 A, 108 B, and 108 C may support an industry standard or proprietary communications protocol.
  • the processing circuitry 308 may include one or more of a system processor, a digital signal processor, a processing module, dedicated hardware, application specific integrated circuit, or other circuitry that is capable of executing software instructions and for processing data.
  • the processing circuitry 308 may perform some processing to detect a hovering finger and determine a position of the hovering finger, and then use that input as gaming input or non-gaming input.
  • the primary game controller 502 couples to secondary game controller 504 via either a wired or a wireless interface.
  • the secondary game controller 504 includes input components 521 , 522 , and 524 . These input components 521 , 522 , and 524 of the secondary game controller 504 may be embodied by either mechanical input devices or touch pads. The manners in which touch pads are implemented are described further herein. Data collected from these input components 521 , 522 , and 524 are relayed to game controller 502 , which may process the inputs. Alternately, the input received from input components 521 , 522 , and/or 524 may be relayed to a servicing game console. The primary game controller 502 and the secondary game controller 504 may both be hand-held devices.
  • the display 624 of the game controller 602 may have a relatively small size or relatively large size that presents information to a user and that allows the user to respond accordingly.
  • the speaker/microphone 626 may receive audio input and provide audio output to a user of the game controller 602 . Audio input captured by the microphone may be used in conjunction with touch pad 618 input for user identification and/or for gaming input.
  • Video camera 628 of the game controller may be used to determine a location of the game controller and/or may be used to provide additional gaming input for gaming environments supported by the game controller 602 .

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Computer Security & Cryptography (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Hardware Design (AREA)
  • Software Systems (AREA)
  • Human Computer Interaction (AREA)
  • User Interface Of Digital Computer (AREA)
  • Position Input By Displaying (AREA)
  • Navigation (AREA)

Abstract

Operating a user input device by scanning touch sensitive elements of a touch pad to measure touch sensitive element values. The touch sensitive element values are compared to a stylus input threshold pattern. Upon a favorable comparison, a stylus input condition is determined, stylus input touch pad processing settings are enacted, and a position of the stylus upon the touch pad is detected. Detection of the stylus position upon the touch pad is based upon the touch sensitive element values and the stylus input touch pad processing settings. The touch sensitive element values are compared to a touching finger threshold pattern. Upon a favorable comparison, a touching finger condition is determined, touching finger touch pad processing settings are enacted, and the touching finger's position upon the touch pad is detected based upon the touch sensitive element values.

Description

    CROSS-REFERENCE TO PRIORITY APPLICATION
  • The present U.S. Utility Patent Application claims priority pursuant to 35 U.S.C. §119(e) to U.S. Provisional Application Ser. No. 61/261,702, entitled “TOUCH PAD USER IDENTIFICATION, GAMING INPUT, AND PREFERENCE INPUT,” (Attorney Docket No. BP20924), filed Nov. 16, 2009, pending, which is hereby incorporated herein by reference in its entirety and made part of the present U.S. Utility Patent Application for all purposes.
  • BACKGROUND OF THE INVENTION
  • 1. Field of the Invention
  • The present invention relates generally to electronic devices, and more particularly to electronic devices having touch pads/panels.
  • 2. Description of the Related Art
  • User gaming devices are fairly well known. These devices include game consoles with communicatively coupled controllers such as Nintendo game consoles, Sony game consoles, Microsoft game consoles, and various other game console devices. These game consoles couple to a television, may couple to an audio system, and support user game playing. Some of these game consoles support wireless communications with handheld game controllers and/or other game controllers. For example, the Nintendo Wii includes handheld controllers that detect their orientation to some degree, acceleration to some degree, and receive standard button inputs from a user. This information is wirelessly relayed to the game controller to control operation of corresponding game elements within the gaming environment. Other game controllers may include simulated game pieces such as musical instruments, baseball bats, golf clubs, and various other types of simulated devices. Further, other types of gaming systems are contained in a single unit such as the Nintendo GameBoy and the Sony PlayStation Portable, among other units.
  • With the continued advancement of technology, the complexities and capabilities of game consoles have become advanced. The game controllers support sophisticated gaming inputs received via numerous input sources, e.g., buttons, accelerometers, IR orientation detectors, positional detectors, and various other gaming inputs. The gaming environment in which these gaming inputs are received is very complex, providing a fairly realistic experience for a user of the gaming device/console. While some games supported by a game console may support only a few gaming inputs, other games require a large number of gaming inputs.
  • Most game consoles support many differing games, which are software controlled via respective software programming. Sometimes game controllers are specific to the particular game being supported, e.g., guitar hero, rock star, and various other particular types of games. In such a case, these various types of inputs must be supported by differing unique game controllers. The expense and complexity of the multiple game controllers can be overwhelming for some users from a cost standpoint.
  • Many gaming systems are contained within one unit such as the Nintendo Game Boy and its successors and the Sony Play Station and its successors, for example. These gaming systems include processing resources and a user interface contained within a single unit. With these units, various buttons receive user input while a display and speakers provide user output. Because of the limited battery life available for these units, their functionality has been limited in some regard.
  • Audio/video entertainment systems that include cable boxes, satellite boxes, and audio visual components typically include one or more remote control devices. These remote control devices allow users to remotely control system operation. Such technology is very old and has been prevalent for a number of years. However, one problem with these devices is that the operation generally of the set-top box is generic to all users and must be uniquely programmed if desired for a particular user. However, this particular programming in other settings is typically applied across the board to all potential users of the device.
  • BRIEF SUMMARY OF THE INVENTION
  • The present invention is directed to apparatus and methods of operation that are further described in the following Brief Description of the Drawings, the Detailed Description, and the claims. Other features and advantages of the present invention will become apparent from the following detailed description of the invention made with reference to the accompanying drawings.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • FIG. 1 is a system diagram illustrating a video game system constructed according to one or more embodiments of the present invention;
  • FIG. 2 is a system diagram illustrating an audio/video entertainment system constructed according to one or more embodiments of the present invention;
  • FIG. 3 is a block diagram illustrating a game console constructed according to one or more embodiments of the present invention;
  • FIG. 4A is a first perspective view of a game controller constructed according to one or more embodiments of the present invention;
  • FIG. 4B is a second perspective view of the game controller of FIG. 3A that is constructed according to one or more embodiments of the present invention;
  • FIG. 5 is a block diagram illustrating a game controller and coupled secondary game controller, both of which are constructed according to one or more embodiments of the present invention;
  • FIG. 6 is a block diagram illustrating a game controller constructed according to one or more embodiments of the present invention;
  • FIG. 7 is a block diagram illustrating an entertainment system remote control constructed according to one or more embodiments of the present invention;
  • FIG. 8 is a block diagram illustrating a touch pad and touch pad circuitry constructed according to one or more embodiments of the present invention;
  • FIG. 9 is a diagrammatic side view illustrating both a user's finger and a stylus touching a touch pad constructed and operating according to one or more embodiments of the present invention;
  • FIG. 10 is a diagram illustrating a touch pad and the manner in which a touching finger and a touching stylus may be detected via differing capacitance levels according to one or more embodiments of the present invention;
  • FIG. 11 is a flowchart illustrating operations of a user input device (video game controller, video game console, remote control, automobile data input device, keypad replacement device, etc.) according to one or more embodiments of the present invention;
  • FIG. 12 is a flowchart illustrating particular operations of the user input device of FIG. 11 when enacting stylus input touch pad settings and touching finger touch pad settings according to one or more embodiments of the present invention;
  • FIG. 13 is a block diagram illustrating a touch pad that operates according to one or more embodiments of the present invention; and
  • FIG. 14 is a block diagram illustrating a touch pad that operates according to one or more embodiments of the present invention.
  • DETAILED DESCRIPTION
  • FIG. 1 is a system diagram illustrating a video game system constructed according to one or more embodiments of the present invention. The gaming system 100 of FIG. 1 includes a game console 102 and a plurality of game controllers 108A, 108B, and 108C. The game console 102 couples to an audio/visual system 104 that includes a video monitor and an audio system. The game console 102 also couples to an infrared (IR) detector 106.
  • The game controllers 108A, 108B, and 108C communicate with the game console 102 via one or more of a wired and/or wireless communication link. The wired communication link may be a tethered controller including conductors that support wired communications. Wireless communications may be in various Radio Frequency (RF) bands and/or in the infrared range. Thus, each of the game controllers 108A, 108B, and 108C includes communication circuitry that allow the game controllers 108A, 108B, and 108C to communicate with the game console 102.
  • According to one or more embodiments of the present invention, each of game controllers 108A, 108B, and 108C includes one or more touch pads/touch panels/touch sensitive pads (referred to herein interchangeably) 110A, 110B, and 110C, respectively. According to some aspects of the present invention, the touch pads 110A, 110B, and 110C of the game controllers 108A, 108B, and 108C are used to identify users of the game controllers, to provide gaming input, to determine whether a user is active, and/or to provide other information to the game console 102 for subsequent action. Data captured by the touch pads 110A, 110B, and 110C may be solely processed by a host game controller, e.g., 108A, may be partially processed and transmitted to the game console 102 for further processing, or may be transferred in an unprocessed format from the game controller 108A to the game console 102. Based upon one or more embodiments of the present invention, touch pads are coupled to touch pad circuitry that measures capacitance (inductance or RF propagation) characteristics observed by a plurality of touch sensitive elements of the touch pads.
  • According to one embodiment of the present invention, each of the game controllers 108A, 108B, and 108C includes touch pads 110A, 110B, and 110C that support both finger and stylus inputs. With the finger input operations of the touch pads 110A, 110B, and 110C, the touch pads enact touching finger touch pad processing settings that are tailored for receiving input via a finger of the user. In another operation, each of the touch pads 110A, 110B, and 110C enact stylus input touch pad processing settings that are tailored for receipt of input by a stylus. Distinguishing touching finger input from stylus input allows a user to more finely control or to provide greater resolution in inputs when using a stylus. It is generally known that styluses used in conjunction with touch pads have a magnetic or metallic end portion that has a relatively small diameter. When this end portion or tip of the stylus touches a touch pad, it has a much smaller cross section than a finger touching a touch pad. In this case, the use of a stylus as an input to touch pads 110A, 110B, and 110C provides greater resolution for input to a game service by game console 102 and game controllers 108A, 108B, or 108C. The game console 102 may include one or more touch pads that support both stylus input and touching finger input.
  • The inventive concepts described herein may also be applied to/embodied by a single package video game, i.e., a video game system that is contained in a single housing, a single package telephone, a single package remote control, a single package computer, a computer touch pad display, or other single package device. Such single package system includes a display, a user input, which includes one or more touch pads, processing components, memory components, and powering components, such as a battery and power circuitry. Thus, the teachings of the present invention further apply to all forms of touch sensitive systems including touch sensitive panel computers, touch sensitive panel monitors, touch sensitive panel laptop computers, smart phones, etc.
  • FIG. 2 is a system diagram illustrating an audio/video entertainment system constructed according to one or more embodiments of the present invention. The audio/video entertainment system 200 includes a multimedia system 202 that couples to monitor 204 and related multimedia system components such as speakers, audio components such as CD Players, DVD Players, tape deck, and/or various other multimedia system components. In some embodiments, the multimedia system 202, the monitor 204, and the IR detector 206 may be contained in single housing, e.g., stand-alone television, television with stereo, television with CD/DVD/tape deck, etc.
  • The multimedia system 202 also couples to an IR detector 206 or has such an IR detector built-in. The audio/video entertainment system 200 of FIG. 2 further includes at least one remote control 208A, 208B, and/or 208C. Each of these remote controls 208A, 208B, and 208C includes respective touch pads 210A, 210B, and 210C.
  • Each of remote controls 208A, 208B, and 208C includes touch pads 210A, 210B, and 210C that support both stylus input and touching finger input. As was previously described with reference to FIG. 1, the stylus input and touching finger input are supported via different processing settings. The processing settings for processing input from the touch pads 210A, 210B, and 210C differ for stylus input touch pad operations and touching finger touch pad operations.
  • FIG. 3 is a block diagram illustrating a game console constructed according to one or more embodiments of the present invention. The game console 302 of FIG. 3 includes a wireless interface(s) 304, an infrared interface 306, an IR Transmit/Receive element 307, processing circuitry 308, one or more wired interfaces 310, and memory 312. The game console 302 typically also includes a user interface 314, a video interface 316, an audio interface 318, and may include a video camera/video camera interface 320. The wireless interface(s) 304 support wireless communications with at least the game controllers 108A, 108B, and 108C described with reference to FIG. 1. This wireless interface may be a Bluetooth interface, a wireless local area network (WLAN) interface, or another type of wireless communication interface that supports communications between the game console 302 and one or more game controllers. Further, the wireless interface 304 may support communications with a WLAN router or access point, a cellular infrastructure, a satellite communications network, or another type of wireless communications systems.
  • The IR interface 306 couples to the IR transmit/receive element 307 and supports IR communications with game controllers 108A, 108B, and 108C as shown in FIG. 1. The IR communications between the game console 302 and the game controllers 108A, 108B, and 108C may support an industry standard or proprietary communications protocol. The processing circuitry 308 may include one or more of a system processor, a digital signal processor, a processing module, dedicated hardware, application specific integrated circuit, or other circuitry that is capable of executing software instructions and for processing data. The processing circuitry 308 may perform some processing to detect a hovering finger and determine a position of the hovering finger, and then use that input as gaming input or non-gaming input. The memory 312 may be RAM, ROM, FLASH RAM, FLASH ROM, an optical memory, magnetic memory, or other types of memory that is capable of storing data and/or instructions in allowing processing circuitry to access same. The wired interface(s) 310 may include a USB interface, a fire wire interface, a serial interface, a parallel interface, an optical interface, or another type of interface supported by a media that is copper, metal, or optical.
  • The user interface 314 may include a keypad, a video display, cursor control, a touch pad, or other type of interface that allows a user to interface with the game console 302. The video interface 316 couples the game console 302 to one or more video monitors to provide display for the gaming environment supported by game console 302. The communications link between the video interface 316 on the video monitor(s) may be an HDMI interface, a composite video interface, component video interface, an S-video interface, or another type of video interface supported by both the video monitor and the game console 302. The audio interface 318 couples the game console 312 to speakers and/or microphones for audio content delivery and receipt. The video camera/video camera interface 302 may include an onboard video camera or may couple the game console 302 to an external video camera. The external video camera may be used to provide gaming input or other types of information that the game console 302 uses within its operation to produce a gaming environment.
  • The game console 302 may receive touch pad input from one or more coupled game controllers. This touch pad input may be input caused by the use of a stylus or the finger (or other body part) of a user. According to one aspect of the present invention, the game console 302 may operate in conjunction with one or more game controllers to initiate stylus input touch pad processing settings and touching finger touch pad processing settings. For each of the particular cases, the input received from the touch pad indicative of a touch of a stylus or a finger of locations corresponding to a plurality of touch sensitive elements will cause different user inputs to be produced based upon which processing settings are enacted.
  • The game console 302 may direct one or more game controllers to enact one or more of the stylus input touch pad processing settings and the touching finger touch pad processing settings based upon a particular gaming operation, based upon input from the touch pad, or based upon other operational conditions. For example, in some cases, the game console 302 may support a video game that has a portion or segment that is designed to receive stylus input. In such case, the game console 302 may enact stylus input touch pad processing settings for the particular portion of the game. Likewise, other portions of the supported game of the game console 302 may require touching finger touch pad processing settings to receive user finger input for those portions. In this case, the game console 302 therefore initiates the stylus input touch pad processing settings and the touching finger touch pad processing settings at different points in the operation of the game.
  • FIG. 4A is a first perspective view of a game controller constructed according to one or more embodiments of the present invention. As shown in FIG. 4A, a game controller 402 includes a cursor control 404, mechanical buttons 410 and 406, and may include a touch pad 408. The cursor control 404 may be a touch pad. When 404 and 408 are both touch pads, they receive inputs and may be used for user identification, gaming input, or other operations supported by the gaming system and includes game controller 402. The touch pad 408 of the gaming controller 402 of FIG. 4A may support both stylus input and touching finger input.
  • FIG. 4B is a second perspective view of the game controller 402 of FIG. 4A that is constructed according to one or more embodiments of the present invention. As shown in FIG. 4B, a reverse portion of the game controller 402 may include a touch pad 452. The touch pad 452 may wrap around a back portion of the game controller 402. Alternatively, the touch pad 452 may reside on a battery cover of the game controller 402. As will be described further herein, the touch pad 452 includes a plurality of touch pad locations/touch sensitive elements that receive input that may be further used for user ID, gaming input, and/or other purposes. The touch pad 452 of FIG. 4B supports both stylus input touch pad processing settings and touching finger touch pad processing settings according to one or more operations of the present invention.
  • FIG. 5 is a block diagram illustrating a game controller and coupled secondary game controller, both of which are constructed according to one or more embodiments of the present invention. As shown in FIG. 5, primary game controller 502 includes a display 506, a circular input device 508, and input devices 510, 512, 514, 516, 518, and 520. Any of these input devices 508, 510, 512, 514, 516, 518, and 520 of primary game controller 502 may be touch pads, as is further described herein. These touch pads receive gaming input in a manner that is consistent with mechanical counterparts that were previously implemented according to prior devices.
  • The primary game controller 502 couples to secondary game controller 504 via either a wired or a wireless interface. The secondary game controller 504 includes input components 521, 522, and 524. These input components 521, 522, and 524 of the secondary game controller 504 may be embodied by either mechanical input devices or touch pads. The manners in which touch pads are implemented are described further herein. Data collected from these input components 521, 522, and 524 are relayed to game controller 502, which may process the inputs. Alternately, the input received from input components 521, 522, and/or 524 may be relayed to a servicing game console. The primary game controller 502 and the secondary game controller 504 may both be hand-held devices. Alternately, one or the other of these game controllers may be placed on the floor, inserted into a simulated gaming piece, e.g., guitar, drums, simulated golf club, simulated baseball bat, etc. Each of these game controllers 502 and 504 may capture touch pad input as is further described herein with reference to the FIGs. The touch pad input captured by game controllers 502 and 504 may be processed to produce combined gaming input or transmitted separately to a game console 202. The combined or separate touch pad input may be used as gaming input, may be processed to identify a user, or may be processed to otherwise provide input to a supported video game.
  • Still referring to FIG. 5, the input devices 521, 522, 524, 508, 510, 512, 514, 516, 518, and 520 may be touch pads that support both stylus input touch pad processing settings and touching finger touch pad processing settings according to one or more embodiments of the present invention. In some operations, the stylus input touch pad processing settings are enacted based upon detection of a touch of a stylus. In other operations, the stylus input processing settings are enacted based upon a direction received from game console as was just described with reference to FIG. 3. In either case, when the stylus input touch pad processing settings are enacted, the touch pads of the game controllers 502, and 504 will be tailored to receive input via a touch pad based upon the expected stylus characteristics. The game controller 504 may produce user input based upon the stylus input and relay the user input to a game console.
  • Likewise, each of the touch pads 522, 524, 514, 516, 518, and 520 may also be configured to enact touching finger touch pad processing settings. The touching finger touch pad processing settings are tailored for receipt of user input via a user's finger, as contrasted to the use of a stylus by a user. In such case, the user input received by the touching finger may be used to produce user input that is relayed from game controller 502 to a game console.
  • FIG. 6 is a block diagram illustrating a game controller constructed according to one or more embodiments of the present invention. The game controller 602 includes one or more wireless interfaces 604, an IR interface 606 that includes an IR transmit/receive element 608, processing circuitry 610, wired interface(s) 612, memory 614, and user interface(s) 616. These particular components of the game controller 602 may be similar to the like named components of the game console 302 illustrated in FIG. 3 and described with reference thereto. However, in other embodiments, these like named components may have differing construct/functionality, e.g., smaller memory, less processing capability, lower power wireless interfaces, etc. Thus, commonly named components will not be described further herein as they have been previously described with reference to FIG. 3.
  • The game controller 602 includes one or more touch pad(s) 618, motion/position detector 620, orientation detector 622, display 624, speaker/microphone 626, and a video camera 628. The game controller may also include other components such as one or more environmental conditions detectors 630 that are used to sense environmental conditions such as temperature, humidity, and other environmental conditions. The structure and operations of the touch pads 618 will be described further herein with reference to subsequent FIGs. The motion/position detector 620 detects motion/acceleration of the game controller 602. Detection of such motion/acceleration may be performed in conjunction with the game controller, using a GPS system, using an accelerometer or gyrator of the game controller 602 and/or using external components to determine motion/acceleration position of the game controller. The motion/position detector 620 may also determine position of the game controller. The manner in which the motion/position detector 620 determines the position of the game controller 602 is not described further herein. However, the position detector 620 may use external reference devices in order to determine position of the game controller within a gaming environment. Motion, acceleration, and position of the game controller 602 may be provided to a servicing game console as a gaming input. The game controller 602 supports detection of a hovering finger and determines a position of the hovering finger, as described herein.
  • The orientation detector 622 determines an orientation and/or direction in which the game controller is pointed. Such orientation detection provided by orientation detector 622 may be accomplished in conjunction with the IR interface 606 of the game controller 602. Such orientation detection may be performed in conjunction with the IR detector 106 of the gaming system 100 of FIG. 1.
  • The display 624 of the game controller 602 may have a relatively small size or relatively large size that presents information to a user and that allows the user to respond accordingly. The speaker/microphone 626 may receive audio input and provide audio output to a user of the game controller 602. Audio input captured by the microphone may be used in conjunction with touch pad 618 input for user identification and/or for gaming input. Video camera 628 of the game controller may be used to determine a location of the game controller and/or may be used to provide additional gaming input for gaming environments supported by the game controller 602.
  • According to one particular aspect of the gaming system of FIG. 1, the touch pad(s) 618 of the game controller 602 (and/or game console) may be capacitive, inductive, or RF based. With regard to inputs received via the touch pad of the game controller, the raw data received by the touch pad of the game controller may be fully communicated to the game console of the gaming system. Alternatively, information captured via the touch pad(s) 618 of the game controller may be processed by the processing circuitry 610 of the game controller 602 (or other processing circuitry such as the touch pad processing circuitry shown in FIG. 6, which may be different or the same as the processing circuitry 610) prior to communicating such information to the game console 102 of FIG. 1. Such processing may be full or partial to determine whether and what data to upload to the game console.
  • Referring again to FIG. 5, the touch pad input received by game controller may be received at both primary 502 and secondary 504 game controllers of FIG. 5. The input received from multiple touch pads of the primary and secondary game controllers 502 and 504 may be received and at least partially processed by processing circuitry of the game controller(s) prior to uploading the data to a game console. The basis for touch pad input processing may be based upon a current usage of the game controllers. For example, the primary game controller 502 may be relevant to a first portion of a user's body while the secondary game controller 504 may be relevant to a second portion of a user's body.
  • Referring again to FIG. 6, the game controller includes touch pads 618. These touch pads 618 support both stylus input and touching finger input. According to some operations of the present invention, the processing circuitry 610 enacts stylus touch pad processing settings based upon a direction received from a game console or by meeting a stylus input condition. The stylus input condition is met when the game controller 602 detects that a stylus is being used to provide input to the touch pads 618. Likewise, the processing circuitry 610 implements touching finger touch pad processing settings for touch pads 618 upon detection of a touching finger or based upon direction received from the game controller via wireless interface 604 or infrared interface 606, or wired interface 612. In such case, the further operations of the touch pads 618 to receive user input are based upon the enacted settings.
  • In another embodiment of the present invention, the structure 602 of FIG. 6 is a single unit video game with the entirety of a video game supported thereby. With this embodiment, both stylus input and touch pad input are supported by the touch pad(s) 618.
  • FIG. 7 is a block diagram illustrating an entertainment system remote control constructed according to one or more embodiments of the present invention. The remote control 702 may be used as one of the remote controls 208A, 208B, or 208C in conjunction with the multimedia system 202 of the system 200 of FIG. 2. The remote control includes one or more wireless interfaces 704, IR interface 706 that includes an IR T/R element 708, processing circuitry 710 and one or more wired interfaces 712. The remote control 702 further includes memory 714, one or more user interfaces 716, one or more touch pads 718, and/or one or more displays 720. The remote control 702 of FIG. 7 may include components that are of same/similar construct as those components previously described with reference to the game controller 602 of FIG. 6. However, as is illustrated in FIG. 7, the remote control 702 may have fewer components than those typically included with a game controller. The functions of each of the components of the remote control 702 of FIG. 7 may have similar input characteristics to those of game controller 602 of FIG. 6. The touch pads 718 and supporting circuitry 710 may be configured to enact one or both of the stylus input touch pad processing settings and touching finger touch pad processing settings.
  • As will be further described with reference to subsequent FIGs herein (and also for any of the game console, game controller or remote control), the stylus input processing settings may be enacted for a portion of the touch pad 718 and the touching finger touch pad processing settings may be enacted for a second portion of the touch pad 718. In either case, when the touch pad 718 is configured for particular types of input, the processing circuitry 710 will process touch sensitive element values based upon the settings to produce user input. Such user input may be relayed to the multimedia system 202 of FIG. 2 via wireless interface 704, infrared interface 706, or wired interface 712.
  • FIG. 8 is a block diagram illustrating a touch sensitive pad and touch pad circuitry constructed according to one or more embodiments of the present invention. A touch pad 802 includes a plurality of touch sensitive elements 804 each of which corresponds to a particular location of the touch pad 802. With the embodiment of FIG. 8, the touch pad includes an array of touch sensitive elements 804, each of which may be a particular capacitively coupled location, inductively coupled location, or a radio frequency (RF) touch sensitive element. Touch pad circuitry 806 couples via a grid structure to the plurality of touch sensitive elements 804 to sense the particular capacitance, inductive, or RF characteristics at each of the touch sensitive elements.
  • Touch pad circuitry 806 scans the plurality of touch sensitive elements 804 via access of particular row-column combinations at particular times. The frequency or voltage at which the touch pad circuitry 806 scans the plurality of touch sensitive elements 804 may be altered over time. Choosing the scanning frequency or scanning voltage may be based upon a particular operational use of the touch pad. For example, at some points in time the manner in which the touch pad is scanned will change based upon a particular point in a game of a gaming system with which the touch pad functions as a gaming input device. Further, a first scanning frequency/scanning voltage may be employed for user identification while a second scanning frequency/scanning voltage may be employed for gaming input functions.
  • The scanning done by the touch pad circuitry 806 of the plurality of touch sensitive elements 804 may be made using a spread spectrum frequency scanning technique. Such technique may be employed to more efficiently capture information from the touch pad 802 at the various touch sensitive elements 804 or to determine which particular scanning frequencies are more successful than others in capturing input information.
  • Further, the scanning of each row and column corresponding to a particular touch sensitive element 804 may be altered based upon a detected capacitance (inductance/RF propagation) at the location. For example, one particular touch sensitive element 804 may have a fixed capacitance that does not vary over time. Such fixed capacitance may indicate that the particular touch sensitive element 804 is inoperable or that it receives no discernable input. In such case, by not scanning the particular touch sensitive element, other touch sensitive elements may be more frequently scanned or energy may be saved by not scanning all touch sensitive elements.
  • According to another aspect of the present invention, some portions of the touch pad may be disabled while others are enabled at differing points in time. Enablement of some touch sensitive elements and not others may be based upon a custom configuration of the touch pad for a particular input function provided.
  • The touch pad 802 may also be calibrated by the touch pad circuitry 806 based upon the environmental factors such as temperature, humidity, and surrounding noise from the capacitance, inductance, or RF perspective. Calibration of the touch pad 802 allows the touch pad 802 to have more efficient and effective touch pad input for user identification and/or for other input purposes.
  • The touch pad 802 may also be calibrated by the touch pad circuitry 806 based upon the environmental factors such as temperature, humidity, and surrounding noise as detected by measured capacitance, inductance, or RF propagation characteristics. Calibration of the touch pad 802 allows the touch pad 802 to be more efficient and more effectively receive touch pad input for user identification and/or for other input purposes. The calibration of the touch pad 802 by the touch pad circuitry 806 may be initiated at particular points in time. The touch pad circuitry 806 may simply initiate calibration of the touch pad 802 upon the expiration of a timer such that the touch pad is calibrated at a particular regular time interval. Alternatively, the touch pad 802 may be calibrated after a period of inactivity, i.e., the touch pad circuitry 806 performs calibration when it determines that no input is present on the touch pad 802. With other operations or embodiments, the touch pad 802 may be calibrated by the touch pad circuitry 806 using other input criteria as well.
  • Still referring to FIG. 8, the touch pad circuitry 806 may enact stylus input touch pad processing settings and/or touching finger touch pad processing settings for processing of input received via the plurality of touch sensitive elements 804. As will be further described herein, the touch pad circuitry 806 may enact the stylus input touch pad processing settings upon detection of a stylus touching the touch pad 802 or upon a direction received locally or from a remote device Likewise, the touch pad circuitry 806 may enact the touching finger touch pad processing settings upon detection of a finger touching the touch pad 802 or upon a direction received locally or from a remote device. Once the particular processing settings are enacted, the touch pad circuitry 806 processes the plurality of touch sensitive element values according to such processing settings to produce user input.
  • FIG. 9 is a diagrammatic side view illustrating both a user's finger and a stylus touching a touch pad constructed and operating according to one or more embodiments of the present invention. As shown in FIG. 9, the touch pad 902 includes a plurality of touch pad elements 904 that are disposed in two dimensions, as was shown in FIG. 8, across the touch pad 902. Each of the touch pad elements 904 is coupled to touch pad circuitry 912 and scanned by the touch pad circuitry 912 to determine a corresponding plurality of touch sensitive element values. Each touch sensitive element value corresponds to a particular touch sensitive element. As was previously described, each of the touch sensitive element values may be one or more of touch sensitive element measured capacitance, touch sensitive element measured inductance, or touch sensitive element measured Radio Frequency (RF) impedance.
  • Processing circuitry 916 couples to touch pad circuitry 912. The touch pad 902 may include/be constructed in conjunction with a touch pad display that includes a plurality of touch pad display elements 910 controlled by touch pad display circuitry 914, which also couples to processing circuitry 916. The touch pad display circuitry 914 controls the touch pad display elements 910 to create visible icons that may be viewed by a user of the touch pad 902. The manner in which icons are created and displayed is described further herein with reference to FIG. 14.
  • According to one aspect of the present invention, the touch pad 902 supports both stylus 908 input and touching finger 906 input. As is shown, the stylus 908 has a much smaller diameter at its tip than does the finger 906 at its tip. The stylus 908 includes a tip that is constructed of a material that causes the touch pad elements 904 to alter their measurable characteristics (touch sensitive element values) that are measurable by the touch pad circuitry 912. A cross section of the touch pad 902 whose touch sensitive element values are affected by the stylus 908 is correspondingly smaller than a cross section of the touch pad 902 whose touch sensitive element values are affected by the touching finger 906. Further, the extent to which a touching finger alters touch sensitive element values may differ from a touching stylus.
  • FIG. 10 is a diagram illustrating a touch pad and the manner in which a touching finger and a touching stylus may be detected via differing capacitance levels according to one or more embodiments of the present invention. Referring to both FIGS. 9 and 10, the touch sensitive element values are measured by touch pad circuitry 912 for a plurality of touch sensitive elements 904. For a location at which a finger touches the touch pad 902, the touch sensitive element values are depicted as touching finger pattern 1004. Likewise a location upon the touch pad 902 that the stylus 908 touches is depicted as a touching stylus pattern 1002. As is shown, the touching finger 906 creates a much different touch pad element value pattern 1004 than does the touch sensitive element value pattern 1002 created by the stylus 908 touching the touch pad 902. The different characteristics of the touching finger 906 and the touching stylus 908 as measured by the touch pad circuitry 912 enables the touch pad 902 to differentiate between a touching finger 906 and a touching stylus 908. Generally, the pattern of the touch sensitive element values are compared to both a touching finger threshold pattern and a stylus input threshold pattern. Based upon this comparison, processing circuitry may enact one or both of the stylus input touch pad processing settings and the stylus input touch pad processing settings, as described further herein. The manner in which the touch pad 902 may be operated to distinguish these varying operational characteristics and to enact processing settings based thereon is described further with reference to FIGS. 11-13.
  • FIG. 11 is a flowchart illustrating operations of a user input device (video game controller, video game console, remote control, automobile data input device, keypad replacement device, etc.) according to one or more embodiments of the present invention. The operations 1100 commence with touch pad circuitry or other circuitry scanning a plurality of touch sensitive elements of a touch pad (step 1102). Based upon the scanning, the touch pad circuitry in cooperation with the touch pad may determine a stylus input condition (step 1104), determine a touching finger input condition (step 1106), or enter touch pad calibration operations (step 1108).
  • The touch pad circuitry determines that a stylus input condition is met at step 1104 by comparing a plurality of touch sensitive element values measured at step 1102 to a stylus input threshold pattern. The determination made at step 1104 is made based upon a favorable comparison of the plurality of touch sensitive element values to the stylus input threshold pattern. Likewise, the touch pad circuitry or other processing circuitry operating upon the touch pad input determines that a touching finger input condition is met at step 1106 based upon comparing the plurality of touch sensitive element values to a touching finger threshold pattern.
  • According to one embodiment of the present invention, the stylus input threshold pattern has substantially uniform touch sensitive element thresholds for a first proximate group of touch sensitive elements. Further, the touching finger threshold pattern has a substantially uniform touch sensitive element thresholds for a second group of touch sensitive elements, the second group of touch sensitive elements corresponding to the touching finger threshold pattern being is greater in number than the first proximate group of touch sensitive elements. Such patterns therefore favor an indication that a touching finger has a greater surface area in contact with the touch pad than does a stylus.
  • Referring again to FIG. 10, the stylus input threshold pattern may correspond to the touching stylus pattern 1002, while the touching finger threshold pattern may correspond to the touch sensitive element values indicated at the touching finger pattern 1004. As shown in FIG. 10, the touching finger threshold pattern 1004 has a greater number of affected touch sensitive elements than does the touching stylus pattern 1002. Further, as may be the case based upon the characteristics of the stylus, the affect, e.g., capacitance, inductance, change in RF propagation, on the touch sensitive elements of the touching stylus pattern 1002 may be greater than the effect on the touch sensitive elements at the touching finger pattern 1004. Such would be the case if the stylus tip has a strong magnetic or metallic structure.
  • Referring again to FIG. 11, upon a determination of the stylus input condition at step 1104, the touch pad circuitry or other circuitry controlling operation of the touch pad enacts stylus input touch pad processing settings (step 1110). The touch pad circuitry then processes the touch sensitive element values using the stylus input touch pad processing settings to produce user input (step 1112). The processing circuitry or another component of a device, e.g. game controller or remote control, vehicle data input device, or key pad replacement device, etc., may transmit user input to a remote device (step 1114). The user input is produced by the processing circuitry using the stylus input touch pad processing settings and the touch sensitive element values of the plurality of touch sensitive elements of the user input device. Further, as will be described with reference to FIG. 14, the touch pad display may be operated to indicate a button depression (step 1116). From step 1116, operation returns to step 1102.
  • From step 1106, upon determining a touching finger input condition, the touch pad circuitry or other processing circuitry enacts the touching finger touch pad processing settings (step 1118). With the touching finger touch pad processing settings determined, the user input device processes the touch sensitive element values of the touch pad using the touching finger touch pad processing settings to produce user input (step 1120). The device may then transmit the user input to a remote device (step 1122). The remote device may be a game console and the user input device may be a game controller. Alternately, the user input device may be a remote control and the remote device may be an entertainment system. Further, the user input device may be a keyboard with a touch pad built therein and the remote device may be a computer. The user input device may alternately be a touch pad located within an automobile and the remote device may be an automobile computer system. Further, after step 1122, the user input device may indicate a button depression (step 1124).
  • The operations 1100 may further include calibrating the touch pad (step 1108). The touch pad may be calibrated for stylus input at step 1126 by directing a user to touch the touch pad at various locations with a stylus. Then, based upon the touch sensitive element values received at step 1126, the user input device determines the stylus input threshold pattern at step 1128, which is subsequently used for detection of a stylus input condition. At step 1130, the user input device calibrates the touch pad for touching finger operations. The calibration operations of step 1130 may include directing a user to touch various locations of the touch pad with his or her finger and measuring touch sensitive element values upon such input condition. Then, at step 1132, the user input device determines the touching finger threshold pattern based upon the data captured at step 1130. Operations from step 1124 and step 1132 also return to step 1102.
  • According to other operations 1100 of FIG. 11, the stylus input condition and touching finger condition may be enacted based upon direction received from another device or from a user. As was previously described with reference to the gaming system, different points in the game may require stylus input or touching finger input. In such case, the direction to configure the touch pad and related circuitry for either the stylus input or touching finger input may be done in such a fashion. Upon enacting the stylus input touch pad processing settings, the touch pad may have a greater sensitivity, greater scanning rate, or a reduced detected touch area. These modifications of the operation of the touch pad may also be enacted based upon a function of the touch pad, e.g. key pad replacement device, gaming input device, remote control settings input, etc.
  • FIG. 12 is a flowchart illustrating particular operations of the user input device of FIG. 11 when enacting stylus input touch pad processing settings and touching finger touch pad processing settings according to one or more embodiments of the present invention. The operations 1200 of FIG. 12 commence with enacting stylus input touch pad processing settings for a first portion of a touch pad (step 1202). Operation continues with enacting touching finger touch pad processing settings for a second portion of the touch pad (step 1204). Once the stylus input touch pad processing settings and the touching finger touch pad processing settings have been enacted for different portions of the touch pad, the touch pad circuitry or other processing circuitry may ignore touch sensitive element values for the second portion of the touch pad and process touch sensitive element values for the first portion of the touch pad (step 1206).
  • Alternatively, the user input device may differently process touch sensitive element values received via the first portion of the touch pad display and/or the second portion of the touch pad display (step 1208). For example, with the operations 1200 of FIG. 12, one portion of the screen may be set in a stylus input touch pad processing mode of operation with the other portion of the screen being set as a touching finger touch pad processing mode of operation. The operations 1200 of FIG. 12 may be enacted when the processing circuitry or touch pad circuitry detects a finger resting on the touch pad. In such case, when the user input device detects that the finger is resting on the display, it accepts the stylus input as primary screen input. Further, the touch pad circuitry may detect that the stylus is resting on the display and receives touching finger input as primary touch pad input.
  • According to other operations of FIG. 12, multi-touch operations of the touch pad may initiated or ceased with the touch pad processing settings enacted. For example, with touching finger touch pad processing settings, multiple fingers may be detected upon the touch pad. In such case, the multiple touches may be used to determine touch pad input. Further, when in the stylus input touch pad processing settings mode, a first touch of the stylus may be received and stored and, after a delay period, the touch pad detects a second stylus touch at a different location on the touch pad and treats the first and second touches as multi-touches for multi-touch operations of the touch pad. Further, multi-touch operations may be employed wherein the touching finger provides one of the multi-touches and the stylus provides a second of the multi-touches on the touch pad.
  • FIG. 13 is a block diagram illustrating a touch pad that operates according to one or more embodiments of the present invention. Different processing settings may be enacted for differing portions 1312 and 1314 of the touch pad and touch sensitive element values for the differing portions 1312 and 1314 may be processed differently. With the example of FIG. 13, touching fingers are detected at locations 1308 and 1306. Likewise, a stylus touch in detected at location 1310. Based upon these detected events, stylus input touch pad processing settings are enacted for a first portion 1314 of the touch pad 1302 while touching finger touch pad processing settings are enacted for a second portion 1312 of the touch pad 1302. With these respective processing settings enacted for the first 1314 and second 1312 portions of the touch pad 1302, the touch pad circuitry may ignore touch sensitive element values for the first portion 1314 of the touch pad 1302 and use touch sensitive element values for the first portion 1312 of the touch pad 1302 are used to produce touch pad input. Alternatively, touch sensitive element values of the first portion 1312 of the touch pad 1302 may be ignored and touch sensitive element values of the second portion 1314 of the touch pad 1302 are used to determine touch pad input.
  • With the example of FIG. 13, the touch sensitive element values for the first portion 1312 of the touch pad 1302 may be processed using the touching finger touch pad processing settings to determine first touch pad input and touch sensitive element values for the first portion 1314 of the touch pad may be processed using the stylus input touch pad processing settings to determine second touch pad input. The touch pad 1302 may be divided into differing portions in other operations.
  • FIG. 14 is a block diagram illustrating a touch pad that operates according to one or more embodiments of the present invention. The touch pad 1402 of FIG. 14 includes a plurality of touch sensitive elements 1404. The structure of the touch pad 1402 of FIG. 14 corresponds to the structure described with reference to FIG. 9 and includes a touch pad display and touch pad display circuitry. Based upon control of the touch pad display circuitry, which controls the touch pad display elements, button icons 1406, 1408, 1410, 1412, 1414, 1416, 1418, 1420, 1422, 1424, 1426, and 1428 are displayed to a user. With the example of FIG. 14, each of these elements is a simulated button with each of the buttons corresponding to a plurality of touch sensitive elements. The processing circuitry is operable to cause the touch pad display to indicate depression of a simulated button based upon the touch pad input that is received via touch sensitive elements 1404.
  • The structure and operation of FIG. 14 may be employed when the touch pad/touch pad circuitry is configured to receive user input at particular locations; each input location serving as a particular button function in some embodiments. For example, when the user input device is a remote control, a plurality of simulated buttons 1406, 1408, 1410, 1412, 1414, 1416, 1418, 1420, 1422, 1424, 1426, and 1428 corresponds to functionality of the audio/visual system. User input (finger input, stylus input, etc) is received by a user touching the touch pad at location(s) corresponding to the simulated buttons. Of course, the touch pad display can be configured to display a limitless number of differing icons and icon combinations. Thus, the touch pad may be customized to receive user input in many different manners.
  • The terms “circuit” and “circuitry” as used herein may refer to an independent circuit or to a portion of a multifunctional circuit that performs multiple underlying functions. For example, depending on the embodiment, processing circuitry may be implemented as a single chip processor or as a plurality of processing chips. Likewise, a first circuit and a second circuit may be combined in one embodiment into a single circuit or, in another embodiment, operate independently perhaps in separate chips. The term “chip,” as used herein, refers to an integrated circuit. Circuits and circuitry may comprise general or specific purpose hardware, or may comprise such hardware and associated software such as firmware or object code.
  • The present invention has also been described above with the aid of method steps illustrating the performance of specified functions and relationships thereof. The boundaries and sequence of these functional building blocks and method steps have been arbitrarily defined herein for convenience of description. Alternate boundaries and sequences can be defined so long as the specified functions and relationships are appropriately performed. Any such alternate boundaries or sequences are thus within the scope and spirit of the claimed invention.
  • The present invention has been described above with the aid of functional building blocks illustrating the performance of certain significant functions. The boundaries of these functional building blocks have been arbitrarily defined for convenience of description. Alternate boundaries could be defined as long as the certain significant functions are appropriately performed. Similarly, flow diagram blocks may also have been arbitrarily defined herein to illustrate certain significant functionality. To the extent used, the flow diagram block boundaries and sequence could have been defined otherwise and still perform the certain significant functionality. Such alternate definitions of both functional building blocks and flow diagram blocks and sequences are thus within the scope and spirit of the claimed invention. One of average skill in the art will also recognize that the functional building blocks, and other illustrative blocks, modules and components herein, can be implemented as illustrated or by discrete components, application specific integrated circuits, processors executing appropriate software and the like or any combination thereof.
  • As may be used herein, the terms “substantially” and “approximately” provides an industry-accepted tolerance for its corresponding term and/or relativity between items. Such an industry-accepted tolerance ranges from less than one percent to fifty percent and corresponds to, but is not limited to, component values, integrated circuit process variations, temperature variations, rise and fall times, and/or thermal noise. Such relativity between items ranges from a difference of a few percent to magnitude differences. As may also be used herein, the term(s) “coupled to” and/or “coupling” and/or includes direct coupling between items and/or indirect coupling between items via an intervening item (e.g., an item includes, but is not limited to, a component, an element, a circuit, and/or a module) where, for indirect coupling, the intervening item does not modify the information of a signal but may adjust its current level, voltage level, and/or power level. As may further be used herein, inferred coupling (i.e., where one element is coupled to another element by inference) includes direct and indirect coupling between two items in the same manner as “coupled to.” As may even further be used herein, the term “operable to” indicates that an item includes one or more of power connections, input(s), output(s), etc., to perform one or more its corresponding functions and may further include inferred coupling to one or more other items. As may still further be used herein, the term “associated with,” includes direct and/or indirect coupling of separate items and/or one item being embedded within another item. As may be used herein, the term “compares favorably,” indicates that a comparison between two or more items, signals, etc., provides a desired relationship. For example, when the desired relationship is that signal 1 has a greater magnitude than signal 2, a favorable comparison may be achieved when the magnitude of signal 1 is greater than that of signal 2 or when the magnitude of signal 2 is less than that of signal 1.
  • The present invention has also been described above with the aid of method steps illustrating the performance of specified functions and relationships thereof. The boundaries and sequence of these functional building blocks and method steps have been arbitrarily defined herein for convenience of description. Alternate boundaries and sequences can be defined so long as the specified functions and relationships are appropriately performed. Any such alternate boundaries or sequences are thus within the scope and spirit of the claimed invention.
  • Moreover, although described in detail for purposes of clarity and understanding by way of the aforementioned embodiments, the present invention is not limited to such embodiments. It will be obvious to one of average skill in the art that various changes and modifications may be practiced within the spirit and scope of the invention, as limited only by the scope of the appended claims.

Claims (21)

1. A method for operating a user input device comprising:
scanning a plurality of touch sensitive elements of a touch pad to measure a plurality of touch sensitive element values;
comparing the plurality of touch sensitive element values to a stylus input threshold pattern and, upon a favorable comparison:
determining a stylus input condition;
enacting stylus input touch pad processing settings; and
detecting a position of the stylus upon the touch pad based upon the plurality of touch sensitive element values and the stylus input touch pad processing settings; and
comparing the plurality of touch sensitive element values to a touching finger threshold pattern and, upon a favorable comparison:
determining a touching finger condition;
enacting touching finger touch pad processing settings; and
detecting a position of the touching finger upon the touch pad based upon the plurality of touch sensitive element values.
2. The method of claim 1, wherein the touch sensitive element values are selected from the group consisting of:
touch sensitive element measured capacitance;
touch sensitive element measured inductance; and
touch sensitive element measured Radio Frequency (RF) impedance.
3. The method of claim 1, further comprising processing the plurality of touch sensitive element values to produce one of:
video game controller input;
remote control input;
vehicle data input device input;
cellular telephone input;
portable electronic device input;
computer input; and
keypad replacement device input.
4. The method of claim 1, wherein:
the stylus input threshold pattern comprises substantially uniform touch sensitive element thresholds for a first proximate group of touch sensitive elements; and
the touching finger threshold pattern comprises substantially uniform touch sensitive element thresholds for a second proximate group of touch sensitive elements, the second proximate group of touch sensitive elements greater in number than the first proximate group of touch sensitive elements.
5. The method of claim 1, wherein the stylus input touch pad processing settings have finer resolution than do the touching finger touch pad processing settings.
6. The method of claim 1, wherein:
the stylus input touch pad processing settings are enacted for a first portion of the touch pad; and
the touching finger touch pad processing settings are enacted for a second portion of the touch pad.
7. The method of claim 6, further comprising:
ignoring touch sensitive element values for the second portion of the touch pad; and
processing the touch sensitive element values for the first portion of the touch pad to determine touch pad input.
8. The method of claim 6, further comprising:
processing the touch sensitive element values for the first portion of the touch pad using the stylus input touch pad processing settings to determine first touch pad input; and
processing the touch sensitive element values for the second portion of the touch pad using the touching finger touch pad processing settings to determine second touch pad input.
9. The method of claim 1, wherein:
the stylus input touch pad processing settings support single touch input; and
the touching finger touch pad processing settings support multiple touch input.
10. A method for operating a user input device comprising:
scanning a plurality of touch sensitive elements of a touch pad to measure a plurality of touch sensitive element values;
comparing the plurality of touch sensitive element values to a stylus input threshold pattern and, upon a favorable comparison:
determining a stylus input condition; and
enacting stylus input touch pad processing settings;
comparing the plurality of touch sensitive element values to a touching finger threshold pattern and, upon a favorable comparison:
determining a touching finger condition; and
enacting touching finger touch pad processing settings;
processing the plurality of touch sensitive element values based upon an enacted one of the stylus input touch pad processing settings and the touching finger touch pad processing settings to produce touch pad input.
11. The method of claim 10, further comprising transmitting the touch pad input to a remote device via a communications interface of the user input device.
12. The method of claim 10, wherein:
the stylus input threshold pattern comprises substantially uniform touch sensitive element thresholds for a first proximate group of touch sensitive elements; and
the touching finger threshold pattern comprises substantially uniform touch sensitive element thresholds for a second proximate group of touch sensitive elements, the second proximate group of touch sensitive elements greater in number than the first proximate group of touch sensitive elements.
13. A user input device comprising:
a communications interface;
a touch pad having a plurality of touch sensitive elements; and
processing circuitry coupled to the communications interface and to the touch pad, the processing circuitry operable to:
scan the plurality of touch sensitive elements to measure a plurality of touch sensitive element values;
compare the plurality of touch sensitive element values to a stylus input threshold pattern and, upon a favorable comparison:
determine a stylus input condition; and
enact stylus input touch pad processing settings;
compare the plurality of touch sensitive element values to a touching finger threshold pattern and, upon a favorable comparison:
determine a touching finger condition; and
enact touching finger touch pad processing settings;
process the plurality of touch sensitive element values based upon an enacted one of the stylus input touch pad processing settings and the touching finger touch pad processing settings to produce touch pad input.
14. The user input device of claim 13, wherein touch sensitive element values are selected from the group consisting of:
touch sensitive element measured capacitance;
touch sensitive element measured inductance; and
touch sensitive element measured Radio Frequency (RF) impedance.
15. The user input device of claim 13, wherein the user input device comprises one of:
a video game controller;
a remote control;
a vehicle data input device;
a cellular telephone;
a portable electronic device;
a computer; and
a keypad replacement device.
16. The user input device of claim 13, wherein:
the stylus input threshold pattern comprises substantially uniform touch sensitive element thresholds for a first proximate group of touch sensitive elements; and
the touching finger threshold pattern comprises substantially uniform touch sensitive element thresholds for a second proximate group of touch sensitive elements, the second proximate group of touch sensitive elements greater in number than the first proximate group of touch sensitive elements.
17. The user input device of claim 13:
further comprising a touch pad display coupled to the processing circuitry and corresponding to the touch pad, the touch pad display having a plurality of display elements configured to display at least one simulated button, each simulated button corresponding to a plurality of touch sensitive elements; and
wherein upon the processing circuitry is operable to cause the touch pad display to indicate depression of a simulated button based upon the touch pad input.
18. The user input device of claim 13, wherein the stylus input touch pad processing settings have finer resolution than do the touching finger touch pad processing settings.
19. The user input device of claim 13, wherein:
the processing circuitry enacts the stylus input touch pad processing settings for a first portion of the touch pad; and
the processing circuitry enacts the touching finger touch pad processing settings for a second portion of the touch pad.
20. The user input device of claim 19, wherein:
the processing circuitry ignores touch sensitive element values for the second portion of the touch pad; and
the processing circuitry processes touch sensitive element values for the first portion of the touch pad to determine the touch pad input.
21. The user input device of claim 19, wherein:
the processing circuitry supports single touch input for the stylus input touch pad processing settings; and
the processing circuitry supports multiple touch input for the touching finger touch pad processing settings.
US12/912,472 2009-11-16 2010-10-26 Touch sensitive panel supporting stylus input Abandoned US20110115741A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US12/912,472 US20110115741A1 (en) 2009-11-16 2010-10-26 Touch sensitive panel supporting stylus input

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US26170209P 2009-11-16 2009-11-16
US12/912,472 US20110115741A1 (en) 2009-11-16 2010-10-26 Touch sensitive panel supporting stylus input

Publications (1)

Publication Number Publication Date
US20110115741A1 true US20110115741A1 (en) 2011-05-19

Family

ID=44010905

Family Applications (12)

Application Number Title Priority Date Filing Date
US12/894,011 Active 2031-03-08 US8535133B2 (en) 2009-11-16 2010-09-29 Video game with controller sensing player inappropriate activity
US12/912,595 Abandoned US20110118027A1 (en) 2009-11-16 2010-10-26 Altering video game operations based upon user id and-or grip position
US12/912,472 Abandoned US20110115741A1 (en) 2009-11-16 2010-10-26 Touch sensitive panel supporting stylus input
US12/912,342 Abandoned US20110118024A1 (en) 2009-11-16 2010-10-26 Adjusting operation of touch sensitive panel of game controller
US12/912,405 Active 2032-05-21 US8614621B2 (en) 2009-11-16 2010-10-26 Remote control for multimedia system having touch sensitive panel for user ID
US12/912,651 Abandoned US20110118029A1 (en) 2009-11-16 2010-10-26 Hand-held gaming device with touch sensitive panel(s) for gaming input
US12/912,645 Active 2031-03-13 US8449393B2 (en) 2009-11-16 2010-10-26 Hand-held gaming device with configurable touch sensitive panel(s)
US12/912,422 Abandoned US20110118025A1 (en) 2009-11-16 2010-10-26 Game controller with touch pad user interface
US12/912,637 Abandoned US20110115606A1 (en) 2009-11-16 2010-10-26 Touch sensitive panel in vehicle for user identification
US12/943,768 Active 2033-02-04 US8838060B2 (en) 2009-11-16 2010-11-10 Device communications via intra-body communication path
US12/945,556 Active 2031-09-08 US9007331B2 (en) 2009-11-16 2010-11-12 Touch sensitive panel detecting hovering finger
US13/867,316 Active US8845424B2 (en) 2009-11-16 2013-04-22 Hand-held gaming device with configurable touch sensitive panel(s)

Family Applications Before (2)

Application Number Title Priority Date Filing Date
US12/894,011 Active 2031-03-08 US8535133B2 (en) 2009-11-16 2010-09-29 Video game with controller sensing player inappropriate activity
US12/912,595 Abandoned US20110118027A1 (en) 2009-11-16 2010-10-26 Altering video game operations based upon user id and-or grip position

Family Applications After (9)

Application Number Title Priority Date Filing Date
US12/912,342 Abandoned US20110118024A1 (en) 2009-11-16 2010-10-26 Adjusting operation of touch sensitive panel of game controller
US12/912,405 Active 2032-05-21 US8614621B2 (en) 2009-11-16 2010-10-26 Remote control for multimedia system having touch sensitive panel for user ID
US12/912,651 Abandoned US20110118029A1 (en) 2009-11-16 2010-10-26 Hand-held gaming device with touch sensitive panel(s) for gaming input
US12/912,645 Active 2031-03-13 US8449393B2 (en) 2009-11-16 2010-10-26 Hand-held gaming device with configurable touch sensitive panel(s)
US12/912,422 Abandoned US20110118025A1 (en) 2009-11-16 2010-10-26 Game controller with touch pad user interface
US12/912,637 Abandoned US20110115606A1 (en) 2009-11-16 2010-10-26 Touch sensitive panel in vehicle for user identification
US12/943,768 Active 2033-02-04 US8838060B2 (en) 2009-11-16 2010-11-10 Device communications via intra-body communication path
US12/945,556 Active 2031-09-08 US9007331B2 (en) 2009-11-16 2010-11-12 Touch sensitive panel detecting hovering finger
US13/867,316 Active US8845424B2 (en) 2009-11-16 2013-04-22 Hand-held gaming device with configurable touch sensitive panel(s)

Country Status (1)

Country Link
US (12) US8535133B2 (en)

Cited By (26)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110216032A1 (en) * 2010-03-05 2011-09-08 Wacom Co., Ltd. Position detection apparatus
US20130176270A1 (en) * 2012-01-09 2013-07-11 Broadcom Corporation Object classification for touch panels
US20130307803A1 (en) * 2011-02-04 2013-11-21 Panasonic Corporation Electronic device
US8660978B2 (en) 2010-12-17 2014-02-25 Microsoft Corporation Detecting and responding to unintentional contact with a computing device
US20140168116A1 (en) * 2012-12-18 2014-06-19 Logitech Europe S.A. Method and system for discriminating stylus and touch interactions
US20140253520A1 (en) * 2013-03-11 2014-09-11 Barnesandnoble.Com Llc Stylus-based slider functionality for ui control of computing device
US20140253464A1 (en) * 2013-03-11 2014-09-11 Barnesandnoble.Com Llc Stylus sensitive device with stylus idle functionality
US8902181B2 (en) 2012-02-07 2014-12-02 Microsoft Corporation Multi-touch-movement gestures for tablet computing devices
US8982045B2 (en) 2010-12-17 2015-03-17 Microsoft Corporation Using movement of a computing device to enhance interpretation of input events produced when interacting with the computing device
US8988398B2 (en) 2011-02-11 2015-03-24 Microsoft Corporation Multi-touch input device with orientation sensing
US8994646B2 (en) 2010-12-17 2015-03-31 Microsoft Corporation Detecting gestures involving intentional movement of a computing device
US20150205376A1 (en) * 2014-01-21 2015-07-23 Seiko Epson Corporation Position detecting device, position detecting system, and controlling method of position detecting device
EP2911016A1 (en) * 2014-02-21 2015-08-26 Polar Electro Oy Radio frequency based touchscreen
US9195351B1 (en) * 2011-09-28 2015-11-24 Amazon Technologies, Inc. Capacitive stylus
US9201520B2 (en) 2011-02-11 2015-12-01 Microsoft Technology Licensing, Llc Motion and context sharing for pen-based computing inputs
US9244545B2 (en) 2010-12-17 2016-01-26 Microsoft Technology Licensing, Llc Touch and stylus discrimination and rejection for contact sensitive computing devices
US20160034051A1 (en) * 2014-07-31 2016-02-04 Cisco Technology, Inc. Audio-visual content navigation with movement of computing device
US9727161B2 (en) 2014-06-12 2017-08-08 Microsoft Technology Licensing, Llc Sensor correlation for pen and touch-sensitive computing device interaction
US9766723B2 (en) 2013-03-11 2017-09-19 Barnes & Noble College Booksellers, Llc Stylus sensitive device with hover over stylus control functionality
US9870083B2 (en) 2014-06-12 2018-01-16 Microsoft Technology Licensing, Llc Multi-device multi-user sensor correlation for pen and computing device interaction
US9946365B2 (en) 2013-03-11 2018-04-17 Barnes & Noble College Booksellers, Llc Stylus-based pressure-sensitive area for UI control of computing device
US10222935B2 (en) 2014-04-23 2019-03-05 Cisco Technology Inc. Treemap-type user interface
US10372520B2 (en) 2016-11-22 2019-08-06 Cisco Technology, Inc. Graphical user interface for visualizing a plurality of issues with an infrastructure
US10397640B2 (en) 2013-11-07 2019-08-27 Cisco Technology, Inc. Interactive contextual panels for navigating a content stream
US10739943B2 (en) 2016-12-13 2020-08-11 Cisco Technology, Inc. Ordered list user interface
US10862867B2 (en) 2018-04-01 2020-12-08 Cisco Technology, Inc. Intelligent graphical user interface

Families Citing this family (206)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9151564B1 (en) 2006-08-15 2015-10-06 Triggermaster, Inc. Firearm trigger pull training system and methods
US8777620B1 (en) * 2006-08-15 2014-07-15 Triggermaster, Inc. Firearm trigger pull training system and methods
US8556628B1 (en) 2006-08-15 2013-10-15 Malcom E. Baxter Shooting training device
US8463182B2 (en) * 2009-12-24 2013-06-11 Sony Computer Entertainment Inc. Wireless device pairing and grouping methods
US20100060592A1 (en) * 2008-09-10 2010-03-11 Jeffrey Traer Bernstein Data Transmission and Reception Using Optical In-LCD Sensing
TWI483145B (en) * 2009-02-26 2015-05-01 Htc Corp Portable electronic device and method for avoiding erroneously touching touch panel thereof
US8668145B2 (en) * 2009-04-21 2014-03-11 Technology Innovators Inc. Automatic touch identification system and method thereof
JP5195637B2 (en) * 2009-05-21 2013-05-08 富士通株式会社 BAN sensor wireless communication apparatus and method
KR20100126958A (en) * 2009-05-25 2010-12-03 삼성전자주식회사 Multi-device control method and apparatus
US9323398B2 (en) 2009-07-10 2016-04-26 Apple Inc. Touch and hover sensing
KR20110080894A (en) * 2010-01-07 2011-07-13 삼성전자주식회사 Multi-touch input processing method and device
JP5508122B2 (en) * 2010-04-30 2014-05-28 株式会社ソニー・コンピュータエンタテインメント Program, information input device, and control method thereof
US20120068952A1 (en) * 2010-05-25 2012-03-22 Motorola Mobility, Inc. User computer device with temperature sensing capabilities and method of operating same
US9103732B2 (en) 2010-05-25 2015-08-11 Google Technology Holdings LLC User computer device with temperature sensing capabilities and method of operating same
GB2481596B (en) * 2010-06-29 2014-04-16 Nds Ltd System and method for identifying a user through an object held in a hand
US9357024B2 (en) 2010-08-05 2016-05-31 Qualcomm Incorporated Communication management utilizing destination device user presence probability
US9098138B2 (en) * 2010-08-27 2015-08-04 Apple Inc. Concurrent signal detection for touch and hover sensing
CN103221910B (en) * 2010-08-27 2016-04-13 Uico公司 There is the capacitive touch screen of the touch sensing of dynamic capacity control and improvement
US9569003B2 (en) * 2010-09-30 2017-02-14 Broadcom Corporation Portable computing device including a three-dimensional touch screen
AU2011318246A1 (en) 2010-10-22 2013-05-09 Joshua Michael Young Methods devices and systems for creating control signals
US10262324B2 (en) 2010-11-29 2019-04-16 Biocatch Ltd. System, device, and method of differentiating among users based on user-specific page navigation sequence
US11223619B2 (en) 2010-11-29 2022-01-11 Biocatch Ltd. Device, system, and method of user authentication based on user-specific characteristics of task performance
US10621585B2 (en) 2010-11-29 2020-04-14 Biocatch Ltd. Contextual mapping of web-pages, and generation of fraud-relatedness score-values
US10069837B2 (en) 2015-07-09 2018-09-04 Biocatch Ltd. Detection of proxy server
US10834590B2 (en) 2010-11-29 2020-11-10 Biocatch Ltd. Method, device, and system of differentiating between a cyber-attacker and a legitimate user
US11269977B2 (en) 2010-11-29 2022-03-08 Biocatch Ltd. System, apparatus, and method of collecting and processing data in electronic devices
US11210674B2 (en) 2010-11-29 2021-12-28 Biocatch Ltd. Method, device, and system of detecting mule accounts and accounts used for money laundering
US10404729B2 (en) 2010-11-29 2019-09-03 Biocatch Ltd. Device, method, and system of generating fraud-alerts for cyber-attacks
US10776476B2 (en) 2010-11-29 2020-09-15 Biocatch Ltd. System, device, and method of visual login
US10949757B2 (en) 2010-11-29 2021-03-16 Biocatch Ltd. System, device, and method of detecting user identity based on motor-control loop model
US20190158535A1 (en) * 2017-11-21 2019-05-23 Biocatch Ltd. Device, System, and Method of Detecting Vishing Attacks
US10298614B2 (en) * 2010-11-29 2019-05-21 Biocatch Ltd. System, device, and method of generating and managing behavioral biometric cookies
US10685355B2 (en) 2016-12-04 2020-06-16 Biocatch Ltd. Method, device, and system of detecting mule accounts and accounts used for money laundering
US10474815B2 (en) 2010-11-29 2019-11-12 Biocatch Ltd. System, device, and method of detecting malicious automatic script and code injection
US10970394B2 (en) 2017-11-21 2021-04-06 Biocatch Ltd. System, device, and method of detecting vishing attacks
US10949514B2 (en) 2010-11-29 2021-03-16 Biocatch Ltd. Device, system, and method of differentiating among users based on detection of hardware components
US10728761B2 (en) 2010-11-29 2020-07-28 Biocatch Ltd. Method, device, and system of detecting a lie of a user who inputs data
US10586036B2 (en) 2010-11-29 2020-03-10 Biocatch Ltd. System, device, and method of recovery and resetting of user authentication factor
US10747305B2 (en) 2010-11-29 2020-08-18 Biocatch Ltd. Method, system, and device of authenticating identity of a user of an electronic device
US10917431B2 (en) * 2010-11-29 2021-02-09 Biocatch Ltd. System, method, and device of authenticating a user based on selfie image or selfie video
US12101354B2 (en) * 2010-11-29 2024-09-24 Biocatch Ltd. Device, system, and method of detecting vishing attacks
US10897482B2 (en) 2010-11-29 2021-01-19 Biocatch Ltd. Method, device, and system of back-coloring, forward-coloring, and fraud detection
US9851849B2 (en) * 2010-12-03 2017-12-26 Apple Inc. Touch device communication
US9310923B2 (en) 2010-12-03 2016-04-12 Apple Inc. Input device for touch sensitive devices
WO2012094740A1 (en) * 2011-01-12 2012-07-19 Smart Technologies Ulc Method for supporting multiple menus and interactive input system employing same
AU2012201543B2 (en) * 2011-03-15 2015-04-09 Aristocrat Technologies Australia Pty Limited An environmental controller, an environment control system and an environment control method
US20120287065A1 (en) * 2011-05-10 2012-11-15 Kyocera Corporation Electronic device
JP2012247911A (en) * 2011-05-26 2012-12-13 Sony Corp Information processing apparatus, information processing method, and program
US8975903B2 (en) 2011-06-09 2015-03-10 Ford Global Technologies, Llc Proximity switch having learned sensitivity and method therefor
US8928336B2 (en) 2011-06-09 2015-01-06 Ford Global Technologies, Llc Proximity switch having sensitivity control and method therefor
US9417754B2 (en) 2011-08-05 2016-08-16 P4tents1, LLC User interface system, method, and computer program product
US10004286B2 (en) 2011-08-08 2018-06-26 Ford Global Technologies, Llc Glove having conductive ink and method of interacting with proximity sensor
US9143126B2 (en) 2011-09-22 2015-09-22 Ford Global Technologies, Llc Proximity switch having lockout control for controlling movable panel
US20130104039A1 (en) * 2011-10-21 2013-04-25 Sony Ericsson Mobile Communications Ab System and Method for Operating a User Interface on an Electronic Device
EP2587347A3 (en) * 2011-10-25 2016-01-20 Broadcom Corporation Portable computing device including a three-dimensional touch screen
US8750852B2 (en) 2011-10-27 2014-06-10 Qualcomm Incorporated Controlling access to a mobile device
US10112556B2 (en) 2011-11-03 2018-10-30 Ford Global Technologies, Llc Proximity switch having wrong touch adaptive learning and method
US8994228B2 (en) 2011-11-03 2015-03-31 Ford Global Technologies, Llc Proximity switch having wrong touch feedback
US8878438B2 (en) 2011-11-04 2014-11-04 Ford Global Technologies, Llc Lamp and proximity switch assembly and method
US9331743B2 (en) * 2011-12-08 2016-05-03 Microsoft Technology Licensing, Llc Biological entity communication channel
US20130147602A1 (en) * 2011-12-12 2013-06-13 Cisco Technology, Inc. Determination of user based on electrical measurement
US20130154958A1 (en) * 2011-12-20 2013-06-20 Microsoft Corporation Content system with secondary touch controller
US9354748B2 (en) 2012-02-13 2016-05-31 Microsoft Technology Licensing, Llc Optical stylus interaction
US9013425B2 (en) * 2012-02-23 2015-04-21 Cypress Semiconductor Corporation Method and apparatus for data transmission via capacitance sensing device
US8873227B2 (en) 2012-03-02 2014-10-28 Microsoft Corporation Flexible hinge support layer
US9298236B2 (en) 2012-03-02 2016-03-29 Microsoft Technology Licensing, Llc Multi-stage power adapter configured to provide a first power level upon initial connection of the power adapter to the host device and a second power level thereafter upon notification from the host device to the power adapter
US9870066B2 (en) 2012-03-02 2018-01-16 Microsoft Technology Licensing, Llc Method of manufacturing an input device
US9075566B2 (en) 2012-03-02 2015-07-07 Microsoft Technoogy Licensing, LLC Flexible hinge spine
US9460029B2 (en) 2012-03-02 2016-10-04 Microsoft Technology Licensing, Llc Pressure sensitive keys
US9520875B2 (en) 2012-04-11 2016-12-13 Ford Global Technologies, Llc Pliable proximity switch assembly and activation method
US9660644B2 (en) 2012-04-11 2017-05-23 Ford Global Technologies, Llc Proximity switch assembly and activation method
US9184745B2 (en) 2012-04-11 2015-11-10 Ford Global Technologies, Llc Proximity switch assembly and method of sensing user input based on signal rate of change
US9568527B2 (en) 2012-04-11 2017-02-14 Ford Global Technologies, Llc Proximity switch assembly and activation method having virtual button mode
US9197206B2 (en) 2012-04-11 2015-11-24 Ford Global Technologies, Llc Proximity switch having differential contact surface
US8933708B2 (en) 2012-04-11 2015-01-13 Ford Global Technologies, Llc Proximity switch assembly and activation method with exploration mode
US9559688B2 (en) 2012-04-11 2017-01-31 Ford Global Technologies, Llc Proximity switch assembly having pliable surface and depression
US9065447B2 (en) 2012-04-11 2015-06-23 Ford Global Technologies, Llc Proximity switch assembly and method having adaptive time delay
US9831870B2 (en) 2012-04-11 2017-11-28 Ford Global Technologies, Llc Proximity switch assembly and method of tuning same
US9944237B2 (en) 2012-04-11 2018-04-17 Ford Global Technologies, Llc Proximity switch assembly with signal drift rejection and method
US9219472B2 (en) 2012-04-11 2015-12-22 Ford Global Technologies, Llc Proximity switch assembly and activation method using rate monitoring
US9287864B2 (en) 2012-04-11 2016-03-15 Ford Global Technologies, Llc Proximity switch assembly and calibration method therefor
US9531379B2 (en) 2012-04-11 2016-12-27 Ford Global Technologies, Llc Proximity switch assembly having groove between adjacent proximity sensors
KR20140135839A (en) * 2012-04-20 2014-11-26 엠파이어 테크놀로지 디벨롭먼트 엘엘씨 Online game experience using multiple devices
US9201547B2 (en) 2012-04-30 2015-12-01 Apple Inc. Wide dynamic range capacitive sensing
US20130300590A1 (en) 2012-05-14 2013-11-14 Paul Henry Dietz Audio Feedback
US9136840B2 (en) 2012-05-17 2015-09-15 Ford Global Technologies, Llc Proximity switch assembly having dynamic tuned threshold
US9159221B1 (en) * 2012-05-25 2015-10-13 George Stantchev Steering wheel with remote control capabilities
US8981602B2 (en) 2012-05-29 2015-03-17 Ford Global Technologies, Llc Proximity switch assembly having non-switch contact and method
US9337832B2 (en) 2012-06-06 2016-05-10 Ford Global Technologies, Llc Proximity switch and method of adjusting sensitivity therefor
US9684382B2 (en) 2012-06-13 2017-06-20 Microsoft Technology Licensing, Llc Input device configuration having capacitive and pressure sensors
US9459160B2 (en) 2012-06-13 2016-10-04 Microsoft Technology Licensing, Llc Input device sensor configuration
JP5923394B2 (en) * 2012-06-20 2016-05-24 株式会社Nttドコモ Recognition device, recognition method, and recognition system
US9641172B2 (en) 2012-06-27 2017-05-02 Ford Global Technologies, Llc Proximity switch assembly having varying size electrode fingers
US9557845B2 (en) 2012-07-27 2017-01-31 Apple Inc. Input device for and method of communication with capacitive devices through frequency variation
US9652090B2 (en) 2012-07-27 2017-05-16 Apple Inc. Device for digital communication through capacitive coupling
JP2015532803A (en) * 2012-08-07 2015-11-12 ウエブチユーナー・コーポレイシヨン Targeting multimedia ads and recommending content with a viewer identifier detection system
US8964379B2 (en) 2012-08-20 2015-02-24 Microsoft Corporation Switchable magnetic lock
US8922340B2 (en) 2012-09-11 2014-12-30 Ford Global Technologies, Llc Proximity switch based door latch release
US9426274B2 (en) * 2012-09-27 2016-08-23 Intel Corporation Device, method, and system for portable configuration of vehicle controls
US8796575B2 (en) 2012-10-31 2014-08-05 Ford Global Technologies, Llc Proximity switch assembly having ground layer
US10817096B2 (en) * 2014-02-06 2020-10-27 Apple Inc. Force sensor incorporated into display
US9244576B1 (en) 2012-12-21 2016-01-26 Cypress Semiconductor Corporation User interface with child-lock feature
US20140201205A1 (en) * 2013-01-14 2014-07-17 Disney Enterprises, Inc. Customized Content from User Data
US9176538B2 (en) 2013-02-05 2015-11-03 Microsoft Technology Licensing, Llc Input device configurations
KR20150113169A (en) 2013-02-08 2015-10-07 애플 인크. Force determination based on capacitive sensing
US10578499B2 (en) 2013-02-17 2020-03-03 Microsoft Technology Licensing, Llc Piezo-actuated virtual buttons for touch surfaces
US9311204B2 (en) 2013-03-13 2016-04-12 Ford Global Technologies, Llc Proximity interface development system having replicator and method
US9143715B2 (en) 2013-03-14 2015-09-22 Intel Corporation Remote control with capacitive touchpad
US9696839B1 (en) * 2013-03-15 2017-07-04 Adac Plastics, Inc. Vehicle door control
KR102213486B1 (en) * 2013-03-15 2021-02-08 텍추얼 랩스 컴퍼니 Fast multi-touch noise reduction
JP5697113B2 (en) * 2013-04-26 2015-04-08 パナソニック インテレクチュアル プロパティ コーポレーション オブアメリカPanasonic Intellectual Property Corporation of America Electronics
US9440143B2 (en) 2013-07-02 2016-09-13 Kabam, Inc. System and method for determining in-game capabilities based on device information
FR3008510B1 (en) 2013-07-12 2017-06-23 Blinksight DEVICE AND METHOD FOR CONTROLLING ACCESS TO AT LEAST ONE MACHINE
US9671889B1 (en) 2013-07-25 2017-06-06 Apple Inc. Input member with capacitive sensor
US9939935B2 (en) 2013-07-31 2018-04-10 Apple Inc. Scan engine for touch controller architecture
US9415306B1 (en) * 2013-08-12 2016-08-16 Kabam, Inc. Clients communicate input technique to server
KR20150020865A (en) * 2013-08-19 2015-02-27 삼성전자주식회사 Method and apparatus for processing a input of electronic device
US9117100B2 (en) 2013-09-11 2015-08-25 Qualcomm Incorporated Dynamic learning for object tracking
US10025489B2 (en) 2013-09-16 2018-07-17 Microsoft Technology Licensing, Llc Detecting primary hover point for multi-hover point device
US9602624B2 (en) 2013-09-30 2017-03-21 AT&T Intellectual Property I, L.L.P. Facilitating content management based on profiles of members in an environment
WO2015069311A1 (en) * 2013-11-08 2015-05-14 Seyamak Vaziri Capacitive track pad transmission shift knob
US9623322B1 (en) 2013-11-19 2017-04-18 Kabam, Inc. System and method of displaying device information for party formation
US9933879B2 (en) 2013-11-25 2018-04-03 Apple Inc. Reconfigurable circuit topology for both self-capacitance and mutual capacitance sensing
WO2015080696A1 (en) * 2013-11-26 2015-06-04 Rinand Solutions Llc Self-calibration of force sensors and inertial compensation
US9295916B1 (en) 2013-12-16 2016-03-29 Kabam, Inc. System and method for providing recommendations for in-game events
US20150177945A1 (en) * 2013-12-23 2015-06-25 Uttam K. Sengupta Adapting interface based on usage context
US9448631B2 (en) 2013-12-31 2016-09-20 Microsoft Technology Licensing, Llc Input device haptics and pressure sensing
US9227141B2 (en) 2013-12-31 2016-01-05 Microsoft Technology Licensing, Llc Touch screen game controller
US9753562B2 (en) 2014-01-15 2017-09-05 Nokia Technologies Oy Dynamic threshold for local connectivity setup
US20150199941A1 (en) * 2014-01-15 2015-07-16 Nokia Corporation 3d touch sensor reader
EP3072040B1 (en) 2014-02-12 2021-12-29 Apple Inc. Force determination employing sheet sensor and capacitive array
US10120420B2 (en) 2014-03-21 2018-11-06 Microsoft Technology Licensing, Llc Lockable display and techniques enabling use of lockable displays
US9400880B2 (en) 2014-06-17 2016-07-26 Qualcomm Incorporated Method and apparatus for biometric-based security using capacitive profiles
GB2528086A (en) * 2014-07-09 2016-01-13 Jaguar Land Rover Ltd Identification method and apparatus
US10712116B1 (en) 2014-07-14 2020-07-14 Triggermaster, Llc Firearm body motion detection training system
US10324733B2 (en) 2014-07-30 2019-06-18 Microsoft Technology Licensing, Llc Shutdown notifications
US20160034171A1 (en) * 2014-08-04 2016-02-04 Flextronics Ap, Llc Multi-touch gesture recognition using multiple single-touch touch pads
US9424048B2 (en) 2014-09-15 2016-08-23 Microsoft Technology Licensing, Llc Inductive peripheral retention device
US9946371B2 (en) 2014-10-16 2018-04-17 Qualcomm Incorporated System and method for using touch orientation to distinguish between users of a touch panel
US10038443B2 (en) 2014-10-20 2018-07-31 Ford Global Technologies, Llc Directional proximity switch assembly
KR102380228B1 (en) * 2014-11-14 2022-03-30 삼성전자주식회사 Method for controlling device and the device
US10065111B1 (en) * 2014-12-16 2018-09-04 Oculus Vr, Llc Mapping user interactions with a controller to a hand position
US9763088B2 (en) * 2014-12-31 2017-09-12 Ruckus Wireless, Inc. Mesh network with personal pre-shared keys
US9654103B2 (en) 2015-03-18 2017-05-16 Ford Global Technologies, Llc Proximity switch assembly having haptic feedback and method
US9548733B2 (en) 2015-05-20 2017-01-17 Ford Global Technologies, Llc Proximity sensor assembly having interleaved electrode configuration
US10191579B2 (en) * 2015-05-22 2019-01-29 Tactual Labs Co. Transmitting and receiving system and method for bidirectional orthogonal signaling sensors
US9898091B2 (en) 2015-06-03 2018-02-20 Oculus Vr, Llc Virtual reality system with head-mounted display, camera and hand-held controllers
US10416799B2 (en) 2015-06-03 2019-09-17 Microsoft Technology Licensing, Llc Force sensing and inadvertent input control of an input device
US10222889B2 (en) 2015-06-03 2019-03-05 Microsoft Technology Licensing, Llc Force inputs and cursor control
US9870052B2 (en) 2015-06-11 2018-01-16 Oculus Vr, Llc Hand-held controller with pressure-sensing switch for virtual-reality systems
US9999833B2 (en) * 2015-06-11 2018-06-19 Oculus Vr, Llc Hand-held controllers with capacitive touch sensors for virtual-reality systems
GB2539705B (en) 2015-06-25 2017-10-25 Aimbrain Solutions Ltd Conditional behavioural biometrics
US10007421B2 (en) * 2015-08-03 2018-06-26 Lenovo (Singapore) Pte. Ltd. Natural handwriting detection on a touch surface
US9660968B2 (en) 2015-09-25 2017-05-23 Intel Corporation Methods and apparatus for conveying a nonce via a human body communication conduit
US9701202B2 (en) 2015-11-13 2017-07-11 Thunder Power New Energy Vehicle Development Company Limited Vehicle fingerprint bookmark
US10325134B2 (en) * 2015-11-13 2019-06-18 Fingerprint Cards Ab Method and system for calibration of an optical fingerprint sensing device
US20170140233A1 (en) * 2015-11-13 2017-05-18 Fingerprint Cards Ab Method and system for calibration of a fingerprint sensing device
US9639620B1 (en) * 2015-11-13 2017-05-02 Thunder Power Hong Kong Ltd. Vehicle fingerprint bookmark
US9891773B2 (en) 2015-12-17 2018-02-13 Synaptics Incorporated Detecting hover distance with a capacitive sensor
US20170185980A1 (en) * 2015-12-24 2017-06-29 Capital One Services, Llc Personalized automatic teller machine
US10061385B2 (en) 2016-01-22 2018-08-28 Microsoft Technology Licensing, Llc Haptic feedback for a touch input device
US10025492B2 (en) 2016-02-08 2018-07-17 Microsoft Technology Licensing, Llc Pointing detection
EP3419729A4 (en) * 2016-02-25 2019-10-09 Box Dark Industries Pty. Ltd. Articulated gaming controller
KR102559030B1 (en) 2016-03-18 2023-07-25 삼성전자주식회사 Electronic device including a touch panel and method for controlling thereof
US10007343B2 (en) 2016-03-31 2018-06-26 Apple Inc. Force sensor in an input device
GB2552032B (en) 2016-07-08 2019-05-22 Aimbrain Solutions Ltd Step-up authentication
US10086267B2 (en) 2016-08-12 2018-10-02 Microsoft Technology Licensing, Llc Physical gesture input configuration for interactive software and video games
KR102425576B1 (en) 2016-09-13 2022-07-26 삼성전자주식회사 Wearable device and the operation method thereof
US10579784B2 (en) 2016-11-02 2020-03-03 Biocatch Ltd. System, device, and method of secure utilization of fingerprints for user authentication
US10444927B2 (en) 2016-11-04 2019-10-15 Microsoft Technology Licensing, Llc Stylus hover and position communication protocol
US10379806B2 (en) 2016-11-04 2019-08-13 International Business Machines Corporation Dynamic selection for touch sensor
US10397262B2 (en) 2017-07-20 2019-08-27 Biocatch Ltd. Device, system, and method of detecting overlay malware
CN107485854B (en) * 2017-08-03 2022-03-01 惠州Tcl移动通信有限公司 Game paddle control method, storage medium and game paddle
JP6719433B2 (en) 2017-09-22 2020-07-08 株式会社日立製作所 Moving body control system and moving body control method
US10437365B2 (en) 2017-10-11 2019-10-08 Pixart Imaging Inc. Driver integrated circuit of touch panel and associated driving method
US10773153B2 (en) * 2017-11-02 2020-09-15 Michael Callahan Method and system for a personal interaction game platform
US10599259B2 (en) * 2017-11-20 2020-03-24 Google Llc Virtual reality / augmented reality handheld controller sensing
US10194019B1 (en) * 2017-12-01 2019-01-29 Qualcomm Incorporated Methods and systems for initiating a phone call from a wireless communication device
CN108845613B (en) * 2018-04-09 2021-07-09 广州视源电子科技股份有限公司 Interactive intelligent tablet computer and data processing method and device thereof
CN108777854A (en) * 2018-05-25 2018-11-09 恒玄科技(上海)有限公司 A kind of wireless headset realization stereosonic system and method for high-quality transmission
US11056923B2 (en) * 2018-06-05 2021-07-06 Avago Technologies International Sales Pte. Limited Wireless charging relay and method
US10866683B2 (en) 2018-08-27 2020-12-15 Apple Inc. Force or touch sensing on a mobile device using capacitive or pressure sensing
US10814222B2 (en) 2018-09-21 2020-10-27 Logitech Europe S.A. Gaming controller with adaptable input configurations
CN109350962A (en) * 2018-10-08 2019-02-19 业成科技(成都)有限公司 Touch device
WO2020111308A1 (en) * 2018-11-28 2020-06-04 전자부품연구원 Intuitive interaction method and system for augmented reality display for vehicle
EP3892344A4 (en) * 2018-12-07 2023-01-25 Sony Interactive Entertainment Inc. Entertainment device, emission control device, operating device, emission control method, and program
US10635202B1 (en) * 2018-12-18 2020-04-28 Valve Corporation Dynamic sensor assignment
US10905946B2 (en) * 2019-02-28 2021-02-02 Valve Corporation Continuous controller calibration
US11934244B2 (en) * 2019-03-06 2024-03-19 Sony Interactive Entertainment Inc. Low battery switchover
US11281373B2 (en) * 2019-05-07 2022-03-22 Yifang Liu Multi-perspective input for computing devices
GB2586333A (en) 2019-06-05 2021-02-17 Touch Biometrix Ltd Apparatus and method
US11216065B2 (en) * 2019-09-26 2022-01-04 Lenovo (Singapore) Pte. Ltd. Input control display based on eye gaze
US11504610B2 (en) * 2020-02-14 2022-11-22 Valve Corporation Dynamically enabling or disabling controls of a controller
US12121800B2 (en) 2020-03-03 2024-10-22 Backbone Labs, Inc. Haptics for touch-input hardware interfaces of a game controller
US12115443B2 (en) 2020-03-03 2024-10-15 Backbone Labs, Inc. Game controller with magnetic wireless connector
US12268956B2 (en) 2020-03-03 2025-04-08 Backbone Labs, Inc. Game controller for a mobile device with audio waveguide feature
US12194374B2 (en) 2020-03-03 2025-01-14 Backbone Labs, Inc. Game controller for a mobile device with extended bumper button
US12145052B2 (en) 2020-03-03 2024-11-19 Backbone Labs, Inc. Game controller for a mobile device with flat flex connector
US11144160B1 (en) * 2020-03-30 2021-10-12 Sigmasense, Llc. Three-dimensional data reduction method and system
CN111462557B (en) * 2020-04-09 2022-03-01 中国人民解放军陆军军医大学第二附属医院 A game-style teaching application system for clinical cases of cardiovascular disease
US12153764B1 (en) 2020-09-25 2024-11-26 Apple Inc. Stylus with receive architecture for position determination
US11606353B2 (en) 2021-07-22 2023-03-14 Biocatch Ltd. System, device, and method of generating and utilizing one-time passwords
US20230084581A1 (en) * 2021-09-16 2023-03-16 Voyetra Turtle Beach Inc. Video game controller with a graphical user interface
US12074946B2 (en) 2022-11-04 2024-08-27 Backbone Labs, Inc. System and method for automatic content capability detection
US12070678B2 (en) * 2022-12-21 2024-08-27 Backbone Labs, Inc. Dynamically changing button indicia for a game controller
WO2024148214A1 (en) 2023-01-06 2024-07-11 Backbone Labs, Inc. Open and close features for game controller bridge

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090167715A1 (en) * 2007-12-26 2009-07-02 Htc Corporation User interface of portable device and operating method thereof
US20100066693A1 (en) * 2008-09-12 2010-03-18 Mitsubishi Electric Corporation Touch panel device
US20110069012A1 (en) * 2009-09-22 2011-03-24 Sony Ericsson Mobile Communications Ab Miniature character input mechanism

Family Cites Families (100)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4525859A (en) * 1982-09-03 1985-06-25 Bowles Romald E Pattern recognition system
US4857916A (en) * 1987-02-26 1989-08-15 Bellin Robert W System and method for identifying an individual utilizing grasping pressures
US6343991B1 (en) * 1997-10-01 2002-02-05 Brad A. Armstrong Game control with analog pressure sensor
DE4416507C5 (en) * 1994-05-10 2006-10-19 Volkswagen Ag Method for detecting a use authorization for a vehicle
US5812252A (en) * 1995-01-31 1998-09-22 Arete Associates Fingerprint--Acquisition apparatus for access control; personal weapon and other systems controlled thereby
JP2845175B2 (en) * 1995-08-25 1999-01-13 株式会社オプテック Game console controller
US5896125A (en) * 1995-11-06 1999-04-20 Niedzwiecki; Richard H. Configurable keyboard to personal computer video game controller adapter
US6400835B1 (en) * 1996-05-15 2002-06-04 Jerome H. Lemelson Taillight mounted vehicle security system employing facial recognition using a reflected image
GB9705267D0 (en) * 1997-03-13 1997-04-30 Philips Electronics Nv Hand biometrics sensing device
US5982913A (en) * 1997-03-25 1999-11-09 The United States Of America As Represented By The National Security Agency Method of verification using a subset of claimant's fingerprint
US6100811A (en) * 1997-12-22 2000-08-08 Trw Inc. Fingerprint actuation of customized vehicle features
US6408087B1 (en) * 1998-01-13 2002-06-18 Stmicroelectronics, Inc. Capacitive semiconductor user input device
US7663607B2 (en) * 2004-05-06 2010-02-16 Apple Inc. Multipoint touchscreen
JP2002501271A (en) * 1998-01-26 2002-01-15 ウェスターマン,ウェイン Method and apparatus for integrating manual input
US6225890B1 (en) * 1998-03-20 2001-05-01 Trimble Navigation Limited Vehicle use control
JP3171575B2 (en) * 1998-07-31 2001-05-28 株式会社ソニー・コンピュータエンタテインメント Entertainment system and program supply medium
US6028950A (en) * 1999-02-10 2000-02-22 The National Registry, Inc. Fingerprint controlled set-top box
US6351695B1 (en) * 1999-04-23 2002-02-26 Ronald Weiss Verified common carrier truck operation log
US6369706B1 (en) * 1999-05-10 2002-04-09 Gateway, Inc. System and method for protecting a digital information appliance from environmental influences
US7047419B2 (en) * 1999-09-17 2006-05-16 Pen-One Inc. Data security system
IL134527A (en) * 2000-02-14 2011-08-31 Bioguard Components And Technology Ltd Biometrics interface
JP3868701B2 (en) * 2000-03-21 2007-01-17 三菱電機株式会社 Vehicle key system
US6565441B1 (en) * 2000-04-07 2003-05-20 Arista Enterprises Inc. Dedicated wireless digital video disc (DVD) controller for video game consoles
US20060250213A1 (en) * 2000-07-28 2006-11-09 Cain George R Jr Biometric data controlled configuration
US6819219B1 (en) * 2000-10-13 2004-11-16 International Business Machines Corporation Method for biometric-based authentication in wireless communication for access control
US6990219B2 (en) * 2000-12-15 2006-01-24 Nippon Telegraph And Telephone Corporation Image capturing method and apparatus and fingerprint collation method and apparatus
TW507158B (en) * 2001-01-05 2002-10-21 Darfon Electronics Corp Detecting device and method of mouse touch pad
US8939831B2 (en) * 2001-03-08 2015-01-27 Brian M. Dugan Systems and methods for improving fitness equipment and exercise
US6603462B2 (en) * 2001-03-21 2003-08-05 Multidigit, Inc. System and method for selecting functions based on a finger feature such as a fingerprint
US6563940B2 (en) * 2001-05-16 2003-05-13 New Jersey Institute Of Technology Unauthorized user prevention device and method
US6902481B2 (en) * 2001-09-28 2005-06-07 Igt Decoupling of the graphical presentation of a game from the presentation logic
JP2003140823A (en) * 2001-11-08 2003-05-16 Sony Computer Entertainment Inc Information input device and information processing program
US7352356B2 (en) * 2001-12-13 2008-04-01 United States Of America Refreshable scanning tactile graphic display for localized sensory stimulation
US20050084138A1 (en) * 2002-02-13 2005-04-21 Inkster D R. System and method for identifying a person
US20030220142A1 (en) * 2002-05-21 2003-11-27 Mark Siegel Video Game controller with display screen
US7616784B2 (en) * 2002-07-29 2009-11-10 Robert William Kocher Method and apparatus for contactless hand recognition
US20040021681A1 (en) * 2002-07-30 2004-02-05 Liao Chin-Hua Arthur Dual-touch-screen mobile computer
US7180508B2 (en) * 2002-09-17 2007-02-20 Tyco Electronics Corporation Dynamic corrections for a non-linear touchscreen
US7050798B2 (en) * 2002-12-16 2006-05-23 Microsoft Corporation Input device with user-balanced performance and power consumption
US7280678B2 (en) * 2003-02-28 2007-10-09 Avago Technologies General Ip Pte Ltd Apparatus and method for detecting pupils
JP2005145351A (en) * 2003-11-18 2005-06-09 Tokai Rika Co Ltd Vehicle theft preventive device
US8170945B2 (en) * 2004-01-15 2012-05-01 Bgc Partners, Inc. System and method for providing security to a game controller device for electronic trading
JP4454335B2 (en) * 2004-02-12 2010-04-21 Necインフロンティア株式会社 Fingerprint input device
US8131026B2 (en) * 2004-04-16 2012-03-06 Validity Sensors, Inc. Method and apparatus for fingerprint image reconstruction
US7308112B2 (en) * 2004-05-14 2007-12-11 Honda Motor Co., Ltd. Sign based human-machine interaction
US7956890B2 (en) * 2004-09-17 2011-06-07 Proximex Corporation Adaptive multi-modal integrated biometric identification detection and surveillance systems
US7180401B2 (en) * 2004-12-03 2007-02-20 Kulite Semiconductor Products, Ic. Personal identification apparatus using measured tactile pressure
WO2006118555A1 (en) * 2005-03-31 2006-11-09 Brian Scott Miller Biometric control of equipment
US20060244733A1 (en) * 2005-04-28 2006-11-02 Geaghan Bernard O Touch sensitive device and method using pre-touch information
KR20060131542A (en) * 2005-06-16 2006-12-20 엘지전자 주식회사 Touch Screen Power Save Device and Method
US20060284853A1 (en) * 2005-06-16 2006-12-21 Xm Satellite Radio, Inc. Context sensitive data input using finger or fingerprint recognition
KR100668341B1 (en) * 2005-06-29 2007-01-12 삼성전자주식회사 Method and apparatus for inputting a function of a portable terminal using a user's grip form.
ES2374221T3 (en) * 2005-07-11 2012-02-14 Volvo Technology Corporation METHODS AND DEVICE FOR CARRYING OUT THE IDENTITY CHECK OF A DRIVER.
US7942745B2 (en) * 2005-08-22 2011-05-17 Nintendo Co., Ltd. Game operating device
DE102005047137A1 (en) * 2005-09-30 2007-04-05 Daimlerchrysler Ag Passenger protection and/or comfort system for use in vehicle, has person identification device including biometric sensor such as fingerprint scanner for identifying passenger and for clear allocation of passenger in vehicle seat
US7649522B2 (en) * 2005-10-11 2010-01-19 Fish & Richardson P.C. Human interface input acceleration system
US7868874B2 (en) * 2005-11-15 2011-01-11 Synaptics Incorporated Methods and systems for detecting a position-based attribute of an object using digital codes
US20070111796A1 (en) * 2005-11-16 2007-05-17 Microsoft Corporation Association of peripherals communicatively attached to a console device
US10048860B2 (en) * 2006-04-06 2018-08-14 Google Technology Holdings LLC Method and apparatus for user interface adaptation
JP5044272B2 (en) * 2006-05-24 2012-10-10 株式会社日本自動車部品総合研究所 Vehicle user support device
KR100827234B1 (en) * 2006-05-30 2008-05-07 삼성전자주식회사 Touch sensor malfunction prevention method and device
US20070299670A1 (en) * 2006-06-27 2007-12-27 Sbc Knowledge Ventures, Lp Biometric and speech recognition system and method
US20080004113A1 (en) * 2006-06-30 2008-01-03 Jason Avery Enhanced controller with modifiable functionality
WO2008007372A2 (en) * 2006-07-12 2008-01-17 N-Trig Ltd. Hover and touch detection for a digitizer
US8175346B2 (en) * 2006-07-19 2012-05-08 Lumidigm, Inc. Whole-hand multispectral biometric imaging
US7660442B2 (en) * 2006-09-01 2010-02-09 Handshot, Llc Method and system for capturing fingerprints, palm prints and hand geometry
JP5294442B2 (en) * 2006-09-13 2013-09-18 任天堂株式会社 GAME DEVICE AND GAME PROGRAM
US20080069412A1 (en) * 2006-09-15 2008-03-20 Champagne Katrina S Contoured biometric sensor
US20100234074A1 (en) * 2006-10-02 2010-09-16 Nokia Corporation Keypad emulation
US8232970B2 (en) * 2007-01-03 2012-07-31 Apple Inc. Scan sequence generator
US8094128B2 (en) * 2007-01-03 2012-01-10 Apple Inc. Channel scan logic
US7848825B2 (en) * 2007-01-03 2010-12-07 Apple Inc. Master/slave mode for sensor processing devices
US20090017910A1 (en) * 2007-06-22 2009-01-15 Broadcom Corporation Position and motion tracking of an object
US20080231604A1 (en) * 2007-03-22 2008-09-25 Cypress Semiconductor Corp. Method for extending the life of touch screens
JP5285234B2 (en) * 2007-04-24 2013-09-11 任天堂株式会社 Game system, information processing system
US8027518B2 (en) * 2007-06-25 2011-09-27 Microsoft Corporation Automatic configuration of devices based on biometric data
WO2009006557A1 (en) * 2007-07-03 2009-01-08 Cypress Semiconductor Corporation Method for improving scan time and sensitivity in touch sensitive user interface device
US20090073112A1 (en) * 2007-09-14 2009-03-19 International Business Machines Corporation Method and system for dynamically configurable tactile feedback for navigational support
US8031175B2 (en) * 2008-04-21 2011-10-04 Panasonic Corporation Touch sensitive remote control system that detects hand size characteristics of user and adapts mapping to screen display
US8299889B2 (en) * 2007-12-07 2012-10-30 Cisco Technology, Inc. Home entertainment system providing presence and mobility via remote control authentication
US20090176565A1 (en) * 2008-01-07 2009-07-09 Bally Gaming, Inc. Gaming devices for biometrically identifying a player
WO2009089050A1 (en) * 2008-01-08 2009-07-16 Cirque Corporation Game controller touchpad providing touch stick functionality and relative and absolute position input
US8195220B2 (en) * 2008-02-01 2012-06-05 Lg Electronics Inc. User interface for mobile devices
EP2113828B1 (en) * 2008-04-30 2017-10-11 InnoLux Corporation Display device with touch screen
US20090284532A1 (en) * 2008-05-16 2009-11-19 Apple Inc. Cursor motion blurring
US20090289780A1 (en) * 2008-05-21 2009-11-26 Danette Sue Tenorio-Fox SenCora print system
US20100039224A1 (en) * 2008-05-26 2010-02-18 Okude Kazuhiro Biometrics information matching apparatus, biometrics information matching system, biometrics information matching method, person authentication apparatus, and person authentication method
US8355003B2 (en) * 2008-06-13 2013-01-15 Microsoft Corporation Controller lighting activation by proximity and motion
KR20100006219A (en) * 2008-07-09 2010-01-19 삼성전자주식회사 Method and apparatus for user interface
US20100062833A1 (en) * 2008-09-10 2010-03-11 Igt Portable Gaming Machine Emergency Shut Down Circuitry
US8116453B2 (en) * 2008-12-29 2012-02-14 Bank Of America Corporation Gaming console-specific user authentication
US8217913B2 (en) * 2009-02-02 2012-07-10 Apple Inc. Integrated touch screen
US8264455B2 (en) * 2009-02-03 2012-09-11 Microsoft Corporation Mapping of physical controls for surface computing
US8172675B2 (en) * 2009-03-27 2012-05-08 Microsoft Corporation Personalization using a hand-pressure signature
US8154529B2 (en) * 2009-05-14 2012-04-10 Atmel Corporation Two-dimensional touch sensors
US20110009195A1 (en) * 2009-07-08 2011-01-13 Gunjan Porwal Configurable representation of a virtual button on a game controller touch screen
US20110028194A1 (en) * 2009-07-31 2011-02-03 Razer (Asia-Pacific) Pte Ltd System and method for unified-context mapping of physical input device controls to application program actions
US8334849B2 (en) * 2009-08-25 2012-12-18 Pixart Imaging Inc. Firmware methods and devices for a mutual capacitance touch sensing device
US8773366B2 (en) * 2009-11-16 2014-07-08 3M Innovative Properties Company Touch sensitive device using threshold voltage signal
US20120052929A1 (en) * 2010-08-31 2012-03-01 Khamvong Thammasouk Interactive phone case

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090167715A1 (en) * 2007-12-26 2009-07-02 Htc Corporation User interface of portable device and operating method thereof
US20100066693A1 (en) * 2008-09-12 2010-03-18 Mitsubishi Electric Corporation Touch panel device
US20110069012A1 (en) * 2009-09-22 2011-03-24 Sony Ericsson Mobile Communications Ab Miniature character input mechanism

Cited By (34)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8395598B2 (en) * 2010-03-05 2013-03-12 Wacom Co., Ltd. Position detection apparatus
US20110216032A1 (en) * 2010-03-05 2011-09-08 Wacom Co., Ltd. Position detection apparatus
US8660978B2 (en) 2010-12-17 2014-02-25 Microsoft Corporation Detecting and responding to unintentional contact with a computing device
US9244545B2 (en) 2010-12-17 2016-01-26 Microsoft Technology Licensing, Llc Touch and stylus discrimination and rejection for contact sensitive computing devices
US8982045B2 (en) 2010-12-17 2015-03-17 Microsoft Corporation Using movement of a computing device to enhance interpretation of input events produced when interacting with the computing device
US8994646B2 (en) 2010-12-17 2015-03-31 Microsoft Corporation Detecting gestures involving intentional movement of a computing device
US9069384B2 (en) * 2011-02-04 2015-06-30 Panasonic Intellectual Property Management Co., Ltd. Electronic device
US20130307803A1 (en) * 2011-02-04 2013-11-21 Panasonic Corporation Electronic device
US9201520B2 (en) 2011-02-11 2015-12-01 Microsoft Technology Licensing, Llc Motion and context sharing for pen-based computing inputs
US8988398B2 (en) 2011-02-11 2015-03-24 Microsoft Corporation Multi-touch input device with orientation sensing
US9195351B1 (en) * 2011-09-28 2015-11-24 Amazon Technologies, Inc. Capacitive stylus
US20130176270A1 (en) * 2012-01-09 2013-07-11 Broadcom Corporation Object classification for touch panels
US8902181B2 (en) 2012-02-07 2014-12-02 Microsoft Corporation Multi-touch-movement gestures for tablet computing devices
US20140168116A1 (en) * 2012-12-18 2014-06-19 Logitech Europe S.A. Method and system for discriminating stylus and touch interactions
US9367185B2 (en) * 2012-12-18 2016-06-14 Logitech Europe S.A. Method and system for discriminating stylus and touch interactions
US9946365B2 (en) 2013-03-11 2018-04-17 Barnes & Noble College Booksellers, Llc Stylus-based pressure-sensitive area for UI control of computing device
US20140253464A1 (en) * 2013-03-11 2014-09-11 Barnesandnoble.Com Llc Stylus sensitive device with stylus idle functionality
US20140253520A1 (en) * 2013-03-11 2014-09-11 Barnesandnoble.Com Llc Stylus-based slider functionality for ui control of computing device
US9632594B2 (en) * 2013-03-11 2017-04-25 Barnes & Noble College Booksellers, Llc Stylus sensitive device with stylus idle functionality
US9766723B2 (en) 2013-03-11 2017-09-19 Barnes & Noble College Booksellers, Llc Stylus sensitive device with hover over stylus control functionality
US9785259B2 (en) * 2013-03-11 2017-10-10 Barnes & Noble College Booksellers, Llc Stylus-based slider functionality for UI control of computing device
US10397640B2 (en) 2013-11-07 2019-08-27 Cisco Technology, Inc. Interactive contextual panels for navigating a content stream
US20150205376A1 (en) * 2014-01-21 2015-07-23 Seiko Epson Corporation Position detecting device, position detecting system, and controlling method of position detecting device
US9753580B2 (en) * 2014-01-21 2017-09-05 Seiko Epson Corporation Position detecting device, position detecting system, and controlling method of position detecting device
EP2911016A1 (en) * 2014-02-21 2015-08-26 Polar Electro Oy Radio frequency based touchscreen
US10222935B2 (en) 2014-04-23 2019-03-05 Cisco Technology Inc. Treemap-type user interface
US9870083B2 (en) 2014-06-12 2018-01-16 Microsoft Technology Licensing, Llc Multi-device multi-user sensor correlation for pen and computing device interaction
US10168827B2 (en) 2014-06-12 2019-01-01 Microsoft Technology Licensing, Llc Sensor correlation for pen and touch-sensitive computing device interaction
US9727161B2 (en) 2014-06-12 2017-08-08 Microsoft Technology Licensing, Llc Sensor correlation for pen and touch-sensitive computing device interaction
US20160034051A1 (en) * 2014-07-31 2016-02-04 Cisco Technology, Inc. Audio-visual content navigation with movement of computing device
US10372520B2 (en) 2016-11-22 2019-08-06 Cisco Technology, Inc. Graphical user interface for visualizing a plurality of issues with an infrastructure
US11016836B2 (en) 2016-11-22 2021-05-25 Cisco Technology, Inc. Graphical user interface for visualizing a plurality of issues with an infrastructure
US10739943B2 (en) 2016-12-13 2020-08-11 Cisco Technology, Inc. Ordered list user interface
US10862867B2 (en) 2018-04-01 2020-12-08 Cisco Technology, Inc. Intelligent graphical user interface

Also Published As

Publication number Publication date
US20130237322A1 (en) 2013-09-12
US20110115604A1 (en) 2011-05-19
US20110118029A1 (en) 2011-05-19
US20110115742A1 (en) 2011-05-19
US20110118030A1 (en) 2011-05-19
US20110115606A1 (en) 2011-05-19
US20110118024A1 (en) 2011-05-19
US20110118025A1 (en) 2011-05-19
US8614621B2 (en) 2013-12-24
US8838060B2 (en) 2014-09-16
US20110118023A1 (en) 2011-05-19
US8535133B2 (en) 2013-09-17
US20110118028A1 (en) 2011-05-19
US9007331B2 (en) 2015-04-14
US20110118027A1 (en) 2011-05-19
US8449393B2 (en) 2013-05-28
US8845424B2 (en) 2014-09-30

Similar Documents

Publication Publication Date Title
US20110115741A1 (en) Touch sensitive panel supporting stylus input
US8754746B2 (en) Hand-held gaming device that identifies user based upon input from touch sensitive panel
US8184100B2 (en) Inertia sensing input controller and receiver and interactive system using thereof
JP5841409B2 (en) Control program, input terminal device, control system, and control method
US10709971B2 (en) Information processing system, extended input device, and information processing method
CN106681638A (en) Method, device and mobile terminal for controlling touch screen
CN106506554A (en) The method of live coding, device, terminal, linkage encoder server and system
CN108174016A (en) A kind of terminal shatter-resistant control method, terminal and computer readable storage medium
CN107911777A (en) Method, device and mobile terminal for processing ear return function
CN107562303B (en) Method and device for controlling element motion in display interface
CN111651387B (en) Interface Circuits and Electronic Equipment
CN111506191B (en) Control method and electronic equipment
CA2834791C (en) Configuring the functionality of control elements of a control device based on orientation
TW202312755A (en) Input device having integrated haptics and near field communication antenna
CN108170310B (en) A touch screen control method and mobile terminal
CN114153334B (en) Electronic device and control method and control device thereof
US8432264B1 (en) Motion-activated remote control backlight
US20100309155A1 (en) Two-dimensional input device, control device and interactive game system
CN108900942B (en) A playback control method and electronic device
CN108418961B (en) A kind of audio playback method and mobile terminal
TWI611312B (en) Method for transforming mobile communication device into game joystick
CN113873308A (en) Remote control method and control system

Legal Events

Date Code Title Description
AS Assignment

Owner name: BROADCOM CORPORATION, CALIFORNIA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:LUKAS, BOB;SOBEL, DAVID A.;GUPTA, MONIKA;AND OTHERS;SIGNING DATES FROM 20100927 TO 20101025;REEL/FRAME:025203/0029

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION

AS Assignment

Owner name: BANK OF AMERICA, N.A., AS COLLATERAL AGENT, NORTH CAROLINA

Free format text: PATENT SECURITY AGREEMENT;ASSIGNOR:BROADCOM CORPORATION;REEL/FRAME:037806/0001

Effective date: 20160201

Owner name: BANK OF AMERICA, N.A., AS COLLATERAL AGENT, NORTH

Free format text: PATENT SECURITY AGREEMENT;ASSIGNOR:BROADCOM CORPORATION;REEL/FRAME:037806/0001

Effective date: 20160201

AS Assignment

Owner name: AVAGO TECHNOLOGIES GENERAL IP (SINGAPORE) PTE. LTD., SINGAPORE

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:BROADCOM CORPORATION;REEL/FRAME:041706/0001

Effective date: 20170120

Owner name: AVAGO TECHNOLOGIES GENERAL IP (SINGAPORE) PTE. LTD

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:BROADCOM CORPORATION;REEL/FRAME:041706/0001

Effective date: 20170120

AS Assignment

Owner name: BROADCOM CORPORATION, CALIFORNIA

Free format text: TERMINATION AND RELEASE OF SECURITY INTEREST IN PATENTS;ASSIGNOR:BANK OF AMERICA, N.A., AS COLLATERAL AGENT;REEL/FRAME:041712/0001

Effective date: 20170119

点击 这是indexloc提供的php浏览器服务,不要输入任何密码和下载