US20120249474A1 - Proximity and force detection for haptic effect generation - Google Patents
Proximity and force detection for haptic effect generation Download PDFInfo
- Publication number
- US20120249474A1 US20120249474A1 US13/434,623 US201213434623A US2012249474A1 US 20120249474 A1 US20120249474 A1 US 20120249474A1 US 201213434623 A US201213434623 A US 201213434623A US 2012249474 A1 US2012249474 A1 US 2012249474A1
- Authority
- US
- United States
- Prior art keywords
- haptic
- touch
- user
- touch screen
- user interaction
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F1/00—Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
- G06F1/16—Constructional details or arrangements
- G06F1/1613—Constructional details or arrangements for portable computers
- G06F1/1633—Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
- G06F1/1684—Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675
- G06F1/1694—Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675 the I/O peripheral being a single or a set of motion sensors for pointer control or gesture input obtained by sensing movements of the portable computer
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/016—Input arrangements with force or tactile feedback as computer generated output to the user
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04886—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2200/00—Indexing scheme relating to G06F1/04 - G06F1/32
- G06F2200/16—Indexing scheme relating to G06F1/16 - G06F1/18
- G06F2200/163—Indexing scheme relating to constructional details of the computer
- G06F2200/1637—Sensing arrangement for detection of housing movement or orientation, e.g. for controlling scrolling or cursor movement on the display of an handheld computer
Definitions
- the present invention relates to user interface control, in particular to haptic effect generation techniques based on proximity, touch, and/or force detection.
- Haptics refers to the sense of touch.
- haptics relates to providing a touch sensory feedback to the user.
- Electronic devices incorporating haptics may include cell phones, PDAs, gaming devices, etc.
- the user interacts with electronic devices through a user interface, such as a touch screen; however, the user often does not know if the user's desired function was recognized or is being performed by the electronic device.
- electronic devices generate a haptic feedback in the form of a vibro-tactile sensation (often, a simulated “click”) to alert the user of the electronic device's performance.
- haptic feedback lets the user know what is going on with the electronic device.
- haptics can provide a sensory stimuli according to game interactions.
- the haptic response For a user to accept haptics, the haptic response should follow closely in time with the user action. Thus, prolonged latency in the haptic response, which is the delay between the moment of user interaction and the corresponding haptics response, causes a disconnect between the touch and the haptic response. When the latency exceeds about 250 ms, the latency becomes noticeable to the user and it can be perceived as device error rather than an event that was triggered by the user's input. For example, a user may touch a first button on a touch screen and move onto another function of the device before feeling the haptic response to the first button. This temporal disconnect results in low user acceptance of haptics leading to a poor user experience.
- user interaction with the device may expand to more than mere point touches on the screen.
- user hovering his/her finger over a screen may constitute a type of user interaction or the force of the user touches may constitute different type of user interaction events depending on the amount of force.
- different haptic effects should compliment these new types of user interaction events.
- the inventors recognized a need in the art for efficient haptic effect generation with reduced latency that compliment different types of user interaction events.
- FIGS. 3( a )-( b ) illustrate a series of user interaction event detection according to an embodiment of the present invention.
- FIG. 4 illustrates a haptic effect generation operation according to an embodiment of the present invention.
- FIG. 5 illustrates a haptic effect generation operation according to an embodiment of the present invention.
- FIGS. 6( a )-( b ) illustrate a force detection operation according to an embodiment of the present invention.
- FIG. 1( a ) is a simplified block diagram of a haptic-enabled display device 100 according to an embodiment of the present invention.
- the device 100 may include a user interface (UI) controller 110 with a processor 112 and a memory 114 , a haptics driver 120 , a haptics actuator 130 , a touch screen 140 with a touch screen (TS) sensor 142 , and a host system 150 .
- the device 100 may be embodied as a consumer electronic device such as a cell phone, PDA, gaming device, etc.
- the UI controller 110 may include the processor 112 and the memory 114 .
- the processor 112 may control the operations of the UI controller 110 according to instructions stored in the memory 114 .
- the memory 114 may also store haptic effect profiles associated with different feedback responses. Different user interaction events may be associated with different haptic effect profiles.
- the memory 114 may be provided as a non-volatile memory, a volatile memory such as random access memory (RAM), or a combination thereof.
- the UI controller 110 may be coupled to the touch screen 140 and to the TS sensor 142 therein that measures different user interaction with the touch screen 140 .
- the touch screen 140 may also include an overlain display, which may be provided as a backlit LCD display with an LCD matrix, lenticular lenses, polaraziers, etc.
- FIG. 1( b ) is a functional block diagram of the UI controller 110 according to an embodiment of the present invention.
- the processor 112 in the UI controller 110 may include a proximity classification module 112 . 1 , a touch classification module 112 . 2 , a force classification module 112 . 3 and a haptics response search module 112 . 4 .
- the memory 114 in the UI controller 110 may include haptics profiles data 114 . 1 .
- the data may be stored as look-up-tables (LUTs).
- the proximity classification module 112 . 1 , touch classification module 112 . 2 , and the force classification module 112 . 3 may receive the TS sensor data. Based on the TS sensor data, the classification module may calculate corresponding proximity, touch, and/or force event(s).
- the proximity classification module 112 . 1 may calculate user proximity to a touch screen 140 , for example before contact is made, based on proximity associated TS sensor data.
- the proximity classification module 112 . 1 may calculate location (or locations for multi-touch user interactions) and time of the user movement (e.g., finger, stylus, pen, etc.) as it hovers over the touch screen 140 .
- the proximity classification module 112 . 1 may be complimentary to the type of touch screen 140 and TS sensor 142 . For example, if the touch screen 140 and the TS sensor 142 is provided as a capacitive touch screen and corresponding capacitive sensor (or grid of capacitive sensors), the proximity classification module 112 . 1 may calculate changes in respective capacitive fields for detecting proximity events. Further, the proximity classification module 112 . 1 may be programmed to differentiate between true positives for desired user proximity events and false positives for objects larger than a typical user interaction instrument (e.g., finger, pen, stylus).
- a typical user interaction instrument e.
- the touch classification module 112 . 2 may calculate user touch(es) on the touch screen 140 and the touch(es) characteristics (e.g., icon selection, gesture, etc.,). The touch classification module 112 . 2 may calculate the location (or locations for multi-touch user interactions) and time of the user touch.
- the touch classification module 112 . 2 may be complimentary to the type of touch screen 140 and TS sensor 142 . For example, if the touch screen 140 and the TS sensor 142 is provided as a capacitive touch screen and corresponding capacitive sensor (or grid of capacitive sensors), the touch classification module 112 . 2 may calculate changes in respective capacitive fields for detecting touch events. Further, the touch classification module 112 . 2 may be programmed to differentiate between true positives for desired user touch events and false positives for objects larger than a typical user interaction instrument (e.g., finger, pen, stylus).
- a typical user interaction instrument e.g., finger, pen, stylus
- the force classification module 112 . 3 may calculate an amount of force corresponding to a user touch on the touch screen 140 .
- the force classification module 112 . 3 may calculate how hard the user presses down and for how long with respect to a touch screen 140 contact.
- the force sensor 160 may be complimentary to the type of touch screen 170 .
- the force classification module 112 . 3 may be complimentary to the type of touch screen 140 and TS sensor 142 .
- the touch classification module 112 . 3 may calculate changes in respective capacitive fields for detecting force events.
- the touch classification module 112 . 2 may be programmed to differentiate between true positives for desired user touch events and false positives for objects larger than a typical user interaction instrument (e.g., finger, pen, stylus).
- the haptic response search module 112 . 4 may receive proximity, touch, and/or force events as calculated by the modules 112 . 1 - 112 . 3 , and may generate a haptic command based on the haptics profile data 114 . 1 .
- the haptic response search module 112 . 4 may match the calculated proximity, touch, and/or force event data to a stored haptic profile and may generate a haptic command associated with the matched haptic profile.
- the UI controller 110 may receive the TS sensor results and may generate corresponding haptic commands accordingly based on stored haptic profiles.
- the UI controller 110 may be coupled to the haptic driver 120 .
- the haptic driver 120 may generate a corresponding drive signal.
- the drive signal may be an analog signal, and the drive signal may be a current or voltage signal.
- the haptics driver 120 may be coupled to the haptics actuator 130 .
- the haptics actuator 120 may be embodied as piezoelectric elements, linear resonant actuators (LRAs), eccentric rotating mass actuators (ERMs), and/or other known actuator types.
- the haptics driver 120 may transmit the drive signal to the haptics actuator 130 causing it to vibrate according to the drive signal properties. The vibrations may be felt by the user providing a vibro-tactile sensory feedback stimuli.
- the haptics actuator 130 may include a mechanical system such as a motor that vibrates to generate the desired haptic effect.
- the haptics actuator 130 may include a coil motor with a spring loaded mass and a permanent magnet. The coil motor may cause the spring loaded mass to vibrate to generate the haptic effect.
- the haptics actuator 130 may also include magnetic coils to generate the motion.
- a plurality of haptic actuators may be provided in the device to generate a plurality of haptic effects at different parts of the device.
- the haptic actuators may be driven by the haptic actuator 130 with the same drive signal with multiple drive signals.
- FIG. 2 illustrates a simplified touch screen arrangement with capacitive sensors according to an embodiment of the present invention.
- FIG. 2( a ) illustrates a capacitive sensor grid layout of the touch screen arrangement
- FIG. 2( b ) illustrates a cross-sectional view of the touch screen arrangement
- FIG. 2( c ) illustrates capacitive fields of the capacitive sensor grid.
- the touch screen arrangement may include a touch screen 210 , a plurality of capacitive sensors, and a cover 240 .
- the capacitive sensors 220 may be provided in a grid fashion that overlaps the display panel 230 .
- a cover 240 may protect the display panel.
- the cover 240 may be provided as a glass cover.
- the capacitive sensors 220 may be arranged in the grid with multiple columns and rows.
- the grid may include m columns and n rows thus generating a m ⁇ n array (say, 11 ⁇ 15).
- the size of the array may be designed to accommodate different screen sizes and/or the desired accuracy/precision level of the touch screen.
- Cross points (CS) of the sensor grid may be placed a distance (D) apart from each other. In an embodiment, each cross point CS, for example, may be 5 mm apart from its neighboring cross points.
- the capacitive sensors 220 may detect proximity events, touch events, and/or force events as will be described below.
- the array of capacitive sensors 220 may be scanned at a scanning frequency.
- the scanning frequency may be programmable.
- the scanning frequency may be set to 100 or 120 Hz.
- the scanning frequency may be dynamically changed based on present conditions.
- the scanning frequency may be dynamically changed based on a rate of approach as detected by the capacitive sensors 220 (e.g., 5 ⁇ the rate of approach).
- the scanning frequency may increase as the rate of approach increases.
- each cross point CS (or each row or each column) may generate a bit code result, which may reflect a change from normal (i.e., without user presence) conditions with respect to proximity, touch, and/or force detection.
- each CS may generate a 14 bit result.
- the code may be used to calculate the type, location, and/or other characteristics such as the rate of approach (velocity and/or acceleration), force, etc., of the user interaction.
- each CS may detect changes in its capacitive field as shown in FIG. 2( c ). Further, a user's finger hovering over the touch screen may be sensed by multiple CSs. In the FIG. 2( c ), CS 1 . 1 may detect a larger presence in its capacitive field as compared to it's neighboring CS 1 . 2 . As a result, the code change of CS 1 . 1 may be higher than that of CS 1 . 2 . From the sensor results, data representing the X,Y,Z coordinates of the finger location may be generated.
- X,Y location may correspond to the location of the CS(s) in the grid that detected the presence (i.e., code change), and the Z location may correspond to the amount of change detected. Moreover, based on one or more sets of X,Y,Z coordinates, other characteristics such as the rate of approach may be calculated.
- the capacitive sensors 220 may also detect location and time of actual touches. For touch detection, X,Y coordinates and the time of the touches may be generated based on the sensor results. In addition, other characteristics such as the type of touch (e.g., movement on the touch surface) may be calculated from one or more sets of scan results. Force detection by the capacitive sensors 220 may also be performed by the sensors as will be described below.
- FIGS. 3( a )-( b ) illustrate user interaction detection by the sensors.
- FIG. 3( a ) illustrates a illustrates a two-dimensional workspace 310 (i.e., UI map) without any user interaction in accordance with embodiments of the present invention.
- the workspace 310 is illustrated as including a plurality of icons 320 and buttons 330 that identify interactive elements of the workspace 310 .
- the workspace 310 may include other areas that are not designated as interactive.
- icons 320 may be spaced apart from each other by a certain separation distance.
- other areas of the display may be unoccupied by content or occupied with display data that is non-interactive.
- non-interactive areas of the device may be may be designated as “dead zones” (DZs) for purposes of user interaction (shown in gray in the example of FIG. 2( b )).
- DZs dead zones
- FIG. 3( b ) illustrates the two-dimensional workspace 310 with a series of user interaction beginning from proximity detection to touch detection and then force detection.
- the user's finger is a certain distance above the workspace 310 where workspace detection area for t 1 may be generalized to a significant left bottom corner.
- the detection may increase accuracy and may become more localized.
- the user finger may be slightly above the button 330 .
- the user finger may touch the screen at button 320 .
- the detection area may increase detecting a larger amount of force as compared to time t 3 touch. Based on the localization of the detection area, different haptic generation operations may be controlled and optimized.
- FIG. 4 illustrates a pre-charging haptic generation operation according to an embodiment of the present invention.
- FIG. 4 includes two plots. The top plot shows a user finger approaching the touch surface in a distance versus time graph, and the bottom plot shows a corresponding voltage through the haptic actuator versus time graph.
- the device may be detecting the location of the finger via proximity sensing. The device, consequently, may be generating X,Y,Z coordinates based on the proximity results. Based on at least two sets of coordinates, the device may also calculate the rate of approach and/or the direction of the finger's movement. Hence, the device may anticipate the time and/or location of the touch.
- the device may detect the finger at a predetermined distance (Threshold P) from the touch surface. At this time t 1 , the device may initiate pre-charging the haptic actuator.
- the haptic actuator may be pre-charged according to a haptic profile for the anticipated time and location of the touch.
- the device may detect the finger making contact with the touch surface via Threshold T. The device, consequently, may be generating X,Y,Z coordinates based on the touch results.
- the device may drive the haptic actuator with the a corresponding haptic effect voltage based on the haptic effect profile associated with the touch characteristics. Therefore, the device may generate the haptic effect faster upon touch screen contact because of pre-charging the haptic generating components, and thereby reducing latency between the user touching the screen and feeling the corresponding haptic feedback.
- the Threshold P value may be programmable.
- the Threshold P value may be dynamically adjustable based on finger movement characteristics.
- the threshold P value may be directly proportional to the rate of approach. Hence, as the rate of approach increases, the Threshold P value increases and vice versa. As a result, the pre-charging time may maintained independent of the rate of approach to allow sufficient time for pre-charging the haptic actuator to the desired voltage level.
- haptic selection may also be based on sensor measurements. For example, haptic effect types may be selected based on the rate of approach of the user's finger as it moves toward a touch screen—a first haptic effect may be selected in response to a relatively “fast” velocity and a second haptic effect may be selected in response to a relatively “slow” velocity.
- FIG. 5 illustrates a multi-haptic effect generation operation according to an embodiment of the present invention.
- FIG. 5 includes two plots. The top plot shows a user finger approaching the touch surface in a distance versus time graph, and the bottom plot shows a corresponding voltage through the haptic actuator versus time graph.
- FIG. 5 illustrates different haptic effect generation based on different user interaction events as detected by the sensor(s) via thresholds.
- the device may detect the finger at a predetermined distance, Threshold 1 , from the touch surface.
- the device consequently, may be generating X,Y,Z coordinates based on the proximity sensor results.
- the device may drive a haptic actuator to generate a first haptic effect according to a haptic profile associated with the finger location and/or movement characteristics (e.g., rate of approach).
- the device may detect the finger touching the touch surface with Threshold 2 .
- the device consequently, may be generating X,Y coordinates based on the touch sensor results.
- the device may drive the haptic actuator to generate a second haptic effect according to a haptic profile associated with the touch location and/or movement characteristics (e.g., type of contact).
- the device may detect the force of the finger contact crossing a predetermined level with Threshold 3 .
- the device consequently, may be generating X,Y,Z coordinates based on the force sensor results.
- the device may drive the haptic actuator to generate a third haptic effect according to a haptic profile for the finger touch location and/or movement characteristics (e.g., amount of force).
- the third haptic effect for example, may be an alert to the user that he/she is pressing too hard on the touch screen.
- the same actuator or different actuators may used to generate the first, second, and/or third haptic effects.
- the haptic effect selection for different interaction events may be dynamically changed based on user interaction history. For example, in a text entry application, different users may enter text at different rates. If a user touches a first letter and the device initiates a haptic effect, then the user's moves toward another letter, the device may recognize the approaching finger and terminate the first haptic effect sufficiently early before the second letter is touched so as to minimize blur between successive haptic effects.
- FIG. 6 illustrates operation of a haptics enabled device to detect force applied to a touch screen for use in accordance with embodiments of the present invention.
- a user may press the touch screen lightly with his/her finger.
- there may be a small deflection of the user's finger at the point of contact which may be registered on the touch screen as an area of contact.
- the area may be considered as a circle having radius R.
- a touch sensor may derive a force at the point of contact from the calculated area.
- FIG. 6( b ) the user presses the touch screen with greater force, causing a greater amount of deformation in the user's finger.
- the user's finger therefore, may register a greater area of contact than in the FIG. 6( a ) case, which the touch sensor may use to derive a corresponding higher value of force.
- the force sensor may represent force as a distance value in the Z plane.
- the force sensor may calculate an area of contact between the user and the touch screen and convert the value to a distance value in the Z plane. If, in the proximity and presence detection operations, distance values are represented as positive Z values, distance values representing user force may be represented as negative Z values. See, FIG. 6( b ).
- the negative Z value models a hypothetical depth of an operator's touch, based on deformation of the user's finger, rather than an actual depth of touch.
- haptic effects may be pre-charged and driven before the user touch based on proximity detection.
- the device may generate a “bubble” effect, which may correspond to stimulating a clicking functionality using haptic effects.
- FIGS. 4( a )- 4 ( d ) illustrate a bubble effect generation operation according to an embodiment of the present invention.
- FIG. 7( a ) illustrates a state of the touch screen prior to detection.
- a haptics driver may pre-charge a haptics actuator to cause the touch screen to deflect toward the users finger by a predetermined amount, shown as ⁇ Z in FIGS. 7( a )-( d ).
- the touch screen may be deflected toward the user's finger by the time it makes contact with the touch screen as shown in FIG. 7( b ).
- the touch sensor determines that the user's finger has made contact, it may initiate the haptic effect. As shown in FIG.
- a mechanical button click may be simulated, for example, by removing the pre-charge effect and inducing the touch screen to return to its default level (shown as “Z”). After the retraction of the touch screen, the user's finger may fall to the surface of the screen at the Z level, shown in FIG. 7( d ).
- the click effect may be induced by the user feeling mechanical resistance at the first point of contact with the screen deflected forward ( FIG. 7( b )) and then at the second point of contact with the screen at the rest position ( FIG. 7( d )). This effect simulates a mechanical compression effect (i.e., the bubble effect).
- the proximity-based deflection operations are not limited to click effects. Vibration effects may be induced by deflecting the screen forward prior to initial contact, then oscillating the screen forward and backward after contact is made. A variety of different haptic effects may be used in connection with proximity detection operations.
- Various embodiments may be implemented using hardware elements, software elements, or a combination of both.
- hardware elements may include processors, microprocessors, circuits, circuit elements (e.g., transistors, resistors, capacitors, inductors, and so forth), integrated circuits, application specific integrated circuits (ASIC), programmable logic devices (PLD), digital signal processors (DSP), field programmable gate array (FPGA), logic gates, registers, semiconductor device, chips, microchips, chip sets, and so forth.
- Examples of software may include software components, programs, applications, computer programs, application programs, system programs, machine programs, operating system software, middleware, firmware, software modules, routines, subroutines, functions, methods, procedures, software interfaces, application program interfaces (API), instruction sets, computing code, computer code, code segments, computer code segments, words, values, symbols, or any combination thereof. Determining whether an embodiment is implemented using hardware elements and/or software elements may vary in accordance with any number of factors, such as desired computational rate, power levels, heat tolerances, processing cycle budget, input data rates, output data rates, memory resources, data bus speeds and other design or performance constraints.
- Some embodiments may be implemented, for example, using a computer-readable medium or article which may store an instruction or a set of instructions that, if executed by a machine, may cause the machine to perform a method and/or operations in accordance with the embodiments.
- a machine may include, for example, any suitable processing platform, computing platform, computing device, processing device, computing system, processing system, computer, processor, or the like, and may be implemented using any suitable combination of hardware and/or software.
- the computer-readable medium or article may include, for example, any suitable type of memory unit, memory device, memory article, memory medium, storage device, storage article, storage medium and/or storage unit, for example, memory, removable or non-removable media, erasable or non-erasable media, writeable or re-writeable media, digital or analog media, hard disk, floppy disk, Compact Disc Read Only Memory (CD-ROM), Compact Disc Recordable (CD-R), Compact Disc Rewriteable (CD-RW), optical disk, magnetic media, magneto-optical media, removable memory cards or disks, various types of Digital Versatile Disc (DVD), a tape, a cassette, or the like.
- any suitable type of memory unit for example, any suitable type of memory unit, memory device, memory article, memory medium, storage device, storage article, storage medium and/or storage unit, for example, memory, removable or non-removable media, erasable or non-erasable media, writeable or re-writeable media, digital or analog media, hard disk
- the instructions may include any suitable type of code, such as source code, compiled code, interpreted code, executable code, static code, dynamic code, encrypted code, and the like, implemented using any suitable high-level, low-level, object-oriented, visual, compiled and/or interpreted programming language.
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Computer Hardware Design (AREA)
- User Interface Of Digital Computer (AREA)
- Programmable Controllers (AREA)
Abstract
The present invention may provide a device including a haptic driver to drive a coupled actuator causing the actuator to generate a vibratory haptic effect. A touch screen may display a user interface and may include a sensor to detect user interaction with the touch screen within a predetermined range above the touch screen. A controller may calculate a proximity event based on the detected user interaction above the touch screen, and to control haptic driver operations according to the proximity event.
Description
- This application claims priority to provisional U.S. Patent Application Ser. No. 61/470,764, entitled “Touch Screen and Haptic Control” filed on Apr. 1, 2011, the content of which is incorporated herein in its entirety.
- The present invention relates to user interface control, in particular to haptic effect generation techniques based on proximity, touch, and/or force detection.
- Haptics refers to the sense of touch. In electronic devices, haptics relates to providing a touch sensory feedback to the user. Electronic devices incorporating haptics may include cell phones, PDAs, gaming devices, etc. The user interacts with electronic devices through a user interface, such as a touch screen; however, the user often does not know if the user's desired function was recognized or is being performed by the electronic device. Thus, electronic devices generate a haptic feedback in the form of a vibro-tactile sensation (often, a simulated “click”) to alert the user of the electronic device's performance. Stated differently, haptic feedback lets the user know what is going on with the electronic device. In a gaming electronic device, for example, haptics can provide a sensory stimuli according to game interactions.
- For a user to accept haptics, the haptic response should follow closely in time with the user action. Thus, prolonged latency in the haptic response, which is the delay between the moment of user interaction and the corresponding haptics response, causes a disconnect between the touch and the haptic response. When the latency exceeds about 250 ms, the latency becomes noticeable to the user and it can be perceived as device error rather than an event that was triggered by the user's input. For example, a user may touch a first button on a touch screen and move onto another function of the device before feeling the haptic response to the first button. This temporal disconnect results in low user acceptance of haptics leading to a poor user experience.
- Moreover, as electronic devices become more complex, user interaction with the device may expand to more than mere point touches on the screen. For example, user hovering his/her finger over a screen may constitute a type of user interaction or the force of the user touches may constitute different type of user interaction events depending on the amount of force. Thus, different haptic effects should compliment these new types of user interaction events.
- Therefore, the inventors recognized a need in the art for efficient haptic effect generation with reduced latency that compliment different types of user interaction events.
-
FIG. 1 is a simplified block diagram of a display device according to an embodiment of the present invention. -
FIGS. 2( a)-(c) illustrate an integrated touch screen sensor grid according to an embodiment of the present invention. -
FIGS. 3( a)-(b) illustrate a series of user interaction event detection according to an embodiment of the present invention. -
FIG. 4 illustrates a haptic effect generation operation according to an embodiment of the present invention. -
FIG. 5 illustrates a haptic effect generation operation according to an embodiment of the present invention. -
FIGS. 6( a)-(b) illustrate a force detection operation according to an embodiment of the present invention. -
FIGS. 7( a)-(d) illustrate a haptic bubble effect generation operation according to an embodiment of the present invention. - Embodiments of the present invention may provide a device including a haptic driver to drive a coupled actuator causing the actuator to generate a vibratory haptic effect. A touch screen may display a user interface and may include a sensor to detect user interaction with the touch screen within a predetermined range above the touch screen. A controller may calculate a proximity event based on the detected user interaction above the touch screen, and to control haptic driver operations according to the proximity event.
-
FIG. 1( a) is a simplified block diagram of a haptic-enableddisplay device 100 according to an embodiment of the present invention. Thedevice 100 may include a user interface (UI)controller 110 with aprocessor 112 and amemory 114, ahaptics driver 120, ahaptics actuator 130, atouch screen 140 with a touch screen (TS)sensor 142, and ahost system 150. Thedevice 100 may be embodied as a consumer electronic device such as a cell phone, PDA, gaming device, etc. - Based on the TS sensor results, the
UI controller 112 may calculate proximity, touch, and/or force user interaction events. Haptic generation, consequently, may be linked to these proximity, touch, and/or force events and thus may be significantly improved in terms of efficiency and precision. In an embodiment, latency may be improved by pre-charging thehaptics actuator 130 based on detected proximity events such as location and/or rate of approach (i.e., velocity and/or acceleration). Therefore, a haptic effect may generated faster upon an actual touch detected because of the pre-charged actuator. In another embodiment, haptic generation may be dynamically changed and/or adjusted based on detected proximity, touch, and/or force events. - The
UI controller 110 may include theprocessor 112 and thememory 114. Theprocessor 112 may control the operations of theUI controller 110 according to instructions stored in thememory 114. Thememory 114 may also store haptic effect profiles associated with different feedback responses. Different user interaction events may be associated with different haptic effect profiles. Thememory 114 may be provided as a non-volatile memory, a volatile memory such as random access memory (RAM), or a combination thereof. - The
UI controller 110 may be coupled to thehost system 150 of the device. TheUI controller 110 may receive instructions from thehost system 150. Thehost system 150 may include an operating system and application(s) that are being executed by thedevice 100. Thehost system 150 may represent processing resources for the remainder of the device and may include central processing units, memory for storage of instructions representing an operating system and/or applications, input/output devices such as display driver, audio drivers, user input keys and the like (not shown). The host system 180 may include program instructions to govern operations of the device and manage device resources on behalf of various applications. Thehost system 150 may, for example, manage content of the display, providing icons and softkeys thereon to solicit user input thru thetouch screen 140. In an embodiment, theUI controller 110 may be integrated into thehost system 150. - The
UI controller 110 may be coupled to thetouch screen 140 and to theTS sensor 142 therein that measures different user interaction with thetouch screen 140. Thetouch screen 140 may also include an overlain display, which may be provided as a backlit LCD display with an LCD matrix, lenticular lenses, polaraziers, etc. -
FIG. 1( b) is a functional block diagram of theUI controller 110 according to an embodiment of the present invention. Theprocessor 112 in theUI controller 110 may include a proximity classification module 112.1, a touch classification module 112.2, a force classification module 112.3 and a haptics response search module 112.4. Thememory 114 in theUI controller 110 may include haptics profiles data 114.1. The data may be stored as look-up-tables (LUTs). The proximity classification module 112.1, touch classification module 112.2, and the force classification module 112.3 may receive the TS sensor data. Based on the TS sensor data, the classification module may calculate corresponding proximity, touch, and/or force event(s). - The proximity classification module 112.1 may calculate user proximity to a
touch screen 140, for example before contact is made, based on proximity associated TS sensor data. The proximity classification module 112.1 may calculate location (or locations for multi-touch user interactions) and time of the user movement (e.g., finger, stylus, pen, etc.) as it hovers over thetouch screen 140. The proximity classification module 112.1 may be complimentary to the type oftouch screen 140 andTS sensor 142. For example, if thetouch screen 140 and theTS sensor 142 is provided as a capacitive touch screen and corresponding capacitive sensor (or grid of capacitive sensors), the proximity classification module 112.1 may calculate changes in respective capacitive fields for detecting proximity events. Further, the proximity classification module 112.1 may be programmed to differentiate between true positives for desired user proximity events and false positives for objects larger than a typical user interaction instrument (e.g., finger, pen, stylus). - The touch classification module 112.2 may calculate user touch(es) on the
touch screen 140 and the touch(es) characteristics (e.g., icon selection, gesture, etc.,). The touch classification module 112.2 may calculate the location (or locations for multi-touch user interactions) and time of the user touch. The touch classification module 112.2 may be complimentary to the type oftouch screen 140 andTS sensor 142. For example, if thetouch screen 140 and theTS sensor 142 is provided as a capacitive touch screen and corresponding capacitive sensor (or grid of capacitive sensors), the touch classification module 112.2 may calculate changes in respective capacitive fields for detecting touch events. Further, the touch classification module 112.2 may be programmed to differentiate between true positives for desired user touch events and false positives for objects larger than a typical user interaction instrument (e.g., finger, pen, stylus). - The force classification module 112.3 may calculate an amount of force corresponding to a user touch on the
touch screen 140. The force classification module 112.3 may calculate how hard the user presses down and for how long with respect to atouch screen 140 contact. The force sensor 160 may be complimentary to the type of touch screen 170. The force classification module 112.3 may be complimentary to the type oftouch screen 140 andTS sensor 142. For example, if thetouch screen 140 and theTS sensor 142 is provided as a capacitive touch screen and corresponding capacitive sensor (or grid of capacitive sensors), the force classification module 112.3 may calculate changes in respective capacitive fields for detecting force events. Further, the touch classification module 112.2 may be programmed to differentiate between true positives for desired user touch events and false positives for objects larger than a typical user interaction instrument (e.g., finger, pen, stylus). - The haptic response search module 112.4 may receive proximity, touch, and/or force events as calculated by the modules 112.1-112.3, and may generate a haptic command based on the haptics profile data 114.1. For example, the haptic response search module 112.4 may match the calculated proximity, touch, and/or force event data to a stored haptic profile and may generate a haptic command associated with the matched haptic profile.
- Returning to
FIG. 1( a), theUI controller 110 may receive the TS sensor results and may generate corresponding haptic commands accordingly based on stored haptic profiles. TheUI controller 110 may be coupled to thehaptic driver 120. Based on the haptic command from theUI controller 110, thehaptic driver 120 may generate a corresponding drive signal. The drive signal may be an analog signal, and the drive signal may be a current or voltage signal. - The
haptics driver 120 may be coupled to thehaptics actuator 130. The haptics actuator 120 may be embodied as piezoelectric elements, linear resonant actuators (LRAs), eccentric rotating mass actuators (ERMs), and/or other known actuator types. Thehaptics driver 120 may transmit the drive signal to the haptics actuator 130 causing it to vibrate according to the drive signal properties. The vibrations may be felt by the user providing a vibro-tactile sensory feedback stimuli. - In an embodiment, the haptics actuator 130 may include a mechanical system such as a motor that vibrates to generate the desired haptic effect. For example, the haptics actuator 130 may include a coil motor with a spring loaded mass and a permanent magnet. The coil motor may cause the spring loaded mass to vibrate to generate the haptic effect. The haptics actuator 130 may also include magnetic coils to generate the motion.
- In an embodiment, a plurality of haptic actuators may be provided in the device to generate a plurality of haptic effects at different parts of the device. The haptic actuators may be driven by the
haptic actuator 130 with the same drive signal with multiple drive signals. -
FIG. 2 illustrates a simplified touch screen arrangement with capacitive sensors according to an embodiment of the present invention.FIG. 2( a) illustrates a capacitive sensor grid layout of the touch screen arrangement,FIG. 2( b) illustrates a cross-sectional view of the touch screen arrangement, andFIG. 2( c) illustrates capacitive fields of the capacitive sensor grid. - The touch screen arrangement may include a
touch screen 210, a plurality of capacitive sensors, and acover 240. Thecapacitive sensors 220 may be provided in a grid fashion that overlaps thedisplay panel 230. Acover 240 may protect the display panel. For example, thecover 240 may be provided as a glass cover. - The
capacitive sensors 220 may be arranged in the grid with multiple columns and rows. The grid may include m columns and n rows thus generating a m×n array (say, 11×15). The size of the array may be designed to accommodate different screen sizes and/or the desired accuracy/precision level of the touch screen. Cross points (CS) of the sensor grid may be placed a distance (D) apart from each other. In an embodiment, each cross point CS, for example, may be 5 mm apart from its neighboring cross points. - The
capacitive sensors 220 may detect proximity events, touch events, and/or force events as will be described below. The array ofcapacitive sensors 220 may be scanned at a scanning frequency. The scanning frequency may be programmable. For example, the scanning frequency may be set to 100 or 120 Hz. In an embodiment, the scanning frequency, however, may be dynamically changed based on present conditions. For example, the scanning frequency may be dynamically changed based on a rate of approach as detected by the capacitive sensors 220 (e.g., 5× the rate of approach). Hence, the scanning frequency may increase as the rate of approach increases. - In a scan, each cross point CS (or each row or each column) may generate a bit code result, which may reflect a change from normal (i.e., without user presence) conditions with respect to proximity, touch, and/or force detection. For example, each CS may generate a 14 bit result. The code may be used to calculate the type, location, and/or other characteristics such as the rate of approach (velocity and/or acceleration), force, etc., of the user interaction.
- For proximity detection, each CS may detect changes in its capacitive field as shown in
FIG. 2( c). Further, a user's finger hovering over the touch screen may be sensed by multiple CSs. In theFIG. 2( c), CS 1.1 may detect a larger presence in its capacitive field as compared to it's neighboring CS 1.2. As a result, the code change of CS 1.1 may be higher than that of CS 1.2. From the sensor results, data representing the X,Y,Z coordinates of the finger location may be generated. X,Y location may correspond to the location of the CS(s) in the grid that detected the presence (i.e., code change), and the Z location may correspond to the amount of change detected. Moreover, based on one or more sets of X,Y,Z coordinates, other characteristics such as the rate of approach may be calculated. - The
capacitive sensors 220 may also detect location and time of actual touches. For touch detection, X,Y coordinates and the time of the touches may be generated based on the sensor results. In addition, other characteristics such as the type of touch (e.g., movement on the touch surface) may be calculated from one or more sets of scan results. Force detection by thecapacitive sensors 220 may also be performed by the sensors as will be described below. -
FIGS. 3( a)-(b) illustrate user interaction detection by the sensors.FIG. 3( a) illustrates a illustrates a two-dimensional workspace 310 (i.e., UI map) without any user interaction in accordance with embodiments of the present invention. Theworkspace 310 is illustrated as including a plurality oficons 320 andbuttons 330 that identify interactive elements of theworkspace 310. Theworkspace 310 may include other areas that are not designated as interactive. For example,icons 320 may be spaced apart from each other by a certain separation distance. Further, other areas of the display may be unoccupied by content or occupied with display data that is non-interactive. Thus, non-interactive areas of the device may be may be designated as “dead zones” (DZs) for purposes of user interaction (shown in gray in the example ofFIG. 2( b)). - The sensors detection may become more localized as the user finger approaches and touches the screen.
FIG. 3( b) illustrates the two-dimensional workspace 310 with a series of user interaction beginning from proximity detection to touch detection and then force detection. At time t1, the user's finger is a certain distance above theworkspace 310 where workspace detection area for t1 may be generalized to a significant left bottom corner. As the finger approaches, the detection may increase accuracy and may become more localized. At time t2, the user finger may be slightly above thebutton 330. At time t3, the user finger may touch the screen atbutton 320. And at time t4, the detection area may increase detecting a larger amount of force as compared to time t3 touch. Based on the localization of the detection area, different haptic generation operations may be controlled and optimized. -
FIG. 4 illustrates a pre-charging haptic generation operation according to an embodiment of the present invention.FIG. 4 includes two plots. The top plot shows a user finger approaching the touch surface in a distance versus time graph, and the bottom plot shows a corresponding voltage through the haptic actuator versus time graph. As the finger is approaching the touch surface, the device may be detecting the location of the finger via proximity sensing. The device, consequently, may be generating X,Y,Z coordinates based on the proximity results. Based on at least two sets of coordinates, the device may also calculate the rate of approach and/or the direction of the finger's movement. Hence, the device may anticipate the time and/or location of the touch. - At time t1, the device may detect the finger at a predetermined distance (Threshold P) from the touch surface. At this time t1, the device may initiate pre-charging the haptic actuator. The haptic actuator may be pre-charged according to a haptic profile for the anticipated time and location of the touch. At time t2, the device may detect the finger making contact with the touch surface via Threshold T. The device, consequently, may be generating X,Y,Z coordinates based on the touch results. At this time t2, the device may drive the haptic actuator with the a corresponding haptic effect voltage based on the haptic effect profile associated with the touch characteristics. Therefore, the device may generate the haptic effect faster upon touch screen contact because of pre-charging the haptic generating components, and thereby reducing latency between the user touching the screen and feeling the corresponding haptic feedback.
- The Threshold P value may be programmable. In an embodiment, the Threshold P value may be dynamically adjustable based on finger movement characteristics. For example, the threshold P value may be directly proportional to the rate of approach. Hence, as the rate of approach increases, the Threshold P value increases and vice versa. As a result, the pre-charging time may maintained independent of the rate of approach to allow sufficient time for pre-charging the haptic actuator to the desired voltage level.
- In an embodiment, haptic selection may also be based on sensor measurements. For example, haptic effect types may be selected based on the rate of approach of the user's finger as it moves toward a touch screen—a first haptic effect may be selected in response to a relatively “fast” velocity and a second haptic effect may be selected in response to a relatively “slow” velocity.
- In an embodiment of the present invention, different types of haptic events may be selected based in part on proximity, touch, and/or force events. For example, a set of different haptic effects may be generated based on different measured events such as the rate of approach, direction, location, force, etc.
FIG. 5 illustrates a multi-haptic effect generation operation according to an embodiment of the present invention.FIG. 5 includes two plots. The top plot shows a user finger approaching the touch surface in a distance versus time graph, and the bottom plot shows a corresponding voltage through the haptic actuator versus time graph.FIG. 5 illustrates different haptic effect generation based on different user interaction events as detected by the sensor(s) via thresholds. - At time t1, the device may detect the finger at a predetermined distance,
Threshold 1, from the touch surface. The device, consequently, may be generating X,Y,Z coordinates based on the proximity sensor results. At this time t1, the device may drive a haptic actuator to generate a first haptic effect according to a haptic profile associated with the finger location and/or movement characteristics (e.g., rate of approach). - At time t2, the device may detect the finger touching the touch surface with
Threshold 2. The device, consequently, may be generating X,Y coordinates based on the touch sensor results. At this time t2, the device may drive the haptic actuator to generate a second haptic effect according to a haptic profile associated with the touch location and/or movement characteristics (e.g., type of contact). - At time t3, the device may detect the force of the finger contact crossing a predetermined level with
Threshold 3. The device, consequently, may be generating X,Y,Z coordinates based on the force sensor results. At this time t3, the device may drive the haptic actuator to generate a third haptic effect according to a haptic profile for the finger touch location and/or movement characteristics (e.g., amount of force). The third haptic effect, for example, may be an alert to the user that he/she is pressing too hard on the touch screen. The same actuator or different actuators may used to generate the first, second, and/or third haptic effects. - In an embodiment, the haptic effect selection for different interaction events such as proximity, touch, and/or force events may be dynamically changed based on user interaction history. For example, in a text entry application, different users may enter text at different rates. If a user touches a first letter and the device initiates a haptic effect, then the user's moves toward another letter, the device may recognize the approaching finger and terminate the first haptic effect sufficiently early before the second letter is touched so as to minimize blur between successive haptic effects.
-
FIG. 6 illustrates operation of a haptics enabled device to detect force applied to a touch screen for use in accordance with embodiments of the present invention. As illustrated inFIG. 6( a), a user may press the touch screen lightly with his/her finger. In this case, there may be a small deflection of the user's finger at the point of contact, which may be registered on the touch screen as an area of contact. InFIG. 6( a), the area may be considered as a circle having radius R. A touch sensor may derive a force at the point of contact from the calculated area. InFIG. 6( b), the user presses the touch screen with greater force, causing a greater amount of deformation in the user's finger. The user's finger, therefore, may register a greater area of contact than in theFIG. 6( a) case, which the touch sensor may use to derive a corresponding higher value of force. - In an embodiment, the force sensor may represent force as a distance value in the Z plane. The force sensor may calculate an area of contact between the user and the touch screen and convert the value to a distance value in the Z plane. If, in the proximity and presence detection operations, distance values are represented as positive Z values, distance values representing user force may be represented as negative Z values. See,
FIG. 6( b). In the example ofFIG. 6( b), the negative Z value models a hypothetical depth of an operator's touch, based on deformation of the user's finger, rather than an actual depth of touch. - In an embodiment, haptic effects may be pre-charged and driven before the user touch based on proximity detection. The device, for example, may generate a “bubble” effect, which may correspond to stimulating a clicking functionality using haptic effects.
FIGS. 4( a)-4(d) illustrate a bubble effect generation operation according to an embodiment of the present invention. -
FIG. 7( a) illustrates a state of the touch screen prior to detection. As the user's finger approaches the touch screen, it enters the “field of view” of the touch screen and is identified by the proximity sensor. In response, a haptics driver may pre-charge a haptics actuator to cause the touch screen to deflect toward the users finger by a predetermined amount, shown as ΔZ inFIGS. 7( a)-(d). Thus, the touch screen may be deflected toward the user's finger by the time it makes contact with the touch screen as shown inFIG. 7( b). When the touch sensor determines that the user's finger has made contact, it may initiate the haptic effect. As shown inFIG. 7( c), a mechanical button click may be simulated, for example, by removing the pre-charge effect and inducing the touch screen to return to its default level (shown as “Z”). After the retraction of the touch screen, the user's finger may fall to the surface of the screen at the Z level, shown inFIG. 7( d). The click effect may be induced by the user feeling mechanical resistance at the first point of contact with the screen deflected forward (FIG. 7( b)) and then at the second point of contact with the screen at the rest position (FIG. 7( d)). This effect simulates a mechanical compression effect (i.e., the bubble effect). - Of course, the proximity-based deflection operations are not limited to click effects. Vibration effects may be induced by deflecting the screen forward prior to initial contact, then oscillating the screen forward and backward after contact is made. A variety of different haptic effects may be used in connection with proximity detection operations.
- The foregoing description refers to finger touches for illustration purposes only, and it should be understood that embodiments of the present invention are applicable for other types of user interaction such as with a pen, stylus, etc.
- Those skilled in the art may appreciate from the foregoing description that the present invention may be implemented in a variety of forms, and that the various embodiments may be implemented alone or in combination. Therefore, while the embodiments of the present invention have been described in connection with particular examples thereof, the true scope of the embodiments and/or methods of the present invention should not be so limited since other modifications will become apparent to the skilled practitioner upon a study of the drawings, specification, and following claims.
- Various embodiments may be implemented using hardware elements, software elements, or a combination of both. Examples of hardware elements may include processors, microprocessors, circuits, circuit elements (e.g., transistors, resistors, capacitors, inductors, and so forth), integrated circuits, application specific integrated circuits (ASIC), programmable logic devices (PLD), digital signal processors (DSP), field programmable gate array (FPGA), logic gates, registers, semiconductor device, chips, microchips, chip sets, and so forth. Examples of software may include software components, programs, applications, computer programs, application programs, system programs, machine programs, operating system software, middleware, firmware, software modules, routines, subroutines, functions, methods, procedures, software interfaces, application program interfaces (API), instruction sets, computing code, computer code, code segments, computer code segments, words, values, symbols, or any combination thereof. Determining whether an embodiment is implemented using hardware elements and/or software elements may vary in accordance with any number of factors, such as desired computational rate, power levels, heat tolerances, processing cycle budget, input data rates, output data rates, memory resources, data bus speeds and other design or performance constraints.
- Some embodiments may be implemented, for example, using a computer-readable medium or article which may store an instruction or a set of instructions that, if executed by a machine, may cause the machine to perform a method and/or operations in accordance with the embodiments. Such a machine may include, for example, any suitable processing platform, computing platform, computing device, processing device, computing system, processing system, computer, processor, or the like, and may be implemented using any suitable combination of hardware and/or software. The computer-readable medium or article may include, for example, any suitable type of memory unit, memory device, memory article, memory medium, storage device, storage article, storage medium and/or storage unit, for example, memory, removable or non-removable media, erasable or non-erasable media, writeable or re-writeable media, digital or analog media, hard disk, floppy disk, Compact Disc Read Only Memory (CD-ROM), Compact Disc Recordable (CD-R), Compact Disc Rewriteable (CD-RW), optical disk, magnetic media, magneto-optical media, removable memory cards or disks, various types of Digital Versatile Disc (DVD), a tape, a cassette, or the like. The instructions may include any suitable type of code, such as source code, compiled code, interpreted code, executable code, static code, dynamic code, encrypted code, and the like, implemented using any suitable high-level, low-level, object-oriented, visual, compiled and/or interpreted programming language.
Claims (20)
1. A device, comprising:
a haptic driver to drive a coupled actuator causing the actuator to generate a vibratory haptic effect;
a touch screen to display a user interface, wherein touch screen including a sensor to detect user interaction with the touch screen within a predetermined range above the touch screen; and
a controller to calculate a proximity event based on the detected user interaction above the touch screen, and to control haptic driver operations according to the proximity event.
2. The device of claim 1 , wherein the proximity event includes a rate of approach.
3. The device of claim 1 , wherein the sensor comprises a capacitive sensor grid that is scanned at a scanning frequency rate.
4. The device of claim 3 , wherein the scanning frequency rate is dynamically adjusted based on prior calculated user interaction properties.
5. The device of claim 1 , wherein the controller is configured to pre-charge the haptic driver to a voltage level based on the proximity event.
6. The device of claim 1 , wherein the controller is further configured to calculate touch and force events.
7. The device of claim 6 , wherein the device is configured to generate multiple haptic effects based different proximity, touch, and/or force events.
8. The device of claim 6 , wherein force events are detected based on an area of user touch on the touch screen.
9. A method of generating haptic effects, comprising:
detecting a user interaction above a touch surface within a predetermined range;
calculating location coordinates of the user interaction;
calculating user interaction properties based on the location coordinates;
applying voltage through a haptic actuator based on the user interaction properties.
10. The method of claim 9 , further comprises:
pre-charging the actuator to a first voltage level based on the user interaction properties;
detecting a user touch on the touch screen;
calculating touch location coordinates of the user touch;
driving the haptic actuator to generate a haptic effect based on the user touch from the first voltage level.
11. The method of claim 10 , further comprises:
detecting an amount of force of the user touch;
driving the haptic actuator to generate a second haptic effect based on the amount of force.
12. The method of claim 11 , wherein detecting the amount of force is proportional to an area of the user touch on the touch screen.
13. The method of claim 9 , wherein the user interaction properties include a rate of approach.
14. The method of claim 9 , wherein the detecting is performed by scan reads of a capacitive sensor grid at a scanning frequency.
15. The method of claim 14 , wherein the scanning frequency rate is dynamically adjusted based on prior calculated user interaction properties.
16. A user interface controller, comprising:
a sensor input to receive sensor data related to user interaction above a touch screen within a predetermined range;
a memory to store program instructions and a plurality of haptic profiles;
a processor to calculate user interaction properties from the sensor data, to match a haptic profile from the memory to the user interaction properties, and to generate a haptic command associated with the haptic profile; and a
a haptic driver output to send the haptic command.
17. The user interface controller of claim 16 , wherein the user interaction properties include a rate of approach.
18. The user interface controller of claim 16 , wherein the haptic command includes an instruction to pre-charge an actuator.
19. The user interface controller of claim 16 , wherein the sensor data also includes also relates to a user touch on the touch screen and to an amount of force of the user touch.
20. The user interface controller of claim 19 , wherein the processor is configured to generate multiple haptic commands based the sensor data.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US13/434,623 US20120249474A1 (en) | 2011-04-01 | 2012-03-29 | Proximity and force detection for haptic effect generation |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US201161470764P | 2011-04-01 | 2011-04-01 | |
US13/434,623 US20120249474A1 (en) | 2011-04-01 | 2012-03-29 | Proximity and force detection for haptic effect generation |
Publications (1)
Publication Number | Publication Date |
---|---|
US20120249474A1 true US20120249474A1 (en) | 2012-10-04 |
Family
ID=46926543
Family Applications (4)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US13/433,069 Abandoned US20120249461A1 (en) | 2011-04-01 | 2012-03-28 | Dedicated user interface controller for feedback responses |
US13/433,105 Active 2032-08-25 US8937603B2 (en) | 2011-04-01 | 2012-03-28 | Method and apparatus for haptic vibration response profiling and feedback |
US13/434,623 Abandoned US20120249474A1 (en) | 2011-04-01 | 2012-03-29 | Proximity and force detection for haptic effect generation |
US13/434,677 Abandoned US20120249475A1 (en) | 2011-04-01 | 2012-03-29 | 3d user interface control |
Family Applications Before (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US13/433,069 Abandoned US20120249461A1 (en) | 2011-04-01 | 2012-03-28 | Dedicated user interface controller for feedback responses |
US13/433,105 Active 2032-08-25 US8937603B2 (en) | 2011-04-01 | 2012-03-28 | Method and apparatus for haptic vibration response profiling and feedback |
Family Applications After (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US13/434,677 Abandoned US20120249475A1 (en) | 2011-04-01 | 2012-03-29 | 3d user interface control |
Country Status (2)
Country | Link |
---|---|
US (4) | US20120249461A1 (en) |
WO (4) | WO2012135373A2 (en) |
Cited By (52)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20110060499A1 (en) * | 2009-09-04 | 2011-03-10 | Hyundai Motor Japan R&D Center, Inc. | Operation system for vehicle |
US20120319938A1 (en) * | 2011-06-20 | 2012-12-20 | Immersion Corporation | Haptic theme framework |
US20130154948A1 (en) * | 2011-12-14 | 2013-06-20 | Synaptics Incorporated | Force sensing input device and method for determining force information |
US20130335338A1 (en) * | 2012-06-15 | 2013-12-19 | Research In Motion Limited | Electronic device including touch-sensitive display and method of controlling same |
US20140092003A1 (en) * | 2012-09-28 | 2014-04-03 | Min Liu | Direct haptic feedback |
US20140104320A1 (en) * | 2012-10-17 | 2014-04-17 | Perceptive Pixel, Inc. | Controlling Virtual Objects |
US20140230575A1 (en) * | 2013-02-17 | 2014-08-21 | Microsoft Corporation | Piezo-actuated virtual buttons for touch surfaces |
US20140232944A1 (en) * | 2011-10-19 | 2014-08-21 | Thomson Licensing | Remote control with feedback for blind navigation |
US20140368445A1 (en) * | 2013-06-17 | 2014-12-18 | Lenovo (Singapore) Pte. Ltd. | Simulation of control areas on touch surface using haptic feedback |
US9182823B2 (en) | 2014-01-21 | 2015-11-10 | Lenovo (Singapore) Pte. Ltd. | Actuating haptic element of a touch-sensitive device |
US9195351B1 (en) * | 2011-09-28 | 2015-11-24 | Amazon Technologies, Inc. | Capacitive stylus |
US20160070353A1 (en) * | 2012-02-01 | 2016-03-10 | Immersion Corporation | Eccentric rotating mass actuator optimization for haptic effects |
US20160103500A1 (en) * | 2013-05-21 | 2016-04-14 | Stanley Innovation, Inc. | System and method for a human machine interface utilizing near-field quasi-state electrical field sensing technology |
US9448631B2 (en) | 2013-12-31 | 2016-09-20 | Microsoft Technology Licensing, Llc | Input device haptics and pressure sensing |
US9471143B2 (en) * | 2014-01-20 | 2016-10-18 | Lenovo (Singapore) Pte. Ltd | Using haptic feedback on a touch device to provide element location indications |
WO2017053761A1 (en) * | 2015-09-25 | 2017-03-30 | Immersion Corporation | Haptic effects design system |
WO2017111928A1 (en) * | 2015-12-22 | 2017-06-29 | Intel Corporation | Reduction of touchscreen bounce |
US20170185151A1 (en) * | 2015-12-28 | 2017-06-29 | Microsoft Technology Licensing, Llc | Haptic feedback for non-touch surface interaction |
WO2018002189A1 (en) * | 2016-06-29 | 2018-01-04 | Dav | Control method and control interface for a motor vehicle |
WO2018002186A1 (en) * | 2016-06-29 | 2018-01-04 | Dav | Control method and control interface for a motor vehicle |
US10061385B2 (en) | 2016-01-22 | 2018-08-28 | Microsoft Technology Licensing, Llc | Haptic feedback for a touch input device |
US20180356891A1 (en) * | 2015-11-27 | 2018-12-13 | Kyocera Corporation | Tactile sensation providing apparatus and tactile sensation providing method |
US10671167B2 (en) | 2016-09-01 | 2020-06-02 | Apple Inc. | Electronic device including sensed location based driving of haptic actuators and related methods |
US10890978B2 (en) * | 2016-05-10 | 2021-01-12 | Apple Inc. | Electronic device with an input device having a haptic engine |
US10936071B2 (en) | 2018-08-30 | 2021-03-02 | Apple Inc. | Wearable electronic device with haptic rotatable input |
US10966007B1 (en) | 2018-09-25 | 2021-03-30 | Apple Inc. | Haptic output system |
US20210149494A1 (en) * | 2017-02-13 | 2021-05-20 | Snap Inc. | Generating a response that depicts haptic characteristics |
US11024135B1 (en) | 2020-06-17 | 2021-06-01 | Apple Inc. | Portable electronic device having a haptic button assembly |
US11054932B2 (en) | 2017-09-06 | 2021-07-06 | Apple Inc. | Electronic device having a touch sensor, force sensor, and haptic actuator in an integrated module |
US11098951B2 (en) | 2018-09-09 | 2021-08-24 | Ultrahaptics Ip Ltd | Ultrasonic-assisted liquid manipulation |
US11169610B2 (en) | 2019-11-08 | 2021-11-09 | Ultraleap Limited | Tracking techniques in haptic systems |
US11189140B2 (en) | 2016-01-05 | 2021-11-30 | Ultrahaptics Ip Ltd | Calibration and detection techniques in haptic systems |
US11204644B2 (en) * | 2014-09-09 | 2021-12-21 | Ultrahaptics Ip Ltd | Method and apparatus for modulating haptic feedback |
US11276281B2 (en) | 2015-02-20 | 2022-03-15 | Ultrahaptics Ip Ltd | Algorithm improvements in a haptic system |
US11307664B2 (en) | 2016-08-03 | 2022-04-19 | Ultrahaptics Ip Ltd | Three-dimensional perceptions in haptic systems |
US11360546B2 (en) | 2017-12-22 | 2022-06-14 | Ultrahaptics Ip Ltd | Tracking in haptic systems |
US11374586B2 (en) | 2019-10-13 | 2022-06-28 | Ultraleap Limited | Reducing harmonic distortion by dithering |
US11378997B2 (en) | 2018-10-12 | 2022-07-05 | Ultrahaptics Ip Ltd | Variable phase and frequency pulse-width modulation technique |
US20220334658A1 (en) * | 2021-04-20 | 2022-10-20 | Microsoft Technology Licensing, Llc | Stylus haptic component arming and power consumption |
US11529650B2 (en) | 2018-05-02 | 2022-12-20 | Ultrahaptics Ip Ltd | Blocking plate structure for improved acoustic transmission efficiency |
US11531395B2 (en) | 2017-11-26 | 2022-12-20 | Ultrahaptics Ip Ltd | Haptic effects from focused acoustic fields |
US11543507B2 (en) | 2013-05-08 | 2023-01-03 | Ultrahaptics Ip Ltd | Method and apparatus for producing an acoustic field |
US11550432B2 (en) | 2015-02-20 | 2023-01-10 | Ultrahaptics Ip Ltd | Perceptions in a haptic system |
US11553295B2 (en) | 2019-10-13 | 2023-01-10 | Ultraleap Limited | Dynamic capping with virtual microphones |
US11550395B2 (en) | 2019-01-04 | 2023-01-10 | Ultrahaptics Ip Ltd | Mid-air haptic textures |
US11704983B2 (en) | 2017-12-22 | 2023-07-18 | Ultrahaptics Ip Ltd | Minimizing unwanted responses in haptic systems |
US11715453B2 (en) | 2019-12-25 | 2023-08-01 | Ultraleap Limited | Acoustic transducer structures |
US11727790B2 (en) | 2015-07-16 | 2023-08-15 | Ultrahaptics Ip Ltd | Calibration techniques in haptic systems |
US11816267B2 (en) | 2020-06-23 | 2023-11-14 | Ultraleap Limited | Features of airborne ultrasonic fields |
US11842517B2 (en) | 2019-04-12 | 2023-12-12 | Ultrahaptics Ip Ltd | Using iterative 3D-model fitting for domain adaptation of a hand-pose-estimation neural network |
US11886639B2 (en) | 2020-09-17 | 2024-01-30 | Ultraleap Limited | Ultrahapticons |
US11955109B2 (en) | 2016-12-13 | 2024-04-09 | Ultrahaptics Ip Ltd | Driving techniques for phased-array systems |
Families Citing this family (88)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2012256214A (en) * | 2011-06-09 | 2012-12-27 | Sony Corp | Information processing device, information processing method, and program |
US20130009915A1 (en) | 2011-07-08 | 2013-01-10 | Nokia Corporation | Controlling responsiveness to user inputs on a touch-sensitive display |
IL216118A0 (en) | 2011-11-03 | 2012-02-29 | Google Inc | Customer support solution recommendation system |
KR101873759B1 (en) * | 2012-04-10 | 2018-08-02 | 엘지전자 주식회사 | Display apparatus and method for controlling thereof |
US10281986B2 (en) * | 2012-05-03 | 2019-05-07 | Georgia Tech Research Corporation | Methods, controllers and computer program products for accessibility to computing devices |
WO2013170099A1 (en) | 2012-05-09 | 2013-11-14 | Yknots Industries Llc | Calibration of haptic feedback systems for input devices |
CN203324713U (en) * | 2012-05-09 | 2013-12-04 | 布里斯托尔D/B/A远程自动化解决方案公司 | Device for displaying information via process control equipment |
US9977499B2 (en) | 2012-05-09 | 2018-05-22 | Apple Inc. | Thresholds for determining feedback in computing devices |
US20130318437A1 (en) * | 2012-05-22 | 2013-11-28 | Samsung Electronics Co., Ltd. | Method for providing ui and portable apparatus applying the same |
WO2013188307A2 (en) | 2012-06-12 | 2013-12-19 | Yknots Industries Llc | Haptic electromagnetic actuator |
US9886116B2 (en) | 2012-07-26 | 2018-02-06 | Apple Inc. | Gesture and touch input detection through force sensing |
JP6308132B2 (en) * | 2012-09-11 | 2018-04-11 | 日本電気株式会社 | Electronic device, control method of electronic device, and program |
US9874972B2 (en) * | 2012-09-25 | 2018-01-23 | Synaptics Incorporated | Systems and methods for decoupling image generation rate from reporting rate in capacitive sensing |
KR20140047897A (en) * | 2012-10-15 | 2014-04-23 | 삼성전자주식회사 | Method for providing for touch effect and an electronic device thereof |
JP6498863B2 (en) * | 2012-12-13 | 2019-04-10 | イマージョン コーポレーションImmersion Corporation | Haptic system with increased LRA bandwidth |
US9202350B2 (en) * | 2012-12-19 | 2015-12-01 | Nokia Technologies Oy | User interfaces and associated methods |
KR102044826B1 (en) * | 2013-01-02 | 2019-11-14 | 삼성전자 주식회사 | Method for providing function of mouse and terminal implementing the same |
US10175874B2 (en) * | 2013-01-04 | 2019-01-08 | Samsung Electronics Co., Ltd. | Display system with concurrent multi-mode control mechanism and method of operation thereof |
CN103970291B (en) * | 2013-01-31 | 2018-08-14 | 索尼公司 | Mobile terminal |
US9304587B2 (en) | 2013-02-13 | 2016-04-05 | Apple Inc. | Force sensing mouse |
JP6032364B2 (en) | 2013-06-26 | 2016-11-24 | 富士通株式会社 | DRIVE DEVICE, ELECTRONIC DEVICE, AND DRIVE CONTROL PROGRAM |
WO2014207842A1 (en) * | 2013-06-26 | 2014-12-31 | 富士通株式会社 | Drive device, electronic apparatus, and drive control program |
EP3019943A4 (en) * | 2013-07-12 | 2017-05-31 | Tactual Labs Co. | Reducing control response latency with defined cross-control behavior |
US11229239B2 (en) * | 2013-07-19 | 2022-01-25 | Rai Strategic Holdings, Inc. | Electronic smoking article with haptic feedback |
US9520036B1 (en) * | 2013-09-18 | 2016-12-13 | Amazon Technologies, Inc. | Haptic output generation with dynamic feedback control |
US9213408B2 (en) | 2013-10-08 | 2015-12-15 | Immersion Corporation | Generating haptic effects while minimizing cascading |
TWI606386B (en) * | 2013-10-31 | 2017-11-21 | 富智康(香港)有限公司 | Page switching system, touch device and page switching method |
JP2015121983A (en) * | 2013-12-24 | 2015-07-02 | 京セラ株式会社 | Tactile sensation presentation device |
US20150242037A1 (en) | 2014-01-13 | 2015-08-27 | Apple Inc. | Transparent force sensor with strain relief |
US9817489B2 (en) | 2014-01-27 | 2017-11-14 | Apple Inc. | Texture capture stylus and method |
US20150323994A1 (en) * | 2014-05-07 | 2015-11-12 | Immersion Corporation | Dynamic haptic effect modification |
US9323331B2 (en) * | 2014-05-21 | 2016-04-26 | International Business Machines Corporation | Evaluation of digital content using intentional user feedback obtained through haptic interface |
US10146318B2 (en) | 2014-06-13 | 2018-12-04 | Thomas Malzbender | Techniques for using gesture recognition to effectuate character selection |
KR102294193B1 (en) | 2014-07-16 | 2021-08-26 | 삼성전자주식회사 | Apparatus and method for supporting computer aided diagonosis based on probe speed |
CN106796456B (en) | 2014-07-30 | 2019-10-22 | 惠普发展公司,有限责任合伙企业 | Detector for display |
CN115963922A (en) | 2014-09-02 | 2023-04-14 | 苹果公司 | Semantic framework for variable haptic output |
US10297119B1 (en) | 2014-09-02 | 2019-05-21 | Apple Inc. | Feedback device in an electronic device |
US9939901B2 (en) | 2014-09-30 | 2018-04-10 | Apple Inc. | Haptic feedback assembly |
EP3002666A1 (en) | 2014-10-02 | 2016-04-06 | Huawei Technologies Co., Ltd. | Interaction method for user interfaces |
US9400570B2 (en) | 2014-11-14 | 2016-07-26 | Apple Inc. | Stylus with inertial sensor |
US9846484B2 (en) * | 2014-12-04 | 2017-12-19 | Immersion Corporation | Systems and methods for controlling haptic signals |
US9575573B2 (en) | 2014-12-18 | 2017-02-21 | Apple Inc. | Stylus with touch sensor |
US9798409B1 (en) | 2015-03-04 | 2017-10-24 | Apple Inc. | Multi-force input device |
US9645647B2 (en) | 2015-05-13 | 2017-05-09 | Immersion Corporation | Systems and methods for haptic feedback for modular devices |
EP3314369B1 (en) | 2015-06-26 | 2021-07-21 | SABIC Global Technologies B.V. | Electromechanical actuators for haptic feedback in electronic devices |
US10109161B2 (en) * | 2015-08-21 | 2018-10-23 | Immersion Corporation | Haptic driver with attenuation |
US10516348B2 (en) | 2015-11-05 | 2019-12-24 | Mems Drive Inc. | MEMS actuator package architecture |
KR102181938B1 (en) * | 2016-04-19 | 2020-11-23 | 니폰 덴신 덴와 가부시끼가이샤 | Pseudo force generating device |
DK179489B1 (en) | 2016-06-12 | 2019-01-04 | Apple Inc. | Devices, methods and graphical user interfaces for providing haptic feedback |
DK179823B1 (en) | 2016-06-12 | 2019-07-12 | Apple Inc. | Devices, methods, and graphical user interfaces for providing haptic feedback |
DK179802B1 (en) * | 2016-09-06 | 2019-06-26 | Apple Inc. | Devices, methods and graphical user interfaces for generating tactile outputs |
US10606355B1 (en) * | 2016-09-06 | 2020-03-31 | Apple Inc. | Haptic architecture in a portable electronic device |
DK201670720A1 (en) | 2016-09-06 | 2018-03-26 | Apple Inc | Devices, Methods, and Graphical User Interfaces for Generating Tactile Outputs |
DK179278B1 (en) | 2016-09-06 | 2018-03-26 | Apple Inc | Devices, methods and graphical user interfaces for haptic mixing |
CN106980362A (en) | 2016-10-09 | 2017-07-25 | 阿里巴巴集团控股有限公司 | Input method and device based on virtual reality scenario |
KR102629409B1 (en) * | 2016-11-11 | 2024-01-26 | 삼성전자주식회사 | Method for providing object information and electronic device thereof |
US11494986B2 (en) * | 2017-04-20 | 2022-11-08 | Samsung Electronics Co., Ltd. | System and method for two dimensional application usage in three dimensional virtual reality environment |
US10732714B2 (en) | 2017-05-08 | 2020-08-04 | Cirrus Logic, Inc. | Integrated haptic system |
DK201770372A1 (en) | 2017-05-16 | 2019-01-08 | Apple Inc. | Tactile feedback for locked device user interfaces |
US10712930B2 (en) | 2017-05-28 | 2020-07-14 | International Business Machines Corporation | 3D touch based user interface value pickers |
US10871829B2 (en) | 2017-12-05 | 2020-12-22 | Tactai, Inc. | Touch enabling process, haptic accessory, and core haptic engine to enable creation and delivery of tactile-enabled experiences with virtual objects |
US10832537B2 (en) | 2018-04-04 | 2020-11-10 | Cirrus Logic, Inc. | Methods and apparatus for outputting a haptic signal to a haptic transducer |
US10599221B2 (en) | 2018-06-15 | 2020-03-24 | Immersion Corporation | Systems, devices, and methods for providing limited duration haptic effects |
US11269415B2 (en) | 2018-08-14 | 2022-03-08 | Cirrus Logic, Inc. | Haptic output systems |
CN112639685B (en) * | 2018-09-04 | 2024-03-08 | 苹果公司 | Display device sharing and interaction in Simulated Reality (SR) |
US10831276B2 (en) | 2018-09-07 | 2020-11-10 | Apple Inc. | Tungsten frame of a haptic feedback module for a portable electronic device |
US10852830B2 (en) * | 2018-09-11 | 2020-12-01 | Apple Inc. | Power efficient, dynamic management of haptic module mechanical offset |
GB201817495D0 (en) | 2018-10-26 | 2018-12-12 | Cirrus Logic Int Semiconductor Ltd | A force sensing system and method |
US12035445B2 (en) | 2019-03-29 | 2024-07-09 | Cirrus Logic Inc. | Resonant tracking of an electromagnetic load |
US10828672B2 (en) | 2019-03-29 | 2020-11-10 | Cirrus Logic, Inc. | Driver circuitry |
US11644370B2 (en) * | 2019-03-29 | 2023-05-09 | Cirrus Logic, Inc. | Force sensing with an electromagnetic load |
US11283337B2 (en) | 2019-03-29 | 2022-03-22 | Cirrus Logic, Inc. | Methods and systems for improving transducer dynamics |
US11509292B2 (en) | 2019-03-29 | 2022-11-22 | Cirrus Logic, Inc. | Identifying mechanical impedance of an electromagnetic load using least-mean-squares filter |
US10955955B2 (en) | 2019-03-29 | 2021-03-23 | Cirrus Logic, Inc. | Controller for use in a device comprising force sensors |
US10976825B2 (en) | 2019-06-07 | 2021-04-13 | Cirrus Logic, Inc. | Methods and apparatuses for controlling operation of a vibrational output system and/or operation of an input sensor system |
CN114008569A (en) | 2019-06-21 | 2022-02-01 | 思睿逻辑国际半导体有限公司 | Method and apparatus for configuring a plurality of virtual buttons on a device |
US11921923B2 (en) * | 2019-07-30 | 2024-03-05 | Maxim Integrated Products, Inc. | Oscillation reduction in haptic vibrators by minimization of feedback acceleration |
US11408787B2 (en) | 2019-10-15 | 2022-08-09 | Cirrus Logic, Inc. | Control methods for a force sensor system |
US11380175B2 (en) | 2019-10-24 | 2022-07-05 | Cirrus Logic, Inc. | Reproducibility of haptic waveform |
US11545951B2 (en) | 2019-12-06 | 2023-01-03 | Cirrus Logic, Inc. | Methods and systems for detecting and managing amplifier instability |
WO2021141936A1 (en) * | 2020-01-06 | 2021-07-15 | Tactai, Inc. | Haptic waveform generation and rendering at interface device |
US11662821B2 (en) | 2020-04-16 | 2023-05-30 | Cirrus Logic, Inc. | In-situ monitoring, calibration, and testing of a haptic actuator |
US12244253B2 (en) | 2020-04-16 | 2025-03-04 | Cirrus Logic Inc. | Restricting undesired movement of a haptic actuator |
US11567575B2 (en) * | 2021-06-14 | 2023-01-31 | Microsoft Technology Licensing, Llc | Haptic response control |
US11933822B2 (en) | 2021-06-16 | 2024-03-19 | Cirrus Logic Inc. | Methods and systems for in-system estimation of actuator parameters |
US11765499B2 (en) | 2021-06-22 | 2023-09-19 | Cirrus Logic Inc. | Methods and systems for managing mixed mode electromechanical actuator drive |
US11908310B2 (en) | 2021-06-22 | 2024-02-20 | Cirrus Logic Inc. | Methods and systems for detecting and managing unexpected spectral content in an amplifier system |
US11552649B1 (en) | 2021-12-03 | 2023-01-10 | Cirrus Logic, Inc. | Analog-to-digital converter-embedded fixed-phase variable gain amplifier stages for dual monitoring paths |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20110291976A1 (en) * | 2009-03-12 | 2011-12-01 | Ricoh Company, Ltd | Touch panel device, display device equipped with touch panel device, and control method of touch panel device |
US20120056825A1 (en) * | 2010-03-16 | 2012-03-08 | Immersion Corporation | Systems And Methods For Pre-Touch And True Touch |
US20120105357A1 (en) * | 2010-10-31 | 2012-05-03 | Avago Technologies Ecbu Ip (Singapore) Pte. Ltd. | Capacitive Touchscreen System with Reduced Power Consumption Using Modal Focused Scanning |
Family Cites Families (41)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5825308A (en) * | 1996-11-26 | 1998-10-20 | Immersion Human Interface Corporation | Force feedback interface having isotonic and isometric functionality |
US6411276B1 (en) | 1996-11-13 | 2002-06-25 | Immersion Corporation | Hybrid control of haptic feedback for host computer and interface device |
DE20080209U1 (en) * | 1999-09-28 | 2001-08-09 | Immersion Corp | Control of haptic sensations for interface devices with vibrotactile feedback |
US7730401B2 (en) | 2001-05-16 | 2010-06-01 | Synaptics Incorporated | Touch screen with user interface enhancement |
US11275405B2 (en) * | 2005-03-04 | 2022-03-15 | Apple Inc. | Multi-functional hand-held device |
GB0323570D0 (en) * | 2003-10-08 | 2003-11-12 | Harald Philipp | Touch-sensitivity control panel |
US7295015B2 (en) * | 2004-02-19 | 2007-11-13 | Brooks Automation, Inc. | Ionization gauge |
US7956846B2 (en) | 2006-01-05 | 2011-06-07 | Apple Inc. | Portable electronic device with content-dependent touch sensitivity |
US8681098B2 (en) | 2008-04-24 | 2014-03-25 | Oblong Industries, Inc. | Detecting, representing, and interpreting three-space input: gestural continuum subsuming freespace, proximal, and surface-contact modes |
US8612024B2 (en) | 2006-02-24 | 2013-12-17 | Medtronic, Inc. | User interface with 3D environment for configuring stimulation therapy |
US7890863B2 (en) | 2006-10-04 | 2011-02-15 | Immersion Corporation | Haptic effects with proximity sensing |
US8103109B2 (en) | 2007-06-19 | 2012-01-24 | Microsoft Corporation | Recognizing hand poses and/or object classes |
WO2009008686A2 (en) | 2007-07-11 | 2009-01-15 | Eui Jin Oh | Data input device by detecting finger's moving and the input process thereof |
US20090066660A1 (en) * | 2007-09-06 | 2009-03-12 | Ure Michael J | Interface with and communication between mobile electronic devices |
KR101424259B1 (en) | 2007-08-22 | 2014-07-31 | 삼성전자주식회사 | Method and apparatus for providing input feedback in portable terminal |
US8917247B2 (en) | 2007-11-20 | 2014-12-23 | Samsung Electronics Co., Ltd. | External device identification method and apparatus in a device including a touch spot, and computer-readable recording mediums having recorded thereon programs for executing the external device identification method in a device including a touch spot |
KR20090066368A (en) | 2007-12-20 | 2009-06-24 | 삼성전자주식회사 | A mobile terminal having a touch screen and a method of controlling the function thereof |
JP5166955B2 (en) * | 2008-04-24 | 2013-03-21 | キヤノン株式会社 | Information processing apparatus, information processing method, and information processing program |
US20090279107A1 (en) | 2008-05-09 | 2009-11-12 | Analog Devices, Inc. | Optical distance measurement by triangulation of an active transponder |
US9285459B2 (en) | 2008-05-09 | 2016-03-15 | Analog Devices, Inc. | Method of locating an object in 3D |
US8099332B2 (en) | 2008-06-06 | 2012-01-17 | Apple Inc. | User interface for application management for a mobile device |
US20090309825A1 (en) * | 2008-06-13 | 2009-12-17 | Sony Ericsson Mobile Communications Ab | User interface, method, and computer program for controlling apparatus, and apparatus |
US8174372B2 (en) | 2008-06-26 | 2012-05-08 | Immersion Corporation | Providing haptic feedback on a touch surface |
KR101014263B1 (en) | 2008-09-04 | 2011-02-16 | 삼성전기주식회사 | Tactile sensor |
KR20100036850A (en) | 2008-09-30 | 2010-04-08 | 삼성전기주식회사 | Touch panel apparatus using tactile sensor |
KR101021440B1 (en) | 2008-11-14 | 2011-03-15 | 한국표준과학연구원 | Touch input device, mobile device using same and control method thereof |
US20100134409A1 (en) | 2008-11-30 | 2010-06-03 | Lenovo (Singapore) Pte. Ltd. | Three-dimensional user interface |
US9746544B2 (en) | 2008-12-03 | 2017-08-29 | Analog Devices, Inc. | Position measurement systems using position sensitive detectors |
US8823518B2 (en) * | 2008-12-08 | 2014-09-02 | Motorola Solutions, Inc. | Method of sensor cluster processing for a communication device |
US7843277B2 (en) | 2008-12-16 | 2010-11-30 | Immersion Corporation | Haptic feedback generation based on resonant frequency |
US8686952B2 (en) | 2008-12-23 | 2014-04-01 | Apple Inc. | Multi touch with multi haptics |
US8291348B2 (en) | 2008-12-31 | 2012-10-16 | Hewlett-Packard Development Company, L.P. | Computing device and method for selecting display regions responsive to non-discrete directional input actions and intelligent content analysis |
US8760413B2 (en) | 2009-01-08 | 2014-06-24 | Synaptics Incorporated | Tactile surface |
CN102802509B (en) | 2009-05-27 | 2017-06-09 | 美国亚德诺半导体公司 | Multiuse optical sensor |
US8279197B2 (en) | 2009-08-25 | 2012-10-02 | Pixart Imaging Inc. | Method and apparatus for detecting defective traces in a mutual capacitance touch sensing device |
KR20110031797A (en) * | 2009-09-21 | 2011-03-29 | 삼성전자주식회사 | Input device and method of mobile terminal |
US8487759B2 (en) * | 2009-09-30 | 2013-07-16 | Apple Inc. | Self adapting haptic device |
KR101120894B1 (en) | 2009-10-20 | 2012-02-27 | 삼성전기주식회사 | Haptic feedback device and electronic device |
US9104275B2 (en) | 2009-10-20 | 2015-08-11 | Lg Electronics Inc. | Mobile terminal to display an object on a perceived 3D space |
US9043732B2 (en) | 2010-10-21 | 2015-05-26 | Nokia Corporation | Apparatus and method for user input for controlling displayed information |
US9164586B2 (en) * | 2012-11-21 | 2015-10-20 | Novasentis, Inc. | Haptic system with localized response |
-
2012
- 2012-03-28 US US13/433,069 patent/US20120249461A1/en not_active Abandoned
- 2012-03-28 WO PCT/US2012/030994 patent/WO2012135373A2/en active Application Filing
- 2012-03-28 WO PCT/US2012/031003 patent/WO2012135378A1/en active Application Filing
- 2012-03-28 US US13/433,105 patent/US8937603B2/en active Active
- 2012-03-29 WO PCT/US2012/031272 patent/WO2012135532A1/en active Application Filing
- 2012-03-29 WO PCT/US2012/031279 patent/WO2012135534A1/en active Application Filing
- 2012-03-29 US US13/434,623 patent/US20120249474A1/en not_active Abandoned
- 2012-03-29 US US13/434,677 patent/US20120249475A1/en not_active Abandoned
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20110291976A1 (en) * | 2009-03-12 | 2011-12-01 | Ricoh Company, Ltd | Touch panel device, display device equipped with touch panel device, and control method of touch panel device |
US20120056825A1 (en) * | 2010-03-16 | 2012-03-08 | Immersion Corporation | Systems And Methods For Pre-Touch And True Touch |
US20120105357A1 (en) * | 2010-10-31 | 2012-05-03 | Avago Technologies Ecbu Ip (Singapore) Pte. Ltd. | Capacitive Touchscreen System with Reduced Power Consumption Using Modal Focused Scanning |
Cited By (97)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20110060499A1 (en) * | 2009-09-04 | 2011-03-10 | Hyundai Motor Japan R&D Center, Inc. | Operation system for vehicle |
US8849506B2 (en) * | 2009-09-04 | 2014-09-30 | Hyundai Motor Japan R&D Center, Inc. | Operation system for vehicle |
US20120319938A1 (en) * | 2011-06-20 | 2012-12-20 | Immersion Corporation | Haptic theme framework |
US10191546B2 (en) * | 2011-06-20 | 2019-01-29 | Immersion Corporation | Haptic theme framework |
US9195351B1 (en) * | 2011-09-28 | 2015-11-24 | Amazon Technologies, Inc. | Capacitive stylus |
US9591250B2 (en) * | 2011-10-19 | 2017-03-07 | Thomson Licensing | Remote control with feedback for blind navigation |
US20140232944A1 (en) * | 2011-10-19 | 2014-08-21 | Thomson Licensing | Remote control with feedback for blind navigation |
US20130154948A1 (en) * | 2011-12-14 | 2013-06-20 | Synaptics Incorporated | Force sensing input device and method for determining force information |
US8633911B2 (en) * | 2011-12-14 | 2014-01-21 | Synaptics Incorporated | Force sensing input device and method for determining force information |
US9207801B2 (en) | 2011-12-14 | 2015-12-08 | Synaptics Incorporated | Force sensing input device and method for determining force information |
US10101815B2 (en) | 2012-02-01 | 2018-10-16 | Immersion Corporation | Eccentric rotating mass actuator optimization for haptic effects |
US20160070353A1 (en) * | 2012-02-01 | 2016-03-10 | Immersion Corporation | Eccentric rotating mass actuator optimization for haptic effects |
US9921656B2 (en) | 2012-02-01 | 2018-03-20 | Immersion Corporation | Eccentric rotating mass actuator optimization for haptic effect |
US9710065B2 (en) * | 2012-02-01 | 2017-07-18 | Immersion Corporation | Eccentric rotating mass actuator optimization for haptic effects |
US9158405B2 (en) * | 2012-06-15 | 2015-10-13 | Blackberry Limited | Electronic device including touch-sensitive display and method of controlling same |
US20130335338A1 (en) * | 2012-06-15 | 2013-12-19 | Research In Motion Limited | Electronic device including touch-sensitive display and method of controlling same |
US20140092003A1 (en) * | 2012-09-28 | 2014-04-03 | Min Liu | Direct haptic feedback |
US20140104320A1 (en) * | 2012-10-17 | 2014-04-17 | Perceptive Pixel, Inc. | Controlling Virtual Objects |
US9589538B2 (en) * | 2012-10-17 | 2017-03-07 | Perceptive Pixel, Inc. | Controlling virtual objects |
US20140230575A1 (en) * | 2013-02-17 | 2014-08-21 | Microsoft Corporation | Piezo-actuated virtual buttons for touch surfaces |
CN105074621A (en) * | 2013-02-17 | 2015-11-18 | 微软公司 | Piezo-actuated virtual buttons for touch surfaces |
US10578499B2 (en) * | 2013-02-17 | 2020-03-03 | Microsoft Technology Licensing, Llc | Piezo-actuated virtual buttons for touch surfaces |
US11624815B1 (en) | 2013-05-08 | 2023-04-11 | Ultrahaptics Ip Ltd | Method and apparatus for producing an acoustic field |
US11543507B2 (en) | 2013-05-08 | 2023-01-03 | Ultrahaptics Ip Ltd | Method and apparatus for producing an acoustic field |
US20160103500A1 (en) * | 2013-05-21 | 2016-04-14 | Stanley Innovation, Inc. | System and method for a human machine interface utilizing near-field quasi-state electrical field sensing technology |
US10591992B2 (en) * | 2013-06-17 | 2020-03-17 | Lenovo (Singapore) Pte. Ltd. | Simulation of control areas on touch surface using haptic feedback |
US20140368445A1 (en) * | 2013-06-17 | 2014-12-18 | Lenovo (Singapore) Pte. Ltd. | Simulation of control areas on touch surface using haptic feedback |
US10359848B2 (en) | 2013-12-31 | 2019-07-23 | Microsoft Technology Licensing, Llc | Input device haptics and pressure sensing |
US9448631B2 (en) | 2013-12-31 | 2016-09-20 | Microsoft Technology Licensing, Llc | Input device haptics and pressure sensing |
US9471143B2 (en) * | 2014-01-20 | 2016-10-18 | Lenovo (Singapore) Pte. Ltd | Using haptic feedback on a touch device to provide element location indications |
US9182823B2 (en) | 2014-01-21 | 2015-11-10 | Lenovo (Singapore) Pte. Ltd. | Actuating haptic element of a touch-sensitive device |
US10684688B2 (en) * | 2014-01-21 | 2020-06-16 | Lenovo (Singapore) Pte. Ltd. | Actuating haptic element on a touch-sensitive device |
US20150338920A1 (en) * | 2014-01-21 | 2015-11-26 | Lenovo (Singapore) Pte. Ltd. | Actuating haptic element on a touch-sensitive device |
US11204644B2 (en) * | 2014-09-09 | 2021-12-21 | Ultrahaptics Ip Ltd | Method and apparatus for modulating haptic feedback |
US12204691B2 (en) | 2014-09-09 | 2025-01-21 | Ultrahaptics Ip Ltd | Method and apparatus for modulating haptic feedback |
US11768540B2 (en) | 2014-09-09 | 2023-09-26 | Ultrahaptics Ip Ltd | Method and apparatus for modulating haptic feedback |
US11656686B2 (en) | 2014-09-09 | 2023-05-23 | Ultrahaptics Ip Ltd | Method and apparatus for modulating haptic feedback |
US11830351B2 (en) | 2015-02-20 | 2023-11-28 | Ultrahaptics Ip Ltd | Algorithm improvements in a haptic system |
US11550432B2 (en) | 2015-02-20 | 2023-01-10 | Ultrahaptics Ip Ltd | Perceptions in a haptic system |
US11276281B2 (en) | 2015-02-20 | 2022-03-15 | Ultrahaptics Ip Ltd | Algorithm improvements in a haptic system |
US12100288B2 (en) | 2015-07-16 | 2024-09-24 | Ultrahaptics Ip Ltd | Calibration techniques in haptic systems |
US11727790B2 (en) | 2015-07-16 | 2023-08-15 | Ultrahaptics Ip Ltd | Calibration techniques in haptic systems |
WO2017053761A1 (en) * | 2015-09-25 | 2017-03-30 | Immersion Corporation | Haptic effects design system |
US10963089B2 (en) * | 2015-11-27 | 2021-03-30 | Kyocera Corporation | Tactile sensation providing apparatus and tactile sensation providing method |
US20180356891A1 (en) * | 2015-11-27 | 2018-12-13 | Kyocera Corporation | Tactile sensation providing apparatus and tactile sensation providing method |
WO2017111928A1 (en) * | 2015-12-22 | 2017-06-29 | Intel Corporation | Reduction of touchscreen bounce |
US10698518B2 (en) | 2015-12-22 | 2020-06-30 | Intel Corporation | Reduction of touchscreen bounce |
US10976819B2 (en) * | 2015-12-28 | 2021-04-13 | Microsoft Technology Licensing, Llc | Haptic feedback for non-touch surface interaction |
US20170185151A1 (en) * | 2015-12-28 | 2017-06-29 | Microsoft Technology Licensing, Llc | Haptic feedback for non-touch surface interaction |
US11189140B2 (en) | 2016-01-05 | 2021-11-30 | Ultrahaptics Ip Ltd | Calibration and detection techniques in haptic systems |
US10061385B2 (en) | 2016-01-22 | 2018-08-28 | Microsoft Technology Licensing, Llc | Haptic feedback for a touch input device |
US10890978B2 (en) * | 2016-05-10 | 2021-01-12 | Apple Inc. | Electronic device with an input device having a haptic engine |
US11762470B2 (en) | 2016-05-10 | 2023-09-19 | Apple Inc. | Electronic device with an input device having a haptic engine |
WO2018002186A1 (en) * | 2016-06-29 | 2018-01-04 | Dav | Control method and control interface for a motor vehicle |
FR3053488A1 (en) * | 2016-06-29 | 2018-01-05 | Dav | CONTROL METHOD AND CONTROL INTERFACE FOR MOTOR VEHICLE |
FR3053489A1 (en) * | 2016-06-29 | 2018-01-05 | Dav | CONTROL METHOD AND CONTROL INTERFACE FOR MOTOR VEHICLE |
WO2018002189A1 (en) * | 2016-06-29 | 2018-01-04 | Dav | Control method and control interface for a motor vehicle |
US11307664B2 (en) | 2016-08-03 | 2022-04-19 | Ultrahaptics Ip Ltd | Three-dimensional perceptions in haptic systems |
US12001610B2 (en) | 2016-08-03 | 2024-06-04 | Ultrahaptics Ip Ltd | Three-dimensional perceptions in haptic systems |
US11714492B2 (en) | 2016-08-03 | 2023-08-01 | Ultrahaptics Ip Ltd | Three-dimensional perceptions in haptic systems |
US10671167B2 (en) | 2016-09-01 | 2020-06-02 | Apple Inc. | Electronic device including sensed location based driving of haptic actuators and related methods |
US11955109B2 (en) | 2016-12-13 | 2024-04-09 | Ultrahaptics Ip Ltd | Driving techniques for phased-array systems |
US11789534B2 (en) * | 2017-02-13 | 2023-10-17 | Snap Inc. | Generating a response that depicts haptic characteristics |
US20210149494A1 (en) * | 2017-02-13 | 2021-05-20 | Snap Inc. | Generating a response that depicts haptic characteristics |
US11054932B2 (en) | 2017-09-06 | 2021-07-06 | Apple Inc. | Electronic device having a touch sensor, force sensor, and haptic actuator in an integrated module |
US11460946B2 (en) | 2017-09-06 | 2022-10-04 | Apple Inc. | Electronic device having a touch sensor, force sensor, and haptic actuator in an integrated module |
US11921928B2 (en) | 2017-11-26 | 2024-03-05 | Ultrahaptics Ip Ltd | Haptic effects from focused acoustic fields |
US11531395B2 (en) | 2017-11-26 | 2022-12-20 | Ultrahaptics Ip Ltd | Haptic effects from focused acoustic fields |
US12158522B2 (en) | 2017-12-22 | 2024-12-03 | Ultrahaptics Ip Ltd | Tracking in haptic systems |
US11704983B2 (en) | 2017-12-22 | 2023-07-18 | Ultrahaptics Ip Ltd | Minimizing unwanted responses in haptic systems |
US11360546B2 (en) | 2017-12-22 | 2022-06-14 | Ultrahaptics Ip Ltd | Tracking in haptic systems |
US11529650B2 (en) | 2018-05-02 | 2022-12-20 | Ultrahaptics Ip Ltd | Blocking plate structure for improved acoustic transmission efficiency |
US11883847B2 (en) | 2018-05-02 | 2024-01-30 | Ultraleap Limited | Blocking plate structure for improved acoustic transmission efficiency |
US10936071B2 (en) | 2018-08-30 | 2021-03-02 | Apple Inc. | Wearable electronic device with haptic rotatable input |
US11098951B2 (en) | 2018-09-09 | 2021-08-24 | Ultrahaptics Ip Ltd | Ultrasonic-assisted liquid manipulation |
US11740018B2 (en) | 2018-09-09 | 2023-08-29 | Ultrahaptics Ip Ltd | Ultrasonic-assisted liquid manipulation |
US11805345B2 (en) | 2018-09-25 | 2023-10-31 | Apple Inc. | Haptic output system |
US10966007B1 (en) | 2018-09-25 | 2021-03-30 | Apple Inc. | Haptic output system |
US11378997B2 (en) | 2018-10-12 | 2022-07-05 | Ultrahaptics Ip Ltd | Variable phase and frequency pulse-width modulation technique |
US11550395B2 (en) | 2019-01-04 | 2023-01-10 | Ultrahaptics Ip Ltd | Mid-air haptic textures |
US11842517B2 (en) | 2019-04-12 | 2023-12-12 | Ultrahaptics Ip Ltd | Using iterative 3D-model fitting for domain adaptation of a hand-pose-estimation neural network |
US11374586B2 (en) | 2019-10-13 | 2022-06-28 | Ultraleap Limited | Reducing harmonic distortion by dithering |
US12191875B2 (en) | 2019-10-13 | 2025-01-07 | Ultraleap Limited | Reducing harmonic distortion by dithering |
US11742870B2 (en) | 2019-10-13 | 2023-08-29 | Ultraleap Limited | Reducing harmonic distortion by dithering |
US11553295B2 (en) | 2019-10-13 | 2023-01-10 | Ultraleap Limited | Dynamic capping with virtual microphones |
US11169610B2 (en) | 2019-11-08 | 2021-11-09 | Ultraleap Limited | Tracking techniques in haptic systems |
US12002448B2 (en) | 2019-12-25 | 2024-06-04 | Ultraleap Limited | Acoustic transducer structures |
US11715453B2 (en) | 2019-12-25 | 2023-08-01 | Ultraleap Limited | Acoustic transducer structures |
US11024135B1 (en) | 2020-06-17 | 2021-06-01 | Apple Inc. | Portable electronic device having a haptic button assembly |
US12073710B2 (en) | 2020-06-17 | 2024-08-27 | Apple Inc. | Portable electronic device having a haptic button assembly |
US11756392B2 (en) | 2020-06-17 | 2023-09-12 | Apple Inc. | Portable electronic device having a haptic button assembly |
US11816267B2 (en) | 2020-06-23 | 2023-11-14 | Ultraleap Limited | Features of airborne ultrasonic fields |
US11886639B2 (en) | 2020-09-17 | 2024-01-30 | Ultraleap Limited | Ultrahapticons |
US20230400937A1 (en) * | 2021-04-20 | 2023-12-14 | Microsoft Technology Licensing, Llc | Stylus haptic component arming and power consumption |
US11775084B2 (en) * | 2021-04-20 | 2023-10-03 | Microsoft Technology Licensing, Llc | Stylus haptic component arming and power consumption |
US20220334658A1 (en) * | 2021-04-20 | 2022-10-20 | Microsoft Technology Licensing, Llc | Stylus haptic component arming and power consumption |
US12216840B2 (en) * | 2021-04-20 | 2025-02-04 | Microsoft Technology Licensing, Llc | Stylus haptic component arming and power consumption |
Also Published As
Publication number | Publication date |
---|---|
WO2012135373A3 (en) | 2014-05-01 |
US20120249475A1 (en) | 2012-10-04 |
WO2012135534A1 (en) | 2012-10-04 |
WO2012135373A2 (en) | 2012-10-04 |
US20120249462A1 (en) | 2012-10-04 |
WO2012135532A1 (en) | 2012-10-04 |
US20120249461A1 (en) | 2012-10-04 |
US8937603B2 (en) | 2015-01-20 |
WO2012135378A1 (en) | 2012-10-04 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20120249474A1 (en) | Proximity and force detection for haptic effect generation | |
US10296136B2 (en) | Touch-sensitive button with two levels | |
JP6212175B2 (en) | System and method for an interface featuring surface-based haptic effects | |
CN102349039B (en) | For providing the system and method for feature in friction display | |
CN102349041B (en) | For the system and method for rub display and additional tactile effect | |
EP2406702B1 (en) | System and method for interfaces featuring surface-based haptic effects | |
US20110109577A1 (en) | Method and apparatus with proximity touch detection | |
JP2019133679A (en) | Method and apparatus for providing haptic cues for guidance and alignment with electrostatic friction | |
US8368667B2 (en) | Method for reducing latency when using multi-touch gesture on touchpad | |
KR20110130474A (en) | Systems and methods for friction display and additional haptic effects | |
KR20160007634A (en) | Feedback for gestures | |
US20200012348A1 (en) | Haptically enabled overlay for a pressure sensitive surface | |
US20140282279A1 (en) | Input interaction on a touch sensor combining touch and hover actions | |
KR20160019449A (en) | Disambiguation of indirect input | |
JP2017208024A (en) | Haptic feedback device | |
CN117795459A (en) | Method of generating haptic output and electronic device for generating haptic output using the same |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: ANALOG DEVICES, INC., MASSACHUSETTS Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:PRATT, SUSAN MICHELLE;MURPHY, MARK J.;ENGLISH, EOIN E.;AND OTHERS;SIGNING DATES FROM 20120411 TO 20120412;REEL/FRAME:028370/0337 |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |