US20150268768A1 - Touch Keyboard Calibration - Google Patents
Touch Keyboard Calibration Download PDFInfo
- Publication number
 - US20150268768A1 US20150268768A1 US14/219,921 US201414219921A US2015268768A1 US 20150268768 A1 US20150268768 A1 US 20150268768A1 US 201414219921 A US201414219921 A US 201414219921A US 2015268768 A1 US2015268768 A1 US 2015268768A1
 - Authority
 - US
 - United States
 - Prior art keywords
 - locations
 - pressures
 - touch keyboard
 - inputs
 - input
 - Prior art date
 - Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
 - Granted
 
Links
Images
Classifications
- 
        
- G—PHYSICS
 - G06—COMPUTING OR CALCULATING; COUNTING
 - G06F—ELECTRIC DIGITAL DATA PROCESSING
 - G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
 - G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
 - G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
 - G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
 - G06F3/0416—Control or interface arrangements specially adapted for digitisers
 - G06F3/0418—Control or interface arrangements specially adapted for digitisers for error correction or compensation, e.g. based on parallax, calibration or alignment
 
 - 
        
- G—PHYSICS
 - G06—COMPUTING OR CALCULATING; COUNTING
 - G06F—ELECTRIC DIGITAL DATA PROCESSING
 - G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
 - G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
 - G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
 - G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
 - G06F3/045—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means using resistive elements, e.g. a single continuous surface or two parallel surfaces put in contact
 
 
Definitions
- This description relates to touch keyboards.
 - Touch keyboards may display keys on a screen or display, and may detect contact with the screen or display by a user's fingers and/or hands. At times, the user may accidentally contact part of the screen or display.
 - a non-transitory computer-readable storage medium may comprise instructions stored thereon. When executed by at least one processor, the instructions may be configured to cause a computing system to at least display keys on a touch keyboard, display a prompt to type into the touch keyboard, store locations and pressures of calibration inputs onto the keys on the touch keyboard in association with an account, and recognize subsequent inputs into the touch keyboard during a session with the account based on the stored locations and pressures.
 - a non-transitory computer-readable storage medium may comprise instructions stored thereon. When executed by at least one processor, the instructions may be configured to cause a computing system to at least authenticate a login associated with an account, display keys on a touch keyboard, the touch keyboard comprising a flat display and being configured to detect locations and pressures of contacts onto the flat display, display text on a display for a user of the account to type into the touch keyboard, store locations and pressures of contacts onto the touch keyboard for expected inputs, the expected inputs being based on the displayed text, determine minimum pressure thresholds and locations for multiple characters, the minimum pressure thresholds and locations being based on the stored locations and pressures, compare, during a session with the account, at least a first input from the touch keyboard to a first pressure threshold associated with a first character and a second input from the touch keyboard to a second pressure threshold associated with a second character, recognize the first character based on the first input meeting the first pressure threshold, and ignore the second input based on the second input not meeting the second pressure threshold.
 - a computing device may include a touch screen configured to display text and receive touch inputs, at least one processor, and a non-transitory computer-readable storage medium comprising instructions stored thereon.
 - the instructions may be configured to cause the computing device to at least display keys on the touch screen, display a prompt to type into the touch screen, store locations and pressures of inputs onto the keys on the touch screen, and recognize subsequent inputs into the touch screen based on the stored locations and pressures.
 - a computing device may include means for displaying keys, means for displaying a prompt to type into the means for displaying keys, means for storing locations and pressures of inputs onto the keys on the means for displaying keys, and means for recognizing subsequent inputs into the means for displaying keys based on the stored locations and pressures.
 - FIG. 1A shows a server, a computing device, and a touch keyboard according to an example implementation.
 - FIG. 1B shows a computing device according to another example implementation.
 - FIG. 1C is a diagram of a touch screen and related components according to an example implementation.
 - FIG. 1D is a diagram of a sensor grid according to an example implementation.
 - FIG. 2 is a diagram of a system according to an example implementation.
 - FIG. 3 shows the computing device, the touch keyboard, and a user typing onto the touch keyboard according to an example implementation.
 - FIG. 4 is a flowchart showing a method according to an example implementation.
 - FIG. 5 is a flowchart showing another method according to an example implementation.
 - FIG. 6 shows an example of a computer device and a mobile computer device that can be used to implement the techniques described here.
 - FIG. 1A shows a server 102 , a computing device 104 A, and a touch keyboard 106 according to an example implementation.
 - the touch keyboard 106 may display keys and detect a location of a contact(s) and an amount(s) of force or pressure with which a user presses and/or types onto the displayed keys.
 - the server 102 and/or computing device 104 A may determine whether to recognize contacts as valid inputs based on the amount of force or pressure applied, and previous determinations of minimum or threshold amounts of force or pressure at various parts of the keyboard, to recognize a valid input.
 - a user may log into the server 102 via the computing device 104 A, and the server 102 may recognize input from the user onto the touch keyboard 106 based on settings associated with the user's account. For example, the user may have previously calibrated his or her account and or settings for typing into the touch keyboard 106 , and the server 102 may recognize or ignore inputs into the touch keyboard 106 based on the settings associated with the user's account.
 - the settings, and/or determinations of whether to recognize or ignore input or contacts may be stored on and/or performed by the server 102 , the computing device 104 A, or the touch keyboard 106 , according to example implementations.
 - the server 102 may include a remote server that communicates with the computing device 104 A via a network, such as the Internet.
 - the server 102 may include a single server or multiple servers.
 - the server 102 may provide remote computing services to computing devices including the computing device 104 A, such as providing search results, email, word processing, or other productivity services.
 - the remote computing services may be provided to the computing device 104 A via a browser of the computing device 104 A, and the browser may display output from within the computing device 104 A.
 - the server 102 may, for example, receive and process information received from the computing device 104 A (and other computing devices), and send image data to the computing device 104 A prompting the computing device 104 A to display images.
 - the information received by the server 102 from the computing device 104 A may be based on input received by the computing device 104 A from the touch keyboard 106 .
 - the server 102 may provide word processing services to the computing device 104 A, a user may type into the touch keyboard 106 , the computing device 104 A may receive input from the touch keyboard 106 such as alphanumeric keys and/or text, the computing device 104 A may send the received input to the server 102 , the server 102 may process the input, and the server 102 may send image data to the computing device 104 A, prompting the computing device 104 A to display the alphanumeric keys and/or text that the user inputted into the touch keyboard 106 .
 - the computing device 104 A may include a general purpose computing system, and may include any computing device capable of sending and receiving data to and from the server 102 , displaying data to a user, and receiving input from a keyboard, such as the touch keyboard 106 .
 - the computing device 104 A may include, for example, a tablet computing device, a laptop or desktop computer, or a smartphone.
 - the computing device 104 A may include a display 110 A for presenting visual information to a user, and may include an interface for coupling to the touch keyboard 106 .
 - the interface to the touch keyboard 106 may be a wired interface, such as Universal Serial Bus (USB) or Ethernet, or may be wireless, such as IEEE 802.15 Bluetooth or IEEE 802.11 Wireless Fidelity (WiFi).
 - the computing device 104 A may include a bezel 108 A surrounding the display 110 A.
 - the display 110 A may present images to a user, which may be based on a combination of software stored in the computing device 104 A and data received from the server
 - the touch keyboard 106 may include a standalone touch keyboard, and may display keys and detect when the user contacts and/or presses the displayed keys.
 - the touch keyboard 106 may include a bezel 112 surrounding a display 114 .
 - the display 114 may display keys.
 - the keyboard 106 may also include a touch sensor for detecting where the user has touched the display 114 and provided input.
 - the display 114 may be flat, enabling the touch keyboard 106 to be thinner than keyboards with physical keys.
 - the touch keyboard 106 may detect an amount of force or pressure applied by the user to the display.
 - the touch keyboard 106 may, for example, include resistive and/or capacitive elements and may detect changes in resistance and/or capacitance. The change in resistance and/or capacitance may indicate the location and force or pressure of the contacts.
 - the touch keyboard 106 may not include the display 114 .
 - the touch keyboard 114 may, for example, include an image of a keyboard printed onto material such as fabric or plastic.
 - the example touch keyboard 106 that does not include the display 114 may detect pressure and/or location of contacts, as described herein.
 - the touch keyboard 106 may not include and/or display any images of keys, and may include a trackpad, for example, that detects locations and pressures of contacts.
 - FIG. 1B shows a computing device 104 B according to another example implementation.
 - the computing device 104 B may include any or all of the features and functionalities of the computing device 104 A described herein.
 - the display 110 B which may be surrounded by a bezel 108 B and may include a touch screen, may display a keyboard 116 .
 - the computing device 104 B may detect amounts of force or pressure applied by the user onto the keyboard 116 of the display 110 B and locations of the force or pressure, in similar manner to the display 110 A and/or touch keyboard 106 described herein.
 - FIG. 1C is a diagram of a touch screen 140 and related components according to an example implementation.
 - the touch screen 140 may be included in the display 110 A of the computing device 104 A, the display 114 of the touch keyboard 106 , or the display 110 B of the computing device 104 B.
 - the touch screen 140 may include a surface 150 , a sensor 152 , and a controller 154 .
 - the surface 150 may be configured to be contacted by a user to actuate and trigger an electrical response within the touch screen 140 .
 - the surface 150 may, for example, be on top of the touch screen 140 (such as on the portion of the display 110 A, 114 , 110 B closest to the user) and above the sensor 152 .
 - the surface 150 may be operably coupled to the sensor 152 .
 - the sensor 152 can be activated when a user enters an input (e.g., a touch or a tap), such as by applying pressure on the top surface 150 of the touch screen 140 .
 - the sensor 152 can be, for example, a flame-retardant class-4 (FR4) printed circuit board.
 - FR4 flame-retardant class-4
 - the sensor 152 may be responsive to applications of pressure on the surface 150 and/or sensor 152 , and may provide signals to a controller 154 indicating changes in resistance and/or capacitance in the sensor 152 based on the applications of pressure, and locations of the changes in resistance and/or capacitance.
 - the controller 154 may be operably coupled to the sensor 152 .
 - the controller 154 may be an embedded microcontroller chip and may include, for example, read-only firmware.
 - the controller 154 may include a single integrated circuit containing a processor core, memory, and programmable input/output peripherals.
 - the controller 154 may provide input to the computing device 104 A, 104 B and/or touch keyboard 106 , such as the locations and force or pressure of contacts, or the force or pressure of contacts and the keys associated with the locations of the contacts.
 - FIG. 1D is a diagram of a sensor grid 170 according to an example implementation.
 - the sensor grid 170 may be included as part of the touch screen 140 , such as part of sensor 152 shown in FIG. 1C .
 - Other implementations are possible, and the specific depiction of sensor grid 170 shown in FIG. 1D is merely for illustration.
 - the grid 170 may have any number of columns and rows (rather the eight columns and five rows shown in the example of FIG. 1D ), and may be formed in any shape.
 - the sensor grid 170 may include any number sensors, such as sensors 180 , 182 , 184 , 186 .
 - the sensors 180 , 182 , 184 , 186 may be spaced any distance (such as a few millimeters) apart from each other and may be designed to sense tactile input.
 - the sensors 180 , 182 , 184 , 186 may sense tactile input by sensing applications of pressure to the surface 150 of the touch screen 140 , such as by detecting or determining resistance and/or capacitance levels.
 - the resistance and/or capacitance levels may be changed by the received tactile input, such as changes or applications of pressure to the surface 150 and/or sensor 152 .
 - Input 172 which may be a fingerpad contact, represents a position on the grid 170 when a user places a finger on the tactile input device 110 .
 - input 172 may span multiple rows and columns of sensors 180 , 182 , 184 , 186 on grid 170 .
 - the sensors 180 , 182 , 184 , 186 , and/or controller 154 may sense and/or determine an amount of pressure applied by the user's finger based on changes in the resistance and/or capacitance, and/or based on the number or area of sensors 180 , 182 , 184 , 186 that detect the user's finger contacting the surface 150 .
 - the sensors 180 , 182 , 184 , 186 , and/or controller 154 may sense and/or determine a location of the contact input 172 .
 - the location may be measured in both a horizontal or ‘X’ direction and a vertical or ‘Y’ direction.
 - the location may map to a key on the display 114 of the touch keyboard 106 or keyboard 116 of the display 110 B described above.
 - FIG. 2 is a diagram of a system 200 according to an example implementation.
 - the system 200 may determine whether to recognize contacts, such as the input 172 shown and described above with respect to FIG. 1D , on the touch keyboard 106 as valid keyboard inputs.
 - the system 200 may also determine whether to recognize contacts on the keyboard 116 ; while contacts will be described herein with respect to the touch keyboard 106 , the functions and processes described herein may also be applied with respect to the keyboard 116 .
 - the functions and/or modules of the system 200 may be included in any combination of the server 102 , the computing devices 104 A, 104 B, and/or the touch keyboard 106 .
 - the functions and/or modules may, for example be performed by the server 102 in a remote computing context in which the user logs into the server 102 via the computing device 104 A, in a local computing context in which the user logs into the computing device 104 B or in which the computing device 104 B does not require login and stores the same settings for all users, or by the keyboard 106 in an example in which the keyboard 106 is calibrated for one or more users.
 - the functions and/or modules may also be distributed between the server 102 , computing device 104 A, 104 B, and/or keyboard 106 . Any combination of the functions and/or modules described herein may also be included in or performed by touch DJ equipment, trackpads, mice, or smartphones.
 - the system 200 may determine whether to recognize contacts on the touch keyboard 106 as valid keyboard inputs based on locations of the contacts, amounts of force or pressure applied by the contacts to the touch keyboard 106 , and previous determinations of minimums or thresholds of force or pressure required to recognize a valid contact at various locations on the touch keyboard 106 .
 - the minimums or thresholds of force or pressure may be different at different locations on the touch keyboard 106 . For example, a first minimum or threshold of force or pressure at a location on the touch keyboard 106 where the space bar is displayed may be greater than a second minimum or threshold of force or pressure at one or more locations on the touch keyboard 106 where keys are displayed.
 - the guide keys, ‘a’, ‘s’, ‘d’, ‘f’, ‘j’, ‘k,’ ‘l’, and the apostrophe ('), at which users typically rest their fingers before beginning to type may have greater or higher threshold than other keys on the keyboard, preventing the system 200 from recognizing inputs when the user is merely resting his or her fingers on the touch keyboard 106 .
 - the minimums or thresholds may also be different for different locations associated with the same key.
 - the system 200 may include a calibration engine 202 .
 - the calibration engine may calibrate the user's account to determine whether contacts on the touch keyboard 106 were intentional contacts intended as keystrokes, or accidental contacts that were not intended as keystrokes.
 - the calibration engine 202 may prompt the user to type sample text, monitor the user typing the sample text, and determine pressure or force thresholds or minimums for various locations to recognize contacts as valid keystrokes and/or inputs. Calibration may be performed by the calibration engine 202 when the user sets up an account for using the touch keyboard 106 , or at any time after the user sets up an account, and the settings, such as pressure or force thresholds and any location variances, may be accessed when the user later types into the touch keyboard 106 .
 - the calibration engine 202 may determine force or pressure thresholds during a calibration process.
 - the calibration engine 202 may include a presenter 204 .
 - the presenter 204 may cause a prompt to be displayed on the display of the computing device 104 during the calibration process.
 - the prompt may, for example, include text for the user to type into the touch keyboard 106 .
 - the text shown on the display may change as the user types into the touch keyboard 106 , allowing the user to type a significant amount of text into the touch keyboard 106 for calibration.
 - the calibration engine 202 may also include a listener 206 .
 - the listener 206 may detect contacts, such as input 172 , onto the touch keyboard 106 during the calibration process.
 - the listener may determine, for each contact, a determined key based on the prompted text and/or location of the contact on the touch keyboard 106 , a location on the touch keyboard 106 of the contact, and a pressure or amount of force of the contact.
 - the determined key may also be “none” or “null,” if the system determines that no keystroke was intended, such as if the keystroke was inconsistent with the prompted text and/or the amount of force or pressure was low.
 - FIG. 3 shows the computing device 104 , the touch keyboard 106 , and a user typing onto the touch keyboard 106 according to an example implementation.
 - the presenter 204 may have instructed the display of the computing device to display the text, “Please type this sentence into the keyboard.”
 - the listener 206 may detect contacts, such as input 172 , from the user's fingers and/or hands 302 , 304 into the touch keyboard 106 .
 - the calibration engine 202 may recognize the characters in the text, “Please type this sentence into the keyboard,” as expected input, and may associate sequential contacts with sequential characters in the text.
 - the calibration engine 202 may also determine locations for keys or characters. While most contacts while be inside the intended keys, some contacts may be outside or between keys. Different users may have different typing patterns. For example, a first user may contact the touch keyboard 106 between the ‘o’ and the ‘p’ when intending to type an ‘o’, whereas a second user may contact the touch keyboard 106 between the ‘o’ and the ‘p’ when intending to type a ‘p’.
 - the calibration engine 202 may store the location between the ‘o’ and the ‘p’ in association with the letter ‘o’ for the first user's account, and may store the location between the ‘o’ and the ‘p’ in association with the letter ‘p’ for the second user's account.
 - the locations may, for example, include X and Y values with respect to a reference point, such as an upper-left corner of the touch keyboard 106 , according to an example implementation.
 - the calibration engine 202 may change the images of keys displayed by the display 114 , 110 B based on the user's typing patterns.
 - the calibration engine may change the locations of images of keys based on the locations users contact when they intent to type specific keys. For example, if a user contacts the touch keyboard 106 between the ‘o’ and the ‘p’ when intending to type an ‘o’, the calibration engine 202 may move the ‘o’ and ‘p’ to the right, so that the ‘o’ is where the user contacts the touch keyboard 106 when the user is intending to type the ‘o’. by moving the keys to the locations that the user contacts the keyboard 106 when intending to type the respective keys, the calibration engine 202 may generate a keyboard that conforms to the user's typing patterns and is easier for the user to type into.
 - the calibration engine 202 may determine pressure thresholds during use, such as during a login session, in addition to or instead of, during a calibration process.
 - the calibration engine 302 may update the minimum force or pressure thresholds during a login session by a user, such as by lowering the force or pressure threshold if the user retypes or recontacts a key after the input processor 222 (described below) ignored a contact on the key, or may raise the force or pressure threshold if the user deletes a character that the input processor 222 recognized.
 - the calibration engine 302 may also to adjust the thresholds over time by averaging new inputs with stored inputs, according to an example implementation.
 - the system 200 may also include a memory 208 .
 - the memory 208 may store instructions 210 that, when executed by at least one processor, are configured to cause a computing system to perform any combination of the functions and processes described herein.
 - the memory 208 may also store data 212 .
 - the data 212 may include both information that is related to the functions and processes described herein, and information that is not related to the functions and processes described herein.
 - the memory 208 may also store multiple input units 214 . While only one input unit 214 is shown in FIG. 2 , the memory 208 may store multiple input units 214 .
 - the calibration engine 202 may cause the memory 208 to store input units 214 based on contacts detected by the listener 206 during calibration.
 - the memory 208 may, for example, store at least one input unit 214 for each key displayed by the touch keyboard 106 .
 - the memory 208 may, for example, store multiple input units 214 for each key displayed by the touch keyboard 106 , with each input unit 214 corresponding to a location where the user may contact the touch keyboard 106 when intending to type the respective key.
 - the location may include an X value and a Y value with respect to a reference point on the touch keyboard 106 .
 - the memory 208 may store versions of these input units 214 for each account.
 - Each input unit 214 may be associated with, and/or include, a key 216 , such as an alphanumeric key on the touch keyboard 106 and/or an American Standard Code for Information Interchange (ASCII) character or Unicode character.
 - the input unit 214 may also include a location 218 .
 - the location 218 may be a detected location on the touch keyboard 106 .
 - the location 218 may, for example, correspond to a location on the touch keyboard 106 , which may be represented by an X value and a Y value, in which the associated key 216 is displayed and/or where the user contacts the touch keyboard 106 intending to type the associated key 216 .
 - the input unit 214 may also include a pressure threshold 220 .
 - the pressure threshold 220 may indicate a minimum or threshold of pressure or force that must be detected at the location 218 associated with the input unit 214 for the associated key 214 to be recognized as a valid input during a given contact onto the touch keyboard 106 . If the contact does not meet the pressure threshold 220 , then no input from the touch keyboard may be recognized and/or the system 200 may ignore the contact.
 - the system 200 may include an input processor 222 .
 - the input processor 222 may process input from the touch keyboard 106 , such as contacts, and determine which character, if any, to recognize.
 - the input processor 222 may include a receiver 224 .
 - the receiver 224 may receive inputs from the touch keyboard 106 , such as contact inputs including an input 172 .
 - the receiver 224 may receive the input from, for example, the controller 154 .
 - the inputs may include locations of the contacts, which may include X values and Y values, and pressures of the contacts.
 - the input processor 222 may also include a comparator 226 .
 - the comparator 226 may compare the received inputs to the input units 214 stored in the memory 208 .
 - the comparator 226 may, for example, determine which, if any, input unit 214 includes the location 218 corresponding to the location on the touch keyboard 106 where the contact occurred. If no input unit 214 includes a location 218 corresponding to the location of the contact, then the input processor 222 may not recognize any input from the touch keyboard 106 and/or may ignore the contact. If an input unit 214 does include a location 218 corresponding to the location of the contact, then the comparator 226 may compare the pressure or force of the contact with the pressure threshold 220 included in the input unit 214 .
 - the input processor 222 may recognize the input as the key 216 stored in the input unit. If the pressure or force of the contact does not meet or exceed the pressure threshold 220 , then the input processor 222 may ignore the input and/or not recognize any input from the touch keyboard 106 based on the contact.
 - the system 200 may include an account manager 228 .
 - the account manager 228 may manage and/or store accounts for one or multiple users.
 - the account manager 228 may, for example, store a name 230 and authentication 232 for each user.
 - the name 230 may include a username associated with the user.
 - the authentication 232 may include authentication, such as a password, biometric information, and/or address information associated with the name 230 .
 - the account manager 228 may also include a session manager 234 .
 - the session manager 234 may manage logins, such as by requesting a name and authentication, to log a user in.
 - the session manager 234 may associate the input units 214 associated with the account with the logged-in name with inputs and/or contacts received from the touch keyboard 106 .
 - the session manager 234 may, for example, cause the input processor 222 to compare received contacts to a first set of input units 214 that are associated with a first account during a session after the first account was logged in, and may cause the input processor 222 to compare received contacts to a second set of input units 214 that are associated with a second account during a session after the second account was logged in.
 - the comparison to different sets of input units 214 may allow different users to use the same computing device 104 and/or touch keyboard 106 and have the system 200 respond to their respective calibrations.
 - the session manager 234 may cause the input processor 222 to compare the received contacts to input units 214 associated with a given account even when the account is logged in via a different computing device 104 and/or touch keyboard 106 , allowing a user to log in via different devices and type into a different touch keyboard with the same calibration settings.
 - the account manager 228 may recognize and authenticate a user and/or account based on typing patterns into the touch keyboard 106 .
 - the memory 208 may have stored average intervals between keys and/or average pressures on keys for a given account. Rather than require a name and authentication, the account manager 228 may simply let a user begin typing and identify and authenticate the user based on typing patterns into the touch keyboard 106 , such as intervals between keys and/or pressure on keys.
 - the system 200 may include one or more processors 236 .
 - the one or more processors 236 may include one or more microprocessors capable of executing instructions, such as the instructions 210 stored in the memory 208 .
 - the one or more processors 236 may implement modules or functionalities of the system 200 , such as the calibration engine 202 , input processor 222 , account manager 228 , keyboard interface 238 , and/or computing device interface 240 .
 - the system 200 may include a keyboard interface 238 .
 - the keyboard interface 238 may receive and/or send signals from and/or to the touch keyboard 106 , such as from the controller 154 .
 - the keyboard interface 238 may, for example, receive contact inputs, including locations and pressures, from the touch keyboard 106 , and pass the contact inputs to the input processor 222 .
 - the system 200 may include a computing device interface 240 .
 - the computing device interface 240 may send and/or receive signals to and/or from the computing device 104 A, 104 B.
 - the computing device interface 240 may, for example, receive contact inputs, and may send display signals for generating display output to the computing device 104 A, 104 B.
 - the system 200 may include an input/output module 242 .
 - the input/output module 242 may include a single module which receives input and sends output, or may include a first module that receives input and a second module that sends output.
 - the input/output module(s) 242 may include one or multiple wired or wireless interfaces for communicating with the server 102 , computing device 104 A, 104 B, and/or touch keyboard 106 .
 - FIG. 4 is a flowchart showing a method 400 according to an example implementation.
 - the method 400 may include calibrating an account for input from a touch keyboard 106 ( 402 ).
 - the touch keyboard 106 may include a flat display.
 - the calibrating may include storing locations and pressures of contacts onto the touch keyboard for expected inputs.
 - Calibrating the account ( 402 ) may include presenting text for the expected inputs.
 - the calibration engine 202 may calibrate the touch keyboard 106 ( 402 ) by, for example, instructing the computing device 104 to present sample text to the user, listening for and/or detecting contact inputs, and storing input units 214 based on the presented text and received inputs.
 - Calibrating the account ( 402 ) may include storing locations and pressures of contacts for each of multiple characters, the multiple characters corresponding to the expected inputs.
 - the calibrating the account ( 402 ) may include determining a minimum pressure threshold for each of a plurality of characters, each minimum threshold being based on multiple contacts and pressures associated with the character.
 - the calibration engine 202 may also determine which contact inputs represented keyboard entries and which were accidental contacts. For contacts associated with a same key or character, the calibration engine 202 may determine a pressure threshold 220 based on a lowest pressure of all the contacts, may discard a fraction of the contacts with lower pressures or force, or may discard outliers based on a calculated standard deviation, according to example implementations.
 - the method 400 may also include authenticating an account based on received login information ( 404 ). Recognizing the account ( 404 ) may include receiving login information, such as a name and authentication information, such as a username and password associated with the account. Recognizing the account ( 404 ) may include loading a set of input units 214 associated with the account, with which the input processor will determine whether to recognize contact inputs.
 - the method 400 may also include recognizing inputs from the touch keyboard 106 based on comparing received locations and pressures of contacts onto the touch keyboard to the stored locations and pressures of contacts ( 406 ).
 - the inputs may be recognized during a session with an account. During the session, the set of input units 214 associated with the account may have been loaded.
 - the recognizing inputs ( 406 ) may include the receiver 224 of the input processor 222 receiving contact inputs, including location on the touch keyboard 106 and force or pressure, representing contacts on the touch keyboard 106 .
 - the recognizing inputs ( 406 ) may also include the comparator 226 comparing the pressure or force of the contact inputs to the pressure threshold 220 of the input unit 214 with a location 218 matching or including the location of the contact input. If the pressure or force of the contact input meets or exceeds the pressure threshold 220 , the input processor 222 may recognize the key 216 associated with the location 218 of the contact input.
 - the method 400 may also include ignoring input from the touch keyboard based on comparing locations and pressures of contacts onto the touch keyboard to the calibrated account. If the pressure or force of the contact input does not meet or exceed the pressure threshold 220 , the input processor 222 may discard and/or ignore the contact input.
 - FIG. 5 is a flowchart showing another method 500 according to an example implementation.
 - the method 500 may include displaying a prompt to type into the touch keyboard ( 502 ).
 - the method 500 may also include storing locations and pressures of calibration inputs onto keys on the touch keyboard in association with an account ( 504 ).
 - the method 500 may also include recognizing subsequent inputs into the touch keyboard during a session with the account based on the stored locations and pressures ( 506 ).
 - the touch keyboard 106 may include a display that displays the keys.
 - the touch keyboard 106 may include a resistive touch keyboard that monitors locations and pressures of inputs and/or contacts.
 - the touch keyboard 106 may include a standalone touch keyboard configured to transmit keyboard input signals to a general purpose computing system.
 - the touch keyboard 106 may include a standalone touch keyboard configured to wirelessly transmit keyboard input signals to a general purpose computing system, such as the computing device 104 .
 - the touch keyboard 106 may include a standalone touch keyboard configured to display the keys and display the prompt.
 - the server 102 may include a remote server configured to store the locations and pressures of the calibration inputs and recognize the subsequent inputs.
 - the recognizing ( 506 ) may include receiving a login associated with the account and retrieving stored locations and pressures associated with the account.
 - the recognizing ( 506 ) may include comparing locations and pressures of typing inputs with the stored locations and pressures of calibration inputs associated with the account.
 - the method 500 may further include ignoring an input to a predetermined location based on a pressure of the input being less than a threshold for the predetermined location, the threshold for the predetermined location being higher than thresholds for other locations on the touch keyboard.
 - the method 500 may further include storing locations and pressures of a second set of calibration inputs onto the keys on the touch keyboard in association with a second account, and recognizing subsequent inputs during a session with the second account based on the second set of stored locations and pressures.
 - the method 500 may further include recognizing subsequent inputs into a second touch keyboard during a second session with the account based on the stored locations and pressures.
 - the method 500 may further include determining a pressure threshold for each of the keys based on the stored locations and pressures.
 - FIG. 6 shows an example of a generic computer device 600 and a generic mobile computer device 650 , which may be used with the techniques described here.
 - Computing device 600 is intended to represent various forms of digital computers, such as laptops, desktops, workstations, personal digital assistants, servers, blade servers, mainframes, and other appropriate computers.
 - Computing device 650 is intended to represent various forms of mobile devices, such as personal digital assistants, cellular telephones, smart phones, and other similar computing devices.
 - the components shown here, their connections and relationships, and their functions, are meant to be exemplary only, and are not meant to limit implementations of the inventions described and/or claimed in this document.
 - Computing device 600 includes a processor 602 , memory 604 , a storage device 606 , a high-speed interface 608 connecting to memory 604 and high-speed expansion ports 610 , and a low speed interface 612 connecting to low speed bus 614 and storage device 606 .
 - Each of the components 602 , 604 , 606 , 608 , 610 , and 612 are interconnected using various busses, and may be mounted on a common motherboard or in other manners as appropriate.
 - the processor 602 can process instructions for execution within the computing device 600 , including instructions stored in the memory 604 or on the storage device 606 to display graphical information for a GUI on an external input/output device, such as display 616 coupled to high speed interface 608 .
 - multiple processors and/or multiple buses may be used, as appropriate, along with multiple memories and types of memory.
 - multiple computing devices 600 may be connected, with each device providing portions of the necessary operations (e.g., as a server bank, a group of blade servers, or a multi-processor system).
 - the memory 604 stores information within the computing device 600 .
 - the memory 604 is a volatile memory unit or units.
 - the memory 604 is a non-volatile memory unit or units.
 - the memory 604 may also be another form of computer-readable medium, such as a magnetic or optical disk.
 - the storage device 606 is capable of providing mass storage for the computing device 600 .
 - the storage device 606 may be or contain a computer-readable medium, such as a floppy disk device, a hard disk device, an optical disk device, or a tape device, a flash memory or other similar solid state memory device, or an array of devices, including devices in a storage area network or other configurations.
 - a computer program product can be tangibly embodied in an information carrier.
 - the computer program product may also contain instructions that, when executed, perform one or more methods, such as those described above.
 - the information carrier is a computer- or machine-readable medium, such as the memory 604 , the storage device 606 , or memory on processor 602 .
 - the high speed controller 608 manages bandwidth-intensive operations for the computing device 600 , while the low speed controller 612 manages lower bandwidth-intensive operations. Such allocation of functions is exemplary only.
 - the high-speed controller 608 is coupled to memory 604 , display 616 (e.g., through a graphics processor or accelerator), and to high-speed expansion ports 610 , which may accept various expansion cards (not shown).
 - low-speed controller 612 is coupled to storage device 606 and low-speed expansion port 614 .
 - the low-speed expansion port which may include various communication ports (e.g., USB, Bluetooth, Ethernet, wireless Ethernet) may be coupled to one or more input/output devices, such as a keyboard, a pointing device, a scanner, or a networking device such as a switch or router, e.g., through a network adapter.
 - input/output devices such as a keyboard, a pointing device, a scanner, or a networking device such as a switch or router, e.g., through a network adapter.
 - the computing device 600 may be implemented in a number of different forms, as shown in the figure. For example, it may be implemented as a standard server 620 , or multiple times in a group of such servers. It may also be implemented as part of a rack server system 624 . In addition, it may be implemented in a personal computer such as a laptop computer 622 . Alternatively, components from computing device 600 may be combined with other components in a mobile device (not shown), such as device 650 . Each of such devices may contain one or more of computing device 600 , 650 , and an entire system may be made up of multiple computing devices 600 , 650 communicating with each other.
 - Computing device 650 includes a processor 652 , memory 664 , an input/output device such as a display 654 , a communication interface 666 , and a transceiver 668 , among other components.
 - the device 650 may also be provided with a storage device, such as a microdrive or other device, to provide additional storage.
 - a storage device such as a microdrive or other device, to provide additional storage.
 - Each of the components 650 , 652 , 664 , 654 , 666 , and 668 are interconnected using various buses, and several of the components may be mounted on a common motherboard or in other manners as appropriate.
 - the processor 652 can execute instructions within the computing device 650 , including instructions stored in the memory 664 .
 - the processor may be implemented as a chipset of chips that include separate and multiple analog and digital processors.
 - the processor may provide, for example, for coordination of the other components of the device 650 , such as control of user interfaces, applications run by device 650 , and wireless communication by device 650 .
 - Processor 652 may communicate with a user through control interface 658 and display interface 656 coupled to a display 654 .
 - the display 654 may be, for example, a TFT LCD (Thin-Film-Transistor Liquid Crystal Display) or an OLED (Organic Light Emitting Diode) display, or other appropriate display technology.
 - the display interface 656 may comprise appropriate circuitry for driving the display 654 to present graphical and other information to a user.
 - the control interface 658 may receive commands from a user and convert them for submission to the processor 652 .
 - an external interface 662 may be provide in communication with processor 652 , so as to enable near area communication of device 650 with other devices. External interface 662 may provide, for example, for wired communication in some implementations, or for wireless communication in other implementations, and multiple interfaces may also be used.
 - the memory 664 stores information within the computing device 650 .
 - the memory 664 can be implemented as one or more of a computer-readable medium or media, a volatile memory unit or units, or a non-volatile memory unit or units.
 - Expansion memory 674 may also be provided and connected to device 650 through expansion interface 672 , which may include, for example, a SIMM (Single In Line Memory Module) card interface.
 - SIMM Single In Line Memory Module
 - expansion memory 674 may provide extra storage space for device 650 , or may also store applications or other information for device 650 .
 - expansion memory 674 may include instructions to carry out or supplement the processes described above, and may include secure information also.
 - expansion memory 674 may be provide as a security module for device 650 , and may be programmed with instructions that permit secure use of device 650 .
 - secure applications may be provided via the SIMM cards, along with additional information, such as placing identifying information on the SIMM card in a non-hackable manner.
 - the memory may include, for example, flash memory and/or NVRAM memory, as discussed below.
 - a computer program product is tangibly embodied in an information carrier.
 - the computer program product contains instructions that, when executed, perform one or more methods, such as those described above.
 - the information carrier is a computer- or machine-readable medium, such as the memory 664 , expansion memory 674 , or memory on processor 652 , that may be received, for example, over transceiver 668 or external interface 662 .
 - Device 650 may communicate wirelessly through communication interface 666 , which may include digital signal processing circuitry where necessary. Communication interface 666 may provide for communications under various modes or protocols, such as GSM voice calls, SMS, EMS, or MMS messaging, CDMA, TDMA, PDC, WCDMA, CDMA2000, or GPRS, among others. Such communication may occur, for example, through radio-frequency transceiver 668 . In addition, short-range communication may occur, such as using a Bluetooth, WiFi, or other such transceiver (not shown). In addition, GPS (Global Positioning System) receiver module 670 may provide additional navigation- and location-related wireless data to device 650 , which may be used as appropriate by applications running on device 650 .
 - GPS Global Positioning System
 - Device 650 may also communicate audibly using audio codec 660 , which may receive spoken information from a user and convert it to usable digital information. Audio codec 660 may likewise generate audible sound for a user, such as through a speaker, e.g., in a handset of device 650 . Such sound may include sound from voice telephone calls, may include recorded sound (e.g., voice messages, music files, etc.) and may also include sound generated by applications operating on device 650 .
 - Audio codec 660 may receive spoken information from a user and convert it to usable digital information. Audio codec 660 may likewise generate audible sound for a user, such as through a speaker, e.g., in a handset of device 650 . Such sound may include sound from voice telephone calls, may include recorded sound (e.g., voice messages, music files, etc.) and may also include sound generated by applications operating on device 650 .
 - the computing device 650 may be implemented in a number of different forms, as shown in the figure. For example, it may be implemented as a cellular telephone 680 . It may also be implemented as part of a smart phone 682 , personal digital assistant, or other similar mobile device.
 - Implementations of the various techniques described herein may be implemented in digital electronic circuitry, or in computer hardware, firmware, software, or in combinations of them. Implementations may implemented as a computer program product, i.e., a computer program tangibly embodied in an information carrier, e.g., in a machine-readable storage device or in a propagated signal, for execution by, or to control the operation of, data processing apparatus, e.g., a programmable processor, a computer, or multiple computers.
 - data processing apparatus e.g., a programmable processor, a computer, or multiple computers.
 - a computer program such as the computer program(s) described above, can be written in any form of programming language, including compiled or interpreted languages, and can be deployed in any form, including as a stand-alone program or as a module, component, subroutine, or other unit suitable for use in a computing environment.
 - a computer program can be deployed to be executed on one computer or on multiple computers at one site or distributed across multiple sites and interconnected by a communication network.
 - Method steps may be performed by one or more programmable processors executing a computer program to perform functions by operating on input data and generating output. Method steps also may be performed by, and an apparatus may be implemented as, special purpose logic circuitry, e.g., an FPGA (field programmable gate array) or an ASIC (application-specific integrated circuit).
 - FPGA field programmable gate array
 - ASIC application-specific integrated circuit
 - processors suitable for the execution of a computer program include, by way of example, both general and special purpose microprocessors, and any one or more processors of any kind of digital computer.
 - a processor will receive instructions and data from a read-only memory or a random access memory or both.
 - Elements of a computer may include at least one processor for executing instructions and one or more memory devices for storing instructions and data.
 - a computer also may include, or be operatively coupled to receive data from or transfer data to, or both, one or more mass storage devices for storing data, e.g., magnetic, magneto-optical disks, or optical disks.
 - Information carriers suitable for embodying computer program instructions and data include all forms of non-volatile memory, including by way of example semiconductor memory devices, e.g., EPROM, EEPROM, and flash memory devices; magnetic disks, e.g., internal hard disks or removable disks; magneto-optical disks; and CD-ROM and DVD-ROM disks.
 - semiconductor memory devices e.g., EPROM, EEPROM, and flash memory devices
 - magnetic disks e.g., internal hard disks or removable disks
 - magneto-optical disks e.g., CD-ROM and DVD-ROM disks.
 - the processor and the memory may be supplemented by, or incorporated in special purpose logic circuitry.
 - implementations may be implemented on a computer having a display device, e.g., a cathode ray tube (CRT) or liquid crystal display (LCD) monitor, for displaying information to the user and a keyboard and a pointing device, e.g., a mouse or a trackball, by which the user can provide input to the computer.
 - a display device e.g., a cathode ray tube (CRT) or liquid crystal display (LCD) monitor
 - keyboard and a pointing device e.g., a mouse or a trackball
 - Other kinds of devices can be used to provide for interaction with a user as well; for example, feedback provided to the user can be any form of sensory feedback, e.g., visual feedback, auditory feedback, or tactile feedback; and input from the user can be received in any form, including acoustic, speech, or tactile input.
 - Implementations may be implemented in a computing system that includes a back-end component, e.g., as a data server, or that includes a middleware component, e.g., an application server, or that includes a front-end component, e.g., a client computer having a graphical user interface or a Web browser through which a user can interact with an implementation, or any combination of such back-end, middleware, or front-end components.
 - Components may be interconnected by any form or medium of digital data communication, e.g., a communication network. Examples of communication networks include a local area network (LAN) and a wide area network (WAN), e.g., the Internet.
 - LAN local area network
 - WAN wide area network
 
Landscapes
- Engineering & Computer Science (AREA)
 - General Engineering & Computer Science (AREA)
 - Theoretical Computer Science (AREA)
 - Human Computer Interaction (AREA)
 - Physics & Mathematics (AREA)
 - General Physics & Mathematics (AREA)
 - User Interface Of Digital Computer (AREA)
 - Input From Keyboards Or The Like (AREA)
 
Abstract
Description
-  This description relates to touch keyboards.
 -  Touch keyboards may display keys on a screen or display, and may detect contact with the screen or display by a user's fingers and/or hands. At times, the user may accidentally contact part of the screen or display.
 -  According to an example implementation, a non-transitory computer-readable storage medium may comprise instructions stored thereon. When executed by at least one processor, the instructions may be configured to cause a computing system to at least display keys on a touch keyboard, display a prompt to type into the touch keyboard, store locations and pressures of calibration inputs onto the keys on the touch keyboard in association with an account, and recognize subsequent inputs into the touch keyboard during a session with the account based on the stored locations and pressures.
 -  According to an example implementation, a non-transitory computer-readable storage medium may comprise instructions stored thereon. When executed by at least one processor, the instructions may be configured to cause a computing system to at least authenticate a login associated with an account, display keys on a touch keyboard, the touch keyboard comprising a flat display and being configured to detect locations and pressures of contacts onto the flat display, display text on a display for a user of the account to type into the touch keyboard, store locations and pressures of contacts onto the touch keyboard for expected inputs, the expected inputs being based on the displayed text, determine minimum pressure thresholds and locations for multiple characters, the minimum pressure thresholds and locations being based on the stored locations and pressures, compare, during a session with the account, at least a first input from the touch keyboard to a first pressure threshold associated with a first character and a second input from the touch keyboard to a second pressure threshold associated with a second character, recognize the first character based on the first input meeting the first pressure threshold, and ignore the second input based on the second input not meeting the second pressure threshold.
 -  According to an example implementation, a computing device may include a touch screen configured to display text and receive touch inputs, at least one processor, and a non-transitory computer-readable storage medium comprising instructions stored thereon. When executed by at least one processor, the instructions may be configured to cause the computing device to at least display keys on the touch screen, display a prompt to type into the touch screen, store locations and pressures of inputs onto the keys on the touch screen, and recognize subsequent inputs into the touch screen based on the stored locations and pressures.
 -  According to an example implementation, a computing device may include means for displaying keys, means for displaying a prompt to type into the means for displaying keys, means for storing locations and pressures of inputs onto the keys on the means for displaying keys, and means for recognizing subsequent inputs into the means for displaying keys based on the stored locations and pressures.
 -  The details of one or more implementations are set forth in the accompanying drawings and the description below. Other features will be apparent from the description and drawings, and from the claims.
 -  
FIG. 1A shows a server, a computing device, and a touch keyboard according to an example implementation. -  
FIG. 1B shows a computing device according to another example implementation. -  
FIG. 1C is a diagram of a touch screen and related components according to an example implementation. -  
FIG. 1D is a diagram of a sensor grid according to an example implementation. -  
FIG. 2 is a diagram of a system according to an example implementation. -  
FIG. 3 shows the computing device, the touch keyboard, and a user typing onto the touch keyboard according to an example implementation. -  
FIG. 4 is a flowchart showing a method according to an example implementation. -  
FIG. 5 is a flowchart showing another method according to an example implementation. -  
FIG. 6 shows an example of a computer device and a mobile computer device that can be used to implement the techniques described here. -  
FIG. 1A shows a server 102, acomputing device 104A, and atouch keyboard 106 according to an example implementation. Thetouch keyboard 106 may display keys and detect a location of a contact(s) and an amount(s) of force or pressure with which a user presses and/or types onto the displayed keys. The server 102 and/orcomputing device 104A may determine whether to recognize contacts as valid inputs based on the amount of force or pressure applied, and previous determinations of minimum or threshold amounts of force or pressure at various parts of the keyboard, to recognize a valid input. -  A user may log into the server 102 via the
computing device 104A, and the server 102 may recognize input from the user onto thetouch keyboard 106 based on settings associated with the user's account. For example, the user may have previously calibrated his or her account and or settings for typing into thetouch keyboard 106, and the server 102 may recognize or ignore inputs into thetouch keyboard 106 based on the settings associated with the user's account. The settings, and/or determinations of whether to recognize or ignore input or contacts, may be stored on and/or performed by the server 102, thecomputing device 104A, or thetouch keyboard 106, according to example implementations. -  The server 102 may include a remote server that communicates with the
computing device 104A via a network, such as the Internet. The server 102 may include a single server or multiple servers. The server 102 may provide remote computing services to computing devices including thecomputing device 104A, such as providing search results, email, word processing, or other productivity services. The remote computing services may be provided to thecomputing device 104A via a browser of thecomputing device 104A, and the browser may display output from within thecomputing device 104A. -  The server 102 may, for example, receive and process information received from the
computing device 104A (and other computing devices), and send image data to thecomputing device 104A prompting thecomputing device 104A to display images. The information received by the server 102 from thecomputing device 104A may be based on input received by thecomputing device 104A from thetouch keyboard 106. For example, if the server 102 is providing word processing services to thecomputing device 104A, a user may type into thetouch keyboard 106, thecomputing device 104A may receive input from thetouch keyboard 106 such as alphanumeric keys and/or text, thecomputing device 104A may send the received input to the server 102, the server 102 may process the input, and the server 102 may send image data to thecomputing device 104A, prompting thecomputing device 104A to display the alphanumeric keys and/or text that the user inputted into thetouch keyboard 106. -  The
computing device 104A may include a general purpose computing system, and may include any computing device capable of sending and receiving data to and from the server 102, displaying data to a user, and receiving input from a keyboard, such as thetouch keyboard 106. Thecomputing device 104A may include, for example, a tablet computing device, a laptop or desktop computer, or a smartphone. Thecomputing device 104A may include adisplay 110A for presenting visual information to a user, and may include an interface for coupling to thetouch keyboard 106. The interface to thetouch keyboard 106 may be a wired interface, such as Universal Serial Bus (USB) or Ethernet, or may be wireless, such as IEEE 802.15 Bluetooth or IEEE 802.11 Wireless Fidelity (WiFi). Thecomputing device 104A may include abezel 108A surrounding thedisplay 110A. Thedisplay 110A may present images to a user, which may be based on a combination of software stored in thecomputing device 104A and data received from the server 102. -  The
touch keyboard 106 may include a standalone touch keyboard, and may display keys and detect when the user contacts and/or presses the displayed keys. Thetouch keyboard 106 may include abezel 112 surrounding adisplay 114. Thedisplay 114 may display keys. Thekeyboard 106 may also include a touch sensor for detecting where the user has touched thedisplay 114 and provided input. Thedisplay 114 may be flat, enabling thetouch keyboard 106 to be thinner than keyboards with physical keys. Thetouch keyboard 106 may detect an amount of force or pressure applied by the user to the display. Thetouch keyboard 106 may, for example, include resistive and/or capacitive elements and may detect changes in resistance and/or capacitance. The change in resistance and/or capacitance may indicate the location and force or pressure of the contacts. -  In an example implementation, the
touch keyboard 106 may not include thedisplay 114. Thetouch keyboard 114 may, for example, include an image of a keyboard printed onto material such as fabric or plastic. Theexample touch keyboard 106 that does not include thedisplay 114 may detect pressure and/or location of contacts, as described herein. In an example implementation, thetouch keyboard 106 may not include and/or display any images of keys, and may include a trackpad, for example, that detects locations and pressures of contacts. -  
FIG. 1B shows acomputing device 104B according to another example implementation. Thecomputing device 104B may include any or all of the features and functionalities of thecomputing device 104A described herein. In this example, thedisplay 110B, which may be surrounded by abezel 108B and may include a touch screen, may display akeyboard 116. Thecomputing device 104B may detect amounts of force or pressure applied by the user onto thekeyboard 116 of thedisplay 110B and locations of the force or pressure, in similar manner to thedisplay 110A and/ortouch keyboard 106 described herein. -  
FIG. 1C is a diagram of atouch screen 140 and related components according to an example implementation. Thetouch screen 140 may be included in thedisplay 110A of thecomputing device 104A, thedisplay 114 of thetouch keyboard 106, or thedisplay 110B of thecomputing device 104B. Thetouch screen 140 may include asurface 150, asensor 152, and acontroller 154. -  The
surface 150 may be configured to be contacted by a user to actuate and trigger an electrical response within thetouch screen 140. Thesurface 150 may, for example, be on top of the touch screen 140 (such as on the portion of the 110A, 114, 110B closest to the user) and above thedisplay sensor 152. Thesurface 150 may be operably coupled to thesensor 152. Thesensor 152 can be activated when a user enters an input (e.g., a touch or a tap), such as by applying pressure on thetop surface 150 of thetouch screen 140. Thesensor 152 can be, for example, a flame-retardant class-4 (FR4) printed circuit board. Thesensor 152 may be responsive to applications of pressure on thesurface 150 and/orsensor 152, and may provide signals to acontroller 154 indicating changes in resistance and/or capacitance in thesensor 152 based on the applications of pressure, and locations of the changes in resistance and/or capacitance. -  The
controller 154 may be operably coupled to thesensor 152. Thecontroller 154 may be an embedded microcontroller chip and may include, for example, read-only firmware. Thecontroller 154 may include a single integrated circuit containing a processor core, memory, and programmable input/output peripherals. Thecontroller 154 may provide input to the 104A, 104B and/orcomputing device touch keyboard 106, such as the locations and force or pressure of contacts, or the force or pressure of contacts and the keys associated with the locations of the contacts. -  
FIG. 1D is a diagram of asensor grid 170 according to an example implementation. Thesensor grid 170 may be included as part of thetouch screen 140, such as part ofsensor 152 shown inFIG. 1C . Other implementations are possible, and the specific depiction ofsensor grid 170 shown inFIG. 1D is merely for illustration. For example, thegrid 170 may have any number of columns and rows (rather the eight columns and five rows shown in the example ofFIG. 1D ), and may be formed in any shape. Thesensor grid 170 may include any number sensors, such as 180, 182, 184, 186. Thesensors  180, 182, 184, 186 may be spaced any distance (such as a few millimeters) apart from each other and may be designed to sense tactile input. Thesensors  180, 182, 184, 186 may sense tactile input by sensing applications of pressure to thesensors surface 150 of thetouch screen 140, such as by detecting or determining resistance and/or capacitance levels. The resistance and/or capacitance levels may be changed by the received tactile input, such as changes or applications of pressure to thesurface 150 and/orsensor 152. -  
Input 172, which may be a fingerpad contact, represents a position on thegrid 170 when a user places a finger on the tactile input device 110. As shown inFIG. 1D ,input 172 may span multiple rows and columns of 180, 182, 184, 186 onsensors grid 170. The 180, 182, 184, 186, and/orsensors controller 154 may sense and/or determine an amount of pressure applied by the user's finger based on changes in the resistance and/or capacitance, and/or based on the number or area of 180, 182, 184, 186 that detect the user's finger contacting thesensors surface 150. The 180, 182, 184, 186, and/orsensors controller 154 may sense and/or determine a location of thecontact input 172. In an example implementation, the location may be measured in both a horizontal or ‘X’ direction and a vertical or ‘Y’ direction. The location may map to a key on thedisplay 114 of thetouch keyboard 106 orkeyboard 116 of thedisplay 110B described above. -  
FIG. 2 is a diagram of asystem 200 according to an example implementation. Thesystem 200 may determine whether to recognize contacts, such as theinput 172 shown and described above with respect toFIG. 1D , on thetouch keyboard 106 as valid keyboard inputs. Thesystem 200 may also determine whether to recognize contacts on thekeyboard 116; while contacts will be described herein with respect to thetouch keyboard 106, the functions and processes described herein may also be applied with respect to thekeyboard 116. The functions and/or modules of thesystem 200 may be included in any combination of the server 102, the 104A, 104B, and/or thecomputing devices touch keyboard 106. The functions and/or modules may, for example be performed by the server 102 in a remote computing context in which the user logs into the server 102 via thecomputing device 104A, in a local computing context in which the user logs into thecomputing device 104B or in which thecomputing device 104B does not require login and stores the same settings for all users, or by thekeyboard 106 in an example in which thekeyboard 106 is calibrated for one or more users. The functions and/or modules may also be distributed between the server 102, 104A, 104B, and/orcomputing device keyboard 106. Any combination of the functions and/or modules described herein may also be included in or performed by touch DJ equipment, trackpads, mice, or smartphones. -  The
system 200 may determine whether to recognize contacts on thetouch keyboard 106 as valid keyboard inputs based on locations of the contacts, amounts of force or pressure applied by the contacts to thetouch keyboard 106, and previous determinations of minimums or thresholds of force or pressure required to recognize a valid contact at various locations on thetouch keyboard 106. The minimums or thresholds of force or pressure may be different at different locations on thetouch keyboard 106. For example, a first minimum or threshold of force or pressure at a location on thetouch keyboard 106 where the space bar is displayed may be greater than a second minimum or threshold of force or pressure at one or more locations on thetouch keyboard 106 where keys are displayed. In another example, the guide keys, ‘a’, ‘s’, ‘d’, ‘f’, ‘j’, ‘k,’ ‘l’, and the apostrophe ('), at which users typically rest their fingers before beginning to type, may have greater or higher threshold than other keys on the keyboard, preventing thesystem 200 from recognizing inputs when the user is merely resting his or her fingers on thetouch keyboard 106. The minimums or thresholds may also be different for different locations associated with the same key. -  The
system 200 may include acalibration engine 202. The calibration engine may calibrate the user's account to determine whether contacts on thetouch keyboard 106 were intentional contacts intended as keystrokes, or accidental contacts that were not intended as keystrokes. Thecalibration engine 202 may prompt the user to type sample text, monitor the user typing the sample text, and determine pressure or force thresholds or minimums for various locations to recognize contacts as valid keystrokes and/or inputs. Calibration may be performed by thecalibration engine 202 when the user sets up an account for using thetouch keyboard 106, or at any time after the user sets up an account, and the settings, such as pressure or force thresholds and any location variances, may be accessed when the user later types into thetouch keyboard 106. -  The
calibration engine 202 may determine force or pressure thresholds during a calibration process. Thecalibration engine 202 may include apresenter 204. Thepresenter 204 may cause a prompt to be displayed on the display of thecomputing device 104 during the calibration process. The prompt may, for example, include text for the user to type into thetouch keyboard 106. The text shown on the display may change as the user types into thetouch keyboard 106, allowing the user to type a significant amount of text into thetouch keyboard 106 for calibration. -  The
calibration engine 202 may also include alistener 206. Thelistener 206 may detect contacts, such asinput 172, onto thetouch keyboard 106 during the calibration process. The listener may determine, for each contact, a determined key based on the prompted text and/or location of the contact on thetouch keyboard 106, a location on thetouch keyboard 106 of the contact, and a pressure or amount of force of the contact. The determined key may also be “none” or “null,” if the system determines that no keystroke was intended, such as if the keystroke was inconsistent with the prompted text and/or the amount of force or pressure was low. -  
FIG. 3 shows thecomputing device 104, thetouch keyboard 106, and a user typing onto thetouch keyboard 106 according to an example implementation. In this example, thepresenter 204 may have instructed the display of the computing device to display the text, “Please type this sentence into the keyboard.” Thelistener 206 may detect contacts, such asinput 172, from the user's fingers and/or 302, 304 into thehands touch keyboard 106. Thecalibration engine 202 may recognize the characters in the text, “Please type this sentence into the keyboard,” as expected input, and may associate sequential contacts with sequential characters in the text. -  The
calibration engine 202 may also determine locations for keys or characters. While most contacts while be inside the intended keys, some contacts may be outside or between keys. Different users may have different typing patterns. For example, a first user may contact thetouch keyboard 106 between the ‘o’ and the ‘p’ when intending to type an ‘o’, whereas a second user may contact thetouch keyboard 106 between the ‘o’ and the ‘p’ when intending to type a ‘p’. Thecalibration engine 202 may store the location between the ‘o’ and the ‘p’ in association with the letter ‘o’ for the first user's account, and may store the location between the ‘o’ and the ‘p’ in association with the letter ‘p’ for the second user's account. The locations may, for example, include X and Y values with respect to a reference point, such as an upper-left corner of thetouch keyboard 106, according to an example implementation. -  In an example implementation, the
calibration engine 202 may change the images of keys displayed by the 114, 110B based on the user's typing patterns. The calibration engine may change the locations of images of keys based on the locations users contact when they intent to type specific keys. For example, if a user contacts thedisplay touch keyboard 106 between the ‘o’ and the ‘p’ when intending to type an ‘o’, thecalibration engine 202 may move the ‘o’ and ‘p’ to the right, so that the ‘o’ is where the user contacts thetouch keyboard 106 when the user is intending to type the ‘o’. by moving the keys to the locations that the user contacts thekeyboard 106 when intending to type the respective keys, thecalibration engine 202 may generate a keyboard that conforms to the user's typing patterns and is easier for the user to type into. -  In an example implementation, the
calibration engine 202 may determine pressure thresholds during use, such as during a login session, in addition to or instead of, during a calibration process. For example, thecalibration engine 302 may update the minimum force or pressure thresholds during a login session by a user, such as by lowering the force or pressure threshold if the user retypes or recontacts a key after the input processor 222 (described below) ignored a contact on the key, or may raise the force or pressure threshold if the user deletes a character that theinput processor 222 recognized. Thecalibration engine 302 may also to adjust the thresholds over time by averaging new inputs with stored inputs, according to an example implementation. -  Returning to
FIG. 2 , thesystem 200 may also include amemory 208. Thememory 208 may storeinstructions 210 that, when executed by at least one processor, are configured to cause a computing system to perform any combination of the functions and processes described herein. Thememory 208 may also storedata 212. Thedata 212 may include both information that is related to the functions and processes described herein, and information that is not related to the functions and processes described herein. -  The
memory 208 may also storemultiple input units 214. While only oneinput unit 214 is shown inFIG. 2 , thememory 208 may storemultiple input units 214. Thecalibration engine 202 may cause thememory 208 to storeinput units 214 based on contacts detected by thelistener 206 during calibration. Thememory 208 may, for example, store at least oneinput unit 214 for each key displayed by thetouch keyboard 106. Thememory 208 may, for example, storemultiple input units 214 for each key displayed by thetouch keyboard 106, with eachinput unit 214 corresponding to a location where the user may contact thetouch keyboard 106 when intending to type the respective key. The location may include an X value and a Y value with respect to a reference point on thetouch keyboard 106. In an example with multiple accounts, thememory 208 may store versions of theseinput units 214 for each account. -  Each
input unit 214 may be associated with, and/or include, a key 216, such as an alphanumeric key on thetouch keyboard 106 and/or an American Standard Code for Information Interchange (ASCII) character or Unicode character. Theinput unit 214 may also include alocation 218. Thelocation 218 may be a detected location on thetouch keyboard 106. Thelocation 218 may, for example, correspond to a location on thetouch keyboard 106, which may be represented by an X value and a Y value, in which the associatedkey 216 is displayed and/or where the user contacts thetouch keyboard 106 intending to type the associatedkey 216. Theinput unit 214 may also include apressure threshold 220. Thepressure threshold 220 may indicate a minimum or threshold of pressure or force that must be detected at thelocation 218 associated with theinput unit 214 for the associated key 214 to be recognized as a valid input during a given contact onto thetouch keyboard 106. If the contact does not meet thepressure threshold 220, then no input from the touch keyboard may be recognized and/or thesystem 200 may ignore the contact. -  The
system 200 may include aninput processor 222. Theinput processor 222 may process input from thetouch keyboard 106, such as contacts, and determine which character, if any, to recognize. -  The
input processor 222 may include areceiver 224. Thereceiver 224 may receive inputs from thetouch keyboard 106, such as contact inputs including aninput 172. Thereceiver 224 may receive the input from, for example, thecontroller 154. The inputs may include locations of the contacts, which may include X values and Y values, and pressures of the contacts. -  The
input processor 222 may also include acomparator 226. Thecomparator 226 may compare the received inputs to theinput units 214 stored in thememory 208. Thecomparator 226 may, for example, determine which, if any,input unit 214 includes thelocation 218 corresponding to the location on thetouch keyboard 106 where the contact occurred. If noinput unit 214 includes alocation 218 corresponding to the location of the contact, then theinput processor 222 may not recognize any input from thetouch keyboard 106 and/or may ignore the contact. If aninput unit 214 does include alocation 218 corresponding to the location of the contact, then thecomparator 226 may compare the pressure or force of the contact with thepressure threshold 220 included in theinput unit 214. If the pressure or force of the contact meets or exceeds thepressure threshold 220, then theinput processor 222 may recognize the input as the key 216 stored in the input unit. If the pressure or force of the contact does not meet or exceed thepressure threshold 220, then theinput processor 222 may ignore the input and/or not recognize any input from thetouch keyboard 106 based on the contact. -  The
system 200 may include anaccount manager 228. Theaccount manager 228 may manage and/or store accounts for one or multiple users. Theaccount manager 228 may, for example, store a name 230 andauthentication 232 for each user. The name 230 may include a username associated with the user. Theauthentication 232 may include authentication, such as a password, biometric information, and/or address information associated with the name 230. -  The
account manager 228 may also include asession manager 234. Thesession manager 234 may manage logins, such as by requesting a name and authentication, to log a user in. -  Once the user is logged in, the
session manager 234 may associate theinput units 214 associated with the account with the logged-in name with inputs and/or contacts received from thetouch keyboard 106. Thesession manager 234 may, for example, cause theinput processor 222 to compare received contacts to a first set ofinput units 214 that are associated with a first account during a session after the first account was logged in, and may cause theinput processor 222 to compare received contacts to a second set ofinput units 214 that are associated with a second account during a session after the second account was logged in. The comparison to different sets ofinput units 214 may allow different users to use thesame computing device 104 and/ortouch keyboard 106 and have thesystem 200 respond to their respective calibrations. Thesession manager 234 may cause theinput processor 222 to compare the received contacts to inputunits 214 associated with a given account even when the account is logged in via adifferent computing device 104 and/ortouch keyboard 106, allowing a user to log in via different devices and type into a different touch keyboard with the same calibration settings. -  According to an example implementation, the
account manager 228 may recognize and authenticate a user and/or account based on typing patterns into thetouch keyboard 106. For example, thememory 208 may have stored average intervals between keys and/or average pressures on keys for a given account. Rather than require a name and authentication, theaccount manager 228 may simply let a user begin typing and identify and authenticate the user based on typing patterns into thetouch keyboard 106, such as intervals between keys and/or pressure on keys. -  The
system 200 may include one ormore processors 236. The one ormore processors 236 may include one or more microprocessors capable of executing instructions, such as theinstructions 210 stored in thememory 208. The one ormore processors 236 may implement modules or functionalities of thesystem 200, such as thecalibration engine 202,input processor 222,account manager 228,keyboard interface 238, and/orcomputing device interface 240. -  The
system 200 may include akeyboard interface 238. Thekeyboard interface 238 may receive and/or send signals from and/or to thetouch keyboard 106, such as from thecontroller 154. Thekeyboard interface 238 may, for example, receive contact inputs, including locations and pressures, from thetouch keyboard 106, and pass the contact inputs to theinput processor 222. -  The
system 200 may include acomputing device interface 240. Thecomputing device interface 240 may send and/or receive signals to and/or from the 104A, 104B. Thecomputing device computing device interface 240 may, for example, receive contact inputs, and may send display signals for generating display output to the 104A, 104B.computing device  -  The
system 200 may include an input/output module 242. The input/output module 242 may include a single module which receives input and sends output, or may include a first module that receives input and a second module that sends output. The input/output module(s) 242 may include one or multiple wired or wireless interfaces for communicating with the server 102, 104A, 104B, and/orcomputing device touch keyboard 106. -  
FIG. 4 is a flowchart showing amethod 400 according to an example implementation. According to this example, themethod 400 may include calibrating an account for input from a touch keyboard 106 (402). In an example implementation, thetouch keyboard 106 may include a flat display. The calibrating may include storing locations and pressures of contacts onto the touch keyboard for expected inputs. Calibrating the account (402) may include presenting text for the expected inputs. Thecalibration engine 202 may calibrate the touch keyboard 106 (402) by, for example, instructing thecomputing device 104 to present sample text to the user, listening for and/or detecting contact inputs, and storinginput units 214 based on the presented text and received inputs. Calibrating the account (402) may include storing locations and pressures of contacts for each of multiple characters, the multiple characters corresponding to the expected inputs. -  The calibrating the account (402) may include determining a minimum pressure threshold for each of a plurality of characters, each minimum threshold being based on multiple contacts and pressures associated with the character. The
calibration engine 202 may also determine which contact inputs represented keyboard entries and which were accidental contacts. For contacts associated with a same key or character, thecalibration engine 202 may determine apressure threshold 220 based on a lowest pressure of all the contacts, may discard a fraction of the contacts with lower pressures or force, or may discard outliers based on a calculated standard deviation, according to example implementations. -  The
method 400 may also include authenticating an account based on received login information (404). Recognizing the account (404) may include receiving login information, such as a name and authentication information, such as a username and password associated with the account. Recognizing the account (404) may include loading a set ofinput units 214 associated with the account, with which the input processor will determine whether to recognize contact inputs. -  The
method 400 may also include recognizing inputs from thetouch keyboard 106 based on comparing received locations and pressures of contacts onto the touch keyboard to the stored locations and pressures of contacts (406). The inputs may be recognized during a session with an account. During the session, the set ofinput units 214 associated with the account may have been loaded. The recognizing inputs (406) may include thereceiver 224 of theinput processor 222 receiving contact inputs, including location on thetouch keyboard 106 and force or pressure, representing contacts on thetouch keyboard 106. The recognizing inputs (406) may also include thecomparator 226 comparing the pressure or force of the contact inputs to thepressure threshold 220 of theinput unit 214 with alocation 218 matching or including the location of the contact input. If the pressure or force of the contact input meets or exceeds thepressure threshold 220, theinput processor 222 may recognize the key 216 associated with thelocation 218 of the contact input. -  The
method 400 may also include ignoring input from the touch keyboard based on comparing locations and pressures of contacts onto the touch keyboard to the calibrated account. If the pressure or force of the contact input does not meet or exceed thepressure threshold 220, theinput processor 222 may discard and/or ignore the contact input. -  
FIG. 5 is a flowchart showing anothermethod 500 according to an example implementation. Themethod 500 may include displaying a prompt to type into the touch keyboard (502). Themethod 500 may also include storing locations and pressures of calibration inputs onto keys on the touch keyboard in association with an account (504). Themethod 500 may also include recognizing subsequent inputs into the touch keyboard during a session with the account based on the stored locations and pressures (506). -  According to an example implementation, the
touch keyboard 106 may include a display that displays the keys. -  According to an example implementation, the
touch keyboard 106 may include a resistive touch keyboard that monitors locations and pressures of inputs and/or contacts. -  According to an example implementation, the
touch keyboard 106 may include a standalone touch keyboard configured to transmit keyboard input signals to a general purpose computing system. -  According to an example implementation, the
touch keyboard 106 may include a standalone touch keyboard configured to wirelessly transmit keyboard input signals to a general purpose computing system, such as thecomputing device 104. -  According to an example implementation, the
touch keyboard 106 may include a standalone touch keyboard configured to display the keys and display the prompt. The server 102 may include a remote server configured to store the locations and pressures of the calibration inputs and recognize the subsequent inputs. -  According to an example implementation, the recognizing (506) may include receiving a login associated with the account and retrieving stored locations and pressures associated with the account.
 -  According to an example implementation, the recognizing (506) may include comparing locations and pressures of typing inputs with the stored locations and pressures of calibration inputs associated with the account.
 -  According to an example implementation, the
method 500 may further include ignoring an input to a predetermined location based on a pressure of the input being less than a threshold for the predetermined location, the threshold for the predetermined location being higher than thresholds for other locations on the touch keyboard. -  According to an example implementation, the
method 500 may further include storing locations and pressures of a second set of calibration inputs onto the keys on the touch keyboard in association with a second account, and recognizing subsequent inputs during a session with the second account based on the second set of stored locations and pressures. -  According to an example implementation, the
method 500 may further include recognizing subsequent inputs into a second touch keyboard during a second session with the account based on the stored locations and pressures. -  According to an example implementation, the
method 500 may further include determining a pressure threshold for each of the keys based on the stored locations and pressures. -  
FIG. 6 shows an example of ageneric computer device 600 and a genericmobile computer device 650, which may be used with the techniques described here.Computing device 600 is intended to represent various forms of digital computers, such as laptops, desktops, workstations, personal digital assistants, servers, blade servers, mainframes, and other appropriate computers.Computing device 650 is intended to represent various forms of mobile devices, such as personal digital assistants, cellular telephones, smart phones, and other similar computing devices. The components shown here, their connections and relationships, and their functions, are meant to be exemplary only, and are not meant to limit implementations of the inventions described and/or claimed in this document. -  
Computing device 600 includes aprocessor 602,memory 604, astorage device 606, a high-speed interface 608 connecting tomemory 604 and high-speed expansion ports 610, and alow speed interface 612 connecting tolow speed bus 614 andstorage device 606. Each of the 602, 604, 606, 608, 610, and 612, are interconnected using various busses, and may be mounted on a common motherboard or in other manners as appropriate. Thecomponents processor 602 can process instructions for execution within thecomputing device 600, including instructions stored in thememory 604 or on thestorage device 606 to display graphical information for a GUI on an external input/output device, such asdisplay 616 coupled tohigh speed interface 608. In other implementations, multiple processors and/or multiple buses may be used, as appropriate, along with multiple memories and types of memory. Also,multiple computing devices 600 may be connected, with each device providing portions of the necessary operations (e.g., as a server bank, a group of blade servers, or a multi-processor system). -  The
memory 604 stores information within thecomputing device 600. In one implementation, thememory 604 is a volatile memory unit or units. In another implementation, thememory 604 is a non-volatile memory unit or units. Thememory 604 may also be another form of computer-readable medium, such as a magnetic or optical disk. -  The
storage device 606 is capable of providing mass storage for thecomputing device 600. In one implementation, thestorage device 606 may be or contain a computer-readable medium, such as a floppy disk device, a hard disk device, an optical disk device, or a tape device, a flash memory or other similar solid state memory device, or an array of devices, including devices in a storage area network or other configurations. A computer program product can be tangibly embodied in an information carrier. The computer program product may also contain instructions that, when executed, perform one or more methods, such as those described above. The information carrier is a computer- or machine-readable medium, such as thememory 604, thestorage device 606, or memory onprocessor 602. -  The
high speed controller 608 manages bandwidth-intensive operations for thecomputing device 600, while thelow speed controller 612 manages lower bandwidth-intensive operations. Such allocation of functions is exemplary only. In one implementation, the high-speed controller 608 is coupled tomemory 604, display 616 (e.g., through a graphics processor or accelerator), and to high-speed expansion ports 610, which may accept various expansion cards (not shown). In the implementation, low-speed controller 612 is coupled tostorage device 606 and low-speed expansion port 614. The low-speed expansion port, which may include various communication ports (e.g., USB, Bluetooth, Ethernet, wireless Ethernet) may be coupled to one or more input/output devices, such as a keyboard, a pointing device, a scanner, or a networking device such as a switch or router, e.g., through a network adapter. -  The
computing device 600 may be implemented in a number of different forms, as shown in the figure. For example, it may be implemented as astandard server 620, or multiple times in a group of such servers. It may also be implemented as part of arack server system 624. In addition, it may be implemented in a personal computer such as alaptop computer 622. Alternatively, components fromcomputing device 600 may be combined with other components in a mobile device (not shown), such asdevice 650. Each of such devices may contain one or more of 600, 650, and an entire system may be made up ofcomputing device  600, 650 communicating with each other.multiple computing devices  -  
Computing device 650 includes aprocessor 652,memory 664, an input/output device such as adisplay 654, acommunication interface 666, and atransceiver 668, among other components. Thedevice 650 may also be provided with a storage device, such as a microdrive or other device, to provide additional storage. Each of the 650, 652, 664, 654, 666, and 668, are interconnected using various buses, and several of the components may be mounted on a common motherboard or in other manners as appropriate.components  -  The
processor 652 can execute instructions within thecomputing device 650, including instructions stored in thememory 664. The processor may be implemented as a chipset of chips that include separate and multiple analog and digital processors. The processor may provide, for example, for coordination of the other components of thedevice 650, such as control of user interfaces, applications run bydevice 650, and wireless communication bydevice 650. -  
Processor 652 may communicate with a user throughcontrol interface 658 anddisplay interface 656 coupled to adisplay 654. Thedisplay 654 may be, for example, a TFT LCD (Thin-Film-Transistor Liquid Crystal Display) or an OLED (Organic Light Emitting Diode) display, or other appropriate display technology. Thedisplay interface 656 may comprise appropriate circuitry for driving thedisplay 654 to present graphical and other information to a user. Thecontrol interface 658 may receive commands from a user and convert them for submission to theprocessor 652. In addition, anexternal interface 662 may be provide in communication withprocessor 652, so as to enable near area communication ofdevice 650 with other devices.External interface 662 may provide, for example, for wired communication in some implementations, or for wireless communication in other implementations, and multiple interfaces may also be used. -  The
memory 664 stores information within thecomputing device 650. Thememory 664 can be implemented as one or more of a computer-readable medium or media, a volatile memory unit or units, or a non-volatile memory unit or units.Expansion memory 674 may also be provided and connected todevice 650 throughexpansion interface 672, which may include, for example, a SIMM (Single In Line Memory Module) card interface.Such expansion memory 674 may provide extra storage space fordevice 650, or may also store applications or other information fordevice 650. Specifically,expansion memory 674 may include instructions to carry out or supplement the processes described above, and may include secure information also. Thus, for example,expansion memory 674 may be provide as a security module fordevice 650, and may be programmed with instructions that permit secure use ofdevice 650. In addition, secure applications may be provided via the SIMM cards, along with additional information, such as placing identifying information on the SIMM card in a non-hackable manner. -  The memory may include, for example, flash memory and/or NVRAM memory, as discussed below. In one implementation, a computer program product is tangibly embodied in an information carrier. The computer program product contains instructions that, when executed, perform one or more methods, such as those described above. The information carrier is a computer- or machine-readable medium, such as the
memory 664,expansion memory 674, or memory onprocessor 652, that may be received, for example, overtransceiver 668 orexternal interface 662. -  
Device 650 may communicate wirelessly throughcommunication interface 666, which may include digital signal processing circuitry where necessary.Communication interface 666 may provide for communications under various modes or protocols, such as GSM voice calls, SMS, EMS, or MMS messaging, CDMA, TDMA, PDC, WCDMA, CDMA2000, or GPRS, among others. Such communication may occur, for example, through radio-frequency transceiver 668. In addition, short-range communication may occur, such as using a Bluetooth, WiFi, or other such transceiver (not shown). In addition, GPS (Global Positioning System)receiver module 670 may provide additional navigation- and location-related wireless data todevice 650, which may be used as appropriate by applications running ondevice 650. -  
Device 650 may also communicate audibly usingaudio codec 660, which may receive spoken information from a user and convert it to usable digital information.Audio codec 660 may likewise generate audible sound for a user, such as through a speaker, e.g., in a handset ofdevice 650. Such sound may include sound from voice telephone calls, may include recorded sound (e.g., voice messages, music files, etc.) and may also include sound generated by applications operating ondevice 650. -  The
computing device 650 may be implemented in a number of different forms, as shown in the figure. For example, it may be implemented as acellular telephone 680. It may also be implemented as part of asmart phone 682, personal digital assistant, or other similar mobile device. -  Implementations of the various techniques described herein may be implemented in digital electronic circuitry, or in computer hardware, firmware, software, or in combinations of them. Implementations may implemented as a computer program product, i.e., a computer program tangibly embodied in an information carrier, e.g., in a machine-readable storage device or in a propagated signal, for execution by, or to control the operation of, data processing apparatus, e.g., a programmable processor, a computer, or multiple computers. A computer program, such as the computer program(s) described above, can be written in any form of programming language, including compiled or interpreted languages, and can be deployed in any form, including as a stand-alone program or as a module, component, subroutine, or other unit suitable for use in a computing environment. A computer program can be deployed to be executed on one computer or on multiple computers at one site or distributed across multiple sites and interconnected by a communication network.
 -  Method steps may be performed by one or more programmable processors executing a computer program to perform functions by operating on input data and generating output. Method steps also may be performed by, and an apparatus may be implemented as, special purpose logic circuitry, e.g., an FPGA (field programmable gate array) or an ASIC (application-specific integrated circuit).
 -  Processors suitable for the execution of a computer program include, by way of example, both general and special purpose microprocessors, and any one or more processors of any kind of digital computer. Generally, a processor will receive instructions and data from a read-only memory or a random access memory or both. Elements of a computer may include at least one processor for executing instructions and one or more memory devices for storing instructions and data. Generally, a computer also may include, or be operatively coupled to receive data from or transfer data to, or both, one or more mass storage devices for storing data, e.g., magnetic, magneto-optical disks, or optical disks. Information carriers suitable for embodying computer program instructions and data include all forms of non-volatile memory, including by way of example semiconductor memory devices, e.g., EPROM, EEPROM, and flash memory devices; magnetic disks, e.g., internal hard disks or removable disks; magneto-optical disks; and CD-ROM and DVD-ROM disks. The processor and the memory may be supplemented by, or incorporated in special purpose logic circuitry.
 -  To provide for interaction with a user, implementations may be implemented on a computer having a display device, e.g., a cathode ray tube (CRT) or liquid crystal display (LCD) monitor, for displaying information to the user and a keyboard and a pointing device, e.g., a mouse or a trackball, by which the user can provide input to the computer. Other kinds of devices can be used to provide for interaction with a user as well; for example, feedback provided to the user can be any form of sensory feedback, e.g., visual feedback, auditory feedback, or tactile feedback; and input from the user can be received in any form, including acoustic, speech, or tactile input.
 -  Implementations may be implemented in a computing system that includes a back-end component, e.g., as a data server, or that includes a middleware component, e.g., an application server, or that includes a front-end component, e.g., a client computer having a graphical user interface or a Web browser through which a user can interact with an implementation, or any combination of such back-end, middleware, or front-end components. Components may be interconnected by any form or medium of digital data communication, e.g., a communication network. Examples of communication networks include a local area network (LAN) and a wide area network (WAN), e.g., the Internet.
 -  While certain features of the described implementations have been illustrated as described herein, many modifications, substitutions, changes and equivalents will now occur to those skilled in the art. It is, therefore, to be understood that the appended claims are intended to cover all such modifications and changes as fall within the true spirit of the embodiments of the invention.
 
Claims (20)
Priority Applications (7)
| Application Number | Priority Date | Filing Date | Title | 
|---|---|---|---|
| US14/219,921 US9158426B1 (en) | 2014-03-19 | 2014-03-19 | Touch keyboard calibration | 
| BR112016020776A BR112016020776A8 (en) | 2014-03-19 | 2015-03-16 | VIRTUAL KEYBOARD CALIBRATION | 
| CN201580014290.8A CN106104454B (en) | 2014-03-19 | 2015-03-16 | Touch keyboard calibration | 
| JP2016558083A JP6255512B2 (en) | 2014-03-19 | 2015-03-16 | Touch keyboard adjustment | 
| AU2015231608A AU2015231608B2 (en) | 2014-03-19 | 2015-03-16 | Touch keyboard calibration | 
| PCT/US2015/020772 WO2015142740A1 (en) | 2014-03-19 | 2015-03-16 | Touch keyboard calibration | 
| EP15764749.6A EP3120234B1 (en) | 2014-03-19 | 2015-03-16 | Touch keyboard calibration | 
Applications Claiming Priority (1)
| Application Number | Priority Date | Filing Date | Title | 
|---|---|---|---|
| US14/219,921 US9158426B1 (en) | 2014-03-19 | 2014-03-19 | Touch keyboard calibration | 
Publications (2)
| Publication Number | Publication Date | 
|---|---|
| US20150268768A1 true US20150268768A1 (en) | 2015-09-24 | 
| US9158426B1 US9158426B1 (en) | 2015-10-13 | 
Family
ID=54142091
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date | 
|---|---|---|---|
| US14/219,921 Active US9158426B1 (en) | 2014-03-19 | 2014-03-19 | Touch keyboard calibration | 
Country Status (7)
| Country | Link | 
|---|---|
| US (1) | US9158426B1 (en) | 
| EP (1) | EP3120234B1 (en) | 
| JP (1) | JP6255512B2 (en) | 
| CN (1) | CN106104454B (en) | 
| AU (1) | AU2015231608B2 (en) | 
| BR (1) | BR112016020776A8 (en) | 
| WO (1) | WO2015142740A1 (en) | 
Cited By (57)
| Publication number | Priority date | Publication date | Assignee | Title | 
|---|---|---|---|---|
| US20160259458A1 (en) * | 2015-03-06 | 2016-09-08 | Sony Corporation | Touch screen device | 
| US20180095596A1 (en) * | 2016-09-30 | 2018-04-05 | Biocatch Ltd. | System, device, and method of estimating force applied to a touch surface | 
| USD835661S1 (en) * | 2014-09-30 | 2018-12-11 | Apple Inc. | Display screen or portion thereof with graphical user interface | 
| US10262324B2 (en) | 2010-11-29 | 2019-04-16 | Biocatch Ltd. | System, device, and method of differentiating among users based on user-specific page navigation sequence | 
| US10298614B2 (en) * | 2010-11-29 | 2019-05-21 | Biocatch Ltd. | System, device, and method of generating and managing behavioral biometric cookies | 
| US10397262B2 (en) | 2017-07-20 | 2019-08-27 | Biocatch Ltd. | Device, system, and method of detecting overlay malware | 
| US10404729B2 (en) | 2010-11-29 | 2019-09-03 | Biocatch Ltd. | Device, method, and system of generating fraud-alerts for cyber-attacks | 
| US10417408B2 (en) * | 2017-03-10 | 2019-09-17 | International Business Machines Corporation | Tactile-based password entry | 
| US10474815B2 (en) | 2010-11-29 | 2019-11-12 | Biocatch Ltd. | System, device, and method of detecting malicious automatic script and code injection | 
| US10523680B2 (en) * | 2015-07-09 | 2019-12-31 | Biocatch Ltd. | System, device, and method for detecting a proxy server | 
| US10579784B2 (en) | 2016-11-02 | 2020-03-03 | Biocatch Ltd. | System, device, and method of secure utilization of fingerprints for user authentication | 
| US10586036B2 (en) | 2010-11-29 | 2020-03-10 | Biocatch Ltd. | System, device, and method of recovery and resetting of user authentication factor | 
| US10621585B2 (en) | 2010-11-29 | 2020-04-14 | Biocatch Ltd. | Contextual mapping of web-pages, and generation of fraud-relatedness score-values | 
| US10685355B2 (en) | 2016-12-04 | 2020-06-16 | Biocatch Ltd. | Method, device, and system of detecting mule accounts and accounts used for money laundering | 
| US10719765B2 (en) | 2015-06-25 | 2020-07-21 | Biocatch Ltd. | Conditional behavioral biometrics | 
| US10728761B2 (en) | 2010-11-29 | 2020-07-28 | Biocatch Ltd. | Method, device, and system of detecting a lie of a user who inputs data | 
| US10747305B2 (en) | 2010-11-29 | 2020-08-18 | Biocatch Ltd. | Method, system, and device of authenticating identity of a user of an electronic device | 
| US10776476B2 (en) | 2010-11-29 | 2020-09-15 | Biocatch Ltd. | System, device, and method of visual login | 
| US10834590B2 (en) | 2010-11-29 | 2020-11-10 | Biocatch Ltd. | Method, device, and system of differentiating between a cyber-attacker and a legitimate user | 
| US20200401292A1 (en) * | 2019-06-21 | 2020-12-24 | Cirrus Logic International Semiconductor Ltd. | Method and apparatus for configuring a plurality of virtual buttons on a device | 
| US10884617B2 (en) * | 2016-06-12 | 2021-01-05 | Apple Inc. | Handwriting keyboard for screens | 
| US10897482B2 (en) | 2010-11-29 | 2021-01-19 | Biocatch Ltd. | Method, device, and system of back-coloring, forward-coloring, and fraud detection | 
| US10917431B2 (en) * | 2010-11-29 | 2021-02-09 | Biocatch Ltd. | System, method, and device of authenticating a user based on selfie image or selfie video | 
| US10949757B2 (en) | 2010-11-29 | 2021-03-16 | Biocatch Ltd. | System, device, and method of detecting user identity based on motor-control loop model | 
| US10949514B2 (en) | 2010-11-29 | 2021-03-16 | Biocatch Ltd. | Device, system, and method of differentiating among users based on detection of hardware components | 
| US10970394B2 (en) | 2017-11-21 | 2021-04-06 | Biocatch Ltd. | System, device, and method of detecting vishing attacks | 
| US11016658B2 (en) | 2013-06-09 | 2021-05-25 | Apple Inc. | Managing real-time handwriting recognition | 
| US11055395B2 (en) | 2016-07-08 | 2021-07-06 | Biocatch Ltd. | Step-up authentication | 
| US11112968B2 (en) | 2007-01-05 | 2021-09-07 | Apple Inc. | Method, system, and graphical user interface for providing word recommendations | 
| US20210329030A1 (en) * | 2010-11-29 | 2021-10-21 | Biocatch Ltd. | Device, System, and Method of Detecting Vishing Attacks | 
| US11194467B2 (en) | 2019-06-01 | 2021-12-07 | Apple Inc. | Keyboard management user interfaces | 
| US11210674B2 (en) | 2010-11-29 | 2021-12-28 | Biocatch Ltd. | Method, device, and system of detecting mule accounts and accounts used for money laundering | 
| US11223619B2 (en) | 2010-11-29 | 2022-01-11 | Biocatch Ltd. | Device, system, and method of user authentication based on user-specific characteristics of task performance | 
| US11269977B2 (en) | 2010-11-29 | 2022-03-08 | Biocatch Ltd. | System, apparatus, and method of collecting and processing data in electronic devices | 
| US20220321121A1 (en) * | 2016-09-20 | 2022-10-06 | Apple Inc. | Input device having adjustable input mechanisms | 
| US11500538B2 (en) * | 2016-09-13 | 2022-11-15 | Apple Inc. | Keyless keyboard with force sensing and haptic feedback | 
| US11606353B2 (en) | 2021-07-22 | 2023-03-14 | Biocatch Ltd. | System, device, and method of generating and utilizing one-time passwords | 
| US11636742B2 (en) | 2018-04-04 | 2023-04-25 | Cirrus Logic, Inc. | Methods and apparatus for outputting a haptic signal to a haptic transducer | 
| US11644370B2 (en) | 2019-03-29 | 2023-05-09 | Cirrus Logic, Inc. | Force sensing with an electromagnetic load | 
| US11662821B2 (en) | 2020-04-16 | 2023-05-30 | Cirrus Logic, Inc. | In-situ monitoring, calibration, and testing of a haptic actuator | 
| US11669165B2 (en) | 2019-06-07 | 2023-06-06 | Cirrus Logic, Inc. | Methods and apparatuses for controlling operation of a vibrational output system and/or operation of an input sensor system | 
| US11692889B2 (en) | 2019-10-15 | 2023-07-04 | Cirrus Logic, Inc. | Control methods for a force sensor system | 
| US11726596B2 (en) | 2019-03-29 | 2023-08-15 | Cirrus Logic, Inc. | Controller for use in a device comprising force sensors | 
| US11736093B2 (en) | 2019-03-29 | 2023-08-22 | Cirrus Logic Inc. | Identifying mechanical impedance of an electromagnetic load using least-mean-squares filter | 
| US11765499B2 (en) | 2021-06-22 | 2023-09-19 | Cirrus Logic Inc. | Methods and systems for managing mixed mode electromechanical actuator drive | 
| US11779956B2 (en) | 2019-03-29 | 2023-10-10 | Cirrus Logic Inc. | Driver circuitry | 
| US11847906B2 (en) | 2019-10-24 | 2023-12-19 | Cirrus Logic Inc. | Reproducibility of haptic waveform | 
| US11908310B2 (en) | 2021-06-22 | 2024-02-20 | Cirrus Logic Inc. | Methods and systems for detecting and managing unexpected spectral content in an amplifier system | 
| US20240080339A1 (en) * | 2010-11-29 | 2024-03-07 | Biocatch Ltd. | Device, System, and Method of Detecting Vishing Attacks | 
| US11933822B2 (en) | 2021-06-16 | 2024-03-19 | Cirrus Logic Inc. | Methods and systems for in-system estimation of actuator parameters | 
| US11966513B2 (en) | 2018-08-14 | 2024-04-23 | Cirrus Logic Inc. | Haptic output systems | 
| US11972105B2 (en) | 2018-10-26 | 2024-04-30 | Cirrus Logic Inc. | Force sensing system and method | 
| US12035445B2 (en) | 2019-03-29 | 2024-07-09 | Cirrus Logic Inc. | Resonant tracking of an electromagnetic load | 
| US12032744B2 (en) | 2017-05-08 | 2024-07-09 | Cirrus Logic Inc. | Integrated haptic system | 
| US12176781B2 (en) | 2019-03-29 | 2024-12-24 | Cirrus Logic Inc. | Methods and systems for estimating transducer parameters | 
| US12244253B2 (en) | 2020-04-16 | 2025-03-04 | Cirrus Logic Inc. | Restricting undesired movement of a haptic actuator | 
| US12276687B2 (en) | 2019-12-05 | 2025-04-15 | Cirrus Logic Inc. | Methods and systems for estimating coil impedance of an electromagnetic transducer | 
Families Citing this family (5)
| Publication number | Priority date | Publication date | Assignee | Title | 
|---|---|---|---|---|
| CN107168554B (en) * | 2017-04-26 | 2020-07-24 | 歌尔股份有限公司 | Trigger key calibration method, device and equipment | 
| CN117270637A (en) | 2017-07-26 | 2023-12-22 | 苹果公司 | Computer with keyboard | 
| US20210141877A1 (en) * | 2018-07-30 | 2021-05-13 | Hewlett-Packard Development Company, L.P. | User interface modification | 
| US10852842B1 (en) * | 2019-07-29 | 2020-12-01 | Cirque Corporation, Inc. | Keyboard capacitive backup | 
| CN112445356B (en) * | 2019-08-28 | 2025-03-28 | 北京钛方科技有限责任公司 | A method, device and system for testing and calibrating a touch component | 
Citations (7)
| Publication number | Priority date | Publication date | Assignee | Title | 
|---|---|---|---|---|
| US20080309622A1 (en) * | 2007-06-13 | 2008-12-18 | Apple Inc. | Periodic sensor autocalibration and emulation by varying stimulus level | 
| US20110275412A1 (en) * | 2010-05-10 | 2011-11-10 | Microsoft Corporation | Automatic gain control based on detected pressure | 
| US20120050229A1 (en) * | 2010-08-27 | 2012-03-01 | Tenuta Matthew D | Touch sensor panel calibration | 
| US20120075194A1 (en) * | 2009-06-16 | 2012-03-29 | Bran Ferren | Adaptive virtual keyboard for handheld device | 
| US20140160085A1 (en) * | 2012-12-07 | 2014-06-12 | Qualcomm Incorporated | Adaptive analog-front-end to optimize touch processing | 
| US20140204059A1 (en) * | 2005-06-08 | 2014-07-24 | 3M Innovative Properties Company | Touch location determination involving multiple touch location processes | 
| US8982160B2 (en) * | 2010-04-16 | 2015-03-17 | Qualcomm, Incorporated | Apparatus and methods for dynamically correlating virtual keyboard dimensions to user finger size | 
Family Cites Families (26)
| Publication number | Priority date | Publication date | Assignee | Title | 
|---|---|---|---|---|
| JP2000056877A (en) * | 1998-08-07 | 2000-02-25 | Nec Corp | Touch panel type layout free keyboard | 
| JP2000066817A (en) * | 1998-08-20 | 2000-03-03 | Nec Corp | Keyboard device | 
| US7750891B2 (en) * | 2003-04-09 | 2010-07-06 | Tegic Communications, Inc. | Selective input system based on tracking of motion parameters of an input device | 
| GB0611032D0 (en) | 2006-06-05 | 2006-07-12 | Plastic Logic Ltd | Multi-touch active display keyboard | 
| US8051468B2 (en) | 2006-06-14 | 2011-11-01 | Identity Metrics Llc | User authentication system | 
| US8452978B2 (en) | 2006-09-15 | 2013-05-28 | Identity Metrics, LLC | System and method for user authentication and dynamic usability of touch-screen devices | 
| US8009147B2 (en) | 2007-09-27 | 2011-08-30 | At&T Intellectual Property I, Lp | Multi-touch interfaces for user authentication, partitioning, and external device control | 
| US8683582B2 (en) * | 2008-06-16 | 2014-03-25 | Qualcomm Incorporated | Method and system for graphical passcode security | 
| US8941466B2 (en) | 2009-01-05 | 2015-01-27 | Polytechnic Institute Of New York University | User authentication for devices with touch sensitive elements, such as touch sensitive display screens | 
| US8619043B2 (en) | 2009-02-27 | 2013-12-31 | Blackberry Limited | System and method of calibration of a touch screen display | 
| TW201044232A (en) | 2009-06-05 | 2010-12-16 | Htc Corp | Method, system and computer program product for correcting software keyboard input | 
| CN102447836A (en) * | 2009-06-16 | 2012-05-09 | 英特尔公司 | Camera applications in a handheld device | 
| JP5482023B2 (en) * | 2009-08-27 | 2014-04-23 | ソニー株式会社 | Information processing apparatus, information processing method, and program | 
| US8988191B2 (en) | 2009-08-27 | 2015-03-24 | Symbol Technologies, Inc. | Systems and methods for pressure-based authentication of an input on a touch screen | 
| US8390583B2 (en) * | 2009-08-31 | 2013-03-05 | Qualcomm Incorporated | Pressure sensitive user interface for mobile devices | 
| CN102402373B (en) * | 2010-09-15 | 2014-12-10 | 中国移动通信有限公司 | Method and device for controlling touch keyboard in mobile terminal | 
| US8938787B2 (en) | 2010-11-29 | 2015-01-20 | Biocatch Ltd. | System, device, and method of detecting identity of a user of a mobile electronic device | 
| US8928589B2 (en) | 2011-04-20 | 2015-01-06 | Qualcomm Incorporated | Virtual keyboards and methods of providing the same | 
| US9262076B2 (en) * | 2011-09-12 | 2016-02-16 | Microsoft Technology Licensing, Llc | Soft keyboard interface | 
| US20130106709A1 (en) | 2011-10-28 | 2013-05-02 | Martin John Simmons | Touch Sensor With User Identification | 
| US8436828B1 (en) | 2012-01-27 | 2013-05-07 | Google Inc. | Smart touchscreen key activation detection | 
| WO2013137455A1 (en) * | 2012-03-16 | 2013-09-19 | 株式会社エヌ・ティ・ティ・ドコモ | Information terminal and execution control method | 
| US9104260B2 (en) * | 2012-04-10 | 2015-08-11 | Typesoft Technologies, Inc. | Systems and methods for detecting a press on a touch-sensitive surface | 
| US9430778B2 (en) * | 2012-07-30 | 2016-08-30 | Kount Inc. | Authenticating users for accurate online audience measurement | 
| US8487897B1 (en) | 2012-09-12 | 2013-07-16 | Google Inc. | Multi-directional calibration of touch screens | 
| CN106774965A (en) * | 2013-05-15 | 2017-05-31 | 成都理想境界科技有限公司 | A kind of method of touch keyboard, hand-held mobile terminal and fast text typing | 
- 
        2014
        
- 2014-03-19 US US14/219,921 patent/US9158426B1/en active Active
 
 - 
        2015
        
- 2015-03-16 WO PCT/US2015/020772 patent/WO2015142740A1/en active Application Filing
 - 2015-03-16 BR BR112016020776A patent/BR112016020776A8/en not_active Application Discontinuation
 - 2015-03-16 EP EP15764749.6A patent/EP3120234B1/en active Active
 - 2015-03-16 CN CN201580014290.8A patent/CN106104454B/en active Active
 - 2015-03-16 AU AU2015231608A patent/AU2015231608B2/en active Active
 - 2015-03-16 JP JP2016558083A patent/JP6255512B2/en active Active
 
 
Patent Citations (7)
| Publication number | Priority date | Publication date | Assignee | Title | 
|---|---|---|---|---|
| US20140204059A1 (en) * | 2005-06-08 | 2014-07-24 | 3M Innovative Properties Company | Touch location determination involving multiple touch location processes | 
| US20080309622A1 (en) * | 2007-06-13 | 2008-12-18 | Apple Inc. | Periodic sensor autocalibration and emulation by varying stimulus level | 
| US20120075194A1 (en) * | 2009-06-16 | 2012-03-29 | Bran Ferren | Adaptive virtual keyboard for handheld device | 
| US8982160B2 (en) * | 2010-04-16 | 2015-03-17 | Qualcomm, Incorporated | Apparatus and methods for dynamically correlating virtual keyboard dimensions to user finger size | 
| US20110275412A1 (en) * | 2010-05-10 | 2011-11-10 | Microsoft Corporation | Automatic gain control based on detected pressure | 
| US20120050229A1 (en) * | 2010-08-27 | 2012-03-01 | Tenuta Matthew D | Touch sensor panel calibration | 
| US20140160085A1 (en) * | 2012-12-07 | 2014-06-12 | Qualcomm Incorporated | Adaptive analog-front-end to optimize touch processing | 
Cited By (82)
| Publication number | Priority date | Publication date | Assignee | Title | 
|---|---|---|---|---|
| US11112968B2 (en) | 2007-01-05 | 2021-09-07 | Apple Inc. | Method, system, and graphical user interface for providing word recommendations | 
| US11416141B2 (en) | 2007-01-05 | 2022-08-16 | Apple Inc. | Method, system, and graphical user interface for providing word recommendations | 
| US11314849B2 (en) | 2010-11-29 | 2022-04-26 | Biocatch Ltd. | Method, device, and system of detecting a lie of a user who inputs data | 
| US11580553B2 (en) | 2010-11-29 | 2023-02-14 | Biocatch Ltd. | Method, device, and system of detecting mule accounts and accounts used for money laundering | 
| US12101354B2 (en) * | 2010-11-29 | 2024-09-24 | Biocatch Ltd. | Device, system, and method of detecting vishing attacks | 
| US10262324B2 (en) | 2010-11-29 | 2019-04-16 | Biocatch Ltd. | System, device, and method of differentiating among users based on user-specific page navigation sequence | 
| US10298614B2 (en) * | 2010-11-29 | 2019-05-21 | Biocatch Ltd. | System, device, and method of generating and managing behavioral biometric cookies | 
| US20240080339A1 (en) * | 2010-11-29 | 2024-03-07 | Biocatch Ltd. | Device, System, and Method of Detecting Vishing Attacks | 
| US10404729B2 (en) | 2010-11-29 | 2019-09-03 | Biocatch Ltd. | Device, method, and system of generating fraud-alerts for cyber-attacks | 
| US11838118B2 (en) * | 2010-11-29 | 2023-12-05 | Biocatch Ltd. | Device, system, and method of detecting vishing attacks | 
| US10474815B2 (en) | 2010-11-29 | 2019-11-12 | Biocatch Ltd. | System, device, and method of detecting malicious automatic script and code injection | 
| US11250435B2 (en) | 2010-11-29 | 2022-02-15 | Biocatch Ltd. | Contextual mapping of web-pages, and generation of fraud-relatedness score-values | 
| US11269977B2 (en) | 2010-11-29 | 2022-03-08 | Biocatch Ltd. | System, apparatus, and method of collecting and processing data in electronic devices | 
| US10586036B2 (en) | 2010-11-29 | 2020-03-10 | Biocatch Ltd. | System, device, and method of recovery and resetting of user authentication factor | 
| US10621585B2 (en) | 2010-11-29 | 2020-04-14 | Biocatch Ltd. | Contextual mapping of web-pages, and generation of fraud-relatedness score-values | 
| US10949757B2 (en) | 2010-11-29 | 2021-03-16 | Biocatch Ltd. | System, device, and method of detecting user identity based on motor-control loop model | 
| US11223619B2 (en) | 2010-11-29 | 2022-01-11 | Biocatch Ltd. | Device, system, and method of user authentication based on user-specific characteristics of task performance | 
| US10728761B2 (en) | 2010-11-29 | 2020-07-28 | Biocatch Ltd. | Method, device, and system of detecting a lie of a user who inputs data | 
| US11210674B2 (en) | 2010-11-29 | 2021-12-28 | Biocatch Ltd. | Method, device, and system of detecting mule accounts and accounts used for money laundering | 
| US10776476B2 (en) | 2010-11-29 | 2020-09-15 | Biocatch Ltd. | System, device, and method of visual login | 
| US10834590B2 (en) | 2010-11-29 | 2020-11-10 | Biocatch Ltd. | Method, device, and system of differentiating between a cyber-attacker and a legitimate user | 
| US10747305B2 (en) | 2010-11-29 | 2020-08-18 | Biocatch Ltd. | Method, system, and device of authenticating identity of a user of an electronic device | 
| US11425563B2 (en) | 2010-11-29 | 2022-08-23 | Biocatch Ltd. | Method, device, and system of differentiating between a cyber-attacker and a legitimate user | 
| US20210329030A1 (en) * | 2010-11-29 | 2021-10-21 | Biocatch Ltd. | Device, System, and Method of Detecting Vishing Attacks | 
| US10897482B2 (en) | 2010-11-29 | 2021-01-19 | Biocatch Ltd. | Method, device, and system of back-coloring, forward-coloring, and fraud detection | 
| US10917431B2 (en) * | 2010-11-29 | 2021-02-09 | Biocatch Ltd. | System, method, and device of authenticating a user based on selfie image or selfie video | 
| US11330012B2 (en) | 2010-11-29 | 2022-05-10 | Biocatch Ltd. | System, method, and device of authenticating a user based on selfie image or selfie video | 
| US10949514B2 (en) | 2010-11-29 | 2021-03-16 | Biocatch Ltd. | Device, system, and method of differentiating among users based on detection of hardware components | 
| US11816326B2 (en) | 2013-06-09 | 2023-11-14 | Apple Inc. | Managing real-time handwriting recognition | 
| US11016658B2 (en) | 2013-06-09 | 2021-05-25 | Apple Inc. | Managing real-time handwriting recognition | 
| US11182069B2 (en) | 2013-06-09 | 2021-11-23 | Apple Inc. | Managing real-time handwriting recognition | 
| USD835661S1 (en) * | 2014-09-30 | 2018-12-11 | Apple Inc. | Display screen or portion thereof with graphical user interface | 
| US10126854B2 (en) * | 2015-03-06 | 2018-11-13 | Sony Mobile Communications Inc. | Providing touch position information | 
| US20160259458A1 (en) * | 2015-03-06 | 2016-09-08 | Sony Corporation | Touch screen device | 
| US10719765B2 (en) | 2015-06-25 | 2020-07-21 | Biocatch Ltd. | Conditional behavioral biometrics | 
| US11238349B2 (en) | 2015-06-25 | 2022-02-01 | Biocatch Ltd. | Conditional behavioural biometrics | 
| US11323451B2 (en) | 2015-07-09 | 2022-05-03 | Biocatch Ltd. | System, device, and method for detection of proxy server | 
| US10834090B2 (en) * | 2015-07-09 | 2020-11-10 | Biocatch Ltd. | System, device, and method for detection of proxy server | 
| US10523680B2 (en) * | 2015-07-09 | 2019-12-31 | Biocatch Ltd. | System, device, and method for detecting a proxy server | 
| US10884617B2 (en) * | 2016-06-12 | 2021-01-05 | Apple Inc. | Handwriting keyboard for screens | 
| US11640237B2 (en) | 2016-06-12 | 2023-05-02 | Apple Inc. | Handwriting keyboard for screens | 
| US12422979B2 (en) | 2016-06-12 | 2025-09-23 | Apple Inc. | Handwriting keyboard for screens | 
| US11941243B2 (en) | 2016-06-12 | 2024-03-26 | Apple Inc. | Handwriting keyboard for screens | 
| US11055395B2 (en) | 2016-07-08 | 2021-07-06 | Biocatch Ltd. | Step-up authentication | 
| US11500538B2 (en) * | 2016-09-13 | 2022-11-15 | Apple Inc. | Keyless keyboard with force sensing and haptic feedback | 
| US20220321121A1 (en) * | 2016-09-20 | 2022-10-06 | Apple Inc. | Input device having adjustable input mechanisms | 
| US12341508B2 (en) * | 2016-09-20 | 2025-06-24 | Apple Inc. | Input device having adjustable input mechanisms | 
| US20180095596A1 (en) * | 2016-09-30 | 2018-04-05 | Biocatch Ltd. | System, device, and method of estimating force applied to a touch surface | 
| US10198122B2 (en) * | 2016-09-30 | 2019-02-05 | Biocatch Ltd. | System, device, and method of estimating force applied to a touch surface | 
| US10579784B2 (en) | 2016-11-02 | 2020-03-03 | Biocatch Ltd. | System, device, and method of secure utilization of fingerprints for user authentication | 
| US10685355B2 (en) | 2016-12-04 | 2020-06-16 | Biocatch Ltd. | Method, device, and system of detecting mule accounts and accounts used for money laundering | 
| US10417408B2 (en) * | 2017-03-10 | 2019-09-17 | International Business Machines Corporation | Tactile-based password entry | 
| US12032744B2 (en) | 2017-05-08 | 2024-07-09 | Cirrus Logic Inc. | Integrated haptic system | 
| US10397262B2 (en) | 2017-07-20 | 2019-08-27 | Biocatch Ltd. | Device, system, and method of detecting overlay malware | 
| US10970394B2 (en) | 2017-11-21 | 2021-04-06 | Biocatch Ltd. | System, device, and method of detecting vishing attacks | 
| US11636742B2 (en) | 2018-04-04 | 2023-04-25 | Cirrus Logic, Inc. | Methods and apparatus for outputting a haptic signal to a haptic transducer | 
| US12190716B2 (en) | 2018-04-04 | 2025-01-07 | Cirrus Logic Inc. | Methods and apparatus for outputting a haptic signal to a haptic transducer | 
| US11966513B2 (en) | 2018-08-14 | 2024-04-23 | Cirrus Logic Inc. | Haptic output systems | 
| US11972105B2 (en) | 2018-10-26 | 2024-04-30 | Cirrus Logic Inc. | Force sensing system and method | 
| US12314558B2 (en) | 2018-10-26 | 2025-05-27 | Cirrus Logic Inc. | Force sensing system and method | 
| US11779956B2 (en) | 2019-03-29 | 2023-10-10 | Cirrus Logic Inc. | Driver circuitry | 
| US11644370B2 (en) | 2019-03-29 | 2023-05-09 | Cirrus Logic, Inc. | Force sensing with an electromagnetic load | 
| US12176781B2 (en) | 2019-03-29 | 2024-12-24 | Cirrus Logic Inc. | Methods and systems for estimating transducer parameters | 
| US12035445B2 (en) | 2019-03-29 | 2024-07-09 | Cirrus Logic Inc. | Resonant tracking of an electromagnetic load | 
| US11726596B2 (en) | 2019-03-29 | 2023-08-15 | Cirrus Logic, Inc. | Controller for use in a device comprising force sensors | 
| US11736093B2 (en) | 2019-03-29 | 2023-08-22 | Cirrus Logic Inc. | Identifying mechanical impedance of an electromagnetic load using least-mean-squares filter | 
| US11620046B2 (en) | 2019-06-01 | 2023-04-04 | Apple Inc. | Keyboard management user interfaces | 
| US11194467B2 (en) | 2019-06-01 | 2021-12-07 | Apple Inc. | Keyboard management user interfaces | 
| US11842044B2 (en) | 2019-06-01 | 2023-12-12 | Apple Inc. | Keyboard management user interfaces | 
| US11669165B2 (en) | 2019-06-07 | 2023-06-06 | Cirrus Logic, Inc. | Methods and apparatuses for controlling operation of a vibrational output system and/or operation of an input sensor system | 
| US11972057B2 (en) | 2019-06-07 | 2024-04-30 | Cirrus Logic Inc. | Methods and apparatuses for controlling operation of a vibrational output system and/or operation of an input sensor system | 
| US11656711B2 (en) * | 2019-06-21 | 2023-05-23 | Cirrus Logic, Inc. | Method and apparatus for configuring a plurality of virtual buttons on a device | 
| US20200401292A1 (en) * | 2019-06-21 | 2020-12-24 | Cirrus Logic International Semiconductor Ltd. | Method and apparatus for configuring a plurality of virtual buttons on a device | 
| US11692889B2 (en) | 2019-10-15 | 2023-07-04 | Cirrus Logic, Inc. | Control methods for a force sensor system | 
| US11847906B2 (en) | 2019-10-24 | 2023-12-19 | Cirrus Logic Inc. | Reproducibility of haptic waveform | 
| US12276687B2 (en) | 2019-12-05 | 2025-04-15 | Cirrus Logic Inc. | Methods and systems for estimating coil impedance of an electromagnetic transducer | 
| US11662821B2 (en) | 2020-04-16 | 2023-05-30 | Cirrus Logic, Inc. | In-situ monitoring, calibration, and testing of a haptic actuator | 
| US12244253B2 (en) | 2020-04-16 | 2025-03-04 | Cirrus Logic Inc. | Restricting undesired movement of a haptic actuator | 
| US11933822B2 (en) | 2021-06-16 | 2024-03-19 | Cirrus Logic Inc. | Methods and systems for in-system estimation of actuator parameters | 
| US11908310B2 (en) | 2021-06-22 | 2024-02-20 | Cirrus Logic Inc. | Methods and systems for detecting and managing unexpected spectral content in an amplifier system | 
| US11765499B2 (en) | 2021-06-22 | 2023-09-19 | Cirrus Logic Inc. | Methods and systems for managing mixed mode electromechanical actuator drive | 
| US11606353B2 (en) | 2021-07-22 | 2023-03-14 | Biocatch Ltd. | System, device, and method of generating and utilizing one-time passwords | 
Also Published As
| Publication number | Publication date | 
|---|---|
| AU2015231608A1 (en) | 2016-08-18 | 
| BR112016020776A8 (en) | 2018-01-02 | 
| BR112016020776A2 (en) | 2017-08-15 | 
| JP2017509253A (en) | 2017-03-30 | 
| EP3120234B1 (en) | 2019-09-18 | 
| CN106104454A (en) | 2016-11-09 | 
| US9158426B1 (en) | 2015-10-13 | 
| CN106104454B (en) | 2020-06-23 | 
| AU2015231608B2 (en) | 2016-11-03 | 
| EP3120234A1 (en) | 2017-01-25 | 
| JP6255512B2 (en) | 2017-12-27 | 
| WO2015142740A1 (en) | 2015-09-24 | 
| EP3120234A4 (en) | 2018-01-03 | 
Similar Documents
| Publication | Publication Date | Title | 
|---|---|---|
| US9158426B1 (en) | Touch keyboard calibration | |
| US9710639B1 (en) | Single input unlock for computing devices | |
| US8850349B2 (en) | Smart user-customized graphical keyboard | |
| US9063653B2 (en) | Ranking predictions based on typing speed and typing confidence | |
| WO2019096008A1 (en) | Identification method, computer device, and storage medium | |
| KR101376286B1 (en) | touchscreen text input | |
| US9304595B2 (en) | Gesture-keyboard decoding using gesture path deviation | |
| EP2703955B1 (en) | Scoring predictions based on prediction length and typing speed | |
| WO2018137448A1 (en) | Method for fingerprint recognition of terminal, and mobile terminal | |
| US9275210B2 (en) | System and method of enhancing security of a wireless device through usage pattern detection | |
| US20140007115A1 (en) | Multi-modal behavior awareness for human natural command control | |
| US20130346905A1 (en) | Targeted key press zones on an interactive display | |
| US9244612B1 (en) | Key selection of a graphical keyboard based on user input posture | |
| US20140105664A1 (en) | Keyboard Modification to Increase Typing Speed by Gesturing Next Character | |
| US10591580B2 (en) | Determining location using time difference of arrival | |
| US9965098B2 (en) | Clamshell electronic device and calibration method capable of enabling calibration based on separated number of cover | |
| US8884881B2 (en) | Portable electronic device and method of controlling same | |
| CN110825306A (en) | Braille input method, device, terminal and readable storage medium | |
| US9015798B1 (en) | User authentication using pointing device | |
| US9606973B2 (en) | Input correction enhancement | |
| US20160103556A1 (en) | Display apparatus and control method thereof | |
| US20140267055A1 (en) | Electronic device including touch-sensitive keyboard and method of controlling same | |
| CN107608967A (en) | A kind of error character recognition methods and terminal | |
| JP2015153338A (en) | Electronic apparatus, authentication method, and authentication program | 
Legal Events
| Date | Code | Title | Description | 
|---|---|---|---|
| AS | Assignment | 
             Owner name: GOOGLE INC., CALIFORNIA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:WOODHULL, CHARLES;TANNER, JAMES;SIGNING DATES FROM 20140411 TO 20140416;REEL/FRAME:035392/0872  | 
        |
| STCF | Information on status: patent grant | 
             Free format text: PATENTED CASE  | 
        |
| AS | Assignment | 
             Owner name: GOOGLE LLC, CALIFORNIA Free format text: CHANGE OF NAME;ASSIGNOR:GOOGLE INC.;REEL/FRAME:044334/0466 Effective date: 20170929  | 
        |
| MAFP | Maintenance fee payment | 
             Free format text: PAYMENT OF MAINTENANCE FEE, 4TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1551); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY Year of fee payment: 4  | 
        |
| MAFP | Maintenance fee payment | 
             Free format text: PAYMENT OF MAINTENANCE FEE, 8TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1552); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY Year of fee payment: 8  |